ERIC Educational Resources Information Center
Zhumasheva, Anara; Zhumabaeva, Zaida; Sakenov, Janat; Vedilina, Yelena; Zhaxylykova, Nuriya; Sekenova, Balkumis
2016-01-01
The current study focuses on the research topic of creating a theoretical model of development of information competence among students enrolled in elective courses. In order to examine specific features of the theoretical model of development of information competence among students enrolled in elective courses, we performed an analysis of…
ERIC Educational Resources Information Center
Dunlop, David Livingston
The purpose of this study was to use an information theoretic memory model to quantitatively investigate classification sorting and recall behaviors of various groups of students. The model provided theorems for the determination of information theoretic measures from which inferences concerning mental processing were made. The basic procedure…
Modeling of information diffusion in Twitter-like social networks under information overload.
Li, Pei; Li, Wei; Wang, Hui; Zhang, Xin
2014-01-01
Due to the existence of information overload in social networks, it becomes increasingly difficult for users to find useful information according to their interests. This paper takes Twitter-like social networks into account and proposes models to characterize the process of information diffusion under information overload. Users are classified into different types according to their in-degrees and out-degrees, and user behaviors are generalized into two categories: generating and forwarding. View scope is introduced to model the user information-processing capability under information overload, and the average number of times a message appears in view scopes after it is generated by a given type user is adopted to characterize the information diffusion efficiency, which is calculated theoretically. To verify the accuracy of theoretical analysis results, we conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly. These results are of importance to understand the diffusion dynamics in social networks, and this analysis framework can be extended to consider more realistic situations.
Modeling of Information Diffusion in Twitter-Like Social Networks under Information Overload
Li, Wei
2014-01-01
Due to the existence of information overload in social networks, it becomes increasingly difficult for users to find useful information according to their interests. This paper takes Twitter-like social networks into account and proposes models to characterize the process of information diffusion under information overload. Users are classified into different types according to their in-degrees and out-degrees, and user behaviors are generalized into two categories: generating and forwarding. View scope is introduced to model the user information-processing capability under information overload, and the average number of times a message appears in view scopes after it is generated by a given type user is adopted to characterize the information diffusion efficiency, which is calculated theoretically. To verify the accuracy of theoretical analysis results, we conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly. These results are of importance to understand the diffusion dynamics in social networks, and this analysis framework can be extended to consider more realistic situations. PMID:24795541
Information theoretic analysis of canny edge detection in visual communication
NASA Astrophysics Data System (ADS)
Jiang, Bo; Rahman, Zia-ur
2011-06-01
In general edge detection evaluation, the edge detectors are examined, analyzed, and compared either visually or with a metric for specific an application. This analysis is usually independent of the characteristics of the image-gathering, transmission and display processes that do impact the quality of the acquired image and thus, the resulting edge image. We propose a new information theoretic analysis of edge detection that unites the different components of the visual communication channel and assesses edge detection algorithms in an integrated manner based on Shannon's information theory. The edge detection algorithm here is considered to achieve high performance only if the information rate from the scene to the edge approaches the maximum possible. Thus, by setting initial conditions of the visual communication system as constant, different edge detection algorithms could be evaluated. This analysis is normally limited to linear shift-invariant filters so in order to examine the Canny edge operator in our proposed system, we need to estimate its "power spectral density" (PSD). Since the Canny operator is non-linear and shift variant, we perform the estimation for a set of different system environment conditions using simulations. In our paper we will first introduce the PSD of the Canny operator for a range of system parameters. Then, using the estimated PSD, we will assess the Canny operator using information theoretic analysis. The information-theoretic metric is also used to compare the performance of the Canny operator with other edge-detection operators. This also provides a simple tool for selecting appropriate edgedetection algorithms based on system parameters, and for adjusting their parameters to maximize information throughput.
One-dimensional barcode reading: an information theoretic approach
NASA Astrophysics Data System (ADS)
Houni, Karim; Sawaya, Wadih; Delignon, Yves
2008-03-01
In the convergence context of identification technology and information-data transmission, the barcode found its place as the simplest and the most pervasive solution for new uses, especially within mobile commerce, bringing youth to this long-lived technology. From a communication theory point of view, a barcode is a singular coding based on a graphical representation of the information to be transmitted. We present an information theoretic approach for 1D image-based barcode reading analysis. With a barcode facing the camera, distortions and acquisition are modeled as a communication channel. The performance of the system is evaluated by means of the average mutual information quantity. On the basis of this theoretical criterion for a reliable transmission, we introduce two new measures: the theoretical depth of field and the theoretical resolution. Simulations illustrate the gain of this approach.
One-dimensional barcode reading: an information theoretic approach.
Houni, Karim; Sawaya, Wadih; Delignon, Yves
2008-03-10
In the convergence context of identification technology and information-data transmission, the barcode found its place as the simplest and the most pervasive solution for new uses, especially within mobile commerce, bringing youth to this long-lived technology. From a communication theory point of view, a barcode is a singular coding based on a graphical representation of the information to be transmitted. We present an information theoretic approach for 1D image-based barcode reading analysis. With a barcode facing the camera, distortions and acquisition are modeled as a communication channel. The performance of the system is evaluated by means of the average mutual information quantity. On the basis of this theoretical criterion for a reliable transmission, we introduce two new measures: the theoretical depth of field and the theoretical resolution. Simulations illustrate the gain of this approach.
Information-Theoretical Complexity Analysis of Selected Elementary Chemical Reactions
NASA Astrophysics Data System (ADS)
Molina-Espíritu, M.; Esquivel, R. O.; Dehesa, J. S.
We investigate the complexity of selected elementary chemical reactions (namely, the hydrogenic-abstraction reaction and the identity SN2 exchange reaction) by means of the following single and composite information-theoretic measures: disequilibrium (D), exponential entropy(L), Fisher information (I), power entropy (J), I-D, D-L and I-J planes and Fisher-Shannon (FS) and Lopez-Mancini-Calbet (LMC) shape complexities. These quantities, which are functionals of the one-particle density, are computed in both position (r) and momentum (p) spaces. The analysis revealed that the chemically significant regions of these reactions can be identified through most of the single information-theoretic measures and the two-component planes, not only the ones which are commonly revealed by the energy, such as the reactant/product (R/P) and the transition state (TS), but also those that are not present in the energy profile such as the bond cleavage energy region (BCER), the bond breaking/forming regions (B-B/F) and the charge transfer process (CT). The analysis of the complexities shows that the energy profile of the abstraction reaction bears the same information-theoretical features of the LMC and FS measures, however for the identity SN2 exchange reaction does not hold a simple behavior with respect to the LMC and FS measures. Most of the chemical features of interest (BCER, B-B/F and CT) are only revealed when particular information-theoretic aspects of localizability (L or J), uniformity (D) and disorder (I) are considered.
The application of information theory for the research of aging and aging-related diseases.
Blokh, David; Stambler, Ilia
2017-10-01
This article reviews the application of information-theoretical analysis, employing measures of entropy and mutual information, for the study of aging and aging-related diseases. The research of aging and aging-related diseases is particularly suitable for the application of information theory methods, as aging processes and related diseases are multi-parametric, with continuous parameters coexisting alongside discrete parameters, and with the relations between the parameters being as a rule non-linear. Information theory provides unique analytical capabilities for the solution of such problems, with unique advantages over common linear biostatistics. Among the age-related diseases, information theory has been used in the study of neurodegenerative diseases (particularly using EEG time series for diagnosis and prediction), cancer (particularly for establishing individual and combined cancer biomarkers), diabetes (mainly utilizing mutual information to characterize the diseased and aging states), and heart disease (mainly for the analysis of heart rate variability). Few works have employed information theory for the analysis of general aging processes and frailty, as underlying determinants and possible early preclinical diagnostic measures for aging-related diseases. Generally, the use of information-theoretical analysis permits not only establishing the (non-linear) correlations between diagnostic or therapeutic parameters of interest, but may also provide a theoretical insight into the nature of aging and related diseases by establishing the measures of variability, adaptation, regulation or homeostasis, within a system of interest. It may be hoped that the increased use of such measures in research may considerably increase diagnostic and therapeutic capabilities and the fundamental theoretical mathematical understanding of aging and disease. Copyright © 2016 Elsevier Ltd. All rights reserved.
Python for Information Theoretic Analysis of Neural Data
Ince, Robin A. A.; Petersen, Rasmus S.; Swan, Daniel C.; Panzeri, Stefano
2008-01-01
Information theory, the mathematical theory of communication in the presence of noise, is playing an increasingly important role in modern quantitative neuroscience. It makes it possible to treat neural systems as stochastic communication channels and gain valuable, quantitative insights into their sensory coding function. These techniques provide results on how neurons encode stimuli in a way which is independent of any specific assumptions on which part of the neuronal response is signal and which is noise, and they can be usefully applied even to highly non-linear systems where traditional techniques fail. In this article, we describe our work and experiences using Python for information theoretic analysis. We outline some of the algorithmic, statistical and numerical challenges in the computation of information theoretic quantities from neural data. In particular, we consider the problems arising from limited sampling bias and from calculation of maximum entropy distributions in the presence of constraints representing the effects of different orders of interaction in the system. We explain how and why using Python has allowed us to significantly improve the speed and domain of applicability of the information theoretic algorithms, allowing analysis of data sets characterized by larger numbers of variables. We also discuss how our use of Python is facilitating integration with collaborative databases and centralised computational resources. PMID:19242557
The Philosophy of Information as an Underlying and Unifying Theory of Information Science
ERIC Educational Resources Information Center
Tomic, Taeda
2010-01-01
Introduction: Philosophical analyses of theoretical principles underlying these sub-domains reveal philosophy of information as underlying meta-theory of information science. Method: Conceptual research on the knowledge sub-domains in information science and philosophy and analysis of their mutual connection. Analysis: Similarities between…
Informational analysis for compressive sampling in radar imaging.
Zhang, Jingxiong; Yang, Ke
2015-03-24
Compressive sampling or compressed sensing (CS) works on the assumption of the sparsity or compressibility of the underlying signal, relies on the trans-informational capability of the measurement matrix employed and the resultant measurements, operates with optimization-based algorithms for signal reconstruction and is thus able to complete data compression, while acquiring data, leading to sub-Nyquist sampling strategies that promote efficiency in data acquisition, while ensuring certain accuracy criteria. Information theory provides a framework complementary to classic CS theory for analyzing information mechanisms and for determining the necessary number of measurements in a CS environment, such as CS-radar, a radar sensor conceptualized or designed with CS principles and techniques. Despite increasing awareness of information-theoretic perspectives on CS-radar, reported research has been rare. This paper seeks to bridge the gap in the interdisciplinary area of CS, radar and information theory by analyzing information flows in CS-radar from sparse scenes to measurements and determining sub-Nyquist sampling rates necessary for scene reconstruction within certain distortion thresholds, given differing scene sparsity and average per-sample signal-to-noise ratios (SNRs). Simulated studies were performed to complement and validate the information-theoretic analysis. The combined strategy proposed in this paper is valuable for information-theoretic orientated CS-radar system analysis and performance evaluation.
Esquivel, Rodolfo O; Molina-Espíritu, Moyocoyani; López-Rosa, Sheila; Soriano-Correa, Catalina; Barrientos-Salcedo, Carolina; Kohout, Miroslav; Dehesa, Jesús S
2015-08-24
In this work we undertake a pioneer information-theoretical analysis of 18 selected amino acids extracted from a natural protein, bacteriorhodopsin (1C3W). The conformational structures of each amino acid are analyzed by use of various quantum chemistry methodologies at high levels of theory: HF, M062X and CISD(Full). The Shannon entropy, Fisher information and disequilibrium are determined to grasp the spatial spreading features of delocalizability, order and uniformity of the optimized structures. These three entropic measures uniquely characterize all amino acids through a predominant information-theoretic quality scheme (PIQS), which gathers all chemical families by means of three major spreading features: delocalization, narrowness and uniformity. This scheme recognizes four major chemical families: aliphatic (delocalized), aromatic (delocalized), electro-attractive (narrowed) and tiny (uniform). All chemical families recognized by the existing energy-based classifications are embraced by this entropic scheme. Finally, novel chemical patterns are shown in the information planes associated with the PIQS entropic measures. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Open source tools for the information theoretic analysis of neural data.
Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano
2010-01-01
The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.
IMMAN: free software for information theory-based chemometric analysis.
Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo
2015-05-01
The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA supervised algorithms. Graphic representation for Shannon's distribution of MD calculating software.
Information Diffusion in Facebook-Like Social Networks Under Information Overload
NASA Astrophysics Data System (ADS)
Li, Pei; Xing, Kai; Wang, Dapeng; Zhang, Xin; Wang, Hui
2013-07-01
Research on social networks has received remarkable attention, since many people use social networks to broadcast information and stay connected with their friends. However, due to the information overload in social networks, it becomes increasingly difficult for users to find useful information. This paper takes Facebook-like social networks into account, and models the process of information diffusion under information overload. The term view scope is introduced to model the user information-processing capability under information overload, and the average number of times a message appears in view scopes after it is generated is proposed to characterize the information diffusion efficiency. Through theoretical analysis, we find that factors such as network structure and view scope number have no impact on the information diffusion efficiency, which is a surprising result. To verify the results, we conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly.
Information theoretic analysis of linear shift-invariant edge-detection operators
NASA Astrophysics Data System (ADS)
Jiang, Bo; Rahman, Zia-ur
2012-06-01
Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the influences by the image gathering process. However, experiments show that the image gathering process has a profound impact on the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. We perform an end-to-end information theory based system analysis to assess linear shift-invariant edge-detection algorithms. We evaluate the performance of the different algorithms as a function of the characteristics of the scene and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge-detection algorithm is regarded as having high performance only if the information rate from the scene to the edge image approaches its maximum possible. This goal can be achieved only by jointly optimizing all processes. Our information-theoretic assessment provides a new tool that allows us to compare different linear shift-invariant edge detectors in a common environment.
Security Analysis of Selected AMI Failure Scenarios Using Agent Based Game Theoretic Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abercrombie, Robert K; Schlicher, Bob G; Sheldon, Frederick T
Information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. We concentrated our analysis on the Advanced Metering Infrastructure (AMI) functional domain which the National Electric Sector Cyber security Organization Resource (NESCOR) working group has currently documented 29 failure scenarios. The strategy for the game was developed by analyzing five electric sector representative failure scenarios contained in the AMI functional domain. From thesemore » five selected scenarios, we characterize them into three specific threat categories affecting confidentiality, integrity and availability (CIA). The analysis using our ABGT simulation demonstrates how to model the AMI functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the AMI network with respect to CIA.« less
Information-theoretic signatures of biodiversity in the barcoding gene.
Barbosa, Valmir C
2018-08-14
Analyzing the information content of DNA, though holding the promise to help quantify how the processes of evolution have led to information gain throughout the ages, has remained an elusive goal. Paradoxically, one of the main reasons for this has been precisely the great diversity of life on the planet: if on the one hand this diversity is a rich source of data for information-content analysis, on the other hand there is so much variation as to make the task unmanageable. During the past decade or so, however, succinct fragments of the COI mitochondrial gene, which is present in all animal phyla and in a few others, have been shown to be useful for species identification through DNA barcoding. A few million such fragments are now publicly available through the BOLD systems initiative, thus providing an unprecedented opportunity for relatively comprehensive information-theoretic analyses of DNA to be attempted. Here we show how a generalized form of total correlation can yield distinctive information-theoretic descriptors of the phyla represented in those fragments. In order to illustrate the potential of this analysis to provide new insight into the evolution of species, we performed principal component analysis on standardized versions of the said descriptors for 23 phyla. Surprisingly, we found that, though based solely on the species represented in the data, the first principal component correlates strongly with the natural logarithm of the number of all known living species for those phyla. The new descriptors thus constitute clear information-theoretic signatures of the processes whereby evolution has given rise to current biodiversity, which suggests their potential usefulness in further related studies. Copyright © 2018 Elsevier Ltd. All rights reserved.
A Generalized Information Theoretical Model for Quantum Secret Sharing
NASA Astrophysics Data System (ADS)
Bai, Chen-Ming; Li, Zhi-Hui; Xu, Ting-Ting; Li, Yong-Ming
2016-11-01
An information theoretical model for quantum secret sharing was introduced by H. Imai et al. (Quantum Inf. Comput. 5(1), 69-80 2005), which was analyzed by quantum information theory. In this paper, we analyze this information theoretical model using the properties of the quantum access structure. By the analysis we propose a generalized model definition for the quantum secret sharing schemes. In our model, there are more quantum access structures which can be realized by our generalized quantum secret sharing schemes than those of the previous one. In addition, we also analyse two kinds of important quantum access structures to illustrate the existence and rationality for the generalized quantum secret sharing schemes and consider the security of the scheme by simple examples.
Suggestions for presenting the results of data analyses
Anderson, David R.; Link, William A.; Johnson, Douglas H.; Burnham, Kenneth P.
2001-01-01
We give suggestions for the presentation of research results from frequentist, information-theoretic, and Bayesian analysis paradigms, followed by several general suggestions. The information-theoretic and Bayesian methods offer alternative approaches to data analysis and inference compared to traditionally used methods. Guidance is lacking on the presentation of results under these alternative procedures and on nontesting aspects of classical frequentists methods of statistical analysis. Null hypothesis testing has come under intense criticism. We recommend less reporting of the results of statistical tests of null hypotheses in cases where the null is surely false anyway, or where the null hypothesis is of little interest to science or management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abercrombie, Robert K; Sheldon, Frederick T.
Cyber physical computing infrastructures typically consist of a number of sites are interconnected. Its operation critically depends both on cyber components and physical components. Both types of components are subject to attacks of different kinds and frequencies, which must be accounted for the initial provisioning and subsequent operation of the infrastructure via information security analysis. Information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, andmore » information assets. We concentrated our analysis on the electric sector failure scenarios and impact analyses by the NESCOR Working Group Study, From the Section 5 electric sector representative failure scenarios; we extracted the four generic failure scenarios and grouped them into three specific threat categories (confidentiality, integrity, and availability) to the system. These specific failure scenarios serve as a demonstration of our simulation. The analysis using our ABGT simulation demonstrates how to model the electric sector functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the cyber physical infrastructure network with respect to CIA.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bader, Brett William; Chew, Peter A.; Abdelali, Ahmed
We describe an entirely statistics-based, unsupervised, and language-independent approach to multilingual information retrieval, which we call Latent Morpho-Semantic Analysis (LMSA). LMSA overcomes some of the shortcomings of related previous approaches such as Latent Semantic Analysis (LSA). LMSA has an important theoretical advantage over LSA: it combines well-known techniques in a novel way to break the terms of LSA down into units which correspond more closely to morphemes. Thus, it has a particular appeal for use with morphologically complex languages such as Arabic. We show through empirical results that the theoretical advantages of LMSA can translate into significant gains in precisionmore » in multilingual information retrieval tests. These gains are not matched either when a standard stemmer is used with LSA, or when terms are indiscriminately broken down into n-grams.« less
A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings
Magri, Cesare; Whittingstall, Kevin; Singh, Vanessa; Logothetis, Nikos K; Panzeri, Stefano
2009-01-01
Background Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD) has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Results Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD) even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. Conclusion The new toolbox presented here implements fast and data-robust computations of the most relevant quantities used in information theoretic analysis of neural data. The toolbox can be easily used within Matlab, the environment used by most neuroscience laboratories for the acquisition, preprocessing and plotting of neural data. It can therefore significantly enlarge the domain of application of information theory to neuroscience, and lead to new discoveries about the neural code. PMID:19607698
A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings.
Magri, Cesare; Whittingstall, Kevin; Singh, Vanessa; Logothetis, Nikos K; Panzeri, Stefano
2009-07-16
Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD) has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD) even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. The new toolbox presented here implements fast and data-robust computations of the most relevant quantities used in information theoretic analysis of neural data. The toolbox can be easily used within Matlab, the environment used by most neuroscience laboratories for the acquisition, preprocessing and plotting of neural data. It can therefore significantly enlarge the domain of application of information theory to neuroscience, and lead to new discoveries about the neural code.
ERIC Educational Resources Information Center
Mikkelsen, Kim Sass
2017-01-01
Contemporary case studies rely on verbal arguments and set theory to build or evaluate theoretical claims. While existing procedures excel in the use of qualitative information (information about kind), they ignore quantitative information (information about degree) at central points of the analysis. Effectively, contemporary case studies rely on…
Thermophysical Properties of Selected Aerospace Materials. Part 1. Thermal Radiative Properties
1976-01-01
discusses the available data and information, the theoretical guidelines and other factors on which the critical evaluation, analysis, and synthesis of...text and a specification table. The former reviews and discusses the available data and information, the theoretical guidelines and other factors on...conditions 6’ Zenith angle for viewing conditions A6 Half angle of acceptance of optical system K Loss value factor X Wavelength p Reflectance p
On the Correct Analysis of the Foundations of Theoretical Physics
NASA Astrophysics Data System (ADS)
Kalanov, Temur Z.
2007-04-01
The problem of truth in science -- the most urgent problem of our time -- is discussed. The correct theoretical analysis of the foundations of theoretical physics is proposed. The principle of the unity of formal logic and rational dialectics is a methodological basis of the analysis. The main result is as follows: the generally accepted foundations of theoretical physics (i.e. Newtonian mechanics, Maxwell electrodynamics, thermodynamics, statistical physics and physical kinetics, the theory of relativity, quantum mechanics) contain the set of logical errors. These errors are explained by existence of the global cause: the errors are a collateral and inevitable result of the inductive way of cognition of the Nature, i.e. result of movement from formation of separate concepts to formation of the system of concepts. Consequently, theoretical physics enters the greatest crisis. It means that physics as a science of phenomenon leaves the progress stage for a science of essence (information). Acknowledgment: The books ``Surprises in Theoretical Physics'' (1979) and ``More Surprises in Theoretical Physics'' (1991) by Sir Rudolf Peierls stimulated my 25-year work.
The Educational (Im)possibility for Dietetics: A Poststructural Discourse Analysis
ERIC Educational Resources Information Center
Gingras, Jacqui
2009-01-01
Inquiring into the theoretical underpinnings of dietetic curriculum provides a means for further understanding who dietitians are (identity) and what dietitians do (performativity). Since dietetic curriculum exists as a structural influence on the dietetic student identity, it is worth inquiring into how such a structure is theoretically informed,…
NASA Astrophysics Data System (ADS)
Chernyavskiy, Andrey; Khamitov, Kamil; Teplov, Alexey; Voevodin, Vadim; Voevodin, Vladimir
2016-10-01
In recent years, quantum information technologies (QIT) showed great development, although, the way of the implementation of QIT faces the serious difficulties, some of which are challenging computational tasks. This work is devoted to the deep and broad analysis of the parallel algorithmic properties of such tasks. As an example we take one- and two-qubit transformations of a many-qubit quantum state, which are the most critical kernels of many important QIT applications. The analysis of the algorithms uses the methodology of the AlgoWiki project (algowiki-project.org) and consists of two parts: theoretical and experimental. Theoretical part includes features like sequential and parallel complexity, macro structure, and visual information graph. Experimental part was made by using the petascale Lomonosov supercomputer (Moscow State University, Russia) and includes the analysis of locality and memory access, scalability and the set of more specific dynamic characteristics of realization. This approach allowed us to obtain bottlenecks and generate ideas of efficiency improvement.
Deep and Structured Robust Information Theoretic Learning for Image Analysis.
Deng, Yue; Bao, Feng; Deng, Xuesong; Wang, Ruiping; Kong, Youyong; Dai, Qionghai
2016-07-07
This paper presents a robust information theoretic (RIT) model to reduce the uncertainties, i.e. missing and noisy labels, in general discriminative data representation tasks. The fundamental pursuit of our model is to simultaneously learn a transformation function and a discriminative classifier that maximize the mutual information of data and their labels in the latent space. In this general paradigm, we respectively discuss three types of the RIT implementations with linear subspace embedding, deep transformation and structured sparse learning. In practice, the RIT and deep RIT are exploited to solve the image categorization task whose performances will be verified on various benchmark datasets. The structured sparse RIT is further applied to a medical image analysis task for brain MRI segmentation that allows group-level feature selections on the brain tissues.
Spreading dynamics of an e-commerce preferential information model on scale-free networks
NASA Astrophysics Data System (ADS)
Wan, Chen; Li, Tao; Guan, Zhi-Hong; Wang, Yuanmei; Liu, Xiongding
2017-02-01
In order to study the influence of the preferential degree and the heterogeneity of underlying networks on the spread of preferential e-commerce information, we propose a novel susceptible-infected-beneficial model based on scale-free networks. The spreading dynamics of the preferential information are analyzed in detail using the mean-field theory. We determine the basic reproductive number and equilibria. The theoretical analysis indicates that the basic reproductive number depends mainly on the preferential degree and the topology of the underlying networks. We prove the global stability of the information-elimination equilibrium. The permanence of preferential information and the global attractivity of the information-prevailing equilibrium are also studied in detail. Some numerical simulations are presented to verify the theoretical results.
2015-03-01
of 7 information -theoretic criteria plotted against the model order used . The legend is labeled according to the figures in which the power spectra...spectrum (Brovelli et al. 2004). 6 Fig. 2 Values of 7 information -theoretic criteria plotted against the model order used . The legend is labeled...Identification of directed influence: Granger causality, Kullback - Leibler divergence, and complexity. Neural Computation. 2012;24(7):1722–1739. doi:10.1162
The Role of Trust in Information Science and Technology.
ERIC Educational Resources Information Center
Marsh, Stephen; Dibben, Mark R.
2003-01-01
Discusses the notion of trust as it relates to information science and technology, specifically user interfaces, autonomous agents, and information systems. Highlights include theoretical meaning of trust; trust and levels of analysis, including organizational trust; electronic commerce, user interfaces, and static trust; dynamic trust; and trust…
Information theoretic analysis of edge detection in visual communication
NASA Astrophysics Data System (ADS)
Jiang, Bo; Rahman, Zia-ur
2010-08-01
Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the artifacts introduced into the process by the image gathering process. However, experiments show that the image gathering process profoundly impacts the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. In this paper, we perform an end-to-end information theory based system analysis to assess edge detection methods. We evaluate the performance of the different algorithms as a function of the characteristics of the scene, and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge detection algorithm is regarded to have high performance only if the information rate from the scene to the edge approaches the maximum possible. This goal can be achieved only by jointly optimizing all processes. People generally use subjective judgment to compare different edge detection methods. There is not a common tool that can be used to evaluate the performance of the different algorithms, and to give people a guide for selecting the best algorithm for a given system or scene. Our information-theoretic assessment becomes this new tool to which allows us to compare the different edge detection operators in a common environment.
Detection of allosteric signal transmission by information-theoretic analysis of protein dynamics
Pandini, Alessandro; Fornili, Arianna; Fraternali, Franca; Kleinjung, Jens
2012-01-01
Allostery offers a highly specific way to modulate protein function. Therefore, understanding this mechanism is of increasing interest for protein science and drug discovery. However, allosteric signal transmission is difficult to detect experimentally and to model because it is often mediated by local structural changes propagating along multiple pathways. To address this, we developed a method to identify communication pathways by an information-theoretical analysis of molecular dynamics simulations. Signal propagation was described as information exchange through a network of correlated local motions, modeled as transitions between canonical states of protein fragments. The method was used to describe allostery in two-component regulatory systems. In particular, the transmission from the allosteric site to the signaling surface of the receiver domain NtrC was shown to be mediated by a layer of hub residues. The location of hubs preferentially connected to the allosteric site was found in close agreement with key residues experimentally identified as involved in the signal transmission. The comparison with the networks of the homologues CheY and FixJ highlighted similarities in their dynamics. In particular, we showed that a preorganized network of fragment connections between the allosteric and functional sites exists already in the inactive state of all three proteins.—Pandini, A., Fornili, A., Fraternali, F., Kleinjung, J. Detection of allosteric signal transmission by information-theoretic analysis of protein dynamics. PMID:22071506
He, Meilin; Devine, Laura; Zhuang, Jun
2018-02-01
The government, private sectors, and others users of the Internet are increasingly faced with the risk of cyber incidents. Damage to computer systems and theft of sensitive data caused by cyber attacks have the potential to result in lasting harm to entities under attack, or to society as a whole. The effects of cyber attacks are not always obvious, and detecting them is not a simple proposition. As the U.S. federal government believes that information sharing on cybersecurity issues among organizations is essential to safety, security, and resilience, the importance of trusted information exchange has been emphasized to support public and private decision making by encouraging the creation of the Information Sharing and Analysis Center (ISAC). Through a decision-theoretic approach, this article provides new perspectives on ISAC, and the advent of the new Information Sharing and Analysis Organizations (ISAOs), which are intended to provide similar benefits to organizations that cannot fit easily into the ISAC structure. To help understand the processes of information sharing against cyber threats, this article illustrates 15 representative information sharing structures between ISAC, government, and other participating entities, and provide discussions on the strategic interactions between different stakeholders. This article also identifies the costs of information sharing and information security borne by different parties in this public-private partnership both before and after cyber attacks, as well as the two main benefits. This article provides perspectives on the mechanism of information sharing and some detailed cost-benefit analysis. © 2017 Society for Risk Analysis.
ERIC Educational Resources Information Center
Skinner, Ann
2018-01-01
Resource-based theory provided the theoretical foundation to investigate the extent that developer knowledge correlated to success of information technology (IT) development projects. Literature indicated there was a knowledge gap in understanding whether developer information system development, behavior and business knowledge contributed to IT…
Multiscale analysis of information dynamics for linear multivariate processes.
Faes, Luca; Montalto, Alessandro; Stramaglia, Sebastiano; Nollo, Giandomenico; Marinazzo, Daniele
2016-08-01
In the study of complex physical and physiological systems represented by multivariate time series, an issue of great interest is the description of the system dynamics over a range of different temporal scales. While information-theoretic approaches to the multiscale analysis of complex dynamics are being increasingly used, the theoretical properties of the applied measures are poorly understood. This study introduces for the first time a framework for the analytical computation of information dynamics for linear multivariate stochastic processes explored at different time scales. After showing that the multiscale processing of a vector autoregressive (VAR) process introduces a moving average (MA) component, we describe how to represent the resulting VARMA process using statespace (SS) models and how to exploit the SS model parameters to compute analytical measures of information storage and information transfer for the original and rescaled processes. The framework is then used to quantify multiscale information dynamics for simulated unidirectionally and bidirectionally coupled VAR processes, showing that rescaling may lead to insightful patterns of information storage and transfer but also to potentially misleading behaviors.
ERIC Educational Resources Information Center
Virkkunen, Jaakko; Ristimaki, Paivi
2012-01-01
In this article, we study the relationships between culturally existing general strategy concepts and a small information and communication technology firm's specific strategic challenge in its management team's search for a new strategy concept. We apply three theoretical ideas of cultural historical activity theory: (a) the idea of double…
ERIC Educational Resources Information Center
Song, Jing
2008-01-01
From the perspective of asymmetric information, a principal-agent model is used to put forward a new theoretical explanation for the validity and effectiveness of tenure. Furthermore, through an analysis of the conditions of implementing an effective tenure system, it is argued that such a system is more efficient in research universities. Based…
A New Methodology for Systematic Exploitation of Technology Databases.
ERIC Educational Resources Information Center
Bedecarrax, Chantal; Huot, Charles
1994-01-01
Presents the theoretical aspects of a data analysis methodology that can help transform sequential raw data from a database into useful information, using the statistical analysis of patents as an example. Topics discussed include relational analysis and a technology watch approach. (Contains 17 references.) (LRW)
Wilson, Kumanan; Code, Catherine; Dornan, Christopher; Ahmad, Nadya; Hébert, Paul; Graham, Ian
2004-01-01
Background The media play an important role at the interface of science and policy by communicating scientific information to the public and policy makers. In issues of theoretical risk, in which there is scientific uncertainty, the media's role as disseminators of information is particularly important due to the potential to influence public perception of the severity of the risk. In this article we describe how the Canadian print media reported the theoretical risk of blood transmission of Creutzfeldt-Jakob disease (CJD). Methods We searched 3 newspaper databases for articles published by 6 major Canadian daily newspapers between January 1990 and December 1999. We identified all articles relating to blood transmission of CJD. In duplicate we extracted information from the articles and entered the information into a qualitative software program. We compared the observations obtained from this content analysis with information obtained from a previous policy analysis examining the Canadian blood system's decision-making concerning the potential transfusion transmission of CJD. Results Our search identified 245 relevant articles. We observed that newspapers in one instance accelerated a policy decision, which had important resource and health implication, by communicating information on risk to the public. We also observed that newspapers primarily relied upon expert opinion (47 articles) as opposed to published medical evidence (28 articles) when communicating risk information. Journalists we interviewed described the challenges of balancing their responsibility to raise awareness of potential health threats with not unnecessarily arousing fear amongst the public. Conclusions Based on our findings we recommend that journalists report information from both expert opinion sources and from published studies when communicating information on risk. We also recommend researchers work more closely with journalists to assist them in identifying and appraising relevant scientific information on risk. PMID:14706119
ERIC Educational Resources Information Center
Yang, Qinghua; Yang, Fan; Zhou, Chun
2015-01-01
Purpose: The purpose of this paper is to investigate how the information about haze, a term used in China to describe the air pollution problem, is portrayed on Chinese social media by different types of organizations using the theoretical framework of the health belief model (HBM). Design/methodology/approach: A content analysis was conducted…
Varn, D P; Crutchfield, J P
2016-03-13
Erwin Schrödinger famously and presciently ascribed the vehicle transmitting the hereditary information underlying life to an 'aperiodic crystal'. We compare and contrast this, only later discovered to be stored in the linear biomolecule DNA, with the information-bearing, layered quasi-one-dimensional materials investigated by the emerging field of chaotic crystallography. Despite differences in functionality, the same information measures capture structure and novelty in both, suggesting an intimate coherence between the information character of biotic and abiotic matter-a broadly applicable physics of information. We review layered solids and consider three examples of how information- and computation-theoretic techniques are being applied to understand their structure. In particular, (i) we review recent efforts to apply new kinds of information measures to quantify disordered crystals; (ii) we discuss the structure of ice I in information-theoretic terms; and (iii) we recount recent investigations into the structure of tris(bicyclo[2.1.1]hexeno)benzene, showing how an information-theoretic analysis yields additional insight into its structure. We then illustrate a new Second Law of Thermodynamics that describes information processing in active low-dimensional materials, reviewing Maxwell's Demon and a new class of molecular devices that act as information catalysts. Lastly, we conclude by speculating on how these ideas from informational materials science may impact biology. © 2016 The Author(s).
McParland, Joanna L; Williams, Lynn; Gozdzielewska, Lucyna; Young, Mairi; Smith, Fraser; MacDonald, Jennifer; Langdridge, Darren; Davis, Mark; Price, Lesley; Flowers, Paul
2018-05-27
Changing public awareness of antimicrobial resistance (AMR) represents a global public health priority. A systematic review of interventions that targeted public AMR awareness and associated behaviour was previously conducted. Here, we focus on identifying the active content of these interventions and explore potential mechanisms of action. The project took a novel approach to intervention mapping utilizing the following steps: (1) an exploration of explicit and tacit theory and theoretical constructs within the interventions using the Theoretical Domains Framework (TDFv2), (2) retrospective coding of behaviour change techniques (BCTs) using the BCT Taxonomy v1, and (3) an investigation of coherent links between the TDF domains and BCTs across the interventions. Of 20 studies included, only four reported an explicit theoretical basis to their intervention. However, TDF analysis revealed that nine of the 14 TDF domains were utilized, most commonly 'Knowledge' and 'Environmental context and resources'. The BCT analysis showed that all interventions contained at least one BCT, and 14 of 93 (15%) BCTs were coded, most commonly 'Information about health consequences', 'Credible source', and 'Instruction on how to perform the behaviour'. We identified nine relevant TDF domains and 14 BCTs used in these interventions. Only 15% of BCTs have been applied in AMR interventions thus providing a clear opportunity for the development of novel interventions in this context. This methodological approach provides a useful way of retrospectively mapping theoretical constructs and BCTs when reviewing studies that provide limited information on theory and intervention content. Statement of contribution What is already known on this subject? Evidence of the effectiveness of interventions that target the public to engage them with AMR is mixed; the public continue to show poor knowledge and misperceptions of AMR. Little is known about the common, active ingredients of AMR interventions targeting the public and information on explicit theoretical content is sparse. Information on the components of AMR public health interventions is urgently needed to enable the design of effective interventions to engage the public with AMR stewardship behaviour. What does this study add? The analysis shows very few studies reported any explicit theoretical basis to the interventions they described. Many interventions share common components, including core mechanisms of action and behaviour change techniques. The analysis suggests components of future interventions to engage the public with AMR. © 2018 The Authors. British Journal of Health Psychology published by John Wiley & Sons Ltd on behalf of British Psychological Society.
ERIC Educational Resources Information Center
Jonson, Jessica L.; Thompson, Robert J., Jr.; Guetterman, Timothy C.; Mitchell, Nancy
2017-01-01
Increasing the use of learning outcome assessments to inform educational decisions is a major challenge in higher education. For this study we used a sense-making theoretical perspective to guide an analysis of the relationship of information characteristics and faculty assessment knowledge and beliefs with the use of general education assessment…
A Thematic Analysis of Theoretical Models for Translational Science in Nursing: Mapping the Field
Mitchell, Sandra A.; Fisher, Cheryl A.; Hastings, Clare E.; Silverman, Leanne B.; Wallen, Gwenyth R.
2010-01-01
Background The quantity and diversity of conceptual models in translational science may complicate rather than advance the use of theory. Purpose This paper offers a comparative thematic analysis of the models available to inform knowledge development, transfer, and utilization. Method Literature searches identified 47 models for knowledge translation. Four thematic areas emerged: (1) evidence-based practice and knowledge transformation processes; (2) strategic change to promote adoption of new knowledge; (3) knowledge exchange and synthesis for application and inquiry; (4) designing and interpreting dissemination research. Discussion This analysis distinguishes the contributions made by leaders and researchers at each phase in the process of discovery, development, and service delivery. It also informs the selection of models to guide activities in knowledge translation. Conclusions A flexible theoretical stance is essential to simultaneously develop new knowledge and accelerate the translation of that knowledge into practice behaviors and programs of care that support optimal patient outcomes. PMID:21074646
Investigating nurse practitioners in the private sector: a theoretically informed research protocol.
Adams, Margaret; Gardner, Glenn; Yates, Patsy
2017-06-01
To report a study protocol and the theoretical framework normalisation process theory that informs this protocol for a case study investigation of private sector nurse practitioners. Most research evaluating nurse practitioner service is focused on public, mainly acute care environments where nurse practitioner service is well established with strong structures for governance and sustainability. Conversely, there is lack of clarity in governance for emerging models in the private sector. In a climate of healthcare reform, nurse practitioner service is extending beyond the familiar public health sector. Further research is required to inform knowledge of the practice, operational framework and governance of new nurse practitioner models. The proposed research will use a multiple exploratory case study design to examine private sector nurse practitioner service. Data collection includes interviews, surveys and audits. A sequential mixed method approach to analysis of each case will be conducted. Findings from within-case analysis will lead to a meta-synthesis across all four cases to gain a holistic understanding of the cases under study, private sector nurse practitioner service. Normalisation process theory will be used to guide the research process, specifically coding and analysis of data using theory constructs and the relevant components associated with those constructs. This article provides a blueprint for the research and describes a theoretical framework, normalisation process theory in terms of its flexibility as an analytical framework. Consistent with the goals of best research practice, this study protocol will inform the research community in the field of primary health care about emerging research in this field. Publishing a study protocol ensures researcher fidelity to the analysis plan and supports research collaboration across teams. © 2016 John Wiley & Sons Ltd.
Dunn-Walters, Deborah K.; Belelovsky, Alex; Edelman, Hanna; Banerjee, Monica; Mehr, Ramit
2002-01-01
We have developed a rigorous graph-theoretical algorithm for quantifying the shape properties of mutational lineage trees. We show that information about the dynamics of hypermutation and antigen-driven clonal selection during the humoral immune response is contained in the shape of mutational lineage trees deduced from the responding clones. Age and tissue related differences in the selection process can be studied using this method. Thus, tree shape analysis can be used as a means of elucidating humoral immune response dynamics in various situations. PMID:15144020
Health information systems: a survey of frameworks for developing countries.
Marcelo, A B
2010-01-01
The objective of this paper is to perform a survey of excellent research on health information systems (HIS) analysis and design, and their underlying theoretical frameworks. It classifies these frameworks along major themes, and analyzes the different approaches to HIS development that are practical in resource-constrained environments. Literature review based on PubMed citations and conference proceedings, as well as Internet searches on information systems in general, and health information systems in particular. The field of health information systems development has been studied extensively. Despite this, failed implementations are still common. Theoretical frameworks for HIS development are available that can guide implementers. As awareness, acceptance, and demand for health information systems increase globally, the variety of approaches and strategies will also follow. For developing countries with scarce resources, a trial-and-error approach can be very costly. Lessons from the successes and failures of initial HIS implementations have been abstracted into theoretical frameworks. These frameworks organize complex HIS concepts into methodologies that standardize techniques in implementation. As globalization continues to impact healthcare in the developing world, demand for more responsive health systems will become urgent. More comprehensive frameworks and practical tools to guide HIS implementers will be imperative.
NASA Astrophysics Data System (ADS)
Prasai, Binay; Wilson, A. R.; Wiley, B. J.; Ren, Y.; Petkov, Valeri
2015-10-01
The extent to which current theoretical modeling alone can reveal real-world metallic nanoparticles (NPs) at the atomic level was scrutinized and demonstrated to be insufficient and how it can be improved by using a pragmatic approach involving straightforward experiments is shown. In particular, 4 to 6 nm in size silica supported Au100-xPdx (x = 30, 46 and 58) explored for catalytic applications is characterized structurally by total scattering experiments including high-energy synchrotron X-ray diffraction (XRD) coupled to atomic pair distribution function (PDF) analysis. Atomic-level models for the NPs are built by molecular dynamics simulations based on the archetypal for current theoretical modeling Sutton-Chen (SC) method. Models are matched against independent experimental data and are demonstrated to be inaccurate unless their theoretical foundation, i.e. the SC method, is supplemented with basic yet crucial information on the length and strength of metal-to-metal bonds and, when necessary, structural disorder in the actual NPs studied. An atomic PDF-based approach for accessing such information and implementing it in theoretical modeling is put forward. For completeness, the approach is concisely demonstrated on 15 nm in size water-dispersed Au particles explored for bio-medical applications and 16 nm in size hexane-dispersed Fe48Pd52 particles explored for magnetic applications as well. It is argued that when ``tuned up'' against experiments relevant to metals and alloys confined to nanoscale dimensions, such as total scattering coupled to atomic PDF analysis, rather than by mere intuition and/or against data for the respective solids, atomic-level theoretical modeling can provide a sound understanding of the synthesis-structure-property relationships in real-world metallic NPs. Ultimately this can help advance nanoscience and technology a step closer to producing metallic NPs by rational design.The extent to which current theoretical modeling alone can reveal real-world metallic nanoparticles (NPs) at the atomic level was scrutinized and demonstrated to be insufficient and how it can be improved by using a pragmatic approach involving straightforward experiments is shown. In particular, 4 to 6 nm in size silica supported Au100-xPdx (x = 30, 46 and 58) explored for catalytic applications is characterized structurally by total scattering experiments including high-energy synchrotron X-ray diffraction (XRD) coupled to atomic pair distribution function (PDF) analysis. Atomic-level models for the NPs are built by molecular dynamics simulations based on the archetypal for current theoretical modeling Sutton-Chen (SC) method. Models are matched against independent experimental data and are demonstrated to be inaccurate unless their theoretical foundation, i.e. the SC method, is supplemented with basic yet crucial information on the length and strength of metal-to-metal bonds and, when necessary, structural disorder in the actual NPs studied. An atomic PDF-based approach for accessing such information and implementing it in theoretical modeling is put forward. For completeness, the approach is concisely demonstrated on 15 nm in size water-dispersed Au particles explored for bio-medical applications and 16 nm in size hexane-dispersed Fe48Pd52 particles explored for magnetic applications as well. It is argued that when ``tuned up'' against experiments relevant to metals and alloys confined to nanoscale dimensions, such as total scattering coupled to atomic PDF analysis, rather than by mere intuition and/or against data for the respective solids, atomic-level theoretical modeling can provide a sound understanding of the synthesis-structure-property relationships in real-world metallic NPs. Ultimately this can help advance nanoscience and technology a step closer to producing metallic NPs by rational design. Electronic supplementary information (ESI) available: XRD patterns, TEM and 3D structure modelling methodology. See DOI: 10.1039/c5nr04678e
Judging nursing information on the WWW: a theoretical understanding.
Cader, Raffik; Campbell, Steve; Watson, Don
2009-09-01
This paper is a report of a study of the judgement processes nurses use when evaluating World Wide Web information related to nursing practice. The World Wide Web has increased the global accessibility of online health information. However, the variable nature of the quality of World Wide Web information and its perceived level of reliability may lead to misinformation. This makes demands on healthcare professionals, and on nurses in particular, to ensure that health information of reliable quality is selected for use in practice. A grounded theory approach was adopted. Semi-structured interviews and focus groups were used to collect data, between 2004 and 2005, from 20 nurses undertaking a postqualification graduate course at a university and 13 nurses from a local hospital in the United Kingdom. A theoretical framework emerged that gave insight into the judgement process nurses use when evaluating World Wide Web information. Participants broke the judgement process down into specific tasks. In addition, they used tacit, process and propositional knowledge and intuition, quasi-rational cognition and analysis to undertake these tasks. World Wide Web information cues, time available and nurses' critical skills were influencing factors in their judgement process. Addressing the issue of quality and reliability associated with World Wide Web information is a global challenge. This theoretical framework could contribute towards meeting this challenge.
Wright, Adam; Ricciardi, Thomas N.; Zwick, Martin
2005-01-01
The Medical Quality Improvement Consortium data warehouse contains de-identified data on more than 3.6 million patients including their problem lists, test results, procedures and medication lists. This study uses reconstructability analysis, an information-theoretic data mining technique, on the MQIC data warehouse to empirically identify risk factors for various complications of diabetes including myocardial infarction and microalbuminuria. The risk factors identified match those risk factors identified in the literature, demonstrating the utility of the MQIC data warehouse for outcomes research, and RA as a technique for mining clinical data warehouses. PMID:16779156
Toward Validation of the Genius Discipline-Specific Literacy Model
ERIC Educational Resources Information Center
Ellis, Edwin S.; Wills, Stephen; Deshler, Donald D.
2011-01-01
An analysis of the rationale and theoretical foundations of the Genius Discipline-specific Literacy Model and its use of SMARTvisuals to cue information-processing skills and strategies and focus attention on essential informational elements in high-frequency topics in history and the English language arts are presented. Quantitative data…
ERIC Educational Resources Information Center
Salajan, Florin D.; Chiper, Sorina
2013-01-01
This article conducts an exploration of Romania's European integration process through higher education. It contends that integration occurs at "formal" and "informal levels" through institutional norms and human agency, respectively. Through theoretical and empirical analysis, the authors discuss the modalities through which…
Keller, L Robin; Wang, Yitong
2017-06-01
For the last 30 years, researchers in risk analysis, decision analysis, and economics have consistently proven that decisionmakers employ different processes for evaluating and combining anticipated and actual losses, gains, delays, and surprises. Although rational models generally prescribe a consistent response, people's heuristic processes will sometimes lead them to be inconsistent in the way they respond to information presented in theoretically equivalent ways. We point out several promising future research directions by listing and detailing a series of answered, partly answered, and unanswered questions. © 2016 Society for Risk Analysis.
ERIC Educational Resources Information Center
Mittal, Surabhi; Mehar, Mamta
2016-01-01
Purpose: The paper analyzes factors that affect the likelihood of adoption of different agriculture-related information sources by farmers. Design/Methodology/Approach: The paper links the theoretical understanding of the existing multiple sources of information that farmers use, with the empirical model to analyze the factors that affect the…
Hash Functions and Information Theoretic Security
NASA Astrophysics Data System (ADS)
Bagheri, Nasour; Knudsen, Lars R.; Naderi, Majid; Thomsen, Søren S.
Information theoretic security is an important security notion in cryptography as it provides a true lower bound for attack complexities. However, in practice attacks often have a higher cost than the information theoretic bound. In this paper we study the relationship between information theoretic attack costs and real costs. We show that in the information theoretic model, many well-known and commonly used hash functions such as MD5 and SHA-256 fail to be preimage resistant.
NASA Astrophysics Data System (ADS)
Katura, Takusige; Tanaka, Naoki; Obata, Akiko; Sato, Hiroki; Maki, Atsushi
2005-08-01
In this study, from the information-theoretic viewpoint, we analyzed the interrelation between the spontaneous low-frequency fluctuations around 0.1Hz in the hemoglobin concentration in the cerebral cortex, mean arterial blood pressure and the heart rate. For this analysis, as measures of information transfer, we used transfer entropy (TE) proposed for two-factor systems by Schreiber and intrinsic transfer entropy (ITE) introduced for further analysis of three-factor systems by extending the original TE. In our analysis, information transfer analysis based on both TE and ITE suggests the systemic cardiovascular fluctuations alone cannot account for the cerebrovascular fluctuations, that is, the regulation of the regional cerebral energetic metabolism is important as a candidate of its generation mechanism Such an information transfer analysis seems useful to reveal the interrelation between the elements regulated each other in a complex manner.
An Analysis of Machine- and Human-Analytics in Classification.
Tam, Gary K L; Kothari, Vivek; Chen, Min
2017-01-01
In this work, we present a study that traces the technical and cognitive processes in two visual analytics applications to a common theoretic model of soft knowledge that may be added into a visual analytics process for constructing a decision-tree model. Both case studies involved the development of classification models based on the "bag of features" approach. Both compared a visual analytics approach using parallel coordinates with a machine-learning approach using information theory. Both found that the visual analytics approach had some advantages over the machine learning approach, especially when sparse datasets were used as the ground truth. We examine various possible factors that may have contributed to such advantages, and collect empirical evidence for supporting the observation and reasoning of these factors. We propose an information-theoretic model as a common theoretic basis to explain the phenomena exhibited in these two case studies. Together we provide interconnected empirical and theoretical evidence to support the usefulness of visual analytics.
RANDOMIZATION PROCEDURES FOR THE ANALYSIS OF EDUCATIONAL EXPERIMENTS.
ERIC Educational Resources Information Center
COLLIER, RAYMOND O.
CERTAIN SPECIFIC ASPECTS OF HYPOTHESIS TESTS USED FOR ANALYSIS OF RESULTS IN RANDOMIZED EXPERIMENTS WERE STUDIED--(1) THE DEVELOPMENT OF THE THEORETICAL FACTOR, THAT OF PROVIDING INFORMATION ON STATISTICAL TESTS FOR CERTAIN EXPERIMENTAL DESIGNS AND (2) THE DEVELOPMENT OF THE APPLIED ELEMENT, THAT OF SUPPLYING THE EXPERIMENTER WITH MACHINERY FOR…
NASA Astrophysics Data System (ADS)
Gontrani, Lorenzo; Caminiti, Ruggero; Salma, Umme; Campetella, Marco
2017-09-01
We present here a structural and vibrational analysis of melted methylammonium nitrate, the simplest compound of the family of alkylammonium nitrates. The static and dynamical features calculated were endorsed by comparing the experimental X-ray data with the theoretical ones. A reliable description cannot be obtained with classical molecular dynamics owing to polarization effects. Contrariwise, the structure factor and the vibrational frequencies obtained from ab initio molecular dynamics trajectories are in very good agreement with the experiment. A careful analysis has provided additional information on the complex hydrogen bonding network that exists in this liquid.
An Information-Theoretic-Cluster Visualization for Self-Organizing Maps.
Brito da Silva, Leonardo Enzo; Wunsch, Donald C
2018-06-01
Improved data visualization will be a significant tool to enhance cluster analysis. In this paper, an information-theoretic-based method for cluster visualization using self-organizing maps (SOMs) is presented. The information-theoretic visualization (IT-vis) has the same structure as the unified distance matrix, but instead of depicting Euclidean distances between adjacent neurons, it displays the similarity between the distributions associated with adjacent neurons. Each SOM neuron has an associated subset of the data set whose cardinality controls the granularity of the IT-vis and with which the first- and second-order statistics are computed and used to estimate their probability density functions. These are used to calculate the similarity measure, based on Renyi's quadratic cross entropy and cross information potential (CIP). The introduced visualizations combine the low computational cost and kernel estimation properties of the representative CIP and the data structure representation of a single-linkage-based grouping algorithm to generate an enhanced SOM-based visualization. The visual quality of the IT-vis is assessed by comparing it with other visualization methods for several real-world and synthetic benchmark data sets. Thus, this paper also contains a significant literature survey. The experiments demonstrate the IT-vis cluster revealing capabilities, in which cluster boundaries are sharply captured. Additionally, the information-theoretic visualizations are used to perform clustering of the SOM. Compared with other methods, IT-vis of large SOMs yielded the best results in this paper, for which the quality of the final partitions was evaluated using external validity indices.
Dynamic Substrate for the Physical Encoding of Sensory Information in Bat Biosonar
NASA Astrophysics Data System (ADS)
Müller, Rolf; Gupta, Anupam K.; Zhu, Hongxiao; Pannala, Mittu; Gillani, Uzair S.; Fu, Yanqing; Caspers, Philip; Buck, John R.
2017-04-01
Horseshoe bats have dynamic biosonar systems with interfaces for ultrasonic emission (reception) that change shape while diffracting the outgoing (incoming) sound waves. An information-theoretic analysis based on numerical and physical prototypes shows that these shape changes add sensory information (mutual information between distant shape conformations <20 %), increase the number of resolvable directions of sound incidence, and improve the accuracy of direction finding. These results demonstrate that horseshoe bats have a highly effective substrate for dynamic encoding of sensory information.
Dynamic Substrate for the Physical Encoding of Sensory Information in Bat Biosonar.
Müller, Rolf; Gupta, Anupam K; Zhu, Hongxiao; Pannala, Mittu; Gillani, Uzair S; Fu, Yanqing; Caspers, Philip; Buck, John R
2017-04-14
Horseshoe bats have dynamic biosonar systems with interfaces for ultrasonic emission (reception) that change shape while diffracting the outgoing (incoming) sound waves. An information-theoretic analysis based on numerical and physical prototypes shows that these shape changes add sensory information (mutual information between distant shape conformations <20%), increase the number of resolvable directions of sound incidence, and improve the accuracy of direction finding. These results demonstrate that horseshoe bats have a highly effective substrate for dynamic encoding of sensory information.
Functional analysis of ultra high information rates conveyed by rat vibrissal primary afferents
Chagas, André M.; Theis, Lucas; Sengupta, Biswa; Stüttgen, Maik C.; Bethge, Matthias; Schwarz, Cornelius
2013-01-01
Sensory receptors determine the type and the quantity of information available for perception. Here, we quantified and characterized the information transferred by primary afferents in the rat whisker system using neural system identification. Quantification of “how much” information is conveyed by primary afferents, using the direct method (DM), a classical information theoretic tool, revealed that primary afferents transfer huge amounts of information (up to 529 bits/s). Information theoretic analysis of instantaneous spike-triggered kinematic stimulus features was used to gain functional insight on “what” is coded by primary afferents. Amongst the kinematic variables tested—position, velocity, and acceleration—primary afferent spikes encoded velocity best. The other two variables contributed to information transfer, but only if combined with velocity. We further revealed three additional characteristics that play a role in information transfer by primary afferents. Firstly, primary afferent spikes show preference for well separated multiple stimuli (i.e., well separated sets of combinations of the three instantaneous kinematic variables). Secondly, neurons are sensitive to short strips of the stimulus trajectory (up to 10 ms pre-spike time), and thirdly, they show spike patterns (precise doublet and triplet spiking). In order to deal with these complexities, we used a flexible probabilistic neuron model fitting mixtures of Gaussians to the spike triggered stimulus distributions, which quantitatively captured the contribution of the mentioned features and allowed us to achieve a full functional analysis of the total information rate indicated by the DM. We found that instantaneous position, velocity, and acceleration explained about 50% of the total information rate. Adding a 10 ms pre-spike interval of stimulus trajectory achieved 80–90%. The final 10–20% were found to be due to non-linear coding by spike bursts. PMID:24367295
Goals for Long-Range Research in Information Science.
ERIC Educational Resources Information Center
Pearson, Charls
In order to discuss the research goals of information science (IS), both its logical and its specific nature must be determined. Peircean logical analysis shows that IS may be classified in three parts: pure science, applied science, and technology. The deficiency in the present state of the art is in the pure science, or theoretical portion, of…
ERIC Educational Resources Information Center
Tamosiunas, Teodoras
2006-01-01
Purpose: The purpose of the research is to investigate how particular information from the environment serves as didactic material for students of Siauliai University Faculty of Social Sciences in learning to carry out scientific analysis and theoretical generalization of data in their theses. Methodology: The main sources--Internet databases,…
ERIC Educational Resources Information Center
Katz, Phyllis; McGinnis, J. Randy; Riedinger, Kelly; Marbach-Ad, Gili; Dai, Amy
2013-01-01
In case studies of two first-year elementary classroom teachers, we explored the influence of informal science education (ISE) they experienced in their teacher education program. Our theoretical lens was identity development, delimited to classroom science teaching. We used complementary data collection methods and analysis, including interviews,…
ERIC Educational Resources Information Center
Reddy, Dinesh Sampangirama
2017-01-01
Cybersecurity threats confront the United States on a daily basis, making them one of the major national security challenges. One approach to meeting these challenges is to improve user cybersecurity behavior. End user security behavior hinges on end user acceptance and use of the protective information technologies such as anti-virus and…
NASA Astrophysics Data System (ADS)
Herrmann, K.
2009-11-01
Information-theoretic approaches still play a minor role in financial market analysis. Nonetheless, there have been two very similar approaches evolving during the last years, one in the so-called econophysics and the other in econometrics. Both generalize the notion of GARCH processes in an information-theoretic sense and are able to capture kurtosis better than traditional models. In this article we present both approaches in a more general framework. The latter allows the derivation of a wide range of new models. We choose a third model using an entropy measure suggested by Kapur. In an application to financial market data, we find that all considered models - with similar flexibility in terms of skewness and kurtosis - lead to very similar results.
More data, less information? Potential for nonmonotonic information growth using GEE.
Shoben, Abigail B; Rudser, Kyle D; Emerson, Scott S
2017-01-01
Statistical intuition suggests that increasing the total number of observations available for analysis should increase the precision with which parameters can be estimated. Such monotonic growth of statistical information is of particular importance when data are analyzed sequentially, such as in confirmatory clinical trials. However, monotonic information growth is not always guaranteed, even when using a valid, but inefficient estimator. In this article, we demonstrate the theoretical possibility of nonmonotonic information growth when using generalized estimating equations (GEE) to estimate a slope and provide intuition for why this possibility exists. We use theoretical and simulation-based results to characterize situations that may result in nonmonotonic information growth. Nonmonotonic information growth is most likely to occur when (1) accrual is fast relative to follow-up on each individual, (2) correlation among measurements from the same individual is high, and (3) measurements are becoming more variable further from randomization. In situations that may lead to nonmonotonic information growth, study designers should plan interim analyses to avoid situations most likely to result in nonmonotonic information growth.
Lizier, Joseph T; Heinzle, Jakob; Horstmann, Annette; Haynes, John-Dylan; Prokopenko, Mikhail
2011-02-01
The human brain undertakes highly sophisticated information processing facilitated by the interaction between its sub-regions. We present a novel method for interregional connectivity analysis, using multivariate extensions to the mutual information and transfer entropy. The method allows us to identify the underlying directed information structure between brain regions, and how that structure changes according to behavioral conditions. This method is distinguished in using asymmetric, multivariate, information-theoretical analysis, which captures not only directional and non-linear relationships, but also collective interactions. Importantly, the method is able to estimate multivariate information measures with only relatively little data. We demonstrate the method to analyze functional magnetic resonance imaging time series to establish the directed information structure between brain regions involved in a visuo-motor tracking task. Importantly, this results in a tiered structure, with known movement planning regions driving visual and motor control regions. Also, we examine the changes in this structure as the difficulty of the tracking task is increased. We find that task difficulty modulates the coupling strength between regions of a cortical network involved in movement planning and between motor cortex and the cerebellum which is involved in the fine-tuning of motor control. It is likely these methods will find utility in identifying interregional structure (and experimentally induced changes in this structure) in other cognitive tasks and data modalities.
Spectral Entropies as Information-Theoretic Tools for Complex Network Comparison
NASA Astrophysics Data System (ADS)
De Domenico, Manlio; Biamonte, Jacob
2016-10-01
Any physical system can be viewed from the perspective that information is implicitly represented in its state. However, the quantification of this information when it comes to complex networks has remained largely elusive. In this work, we use techniques inspired by quantum statistical mechanics to define an entropy measure for complex networks and to develop a set of information-theoretic tools, based on network spectral properties, such as Rényi q entropy, generalized Kullback-Leibler and Jensen-Shannon divergences, the latter allowing us to define a natural distance measure between complex networks. First, we show that by minimizing the Kullback-Leibler divergence between an observed network and a parametric network model, inference of model parameter(s) by means of maximum-likelihood estimation can be achieved and model selection can be performed with appropriate information criteria. Second, we show that the information-theoretic metric quantifies the distance between pairs of networks and we can use it, for instance, to cluster the layers of a multilayer system. By applying this framework to networks corresponding to sites of the human microbiome, we perform hierarchical cluster analysis and recover with high accuracy existing community-based associations. Our results imply that spectral-based statistical inference in complex networks results in demonstrably superior performance as well as a conceptual backbone, filling a gap towards a network information theory.
Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John
2018-03-07
DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of quantifying methylation stochasticity using concepts from information theory. By employing this methodology, substantial improvement of DNA methylation analysis can be achieved by effectively taking into account the massive amount of statistical information available in WGBS data, which is largely ignored by existing methods.
Applying thematic analysis theory to practice: a researcher's experience.
Tuckett, Anthony G
2005-01-01
This article describes an experience of thematic analysis. In order to answer the question 'What does analysis look like in practice?' it describes in brief how the methodology of grounded theory, the epistemology of social constructionism, and the theoretical stance of symbolic interactionism inform analysis. Additionally, analysis is examined by evidencing the systematic processes--here termed organising, coding, writing, theorising, and reading--that led the researcher to develop a final thematic schema.
Maximizing the Biochemical Resolving Power of Fluorescence Microscopy
Esposito, Alessandro; Popleteeva, Marina; Venkitaraman, Ashok R.
2013-01-01
Most recent advances in fluorescence microscopy have focused on achieving spatial resolutions below the diffraction limit. However, the inherent capability of fluorescence microscopy to non-invasively resolve different biochemical or physical environments in biological samples has not yet been formally described, because an adequate and general theoretical framework is lacking. Here, we develop a mathematical characterization of the biochemical resolution in fluorescence detection with Fisher information analysis. To improve the precision and the resolution of quantitative imaging methods, we demonstrate strategies for the optimization of fluorescence lifetime, fluorescence anisotropy and hyperspectral detection, as well as different multi-dimensional techniques. We describe optimized imaging protocols, provide optimization algorithms and describe precision and resolving power in biochemical imaging thanks to the analysis of the general properties of Fisher information in fluorescence detection. These strategies enable the optimal use of the information content available within the limited photon-budget typically available in fluorescence microscopy. This theoretical foundation leads to a generalized strategy for the optimization of multi-dimensional optical detection, and demonstrates how the parallel detection of all properties of fluorescence can maximize the biochemical resolving power of fluorescence microscopy, an approach we term Hyper Dimensional Imaging Microscopy (HDIM). Our work provides a theoretical framework for the description of the biochemical resolution in fluorescence microscopy, irrespective of spatial resolution, and for the development of a new class of microscopes that exploit multi-parametric detection systems. PMID:24204821
McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W
2015-03-27
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.
Information flow dynamics in the brain
NASA Astrophysics Data System (ADS)
Rabinovich, Mikhail I.; Afraimovich, Valentin S.; Bick, Christian; Varona, Pablo
2012-03-01
Timing and dynamics of information in the brain is a hot field in modern neuroscience. The analysis of the temporal evolution of brain information is crucially important for the understanding of higher cognitive mechanisms in normal and pathological states. From the perspective of information dynamics, in this review we discuss working memory capacity, language dynamics, goal-dependent behavior programming and other functions of brain activity. In contrast with the classical description of information theory, which is mostly algebraic, brain flow information dynamics deals with problems such as the stability/instability of information flows, their quality, the timing of sequential processing, the top-down cognitive control of perceptual information, and information creation. In this framework, different types of information flow instabilities correspond to different cognitive disorders. On the other hand, the robustness of cognitive activity is related to the control of the information flow stability. We discuss these problems using both experimental and theoretical approaches, and we argue that brain activity is better understood considering information flows in the phase space of the corresponding dynamical model. In particular, we show how theory helps to understand intriguing experimental results in this matter, and how recent knowledge inspires new theoretical formalisms that can be tested with modern experimental techniques.
Investigating Addiction in the Changing Universe
Dastoury, Mojgan; Aminaee, Tayebe; Ghaumi, Raheleh
2014-01-01
The process of globalization as the most significant characteristic of modern era is facilitated by several factors including Information Technology, the industry of production and distribution of information, the flow of goods, services, human beings, capitals, information and etc. This phenomenon, along with the complex and various identities and life styles created by the national and transnational determinants, has widely changed the nature of social phenomena, including addition. The present study aims to investigate the contribution of sociological studies in the field of addiction during 2001 to 2011 in Iran. This is done through performing content analysis on 41 peer reviewed papers. The selected samples were surveyed and compared according to theoretical frameworks and the social groups under study. The results showed that the analyzed papers extensively overlooked the process of contemporary social changes in Iran which could be caused either by the theoretical basis of the studies or the social groups under study. Following the theoretical views of previous decades, these papers largely considered addiction as a type of social deviation and misbehavior related to the men living in urban areas. PMID:25363096
Information-theoretic metamodel of organizational evolution
NASA Astrophysics Data System (ADS)
Sepulveda, Alfredo
2011-12-01
Social organizations are abstractly modeled by holarchies---self-similar connected networks---and intelligent complex adaptive multiagent systems---large networks of autonomous reasoning agents interacting via scaled processes. However, little is known of how information shapes evolution in such organizations, a gap that can lead to misleading analytics. The research problem addressed in this study was the ineffective manner in which classical model-predict-control methods used in business analytics attempt to define organization evolution. The purpose of the study was to construct an effective metamodel for organization evolution based on a proposed complex adaptive structure---the info-holarchy. Theoretical foundations of this study were holarchies, complex adaptive systems, evolutionary theory, and quantum mechanics, among other recently developed physical and information theories. Research questions addressed how information evolution patterns gleamed from the study's inductive metamodel more aptly explained volatility in organization. In this study, a hybrid grounded theory based on abstract inductive extensions of information theories was utilized as the research methodology. An overarching heuristic metamodel was framed from the theoretical analysis of the properties of these extension theories and applied to business, neural, and computational entities. This metamodel resulted in the synthesis of a metaphor for, and generalization of organization evolution, serving as the recommended and appropriate analytical tool to view business dynamics for future applications. This study may manifest positive social change through a fundamental understanding of complexity in business from general information theories, resulting in more effective management.
Information processing and dynamics in minimally cognitive agents.
Beer, Randall D; Williams, Paul L
2015-01-01
There has been considerable debate in the literature about the relative merits of information processing versus dynamical approaches to understanding cognitive processes. In this article, we explore the relationship between these two styles of explanation using a model agent evolved to solve a relational categorization task. Specifically, we separately analyze the operation of this agent using the mathematical tools of information theory and dynamical systems theory. Information-theoretic analysis reveals how task-relevant information flows through the system to be combined into a categorization decision. Dynamical analysis reveals the key geometrical and temporal interrelationships underlying the categorization decision. Finally, we propose a framework for directly relating these two different styles of explanation and discuss the possible implications of our analysis for some of the ongoing debates in cognitive science. Copyright © 2014 Cognitive Science Society, Inc.
Theoretical analysis of Lumry-Eyring models in differential scanning calorimetry
Sanchez-Ruiz, Jose M.
1992-01-01
A theoretical analysis of several protein denaturation models (Lumry-Eyring models) that include a rate-limited step leading to an irreversibly denatured state of the protein (the final state) has been carried out. The differential scanning calorimetry transitions predicted for these models can be broadly classified into four groups: situations A, B, C, and C′. (A) The transition is calorimetrically irreversible but the rate-limited, irreversible step takes place with significant rate only at temperatures slightly above those corresponding to the transition. Equilibrium thermodynamics analysis is permissible. (B) The transition is distorted by the occurrence of the rate-limited step; nevertheless, it contains thermodynamic information about the reversible unfolding of the protein, which could be obtained upon the appropriate data treatment. (C) The heat absorption is entirely determined by the kinetics of formation of the final state and no thermodynamic information can be extracted from the calorimetric transition; the rate-determining step is the irreversible process itself. (C′) same as C, but, in this case, the rate-determining step is a previous step in the unfolding pathway. It is shown that ligand and protein concentration effects on transitions corresponding to situation C (strongly rate-limited transitions) are similar to those predicted by equilibrium thermodynamics for simple reversible unfolding models. It has been widely held in recent literature that experimentally observed ligand and protein concentration effects support the applicability of equilibrium thermodynamics to irreversible protein denaturation. The theoretical analysis reported here disfavors this claim. PMID:19431826
ERIC Educational Resources Information Center
Gorman, Bonnie B.
2012-01-01
In this dissertation, the National Survey of Student Engagement (NSSE) serves as a nodal point through which to examine the power relations shaping the direction and practices of higher education in the twenty-first century. Theoretically, my analysis is informed by Foucault's concept of governmentality, briefly defined as a technology of power…
Sequence information gain based motif analysis.
Maynou, Joan; Pairó, Erola; Marco, Santiago; Perera, Alexandre
2015-11-09
The detection of regulatory regions in candidate sequences is essential for the understanding of the regulation of a particular gene and the mechanisms involved. This paper proposes a novel methodology based on information theoretic metrics for finding regulatory sequences in promoter regions. This methodology (SIGMA) has been tested on genomic sequence data for Homo sapiens and Mus musculus. SIGMA has been compared with different publicly available alternatives for motif detection, such as MEME/MAST, Biostrings (Bioconductor package), MotifRegressor, and previous work such Qresiduals projections or information theoretic based detectors. Comparative results, in the form of Receiver Operating Characteristic curves, show how, in 70% of the studied Transcription Factor Binding Sites, the SIGMA detector has a better performance and behaves more robustly than the methods compared, while having a similar computational time. The performance of SIGMA can be explained by its parametric simplicity in the modelling of the non-linear co-variability in the binding motif positions. Sequence Information Gain based Motif Analysis is a generalisation of a non-linear model of the cis-regulatory sequences detection based on Information Theory. This generalisation allows us to detect transcription factor binding sites with maximum performance disregarding the covariability observed in the positions of the training set of sequences. SIGMA is freely available to the public at http://b2slab.upc.edu.
Information physics fundamentals of nanophotonics.
Naruse, Makoto; Tate, Naoya; Aono, Masashi; Ohtsu, Motoichi
2013-05-01
Nanophotonics has been extensively studied with the aim of unveiling and exploiting light-matter interactions that occur at a scale below the diffraction limit of light, and recent progress made in experimental technologies--both in nanomaterial fabrication and characterization--is driving further advancements in the field. From the viewpoint of information, on the other hand, novel architectures, design and analysis principles, and even novel computing paradigms should be considered so that we can fully benefit from the potential of nanophotonics. This paper examines the information physics aspects of nanophotonics. More specifically, we present some fundamental and emergent information properties that stem from optical excitation transfer mediated by optical near-field interactions and the hierarchical properties inherent in optical near-fields. We theoretically and experimentally investigate aspects such as unidirectional signal transfer, energy efficiency and networking effects, among others, and we present their basic theoretical formalisms and describe demonstrations of practical applications. A stochastic analysis of light-assisted material formation is also presented, where an information-based approach provides a deeper understanding of the phenomena involved, such as self-organization. Furthermore, the spatio-temporal dynamics of optical excitation transfer and its inherent stochastic attributes are utilized for solution searching, paving the way to a novel computing paradigm that exploits coherent and dissipative processes in nanophotonics.
NASA Astrophysics Data System (ADS)
Kim, J.
2016-12-01
Considering high levels of uncertainty, epistemological conflicts over facts and values, and a sense of urgency, normal paradigm-driven science will be insufficient to mobilize people and nation toward sustainability. The conceptual framework to bridge the societal system dynamics with that of natural ecosystems in which humanity operates remains deficient. The key to understanding their coevolution is to understand `self-organization.' Information-theoretic approach may shed a light to provide a potential framework which enables not only to bridge human and nature but also to generate useful knowledge for understanding and sustaining the integrity of ecological-societal systems. How can information theory help understand the interface between ecological systems and social systems? How to delineate self-organizing processes and ensure them to fulfil sustainability? How to evaluate the flow of information from data through models to decision-makers? These are the core questions posed by sustainability science in which visioneering (i.e., the engineering of vision) is an essential framework. Yet, visioneering has neither quantitative measure nor information theoretic framework to work with and teach. This presentation is an attempt to accommodate the framework of self-organizing hierarchical open systems with visioneering into a common information-theoretic framework. A case study is presented with the UN/FAO's communal vision of climate-smart agriculture (CSA) which pursues a trilemma of efficiency, mitigation, and resilience. Challenges of delineating and facilitating self-organizing systems are discussed using transdisciplinary toold such as complex systems thinking, dynamic process network analysis and multi-agent systems modeling. Acknowledgments: This study was supported by the Korea Meteorological Administration Research and Development Program under Grant KMA-2012-0001-A (WISE project).
Carey, Mariko; Jefford, Michael; Schofield, Penelope; Kelly, Siobhan; Krishnasamy, Meinir; Aranda, Sanchia
2006-04-01
Based on a theoretical framework, we developed an audiovisual resource to promote self-management of eight common chemotherapy side-effects. A patient needs analysis identified content domains, best evidence for preparing patients for threatening medical procedures and a systematic review of effective self-care strategies informed script content. Patients and health professionals were invited to complete a written evaluation of the video. A 25-min video was produced. Fifty health professionals and 37 patients completed the evaluation. All considered the video informative and easy to understand. The majority believed the video would reduce anxiety and help patients prepare for chemotherapy. Underpinned by a robust theoretical framework, we have developed an evidence-based resource that is perceived by both patients and health professionals as likely to enhance preparedness for chemotherapy.
POLLUTION PREVENTION: THE ROLE OF ENVIRONMENTAL MANAGEMENT AND INFORMATION
The theoretical analysis undertaken here addresses the following issues. First we examine whether firms with high intrinsic quality products would choose to produce more or less environmentally friendly products than their competitors. Second, we investigate how the environmen...
From recording discrete actions to studying continuous goal-directed behaviours in team sports.
Correia, Vanda; Araújo, Duarte; Vilar, Luís; Davids, Keith
2013-01-01
This paper highlights the importance of examining interpersonal interactions in performance analysis of team sports, predicated on the relationship between perception and action, compared to the traditional cataloguing of actions by individual performers. We discuss how ecological dynamics may provide a potential unifying theoretical and empirical framework to achieve this re-emphasis in research. With reference to data from illustrative studies on performance analysis and sport expertise, we critically evaluate some of the main assumptions and methodological approaches with regard to understanding how information influences action and decision-making during team sports performance. Current data demonstrate how the understanding of performance behaviours in team sports by sport scientists and practitioners may be enhanced with a re-emphasis in research on the dynamics of emergent ongoing interactions. Ecological dynamics provides formal and theoretically grounded descriptions of player-environment interactions with respect to key performance goals and the unfolding information of competitive performance. Developing these formal descriptions and explanations of sport performance may provide a significant contribution to the field of performance analysis, supporting design and intervention in both research and practice.
On the presence of electric currents in the solar atmosphere. I - A theoretical framework
NASA Technical Reports Server (NTRS)
Hagyard, M.; Low, B. C.; Tandberg-Hanssen, E.
1981-01-01
The general magnetic field above the solar photosphere is divided by an elementary analysis based on Ampere's law into two parts: a potential field due to electric currents below the photosphere and a field produced by electric currents above the photosphere combined with the induced mirror currents. The latter, by symmetry, has a set of field lines lying in the plane taken to be the photosphere which may be constructed from given vector magnetograph measurements. These field lines also represent all the information on the electric currents above the photosphere that a magnetograph can provide. Theoretical illustrations are given, and implications for data analysis are discussed.
How much a galaxy knows about its large-scale environment?: An information theoretic perspective
NASA Astrophysics Data System (ADS)
Pandey, Biswajit; Sarkar, Suman
2017-05-01
The small-scale environment characterized by the local density is known to play a crucial role in deciding the galaxy properties but the role of large-scale environment on galaxy formation and evolution still remain a less clear issue. We propose an information theoretic framework to investigate the influence of large-scale environment on galaxy properties and apply it to the data from the Galaxy Zoo project that provides the visual morphological classifications of ˜1 million galaxies from the Sloan Digital Sky Survey. We find a non-zero mutual information between morphology and environment that decreases with increasing length-scales but persists throughout the entire length-scales probed. We estimate the conditional mutual information and the interaction information between morphology and environment by conditioning the environment on different length-scales and find a synergic interaction between them that operates up to at least a length-scales of ˜30 h-1 Mpc. Our analysis indicates that these interactions largely arise due to the mutual information shared between the environments on different length-scales.
Measuring, Understanding, and Responding to Covert Social Networks: Passive and Active Tomography
2017-11-29
Methods for generating a random sample of networks with desired properties are important tools for the analysis of social , biological, and information...on Theoretical Foundations for Statistical Network Analysis at the Isaac Newton Institute for Mathematical Sciences at Cambridge U. (organized by...Approach SOCIAL SCIENCES STATISTICS EECS Problems span three disciplines Scientific focus is needed at the interfaces
Recent statistical methods for orientation data
NASA Technical Reports Server (NTRS)
Batschelet, E.
1972-01-01
The application of statistical methods for determining the areas of animal orientation and navigation are discussed. The method employed is limited to the two-dimensional case. Various tests for determining the validity of the statistical analysis are presented. Mathematical models are included to support the theoretical considerations and tables of data are developed to show the value of information obtained by statistical analysis.
A post-colonial analysis of healthcare discourses addressing aboriginal women.
Browne, Annette J; Smye, Vicki
2002-01-01
Annette Browne and Vicki Smye use post-colonial theoretical perspectives to inform a critical analysis of healthcare discourses related to cervical cancer among Canadian aboriginal women. They also examine how decontextualised discourses addressing aboriginal women's risks for cervical cancer can perpetuate negative stereotypical images of aboriginal women while downplaying or ignoring the historical, social and economic context of women's health risks.
[Verbal patient information through nurses--a case of stroke patients].
Christmann, Elli; Holle, Regina; Schüssler, Dörte; Beier, Jutta; Dassen, Theo
2004-06-01
The article represents results of a theoretical work in the field of nursing education, with the topic: Verbal Patient Information through Nurses--A Case of Stroke Patients. The literature review and analysis show that there is a shortage in (stroke) patient information generally and a lack of successful concepts and strategies for the verbal (stroke) patient information through nurses in hospitals. The authors have developed a theoretical basis for health information as a nursing intervention and this represents a model of health information as a "communicational teach-and-learn process", which is of general application to all patients. The health information takes place as a separate nursing intervention within a non-public, face-to-face communication situation and in the steps-model of the nursing process. Health information is seen as a learning process for patients and nurses too. We consider learning as information production (constructivism) and information processing (cognitivism). Both processes are influenced by different factors and the illness-situation of patients, personality information content and the environment. For a successful health information output, it is necessary to take care of these aspects and this can be realized through a constructivational understanding of didactics. There is a need for an evaluation study to prove our concept of health information.
McDonnell, J. D.; Schunck, N.; Higdon, D.; ...
2015-03-24
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. In addition, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDonnell, J. D.; Schunck, N.; Higdon, D.
2015-03-24
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less
On Utilizing Optimal and Information Theoretic Syntactic Modeling for Peptide Classification
NASA Astrophysics Data System (ADS)
Aygün, Eser; Oommen, B. John; Cataltepe, Zehra
Syntactic methods in pattern recognition have been used extensively in bioinformatics, and in particular, in the analysis of gene and protein expressions, and in the recognition and classification of bio-sequences. These methods are almost universally distance-based. This paper concerns the use of an Optimal and Information Theoretic (OIT) probabilistic model [11] to achieve peptide classification using the information residing in their syntactic representations. The latter has traditionally been achieved using the edit distances required in the respective peptide comparisons. We advocate that one can model the differences between compared strings as a mutation model consisting of random Substitutions, Insertions and Deletions (SID) obeying the OIT model. Thus, in this paper, we show that the probability measure obtained from the OIT model can be perceived as a sequence similarity metric, using which a Support Vector Machine (SVM)-based peptide classifier, referred to as OIT_SVM, can be devised.
Information-Theoretical Analysis of EEG Microstate Sequences in Python.
von Wegner, Frederic; Laufs, Helmut
2018-01-01
We present an open-source Python package to compute information-theoretical quantities for electroencephalographic data. Electroencephalography (EEG) measures the electrical potential generated by the cerebral cortex and the set of spatial patterns projected by the brain's electrical potential on the scalp surface can be clustered into a set of representative maps called EEG microstates. Microstate time series are obtained by competitively fitting the microstate maps back into the EEG data set, i.e., by substituting the EEG data at a given time with the label of the microstate that has the highest similarity with the actual EEG topography. As microstate sequences consist of non-metric random variables, e.g., the letters A-D, we recently introduced information-theoretical measures to quantify these time series. In wakeful resting state EEG recordings, we found new characteristics of microstate sequences such as periodicities related to EEG frequency bands. The algorithms used are here provided as an open-source package and their use is explained in a tutorial style. The package is self-contained and the programming style is procedural, focusing on code intelligibility and easy portability. Using a sample EEG file, we demonstrate how to perform EEG microstate segmentation using the modified K-means approach, and how to compute and visualize the recently introduced information-theoretical tests and quantities. The time-lagged mutual information function is derived as a discrete symbolic alternative to the autocorrelation function for metric time series and confidence intervals are computed from Markov chain surrogate data. The software package provides an open-source extension to the existing implementations of the microstate transform and is specifically designed to analyze resting state EEG recordings.
NASA Astrophysics Data System (ADS)
Deepha, V.; Praveena, R.; Sivakumar, Raman; Sadasivam, K.
2014-03-01
The increasing interests in naturally occurring flavonoids are well known for their bioactivity as antioxidants. The present investigations with combined experimental and theoretical methods are employed to determine the radical scavenging activity and phytochemicals present in Crotalaria globosa, a novel plant source. Preliminary quantification of ethanolic extract of leaves shows high phenolic and flavonoid content than root extract; also it is validated through DPPHrad assay. Further analysis is carried out with successive extracts of leaves of varying polarity of solvents. In DPPHrad and FRAP assays, ethyl acetate fraction (EtOAc) exhibit higher scavenging activity followed by ethanol fraction (EtOH) whereas in NOS assay ethanol fraction is slightly predominant over the EtOAc fraction. The LC-MS analysis provides tentative information about the presence of flavonoid C-glycoside in EtOAc fraction (yellow solid). Presence of flavonoid isorientin has been confirmed through isolation (PTLC) and detected by spectroscopy methods (UV-visible and 1H NMR). Utilizing B3LYP/6-311G (d,p) level of theory the structure and reactivity of flavonoid isoorientin theoretically have been explored. The analysis of the theoretical Bond dissociation energy values, for all Osbnd H sites of isoorientin reveals that minimum energy is required to dissociate H-atom from B-ring than A and C-rings. In order to validate the antioxidant characteristics of isoorientin the relevant molecular descriptors IP, HOMO-LUMO, Mulliken spin density analysis and molecular electrostatic potential surfaces have been computed and interpreted. From experimental and theoretical results, it is proved that isoorientin can act as potent antiradical scavenger in oxidative system.
Two dimensional Fourier transform methods for fringe pattern analysis
NASA Astrophysics Data System (ADS)
Sciammarella, C. A.; Bhat, G.
An overview of the use of FFTs for fringe pattern analysis is presented, with emphasis on fringe patterns containing displacement information. The techniques are illustrated via analysis of the displacement and strain distributions in the direction perpendicular to the loading, in a disk under diametral compression. The experimental strain distribution is compared to the theoretical, and the agreement is found to be excellent in regions where the elasticity solution models well the actual problem.
The use of models by ecologist and environmental managers, to inform environmental management and decision-making, has grown exponentially in the past 50 years. Due to logistical, economical and theoretical benefits, model users are frequently transferring preexisting models to n...
Guided waves by axisymmetric and non-axisymmetric surface loading on hollow cylinders
Shin; Rose
1999-06-01
Guided waves generated by axisymmetric and non-axisymmetric surface loading on a hollow cylinder are studied. For the theoretical analysis of the superposed guided waves, a normal mode concept is employed. The amplitude factors of individual guided wave modes are studied with respect to varying surface pressure loading profiles. Both theoretical and experimental focus is given to the guided waves generated by both axisymmetric and non-axisymmetric excitation. For the experiments, a comb transducer and high power tone burst function generator system are used on a sample Inconel tube. Surface loading conditions, such as circumferential loading angles and axial loading lengths, are used with the frequency and phase velocity to control the axisymmetric and non-axisymmetric mode excitations. The experimental study demonstrates the use of a practical non-axisymmetric partial loading technique in generating axisymmetric modes, particularly useful in the inspection of tubing and piping with limited circumferential access. From both theoretical and experimental studies, it also could be said that the amount of flexural modes reflected from a defect contains information on the reflector's circumferential angle, as well as potentially other classification and sizing feature information. The axisymmetric and non-axisymmetric guided wave modes should both be carefully considered for improvement of the overall analysis of guided waves generated in hollow cylinders.
Prasai, Binay; Wilson, A R; Wiley, B J; Ren, Y; Petkov, Valeri
2015-11-14
The extent to which current theoretical modeling alone can reveal real-world metallic nanoparticles (NPs) at the atomic level was scrutinized and demonstrated to be insufficient and how it can be improved by using a pragmatic approach involving straightforward experiments is shown. In particular, 4 to 6 nm in size silica supported Au(100-x)Pd(x) (x = 30, 46 and 58) explored for catalytic applications is characterized structurally by total scattering experiments including high-energy synchrotron X-ray diffraction (XRD) coupled to atomic pair distribution function (PDF) analysis. Atomic-level models for the NPs are built by molecular dynamics simulations based on the archetypal for current theoretical modeling Sutton-Chen (SC) method. Models are matched against independent experimental data and are demonstrated to be inaccurate unless their theoretical foundation, i.e. the SC method, is supplemented with basic yet crucial information on the length and strength of metal-to-metal bonds and, when necessary, structural disorder in the actual NPs studied. An atomic PDF-based approach for accessing such information and implementing it in theoretical modeling is put forward. For completeness, the approach is concisely demonstrated on 15 nm in size water-dispersed Au particles explored for bio-medical applications and 16 nm in size hexane-dispersed Fe48Pd52 particles explored for magnetic applications as well. It is argued that when "tuned up" against experiments relevant to metals and alloys confined to nanoscale dimensions, such as total scattering coupled to atomic PDF analysis, rather than by mere intuition and/or against data for the respective solids, atomic-level theoretical modeling can provide a sound understanding of the synthesis-structure-property relationships in real-world metallic NPs. Ultimately this can help advance nanoscience and technology a step closer to producing metallic NPs by rational design.
Günay, Ulviye; Kılınç, Gülsen
2018-06-01
Nursing education contains both theoretical and practical training processes. Clinical training is the basis of nursing education. The quality of clinical training is closely related to the quality of the clinical learning environment. This study aimed to determine the transfer of theoretical knowledge into clinical practice by nursing students and the difficulties they experience during this process. A qualitative research design was used in the study. The study was conducted in 2015 with 30 nursing students in a university located in the east of Turkey, constituting three focus groups. The questions directed to the students during the focus group interviews were as follows: What do you think about your clinical training? How do you evaluate yourself in the process of putting your theoretical knowledge into clinical practice? What kind of difficulties are you experiencing in clinical practices? The data were interpreted using the method of content analysis. Most of the students reported that theoretical information they received was excessive, their ability to put most of this information into practice was weak, and they lacked courage to touch patients for fear of implementing procedures incorrectly. As a result of the analysis of the data, five main themes were determined: clinical training, guidance and communication, hospital environment and expectations. The results of this study showed that nursing students found their clinical knowledge and skills insufficient and usually failed to transfer their theoretical knowledge into clinical practices. The study observed that nursing students experienced various issues in clinical practices. In order to fix these issues and achieve an effective clinical training environment, collaboration should be achieved among nursing instructors, nurses, nursing school and hospital managements. Additionally, the number of nursing educators should be increased and training programs should be provided regarding effective clinical training methods. Copyright © 2018 Elsevier Ltd. All rights reserved.
Gary D. Grossman; Robert E. Ratajczak; C. Michael Wagner; J. Todd Petty
2010-01-01
1. We used information theoretic statistics [Akaikeâs Information Criterion (AIC)] and regression analysis in a multiple hypothesis testing approach to assess the processes capable of explaining long-term demographic variation in a lightly exploited brook trout population in Ball Creek, NC. We sampled a 100-m-long second-order site during both spring and autumn 1991â...
Informing Physics: Jacob Bekenstein and the Informational Turn in Theoretical Physics
NASA Astrophysics Data System (ADS)
Belfer, Israel
2014-03-01
In his PhD dissertation in the early 1970s, the Mexican-Israeli theoretical physicist Jacob Bekenstein developed the thermodynamics of black holes using a generalized version of the second law of thermodynamics. This work made it possible for physicists to describe and analyze black holes using information-theoretical concepts. It also helped to transform information theory into a fundamental and foundational concept in theoretical physics. The story of Bekenstein's work—which was initially opposed by many scientists, including Stephen Hawking—highlights the transformation within physics towards an information-oriented scientific mode of theorizing. This "informational turn" amounted to a mild-mannered revolution within physics, revolutionary without being rebellious.
Challenging convention: symbolic interactionism and grounded theory.
Newman, Barbara
2008-01-01
Not very much is written in the literature about decisions made by researchers and the justifications on method as a result of a particular clinical problem, together with an appropriate and congruent theoretical perspective, particularly for Glaserian grounded theory. I contend the utilisation of symbolic interactionism as a theoretical perspective to inform and guide the evolving research process and analysis of data when using classic or Glaserian grounded theory (GT) method, is not always appropriate. Within this article I offer an analysis of the key issues to be addressed when contemplating the use of Glaserian GT and the utilisation of an appropriate theoretical perspective, rather than accepting convention of symbolic interactionism (SI). The analysis became imperative in a study I conducted that sought to explore the concerns, adaptive behaviours, psychosocial processes and relevant interactions over a 12-month period, among newly diagnosed persons with end stage renal disease, dependent on haemodialysis in the home environment for survival. The reality of perception was central to the end product in the study. Human ethics approval was granted by six committees within New South Wales Health Department and one from a university.
Analysis of nonlinear internal waves observed by Landsat thematic mapper
NASA Astrophysics Data System (ADS)
Artale, V.; Levi, D.; Marullo, S.; Santoleri, R.
1990-09-01
In this work we test the compatibility between the theoretical parameters of a nonlinear wave model and the quantitative information that one can deduce from satellite-derived data. The theoretical parameters are obtained by applying an inverse problem to the solution of the Cauchy problem for the Korteweg-de Vries equation. Our results are applied to the case of internal wave patterns elaborated from two different satellite sensors at the south of Messina (the thematic mapper) and at the north of Messina (the synthetic aperture radar).
Analysis of cellular signal transduction from an information theoretic approach.
Uda, Shinsuke; Kuroda, Shinya
2016-03-01
Signal transduction processes the information of various cellular functions, including cell proliferation, differentiation, and death. The information for controlling cell fate is transmitted by concentrations of cellular signaling molecules. However, how much information is transmitted in signaling pathways has thus far not been investigated. Shannon's information theory paves the way to quantitatively analyze information transmission in signaling pathways. The theory has recently been applied to signal transduction, and mutual information of signal transduction has been determined to be a measure of information transmission. We review this work and provide an overview of how signal transduction transmits informational input and exerts biological output. Copyright © 2015 Elsevier Ltd. All rights reserved.
"I May Be Crackin', But Um Fackin": Racial Humor in "The Watsons Go To Birmingham--1963"
ERIC Educational Resources Information Center
McNair, Jonda C.
2008-01-01
This article examines the utilization of racial humor in Christopher Paul Curtis' novel, "The Watsons Go To Birmingham--1963." The theoretical perspectives that inform the analysis include critical race theory and humor theory. The results of the analysis reveal that the use of humor in this book is influenced to a significant degree by race and…
Vakorin, Vasily A.; Mišić, Bratislav; Krakovska, Olga; McIntosh, Anthony Randal
2011-01-01
Variability in source dynamics across the sources in an activated network may be indicative of how the information is processed within a network. Information-theoretic tools allow one not only to characterize local brain dynamics but also to describe interactions between distributed brain activity. This study follows such a framework and explores the relations between signal variability and asymmetry in mutual interdependencies in a data-driven pipeline of non-linear analysis of neuromagnetic sources reconstructed from human magnetoencephalographic (MEG) data collected as a reaction to a face recognition task. Asymmetry in non-linear interdependencies in the network was analyzed using transfer entropy, which quantifies predictive information transfer between the sources. Variability of the source activity was estimated using multi-scale entropy, quantifying the rate of which information is generated. The empirical results are supported by an analysis of synthetic data based on the dynamics of coupled systems with time delay in coupling. We found that the amount of information transferred from one source to another was correlated with the difference in variability between the dynamics of these two sources, with the directionality of net information transfer depending on the time scale at which the sample entropy was computed. The results based on synthetic data suggest that both time delay and strength of coupling can contribute to the relations between variability of brain signals and information transfer between them. Our findings support the previous attempts to characterize functional organization of the activated brain, based on a combination of non-linear dynamics and temporal features of brain connectivity, such as time delay. PMID:22131968
Motivation and Satisfaction in Internet-Supported Learning Environments: A Review
ERIC Educational Resources Information Center
Bekele, Teklu Abate
2010-01-01
Previous studies examined student motivation and satisfaction in Internet-Supported Learning Environments (ISLE) in higher education but none provided a comprehensive analysis of significant methodological and theoretical issues. To contribute toward filling this knowledge gap and then to better inform instructional systems development, practice,…
Vocabulary, Grammar, Sex, and Aging
ERIC Educational Resources Information Center
Moscoso del Prado Martín, Fermín
2017-01-01
Understanding the changes in our language abilities along the lifespan is a crucial step for understanding the aging process both in normal and in abnormal circumstances. Besides controlled experimental tasks, it is equally crucial to investigate language in unconstrained conversation. I present an information-theoretical analysis of a corpus of…
Concepts and Synonymy in the UMLS Metathesaurus
Merrill, Gary H.
2009-01-01
This paper advances a detailed exploration of the complex relationships among terms, concepts, and synonymy in the UMLS (Unified Medical Language System) Metathesaurus, and proposes the study and understanding of the Metathesaurus from a model-theoretic perspective. Initial sections provide the background and motivation for such an approach, and a careful informal treatment of these notions is offered as a context and basis for the formal analysis. What emerges from this is a set of puzzles and confusions in the Metathesaurus and its literature pertaining to synonymy and its relation to terms and concepts. A model theory for a segment of the Metathesaurus is then constructed, and its adequacy relative to the informal treatment is demonstrated. Finally, it is shown how this approach clarifies and addresses the puzzles educed from the informal discussion, and how the model-theoretic perspective may be employed to evaluate some fundamental criticisms of the Metathesaurus. For users of the UMLS, two significant results of this analysis are a rigorous clarification of the different senses of synonymy that appear in treatments of the Metathesaurus and an illustration of the dangers in computing inferences involving ambiguous terms. PMID:19838995
Strategic analysis for safeguards systems: a feasibility study. Volume 2. Appendix
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldman, A J
1984-12-01
This appendix provides detailed information regarding game theory (strategic analysis) and its potential role in safeguards to supplement the main body of this report. In particualr, it includes an extensive, though not comprehensive review of literature on game theory and on other topics that relate to the formulation of a game-theoretic model (e.g. the payoff functions). The appendix describes the basic form and components of game theory models, and the solvability of various models. It then discusses three basic issues related to the use of strategic analysis in material accounting: (1) its understandability; (2) its viability in regulatory settings; andmore » (3) difficulties in the use of mixed strategies. Each of the components of a game theoretic model are then discussed and related to the present context.« less
2016-12-22
assumptions of behavior. This research proposes an information theoretic methodology to discover such complex network structures and dynamics while overcoming...the difficulties historically associated with their study. Indeed, this was the first application of an information theoretic methodology as a tool...1 Research Objectives and Questions..............................................................................2 Methodology
On the theoretical velocity distribution and flow resistance in natural channels
NASA Astrophysics Data System (ADS)
Moramarco, Tommaso; Dingman, S. Lawrence
2017-12-01
The velocity distribution in natural channels is of considerable interest for streamflow measurements to obtain information on discharge and flow resistance. This study focuses on the comparison of theoretical velocity distributions based on 1) entropy theory, and 2) the two-parameter power law. The analysis identifies the correlation between the parameters of the distributions and defines their dependence on the geometric and hydraulic characteristics of the channel. Specifically, we investigate how the parameters are related to the flow resistance in terms of Manning roughness, shear velocity and water surface slope, and several formulae showing their relationships are proposed. Velocity measurements carried out in the past 20 years at Ponte Nuovo gauged section along the Tiber River, central Italy, are the basis for the analysis.
NASA Technical Reports Server (NTRS)
Acton, W. H.; Crabtree, M. S.; Simons, J. C.; Gomer, F. E.; Eckel, J. S.
1983-01-01
Information theoretic analysis and subjective paired-comparison and task ranking techniques were employed in order to scale the workload of 20 communications-related tasks frequently performed by the captain and first officer of transport category aircraft. Tasks were drawn from taped conversations between aircraft and air traffic controllers (ATC). Twenty crewmembers performed subjective message comparisons and task rankings on the basis of workload. Information theoretic results indicated a broad range of task difficulty levels, and substantial differences between captain and first officer workload levels. Preliminary subjective data tended to corroborate these results. A hybrid scale reflecting the results of both the analytical and the subjective techniques is currently being developed. The findings will be used to select representative sets of communications for use in high fidelity simulation.
Li, Yue; Jha, Devesh K; Ray, Asok; Wettergren, Thomas A; Yue Li; Jha, Devesh K; Ray, Asok; Wettergren, Thomas A; Wettergren, Thomas A; Li, Yue; Ray, Asok; Jha, Devesh K
2018-06-01
This paper presents information-theoretic performance analysis of passive sensor networks for detection of moving targets. The proposed method falls largely under the category of data-level information fusion in sensor networks. To this end, a measure of information contribution for sensors is formulated in a symbolic dynamics framework. The network information state is approximately represented as the largest principal component of the time series collected across the network. To quantify each sensor's contribution for generation of the information content, Markov machine models as well as x-Markov (pronounced as cross-Markov) machine models, conditioned on the network information state, are constructed; the difference between the conditional entropies of these machines is then treated as an approximate measure of information contribution by the respective sensors. The x-Markov models represent the conditional temporal statistics given the network information state. The proposed method has been validated on experimental data collected from a local area network of passive sensors for target detection, where the statistical characteristics of environmental disturbances are similar to those of the target signal in the sense of time scale and texture. A distinctive feature of the proposed algorithm is that the network decisions are independent of the behavior and identity of the individual sensors, which is desirable from computational perspectives. Results are presented to demonstrate the proposed method's efficacy to correctly identify the presence of a target with very low false-alarm rates. The performance of the underlying algorithm is compared with that of a recent data-driven, feature-level information fusion algorithm. It is shown that the proposed algorithm outperforms the other algorithm.
Wu, Wenjie; Wu, Zemin; Rong, Chunying; Lu, Tian; Huang, Ying; Liu, Shubin
2015-07-23
The electrophilic aromatic substitution for nitration, halogenation, sulfonation, and acylation is a vastly important category of chemical transformation. Its reactivity and regioselectivity is predominantly determined by nucleophilicity of carbon atoms on the aromatic ring, which in return is immensely influenced by the group that is attached to the aromatic ring a priori. In this work, taking advantage of recent developments in quantifying nucleophilicity (electrophilicity) with descriptors from the information-theoretic approach in density functional reactivity theory, we examine the reactivity properties of this reaction system from three perspectives. These include scaling patterns of information-theoretic quantities such as Shannon entropy, Fisher information, Ghosh-Berkowitz-Parr entropy and information gain at both molecular and atomic levels, quantitative predictions of the barrier height with both Hirshfeld charge and information gain, and energetic decomposition analyses of the barrier height for the reactions. To that end, we focused in this work on the identity reaction of the monosubstituted-benzene molecule reacting with hydrogen fluoride using boron trifluoride as the catalyst in the gas phase. We also considered 19 substituting groups, 9 of which are ortho/para directing and the other 9 meta directing, besides the case of R = -H. Similar scaling patterns for these information-theoretic quantities found for stable species elsewhere were disclosed for these reactions systems. We also unveiled novel scaling patterns for information gain at the atomic level. The barrier height of the reactions can reliably be predicted by using both the Hirshfeld charge and information gain at the regioselective carbon atom. The energy decomposition analysis ensued yields an unambiguous picture about the origin of the barrier height, where we showed that it is the electrostatic interaction that plays the dominant role, while the roles played by exchange-correlation and steric effects are minor but indispensable. Results obtained in this work should shed new light for better understanding of the factors governing the reactivity for this class of reactions and assisting ongoing efforts for the design of new and more efficient catalysts for such kind of transformations.
Novel Image Encryption Scheme Based on Chebyshev Polynomial and Duffing Map
2014-01-01
We present a novel image encryption algorithm using Chebyshev polynomial based on permutation and substitution and Duffing map based on substitution. Comprehensive security analysis has been performed on the designed scheme using key space analysis, visual testing, histogram analysis, information entropy calculation, correlation coefficient analysis, differential analysis, key sensitivity test, and speed test. The study demonstrates that the proposed image encryption algorithm shows advantages of more than 10113 key space and desirable level of security based on the good statistical results and theoretical arguments. PMID:25143970
Toward Theory-Based Instruction in Scientific Problem Solving.
ERIC Educational Resources Information Center
Heller, Joan I.; And Others
Several empirical and theoretical analyses related to scientific problem-solving are reviewed, including: detailed studies of individuals at different levels of expertise, and computer models simulating some aspects of human information processing during problem solving. Analysis of these studies has revealed many facets about the nature of the…
Disgust: Evolved Function and Structure
ERIC Educational Resources Information Center
Tybur, Joshua M.; Lieberman, Debra; Kurzban, Robert; DeScioli, Peter
2013-01-01
Interest in and research on disgust has surged over the past few decades. The field, however, still lacks a coherent theoretical framework for understanding the evolved function or functions of disgust. Here we present such a framework, emphasizing 2 levels of analysis: that of evolved function and that of information processing. Although there is…
Leader Positivity and Follower Creativity: An Experimental Analysis
ERIC Educational Resources Information Center
Avey, James B.; Richmond, F. Lynn; Nixon, Don R.
2012-01-01
Using an experimental research design, 191 working adults were randomly assigned to two experimental conditions in order to test a theoretical model linking leader and follower positive psychological capital (PsyCap). Multiple methods were used to gather information from the participants. We found when leader PsyCap was manipulated experimentally,…
Vulnerability and the Neo-Liberal Youth Citizen: A View from Australia
ERIC Educational Resources Information Center
McLeod, Julie
2012-01-01
This article develops a critical discourse analysis of Australian youth and community policies, examined through a discussion of theoretical debates about citizenship and vulnerability. Informed by a Foucauldian genealogical approach, it explores citizenship, not in terms of rights and universal categories, but in terms of relational, situated and…
Students Individual Engagement in GIS
ERIC Educational Resources Information Center
Madsen, Lene Møller; Christiansen, Frederik; Rump, Camilla
2014-01-01
This paper develops two sets of concepts to theorize why students engage differently in Geographical Information Systems (GIS). These theoretical concepts are used as an analytical lens to explore empirical data on the experiences and engagement of students enrolled in an undergraduate GIS course in planning and management. The analysis shows that…
Networked Learning for Agricultural Extension: A Framework for Analysis and Two Cases
ERIC Educational Resources Information Center
Kelly, Nick; Bennett, John McLean; Starasts, Ann
2017-01-01
Purpose: This paper presents economic and pedagogical motivations for adopting information and communications technology (ICT)- mediated learning networks in agricultural education and extension. It proposes a framework for networked learning in agricultural extension and contributes a theoretical and case-based rationale for adopting the…
ERIC Educational Resources Information Center
Ferguson, William D.
2011-01-01
Undergraduate economics lags behind cutting-edge economic theory. The author briefly reviews six related advances that profoundly extend and deepen economic analysis: game-theoretic modeling, collective-action problems, information economics and contracting, social preference theory, conceptualizing rationality, and institutional theory. He offers…
Averaging Models: Parameters Estimation with the R-Average Procedure
ERIC Educational Resources Information Center
Vidotto, G.; Massidda, D.; Noventa, S.
2010-01-01
The Functional Measurement approach, proposed within the theoretical framework of Information Integration Theory (Anderson, 1981, 1982), can be a useful multi-attribute analysis tool. Compared to the majority of statistical models, the averaging model can account for interaction effects without adding complexity. The R-Average method (Vidotto &…
A Study of Students on the Autism Spectrum Transformation in a High School Transition Program
ERIC Educational Resources Information Center
Moore-Gumora, Courteny
2014-01-01
This study brings together the theoretical and empirical practices of traditional informative education, radical transformative education, and sustainable education reform. An analysis of learning disability and constructivist learning are used to elucidate the socio-complexity of historic academic constructs concerning educational leadership for…
ERIC Educational Resources Information Center
Enyi, Amaechi Uneke
2015-01-01
The study entitled "Language and Interactional Discourse: Deconstructing the Talk-Generating Machinery in Natural Conversation" is an analysis of spontaneous and informal conversation. The study, carried out in the theoretical and methodological tradition of Ethnomethodology, was aimed at explicating how ordinary talk is organized and…
Pattern Activity Clustering and Evaluation (PACE)
NASA Astrophysics Data System (ADS)
Blasch, Erik; Banas, Christopher; Paul, Michael; Bussjager, Becky; Seetharaman, Guna
2012-06-01
With the vast amount of network information available on activities of people (i.e. motions, transportation routes, and site visits) there is a need to explore the salient properties of data that detect and discriminate the behavior of individuals. Recent machine learning approaches include methods of data mining, statistical analysis, clustering, and estimation that support activity-based intelligence. We seek to explore contemporary methods in activity analysis using machine learning techniques that discover and characterize behaviors that enable grouping, anomaly detection, and adversarial intent prediction. To evaluate these methods, we describe the mathematics and potential information theory metrics to characterize behavior. A scenario is presented to demonstrate the concept and metrics that could be useful for layered sensing behavior pattern learning and analysis. We leverage work on group tracking, learning and clustering approaches; as well as utilize information theoretical metrics for classification, behavioral and event pattern recognition, and activity and entity analysis. The performance evaluation of activity analysis supports high-level information fusion of user alerts, data queries and sensor management for data extraction, relations discovery, and situation analysis of existing data.
NASA Astrophysics Data System (ADS)
Yang, Liang-Yi; Sun, Di-Hua; Zhao, Min; Cheng, Sen-Lin; Zhang, Geng; Liu, Hui
2018-03-01
In this paper, a new micro-cooperative driving car-following model is proposed to investigate the effect of continuous historical velocity difference information on traffic stability. The linear stability criterion of the new model is derived with linear stability theory and the results show that the unstable region in the headway-sensitivity space will be shrunk by taking the continuous historical velocity difference information into account. Through nonlinear analysis, the mKdV equation is derived to describe the traffic evolution behavior of the new model near the critical point. Via numerical simulations, the theoretical analysis results are verified and the results indicate that the continuous historical velocity difference information can enhance the stability of traffic flow in the micro-cooperative driving process.
Social challenges when implementing information systems in everyday work in a nursing context.
Nilsson, Lina; Eriksén, Sara; Borg, Christel
2014-09-01
Implementation of information systems in healthcare has become a lengthy process where healthcare staff (eg, nurses) are expected to put information into systems without getting the overall picture of the potential usefulness for their own work. The aim of this study was to explore social challenges when implementing information systems in everyday work in a nursing context. Moreover, this study aimed at putting perceived social challenges in a theoretical framework to address them more constructively when implementing information systems in healthcare. Influenced by institutional ethnography, the findings are based on interviews, observations, and written reflections. Power (changing the existing hierarchy, alienation), professional identity (calling on hold, expert becomes novice, changed routines), and encounter (ignorant introductions, preconceived notions) were categories (subcategories) presented in the findings. Social Cognitive Theory, Diffusion of Innovations, organizational culture, and dramaturgical analysis are proposed to set up a theoretical framework. If social challenges are not considered and addressed in the implementation process, it will be affected by nurses' solidarity to existing power structures and their own professional identity. Thus, implementation of information systems affects more aspects in the organization than might have been intended. These aspects need to be taken in to account in the implementation process.
A practical and systematic review of Weibull statistics for reporting strengths of dental materials
Quinn, George D.; Quinn, Janet B.
2011-01-01
Objectives To review the history, theory and current applications of Weibull analyses sufficient to make informed decisions regarding practical use of the analysis in dental material strength testing. Data References are made to examples in the engineering and dental literature, but this paper also includes illustrative analyses of Weibull plots, fractographic interpretations, and Weibull distribution parameters obtained for a dense alumina, two feldspathic porcelains, and a zirconia. Sources Informational sources include Weibull's original articles, later articles specific to applications and theoretical foundations of Weibull analysis, texts on statistics and fracture mechanics and the international standards literature. Study Selection The chosen Weibull analyses are used to illustrate technique, the importance of flaw size distributions, physical meaning of Weibull parameters and concepts of “equivalent volumes” to compare measured strengths obtained from different test configurations. Conclusions Weibull analysis has a strong theoretical basis and can be of particular value in dental applications, primarily because of test specimen size limitations and the use of different test configurations. Also endemic to dental materials, however, is increased difficulty in satisfying application requirements, such as confirming fracture origin type and diligence in obtaining quality strength data. PMID:19945745
Applications of statistical physics and information theory to the analysis of DNA sequences
NASA Astrophysics Data System (ADS)
Grosse, Ivo
2000-10-01
DNA carries the genetic information of most living organisms, and the of genome projects is to uncover that genetic information. One basic task in the analysis of DNA sequences is the recognition of protein coding genes. Powerful computer programs for gene recognition have been developed, but most of them are based on statistical patterns that vary from species to species. In this thesis I address the question if there exist universal statistical patterns that are different in coding and noncoding DNA of all living species, regardless of their phylogenetic origin. In search for such species-independent patterns I study the mutual information function of genomic DNA sequences, and find that it shows persistent period-three oscillations. To understand the biological origin of the observed period-three oscillations, I compare the mutual information function of genomic DNA sequences to the mutual information function of stochastic model sequences. I find that the pseudo-exon model is able to reproduce the mutual information function of genomic DNA sequences. Moreover, I find that a generalization of the pseudo-exon model can connect the existence and the functional form of long-range correlations to the presence and the length distributions of coding and noncoding regions. Based on these theoretical studies I am able to find an information-theoretical quantity, the average mutual information (AMI), whose probability distributions are significantly different in coding and noncoding DNA, while they are almost identical in all studied species. These findings show that there exist universal statistical patterns that are different in coding and noncoding DNA of all studied species, and they suggest that the AMI may be used to identify genes in different living species, irrespective of their taxonomic origin.
NASA Astrophysics Data System (ADS)
Rey, Michaël; Nikitin, Andrei V.; Bézard, Bruno; Rannou, Pascal; Coustenis, Athena; Tyuterev, Vladimir G.
2018-03-01
The spectrum of methane is very important for the analysis and modeling of Titan's atmosphere but its insufficient knowledge in the near infrared, with the absence of reliable absorption coefficients, is an important limitation. In order to help the astronomer community for analyzing high-quality spectra, we report in the present work the first accurate theoretical methane line lists (T = 50-350 K) of 12CH4 and 13CH4 up to 13400 cm-1 ( > 0.75 μm). These lists are built from extensive variational calculations using our recent ab initio potential and dipole moment surfaces and will be freely accessible via the TheoReTS information system (http://theorets.univ-reims.fr, http://theorets.tsu.ru). Validation of these lists is presented throughout the present paper. For the sample of lines where upper energies were available from published analyses of experimental laboratory 12CH4 spectra, small empirical corrections in positions were introduced that could be useful for future high-resolution applications. We finally apply the TheoRetS line list to model Titan spectra as observed by VIMS and by DISR, respectively onboard Cassini and Huygens. These data are used to check that the TheoReTS line lists are able to model observations. We also make comparisons with other experimental or theoretical line lists. It appears that TheoRetS gives very reliable results better than ExoMol and even than HITRAN2012, except around 1.6 μm where it gives very similar results. We conclude that TheoReTS is suitable to be used for the modeling of planetary radiative transfer and photometry. A re-analysis of spectra recorded by the DISR instrument during the descent of the Huygens probe suggests that the CH4 mixing ratio decreases with altitude in Titan's stratosphere, reaching a value of ∼10-2 above the 110 km altitude.
NASA Technical Reports Server (NTRS)
Kummerow, Christian; Giglio, Louis
1994-01-01
This paper describes a multichannel physical approach for retrieving rainfall and vertical structure information from satellite-based passive microwave observations. The algorithm makes use of statistical inversion techniques based upon theoretically calculated relations between rainfall rates and brightness temperatures. Potential errors introduced into the theoretical calculations by the unknown vertical distribution of hydrometeors are overcome by explicity accounting for diverse hydrometeor profiles. This is accomplished by allowing for a number of different vertical distributions in the theoretical brightness temperature calculations and requiring consistency between the observed and calculated brightness temperatures. This paper will focus primarily on the theoretical aspects of the retrieval algorithm, which includes a procedure used to account for inhomogeneities of the rainfall within the satellite field of view as well as a detailed description of the algorithm as it is applied over both ocean and land surfaces. The residual error between observed and calculated brightness temperatures is found to be an important quantity in assessing the uniqueness of the solution. It is further found that the residual error is a meaningful quantity that can be used to derive expected accuracies from this retrieval technique. Examples comparing the retrieved results as well as the detailed analysis of the algorithm performance under various circumstances are the subject of a companion paper.
The dynamics of information-driven coordination phenomena: A transfer entropy analysis
Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro
2016-01-01
Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social subunits. In the absence of clear exogenous driving, social collective phenomena can be represented as endogenously driven structural transitions of the information transfer network. This study provides results that can help define models and predictive algorithms for the analysis of societal events based on open source data. PMID:27051875
The dynamics of information-driven coordination phenomena: A transfer entropy analysis.
Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro
2016-04-01
Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social subunits. In the absence of clear exogenous driving, social collective phenomena can be represented as endogenously driven structural transitions of the information transfer network. This study provides results that can help define models and predictive algorithms for the analysis of societal events based on open source data.
Mapping interictal epileptic discharges using mutual information between concurrent EEG and fMRI.
Caballero-Gaudes, César; Van de Ville, Dimitri; Grouiller, Frédéric; Thornton, Rachel; Lemieux, Louis; Seeck, Margitta; Lazeyras, François; Vulliemoz, Serge
2013-03-01
The mapping of haemodynamic changes related to interictal epileptic discharges (IED) in simultaneous electroencephalography (EEG) and functional MRI (fMRI) studies is usually carried out by means of EEG-correlated fMRI analyses where the EEG information specifies the model to test on the fMRI signal. The sensitivity and specificity critically depend on the accuracy of EEG detection and the validity of the haemodynamic model. In this study we investigated whether an information theoretic analysis based on the mutual information (MI) between the presence of epileptic activity on EEG and the fMRI data can provide further insights into the haemodynamic changes related to interictal epileptic activity. The important features of MI are that: 1) both recording modalities are treated symmetrically; 2) no requirement for a-priori models for the haemodynamic response function, or assumption of a linear relationship between the spiking activity and BOLD responses, and 3) no parametric model for the type of noise or its probability distribution is necessary for the computation of MI. Fourteen patients with pharmaco-resistant focal epilepsy underwent EEG-fMRI and intracranial EEG and/or surgical resection with positive postoperative outcome (seizure freedom or considerable reduction in seizure frequency) was available in 7/14 patients. We used nonparametric statistical assessment of the MI maps based on a four-dimensional wavelet packet resampling method. The results of MI were compared to the statistical parametric maps obtained with two conventional General Linear Model (GLM) analyses based on the informed basis set (canonical HRF and its temporal and dispersion derivatives) and the Finite Impulse Response (FIR) models. The MI results were concordant with the electro-clinically or surgically defined epileptogenic area in 8/14 patients and showed the same degree of concordance as the results obtained with the GLM-based methods in 12 patients (7 concordant and 5 discordant). In one patient, the information theoretic analysis improved the delineation of the irritative zone compared with the GLM-based methods. Our findings suggest that an information theoretic analysis can provide clinically relevant information about the BOLD signal changes associated with the generation and propagation of interictal epileptic discharges. The concordance between the MI, GLM and FIR maps support the validity of the assumptions adopted in GLM-based analyses of interictal epileptic activity with EEG-fMRI in such a manner that they do not significantly constrain the localization of the epileptogenic zone. Copyright © 2012 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Miranda, Silvania V.; Tarapanoff, Kira M. A.
2008-01-01
Introduction: The paper deals with the identification of the information needs and information competencies of a professional group. Theoretical basis: A theoretical relationship between information needs and information competencies as subjects is proposed. Three dimensions are examine: cognitive, affective and situational. The recognition of an…
Quantitative pathology in virtual microscopy: history, applications, perspectives.
Kayser, Gian; Kayser, Klaus
2013-07-01
With the emerging success of commercially available personal computers and the rapid progress in the development of information technologies, morphometric analyses of static histological images have been introduced to improve our understanding of the biology of diseases such as cancer. First applications have been quantifications of immunohistochemical expression patterns. In addition to object counting and feature extraction, laws of thermodynamics have been applied in morphometric calculations termed syntactic structure analysis. Here, one has to consider that the information of an image can be calculated for separate hierarchical layers such as single pixels, cluster of pixels, segmented small objects, clusters of small objects, objects of higher order composed of several small objects. Using syntactic structure analysis in histological images, functional states can be extracted and efficiency of labor in tissues can be quantified. Image standardization procedures, such as shading correction and color normalization, can overcome artifacts blurring clear thresholds. Morphometric techniques are not only useful to learn more about biological features of growth patterns, they can also be helpful in routine diagnostic pathology. In such cases, entropy calculations are applied in analogy to theoretical considerations concerning information content. Thus, regions with high information content can automatically be highlighted. Analysis of the "regions of high diagnostic value" can deliver in the context of clinical information, site of involvement and patient data (e.g. age, sex), support in histopathological differential diagnoses. It can be expected that quantitative virtual microscopy will open new possibilities for automated histological support. Automated integrated quantification of histological slides also serves for quality assurance. The development and theoretical background of morphometric analyses in histopathology are reviewed, as well as their application and potential future implementation in virtual microscopy. Copyright © 2012 Elsevier GmbH. All rights reserved.
Dynamics analysis of epidemic and information spreading in overlay networks.
Liu, Guirong; Liu, Zhimei; Jin, Zhen
2018-05-07
We establish an SIS-UAU model to present the dynamics of epidemic and information spreading in overlay networks. The overlay network is represented by two layers: one where the dynamics of the epidemic evolves and another where the information spreads. We theoretically derive the explicit formulas for the basic reproduction number of awareness R 0 a by analyzing the self-consistent equation and the basic reproduction number of disease R 0 d by using the next generation matrix. The formula of R 0 d shows that the effect of awareness can reduce the basic reproduction number of disease. In particular, when awareness does not affect epidemic spreading, R 0 d is shown to match the existing theoretical results. Furthermore, we demonstrate that the disease-free equilibrium is globally asymptotically stable if R 0 d <1; and the endemic equilibrium is globally asymptotically stable if R 0 d >1. Finally, numerical simulations show that information plays a vital role in preventing and controlling disease and effectively reduces the final disease scale. Copyright © 2018 Elsevier Ltd. All rights reserved.
Intermodulation Atomic Force Microscopy and Spectroscopy
NASA Astrophysics Data System (ADS)
Hutter, Carsten; Platz, Daniel; Tholen, Erik; Haviland, David; Hansson, Hans
2009-03-01
We present a powerful new method of dynamic AFM, which allows to gain far more information about the tip-surface interaction than standard amplitude or phase imaging, while scanning at comparable speed. Our method, called intermodulation atomic force microscopy (ImAFM), employs the manifestly nonlinear phenomenon of intermodulation to extract information about tip-surface forces. ImAFM uses one eigenmode of a mechanical resonator, the latter driven at two frequencies to produce many spectral peaks near its resonace, where sensitivity is highest [1]. We furthermore present a protocol for decoding the combined information encoded in the spectrum of intermodulation peaks. Our theoretical framework suggests methods to enhance the gained information by using a different parameter regime as compared to Ref. [1]. We also discuss strategies for solving the inverse problem, i.e., for extracting the nonlinear tip-surface interaction from the response, also naming limitations of our theoretical analysis. We will further report on latest progress to experimentally employ our new protocol.[3pt] [1] D. Platz, E. A. Tholen, D. Pesen, and D. B. Haviland, Appl. Phys. Lett. 92, 153106 (2008).
Use of information sources by family physicians: a literature survey.
Verhoeven, A A; Boerma, E J; Meyboom-de Jong, B
1995-01-01
Analysis of the use of information sources by family physicians is important for both practical and theoretical reasons. First, analysis of the ways in which family physicians handle information may point to opportunities for improvement. Second, such efforts may lead to improvements in the methodology of literature research in general. This article reports on a survey of the literature on information use by family physicians. Eleven relevant research publications could be found. The data showed that family physicians used colleagues most often as information sources, followed by journals and books. This outcome corresponded with results in other professions. Several factors influenced the use of information sources by family physicians, including the physical, functional, and intellectual accessibility of the source; the physician's age; participation by the physician in research or education; the social context of the physician; practice characteristics; and the stage of the information-gathering process. The publications studied suggested ways to improve information gathering in the areas of computerization, education, library organization, and journal articles. PMID:7703946
Mass media and environmental issues: a theoretical analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parlour, J.W.
1980-01-01
A critique of the weak empirical and theoretical foundations of commentaries on the mass media in the environmental literature argues that they stem from the incidental rather than fundamental concern for the social dimensions of environmental problems. The contributions of information theory, cybernetics, sociology, and political science to micro and macro theories of mass communications are reviewed. Information from empirical analyses of the mass media's portrayal of social issues, including the environment, is related to Hall's dominant ideology thesis of the mass media and the elitist-conflict model of society. It is argued that the media's portrayal of environmental issues ismore » structured by dominant power-holding groups in society with the result that the media effectively function to maintain and reinforce the status quo to the advantage of these dominant groups. 78 references.« less
Angle-of-Arrival Assisted GNSS Collaborative Positioning.
Huang, Bin; Yao, Zheng; Cui, Xiaowei; Lu, Mingquan
2016-06-20
For outdoor and global navigation satellite system (GNSS) challenged scenarios, collaborative positioning algorithms are proposed to fuse information from GNSS satellites and terrestrial wireless systems. This paper derives the Cramer-Rao lower bound (CRLB) and algorithms for the angle-of-arrival (AOA)-assisted GNSS collaborative positioning. Based on the CRLB model and collaborative positioning algorithms, theoretical analysis are performed to specify the effects of various factors on the accuracy of collaborative positioning, including the number of users, their distribution and AOA measurements accuracy. Besides, the influences of the relative location of the collaborative users are also discussed in order to choose appropriate neighboring users, which is in favor of reducing computational complexity. Simulations and actual experiment are carried out with several GNSS receivers in different scenarios, and the results are consistent with theoretical analysis.
An Algebra-Based Introductory Computational Neuroscience Course with Lab.
Fink, Christian G
2017-01-01
A course in computational neuroscience has been developed at Ohio Wesleyan University which requires no previous experience with calculus or computer programming, and which exposes students to theoretical models of neural information processing and techniques for analyzing neural data. The exploration of theoretical models of neural processes is conducted in the classroom portion of the course, while data analysis techniques are covered in lab. Students learn to program in MATLAB and are offered the opportunity to conclude the course with a final project in which they explore a topic of their choice within computational neuroscience. Results from a questionnaire administered at the beginning and end of the course indicate significant gains in student facility with core concepts in computational neuroscience, as well as with analysis techniques applied to neural data.
Alternate entropy measure for assessing volatility in financial markets.
Bose, Ranjan; Hamacher, Kay
2012-11-01
We propose two alternate information theoretical approaches to assess non-Gaussian fluctuations in the return dynamics of financial markets. Specifically, we use superinformation, which is a measure of the disorder of the entropy of time series. We argue on theoretical grounds on its usefulness and show that it can be applied effectively for analyzing returns. A study of stock market data for over five years has been carried out using this approach. We show how superinformation helps to identify and classify important signals in the time series. The financial crisis of 2008 comes out very clearly in the superinformation plots. In addition, we introduce the super mutual information. Distinct super mutual information signatures are observed that might be used to mitigate idiosyncratic risk. The universality of our approach has been tested by carrying out the analysis for the 100 stocks listed in S&P100 index. The average superinformation values for the S&P100 stocks correlates very well with the VIX.
Alternate entropy measure for assessing volatility in financial markets
NASA Astrophysics Data System (ADS)
Bose, Ranjan; Hamacher, Kay
2012-11-01
We propose two alternate information theoretical approaches to assess non-Gaussian fluctuations in the return dynamics of financial markets. Specifically, we use superinformation, which is a measure of the disorder of the entropy of time series. We argue on theoretical grounds on its usefulness and show that it can be applied effectively for analyzing returns. A study of stock market data for over five years has been carried out using this approach. We show how superinformation helps to identify and classify important signals in the time series. The financial crisis of 2008 comes out very clearly in the superinformation plots. In addition, we introduce the super mutual information. Distinct super mutual information signatures are observed that might be used to mitigate idiosyncratic risk. The universality of our approach has been tested by carrying out the analysis for the 100 stocks listed in S&P100 index. The average superinformation values for the S&P100 stocks correlates very well with the VIX.
Schaper, Louise; Pervan, Graham
2007-01-01
The research reported in this paper describes the development, empirical validation and analysis of a model of technology acceptance by Australian occupational therapists. The study described involved the collection of quantitative data through a national survey. The theoretical significance of this work is that it uses a thoroughly constructed research model, with one of the largest sample sizes ever tested (n=1605), to extend technology acceptance research into the health sector. Results provide strong support for the model. This work reveals the complexity of the constructs and relationships that influence technology acceptance and highlights the need to include sociotechnical and system issues in studies of technology acceptance in healthcare to improve information system implementation success in this arena. The results of this study have practical and theoretical implications for health informaticians and researchers in the field of health informatics and information systems, tertiary educators, Commonwealth and State Governments and the allied health professions.
Can Facebook informational use foster adolescent civic engagement?
Lenzi, Michela; Vieno, Alessio; Altoè, Gianmarco; Scacchi, Luca; Perkins, Douglas D; Zukauskiene, Rita; Santinello, Massimo
2015-06-01
The findings on the association between Social Networking Sites and civic engagement are mixed. The present study aims to evaluate a theoretical model linking the informational use of Internet-based social media (specifically, Facebook) with civic competencies and intentions for future civic engagement, taking into account the mediating role of civic discussions with family and friends and sharing the news online. Participants were 114 Italian high school students aged 14-17 years (57 % boys). Path analysis was used to evaluate the proposed theoretical model. Results showed that Facebook informational use was associated with higher levels of adolescent perceived competence for civic action, both directly and through the mediation of civic discussion with parents and friends (offline). Higher levels of civic competencies, then, were associated with a stronger intention to participate in the civic domain in the future. Our findings suggest that Facebook may provide adolescents with additional tools through which they can learn civic activities or develop the skills necessary to participate in the future.
ERIC Educational Resources Information Center
Pettersson, Rune
2014-01-01
Information design has practical and theoretical components. As an academic discipline we may view information design as a combined discipline, a practical theory, or as a theoretical practice. So far information design has incorporated facts, influences, methods, practices, principles, processes, strategies, and tools from a large number of…
ERIC Educational Resources Information Center
Williams, Robert; Hewison, Alistair; Wildman, Stuart; Roskell, Carolyn
2013-01-01
This paper presents findings from a qualitative study undertaken with 46 African and African Caribbean men exploring their experiences of fatherhood. Data analysis was informed by Connell's theoretical work on changing gender relations. Findings indicate that fathers' lives were mediated by masculinities, racism, gender, migration and generational…
Shaking Religious Education: A New Look at the Literature
ERIC Educational Resources Information Center
Kameniar, Barbara
2007-01-01
This article offers an analysis of religious education practice through the literature that informs it. It engages Derrida's critique of the "metaphysics of presence" (1982a) to develop a theoretical framework for a new look at the ways in which different approaches to religious education represent religion and racial difference. The…
Time Is Precious: Variable- and Event-Centred Approaches to Process Analysis in CSCL Research
ERIC Educational Resources Information Center
Reimann, Peter
2009-01-01
Although temporality is a key characteristic of the core concepts of CSCL--interaction, communication, learning, knowledge building, technology use--and although CSCL researchers have privileged access to process data, the theoretical constructs and methods employed in research practice frequently neglect to make full use of information relating…
Invisible and Hypervisible Academics: The Experiences of Black and Minority Ethnic Teacher Educators
ERIC Educational Resources Information Center
Lander, Vini; Santoro, Ninetta
2017-01-01
This qualitative study investigated the experiences of Black and Minority Ethnic (BME) teacher educators in England and Australia working within the predominantly white space of the academy. Data analysis was informed by a multidimensional theoretical framework drawing on Critical Race Theory, whiteness and Puwar's concept of the Space Invader.…
Communication Network Integration and Group Uniformity in a Complex Organization.
ERIC Educational Resources Information Center
Danowski, James A.; Farace, Richard V.
This paper contains a discussion of the limitations of research on group processes in complex organizations and the manner in which a procedure for network analysis in on-going systems can reduce problems. The research literature on group uniformity processes and on theoretical models of these processes from an information processing perspective…
USDA-ARS?s Scientific Manuscript database
An essential step to understanding the genomic biology of any organism is to comprehensively survey its transcriptome. We present the Bovine Gene Atlas (BGA) a compendium of over 7.2 million unique 20 base Illumina DGE tags representing 100 tissue transcriptomes collected primarily from L1 Dominette...
Data Mining in Earth System Science (DMESS 2011)
Forrest M. Hoffman; J. Walter Larson; Richard Tran Mills; Bhorn-Gustaf Brooks; Auroop R. Ganguly; William Hargrove; et al
2011-01-01
From field-scale measurements to global climate simulations and remote sensing, the growing body of very large and long time series Earth science data are increasingly difficult to analyze, visualize, and interpret. Data mining, information theoretic, and machine learning techniquesâsuch as cluster analysis, singular value decomposition, block entropy, Fourier and...
This work addresses a potentially serious problem in analysis or synthesis of spatially explicit data on ground water quality from wells, known to geographers as the modifiable areal unit problem (MAUP). It results from the fact that in regional aggregation of spatial data, inves...
Closeout Report for CTEQ Summer School 2015
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, Tao
The CTEQ Collaboration is an informal group of 37 experimental and theoretical high energy physicists from 20 universities and 5 national labs, engaged in a program to advance research in and understanding of QCD. This program includes the well-known collaborative project on global QCD analysis of parton distributions, the organization of a variety of workshops, periodic collaboration meetings, and the subject of this proposal: the CTEQ Summer Schools on QCD Analysis and Phenomenology.
The structural, connectomic and network covariance of the human brain.
Irimia, Andrei; Van Horn, John D
2013-02-01
Though it is widely appreciated that complex structural, functional and morphological relationships exist between distinct areas of the human cerebral cortex, the extent to which such relationships coincide remains insufficiently appreciated. Here we determine the extent to which correlations between brain regions are modulated by either structural, connectomic or network-theoretic properties using a structural neuroimaging data set of magnetic resonance imaging (MRI) and diffusion tensor imaging (DTI) volumes acquired from N=110 healthy human adults. To identify the linear relationships between all available pairs of regions, we use canonical correlation analysis to test whether a statistically significant correlation exists between each pair of cortical parcels as quantified via structural, connectomic or network-theoretic measures. In addition to this, we investigate (1) how each group of canonical variables (whether structural, connectomic or network-theoretic) contributes to the overall correlation and, additionally, (2) whether each individual variable makes a significant contribution to the test of the omnibus null hypothesis according to which no correlation between regions exists across subjects. We find that, although region-to-region correlations are extensively modulated by structural and connectomic measures, there are appreciable differences in how these two groups of measures drive inter-regional correlation patterns. Additionally, our results indicate that the network-theoretic properties of the cortex are strong modulators of region-to-region covariance. Our findings are useful for understanding the structural and connectomic relationship between various parts of the brain, and can inform theoretical and computational models of cortical information processing. Published by Elsevier Inc.
Accurate airway centerline extraction based on topological thinning using graph-theoretic analysis.
Bian, Zijian; Tan, Wenjun; Yang, Jinzhu; Liu, Jiren; Zhao, Dazhe
2014-01-01
The quantitative analysis of the airway tree is of critical importance in the CT-based diagnosis and treatment of popular pulmonary diseases. The extraction of airway centerline is a precursor to identify airway hierarchical structure, measure geometrical parameters, and guide visualized detection. Traditional methods suffer from extra branches and circles due to incomplete segmentation results, which induce false analysis in applications. This paper proposed an automatic and robust centerline extraction method for airway tree. First, the centerline is located based on the topological thinning method; border voxels are deleted symmetrically to preserve topological and geometrical properties iteratively. Second, the structural information is generated using graph-theoretic analysis. Then inaccurate circles are removed with a distance weighting strategy, and extra branches are pruned according to clinical anatomic knowledge. The centerline region without false appendices is eventually determined after the described phases. Experimental results show that the proposed method identifies more than 96% branches and keep consistency across different cases and achieves superior circle-free structure and centrality.
Game theoretic approach for cooperative feature extraction in camera networks
NASA Astrophysics Data System (ADS)
Redondi, Alessandro E. C.; Baroffio, Luca; Cesana, Matteo; Tagliasacchi, Marco
2016-07-01
Visual sensor networks (VSNs) consist of several camera nodes with wireless communication capabilities that can perform visual analysis tasks such as object identification, recognition, and tracking. Often, VSN deployments result in many camera nodes with overlapping fields of view. In the past, such redundancy has been exploited in two different ways: (1) to improve the accuracy/quality of the visual analysis task by exploiting multiview information or (2) to reduce the energy consumed for performing the visual task, by applying temporal scheduling techniques among the cameras. We propose a game theoretic framework based on the Nash bargaining solution to bridge the gap between the two aforementioned approaches. The key tenet of the proposed framework is for cameras to reduce the consumed energy in the analysis process by exploiting the redundancy in the reciprocal fields of view. Experimental results in both simulated and real-life scenarios confirm that the proposed scheme is able to increase the network lifetime, with a negligible loss in terms of visual analysis accuracy.
Quantum Information Theory of Measurement
NASA Astrophysics Data System (ADS)
Glick, Jennifer Ranae
Quantum measurement lies at the heart of quantum information processing and is one of the criteria for quantum computation. Despite its central role, there remains a need for a robust quantum information-theoretical description of measurement. In this work, I will quantify how information is processed in a quantum measurement by framing it in quantum information-theoretic terms. I will consider a diverse set of measurement scenarios, including weak and strong measurements, and parallel and consecutive measurements. In each case, I will perform a comprehensive analysis of the role of entanglement and entropy in the measurement process and track the flow of information through all subsystems. In particular, I will discuss how weak and strong measurements are fundamentally of the same nature and show that weak values can be computed exactly for certain measurements with an arbitrary interaction strength. In the context of the Bell-state quantum eraser, I will derive a trade-off between the coherence and "which-path" information of an entangled pair of photons and show that a quantum information-theoretic approach yields additional insights into the origins of complementarity. I will consider two types of quantum measurements: those that are made within a closed system where every part of the measurement device, the ancilla, remains under control (what I will call unamplified measurements), and those performed within an open system where some degrees of freedom are traced over (amplified measurements). For sequences of measurements of the same quantum system, I will show that information about the quantum state is encoded in the measurement chain and that some of this information is "lost" when the measurements are amplified-the ancillae become equivalent to a quantum Markov chain. Finally, using the coherent structure of unamplified measurements, I will outline a protocol for generating remote entanglement, an essential resource for quantum teleportation and quantum cryptographic tasks.
Info/information theory: speakers choose shorter words in predictive contexts.
Mahowald, Kyle; Fedorenko, Evelina; Piantadosi, Steven T; Gibson, Edward
2013-02-01
A major open question in natural language research is the role of communicative efficiency in the origin and on-line processing of language structures. Here, we use word pairs like chimp/chimpanzee, which differ in length but have nearly identical meanings, to investigate the communicative properties of lexical systems and the communicative pressures on language users.If language is designed to be information-theoretically optimal, then shorter words should convey less information than their longer counterparts, when controlling for meaning. Consistent with this prediction, a corpus analysis revealed that the short form of our meaning-matched pairs occurs in more predictive contexts than the longer form. Second, a behavioral study showed that language users choose the short form more often in predictive contexts, suggesting that tendencies to be information-theoretically efficient manifest in explicit behavioral choices. Our findings, which demonstrate the prominent role of communicative efficiency in the structure of the lexicon, complement and extend the results of Piantadosi, Tily, and Gibson (2011), who showed that word length is better correlated with Shannon information content than with frequency. Crucially, we show that this effect arises at least in part from active speaker choice. Copyright © 2012 Elsevier B.V. All rights reserved.
Mielniczuk, Jan; Teisseyre, Paweł
2018-03-01
Detection of gene-gene interactions is one of the most important challenges in genome-wide case-control studies. Besides traditional logistic regression analysis, recently the entropy-based methods attracted a significant attention. Among entropy-based methods, interaction information is one of the most promising measures having many desirable properties. Although both logistic regression and interaction information have been used in several genome-wide association studies, the relationship between them has not been thoroughly investigated theoretically. The present paper attempts to fill this gap. We show that although certain connections between the two methods exist, in general they refer two different concepts of dependence and looking for interactions in those two senses leads to different approaches to interaction detection. We introduce ordering between interaction measures and specify conditions for independent and dependent genes under which interaction information is more discriminative measure than logistic regression. Moreover, we show that for so-called perfect distributions those measures are equivalent. The numerical experiments illustrate the theoretical findings indicating that interaction information and its modified version are more universal tools for detecting various types of interaction than logistic regression and linkage disequilibrium measures. © 2017 WILEY PERIODICALS, INC.
Theory and analysis of a large field polarization imaging system with obliquely incident light.
Lu, Xiaotian; Jin, Weiqi; Li, Li; Wang, Xia; Qiu, Su; Liu, Jing
2018-02-05
Polarization imaging technology provides information about not only the irradiance of a target but also the polarization degree and angle of polarization, which indicates extensive application potential. However, polarization imaging theory is based on paraxial optics. When a beam of obliquely incident light passes an analyser, the direction of light propagation is not perpendicular to the surface of the analyser and the applicability of the traditional paraxial optical polarization imaging theory is challenged. This paper investigates a theoretical model of a polarization imaging system with obliquely incident light and establishes a polarization imaging transmission model with a large field of obliquely incident light. In an imaging experiment with an integrating sphere light source and rotatable polarizer, the polarization imaging transmission model is verified and analysed for two cases of natural light and linearly polarized light incidence. Although the results indicate that the theoretical model is consistent with the experimental results, the theoretical model distinctly differs from the traditional paraxial approximation model. The results prove the accuracy and necessity of the theoretical model and the theoretical guiding significance for theoretical and systematic research of large field polarization imaging.
Research on the novel FBG detection system for temperature and strain field distribution
NASA Astrophysics Data System (ADS)
Liu, Zhi-chao; Yang, Jin-hua
2017-10-01
In order to collect the information of temperature and strain field distribution information, the novel FBG detection system was designed. The system applied linear chirped FBG structure for large bandwidth. The structure of novel FBG cover was designed as a linear change in thickness, in order to have a different response at different locations. It can obtain the temperature and strain field distribution information by reflection spectrum simultaneously. The structure of novel FBG cover was designed, and its theoretical function is calculated. Its solution is derived for strain field distribution. By simulation analysis the change trend of temperature and strain field distribution were analyzed in the conditions of different strain strength and action position, the strain field distribution can be resolved. The FOB100 series equipment was used to test the temperature in experiment, and The JSM-A10 series equipment was used to test the strain field distribution in experiment. The average error of experimental results was better than 1.1% for temperature, and the average error of experimental results was better than 1.3% for strain. There were individual errors when the strain was small in test data. It is feasibility by theoretical analysis, simulation calculation and experiment, and it is very suitable for application practice.
The Sonic Altimeter for Aircraft
NASA Technical Reports Server (NTRS)
Draper, C S
1937-01-01
Discussed here are results already achieved with sonic altimeters in light of the theoretical possibilities of such instruments. From the information gained in this investigation, a procedure is outlined to determine whether or not a further development program is justified by the value of the sonic altimeter as an aircraft instrument. The information available in the literature is reviewed and condensed into a summary of sonic altimeter developments. Various methods of receiving the echo and timing the interval between the signal and the echo are considered. A theoretical discussion is given of sonic altimeter errors due to uncertainties in timing, variations in sound velocity, aircraft speed, location of the sending and receiving units, and inclinations of the flight path with respect to the ground surface. Plots are included which summarize the results in each case. An analysis is given of the effect of an inclined flight path on the frequency of the echo. A brief study of the acoustical phases of the sonic altimeter problem is carried through. The results of this analysis are used to predict approximately the maximum operating altitudes of a reasonably designed sonic altimeter under very good and very bad conditions. A final comparison is made between the estimated and experimental maximum operating altitudes which shows good agreement where quantitative information is available.
ERIC Educational Resources Information Center
Koh, Kyungwon
2011-01-01
Contemporary young people are engaged in a variety of information behaviors, such as information seeking, using, sharing, and creating. The ways youth interact with information have transformed in the shifting digital information environment; however, relatively little empirical research exists and no theoretical framework adequately explains…
An algorithmic approach to crustal deformation analysis
NASA Technical Reports Server (NTRS)
Iz, Huseyin Baki
1987-01-01
In recent years the analysis of crustal deformation measurements has become important as a result of current improvements in geodetic methods and an increasing amount of theoretical and observational data provided by several earth sciences. A first-generation data analysis algorithm which combines a priori information with current geodetic measurements was proposed. Relevant methods which can be used in the algorithm were discussed. Prior information is the unifying feature of this algorithm. Some of the problems which may arise through the use of a priori information in the analysis were indicated and preventive measures were demonstrated. The first step in the algorithm is the optimal design of deformation networks. The second step in the algorithm identifies the descriptive model of the deformation field. The final step in the algorithm is the improved estimation of deformation parameters. Although deformation parameters are estimated in the process of model discrimination, they can further be improved by the use of a priori information about them. According to the proposed algorithm this information must first be tested against the estimates calculated using the sample data only. Null-hypothesis testing procedures were developed for this purpose. Six different estimators which employ a priori information were examined. Emphasis was put on the case when the prior information is wrong and analytical expressions for possible improvements under incompatible prior information were derived.
NASA Astrophysics Data System (ADS)
Fernandes, Geraldo W. Rocha; Rodrigues, António M.; Ferreira, Carlos Alberto
2018-03-01
This article aims to characterise the research on science teachers' professional development programs that support the use of Information and Communication Technologies (ICTs) and the main trends concerning the theoretical frameworks (theoretical foundation, literature review or background) that underpin these studies. Through a systematic review of the literature, 76 articles were found and divided into two axes on training science teachers and the use of digital technologies with their categories. The first axis (characterisation of articles) presents the category key features that characterise the articles selected (major subjects, training and actions for the professional development and major ICT tools and digital resources). The second axis (trends of theoretical frameworks) has three categories organised in theoretical frameworks that emphasise the following: (a) the digital technologies, (b) prospects of curricular renewal and (c) cognitive processes. It also characterised a group of articles with theoretical frameworks that contain multiple elements without deepening them or that even lack a theoretical framework that supports the studies. In this review, we found that many professional development programs for teachers still use inadequate strategies for bringing about change in teacher practices. New professional development proposals are emerging with the objective of minimising such difficulties and this analysis could be a helpful tool to restructure those proposals.
A Novel Statistical Analysis and Interpretation of Flow Cytometry Data
2013-03-31
the resulting residuals appear random. In the work that follows, I∗ = 200. The values of B and b̂j are known from the experiment. Notice that the...conjunction with the model parameter vector in a two- stage process. Unfortunately two- stage estimation may cause some parameters of the mathematical model to...information theoretic criteria such as Akaike’s Information Criterion (AIC). From (4.3), it follows that the scaled residuals rjk = λjI[n̂](tj , zk; ~q
Concept analysis of moral courage in nursing: A hybrid model.
Sadooghiasl, Afsaneh; Parvizy, Soroor; Ebadi, Abbas
2018-02-01
Moral courage is one of the most fundamental virtues in the nursing profession, however, little attention has been paid to it. As a result, no exact and clear definition of moral courage has ever been accessible. This study is carried out for the purposes of defining and clarifying its concept in the nursing profession. This study used a hybrid model of concept analysis comprising three phases, namely, a theoretical phase, field work phase, and a final analysis phase. To find relevant literature, electronic search of valid databases was utilized using keywords related to the concept of courage. Field work data were collected over an 11 months' time period from 2013 to 2014. In the field work phase, in-depth interviews were performed with 10 nurses. The conventional content analysis was used in two theoretical and field work phases using Graneheim and Lundman stages, and the results were combined in the final analysis phase. Ethical consideration: Permission for this study was obtained from the ethics committee of Tehran University of Medical Sciences. Oral and written informed consent was received from the participants. From the sum of 750 gained titles in theoretical phase, 26 texts were analyzed. The analysis resulted in 494 codes in text analysis and 226 codes in interview analysis. The literature review in the theoretical phase revealed two features of inherent-transcendental characteristics, two of which possessed a difficult nature. Working in the field phase added moral self-actualization characteristic, rationalism, spiritual beliefs, and scientific-professional qualifications to the feature of the concept. Moral courage is a pure and prominent characteristic of human beings. The antecedents of moral courage include model orientation, model acceptance, rationalism, individual excellence, acquiring academic and professional qualification, spiritual beliefs, organizational support, organizational repression, and internal and external personal barriers. Professional excellence resulting from moral courage can be crystallized in the form of provision of professional care, creating peace of mind, and the nurse's decision making and proper functioning.
NASA Astrophysics Data System (ADS)
Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro
2012-06-01
ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the success of the workshop. Further information on ACAT 2011 can be found at http://acat2011.cern.ch Dr Liliana Teodorescu Brunel University ACATgroup The PDF also contains details of the workshop's committees and sponsors.
Information-theoretic approach to lead-lag effect on financial markets
NASA Astrophysics Data System (ADS)
Fiedor, Paweł
2014-08-01
Recently the interest of researchers has shifted from the analysis of synchronous relationships of financial instruments to the analysis of more meaningful asynchronous relationships. Both types of analysis are concentrated mostly on Pearson's correlation coefficient and consequently intraday lead-lag relationships (where one of the variables in a pair is time-lagged) are also associated with them. Under the Efficient-Market Hypothesis such relationships are not possible as all information is embedded in the prices, but in real markets we find such dependencies. In this paper we analyse lead-lag relationships of financial instruments and extend known methodology by using mutual information instead of Pearson's correlation coefficient. Mutual information is not only a more general measure, sensitive to non-linear dependencies, but also can lead to a simpler procedure of statistical validation of links between financial instruments. We analyse lagged relationships using New York Stock Exchange 100 data not only on an intraday level, but also for daily stock returns, which have usually been ignored.
The Effect of Visual Information on the Manual Approach and Landing
NASA Technical Reports Server (NTRS)
Wewerinke, P. H.
1982-01-01
The effect of visual information in combination with basic display information on the approach performance. A pre-experimental model analysis was performed in terms of the optimal control model. The resulting aircraft approach performance predictions were compared with the results of a moving base simulator program. The results illustrate that the model provides a meaningful description of the visual (scene) perception process involved in the complex (multi-variable, time varying) manual approach task with a useful predictive capability. The theoretical framework was shown to allow a straight-forward investigation of the complex interaction of a variety of task variables.
A framework for performing workplace hazard and risk analysis: a participative ergonomics approach.
Morag, Ido; Luria, Gil
2013-01-01
Despite the unanimity among researchers about the centrality of workplace analysis based on participatory ergonomics (PE) as a basis for preventive interventions, there is still little agreement about the necessary of a theoretical framework for providing practical guidance. In an effort to develop a conceptual PE framework, the authors, focusing on 20 studies, found five primary dimensions for characterising an analytical structure: (1) extent of workforce involvement; (2) analysis duration; (3) diversity of reporter role types; (4) scope of analysis and (5) supportive information system for analysis management. An ergonomics analysis carried out in a chemical manufacturing plant serves as a case study for evaluating the proposed framework. The study simultaneously demonstrates the five dimensions and evaluates their feasibility. The study showed that managerial leadership was fundamental to the successful implementation of the analysis; that all job holders should participate in analysing their own workplace and simplified reporting methods contributed to a desirable outcome. This paper seeks to clarify the scope of workplace ergonomics analysis by offering a theoretical and structured framework for providing practical advice and guidance. Essential to successfully implementing the analytical framework are managerial involvement, participation of all job holders and simplified reporting methods.
An information-theoretical perspective on weighted ensemble forecasts
NASA Astrophysics Data System (ADS)
Weijs, Steven V.; van de Giesen, Nick
2013-08-01
This paper presents an information-theoretical method for weighting ensemble forecasts with new information. Weighted ensemble forecasts can be used to adjust the distribution that an existing ensemble of time series represents, without modifying the values in the ensemble itself. The weighting can, for example, add new seasonal forecast information in an existing ensemble of historically measured time series that represents climatic uncertainty. A recent article in this journal compared several methods to determine the weights for the ensemble members and introduced the pdf-ratio method. In this article, a new method, the minimum relative entropy update (MRE-update), is presented. Based on the principle of minimum discrimination information, an extension of the principle of maximum entropy (POME), the method ensures that no more information is added to the ensemble than is present in the forecast. This is achieved by minimizing relative entropy, with the forecast information imposed as constraints. From this same perspective, an information-theoretical view on the various weighting methods is presented. The MRE-update is compared with the existing methods and the parallels with the pdf-ratio method are analysed. The paper provides a new, information-theoretical justification for one version of the pdf-ratio method that turns out to be equivalent to the MRE-update. All other methods result in sets of ensemble weights that, seen from the information-theoretical perspective, add either too little or too much (i.e. fictitious) information to the ensemble.
Chircop, Andrea
2008-06-01
This theoretical exploration is an attempt to conceptualize the link between gender and urban environmental health. The proposed ecofeminist framework enables an understanding of the link between the urban physical and social environments and health inequities mediated by gender and socioeconomic status. This framework is proposed as a theoretical magnifying glass to reveal the underlying logic that connects environmental exploitation on the one hand, and gendered health inequities on the other. Ecofeminism has the potential to reveal an inherent, normative conceptual analysis and argumentative justification of western society that permits the oppression of women and the exploitation of the environment. This insight will contribute to a better understanding of the mechanisms underlying gendered environmental health inequities and inform healthy public policy that is supportive of urban environmental health, particularly for low-income mothers.
Linking theory with qualitative research through study of stroke caregiving families.
Pierce, Linda L; Steiner, Victoria; Cervantez Thompson, Teresa L; Friedemann, Marie-Luise
2014-01-01
This theoretical article outlines the deliberate process of applying a qualitative data analysis method rooted in Friedemann's Framework of Systemic Organization through the study of a web-based education and support intervention for stroke caregiving families. Directed by Friedemann's framework, the analytic method involved developing, refining, and using a coding rubric to explore interactive patterns between caregivers and care recipients from this 3-month feasibility study using this education and support intervention. Specifically, data were gathered from the intervention's web-based discussion component between caregivers and the nurse specialist, as well as from telephone caregiver interviews. A theoretical framework guided the process of developing and refining this coding rubric for the purpose of organizing data; but, more importantly, guided the investigators' thought processes, allowing them to extract rich information from the data set, as well as synthesize this information to generate a broad understanding of the caring situation. © 2013 Association of Rehabilitation Nurses.
Reconceptualising "Identity Slippage": Additional Language Learning and (L2) Identity Development
ERIC Educational Resources Information Center
Armour, William
2009-01-01
This paper reconsiders the theoretical concept of "identity slippage" by considering a detailed exegesis of three model conversations taught to learners of Japanese as an additional language. To inform my analysis of these conversations and how they contribute to identity slippage, I have used the work of the systemic-functional linguist Jay Lemke…
ERIC Educational Resources Information Center
Batallan, Graciela; Dente, Liliana; Ritta, Loreley
2017-01-01
This article aims to open up a debate on methodological aspects of ethnographic research, arguing for the legitimacy of the information produced in a research "taller" or workshop using a participatory methodology and video production as a methodological tool. Based on the theoretical foundations and analysis of a "taller"…
Code of Federal Regulations, 2012 CFR
2012-07-01
..., if no processes or imports require reports at the time, within 90 days of having processes or imports... information: (i) Theoretical analysis. Manufacturers records must include: the reaction or reactions believed... records must include: the reaction or reactions believed to be generating PCBs and the levels of PCBs...
Code of Federal Regulations, 2013 CFR
2013-07-01
..., if no processes or imports require reports at the time, within 90 days of having processes or imports... information: (i) Theoretical analysis. Manufacturers records must include: the reaction or reactions believed... records must include: the reaction or reactions believed to be generating PCBs and the levels of PCBs...
Code of Federal Regulations, 2014 CFR
2014-07-01
..., if no processes or imports require reports at the time, within 90 days of having processes or imports... information: (i) Theoretical analysis. Manufacturers records must include: the reaction or reactions believed... records must include: the reaction or reactions believed to be generating PCBs and the levels of PCBs...
Code of Federal Regulations, 2011 CFR
2011-07-01
..., if no processes or imports require reports at the time, within 90 days of having processes or imports... information: (i) Theoretical analysis. Manufacturers records must include: the reaction or reactions believed... records must include: the reaction or reactions believed to be generating PCBs and the levels of PCBs...
ERIC Educational Resources Information Center
Gweon, Gahgene; Jain, Mahaveer; McDonough, John; Raj, Bhiksha; Rose, Carolyn P.
2013-01-01
This paper contributes to a theory-grounded methodological foundation for automatic collaborative learning process analysis. It does this by illustrating how insights from the social psychology and sociolinguistics of speech style provide a theoretical framework to inform the design of a computational model. The purpose of that model is to detect…
Disability and Diversity on CSU Websites: A Critical Discourse Study
ERIC Educational Resources Information Center
Gabel, Susan L.; Reid, Denise; Pearson, Holly; Ruiz, Litzy; Hume-Dawson, Rodney
2016-01-01
With more than 325,000 students, the California State University (CSU) system is 1 of the largest in the United States, making it a useful unit of analysis for studying disability and diversity. Using a critical discourse theoretical framework and borrowing strategies from Astroff (2001) and Pauwells (2012), we found disability information on CSU…
Children's Perceptions and Learning about Tropical Rainforests: An Analysis of Their Drawings
ERIC Educational Resources Information Center
Bowker, Rob
2007-01-01
This study analysed 9 to 11 year old children's drawings of tropical rainforests immediately before and after a visit to the Humid Tropics Biome at the Eden Project, Cornwall, UK. A theoretical framework derived from considerations of informal learning and constructivism was used as a basis to develop a methodology to interpret the children's…
ERIC Educational Resources Information Center
Sochos, Antigonos
2014-01-01
The couple relationship is an essential source of support for individuals undergoing psychological treatment and the aim of this study was to apply a new methodology in assessing the quality of such support. A theoretically informed thematic analysis of interview transcripts was conducted, triangulated by quantitative data. Twenty-one brief…
Normative Discourse and Persuasion: An Analysis of Ga'dang Informal Litigation.
ERIC Educational Resources Information Center
Walrod, Michael R.
A study of the discourse of Ga'dang, a Philippine language, focuses on normative discourse and persuasion, especially the ways in which the former is used to accomplish the latter. The first five chapters outline the theoretical framework of the study, placing normative and persuasive discourse in a philosophical context and relating them to the…
Business Spanish in the Real World: A Task-Based Needs Analysis
ERIC Educational Resources Information Center
Martin, Alexandra; Adrada-Rafael, Sergio
2017-01-01
The growing demand for Spanish for Specific Purposes (SSP) courses at universities in the United States in the last two decades (Klee, 2015) has brought to light the need for more theoretically driven research in this field, which can inform pedagogical decisions and materials design. The present study conceptually replicates Serafini and Torres…
Picturing Sex Education: Notes on the Politics of Visual Stratification
ERIC Educational Resources Information Center
Janssen, Diederik F.
2006-01-01
This paper addresses the scarcity of research on depictions and layout in sex education materials. It is argued that pictures and layout can inform an analysis of social stratification based on visual access. This process of social organization is located using four theoretical models. However these models do not lend themselves to a close reading…
Anomalies in Economics Enrollment: 1991-1992 to 1995-1996
ERIC Educational Resources Information Center
Lombardi, Waldo; Ramrattan, Lall B.; Szenberg, Michael
2004-01-01
This paper presents data and empirical models to explain the causes of the decline in the enrollment of economics majors during the 1991-1992 to 1995-1996 academic years. It first discusses the theoretical bases for a qualitative analysis of this type. It then discusses a sample survey methodology used to obtain cross-sectional information from…
Quantum entanglement helps in improving economic efficiency
NASA Astrophysics Data System (ADS)
Du, Jiangfeng; Ju, Chenyong; Li, Hui
2005-02-01
We propose an economic regulation approach based on quantum game theory for the government to reduce the abuses of oligopolistic competition. Theoretical analysis shows that this approach can help government improve the economic efficiency of the oligopolistic market, and help prevent monopoly due to incorrect information. These advantages are completely attributed to the quantum entanglement, a unique quantum mechanical character.
Social science and linguistic text analysis of nurses' records: a systematic review and critique.
Buus, Niels; Hamilton, Bridget Elizabeth
2016-03-01
The two aims of the paper were to systematically review and critique social science and linguistic text analyses of nursing records in order to inform future research in this emerging area of research. Systematic searches in reference databases and in citation indexes identified 12 articles that included analyses of the social and linguistic features of records and recording. Two reviewers extracted data using established criteria for the evaluation of qualitative research papers. A common characteristic of nursing records was the economical use of language with local meanings that conveyed little information to the uninitiated reader. Records were dominated by technocratic-medical discourse focused on patients' bodies, and they depicted only very limited aspects of nursing practice. Nurses made moral evaluations in their categorisation of patients, which reflected detailed surveillance of patients' disturbing behaviour. The text analysis methods were rarely transparent in the articles, which could suggest research quality problems. For most articles, the significance of the findings was substantiated more by theoretical readings of the institutional settings than by the analysis of textual data. More probing empirical research of nurses' records and a wider range of theoretical perspectives has the potential to expose the situated meanings of nursing work in healthcare organisations. © 2015 John Wiley & Sons Ltd.
Shutin, Dmitriy; Zlobinskaya, Olga
2010-02-01
The goal of this contribution is to apply model-based information-theoretic measures to the quantification of relative differences between immunofluorescent signals. Several models for approximating the empirical fluorescence intensity distributions are considered, namely Gaussian, Gamma, Beta, and kernel densities. As a distance measure the Hellinger distance and the Kullback-Leibler divergence are considered. For the Gaussian, Gamma, and Beta models the closed-form expressions for evaluating the distance as a function of the model parameters are obtained. The advantages of the proposed quantification framework as compared to simple mean-based approaches are analyzed with numerical simulations. Two biological experiments are also considered. The first is the functional analysis of the p8 subunit of the TFIIH complex responsible for a rare hereditary multi-system disorder--trichothiodystrophy group A (TTD-A). In the second experiment the proposed methods are applied to assess the UV-induced DNA lesion repair rate. A good agreement between our in vivo results and those obtained with an alternative in vitro measurement is established. We believe that the computational simplicity and the effectiveness of the proposed quantification procedure will make it very attractive for different analysis tasks in functional proteomics, as well as in high-content screening. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.
Evol and ProDy for bridging protein sequence evolution and structural dynamics
Mao, Wenzhi; Liu, Ying; Chennubhotla, Chakra; Lezon, Timothy R.; Bahar, Ivet
2014-01-01
Correlations between sequence evolution and structural dynamics are of utmost importance in understanding the molecular mechanisms of function and their evolution. We have integrated Evol, a new package for fast and efficient comparative analysis of evolutionary patterns and conformational dynamics, into ProDy, a computational toolbox designed for inferring protein dynamics from experimental and theoretical data. Using information-theoretic approaches, Evol coanalyzes conservation and coevolution profiles extracted from multiple sequence alignments of protein families with their inferred dynamics. Availability and implementation: ProDy and Evol are open-source and freely available under MIT License from http://prody.csb.pitt.edu/. Contact: bahar@pitt.edu PMID:24849577
Study of inelastic e-Cd and e-Zn collisions
NASA Astrophysics Data System (ADS)
Piwinski, Mariusz; Klosowski, Lukasz; Dziczek, Darek; Chwirot, Stanislaw
2016-09-01
Electron-photon coincidence experiments are well known for providing more detailed information about electron-atom collision than any other technique. The Electron Impact Coherence Parameters (EICP) values obtained in such studies deliver the most complete characterization of the inelastic collision and allow for a verification of proposed theoretical models. We present the results of Stokes and EICP parameters characterising electronic excitation of the lowest singlet P-state of cadmium and zinc atoms for various collision energies. The experiments were performed using electron-photon coincidence technique in the coherence analysis version. The obtained data are presented and compared with existing CCC and RDWA theoretical predictions.
An inlet analysis for the NASA hypersonic research engine aerothermodynamic integration model
NASA Technical Reports Server (NTRS)
Andrews, E. H., Jr.; Russell, J. W.; Mackley, E. A.; Simmonds, A. L.
1974-01-01
A theoretical analysis for the inlet of the NASA Hypersonic Research Engine (HRE) Aerothermodynamic Integration Model (AIM) has been undertaken by use of a method-of-characteristics computer program. The purpose of the analysis was to obtain pretest information on the full-scale HRE inlet in support of the experimental AIM program (completed May 1974). Mass-flow-ratio and additive-drag-coefficient schedules were obtained that well defined the range effected in the AIM tests. Mass-weighted average inlet total-pressure recovery, kinetic energy efficiency, and throat Mach numbers were obtained.
Ilott, Irene; Gerrish, Kate; Booth, Andrew; Field, Becky
2013-10-01
There is an international imperative to implement research into clinical practice to improve health care. Understanding the dynamics of change requires knowledge from theoretical and empirical studies. This paper presents a novel approach to testing a new meta theoretical framework: the Consolidated Framework for Implementation Research. The utility of the Framework was evaluated using a post hoc, deductive analysis of 11 narrative accounts of innovation in health care services and practice from England, collected in 2010. A matrix, comprising the five domains and 39 constructs of the Framework was developed to examine the coherence of the terminology, to compare results across contexts and to identify new theoretical developments. The Framework captured the complexity of implementation across 11 diverse examples, offering theoretically informed, comprehensive coverage. The Framework drew attention to relevant points in individual cases together with patterns across cases; for example, all were internally developed innovations that brought direct or indirect patient advantage. In 10 cases, the change was led by clinicians. Most initiatives had been maintained for several years and there was evidence of spread in six examples. Areas for further development within the Framework include sustainability and patient/public engagement in implementation. Our analysis suggests that this conceptual framework has the potential to offer useful insights, whether as part of a situational analysis or by developing context-specific propositions for hypothesis testing. Such studies are vital now that innovation is being promoted as core business for health care. © 2012 John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Streets, W.E.
As the need for rapid and more accurate determinations of gamma-emitting radionuclides in environmental and mixed waste samples grows, there is continued interest in the development of theoretical tools to eliminate the need for some laboratory analyses and to enhance the quality of information from necessary analyses. In gamma spectrometry the use of theoretical self-absorption coefficients (SACs) can eliminate the need to determine the SAC empirically by counting a known source through each sample. This empirical approach requires extra counting time and introduces another source of counting error, which must be included in the calculation of results. The empirical determinationmore » of SACs is routinely used when the nuclides of interest are specified; theoretical determination of the SAC can enhance the information for the analysis of true unknowns, where there may be no prior knowledge about radionuclides present in a sample. Determination of an exact SAC does require knowledge about the total composition of a sample. In support of the Department of Energy`s (DOE) Environmental Survey Program, the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory developed theoretical self-absorption models to estimate SACs for the determination of non-specified radionuclides in samples of unknown, widely-varying, compositions. Subsequently, another SAC model, in a different counting geometry and for specified nuclides, was developed for another application. These two models are now used routinely for the determination of gamma-emitting radionuclides in a wide variety of environmental and mixed waste samples.« less
Determinants of Internet use as a preferred source of information on personal health.
Lemire, Marc; Paré, Guy; Sicotte, Claude; Harvey, Charmian
2008-11-01
To understand the personal, social and cultural factors likely to explain recourse to the Internet as a preferred source of personal health information. A cross-sectional survey was conducted among a population of 2923 Internet users visiting a firmly established website that offers information on personal health. Multiple regression analysis was performed to identify the determinants of site use. The analysis template comprised four classes of determinants likely to explain Internet use: beliefs, intentions, user satisfaction and socio-demographic characteristics. Seven-point Likert scales were used. An analysis of the psychometric qualities of the variables provided compelling evidence of the construct's validity and reliability. A confirmatory factor analysis confirmed the correspondence with the factors predicted by the theoretical model. The regression analysis explained 35% of the variance in Internet use. Use was directly associated with five factors: perceived usefulness, importance given to written media in searches for health information, concern for personal health, importance given to the opinions of physicians and other health professionals, and the trust placed in the information available on the site itself. This study confirms the importance of the credibility of information on the frequency of Internet use as a preferred source of information on personal health. It also shows the potentially influential role of the Internet in the development of personal knowledge of health issues.
A framework for characterizing eHealth literacy demands and barriers.
Chan, Connie V; Kaufman, David R
2011-11-17
Consumer eHealth interventions are of a growing importance in the individual management of health and health behaviors. However, a range of access, resources, and skills barriers prevent health care consumers from fully engaging in and benefiting from the spectrum of eHealth interventions. Consumers may engage in a range of eHealth tasks, such as participating in health discussion forums and entering information into a personal health record. eHealth literacy names a set of skills and knowledge that are essential for productive interactions with technology-based health tools, such as proficiency in information retrieval strategies, and communicating health concepts effectively. We propose a theoretical and methodological framework for characterizing complexity of eHealth tasks, which can be used to diagnose and describe literacy barriers and inform the development of solution strategies. We adapted and integrated two existing theoretical models relevant to the analysis of eHealth literacy into a single framework to systematically categorize and describe task demands and user performance on tasks needed by health care consumers in the information age. The method derived from the framework is applied to (1) code task demands using a cognitive task analysis, and (2) code user performance on tasks. The framework and method are applied to the analysis of a Web-based consumer eHealth task with information-seeking and decision-making demands. We present the results from the in-depth analysis of the task performance of a single user as well as of 20 users on the same task to illustrate both the detailed analysis and the aggregate measures obtained and potential analyses that can be performed using this method. The analysis shows that the framework can be used to classify task demands as well as the barriers encountered in user performance of the tasks. Our approach can be used to (1) characterize the challenges confronted by participants in performing the tasks, (2) determine the extent to which application of the framework to the cognitive task analysis can predict and explain the problems encountered by participants, and (3) inform revisions to the framework to increase accuracy of predictions. The results of this illustrative application suggest that the framework is useful for characterizing task complexity and for diagnosing and explaining barriers encountered in task completion. The framework and analytic approach can be a potentially powerful generative research platform to inform development of rigorous eHealth examination and design instruments, such as to assess eHealth competence, to design and evaluate consumer eHealth tools, and to develop an eHealth curriculum.
A Framework for Characterizing eHealth Literacy Demands and Barriers
Chan, Connie V
2011-01-01
Background Consumer eHealth interventions are of a growing importance in the individual management of health and health behaviors. However, a range of access, resources, and skills barriers prevent health care consumers from fully engaging in and benefiting from the spectrum of eHealth interventions. Consumers may engage in a range of eHealth tasks, such as participating in health discussion forums and entering information into a personal health record. eHealth literacy names a set of skills and knowledge that are essential for productive interactions with technology-based health tools, such as proficiency in information retrieval strategies, and communicating health concepts effectively. Objective We propose a theoretical and methodological framework for characterizing complexity of eHealth tasks, which can be used to diagnose and describe literacy barriers and inform the development of solution strategies. Methods We adapted and integrated two existing theoretical models relevant to the analysis of eHealth literacy into a single framework to systematically categorize and describe task demands and user performance on tasks needed by health care consumers in the information age. The method derived from the framework is applied to (1) code task demands using a cognitive task analysis, and (2) code user performance on tasks. The framework and method are applied to the analysis of a Web-based consumer eHealth task with information-seeking and decision-making demands. We present the results from the in-depth analysis of the task performance of a single user as well as of 20 users on the same task to illustrate both the detailed analysis and the aggregate measures obtained and potential analyses that can be performed using this method. Results The analysis shows that the framework can be used to classify task demands as well as the barriers encountered in user performance of the tasks. Our approach can be used to (1) characterize the challenges confronted by participants in performing the tasks, (2) determine the extent to which application of the framework to the cognitive task analysis can predict and explain the problems encountered by participants, and (3) inform revisions to the framework to increase accuracy of predictions. Conclusions The results of this illustrative application suggest that the framework is useful for characterizing task complexity and for diagnosing and explaining barriers encountered in task completion. The framework and analytic approach can be a potentially powerful generative research platform to inform development of rigorous eHealth examination and design instruments, such as to assess eHealth competence, to design and evaluate consumer eHealth tools, and to develop an eHealth curriculum. PMID:22094891
[Educative intervention and development of position and critical reading].
Angulo-Bernal, Sonia Elizabeth; Leyva-González, Félix Arturo; Viniegra-Velázquez, Leonardo
2007-01-01
To investigate the professors of technical courses of the area of health, the effects of a promotional educative strategy of the participation in the development of a position prior to the education and of the aptitude for the critical theoretical text reading and information of educative research. A longitudinal study took place, of intervention. In order to measure the degree of development of a position before the education, it was applied to the instrument Concepts and ideas about education. It consists of 72 statements, organized in duple that expresses two different approaches from education: participative and passive. For the inquiry of the degree of development of critical reading two instruments were applied: 1) Theoretical text reading of education and, 2) Reading information of educative research, constituted both by 120 itemes. The validity and trustworthiness of the three instruments were valued by experts with experience in teaching and educative research. The strategy was implemented through activities in seminary form, which were done twice a week, with a duration of five hours per session, for nine months; within the activities outside the classroom, the student completed reading of a theoretical text and/or report of educative research and the resolution of a reading guide (task). During the activities in a propitious classroom, discussion atmosphere and promoting at any moment the participation of the students; a space for the reflective recovery of their own experience was opened, for the analysis and interchange of ideas and for the critic and self-criticism of the main educative practices. The professor intervened when individual participation diminished; he also channeled the discussion, indicating to the enlightening observations and strong arguments of the students. The three instruments were applied to the group of professors of technical courses (n = 10); the initial measurement was applied before initiating the educative strategy and the final measurement at the end of the same plan. The qualification of the instruments and the capture of information were made by a blind technical worker. Subsequent to the educative strategy we observed a statistically significant advance in position -inferred through its main indicator: critical theoretical text consequence- and reading. The advances shown in critical reading for information of educative research were below the critical theoretical text reading. The development of position before the education and the aptitudes for critical reading of theoretical texts and information of educative research, in professors of technical courses of the area of health, it is possible, if educative atmospheres are created that lead to the participation -halfway through by the critic.
Tao, Donghua
2008-01-01
This study extended the Technology Acceptance Model (TAM) by examining the roles of two aspects of e-resource characteristics, namely, information quality and system quality, in predicting public health students’ intention to use e-resources for completing research paper assignments. Both focus groups and a questionnaire were used to collect data. Descriptive analysis, data screening, and Structural Equation Modeling (SEM) techniques were used for data analysis. The study found that perceived usefulness played a major role in determining students’ intention to use e-resources. Perceived usefulness and perceived ease of use fully mediated the impact that information quality and system quality had on behavior intention. The research model enriches the existing technology acceptance literature by extending TAM. Representing two aspects of e-resource characteristics provides greater explanatory information for diagnosing problems of system design, development, and implementation. PMID:18999300
Tao, Donghua
2008-11-06
This study extended the Technology Acceptance Model (TAM) by examining the roles of two aspects of e-resource characteristics, namely, information quality and system quality, in predicting public health students' intention to use e-resources for completing research paper assignments. Both focus groups and a questionnaire were used to collect data. Descriptive analysis, data screening, and Structural Equation Modeling (SEM) techniques were used for data analysis. The study found that perceived usefulness played a major role in determining students' intention to use e-resources. Perceived usefulness and perceived ease of use fully mediated the impact that information quality and system quality had on behavior intention. The research model enriches the existing technology acceptance literature by extending TAM. Representing two aspects of e-resource characteristics provides greater explanatory information for diagnosing problems of system design, development, and implementation.
Information Superiority via Formal Concept Analysis
NASA Astrophysics Data System (ADS)
Koester, Bjoern; Schmidt, Stefan E.
This chapter will show how to get more mileage out of information. To achieve that, we first start with an introduction to the fundamentals of Formal Concept Analysis (FCA). FCA is a highly versatile field of applied lattice theory, which allows hidden relationships to be uncovered in relational data. Moreover, FCA provides a distinguished supporting framework to subsequently find and fill information gaps in a systematic and rigorous way. In addition, we would like to build bridges via a universal approach to other communities which can be related to FCA in order for other research areas to benefit from a theory that has been elaborated for more than twenty years. Last but not least, the essential benefits of FCA will be presented algorithmically as well as theoretically by investigating a real data set from the MIPT Terrorism Knowledge Base and also by demonstrating an application in the field of Web Information Retrieval and Web Intelligence.
Potvin, Noah; Bradt, Joke; Ghetti, Claire
2018-03-09
Over the past decade, caregiver pre-bereavement has received increased scholarly and clinical attention across multiple healthcare fields. Pre-bereavement represents a nascent area for music therapy to develop best practices in and an opportunity to establish clinical relevancy in the interdisciplinary team. This study was an exploratory inquiry into the role of music therapy with pre-bereaved informal hospice caregivers. This study intended to articulate (a) what pre-bereavement needs are present for informal hospice caregivers, (b) which of those needs were addressed in music, and (c) the process by which music therapy addressed those needs. A constructivist grounded theory methodology using situational analysis was used. We interviewed 14 currently bereaved informal hospice caregivers who had participated in music therapy with the care recipient. Analysis resulted in a theoretical model of resource-oriented music therapy promoting caregiver resilience. The resource, caregivers' stable caring relationships with care recipients through their pre-illness identities (i.e., spouse, parent, or child), is amplified through music therapy. Engagement with this resource mediates the risk of increased care burden and results in resilience fostering purposefulness and value in caregiving. Resource-oriented music therapy provides a unique clinical avenue for supporting caregivers through pre-bereavement, and was acknowledged by caregivers as a unique and integral hospice service. Within this model, caregivers are better positioned to develop meaning from the experience of providing care through the death of a loved one.
An information theory analysis of spatial decisions in cognitive development
Scott, Nicole M.; Sera, Maria D.; Georgopoulos, Apostolos P.
2015-01-01
Performance in a cognitive task can be considered as the outcome of a decision-making process operating across various knowledge domains or aspects of a single domain. Therefore, an analysis of these decisions in various tasks can shed light on the interplay and integration of these domains (or elements within a single domain) as they are associated with specific task characteristics. In this study, we applied an information theoretic approach to assess quantitatively the gain of knowledge across various elements of the cognitive domain of spatial, relational knowledge, as a function of development. Specifically, we examined changing spatial relational knowledge from ages 5 to 10 years. Our analyses consisted of a two-step process. First, we performed a hierarchical clustering analysis on the decisions made in 16 different tasks of spatial relational knowledge to determine which tasks were performed similarly at each age group as well as to discover how the tasks clustered together. We next used two measures of entropy to capture the gradual emergence of order in the development of relational knowledge. These measures of “cognitive entropy” were defined based on two independent aspects of chunking, namely (1) the number of clusters formed at each age group, and (2) the distribution of tasks across the clusters. We found that both measures of entropy decreased with age in a quadratic fashion and were positively and linearly correlated. The decrease in entropy and, therefore, gain of information during development was accompanied by improved performance. These results document, for the first time, the orderly and progressively structured “chunking” of decisions across the development of spatial relational reasoning and quantify this gain within a formal information-theoretic framework. PMID:25698915
Influencing young women to pursue a career in the creative information technologies
NASA Astrophysics Data System (ADS)
Mosco, Michele
A leaky pipeline is often cited as the cause for the underrepresentation of women in computer-related professions. However, females may not even enter the pipeline; that is, they do not even enroll in creative information technology coursework as early as high school. Creative information technology careers include web design, digital photography, and multimedia. Constructs of the social cognitive career theory---outcome expectations and self-efficacy--provided the theoretical framework for this investigation to determine why young women are not exhibiting interest in these careers. Using an action research structure, a female-segregated technology club was implemented at the high school. The study intended to increase the participants' interest in pursuing careers in the creative information technology field through the components of career choice as outlined in the theoretical framework. The outcome expectations of "With whom will I work?" and "What will I do?" were addressed through the presentation of female role models and career information. Self-efficacy was targeted through technology skills' instruction directly related to the creative information technology fields. Data was collected through the administration of a pretest/posttest survey instrument, researcher observations, individual participant interviews, and an analysis of the participants' creative products. Quantitative findings indicated that there were few statistically significant program effects. The participants' perceptions of those employed in these careers did not change, but their technology self-efficacy increased on three indicators. Analysis of qualitative data yielded a more complete picture: although the young women had little prior knowledge of those employed in these fields, they did enjoy learning technology to develop creative projects in a social atmosphere where they could persevere through the technology frustrations they encountered. All of the data types affirmed that the participants' interest in these careers changed positively after the intervention.
Phenolic Analysis and Theoretic Design for Chinese Commercial Wines' Authentication.
Li, Si-Yu; Zhu, Bao-Qing; Reeves, Malcolm J; Duan, Chang-Qing
2018-01-01
To develop a robust tool for Chinese commercial wines' varietal, regional, and vintage authentication, phenolic compounds in 121 Chinese commercial dry red wines were detected and quantified by using high-performance liquid chromatography triple-quadrupole mass spectrometry (HPLC-QqQ-MS/MS), and differentiation abilities of principal component analysis (PCA), partial least squares discriminant analysis (PLS-DA), and orthogonal partial least squares discriminant analysis (OPLS-DA) were compared. Better than PCA and PLS-DA, OPLS-DA models used to differentiate wines according to their varieties (Cabernet Sauvignon or other varieties), regions (east or west Cabernet Sauvignon wines), and vintages (young or old Cabernet Sauvignon wines) were ideally established. The S-plot provided in OPLS-DA models showed the key phenolic compounds which were both statistically and biochemically significant in sample differentiation. Besides, the potential of the OPLS-DA models in deeper sample differentiating of more detailed regional and vintage information of wines was proved optimistic. On the basis of our results, a promising theoretic design for wine authentication was further proposed for the first time, which might be helpful in practical authentication of more commercial wines. The phenolic data of 121 Chinese commercial dry red wines was processed with different statistical tools for varietal, regional, and vintage differentiation. A promising theoretical design was summarized, which might be helpful for wine authentication in practical situation. © 2017 Institute of Food Technologists®.
Dynamics-A explorer RIMS data analysis
NASA Technical Reports Server (NTRS)
Banks, P. M.; Clauer, C. R.
1985-01-01
Activities of the RIMS instrument during the extended mission are planned. The modes of operation for RIMS to achieve the science requirements utilizing the new and exciting information on the composition and dynamics of the low energy (0-50eV) ions in the Earth's ionosphere and magnetosphere are determined. The specific science problems and the required RIMS operational modes needed to acquire the desired data are identified. The analysis was performed on the RIMS data to achieve the science results and this new information was used in determining RIMS operations during the latter part of the mission. Necessary sensitivity tests of RIMS operating modes and instrument performance was suggested. The inflight results was compared with theoretical models.
Observer-based distributed adaptive iterative learning control for linear multi-agent systems
NASA Astrophysics Data System (ADS)
Li, Jinsha; Liu, Sanyang; Li, Junmin
2017-10-01
This paper investigates the consensus problem for linear multi-agent systems from the viewpoint of two-dimensional systems when the state information of each agent is not available. Observer-based fully distributed adaptive iterative learning protocol is designed in this paper. A local observer is designed for each agent and it is shown that without using any global information about the communication graph, all agents achieve consensus perfectly for all undirected connected communication graph when the number of iterations tends to infinity. The Lyapunov-like energy function is employed to facilitate the learning protocol design and property analysis. Finally, simulation example is given to illustrate the theoretical analysis.
Measuring Integrated Information from the Decoding Perspective
Oizumi, Masafumi; Amari, Shun-ichi; Yanagawa, Toru; Fujii, Naotaka; Tsuchiya, Naotsugu
2016-01-01
Accumulating evidence indicates that the capacity to integrate information in the brain is a prerequisite for consciousness. Integrated Information Theory (IIT) of consciousness provides a mathematical approach to quantifying the information integrated in a system, called integrated information, Φ. Integrated information is defined theoretically as the amount of information a system generates as a whole, above and beyond the amount of information its parts independently generate. IIT predicts that the amount of integrated information in the brain should reflect levels of consciousness. Empirical evaluation of this theory requires computing integrated information from neural data acquired from experiments, although difficulties with using the original measure Φ precludes such computations. Although some practical measures have been previously proposed, we found that these measures fail to satisfy the theoretical requirements as a measure of integrated information. Measures of integrated information should satisfy the lower and upper bounds as follows: The lower bound of integrated information should be 0 and is equal to 0 when the system does not generate information (no information) or when the system comprises independent parts (no integration). The upper bound of integrated information is the amount of information generated by the whole system. Here we derive the novel practical measure Φ* by introducing a concept of mismatched decoding developed from information theory. We show that Φ* is properly bounded from below and above, as required, as a measure of integrated information. We derive the analytical expression of Φ* under the Gaussian assumption, which makes it readily applicable to experimental data. Our novel measure Φ* can generally be used as a measure of integrated information in research on consciousness, and also as a tool for network analysis on diverse areas of biology. PMID:26796119
Finite-block-length analysis in classical and quantum information theory.
Hayashi, Masahito
2017-01-01
Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects.
Finite-block-length analysis in classical and quantum information theory
HAYASHI, Masahito
2017-01-01
Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects. PMID:28302962
Study on the security of the authentication scheme with key recycling in QKD
NASA Astrophysics Data System (ADS)
Li, Qiong; Zhao, Qiang; Le, Dan; Niu, Xiamu
2016-09-01
In quantum key distribution (QKD), the information theoretically secure authentication is necessary to guarantee the integrity and authenticity of the exchanged information over the classical channel. In order to reduce the key consumption, the authentication scheme with key recycling (KR), in which a secret but fixed hash function is used for multiple messages while each tag is encrypted with a one-time pad (OTP), is preferred in QKD. Based on the assumption that the OTP key is perfect, the security of the authentication scheme has be proved. However, the OTP key of authentication in a practical QKD system is not perfect. How the imperfect OTP affects the security of authentication scheme with KR is analyzed thoroughly in this paper. In a practical QKD, the information of the OTP key resulting from QKD is partially leaked to the adversary. Although the information leakage is usually so little to be neglected, it will lead to the increasing degraded security of the authentication scheme as the system runs continuously. Both our theoretical analysis and simulation results demonstrate that the security level of authentication scheme with KR, mainly indicated by its substitution probability, degrades exponentially in the number of rounds and gradually diminishes to zero.
Statistical Issues for Calculating Reentry Hazards
NASA Technical Reports Server (NTRS)
Matney, Mark; Bacon, John
2016-01-01
A number of statistical tools have been developed over the years for assessing the risk of reentering object to human populations. These tools make use of the characteristics (e.g., mass, shape, size) of debris that are predicted by aerothermal models to survive reentry. This information, combined with information on the expected ground path of the reentry, is used to compute the probability that one or more of the surviving debris might hit a person on the ground and cause one or more casualties. The statistical portion of this analysis relies on a number of assumptions about how the debris footprint and the human population are distributed in latitude and longitude, and how to use that information to arrive at realistic risk numbers. This inevitably involves assumptions that simplify the problem and make it tractable, but it is often difficult to test the accuracy and applicability of these assumptions. This paper builds on previous IAASS work to re-examine many of these theoretical assumptions, including the mathematical basis for the hazard calculations, and outlining the conditions under which the simplifying assumptions hold. This study also employs empirical and theoretical information to test these assumptions, and makes recommendations how to improve the accuracy of these calculations in the future.
Statistical Issues for Calculating Reentry Hazards
NASA Technical Reports Server (NTRS)
Bacon, John B.; Matney, Mark
2016-01-01
A number of statistical tools have been developed over the years for assessing the risk of reentering object to human populations. These tools make use of the characteristics (e.g., mass, shape, size) of debris that are predicted by aerothermal models to survive reentry. This information, combined with information on the expected ground path of the reentry, is used to compute the probability that one or more of the surviving debris might hit a person on the ground and cause one or more casualties. The statistical portion of this analysis relies on a number of assumptions about how the debris footprint and the human population are distributed in latitude and longitude, and how to use that information to arrive at realistic risk numbers. This inevitably involves assumptions that simplify the problem and make it tractable, but it is often difficult to test the accuracy and applicability of these assumptions. This paper builds on previous IAASS work to re-examine one of these theoretical assumptions.. This study employs empirical and theoretical information to test the assumption of a fully random decay along the argument of latitude of the final orbit, and makes recommendations how to improve the accuracy of this calculation in the future.
Van Houdt, Sabine; Sermeus, Walter; Vanhaecht, Kris; De Lepeleire, Jan
2014-12-24
Strategies to improve care coordination between primary and hospital care do not always have the desired results. This is partly due to incomplete understanding of the key concepts of care coordination. An in-depth analysis of existing theoretical frameworks for the study of care coordination identified 14 interrelated key concepts. In another study, these 14 key concepts were further explored in patients' experiences. Additionally, "patient characteristics" was identified as a new key concept in patients' experiences and the previously identified key concept "quality of relationship" between healthcare professionals was extended to "quality of relationship" with the patient. Together, these 15 interrelated key concepts resulted in a new theoretical framework. The present study aimed at improving our understanding of the 15 previously identified key concepts and to explore potentially previous unidentified key concepts and the links between these by exploring how healthcare professionals experience care coordination. A qualitative design was used. Six focus groups were conducted including primary healthcare professionals involved in the care of patients who had breast cancer surgery at three hospitals in Belgium. Data were analyzed using constant comparative analysis. All 15 previously identified key concepts of care coordination were further explored in healthcare professionals' experiences. Links between these 15 concepts were identified, including 9 newly identified links. The concept "external factors" was linked with all 6 concepts relating to (inter)organizational mechanisms; "task characteristics", "structure", "knowledge and information technology", "administrative operational processes", "cultural factors" and "need for coordination". Five of these concepts related to 3 concepts of relational coordination; "roles", "quality of relationship" and "exchange of information". The concept of "task characteristics" was only linked with "roles" and "exchange of information". The concept "patient characteristics" related with the concepts "need for coordination" and "patient outcome". Outcome was influenced by "roles", "quality of relationship" and "exchange of information". External factors and the (inter)organizational mechanism should enhance "roles" and "quality of relationship" between healthcare professionals and with the patient as well as "exchange of information", and setting and sharing of common "goals" to improve care coordination and quality of care.
Tobin, David L; Banker, Judith D; Weisberg, Laura; Bowers, Wayne
2007-12-01
Although several studies have shown that eating disorders clinicians do not generally use treatment manuals, findings regarding what they do use have typically been vague, or closely linked to a particular theoretical approach. Our goal was to identify what eating disorder clinicians do with their patients in a more theoretically neutral context. We also sought to describe an empirically defined approach to psychotherapeutic practice as defined by clinicians via factor analysis. A survey developed for this study was administered to 265 clinicians recruited online and at regional and international meetings for eating disorders professionals. Only 6% of respondents reported they adhered closely to treatment manuals and 98% of the respondents indicated they used both behavioral and dynamically informed interventions. Factor analysis of clinicians' use of 32 therapeutic strategies suggested seven dimensions: Psychodynamic Interventions, Coping Skills Training, Family History, CBT, Contracts, Therapist Disclosure, and Patient Feelings. The findings of this study suggest that most clinicians use a wide array of eating disorder treatment interventions drawn from empirically supported treatments, such as CBT-BN, and from treatments that have no randomized controlled trial support. Factor analysis suggested theoretically linked dimensions of treatment, but also dimensions that are common across models. (c) 2007 by Wiley Periodicals, Inc.
Research in Computational Astrobiology
NASA Technical Reports Server (NTRS)
Chaban, Galina; Colombano, Silvano; Scargle, Jeff; New, Michael H.; Pohorille, Andrew; Wilson, Michael A.
2003-01-01
We report on several projects in the field of computational astrobiology, which is devoted to advancing our understanding of the origin, evolution and distribution of life in the Universe using theoretical and computational tools. Research projects included modifying existing computer simulation codes to use efficient, multiple time step algorithms, statistical methods for analysis of astrophysical data via optimal partitioning methods, electronic structure calculations on water-nuclei acid complexes, incorporation of structural information into genomic sequence analysis methods and calculations of shock-induced formation of polycylic aromatic hydrocarbon compounds.
The intellectual core of enterprise information systems: a co-citation analysis
NASA Astrophysics Data System (ADS)
Shiau, Wen-Lung
2016-10-01
Enterprise information systems (EISs) have evolved in the past 20 years, attracting the attention of international practitioners and scholars. Although literature reviews and analyses have been conducted to examine the multiple dimensions of EISs, no co-citation analysis has been conducted to examine the knowledge structures involved in EIS studies; thus, the current study fills this research gap. This study investigated the intellectual structures of EISs. All data source documents (1083 articles and 24,090 citations) were obtained from the Institute for Scientific Information Web of Knowledge database. A co-citation analysis was used to analyse EIS data. By using factor analysis, we identified eight critical factors: (a) factors affecting the implementation and success of information systems (ISs); (b) the successful implementation of enterprise resource planning (ERP); (c) IS evaluation and success, (d) system science studies; (e) factors influencing ERP success; (f) case research and theoretical models; (g) user acceptance of information technology; and (h) IS frameworks. Multidimensional scaling and cluster analysis were used to visually map the resultant EIS knowledge. It is difficult to implement an EIS in an enterprise and each organisation exhibits specific considerations. The current findings indicate that managers must focus on ameliorating inferior project performance levels, enabling a transition from 'vicious' to 'virtuous' projects. Successful EIS implementation yields substantial organisational advantages.
Zhao, Yubin; Li, Xiaofan; Zhang, Sha; Meng, Tianhui; Zhang, Yiwen
2016-08-23
In practical localization system design, researchers need to consider several aspects to make the positioning efficiently and effectively, e.g., the available auxiliary information, sensing devices, equipment deployment and the environment. Then, these practical concerns turn out to be the technical problems, e.g., the sequential position state propagation, the target-anchor geometry effect, the Non-line-of-sight (NLOS) identification and the related prior information. It is necessary to construct an efficient framework that can exploit multiple available information and guide the system design. In this paper, we propose a scalable method to analyze system performance based on the Cramér-Rao lower bound (CRLB), which can fuse all of the information adaptively. Firstly, we use an abstract function to represent all of the wireless localization system model. Then, the unknown vector of the CRLB consists of two parts: the first part is the estimated vector, and the second part is the auxiliary vector, which helps improve the estimation accuracy. Accordingly, the Fisher information matrix is divided into two parts: the state matrix and the auxiliary matrix. Unlike the theoretical analysis, our CRLB can be a practical fundamental limit to denote the system that fuses multiple information in the complicated environment, e.g., recursive Bayesian estimation based on the hidden Markov model, the map matching method and the NLOS identification and mitigation methods. Thus, the theoretical results are approaching the real case more. In addition, our method is more adaptable than other CRLBs when considering more unknown important factors. We use the proposed method to analyze the wireless sensor network-based indoor localization system. The influence of the hybrid LOS/NLOS channels, the building layout information and the relative height differences between the target and anchors are analyzed. It is demonstrated that our method exploits all of the available information for the indoor localization systems and serves as an indicator for practical system evaluation.
Liao, Qiuyan; Cowling, Benjamin J; Lam, Wendy Wing Tak; Fielding, Richard
2011-06-01
Understanding population responses to influenza helps optimize public health interventions. Relevant theoretical frameworks remain nascent. To model associations between trust in information, perceived hygiene effectiveness, knowledge about the causes of influenza, perceived susceptibility and worry, and personal hygiene practices (PHPs) associated with influenza. Cross-sectional household telephone surveys on avian influenza A/H5N1 (2006) and pandemic influenza A/H1N1 (2009) gathered comparable data on trust in formal and informal sources of influenza information, influenza-related knowledge, perceived hygiene effectiveness, worry, perceived susceptibility, and PHPs. Exploratory factor analysis confirmed domain content while confirmatory factor analysis was used to evaluate the extracted factors. The hypothesized model, compiled from different theoretical frameworks, was optimized with structural equation modelling using the A/H5N1 data. The optimized model was then tested against the A/H1N1 dataset. The model was robust across datasets though corresponding path weights differed. Trust in formal information was positively associated with perceived hygiene effectiveness which was positively associated with PHPs in both datasets. Trust in formal information was positively associated with influenza worry in A/H5N1 data, and with knowledge of influenza cause in A/H1N1 data, both variables being positively associated with PHPs. Trust in informal information was positively associated with influenza worry in both datasets. Independent of information trust, perceived influenza susceptibility associated with influenza worry. Worry associated with PHPs in A/H5N1 data only. Knowledge of influenza cause and perceived PHP effectiveness were associated with PHPs. Improving trust in formal information should increase PHPs. Worry was significantly associated with PHPs in A/H5N1.
Human movement analysis using stereophotogrammetry. Part 1: theoretical background.
Cappozzo, Aurelio; Della Croce, Ugo; Leardini, Alberto; Chiari, Lorenzo
2005-02-01
This paper sets the stage for a series of reviews dealing with the problems associated with the reconstruction and analysis of in vivo skeletal system kinematics using optoelectronic stereophotogrammetric data. Instantaneous bone position and orientation and joint kinematic variable estimations are addressed in the framework of rigid body mechanics. The conceptual background to these exercises is discussed. Focus is placed on the experimental and analytical problem of merging the information relative to movement and that relative to the morphology of the anatomical body parts of interest. The various global and local frames that may be used in this context are defined. Common anatomical and mathematical conventions that can be used to describe joint kinematics are illustrated in a comparative fashion. The authors believe that an effort to systematize the different theoretical and experimental approaches to the problems involved and related nomenclatures, as currently reported in the literature, is needed to facilitate data and knowledge sharing, and to provide renewed momentum for the advancement of human movement analysis.
Ethics and rationality in information-enriched decisions: A model for technical communication
NASA Astrophysics Data System (ADS)
Dressel, S. B.; Carlson, P.; Killingsworth, M. J.
1993-12-01
In a technological culture, information has a crucial impact upon decisions, but exactly how information plays into decisions is not always clear. Decisions that are effective, efficient, and ethical must be rational. That is, we must be able to determine and present good reasons for our actions. The topic in this paper is how information relates to good reasons and thereby affects the best decisions. A brief sketch of a model for decision-making, is presented which offers a synthesis of theoretical approaches to argument and to information analysis. Then the model is applied to a brief hypothetical case. The main purpose is to put the model before an interested audience in hopes of stimulating discussion and further research.
Depressed Mothers as Informants on Child Behavior: Methodological Issues
Ordway, Monica Roosa
2011-01-01
Mothers with depressive symptoms more frequently report behavioral problems among their children than non-depressed mothers leading to a debate regarding the accuracy of depressed mothers as informants of children’s behavior. The purpose of this integrative review was to identify methodological challenges in research related to the debate. Data were extracted from 43 papers (6 theoretical, 36 research reports, and 1 instrument scoring manual). The analysis focused on the methodologies considered when using depressed mothers as informants. Nine key themes were identified and I concluded that researchers should incorporate multiple informants, identify the characteristics of maternal depression, and incorporate advanced statistical methodology. The use of a conceptual framework to understand informant discrepancies within child behavior evaluations is suggested for future research. PMID:21964958
Combination of real options and game-theoretic approach in investment analysis
NASA Astrophysics Data System (ADS)
Arasteh, Abdollah
2016-09-01
Investments in technology create a large amount of capital investments by major companies. Assessing such investment projects is identified as critical to the efficient assignment of resources. Viewing investment projects as real options, this paper expands a method for assessing technology investment decisions in the linkage existence of uncertainty and competition. It combines the game-theoretic models of strategic market interactions with a real options approach. Several key characteristics underlie the model. First, our study shows how investment strategies rely on competitive interactions. Under the force of competition, firms hurry to exercise their options early. The resulting "hurry equilibrium" destroys the option value of waiting and involves violent investment behavior. Second, we get best investment policies and critical investment entrances. This suggests that integrating will be unavoidable in some information product markets. The model creates some new intuitions into the forces that shape market behavior as noticed in the information technology industry. It can be used to specify best investment policies for technology innovations and adoptions, multistage R&D, and investment projects in information technology.
The Dolinar Receiver in an Information Theoretic Framework
NASA Technical Reports Server (NTRS)
Erkmen, Baris I.; Birnbaum, Kevin M.; Moision, Bruce E.; Dolinar, Samuel J.
2011-01-01
Optical communication at the quantum limit requires that measurements on the optical field be maximally informative, but devising physical measurements that accomplish this objective has proven challenging. The Dolinar receiver exemplifies a rare instance of success in distinguishing between two coherent states: an adaptive local oscillator is mixed with the signal prior to photodetection, which yields an error probability that meets the Helstrom lower bound with equality. Here we apply the same local-oscillator-based architecture with aninformation-theoretic optimization criterion. We begin with analysis of this receiver in a general framework for an arbitrary coherent-state modulation alphabet, and then we concentrate on two relevant examples. First, we study a binary antipodal alphabet and show that the Dolinar receiver's feedback function not only minimizes the probability of error, but also maximizes the mutual information. Next, we study ternary modulation consistingof antipodal coherent states and the vacuum state. We derive an analytic expression for a near-optimal local oscillator feedback function, and, via simulation, we determine its photon information efficiency (PIE). We provide the PIE versus dimensional information efficiency (DIE) trade-off curve and show that this modulation and the our receiver combination performs universally better than (generalized) on-off keying plus photoncounting, although, the advantage asymptotically vanishes as the bits-per-photon diverges towards infinity.
ERIC Educational Resources Information Center
Bierschenk, Bernhard
Empirical information is presented on how researchers at Swedish institutes of education perceive, structure, and define educational and psychological problems. The collection, evaluation, and presentation of the results of the study were made on the basis of system theoretic assumptions in that the description and analysis of the initial phase of…
ERIC Educational Resources Information Center
Antoniadou, Victoria
2011-01-01
This article describes the contradictions reported by student-teachers in Barcelona who engaged in telecollaboration with transatlantic peers via Second Life, during their initial training in Teaching English as a Foreign Language. The data analysis draws upon Grounded Theory and is theoretically informed by Activity Theory and the notion of…
ERIC Educational Resources Information Center
Logan, Helen
2018-01-01
This paper presents lesser known accounts from policy makers whose experiences as elite informants span 40 or so years in Australian early childhood education and care (ECEC) policy history between 1972 and 2009. Drawing on a post-structuralist theoretical frame, this paper employs a Foucauldian-influenced approach to discourse analysis. Given the…
ERIC Educational Resources Information Center
Balboni, Daniel C.
2016-01-01
Researchers have conducted both theoretical and empirical research on the participation of youth in sports to understand the motivation to continue involvement. Researchers have further examined the positive effects of sports on youth who participate. Although information has been gathered in these areas regarding keeping middle school and high…
ERIC Educational Resources Information Center
Khanova, Julia
2013-01-01
The study explored the role of online teaching experience in pedagogical innovation in the area of Library and Information Science (LIS) education. Based on the data from interviews with 25 LIS faculty who have relevant experience and from the syllabi for their courses, the study provides evidence that transitioning courses to online modality…
Improved separability criteria via some classes of measurements
NASA Astrophysics Data System (ADS)
Shen, Shu-Qian; Li, Ming; Li-Jost, Xianqing; Fei, Shao-Ming
2018-05-01
The entanglement detection via local measurements can be experimentally implemented. Based on mutually unbiased measurements and general symmetric informationally complete positive-operator-valued measures, we present separability criteria for bipartite quantum states, which, by theoretical analysis, are stronger than the related existing criteria via these measurements. Two detailed examples are supplemented to show the efficiency of the presented separability criteria.
ERIC Educational Resources Information Center
Kramer, Lorie Renee
2012-01-01
This qualitative study used narrative analysis to explore the role of relationships between adults and their canine companions and the role of this relationship in personal growth and well-being. The theoretical frameworks to inform the study consisted of attachment theory and a blend of relational theory and connected knowing. The study focused…
ERIC Educational Resources Information Center
Dowell, Nia M. M.; Graesser, Arthur\tC.; Cai, Zhiqiang
2016-01-01
The goal of this article is to preserve and distribute the information presented at the LASI (2014) workshop on Coh-Metrix, a theoretically grounded, computational linguistics facility that analyzes texts on multiple levels of language and discourse. The workshop focused on the utility of Coh-Metrix in discourse theory and educational practice. We…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crossno, Patricia Joyce; Dunlavy, Daniel M.; Stanton, Eric T.
This report is a summary of the accomplishments of the 'Scalable Solutions for Processing and Searching Very Large Document Collections' LDRD, which ran from FY08 through FY10. Our goal was to investigate scalable text analysis; specifically, methods for information retrieval and visualization that could scale to extremely large document collections. Towards that end, we designed, implemented, and demonstrated a scalable framework for text analysis - ParaText - as a major project deliverable. Further, we demonstrated the benefits of using visual analysis in text analysis algorithm development, improved performance of heterogeneous ensemble models in data classification problems, and the advantages ofmore » information theoretic methods in user analysis and interpretation in cross language information retrieval. The project involved 5 members of the technical staff and 3 summer interns (including one who worked two summers). It resulted in a total of 14 publications, 3 new software libraries (2 open source and 1 internal to Sandia), several new end-user software applications, and over 20 presentations. Several follow-on projects have already begun or will start in FY11, with additional projects currently in proposal.« less
Ripp, Isabelle; Zur Nieden, Anna-Nora; Blankenagel, Sonja; Franzmeier, Nicolai; Lundström, Johan N; Freiherr, Jessica
2018-05-07
In this study, we aimed to understand how whole-brain neural networks compute sensory information integration based on the olfactory and visual system. Task-related functional magnetic resonance imaging (fMRI) data was obtained during unimodal and bimodal sensory stimulation. Based on the identification of multisensory integration processing (MIP) specific hub-like network nodes analyzed with network-based statistics using region-of-interest based connectivity matrices, we conclude the following brain areas to be important for processing the presented bimodal sensory information: right precuneus connected contralaterally to the supramarginal gyrus for memory-related imagery and phonology retrieval, and the left middle occipital gyrus connected ipsilaterally to the inferior frontal gyrus via the inferior fronto-occipital fasciculus including functional aspects of working memory. Applied graph theory for quantification of the resulting complex network topologies indicates a significantly increased global efficiency and clustering coefficient in networks including aspects of MIP reflecting a simultaneous better integration and segregation. Graph theoretical analysis of positive and negative network correlations allowing for inferences about excitatory and inhibitory network architectures revealed-not significant, but very consistent-that MIP-specific neural networks are dominated by inhibitory relationships between brain regions involved in stimulus processing. © 2018 Wiley Periodicals, Inc.
Data Curation and Visualization for MuSIASEM Analysis of the Nexus
NASA Astrophysics Data System (ADS)
Renner, Ansel
2017-04-01
A novel software-based approach to relational analysis applying recent theoretical advancements of the Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism (MuSIASEM) accounting framework is presented. This research explores and explains underutilized ways software can assist complex system analysis across the stages of data collection, exploration, analysis and dissemination and in a transparent and collaborative manner. This work is being conducted as part of, and in support of, the four-year European Commission H2020 project: Moving Towards Adaptive Governance in Complexity: Informing Nexus Security (MAGIC). In MAGIC, theoretical advancements to MuSIASEM propose a powerful new approach to spatial-temporal WEFC relational analysis in accordance with a structural-functional scaling mechanism appropriate for biophysically relevant complex system analyses. Software is designed primarily with JavaScript using the Angular2 model-view-controller framework and the Data-Driven Documents (D3) library. These design choices clarify and modularize data flow, simplify research practitioner's work, allow for and assist stakeholder involvement and advance collaboration at all stages. Data requirements and scalable, robust yet light-weight structuring will first be explained. Following, algorithms to process this data will be explored. Data interfaces and data visualization approaches will lastly be presented and described.
Dynamics of social contagions with local trend imitation.
Zhu, Xuzhen; Wang, Wei; Cai, Shimin; Stanley, H Eugene
2018-05-09
Research on social contagion dynamics has not yet included a theoretical analysis of the ubiquitous local trend imitation (LTI) characteristic. We propose a social contagion model with a tent-like adoption probability to investigate the effect of this LTI characteristic on behavior spreading. We also propose a generalized edge-based compartmental theory to describe the proposed model. Through extensive numerical simulations and theoretical analyses, we find a crossover in the phase transition: when the LTI capacity is strong, the growth of the final adoption size exhibits a second-order phase transition. When the LTI capacity is weak, we see a first-order phase transition. For a given behavioral information transmission probability, there is an optimal LTI capacity that maximizes the final adoption size. Finally we find that the above phenomena are not qualitatively affected by the heterogeneous degree distribution. Our suggested theoretical predictions agree with the simulation results.
Mixing Categories and Modal Logics in the Quantum Setting
NASA Astrophysics Data System (ADS)
Cinà, Giovanni
The study of the foundations of Quantum Mechanics, especially after the advent of Quantum Computation and Information, has benefited from the application of category-theoretic tools and modal logics to the analysis of Quantum processes: we witness a wealth of theoretical frameworks casted in either of the two languages. This paper explores the interplay of the two formalisms in the peculiar context of Quantum Theory. After a review of some influential abstract frameworks, we show how different modal logic frames can be extracted from the category of finite dimensional Hilbert spaces, connecting the Categorical Quantum Mechanics approach to some modal logics that have been proposed for Quantum Computing. We then apply a general version of the same technique to two other categorical frameworks, the `topos approach' of Doering and Isham and the sheaf-theoretic work on contextuality by Abramsky and Brandenburger, suggesting how some key features can be expressed with modal languages.
Information Processing in Living Systems
NASA Astrophysics Data System (ADS)
Tkačik, Gašper; Bialek, William
2016-03-01
Life depends as much on the flow of information as on the flow of energy. Here we review the many efforts to make this intuition precise. Starting with the building blocks of information theory, we explore examples where it has been possible to measure, directly, the flow of information in biological networks, or more generally where information-theoretic ideas have been used to guide the analysis of experiments. Systems of interest range from single molecules (the sequence diversity in families of proteins) to groups of organisms (the distribution of velocities in flocks of birds), and all scales in between. Many of these analyses are motivated by the idea that biological systems may have evolved to optimize the gathering and representation of information, and we review the experimental evidence for this optimization, again across a wide range of scales.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tralshawala, Nilesh; Howard, Don; Knight, Bryon
2008-02-28
In conventional infrared thermography, determination of thermal diffusivity requires thickness information. Recently GE has been experimenting with the use of lateral heat flow to determine thermal diffusivity without thickness information. This work builds on previous work at NASA Langley and Wayne State University but we incorporate thermal time of flight (tof) analysis rather than curve fitting to obtain quantitative information. We have developed appropriate theoretical models and a tof based data analysis framework to experimentally determine all components of thermal diffusivity from the time-temperature measurements. Initial validation was carried out using finite difference simulations. Experimental validation was done using anisotropicmore » carbon fiber reinforced polymer (CFRP) composites. We found that in the CFRP samples used, the in-plane component of diffusivity is about eight times larger than the through-thickness component.« less
Peres Penteado, Alissa; Fábio Maciel, Rafael; Erbs, João; Feijó Ortolani, Cristina Lucia; Aguiar Roza, Bartira; Torres Pisa, Ivan
2015-01-01
The entire kidney transplantation process in Brazil is defined through laws, decrees, ordinances, and resolutions, but there is no defined theoretical map describing this process. From this representation it's possible to perform analysis, such as the identification of bottlenecks and information and communication technologies (ICTs) that support this process. The aim of this study was to analyze and represent the kidney transplantation workflow using business process modeling notation (BPMN) and then to identify the ICTs involved in the process. This study was conducted in eight steps, including document analysis and professional evaluation. The results include the BPMN model of the kidney transplantation process in Brazil and the identification of ICTs. We discovered that there are great delays in the process due to there being many different ICTs involved, which can cause information to be poorly integrated.
Instrumentation for localized superconducting cavity diagnostics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conway, Z. A.; Ge, M.; Iwashita, Y.
2017-01-12
Superconducting accelerator cavities are now routinely operated at levels approaching the theoretical limit of niobium. To achieve these operating levels more information than is available from the RF excitation signal is required to characterize and determine fixes for the sources of performance limitations. This information is obtained using diagnostic techniques which complement the analysis of the RF signal. In this paper we describe the operation and select results from three of these diagnostic techniques: the use of large scale thermometer arrays, second sound wave defect location and high precision cavity imaging with the Kyoto camera.
Information theoretic quantification of diagnostic uncertainty.
Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T
2012-01-01
Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.
Orbital Evasive Target Tracking and Sensor Management
2012-03-30
maximize the total information gain in the observer-to-target assignment. We compare the information based approach to the game theoretic criterion where...tracking with multiple space borne observers. The results indicate that the game theoretic approach is more effective than the information based approach in...sensor management is to maximize the total information gain in the observer-to-target assignment. We compare the information based approach to the game
Observations of Terrestrial Nightglow (Meinel Bands) at King
NASA Astrophysics Data System (ADS)
Won, Young-In; Cho, Young-Min; Lee, Bang Yong; Kim, Jhoon; Chung, Jong Kyun; Kim, Yong Ha
1999-12-01
A Fourier Transform Spectrometer was used to study upper mesospheric thermodynamic by observing the hydroxyl(OH) emission. Rocket-born and satellited-born photometers place the peak emission near 87 μm. The instrument was installed in February 1999 at King Sejong station (62.22 deg S,301.25 deg E), Antarctica and has been in routine operation since then. An intensive operational effort has resulted in a substantial data between April and June, 1999. A harmonic analysis was carried out to examine information on the tidal characteristics. The measured amplitudes of the 12-hour oscillation are in the range of 2.4-3.7 K, which are in resonable agreement with theoretical model outputs. The harmonic analysis also revealed 8-hour oscillation which is not expected from the traditional theoretical studies. In addition, the observed 8-hour oscillations are apparent and sometimes dominate the temperature variation in the upper mesosphere.
Multivariate analysis for scanning tunneling spectroscopy data
NASA Astrophysics Data System (ADS)
Yamanishi, Junsuke; Iwase, Shigeru; Ishida, Nobuyuki; Fujita, Daisuke
2018-01-01
We applied principal component analysis (PCA) to two-dimensional tunneling spectroscopy (2DTS) data obtained on a Si(111)-(7 × 7) surface to explore the effectiveness of multivariate analysis for interpreting 2DTS data. We demonstrated that several components that originated mainly from specific atoms at the Si(111)-(7 × 7) surface can be extracted by PCA. Furthermore, we showed that hidden components in the tunneling spectra can be decomposed (peak separation), which is difficult to achieve with normal 2DTS analysis without the support of theoretical calculations. Our analysis showed that multivariate analysis can be an additional powerful way to analyze 2DTS data and extract hidden information from a large amount of spectroscopic data.
Foveal analysis and peripheral selection during active visual sampling
Ludwig, Casimir J. H.; Davies, J. Rhys; Eckstein, Miguel P.
2014-01-01
Human vision is an active process in which information is sampled during brief periods of stable fixation in between gaze shifts. Foveal analysis serves to identify the currently fixated object and has to be coordinated with a peripheral selection process of the next fixation location. Models of visual search and scene perception typically focus on the latter, without considering foveal processing requirements. We developed a dual-task noise classification technique that enables identification of the information uptake for foveal analysis and peripheral selection within a single fixation. Human observers had to use foveal vision to extract visual feature information (orientation) from different locations for a psychophysical comparison. The selection of to-be-fixated locations was guided by a different feature (luminance contrast). We inserted noise in both visual features and identified the uptake of information by looking at correlations between the noise at different points in time and behavior. Our data show that foveal analysis and peripheral selection proceeded completely in parallel. Peripheral processing stopped some time before the onset of an eye movement, but foveal analysis continued during this period. Variations in the difficulty of foveal processing did not influence the uptake of peripheral information and the efficacy of peripheral selection, suggesting that foveal analysis and peripheral selection operated independently. These results provide important theoretical constraints on how to model target selection in conjunction with foveal object identification: in parallel and independently. PMID:24385588
Hadwin, Julie A; Garner, Matthew; Perez-Olivas, Gisela
2006-11-01
The aim of this paper is to explore parenting as one potential route through which information processing biases for threat develop in children. It reviews information processing biases in childhood anxiety in the context of theoretical models and empirical research in the adult anxiety literature. Specifically, it considers how adult models have been used and adapted to develop a theoretical framework with which to investigate information processing biases in children. The paper then considers research which specifically aims to understand the relationship between parenting and the development of information processing biases in children. It concludes that a clearer theoretical framework is required to understand the significance of information biases in childhood anxiety, as well as their origins in parenting.
Juvé-Udina, Maria-Eulàlia
2012-06-01
This manuscript is the third of a triad of papers introducing the philosophical and theoretical approaches that support the development and validation of a nursing interface terminology as a standard vocabulary designed to ease data entry into electronic health records, to produce information and to generate knowledge. To analyze the philosophical and theoretical approaches considered in the development of a new nursing interface terminology called ATIC. Review, analysis and discussion of the main philosophical orientations, high and mid-range theories and nursing scientific literature to develop an interpretative conceptualization of the metaparadigm concepts "Health", "Environment" and "Nursing". In the 2 previous papers the ATIC terminology, its foundation on pragmatism, holism, post-positivism and constructivism and the construction of the meaning for the concept elndividualh is discussed. In this third paper, Health is conceptualized as a multidimensional balance state and the concepts of Partial health status, Disease and Being ill are explored within. The analysis of the Environment theories drives its conceptualization as a group of variables that has the potential to affect health status. In this orientation, Nursing is understood as the scientific discipline focused on the study of health status in the particular environment and experience of the individuals, groups, communities or societies. ATIC terminology is rooted on an eclectic philosophical and theoretical foundation, allowing it to be used from different trends within the totality paradigm.
Haldar, Justin P.; Leahy, Richard M.
2013-01-01
This paper presents a novel family of linear transforms that can be applied to data collected from the surface of a 2-sphere in three-dimensional Fourier space. This family of transforms generalizes the previously-proposed Funk-Radon Transform (FRT), which was originally developed for estimating the orientations of white matter fibers in the central nervous system from diffusion magnetic resonance imaging data. The new family of transforms is characterized theoretically, and efficient numerical implementations of the transforms are presented for the case when the measured data is represented in a basis of spherical harmonics. After these general discussions, attention is focused on a particular new transform from this family that we name the Funk-Radon and Cosine Transform (FRACT). Based on theoretical arguments, it is expected that FRACT-based analysis should yield significantly better orientation information (e.g., improved accuracy and higher angular resolution) than FRT-based analysis, while maintaining the strong characterizability and computational efficiency of the FRT. Simulations are used to confirm these theoretical characteristics, and the practical significance of the proposed approach is illustrated with real diffusion weighted MRI brain data. These experiments demonstrate that, in addition to having strong theoretical characteristics, the proposed approach can outperform existing state-of-the-art orientation estimation methods with respect to measures such as angular resolution and robustness to noise and modeling errors. PMID:23353603
Schalock, Robert L; Luckasson, Ruth; Tassé, Marc J; Verdugo, Miguel Angel
2018-04-01
This article describes a holistic theoretical framework that can be used to explain intellectual disability (ID) and organize relevant information into a usable roadmap to guide understanding and application. Developing the framework involved analyzing the four current perspectives on ID and synthesizing this information into a holistic theoretical framework. Practices consistent with the framework are described, and examples are provided of how multiple stakeholders can apply the framework. The article concludes with a discussion of the advantages and implications of a holistic theoretical approach to ID.
Multidisciplinary analysis and design of printed wiring boards
NASA Astrophysics Data System (ADS)
Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin
1991-04-01
Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.
Swapping Settings: Researching Information Literacy in Workplace and in Educational Contexts
ERIC Educational Resources Information Center
Lundh, Anna Hampson; Limberg, Louise; Lloyd, Annemaree
2013-01-01
Introduction: Information literacy research is characterised by a multitude of interests, research approaches and theoretical starting-points. Challenges lie in the relevance of research to professional fields where information literacy is a concern, and the need to build a strong theoretical base for the research area. We aim to lay a foundation…
The Theoretical Principles of the Organization of Information Systems.
ERIC Educational Resources Information Center
Kulikowski, Juliusz Lech
A survey of the theoretical problems connected with the organization and design of systems for processing and transmitting information is presented in this article. It gives a definition of Information Systems (IS) and classifies them from various points of view. It discusses briefly the most important aspects of the organization of IS, such as…
Grace, Francesca C; Meurk, Carla S; Head, Brian W; Hall, Wayne D; Harris, Meredith G; Whiteford, Harvey A
2017-05-30
Heightened fiscal constraints, increases in the chronic disease burden and in consumer expectations are among several factors contributing to the global interest in evidence-informed health policy. The present article builds on previous work that explored how the Australian Federal Government applied five instruments of policy, or policy levers, to implement a series of reforms under the Australian National Mental Health Strategy (NMHS). The present article draws on theoretical insights from political science to analyse the relative successes and failures of these levers, as portrayed in formal government evaluations of the NMHS. Documentary analysis of six evaluation documents corresponding to three National Mental Health Plans was undertaken. Both the content and approach of these government-funded, independently conducted evaluations were appraised. An overall improvement was apparent in the development and application of policy levers over time. However, this finding should be interpreted with caution due to variations in evaluation approach according to Plan and policy lever. Tabulated summaries of the success and failure of each policy initiative, ordered by lever type, are provided to establish a resource that could be consulted for future policy-making. This analysis highlights the complexities of health service reform and underscores the limitations of narrowly focused empirical approaches. A theoretical framework is provided that could inform the evaluation and targeted selection of appropriate policy levers in mental health.
2011-01-01
Background Transfer entropy (TE) is a measure for the detection of directed interactions. Transfer entropy is an information theoretic implementation of Wiener's principle of observational causality. It offers an approach to the detection of neuronal interactions that is free of an explicit model of the interactions. Hence, it offers the power to analyze linear and nonlinear interactions alike. This allows for example the comprehensive analysis of directed interactions in neural networks at various levels of description. Here we present the open-source MATLAB toolbox TRENTOOL that allows the user to handle the considerable complexity of this measure and to validate the obtained results using non-parametrical statistical testing. We demonstrate the use of the toolbox and the performance of the algorithm on simulated data with nonlinear (quadratic) coupling and on local field potentials (LFP) recorded from the retina and the optic tectum of the turtle (Pseudemys scripta elegans) where a neuronal one-way connection is likely present. Results In simulated data TE detected information flow in the simulated direction reliably with false positives not exceeding the rates expected under the null hypothesis. In the LFP data we found directed interactions from the retina to the tectum, despite the complicated signal transformations between these stages. No false positive interactions in the reverse directions were detected. Conclusions TRENTOOL is an implementation of transfer entropy and mutual information analysis that aims to support the user in the application of this information theoretic measure. TRENTOOL is implemented as a MATLAB toolbox and available under an open source license (GPL v3). For the use with neural data TRENTOOL seamlessly integrates with the popular FieldTrip toolbox. PMID:22098775
Török I, András; Vincze, Gábor
2011-01-01
[corrected] The Szondi-test is widely applied in clinical diagnostics in Hungary too, and the evidence resulting from the theory is that we can get information about attachment during its interpreting. Its validity is proven by empirical research and clinical experiences. By analyzing the modern attachment theory more thoroughly, it becomes clear in what ways the Szondi-test constellations regarding attachment are different from the classificationbased on questionnaires, allowing the discrete measurement of the attachment style. With the Szondi-test the classification to attachment style is more insecure, but if it is completed with exploration, it is more informative in vector C (vector of relation, attachment information), while short questionnaires make the classification to attachment style possible. In our empirical analysis we represent the integration of the above mentioned clinical and theoretical experiences. In the present analysis we compare the vector C and S constellation of the two-profile Szondi-test of 80 persons with the dimensions of ECR-R questionnaire and with Collins and Read's questionnaire classification regarding attachment style. The statistical results refer to the fact that there is a legitimacy to compare questionnaire processes allowing the discrete classification of attachment and the Szondi-test's information content regarding attachment. With applying the methods together, we get a unique, complementary section of the information relating to attachment. Comparing the two methods (projective and questionnaire) establishes the need of theoretical integration as well. We also make an attempt to explain Fraley's evolutionary non-adaptivity of avoidant attachment, in the case of whose presence adaptivity of early attachment, counterbalancing the exploration and security need, and providing closeness--farness loses its balance.
Lindner, Michael; Vicente, Raul; Priesemann, Viola; Wibral, Michael
2011-11-18
Transfer entropy (TE) is a measure for the detection of directed interactions. Transfer entropy is an information theoretic implementation of Wiener's principle of observational causality. It offers an approach to the detection of neuronal interactions that is free of an explicit model of the interactions. Hence, it offers the power to analyze linear and nonlinear interactions alike. This allows for example the comprehensive analysis of directed interactions in neural networks at various levels of description. Here we present the open-source MATLAB toolbox TRENTOOL that allows the user to handle the considerable complexity of this measure and to validate the obtained results using non-parametrical statistical testing. We demonstrate the use of the toolbox and the performance of the algorithm on simulated data with nonlinear (quadratic) coupling and on local field potentials (LFP) recorded from the retina and the optic tectum of the turtle (Pseudemys scripta elegans) where a neuronal one-way connection is likely present. In simulated data TE detected information flow in the simulated direction reliably with false positives not exceeding the rates expected under the null hypothesis. In the LFP data we found directed interactions from the retina to the tectum, despite the complicated signal transformations between these stages. No false positive interactions in the reverse directions were detected. TRENTOOL is an implementation of transfer entropy and mutual information analysis that aims to support the user in the application of this information theoretic measure. TRENTOOL is implemented as a MATLAB toolbox and available under an open source license (GPL v3). For the use with neural data TRENTOOL seamlessly integrates with the popular FieldTrip toolbox.
Modeling of optical mirror and electromechanical behavior
NASA Astrophysics Data System (ADS)
Wang, Fang; Lu, Chao; Liu, Zishun; Liu, Ai Q.; Zhang, Xu M.
2001-10-01
This paper presents finite element (FE) simulation and theoretical analysis of novel MEMS fiber-optical switches actuated by electrostatic attraction. FE simulation for the switches under static and dynamic loading are first carried out to reveal the mechanical characteristics of the minimum or critical switching voltages, the natural frequencies, mode shapes and response under different levels of electrostatic attraction load. To validate the FE simulation results, a theoretical (or analytical) model is then developed for one specific switch, i.e., Plate_40_104. Good agreement is found between the FE simulation and the analytical results. From both FE simulation and theoretical analysis, the critical switching voltage for Plate_40_104 is derived to be 238 V for the switching angel of 12 degree(s). The critical switching on and off times are 431 microsecond(s) and 67 microsecond(s) , respectively. The present study not only develops good FE and analytical models, but also demonstrates step by step a method to simplify a real optical switch structure with reference to the FE simulation results for analytical purpose. With the FE and analytical models, it is easy to obtain any information about the mechanical behaviors of the optical switches, which are helpful in yielding optimized design.
Siyah Mansoory, Meysam; Oghabian, Mohammad Ali; Jafari, Amir Homayoun; Shahbabaie, Alireza
2017-01-01
Graph theoretical analysis of functional Magnetic Resonance Imaging (fMRI) data has provided new measures of mapping human brain in vivo. Of all methods to measure the functional connectivity between regions, Linear Correlation (LC) calculation of activity time series of the brain regions as a linear measure is considered the most ubiquitous one. The strength of the dependence obligatory for graph construction and analysis is consistently underestimated by LC, because not all the bivariate distributions, but only the marginals are Gaussian. In a number of studies, Mutual Information (MI) has been employed, as a similarity measure between each two time series of the brain regions, a pure nonlinear measure. Owing to the complex fractal organization of the brain indicating self-similarity, more information on the brain can be revealed by fMRI Fractal Dimension (FD) analysis. In the present paper, Box-Counting Fractal Dimension (BCFD) is introduced for graph theoretical analysis of fMRI data in 17 methamphetamine drug users and 18 normal controls. Then, BCFD performance was evaluated compared to those of LC and MI methods. Moreover, the global topological graph properties of the brain networks inclusive of global efficiency, clustering coefficient and characteristic path length in addict subjects were investigated too. Compared to normal subjects by using statistical tests (P<0.05), topological graph properties were postulated to be disrupted significantly during the resting-state fMRI. Based on the results, analyzing the graph topological properties (representing the brain networks) based on BCFD is a more reliable method than LC and MI.
Information Retrieval and Graph Analysis Approaches for Book Recommendation.
Benkoussas, Chahinez; Bellot, Patrice
2015-01-01
A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.
Information Retrieval and Graph Analysis Approaches for Book Recommendation
Benkoussas, Chahinez; Bellot, Patrice
2015-01-01
A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments. PMID:26504899
2016-04-19
event is the same as conditioning on the event being certain, which formalizes the standard informal interpretation of conditional probability. The game ...theoretic application of our model, discussed within an example, sheds light on a number of issues in the analysis of extensive form games . Type...belief types Block 13: Supplementary Note © 2014 . Published in Games and Economic Behavior, Vol. Ed. 0 87, (0) (2014), (, (0). DoD Components
ERIC Educational Resources Information Center
Deniz, Hasan; Donnelly, Lisa A.; Yilmaz, Irfan
2008-01-01
In this study, using multiple regression analysis, we aimed to explore the factors related to acceptance of evolutionary theory among preservice Turkish biology teachers using conceptual ecology for biological evolution as a theoretical lens. We aimed to determine the extent to which we can account for the variance in acceptance of evolutionary…
Report on the Domestic Natural Disaster Health Workforce
2011-11-30
DEMPS volunteers are required to review and ensure that their information in the DEMPs database is current and complete on a quarterly basis. Retiring...emergency and critical-care nurses; and (3) paramedics. • Uses a pilot case study focusing on a theoretical major earthquake scenario in the ... on the three core occupational sub- groups. • Incorporates feedback from a multi-stakeholder conference. This landscape analysis supports
ERIC Educational Resources Information Center
Borgos, Jill E.
2013-01-01
This article applies the theoretical framework of principal-agent theory in order to better understand the complex organisational relationships emerging between entities invested in the establishment and monitoring of cross-border international branch campus medical schools. Using the key constructs of principal-agent theory, information asymmetry…
ERIC Educational Resources Information Center
Rasmussen, Ingvill; Ludvigsen, Sten
2009-01-01
This article discusses how to analyze educational reforms in which information and communications technology (ICT) is used as a central catalyst to change practises. We explore the relationship between theoretical conceptualizations and empirical findings drawing on the work of Larry Cuban and Yrjo Engestrom. We claim that reform research has…
ERIC Educational Resources Information Center
Marcovitz, Alan B., Ed.
Four computer programs written in FORTRAN and BASIC develop theoretical predictions and data reduction for a junior-senior level heat exchanger experiment. Programs may be used at the terminal in the laboratory to check progress of the experiment or may be used in the batch mode for interpretation of final information for a formal report. Several…
Presence and absence of bats across habitat scales in the Upper Coastal Plain of South Carolina
W. Mark Ford; Jennifer M. Menzel; Michael A. Menzel; John W. Edwards; John C. Kilgo
2006-01-01
During 2001, we used active acoustical sampling (Anabat II) to survey foraging habitat relationships of bats on the Savannah River Site (SRS) in the upper Coastal Plain of South Carolina. Using an a priori information-theoretic approach, we conducted logistic regression analysis to examine presence of individual bat species relative to a suite of microhabitat, stand,...
Research and absence of bats across habitat scales in the upper coastal plain of South Carolina
W. Mark Ford; Jennifer M. Menzel; Michael A. Menzel; John W. Edwards; John C. Kilgo
2006-01-01
During 2001, we used active acoustical sampling (Anabat 11) to survey foraging habitat relationships of bats on the Savannah River Site (SRS) in the upper Coastal Plain of South Carolina. Using an a priori information-theoretic approach, we conducted logistic regression analysis to examine presence of individual bat species relative to a suite of microhabitat, stand,...
ERIC Educational Resources Information Center
Teplovs, Chris
2015-01-01
This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…
Information analysis of hyperspectral images from the hyperion satellite
NASA Astrophysics Data System (ADS)
Puzachenko, Yu. G.; Sandlersky, R. B.; Krenke, A. N.; Puzachenko, M. Yu.
2017-07-01
A new method of estimating the outgoing radiation spectra data obtained from the Hyperion EO-1 satellite is considered. In theoretical terms, this method is based on the nonequilibrium thermodynamics concept with corresponding estimates of the entropy and the Kullbak information. The obtained information estimates make it possible to assess the effective work of the landscape cover both in general and for its various types and to identify the spectrum ranges primarily responsible for the information increment and, accordingly, for the effective work. The information is measured in the frequency band intervals corresponding to the peaks of solar radiation absorption by different pigments, mesophyll, and water to evaluate the system operation by their synthesis and moisture accumulation. This method is assumed to be effective in investigation of ecosystem functioning by hyperspectral remote sensing.
Using enterprise architecture artefacts in an organisation
NASA Astrophysics Data System (ADS)
Niemi, Eetu; Pekkola, Samuli
2017-03-01
As a tool for management and planning, Enterprise Architecture (EA) can potentially align organisations' business processes, information, information systems and technology towards a common goal, and supply the information required within this journey. However, an explicit view on why, how, when and by whom EA artefacts are used in order to realise its full potential is not defined. Utilising the features of information systems use studies and data from a case study with 14 EA stakeholder interviews, we identify and describe 15 EA artefact use situations that are then reflected in the related literature. Their analysis enriches understanding of what are EA artefacts, how and why they are used and when are they used, and results in a theoretical framework for understanding their use in general.
Levin, Yulia; Tzelgov, Joseph
2016-01-01
The present study suggests that the idea that Stroop interference originates from multiple components may gain theoretically from integrating two independent frameworks. The first framework is represented by the well-known notion of "semantic gradient" of interference and the second one is the distinction between two types of conflict - the task and the informational conflict - giving rise to the interference (MacLeod and MacDonald, 2000; Goldfarb and Henik, 2007). The proposed integration led to the conclusion that two (i.e., orthographic and lexical components) of the four theoretically distinct components represent task conflict, and the other two (i.e., indirect and direct informational conflict components) represent informational conflict. The four components were independently estimated in a series of experiments. The results confirmed the contribution of task conflict (estimated by a robust orthographic component) and of informational conflict (estimated by a strong direct informational conflict component) to Stroop interference. However, the performed critical review of the relevant literature (see General Discussion), as well as the results of the experiments reported, showed that the other two components expressing each type of conflict (i.e., the lexical component of task conflict and the indirect informational conflict) were small and unstable. The present analysis refines our knowledge of the origins of Stroop interference by providing evidence that each type of conflict has its major and minor contributions. The implications for cognitive control of an automatic reading process are also discussed.
Levin, Yulia; Tzelgov, Joseph
2016-01-01
The present study suggests that the idea that Stroop interference originates from multiple components may gain theoretically from integrating two independent frameworks. The first framework is represented by the well-known notion of “semantic gradient” of interference and the second one is the distinction between two types of conflict – the task and the informational conflict – giving rise to the interference (MacLeod and MacDonald, 2000; Goldfarb and Henik, 2007). The proposed integration led to the conclusion that two (i.e., orthographic and lexical components) of the four theoretically distinct components represent task conflict, and the other two (i.e., indirect and direct informational conflict components) represent informational conflict. The four components were independently estimated in a series of experiments. The results confirmed the contribution of task conflict (estimated by a robust orthographic component) and of informational conflict (estimated by a strong direct informational conflict component) to Stroop interference. However, the performed critical review of the relevant literature (see General Discussion), as well as the results of the experiments reported, showed that the other two components expressing each type of conflict (i.e., the lexical component of task conflict and the indirect informational conflict) were small and unstable. The present analysis refines our knowledge of the origins of Stroop interference by providing evidence that each type of conflict has its major and minor contributions. The implications for cognitive control of an automatic reading process are also discussed. PMID:26955363
Adaptive Spike Threshold Enables Robust and Temporally Precise Neuronal Encoding
Resnik, Andrey; Celikel, Tansu; Englitz, Bernhard
2016-01-01
Neural processing rests on the intracellular transformation of information as synaptic inputs are translated into action potentials. This transformation is governed by the spike threshold, which depends on the history of the membrane potential on many temporal scales. While the adaptation of the threshold after spiking activity has been addressed before both theoretically and experimentally, it has only recently been demonstrated that the subthreshold membrane state also influences the effective spike threshold. The consequences for neural computation are not well understood yet. We address this question here using neural simulations and whole cell intracellular recordings in combination with information theoretic analysis. We show that an adaptive spike threshold leads to better stimulus discrimination for tight input correlations than would be achieved otherwise, independent from whether the stimulus is encoded in the rate or pattern of action potentials. The time scales of input selectivity are jointly governed by membrane and threshold dynamics. Encoding information using adaptive thresholds further ensures robust information transmission across cortical states i.e. decoding from different states is less state dependent in the adaptive threshold case, if the decoding is performed in reference to the timing of the population response. Results from in vitro neural recordings were consistent with simulations from adaptive threshold neurons. In summary, the adaptive spike threshold reduces information loss during intracellular information transfer, improves stimulus discriminability and ensures robust decoding across membrane states in a regime of highly correlated inputs, similar to those seen in sensory nuclei during the encoding of sensory information. PMID:27304526
Adaptive Spike Threshold Enables Robust and Temporally Precise Neuronal Encoding.
Huang, Chao; Resnik, Andrey; Celikel, Tansu; Englitz, Bernhard
2016-06-01
Neural processing rests on the intracellular transformation of information as synaptic inputs are translated into action potentials. This transformation is governed by the spike threshold, which depends on the history of the membrane potential on many temporal scales. While the adaptation of the threshold after spiking activity has been addressed before both theoretically and experimentally, it has only recently been demonstrated that the subthreshold membrane state also influences the effective spike threshold. The consequences for neural computation are not well understood yet. We address this question here using neural simulations and whole cell intracellular recordings in combination with information theoretic analysis. We show that an adaptive spike threshold leads to better stimulus discrimination for tight input correlations than would be achieved otherwise, independent from whether the stimulus is encoded in the rate or pattern of action potentials. The time scales of input selectivity are jointly governed by membrane and threshold dynamics. Encoding information using adaptive thresholds further ensures robust information transmission across cortical states i.e. decoding from different states is less state dependent in the adaptive threshold case, if the decoding is performed in reference to the timing of the population response. Results from in vitro neural recordings were consistent with simulations from adaptive threshold neurons. In summary, the adaptive spike threshold reduces information loss during intracellular information transfer, improves stimulus discriminability and ensures robust decoding across membrane states in a regime of highly correlated inputs, similar to those seen in sensory nuclei during the encoding of sensory information.
Analysis of Implicit Uncertain Systems. Part 1: Theoretical Framework
1994-12-07
Analysis of Implicit Uncertain Systems Part I: Theoretical Framework Fernando Paganini * John Doyle 1 December 7, 1994 Abst rac t This paper...Analysis of Implicit Uncertain Systems Part I: Theoretical Framework 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...model and a number of constraints relevant to the analysis problem under consideration. In Part I of this paper we propose a theoretical framework which
Patterns-Based IS Change Management in SMEs
NASA Astrophysics Data System (ADS)
Makna, Janis; Kirikova, Marite
The majority of information systems change management guidelines and standards are either too abstract or too bureaucratic to be easily applicable in small enterprises. This chapter proposes the approach, the method, and the prototype that are designed especially for information systems change management in small and medium enterprises. The approach is based on proven patterns of changes in the set of information systems elements. The set of elements was obtained by theoretical analysis of information systems and business process definitions and enterprise architectures. The patterns were evolved from a number of information systems theories and tested in 48 information systems change management projects. The prototype presents and helps to handle three basic change patterns, which help to anticipate the overall scope of changes related to particular elementary changes in an enterprise information system. The use of prototype requires just basic knowledge in organizational business process and information management.
Medical scientists' information practices in the research work context.
Roos, Annikki
2015-03-01
The aim of the study was to investigate the information practices of medical scientists in the research work context. This is a qualitative study based on semi-structured interviews. The interviews were transcribed and analysed in a web tool for qualitative analysis. Activity theory was used as the theoretical framework. The generating motives for the information related activity come from the core activity, research work. The motives result in actions such as searching and using information. Usability, accessibility and ease of use are the most important conditions that determine information related operations. Medical scientists search and use information most of all in the beginning and at the end of the research work. Information practices appear as an instrument producing activity to the central activity. Information services should be embedded in this core activity and in practice libraries should follow researchers' workflow and embed their tools and services in it. © 2015 Health Libraries Journal.
A Study on the Security Levels of Spread-Spectrum Embedding Schemes in the WOA Framework.
Wang, Yuan-Gen; Zhu, Guopu; Kwong, Sam; Shi, Yun-Qing
2017-08-23
Security analysis is a very important issue for digital watermarking. Several years ago, according to Kerckhoffs' principle, the famous four security levels, namely insecurity, key security, subspace security, and stego-security, were defined for spread-spectrum (SS) embedding schemes in the framework of watermarked-only attack. However, up to now there has been little application of the definition of these security levels to the theoretical analysis of the security of SS embedding schemes, due to the difficulty of the theoretical analysis. In this paper, based on the security definition, we present a theoretical analysis to evaluate the security levels of five typical SS embedding schemes, which are the classical SS, the improved SS (ISS), the circular extension of ISS, the nonrobust and robust natural watermarking, respectively. The theoretical analysis of these typical SS schemes are successfully performed by taking advantage of the convolution of probability distributions to derive the probabilistic models of watermarked signals. Moreover, simulations are conducted to illustrate and validate our theoretical analysis. We believe that the theoretical and practical analysis presented in this paper can bridge the gap between the definition of the four security levels and its application to the theoretical analysis of SS embedding schemes.
Multiple neural states of representation in short-term memory? It's a matter of attention.
Larocque, Joshua J; Lewis-Peacock, Jarrod A; Postle, Bradley R
2014-01-01
Short-term memory (STM) refers to the capacity-limited retention of information over a brief period of time, and working memory (WM) refers to the manipulation and use of that information to guide behavior. In recent years it has become apparent that STM and WM interact and overlap with other cognitive processes, including attention (the selection of a subset of information for further processing) and long-term memory (LTM-the encoding and retention of an effectively unlimited amount of information for a much longer period of time). Broadly speaking, there have been two classes of memory models: systems models, which posit distinct stores for STM and LTM (Atkinson and Shiffrin, 1968; Baddeley and Hitch, 1974); and state-based models, which posit a common store with different activation states corresponding to STM and LTM (Cowan, 1995; McElree, 1996; Oberauer, 2002). In this paper, we will focus on state-based accounts of STM. First, we will consider several theoretical models that postulate, based on considerable behavioral evidence, that information in STM can exist in multiple representational states. We will then consider how neural data from recent studies of STM can inform and constrain these theoretical models. In the process we will highlight the inferential advantage of multivariate, information-based analyses of neuroimaging data (fMRI and electroencephalography (EEG)) over conventional activation-based analysis approaches (Postle, in press). We will conclude by addressing lingering questions regarding the fractionation of STM, highlighting differences between the attention to information vs. the retention of information during brief memory delays.
Rogers, Brian; Gyani, Alex
2010-01-01
Abstract. Patrick Hughes's 'reverspective' artworks provide a novel way of investigating the effectiveness of different sources of 3-D information for the human visual system. Our empirical findings show that the converging lines of simple linear perspective can be as effective as the rich array of 3-D cues present in natural scenes in determining what we see, even when these cues are in conflict with binocular disparities. Theoretical considerations reveal that, once the information provided by motion parallax transformations is correctly understood, there is no need to invoke higher-level processes or an interpretation based on familiarity or past experience in order to explain either the 'reversed' depth or the apparent, concomitant rotation of a reverspective artwork as the observer moves from side to side. What we see in reverspectives is the most likely real-world scenario (distal stimulus) that could have created the perspective and parallax transformations (proximal stimulus) that stimulate our visual systems.
Practical secure quantum communications
NASA Astrophysics Data System (ADS)
Diamanti, Eleni
2015-05-01
We review recent advances in the field of quantum cryptography, focusing in particular on practical implementations of two central protocols for quantum network applications, namely key distribution and coin flipping. The former allows two parties to share secret messages with information-theoretic security, even in the presence of a malicious eavesdropper in the communication channel, which is impossible with classical resources alone. The latter enables two distrustful parties to agree on a random bit, again with information-theoretic security, and with a cheating probability lower than the one that can be reached in a classical scenario. Our implementations rely on continuous-variable technology for quantum key distribution and on a plug and play discrete-variable system for coin flipping, and necessitate a rigorous security analysis adapted to the experimental schemes and their imperfections. In both cases, we demonstrate the protocols with provable security over record long distances in optical fibers and assess the performance of our systems as well as their limitations. The reported advances offer a powerful toolbox for practical applications of secure communications within future quantum networks.
NASA Astrophysics Data System (ADS)
Shulgina, T. M.; Gordova, Y. E.; Martynova, Y. V.
2014-12-01
A problem of making education relevant to the workplace tasks is a key problem of higher education in the professional field of environmental sciences. To answer this challenge several new courses for students of "Climatology" and "Meteorology" specialties were developed and implemented at the Tomsk State University, which comprises theoretical knowledge from up-to-date environmental sciences with computational tasks. To organize the educational process we use an open-source course management system Moodle (www.moodle.org). It gave us an opportunity to combine text and multimedia in a theoretical part of educational courses. The hands-on approach is realized through development of innovative trainings which are performed within the information-computational web GIS platform "Climate" (http://climate.scert.ru/). The platform has a set of tools and data bases allowing a researcher to perform climate changes analysis on the selected territory. The tools are also used for students' trainings, which contain practical tasks on climate modeling and climate changes assessment and analysis. Laboratory exercises are covering three topics: "Analysis of regional climate changes"; "Analysis of climate extreme indices on the regional scale"; and "Analysis of future climate". They designed to consolidate students' knowledge of discipline, to instill in them the skills to work independently with large amounts of geophysical data using modern processing and analysis tools of web-GIS platform "Climate" and to train them to present results obtained on laboratory work as reports with the statement of the problem, the results of calculations and logically justified conclusion. Thus, students are engaged in n the use of modern tools of the geophysical data analysis and it cultivates dynamic of their professional learning. The approach can help us to fill in this gap because it is the only approach that offers experience, increases students involvement, advance the use of modern information and communication tools. Financial support for this research from the RFBR (13-05-12034, 14-05-00502), SB RAS project VIII.80.2.1 and grant of the President of RF (№ 181) is acknowledged.
A simplified computational memory model from information processing.
Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang
2016-11-23
This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.
From information theory to quantitative description of steric effects.
Alipour, Mojtaba; Safari, Zahra
2016-07-21
Immense efforts have been made in the literature to apply the information theory descriptors for investigating the electronic structure theory of various systems. In the present study, the information theoretic quantities, such as Fisher information, Shannon entropy, Onicescu information energy, and Ghosh-Berkowitz-Parr entropy, have been used to present a quantitative description for one of the most widely used concepts in chemistry, namely the steric effects. Taking the experimental steric scales for the different compounds as benchmark sets, there are reasonable linear relationships between the experimental scales of the steric effects and theoretical values of steric energies calculated from information theory functionals. Perusing the results obtained from the information theoretic quantities with the two representations of electron density and shape function, the Shannon entropy has the best performance for the purpose. On the one hand, the usefulness of considering the contributions of functional groups steric energies and geometries, and on the other hand, dissecting the effects of both global and local information measures simultaneously have also been explored. Furthermore, the utility of the information functionals for the description of steric effects in several chemical transformations, such as electrophilic and nucleophilic reactions and host-guest chemistry, has been analyzed. The functionals of information theory correlate remarkably with the stability of systems and experimental scales. Overall, these findings show that the information theoretic quantities can be introduced as quantitative measures of steric effects and provide further evidences of the quality of information theory toward helping theoreticians and experimentalists to interpret different problems in real systems.
Risk/Benefit Communication about Food-A Systematic Review of the Literature.
Frewer, L J; Fischer, A R H; Brennan, M; Bánáti, D; Lion, R; Meertens, R M; Rowe, G; Siegrist, M; Verbeke, W; Vereijken, C M J L
2016-07-26
A systematic review relevant to the following research questions was conducted (1) the extent to which different theoretical frameworks have been applied to food risk/benefit communication and (2) the impact such food risk/benefit communication interventions have had on related risk/benefit attitudes and behaviors. Fifty four papers were identified. The analysis revealed that (primarily European or US) research interest has been relatively recent. Certain food issues were of greater interest to researchers than others, perhaps reflecting the occurrence of a crisis, or policy concern. Three broad themes relevant to the development of best practice in risk (benefit) communication were identified: the characteristics of the target population; the contents of the information; and the characteristics of the information sources. Within these themes, independent and dependent variables differed considerably. Overall, acute risk (benefit) communication will require advances in communication process whereas chronic communication needs to identify audience requirements. Both citizen's risk/benefit perceptions and (if relevant) related behaviors need to be taken into account, and recommendations for behavioral change need to be concrete and actionable. The application of theoretical frameworks to the study of risk (benefit) communication was infrequent, and developing predictive models of effective risk (benefit) communication may be contingent on improved theoretical perspectives.
A study on the correlation between the dewetting temperature of Ag film and SERS intensity.
Quan, Jiamin; Zhang, Jie; Qi, Xueqiang; Li, Junying; Wang, Ning; Zhu, Yong
2017-11-07
The thermally dewetted metal nano-islands have been actively investigated as cost-effective SERS-active substrates with a large area, good reproducibility and repeatability via simple fabrication process. However, the correlation between the dewetting temperature of metal film and SERS intensity hasn't been systematically studied. In this work, taking Ag nano-islands (AgNIs) as an example, we reported a strategy to investigate the correlation between the dewetting temperature of metal film and SERS intensity. We described the morphology evolution of AgNIs on the SiO 2 planar substrate in different temperatures and got the quantitative information in surface-limited diffusion process (SLDP) as a function of annealing temperature via classical mean-field nucleation theory. Those functions were further used in the simulation of electromagnetic field to obtain the correlation between the dewetting temperature of Ag film and theoretical analysis. In addition, Raman mapping was done on samples annealed at different temperatures, with R6G as an analyte, to accomplish the analysis of the correlation between the dewetting temperature of Ag film and SERS intensity, which is consistent with the theoretical analysis. For SLDP, we used the morphological characterization of five samples prepared by different annealing temperatures to successfully illustrate the change in SERS intensity with the temperature fluctuation, obtaining a small deviation between the experimental results and theoretic prediction.
McParlin, Catherine; Bell, Ruth; Robson, Stephen C; Muirhead, Colin R; Araújo-Soares, Vera
2017-06-01
to investigate barriers and facilitators to physical activity (PA) guideline implementation for midwives when advising obese pregnant women. a cross-sectional, self-completion, anonymous questionnaire was designed using the Theoretical Domains Framework. this framework was developed to evaluate the implementation of guidelines by health care professionals. A total of 40 questions were included. These were informed by previous research on pregnant women's and midwives views, knowledge and attitudes to PA, and supported by national evidence based guidelines. Demographic information and free text comments were also collected. three diverse NHS Trusts in the North East of England. all midwives employed by two hospital Trusts and the community midwives from the third Trust (n=375) were invited to participate. mean domain scores were calculated. Factor and regression analysis were performed to describe which theoretical domains may be influencing practice. Free text comments were analysed thematically. 192 (53%) questionnaires were returned. Mean domain scores were highest for social professional role and knowledge, and lowest for skills, beliefs about capabilities and behaviour regulation. Regression analysis indicated that skills and memory/attention/decision domains had a statistically significant influence on midwives discussing PA with obese pregnant women and advising them accordingly. Midwives comments indicated that they felt it was part of their role to discuss PA with all pregnant women but felt they lacked the skills and resources to do so effectively. midwives seem to have the necessary knowledge about the need/importance of PA advice for obese women and believe it is part of their role, but perceive they lack necessary skills and resources, and do not plan or prioritise the discussion regarding PA with obese pregnant woman. designing interventions that improve skills, promote routine enquiry regarding PA and provide resources (eg. information, referral pathways) may help improve midwives' PA advice. Copyright © 2016 Elsevier Ltd. All rights reserved.
King, Andy J
2015-01-01
Researchers and practitioners have an increasing interest in visual components of health information and health communication messages. This study contributes to this evolving body of research by providing an account of the visual images and information featured in printed cancer communication materials. Using content analysis, 147 pamphlets and 858 images were examined to determine how frequently images are used in printed materials, what types of images are used, what information is conveyed visually, and whether or not current recommendations for the inclusion of visual content were being followed. Although visual messages were found to be common in printed health materials, existing recommendations about the inclusion of visual content were only partially followed. Results are discussed in terms of how relevant theoretical frameworks in the areas of behavior change and visual persuasion seem to be used in these materials, as well as how more theory-oriented research is necessary in visual messaging efforts.
Daly, Louise; McCarron, Mary; Higgins, Agnes; McCallion, Philip
2013-02-01
This paper presents a theory explaining the processes used by informal carers of people with dementia to mange alterations to their, and people with dementias' relationships with and places within their social worlds. Informal carers provide the majority of care to people with dementia. A great deal of international informal dementia care research is available, much of which elucidates the content, impacts and consequences of the informal caring role and the coping mechanisms that carers use. However, the socially situated experiences and processes integral to informal caring in dementia have not yet been robustly accounted for. A classic grounded theory approach was used as it is designed for research enquiries that aim to generate theory illustrating social patterns of action used to address an identified problem. Thirty interviews were conducted with 31 participants between 2006-2008. The theory was conceptualised from the data using the concurrent methods of theoretical sampling, constant comparative analysis, memo writing and theoretical sensitivity. Informal carers' main concern was identified as 'Living on the fringes', which was stimulated by dementia-related stigma and living a different life. The theory of 'Sustaining Place' explains the social pattern of actions employed by informal carers to manage this problem on behalf of themselves and the person with dementia. The theory of 'Sustaining Place' identifies an imperative for nurses, other formal carers and society to engage in actions to support and enable social connectedness, social inclusion and citizenship for informal carers and people with dementia. 'Sustaining Place' facilitates enhanced understanding of the complex and socially situated nature of informal dementia care through its portrayal of informal carers as social agents and can be used to guide nurses to better support those who live with dementia. © 2012 Blackwell Publishing Ltd.
Unifying cost and information in information-theoretic competitive learning.
Kamimura, Ryotaro
2005-01-01
In this paper, we introduce costs into the framework of information maximization and try to maximize the ratio of information to its associated cost. We have shown that competitive learning is realized by maximizing mutual information between input patterns and competitive units. One shortcoming of the method is that maximizing information does not necessarily produce representations faithful to input patterns. Information maximizing primarily focuses on some parts of input patterns that are used to distinguish between patterns. Therefore, we introduce the cost, which represents average distance between input patterns and connection weights. By minimizing the cost, final connection weights reflect input patterns well. We applied the method to a political data analysis, a voting attitude problem and a Wisconsin cancer problem. Experimental results confirmed that, when the cost was introduced, representations faithful to input patterns were obtained. In addition, improved generalization performance was obtained within a relatively short learning time.
Redundancy and reduction: Speakers manage syntactic information density
Florian Jaeger, T.
2010-01-01
A principle of efficient language production based on information theoretic considerations is proposed: Uniform Information Density predicts that language production is affected by a preference to distribute information uniformly across the linguistic signal. This prediction is tested against data from syntactic reduction. A single multilevel logit model analysis of naturally distributed data from a corpus of spontaneous speech is used to assess the effect of information density on complementizer that-mentioning, while simultaneously evaluating the predictions of several influential alternative accounts: availability, ambiguity avoidance, and dependency processing accounts. Information density emerges as an important predictor of speakers’ preferences during production. As information is defined in terms of probabilities, it follows that production is probability-sensitive, in that speakers’ preferences are affected by the contextual probability of syntactic structures. The merits of a corpus-based approach to the study of language production are discussed as well. PMID:20434141
Support Net for Frontline Providers
2016-03-01
influencing members’ continuance intentions in professional virtual communities - a longitudinal study. Journal of Information Science, 33(4), 451-467...of law, no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB...from a scientific and theoretically based manner. Results from this project provide critical prevalence information , theoretical development, and
Brain activity and cognition: a connection from thermodynamics and information theory.
Collell, Guillem; Fauquet, Jordi
2015-01-01
The connection between brain and mind is an important scientific and philosophical question that we are still far from completely understanding. A crucial point to our work is noticing that thermodynamics provides a convenient framework to model brain activity, whereas cognition can be modeled in information-theoretical terms. In fact, several models have been proposed so far from both approaches. A second critical remark is the existence of deep theoretical connections between thermodynamics and information theory. In fact, some well-known authors claim that the laws of thermodynamics are nothing but principles in information theory. Unlike in physics or chemistry, a formalization of the relationship between information and energy is currently lacking in neuroscience. In this paper we propose a framework to connect physical brain and cognitive models by means of the theoretical connections between information theory and thermodynamics. Ultimately, this article aims at providing further insight on the formal relationship between cognition and neural activity.
Information-theoretical noninvasive damage detection in bridge structures
NASA Astrophysics Data System (ADS)
Sudu Ambegedara, Amila; Sun, Jie; Janoyan, Kerop; Bollt, Erik
2016-11-01
Damage detection of mechanical structures such as bridges is an important research problem in civil engineering. Using spatially distributed sensor time series data collected from a recent experiment on a local bridge in Upper State New York, we study noninvasive damage detection using information-theoretical methods. Several findings are in order. First, the time series data, which represent accelerations measured at the sensors, more closely follow Laplace distribution than normal distribution, allowing us to develop parameter estimators for various information-theoretic measures such as entropy and mutual information. Second, as damage is introduced by the removal of bolts of the first diaphragm connection, the interaction between spatially nearby sensors as measured by mutual information becomes weaker, suggesting that the bridge is "loosened." Finally, using a proposed optimal mutual information interaction procedure to prune away indirect interactions, we found that the primary direction of interaction or influence aligns with the traffic direction on the bridge even after damaging the bridge.
Yardley, Sarah; Brosnan, Caragh; Richardson, Jane
2013-01-01
Theoretical integration is a necessary element of study design if clarification of experiential learning is to be achieved. There are few published examples demonstrating how this can be achieved. This methodological article provides a worked example of research methodology that achieved clarification of authentic early experiences (AEEs) through a bi-directional approach to theory and data. Bi-directional refers to our simultaneous use of theory to guide and interrogate empirical data and the use of empirical data to refine theory. We explain the five steps of our methodological approach: (1) understanding the context; (2) critique on existing applications of socio-cultural models to inform study design; (3) data generation; (4) analysis and interpretation and (5) theoretical development through a novel application of Metis. These steps resulted in understanding of how and why different outcomes arose from students participating in AEE. Our approach offers a mechanism for clarification without which evidence-based effective ways to maximise constructive learning cannot be developed. In our example it also contributed to greater theoretical understanding of the influence of social interactions. By sharing this example of research undertaken to develop both theory and educational practice we hope to assist others seeking to conduct similar research.
Expanded managed care liability: what impact on employer coverage?
Studdert, D M; Sage, W M; Gresenz, C R; Hensler, D R
1999-01-01
Policymakers are considering legislative changes that would increase managed care organizations' exposure to civil liability for withholding coverage or failing to deliver needed care. Using a combination of empirical information and theoretical analysis, we assess the likely responses of health plans and Employee Retirement Income Security Act (ERISA) plan sponsors to an expansion of liability, and we evaluate the policy impact of those moves. We conclude that the direct costs of liability are uncertain but that the prospect of litigation may have other important effects on coverage decision making, information exchange, risk contracting, and the extent of employers' involvement in health coverage.
Epistemic Beliefs and Conceptual Understanding in Biotechnology: A Case Study
NASA Astrophysics Data System (ADS)
Rebello, Carina M.; Siegel, Marcelle A.; Witzig, Stephen B.; Freyermuth, Sharyn K.; McClure, Bruce A.
2012-04-01
The purpose of this investigation was to explore students' epistemic beliefs and conceptual understanding of biotechnology. Epistemic beliefs can influence reasoning, how individuals evaluate information, and informed decision making abilities. These skills are important for an informed citizenry that will participate in debates regarding areas in science such as biotechnology. We report on an in-depth case study analysis of three undergraduate, non-science majors in a biotechnology course designed for non-biochemistry majors. We selected participants who performed above average and below average on the first in-class exam. Data from multiple sources—interviews, exams, and a concept instrument—were used to construct (a) individual profiles and (b) a cross-case analysis of our participants' conceptual development and epistemic beliefs from two different theoretical perspectives—Women's Ways of Knowing and the Reflective Judgment Model. Two independent trained researchers coded all case records independently for both theoretical perspectives, with resultant initial Cohen's kappa values above .715 (substantial agreement), and then reached consensus on the codes. Results indicate that a student with more sophisticated epistemology demonstrated greater conceptual understandings at the end of the course than a student with less sophisticated epistemology, even though the latter performed higher initially. Also a student with a less sophisticated epistemology and low initial conceptual performance does not demonstrate gains in their overall conceptual understanding. Results suggest the need for instructional interventions fostering epistemological development of learners in order to facilitate their conceptual growth.
The meaning of numbers in health: exploring health numeracy in a Mexican-American population.
Schapira, Marilyn M; Fletcher, Kathlyn E; Ganschow, Pamela S; Walker, Cindy M; Tyler, Bruce; Del Pozo, Sam; Schauer, Carrie; Jacobs, Elizabeth A
2011-07-01
Health numeracy can be defined as the ability to use numeric information in the context of health. The interpretation and application of numbers in health may vary across cultural groups. To explore the construct of health numeracy among persons who identify as Mexican American. Qualitative focus group study. Groups were stratified by preferred language and level of education. Audio-recordings were transcribed and Spanish groups (n = 3) translated to English. An analysis was conducted using principles of grounded theory. A purposeful sample of participants from clinical and community sites in the Milwaukee and Chicago metropolitan areas. A theoretical framework of health numeracy was developed based upon categories and major themes that emerged from the analysis. Six focus groups were conducted with 50 participants. Initial agreement in coding was 59-67% with 100% reached after reconciliation by the coding team. Three major themes emerged: 1) numeracy skills are applied to a broad range of communication and decision making tasks in health, 2) affective and cognitive responses to numeric information influence use of numbers in the health setting, and 3) there exists a strong desire to understand the meaning behind numbers used in health. The findings informed a theoretical framework of health numeracy. Numbers are important across a range of skills and applications in health in a sample of an urban Mexican-American population. This study expands previous work that strives to understand the application of numeric skills to medical decision making and health behaviors.
Network meta-analysis, electrical networks and graph theory.
Rücker, Gerta
2012-12-01
Network meta-analysis is an active field of research in clinical biostatistics. It aims to combine information from all randomized comparisons among a set of treatments for a given medical condition. We show how graph-theoretical methods can be applied to network meta-analysis. A meta-analytic graph consists of vertices (treatments) and edges (randomized comparisons). We illustrate the correspondence between meta-analytic networks and electrical networks, where variance corresponds to resistance, treatment effects to voltage, and weighted treatment effects to current flows. Based thereon, we then show that graph-theoretical methods that have been routinely applied to electrical networks also work well in network meta-analysis. In more detail, the resulting consistent treatment effects induced in the edges can be estimated via the Moore-Penrose pseudoinverse of the Laplacian matrix. Moreover, the variances of the treatment effects are estimated in analogy to electrical effective resistances. It is shown that this method, being computationally simple, leads to the usual fixed effect model estimate when applied to pairwise meta-analysis and is consistent with published results when applied to network meta-analysis examples from the literature. Moreover, problems of heterogeneity and inconsistency, random effects modeling and including multi-armed trials are addressed. Copyright © 2012 John Wiley & Sons, Ltd. Copyright © 2012 John Wiley & Sons, Ltd.
Layover and shadow detection based on distributed spaceborne single-baseline InSAR
NASA Astrophysics Data System (ADS)
Huanxin, Zou; Bin, Cai; Changzhou, Fan; Yun, Ren
2014-03-01
Distributed spaceborne single-baseline InSAR is an effective technique to get high quality Digital Elevation Model. Layover and Shadow are ubiquitous phenomenon in SAR images because of geometric relation of SAR imaging. In the signal processing of single-baseline InSAR, the phase singularity of Layover and Shadow leads to the phase difficult to filtering and unwrapping. This paper analyzed the geometric and signal model of the Layover and Shadow fields. Based on the interferometric signal autocorrelation matrix, the paper proposed the signal number estimation method based on information theoretic criteria, to distinguish Layover and Shadow from normal InSAR fields. The effectiveness and practicability of the method proposed in the paper are validated in the simulation experiments and theoretical analysis.
Evol and ProDy for bridging protein sequence evolution and structural dynamics.
Bakan, Ahmet; Dutta, Anindita; Mao, Wenzhi; Liu, Ying; Chennubhotla, Chakra; Lezon, Timothy R; Bahar, Ivet
2014-09-15
Correlations between sequence evolution and structural dynamics are of utmost importance in understanding the molecular mechanisms of function and their evolution. We have integrated Evol, a new package for fast and efficient comparative analysis of evolutionary patterns and conformational dynamics, into ProDy, a computational toolbox designed for inferring protein dynamics from experimental and theoretical data. Using information-theoretic approaches, Evol coanalyzes conservation and coevolution profiles extracted from multiple sequence alignments of protein families with their inferred dynamics. ProDy and Evol are open-source and freely available under MIT License from http://prody.csb.pitt.edu/. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Adopting the sensemaking perspective for chronic disease self-management.
Mamykina, Lena; Smaldone, Arlene M; Bakken, Suzanne R
2015-08-01
Self-monitoring is an integral component of many chronic diseases; however few theoretical frameworks address how individuals understand self-monitoring data and use it to guide self-management. To articulate a theoretical framework of sensemaking in diabetes self-management that integrates existing scholarship with empirical data. The proposed framework is grounded in theories of sensemaking adopted from organizational behavior, education, and human-computer interaction. To empirically validate the framework the researchers reviewed and analyzed reports on qualitative studies of diabetes self-management practices published in peer-reviewed journals from 2000 to 2015. The proposed framework distinguishes between sensemaking and habitual modes of self-management and identifies three essential sensemaking activities: perception of new information related to health and wellness, development of inferences that inform selection of actions, and carrying out daily activities in response to new information. The analysis of qualitative findings from 50 published reports provided ample empirical evidence for the proposed framework; however, it also identified a number of barriers to engaging in sensemaking in diabetes self-management. The proposed framework suggests new directions for research in diabetes self-management and for design of new informatics interventions for data-driven self-management. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Reed, Frances M; Fitzgerald, Les; Rae, Melanie
2016-01-01
To highlight philosophical and theoretical considerations for planning a mixed methods research design that can inform a practice model to guide rural district nursing end of life care. Conceptual models of nursing in the community are general and lack guidance for rural district nursing care. A combination of pragmatism and nurse agency theory can provide a framework for ethical considerations in mixed methods research in the private world of rural district end of life care. Reflection on experience gathered in a two-stage qualitative research phase, involving rural district nurses who use advocacy successfully, can inform a quantitative phase for testing and complementing the data. Ongoing data analysis and integration result in generalisable inferences to achieve the research objective. Mixed methods research that creatively combines philosophical and theoretical elements to guide design in the particular ethical situation of community end of life care can be used to explore an emerging field of interest and test the findings for evidence to guide quality nursing practice. Combining philosophy and nursing theory to guide mixed methods research design increases the opportunity for sound research outcomes that can inform a nursing model of care.
2009-01-01
theoretical framework developed by Edward L. Thorndike and his contemporaries (1935), proposed that (a) learning occurs in both formal and informal settings...implies, and general learning theory ( Thorndike , 1935) suggests, that more motivated employees should acquire more knowledge, so there should be a...that is predicted by tacit knowledge and general learning theory (Sternberg & Wagner, 1993; Thorndike , 1935). Table 3 reports modest true-score
ERIC Educational Resources Information Center
Muganyizi, Projestine S.; Nystrom, Lennarth; Axemo, Pia; Emmelin, Maria
2011-01-01
Grounded theory guided the analysis of 30 in-depth interviews with raped women and community members who had supported raped women in their contact with the police and health care services in Tanzania. The aim of this study was to understand and conceptualize the experiences of the informants by creating a theoretical model focusing on barriers,…
A Preliminary Analysis of the Theoretical Parameters of Organizaational Learning.
1995-09-01
PARAMETERS OF ORGANIZATIONAL LEARNING THESIS Presented to the Faculty of the Graduate School of Logistics and Acquisition Management of the Air...Organizational Learning Parameters in the Knowledge Acquisition Category 2~™ 2-3. Organizational Learning Parameters in the Information Distribution Category...Learning Refined Scale 4-94 4-145. Composition of Refined Scale 4 Knowledge Flow 4-95 4-146. Cronbach’s Alpha Statistics for the Complete Knowledge Flow
Amemori, Masamitsu; Michie, Susan; Korhonen, Tellervo; Murtomaa, Heikki; Kinnunen, Taru H
2011-05-26
Tobacco use adversely affects oral health. Clinical guidelines recommend that dental providers promote tobacco abstinence and provide patients who use tobacco with brief tobacco use cessation counselling. Research shows that these guidelines are seldom implemented, however. To improve guideline adherence and to develop effective interventions, it is essential to understand provider behaviour and challenges to implementation. This study aimed to develop a theoretically informed measure for assessing among dental providers implementation difficulties related to tobacco use prevention and cessation (TUPAC) counselling guidelines, to evaluate those difficulties among a sample of dental providers, and to investigate a possible underlying structure of applied theoretical domains. A 35-item questionnaire was developed based on key theoretical domains relevant to the implementation behaviours of healthcare providers. Specific items were drawn mostly from the literature on TUPAC counselling studies of healthcare providers. The data were collected from dentists (n = 73) and dental hygienists (n = 22) in 36 dental clinics in Finland using a web-based survey. Of 95 providers, 73 participated (76.8%). We used Cronbach's alpha to ascertain the internal consistency of the questionnaire. Mean domain scores were calculated to assess different aspects of implementation difficulties and exploratory factor analysis to assess the theoretical domain structure. The authors agreed on the labels assigned to the factors on the basis of their component domains and the broader behavioural and theoretical literature. Internal consistency values for theoretical domains varied from 0.50 ('emotion') to 0.71 ('environmental context and resources'). The domain environmental context and resources had the lowest mean score (21.3%; 95% confidence interval [CI], 17.2 to 25.4) and was identified as a potential implementation difficulty. The domain emotion provided the highest mean score (60%; 95% CI, 55.0 to 65.0). Three factors were extracted that explain 70.8% of the variance: motivation (47.6% of variance, α = 0.86), capability (13.3% of variance, α = 0.83), and opportunity (10.0% of variance, α = 0.71). This study demonstrated a theoretically informed approach to identifying possible implementation difficulties in TUPAC counselling among dental providers. This approach provides a method for moving from diagnosing implementation difficulties to designing and evaluating interventions.
A review and assessment of hydrodynamic cavitation as a technology for the future.
Gogate, Parag R; Pandit, Aniruddha B
2005-01-01
In the present work, the current status of the hydrodynamic cavitation reactors has been reviewed discussing the bubble dynamics analysis, optimum design considerations, design correlations for cavitational intensity (in terms of collapse pressure)/cavitational yield and different successful chemical synthesis applications clearly illustrating the utility of these types of reactors. The theoretical discussion based on the modeling of the bubble dynamics equations aims at understanding the design information related to the dependency of the cavitational intensity on the operating parameters and recommendations have been made for the choice of the optimized conditions of operating parameters. The design information based on the theoretical analysis has also been supported with some experimental illustrations concentrating on the chemical synthesis applications. Assessment of the hydrodynamic cavitation reactors and comparison with the sonochemical reactors has been done by citing the different industrially important reactions (oxidation of toluene, o-xylene, m-xylene, p-xylene, mesitylene, o-nitrotoluene, p-nitrotoluene, m-nitrotoluene, o-chlorotoluene and p-chlorotoulene, and trans-esterification reaction i.e., synthesis of bio-diesel). Some recommendations have also been made for the future work to be carried out as well as the choice of the operating conditions for realizing the dream of industrial scale applications of the cavitational reactors.
NASA Technical Reports Server (NTRS)
Goldblum, A.; Rein, R.
1987-01-01
Analysis of C-alpha atom positions from cysteines involved in disulphide bridges in protein crystals shows that their geometric characteristics are unique with respect to other Cys-Cys, non-bridging pairs. They may be used for predicting disulphide connections in incompletely determined protein structures, such as low resolution crystallography or theoretical folding experiments. The basic unit for analysis and prediction is the 3 x 3 distance matrix for Cx positions of residues (i - 1), Cys(i), (i +1) with (j - 1), Cys(j), (j + 1). In each of its columns, row and diagonal vector--outer distances are larger than the central distance. This analysis is compared with some analytical models.
Effect of current vehicle’s interruption on traffic stability in cooperative car-following theory
NASA Astrophysics Data System (ADS)
Zhang, Geng; Liu, Hui
2017-12-01
To reveal the impact of the current vehicle’s interruption information on traffic flow, a new car-following model with consideration of the current vehicle’s interruption is proposed and the influence of the current vehicle’s interruption on traffic stability is investigated through theoretical analysis and numerical simulation. By linear analysis, the linear stability condition of the new model is obtained and the negative influence of the current vehicle’s interruption on traffic stability is shown in the headway-sensitivity space. Through nonlinear analysis, the modified Korteweg-de Vries (mKdV) equation of the new model near the critical point is derived and it can be used to describe the propagating behavior of the traffic density wave. Finally, numerical simulation confirms the analytical results, which shows that the current vehicle’s interruption information can destabilize traffic flow and should be considered in real traffic.
Visibility Graph Based Time Series Analysis.
Stephen, Mutua; Gu, Changgui; Yang, Huijie
2015-01-01
Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.
Dykes, Patricia C; Hurley, Ann; Cashen, Margaret; Bakken, Suzanne; Duffy, Mary E
2007-01-01
The use of health information technology (HIT) for the support of communication processes and data and information access in acute care settings is a relatively new phenomenon. A means of evaluating the impact of HIT in hospital settings is needed. The purpose of this research was to design and psychometrically evaluate the Impact of Health Information Technology scale (I-HIT). I-HIT was designed to measure the perception of nurses regarding the ways in which HIT influences interdisciplinary communication and workflow patterns and nurses' satisfaction with HIT applications and tools. Content for a 43-item tool was derived from the literature, and supported theoretically by the Coiera model and by nurse informaticists. Internal consistency reliability analysis using Cronbach's alpha was conducted on the 43-item scale to initiate the item reduction process. Items with an item total correlation of less than 0.35 were removed, leaving a total of 29 items. Item analysis, exploratory principal component analysis and internal consistency reliability using Cronbach's alpha were used to confirm the 29-item scale. Principal components analysis with Varimax rotation produced a four-factor solution that explained 58.5% of total variance (general advantages, information tools to support information needs, information tools to support communication needs, and workflow implications). Internal consistency of the total scale was 0.95 and ranged from 0.80-0.89 for four subscales. I-HIT demonstrated psychometric adequacy and is recommended to measure the impact of HIT on nursing practice in acute care settings.
Computational Thermomechanical Modelling of Early-Age Silicate Composites
NASA Astrophysics Data System (ADS)
Vala, J.; Št'astník, S.; Kozák, V.
2009-09-01
Strains and stresses in early-age silicate composites, widely used in civil engineering, especially in fresh concrete mixtures, in addition to those caused by exterior mechanical loads, are results of complicated non-deterministic physical and chemical processes. Their numerical prediction at the macro-scale level requires the non-trivial physical analysis based on the thermodynamic principles, making use of micro-structural information from both theoretical and experimental research. The paper introduces a computational model, based on a nonlinear system of macroscopic equations of evolution, supplied with certain effective material characteristics, coming from the micro-scale analysis, and sketches the algorithm for its numerical analysis.
Portraits of self-organization in fish schools interacting with robots
NASA Astrophysics Data System (ADS)
Aureli, M.; Fiorilli, F.; Porfiri, M.
2012-05-01
In this paper, we propose an enabling computational and theoretical framework for the analysis of experimental instances of collective behavior in response to external stimuli. In particular, this work addresses the characterization of aggregation and interaction phenomena in robot-animal groups through the exemplary analysis of fish schooling in the vicinity of a biomimetic robot. We adapt global observables from statistical mechanics to capture the main features of the shoal collective motion and its response to the robot from experimental observations. We investigate the shoal behavior by using a diffusion mapping analysis performed on these global observables that also informs the definition of relevant portraits of self-organization.
Tavender, Emma J; Bosch, Marije; Gruen, Russell L; Green, Sally E; Knott, Jonathan; Francis, Jill J; Michie, Susan; O'Connor, Denise A
2014-01-13
Mild traumatic brain injury is a frequent cause of presentation to emergency departments. Despite the availability of clinical practice guidelines in this area, there is variation in practice. One of the aims of the Neurotrauma Evidence Translation program is to develop and evaluate a targeted, theory- and evidence-informed intervention to improve the management of mild traumatic brain injury in Australian emergency departments. This study is the first step in the intervention development process and uses the Theoretical Domains Framework to explore the factors perceived to influence the uptake of four key evidence-based recommended practices for managing mild traumatic brain injury. Semi-structured interviews were conducted with emergency staff in the Australian state of Victoria. The interview guide was developed using the Theoretical Domains Framework to explore current practice and to identify the factors perceived to influence practice. Two researchers coded the interview transcripts using thematic content analysis. A total of 42 participants (9 Directors, 20 doctors and 13 nurses) were interviewed over a seven-month period. The results suggested that (i) the prospective assessment of post-traumatic amnesia was influenced by: knowledge; beliefs about consequences; environmental context and resources; skills; social/professional role and identity; and beliefs about capabilities; (ii) the use of guideline-developed criteria or decision rules to inform the appropriate use of a CT scan was influenced by: knowledge; beliefs about consequences; environmental context and resources; memory, attention and decision processes; beliefs about capabilities; social influences; skills and behavioral regulation; (iii) providing verbal and written patient information on discharge was influenced by: beliefs about consequences; environmental context and resources; memory, attention and decision processes; social/professional role and identity; and knowledge; (iv) the practice of providing brief, routine follow-up on discharge was influenced by: environmental context and resources; social/professional role and identity; knowledge; beliefs about consequences; and motivation and goals. Using the Theoretical Domains Framework, factors thought to influence the management of mild traumatic brain injury in the emergency department were identified. These factors present theoretically based targets for a future intervention.
Haldar, Justin P; Leahy, Richard M
2013-05-01
This paper presents a novel family of linear transforms that can be applied to data collected from the surface of a 2-sphere in three-dimensional Fourier space. This family of transforms generalizes the previously-proposed Funk-Radon Transform (FRT), which was originally developed for estimating the orientations of white matter fibers in the central nervous system from diffusion magnetic resonance imaging data. The new family of transforms is characterized theoretically, and efficient numerical implementations of the transforms are presented for the case when the measured data is represented in a basis of spherical harmonics. After these general discussions, attention is focused on a particular new transform from this family that we name the Funk-Radon and Cosine Transform (FRACT). Based on theoretical arguments, it is expected that FRACT-based analysis should yield significantly better orientation information (e.g., improved accuracy and higher angular resolution) than FRT-based analysis, while maintaining the strong characterizability and computational efficiency of the FRT. Simulations are used to confirm these theoretical characteristics, and the practical significance of the proposed approach is illustrated with real diffusion weighted MRI brain data. These experiments demonstrate that, in addition to having strong theoretical characteristics, the proposed approach can outperform existing state-of-the-art orientation estimation methods with respect to measures such as angular resolution and robustness to noise and modeling errors. Copyright © 2013 Elsevier Inc. All rights reserved.
Physical Violence between Siblings: A Theoretical and Empirical Analysis
ERIC Educational Resources Information Center
Hoffman, Kristi L.; Kiecolt, K. Jill; Edwards, John N.
2005-01-01
This study develops and tests a theoretical model to explain sibling violence based on the feminist, conflict, and social learning theoretical perspectives and research in psychology and sociology. A multivariate analysis of data from 651 young adults generally supports hypotheses from all three theoretical perspectives. Males with brothers have…
Affine Isoperimetry and Information Theoretic Inequalities
ERIC Educational Resources Information Center
Lv, Songjun
2012-01-01
There are essential connections between the isoperimetric theory and information theoretic inequalities. In general, the Brunn-Minkowski inequality and the entropy power inequality, as well as the classical isoperimetric inequality and the classical entropy-moment inequality, turn out to be equivalent in some certain sense, respectively. Based on…
FLUT - A program for aeroelastic stability analysis. [of aircraft structures in subsonic flow
NASA Technical Reports Server (NTRS)
Johnson, E. H.
1977-01-01
A computer program (FLUT) that can be used to evaluate the aeroelastic stability of aircraft structures in subsonic flow is described. The algorithm synthesizes data from a structural vibration analysis with an unsteady aerodynamics analysis and then performs a complex eigenvalue analysis to assess the system stability. The theoretical basis of the program is discussed with special emphasis placed on some innovative techniques which improve the efficiency of the analysis. User information needed to efficiently and successfully utilize the program is provided. In addition to identifying the required input, the flow of the program execution and some possible sources of difficulty are included. The use of the program is demonstrated with a listing of the input and output for a simple example.
Dean, Marleah; Scherr, Courtney L; Clements, Meredith; Koruo, Rachel; Martinez, Jennifer; Ross, Amy
2017-09-01
To investigate BRCA-positive, unaffected patients' - referred to as previvors - information needs after testing positive for a deleterious BRCA genetic mutation. 25 qualitative interviews were conducted with previvors. Data were analyzed using the constant comparison method of grounded theory. Analysis revealed a theoretical model of previvors' information needs related to the stage of their health journey. Specifically, a four-stage model was developed based on the data: (1) pre-testing information needs, (2) post-testing information needs, (3) pre-management information needs, and (4) post-management information needs. Two recurring dimensions of desired knowledge also emerged within the stages-personal/social knowledge and medical knowledge. While previvors may be genetically predisposed to develop cancer, they have not been diagnosed with cancer, and therefore have different information needs than cancer patients and cancer survivors. This model can serve as a framework for assisting healthcare providers in meeting the specific information needs of cancer previvors. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Shipman, D. L.
1972-01-01
The development of a model to simulate the information system of a program management type of organization is reported. The model statistically determines the following parameters: type of messages, destinations, delivery durations, type processing, processing durations, communication channels, outgoing messages, and priorites. The total management information system of the program management organization is considered, including formal and informal information flows and both facilities and equipment. The model is written in General Purpose System Simulation 2 computer programming language for use on the Univac 1108, Executive 8 computer. The model is simulated on a daily basis and collects queue and resource utilization statistics for each decision point. The statistics are then used by management to evaluate proposed resource allocations, to evaluate proposed changes to the system, and to identify potential problem areas. The model employs both empirical and theoretical distributions which are adjusted to simulate the information flow being studied.
A simplified computational memory model from information processing
Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang
2016-01-01
This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847
2014-01-01
Background More than a fifth of Australian children arrive at school developmentally vulnerable. To counteract this, the Healthy Kids Check (HKC), a one-off health assessment aimed at preschool children, was introduced in 2008 into Australian general practice. Delivery of services has, however, remained low. The Theoretical Domains Framework, which provides a method to understand behaviours theoretically, can be condensed into three core components: capability, opportunity and motivation, and the COM-B model. Utilising this system, this study aimed to determine the barriers and enablers to delivery of the HKC, to inform the design of an intervention to promote provision of HKC services in Australian general practice. Methods Data from 6 focus group discussions with 40 practitioners from general practices in socio-culturally diverse areas of Melbourne, Victoria, were analysed using thematic analysis. Results Many practitioners expressed uncertainty regarding their capabilities and the practicalities of delivering HKCs, but in some cases HKCs had acted as a catalyst for professional development. Key connections between immunisation services and delivery of HKCs prompted practices to have systems of recall and reminder in place. Standardisation of methods for developmental assessment and streamlined referral pathways affected practitioners’ confidence and motivation to perform HKCs. Conclusion Application of a systematic framework effectively demonstrated how a number of behaviours could be targeted to increase delivery of HKCs. Interventions need to target practice systems, the support of office staff and referral options, as well as practitioners’ training. Many behavioural changes could be applied through a single intervention programme delivered by the primary healthcare organisations charged with local healthcare needs (Medicare Locals) providing vital links between general practice, community and the health of young children. PMID:24886520
New directions in evidence-based policy research: a critical analysis of the literature
2014-01-01
Despite 40 years of research into evidence-based policy (EBP) and a continued drive from both policymakers and researchers to increase research uptake in policy, barriers to the use of evidence are persistently identified in the literature. However, it is not clear what explains this persistence – whether they represent real factors, or if they are artefacts of approaches used to study EBP. Based on an updated review, this paper analyses this literature to explain persistent barriers and facilitators. We critically describe the literature in terms of its theoretical underpinnings, definitions of ‘evidence’, methods, and underlying assumptions of research in the field, and aim to illuminate the EBP discourse by comparison with approaches from other fields. Much of the research in this area is theoretically naive, focusing primarily on the uptake of research evidence as opposed to evidence defined more broadly, and privileging academics’ research priorities over those of policymakers. Little empirical data analysing the processes or impact of evidence use in policy is available to inform researchers or decision-makers. EBP research often assumes that policymakers do not use evidence and that more evidence – meaning research evidence – use would benefit policymakers and populations. We argue that these assumptions are unsupported, biasing much of EBP research. The agenda of ‘getting evidence into policy’ has side-lined the empirical description and analysis of how research and policy actually interact in vivo. Rather than asking how research evidence can be made more influential, academics should aim to understand what influences and constitutes policy, and produce more critically and theoretically informed studies of decision-making. We question the main assumptions made by EBP researchers, explore the implications of doing so, and propose new directions for EBP research, and health policy. PMID:25023520
Monaghan, Thomas
2015-08-01
This critical analysis of the literature examines the factors and theoretical perspectives contributing to the theory-practice gap for newly qualified nurses within the United Kingdom. This article aspires to inform, guide and promote effective nursing education both academically and practically. A systematic search strategy was conducted to identify relevant literature covering the period of 2000-2014, to include only contemporary theoretical perspectives coinciding with the dearth of contemporary literature post Project 2000. The literature was systematically investigated utilising nursing research databases, the Cumulative Index of Nursing and Allied Health Literature, Allied and Complementary Medicine, the U.S. National Library of Medicine and Internurse. To satisfy the search criteria only articles conducted within the United Kingdom and written in the English language were included. Only literature including nurses and newly qualified nurses were included. To identify relevant literature a series of key words were utilised. Systematic review of the literature revealed that newly qualified nurses feel unprepared for practice, lacking confidence in their own abilities. It was also felt by newly qualified nurses that not enough time was dedicated to the production of clinical skills during their training. The use of preceptorship programmes was found to reduce the transitional stress associated with becoming a qualified nursing practitioner. Despite the increasing research being undertaken in the area of theory-practice gap there is still a need for nursing educators, practice areas and regulatory bodies to invest further in research. The effects of preceptorship and simulation exercises in particular require more research to provide regulatory bodies with enough evidence to make an informed decision as to whether their use should be mandatory. Copyright © 2015 Elsevier Ltd. All rights reserved.
Hernández-Fernández, Dra Asunción; Mora, Elísabet; Vizcaíno Hernández, María Isabel
2018-04-17
The dual aim of this research is, firstly, to analyze the physiological and unconscious emotional response of consumers to a new technological product and, secondly, link this emotional response to consumer conscious verbal reports of positive and negative product perceptions. In order to do this, biometrics and self-reported measures of emotional response are combined. On the one hand, a neuromarketing experiment based on the facial recognition of emotions of 10 subjects, when physical attributes and economic information of a technological product are exposed, shows the prevalence of the ambivalent emotion of surprise. On the other hand, a nethnographic qualitative approach of sentiment analysis of 67-user online comments characterise the valence of this emotion as mainly negative in the case and context studied. Theoretical, practical and methodological contributions are anticipated from this paper. From a theoretical point of view this proposal contributes valuable information to the product design process, to an effective development of the marketing mix variables of price and promotion, and to a successful selection of the target market. From a practical point of view, the approach employed in the case study on the product Google Glass provides empirical evidence useful in the decision making process for this and other technological enterprises launching a new product. And from a methodological point of view, the usefulness of integrated neuromarketing-eWOM analysis could contribute to the proliferation of this tandem in marketing research. Copyright © 2018 Elsevier Inc. All rights reserved.
Analytical Implications of Using Practice Theory in Workplace Information Literacy Research
ERIC Educational Resources Information Center
Moring, Camilla; Lloyd, Annemaree
2013-01-01
Introduction: This paper considers practice theory and the analytical implications of using this theoretical approach in information literacy research. More precisely the aim of the paper is to discuss the translation of practice theoretical assumptions into strategies that frame the analytical focus and interest when researching workplace…
2008-02-01
Information Theoretic Proceedures Frank Mufalli Rakesh Nagi Jim Llinas Sumita Mishra SUNY at Buffalo— CUBRC 4455 Genessee Street Buffalo...5f. WORK UNIT NUMBER NY 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) SUNY at Buffalo— CUBRC * Paine College ** 4455 Genessee
NLPIR: A Theoretical Framework for Applying Natural Language Processing to Information Retrieval.
ERIC Educational Resources Information Center
Zhou, Lina; Zhang, Dongsong
2003-01-01
Proposes a theoretical framework called NLPIR that integrates natural language processing (NLP) into information retrieval (IR) based on the assumption that there exists representation distance between queries and documents. Discusses problems in traditional keyword-based IR, including relevance, and describes some existing NLP techniques.…
Sun, Wenqi; Yuan, Guozan; Liu, Jingxin; Ma, Li; Liu, Chengbu
2013-04-01
The title molecule (E)-2-[2-(2,6-dichlorophenyl)ethenyl]-8-hydroxyquinoline (DPEQ) was synthesized and characterized by FT-IR, UV-vis, NMR spectroscopy. The molecular geometry, vibrational frequencies and gauge independent atomic orbital (GIAO) 1H and 13C NMR chemical shift values of the compound in the ground state have been calculated by using the density functional theory (DFT) method. All the assignments of the theoretical frequencies were performed by potential energy distributions using VEDA 4 program. The calculated results indicate that the theoretical vibrational frequencies, 1H and 13C NMR chemical shift values show good agreement with experimental data. The electronic properties like UV-vis spectral analysis and HOMO-LUMO analysis of DPEQ have been reported and compared with experimental data. Information about the size, shape, charge density distribution and site of chemical reactivity of the molecule has been obtained by mapping electron density isosurface with molecular electrostatic potential (MEP). Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Sun, Wenqi; Yuan, Guozan; Liu, Jingxin; Ma, Li; Liu, Chengbu
2013-04-01
The title molecule (E)-2-[2-(2,6-dichlorophenyl)ethenyl]-8-hydroxyquinoline (DPEQ) was synthesized and characterized by FT-IR, UV-vis, NMR spectroscopy. The molecular geometry, vibrational frequencies and gauge independent atomic orbital (GIAO) 1H and 13C NMR chemical shift values of the compound in the ground state have been calculated by using the density functional theory (DFT) method. All the assignments of the theoretical frequencies were performed by potential energy distributions using VEDA 4 program. The calculated results indicate that the theoretical vibrational frequencies, 1H and 13C NMR chemical shift values show good agreement with experimental data. The electronic properties like UV-vis spectral analysis and HOMO-LUMO analysis of DPEQ have been reported and compared with experimental data. Information about the size, shape, charge density distribution and site of chemical reactivity of the molecule has been obtained by mapping electron density isosurface with molecular electrostatic potential (MEP).
NASA Astrophysics Data System (ADS)
François, Stéphanie; Perraud, Véronique; Pflieger, Maryline; Monod, Anne; Wortham, Henri
In this work, glass tube and mist chamber sampling techniques using 2,4-dinitrophenylhydrazine as derivative agent for the analysis of gaseous carbonyl compounds are compared. Trapping efficiencies of formaldehyde, acetaldehyde, propionaldehyde, acetone, acrolein, glyoxal, crotonaldehyde, benzaldehyde, butyraldehyde and valeraldehyde are experimentally determined using a gas-phase generator. In addition to generalise our results to all atmospheric gaseous compounds and derivative agents, theoretical trapping efficiencies and enrichment factors are expressed taking into account mechanisms involved in the two kinds of traps. Theoretical and experimental results show that, as expected, the trapping efficiencies of the glass tube depend mainly on solubility of compounds. The results provide new information and better understanding of phenomena occurring in the mist chamber and the ability of this sampler to concentrate the samples. Hence, the mist chamber is the more convenient sampling method when the trapping is associated to a fast derivatisation of the compounds and the glass tube technique must be used to trap atmospheric compounds without simultaneous derivatisation.
Theoretical analysis of tsunami generation by pyroclastic flows
Watts, P.; Waythomas, C.F.
2003-01-01
Pyroclastic flows are a common product of explosive volcanism and have the potential to initiate tsunamis whenever thick, dense flows encounter bodies of water. We evaluate the process of tsunami generation by pyroclastic flow by decomposing the pyroclastic flow into two components, the dense underflow portion, which we term the pyroclastic debris flow, and the plume, which includes the surge and coignimbrite ash cloud parts of the flow. We consider five possible wave generation mechanisms. These mechanisms consist of steam explosion, pyroclastic debris flow, plume pressure, plume shear, and pressure impulse wave generation. Our theoretical analysis of tsunami generation by these mechanisms provides an estimate of tsunami features such as a characteristic wave amplitude and wavelength. We find that in most situations, tsunami generation is dominated by the pyroclastic debris flow component of a pyroclastic flow. This work presents information sufficient to construct tsunami sources for an arbitrary pyroclastic flow interacting with most bodies of water. Copyright 2003 by the American Geophysical Union.
Thermoelectric Generation Of Current - Theoretical And Experimental Analysis
NASA Astrophysics Data System (ADS)
Ruciński, Adam; Rusowicz, Artur
2017-12-01
This paper provides some information about thermoelectric technology. Some new materials with improved figures of merit are presented. These materials in Peltier modules make it possible to generate electric current thanks to a temperature difference. The paper indicates possible applications of thermoelectric modules as interesting tools for using various waste heat sources. Some zero-dimensional equations describing the conditions of electric power generation are given. Also, operating parameters of Peltier modules, such as voltage and electric current, are analyzed. The paper shows chosen characteristics of power generation parameters. Then, an experimental stand for ongoing research and experimental measurements are described. The authors consider the resistance of a receiver placed in the electric circuit with thermoelectric elements. Finally, both the analysis of experimental results and conclusions drawn from theoretical findings are presented. Voltage generation of about 1.5 to 2.5 V for the temperature difference from 65 to 85 K was observed when a bismuth telluride thermoelectric couple (traditionally used in cooling technology) was used.
pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis.
Giannakopoulos, Theodoros
2015-01-01
Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library.
pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis
Giannakopoulos, Theodoros
2015-01-01
Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library. PMID:26656189
Workshop on High-Field NMR and Biological Applications
NASA Astrophysics Data System (ADS)
Scientists at the Pacific Northwest Laboratory have been working toward the establishment of a new Molecular Science Research Center (MSRC). The primary scientific thrust of this new research center is in the areas of theoretical chemistry, chemical dynamics, surface and interfacial science, and studies on the structure and interactions of biological macromolecules. The MSRC will provide important new capabilities for studies on the structure of biological macromolecules. The MSRC program includes several types of advanced spectroscopic techniques for molecular structure analysis, and a theory and modeling laboratory for molecular mechanics/dynamics calculations and graphics. It is the goal to closely integrate experimental and theoretical studies on macromolecular structure, and to join these research efforts with those of the molecular biological programs to provide new insights into the structure/function relationships of biological macromolecules. One of the areas of structural biology on which initial efforts in the MSRC will be focused is the application of high field, 2-D NMR to the study of biological macromolecules. First, there is interest in obtaining 3-D structural information on large proteins and oligonucleotides. Second, one of the primary objectives is to closely link theoretical approaches to molecular structure analysis with the results obtained in experimental research using NMR and other spectroscopies.
Theoretical foundations for information representation and constraint specification
NASA Technical Reports Server (NTRS)
Menzel, Christopher P.; Mayer, Richard J.
1991-01-01
Research accomplished at the Knowledge Based Systems Laboratory of the Department of Industrial Engineering at Texas A&M University is described. Outlined here are the theoretical foundations necessary to construct a Neutral Information Representation Scheme (NIRS), which will allow for automated data transfer and translation between model languages, procedural programming languages, database languages, transaction and process languages, and knowledge representation and reasoning control languages for information system specification.
Desveaux, Laura; Gagliardi, Anna R
2018-06-04
Post-market surveillance of medical devices is reliant on physician reporting of adverse medical device events (AMDEs). Few studies have examined factors that influence whether and how physicians report AMDEs, an essential step in the development of behaviour change interventions. This study was a secondary analysis comparing application of the Theoretical Domains Framework (TDF) and the Tailored Implementation for Chronic Diseases (TICD) framework to identify potential behaviour change interventions that correspond to determinants of AMDE reporting. A previous study involving qualitative interviews with Canadian physicians that implant medical devices identified themes reflecting AMDE reporting determinants. In this secondary analysis, themes that emerged from the primary analysis were independently mapped to the TDF and TICD. Determinants and corresponding intervention options arising from both frameworks (and both mappers) were compared. Both theoretical frameworks were useful for identifying interventions corresponding to behavioural determinants of AMDE reporting. Information or education strategies that provide evidence about AMDEs, and audit and feedback of AMDE data were identified as interventions to target the theme of physician beliefs; improving information systems, and reminder cues, prompts and awards were identified as interventions to address determinants arising from the organization or systems themes; and modifying financial/non-financial incentives and sharing data on outcomes associated with AMDEs were identified as interventions to target device market themes. Numerous operational challenges were encountered in the application of both frameworks including a lack of clarity about how directly relevant to themes the domains/determinants should be, how many domains/determinants to select, if and how to resolve discrepancies across multiple mappers, and how to choose interventions from among the large number associated with selected domains/determinants. Given discrepancies in mapping themes to determinants/domains and the resulting interventions offered by the two frameworks, uncertainty remains about how to choose interventions that best match behavioural determinants in a given context. Further research is needed to provide more nuanced guidance on the application of TDF and TICD for a broader audience, which is likely to increase the utility and uptake of these frameworks in practice.
Indications of the Mineralogy of Callisto and Mars from Reflectance Spectroscopy.
NASA Astrophysics Data System (ADS)
Calvin, Wendy Marie
1991-02-01
Remotely sensed reflectance spectra contain information on mineral identities, grain sizes, and abundances. This thesis consists of analysis of such spectra for two planetary objects, Callisto and Mars. Theoretical modeling of telescopic spectra of Callisto indicates that the surface consists of 20 to 45 wt% water ice at large grain sizes. In the spectral region beyond 3 μm absorption by hydrated mineral phases is dominant. The non-ice material is spectrally similar to hydrous alteration minerals that are commonly found in certain petrologic types of meteorites. New high-resolution data of Callisto are consistent with the findings of the modeling study. In addition, these new data have identified the presence of a small amount of fine-grained water ice on the leading hemisphere, through a characteristic absorption near 3.4 mum. Variations in the depth of this absorption feature indicate dynamic competition between processes which create and erode fine -grained water ice. Calibration and analysis of spectrometer data from the Mariner 6 and 7 space-craft has provided new information regarding the mineralogy of Mars. Laboratory measurements and theoretical calculations of CO_2 frosts have allowed an analysis of spectra taken over the martian south polar cap. The grain sizes in the seasonal cap are quite large and there may be evidence of contamination by water frost or dust. Analysis of Mariner spectra in non-polar regions have tentatively identified absorption features near 2.76 μm and 5.4 mum. The location of these features, and other absorptions identified from terrestrial observations, are consistent with the spectra of hydrous magnesium carbonates. The hydrous carbonates do not have strong spectral features typically associated with carbonates. Theoretical calculations of mixtures indicates that 10-30wt% of these minerals can be included and still be consistent with spectral observations. These minerals form on earth through weathering of mafic minerals with the production of amorphous iron oxides as byproducts, consistent with both present and inferred past martian environments. The presence of hydrous carbonates can provide a mechanism for having abundant carbonates on Mars while spectral searches for (anhydrous) carbonates will not find any evidence for them.
Tsotsos, John K.
2017-01-01
Much has been written about how the biological brain might represent and process visual information, and how this might inspire and inform machine vision systems. Indeed, tremendous progress has been made, and especially during the last decade in the latter area. However, a key question seems too often, if not mostly, be ignored. This question is simply: do proposed solutions scale with the reality of the brain's resources? This scaling question applies equally to brain and to machine solutions. A number of papers have examined the inherent computational difficulty of visual information processing using theoretical and empirical methods. The main goal of this activity had three components: to understand the deep nature of the computational problem of visual information processing; to discover how well the computational difficulty of vision matches to the fixed resources of biological seeing systems; and, to abstract from the matching exercise the key principles that lead to the observed characteristics of biological visual performance. This set of components was termed complexity level analysis in Tsotsos (1987) and was proposed as an important complement to Marr's three levels of analysis. This paper revisits that work with the advantage that decades of hindsight can provide. PMID:28848458
Tsotsos, John K
2017-01-01
Much has been written about how the biological brain might represent and process visual information, and how this might inspire and inform machine vision systems. Indeed, tremendous progress has been made, and especially during the last decade in the latter area. However, a key question seems too often, if not mostly, be ignored. This question is simply: do proposed solutions scale with the reality of the brain's resources? This scaling question applies equally to brain and to machine solutions. A number of papers have examined the inherent computational difficulty of visual information processing using theoretical and empirical methods. The main goal of this activity had three components: to understand the deep nature of the computational problem of visual information processing; to discover how well the computational difficulty of vision matches to the fixed resources of biological seeing systems; and, to abstract from the matching exercise the key principles that lead to the observed characteristics of biological visual performance. This set of components was termed complexity level analysis in Tsotsos (1987) and was proposed as an important complement to Marr's three levels of analysis. This paper revisits that work with the advantage that decades of hindsight can provide.
Determination of principal stress in birefringent composites by hole-drilling method
NASA Technical Reports Server (NTRS)
Prabhakaran, R.
1981-01-01
The application of transmission photoelasticity to stress analysis of composite materials is discussed.The method consists in drilling very small holes at points where the state of stress has to be determined. Experiments are described which verify the theoretical predicitons. The limitations of the method are discussed and it is concluded that valuable information concerning the state of stress in a composite model can be obtained through the suggested method.
Fractal Point Process and Queueing Theory and Application to Communication Networks
1999-12-31
use of nonlinear dynamics and chaos in the design of innovative analog error-protection codes for com- munications applications. In the chaos...the fol- lowing theses, patent, and papers. 1. A. Narula, M. D. Trott , and G. W. Wornell, "Information-Theoretic Analysis of Multiple-Antenna...Bounds," in Proc. Int. Conf. Dec. Control, (Japan), Dec. 1996. 5. G. W. Wornell and M. D. Trott , "Efficient Signal Processing Tech- niques for
Discovering Knowledge from AIS Database for Application in VTS
NASA Astrophysics Data System (ADS)
Tsou, Ming-Cheng
The widespread use of the Automatic Identification System (AIS) has had a significant impact on maritime technology. AIS enables the Vessel Traffic Service (VTS) not only to offer commonly known functions such as identification, tracking and monitoring of vessels, but also to provide rich real-time information that is useful for marine traffic investigation, statistical analysis and theoretical research. However, due to the rapid accumulation of AIS observation data, the VTS platform is often unable quickly and effectively to absorb and analyze it. Traditional observation and analysis methods are becoming less suitable for the modern AIS generation of VTS. In view of this, we applied the same data mining technique used for business intelligence discovery (in Customer Relation Management (CRM) business marketing) to the analysis of AIS observation data. This recasts the marine traffic problem as a business-marketing problem and integrates technologies such as Geographic Information Systems (GIS), database management systems, data warehousing and data mining to facilitate the discovery of hidden and valuable information in a huge amount of observation data. Consequently, this provides the marine traffic managers with a useful strategic planning resource.
Comparison of information theoretic divergences for sensor management
NASA Astrophysics Data System (ADS)
Yang, Chun; Kadar, Ivan; Blasch, Erik; Bakich, Michael
2011-06-01
In this paper, we compare the information-theoretic metrics of the Kullback-Leibler (K-L) and Renyi (α) divergence formulations for sensor management. Information-theoretic metrics have been well suited for sensor management as they afford comparisons between distributions resulting from different types of sensors under different actions. The difference in distributions can also be measured as entropy formulations to discern the communication channel capacity (i.e., Shannon limit). In this paper, we formulate a sensor management scenario for target tracking and compare various metrics for performance evaluation as a function of the design parameter (α) so as to determine which measures might be appropriate for sensor management given the dynamics of the scenario and design parameter.
Temporal Methods to Detect Content-Based Anomalies in Social Media
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skryzalin, Jacek; Field, Jr., Richard; Fisher, Andrew N.
Here, we develop a method for time-dependent topic tracking and meme trending in social media. Our objective is to identify time periods whose content differs signifcantly from normal, and we utilize two techniques to do so. The first is an information-theoretic analysis of the distributions of terms emitted during different periods of time. In the second, we cluster documents from each time period and analyze the tightness of each clustering. We also discuss a method of combining the scores created by each technique, and we provide ample empirical analysis of our methodology on various Twitter datasets.
Brain activity and cognition: a connection from thermodynamics and information theory
Collell, Guillem; Fauquet, Jordi
2015-01-01
The connection between brain and mind is an important scientific and philosophical question that we are still far from completely understanding. A crucial point to our work is noticing that thermodynamics provides a convenient framework to model brain activity, whereas cognition can be modeled in information-theoretical terms. In fact, several models have been proposed so far from both approaches. A second critical remark is the existence of deep theoretical connections between thermodynamics and information theory. In fact, some well-known authors claim that the laws of thermodynamics are nothing but principles in information theory. Unlike in physics or chemistry, a formalization of the relationship between information and energy is currently lacking in neuroscience. In this paper we propose a framework to connect physical brain and cognitive models by means of the theoretical connections between information theory and thermodynamics. Ultimately, this article aims at providing further insight on the formal relationship between cognition and neural activity. PMID:26136709
Murphy, Kerry; O'Connor, Denise A; Browning, Colette J; French, Simon D; Michie, Susan; Francis, Jill J; Russell, Grant M; Workman, Barbara; Flicker, Leon; Eccles, Martin P; Green, Sally E
2014-03-03
Dementia is a growing problem, causing substantial burden for patients, their families, and society. General practitioners (GPs) play an important role in diagnosing and managing dementia; however, there are gaps between recommended and current practice. The aim of this study was to explore GPs' reported practice in diagnosing and managing dementia and to describe, in theoretical terms, the proposed explanations for practice that was and was not consistent with evidence-based guidelines. Semi-structured interviews were conducted with GPs in Victoria, Australia. The Theoretical Domains Framework (TDF) guided data collection and analysis. Interviews explored the factors hindering and enabling achievement of 13 recommended behaviours. Data were analysed using content and thematic analysis. This paper presents an in-depth description of the factors influencing two behaviours, assessing co-morbid depression using a validated tool, and conducting a formal cognitive assessment using a validated scale. A total of 30 GPs were interviewed. Most GPs reported that they did not assess for co-morbid depression using a validated tool as per recommended guidance. Barriers included the belief that depression can be adequately assessed using general clinical indicators and that validated tools provide little additional information (theoretical domain of 'Beliefs about consequences'); discomfort in using validated tools ('Emotion'), possibly due to limited training and confidence ('Skills'; 'Beliefs about capabilities'); limited awareness of the need for, and forgetting to conduct, a depression assessment ('Knowledge'; 'Memory, attention and decision processes'). Most reported practising in a manner consistent with the recommendation that a formal cognitive assessment using a validated scale be undertaken. Key factors enabling this were having an awareness of the need to conduct a cognitive assessment ('Knowledge'); possessing the necessary skills and confidence ('Skills'; 'Beliefs about capabilities'); and having adequate time and resources ('Environmental context and resources'). This is the first study to our knowledge to use a theoretical approach to investigate the barriers and enablers to guideline-recommended diagnosis and management of dementia in general practice. It has identified key factors likely to explain GPs' uptake of the guidelines. The results have informed the design of an intervention aimed at supporting practice change in line with dementia guidelines, which is currently being evaluated in a cluster randomised trial.
[Preliminarily application of content analysis to qualitative nursing data].
Liang, Shu-Yuan; Chuang, Yeu-Hui; Wu, Shu-Fang
2012-10-01
Content analysis is a methodology for objectively and systematically studying the content of communication in various formats. Content analysis in nursing research and nursing education is called qualitative content analysis. Qualitative content analysis is frequently applied to nursing research, as it allows researchers to determine categories inductively and deductively. This article examines qualitative content analysis in nursing research from theoretical and practical perspectives. We first describe how content analysis concepts such as unit of analysis, meaning unit, code, category, and theme are used. Next, we describe the basic steps involved in using content analysis, including data preparation, data familiarization, analysis unit identification, creating tentative coding categories, category refinement, and establishing category integrity. Finally, this paper introduces the concept of content analysis rigor, including dependability, confirmability, credibility, and transferability. This article elucidates the content analysis method in order to help professionals conduct systematic research that generates data that are informative and useful in practical application.
Stevens, Jeffrey R; Marewski, Julian N; Schooler, Lael J; Gilby, Ian C
2016-08-01
In cognitive science, the rational analysis framework allows modelling of how physical and social environments impose information-processing demands onto cognitive systems. In humans, for example, past social contact among individuals predicts their future contact with linear and power functions. These features of the human environment constrain the optimal way to remember information and probably shape how memory records are retained and retrieved. We offer a primer on how biologists can apply rational analysis to study animal behaviour. Using chimpanzees ( Pan troglodytes ) as a case study, we modelled 19 years of observational data on their social contact patterns. Much like humans, the frequency of past encounters in chimpanzees linearly predicted future encounters, and the recency of past encounters predicted future encounters with a power function. Consistent with the rational analyses carried out for human memory, these findings suggest that chimpanzee memory performance should reflect those environmental regularities. In re-analysing existing chimpanzee memory data, we found that chimpanzee memory patterns mirrored their social contact patterns. Our findings hint that human and chimpanzee memory systems may have evolved to solve similar information-processing problems. Overall, rational analysis offers novel theoretical and methodological avenues for the comparative study of cognition.
Reflections of the social environment in chimpanzee memory: applying rational analysis beyond humans
Marewski, Julian N.; Schooler, Lael J.; Gilby, Ian C.
2016-01-01
In cognitive science, the rational analysis framework allows modelling of how physical and social environments impose information-processing demands onto cognitive systems. In humans, for example, past social contact among individuals predicts their future contact with linear and power functions. These features of the human environment constrain the optimal way to remember information and probably shape how memory records are retained and retrieved. We offer a primer on how biologists can apply rational analysis to study animal behaviour. Using chimpanzees (Pan troglodytes) as a case study, we modelled 19 years of observational data on their social contact patterns. Much like humans, the frequency of past encounters in chimpanzees linearly predicted future encounters, and the recency of past encounters predicted future encounters with a power function. Consistent with the rational analyses carried out for human memory, these findings suggest that chimpanzee memory performance should reflect those environmental regularities. In re-analysing existing chimpanzee memory data, we found that chimpanzee memory patterns mirrored their social contact patterns. Our findings hint that human and chimpanzee memory systems may have evolved to solve similar information-processing problems. Overall, rational analysis offers novel theoretical and methodological avenues for the comparative study of cognition. PMID:27853606
Hötzel, Fabian; Seino, Kaori; Huck, Christian; Skibbe, Olaf; Bechstedt, Friedhelm; Pucci, Annemarie
2015-06-10
The metal-atom chains on the Si(111) - 5 × 2 - Au surface represent an exceedingly interesting system for the understanding of one-dimensional electrical interconnects. While other metal-atom chain structures on silicon suffer from metal-to-insulator transitions, Si(111) - 5 × 2 - Au stays metallic at least down to 20 K as we have proven by the anisotropic absorption from localized plasmon polaritons in the infrared. A quantitative analysis of the infrared plasmonic signal done here for the first time yields valuable band structure information in agreement with the theoretically derived data. The experimental and theoretical results are consistently explained in the framework of the atomic geometry, electronic structure, and IR spectra of the recent Kwon-Kang model.
Making a Traditional Study-Abroad Program Geographic: A Theoretically Informed Regional Approach
ERIC Educational Resources Information Center
Jokisch, Brad
2009-01-01
Geographers have been active in numerous focused study-abroad programs, but few have created or led language-based programs overseas. This article describes the development of a Spanish language program in Ecuador and how it was made geographic primarily through a theoretically informed regional geography course. The approach employs theoretical…
Content Based Image Retrieval and Information Theory: A General Approach.
ERIC Educational Resources Information Center
Zachary, John; Iyengar, S. S.; Barhen, Jacob
2001-01-01
Proposes an alternative real valued representation of color based on the information theoretic concept of entropy. A theoretical presentation of image entropy is accompanied by a practical description of the merits and limitations of image entropy compared to color histograms. Results suggest that image entropy is a promising approach to image…
ERIC Educational Resources Information Center
Fleer, Marilyn
2016-01-01
The concept of "perezhivanie" has received increasing attention in recent years. However, a clear understanding of this term has not yet been established. Mostly what is highlighted is the need for more informed theoretical discussion. In this paper, discussions centre on what "perezhivanie" means for research in early…
The Public Library User and the Charter Tourist: Two Travellers, One Analogy
ERIC Educational Resources Information Center
Eriksson, Catarina A. M.; Michnik, Katarina E.; Nordeborg, Yoshiko
2013-01-01
Introduction: A new theoretical model, relevant to library and information science, is implemented in this paper. The aim of this study is to contribute to the theoretical concepts of library and information science by introducing an ethnological model developed for investigating charter tourist styles thereby increasing our knowledge of users'…
Towards Improved Student Experiences in Service Learning in Information Systems Courses
ERIC Educational Resources Information Center
Petkova, Olga
2017-01-01
The paper explores relevant past research on service-learning in Information Systems courses since 2000. One of the conclusions from this is that most of the publications are not founded on specific theoretical models and are mainly about sharing instructor or student experiences. Then several theoretical frameworks from Education and other…
ERIC Educational Resources Information Center
Park, Young Ki
2011-01-01
This study explains the role of information technologies in enabling organizations to successfully sense and manage opportunities and threats and achieve competitive advantage in turbulent environments. I use two approaches, a set-theoretic configurational theory approach and a variance theory approach, which are theoretically and methodologically…
ERIC Educational Resources Information Center
Schalock, Robert L.; Luckasson, Ruth; Tassé, Marc J.; Verdugo, Miguel Angel
2018-01-01
This article describes a holistic theoretical framework that can be used to explain intellectual disability (ID) and organize relevant information into a usable roadmap to guide understanding and application. Developing the framework involved analyzing the four current perspectives on ID and synthesizing this information into a holistic…
Suppressing disease spreading by using information diffusion on multiplex networks.
Wang, Wei; Liu, Quan-Hui; Cai, Shi-Min; Tang, Ming; Braunstein, Lidia A; Stanley, H Eugene
2016-07-06
Although there is always an interplay between the dynamics of information diffusion and disease spreading, the empirical research on the systemic coevolution mechanisms connecting these two spreading dynamics is still lacking. Here we investigate the coevolution mechanisms and dynamics between information and disease spreading by utilizing real data and a proposed spreading model on multiplex network. Our empirical analysis finds asymmetrical interactions between the information and disease spreading dynamics. Our results obtained from both the theoretical framework and extensive stochastic numerical simulations suggest that an information outbreak can be triggered in a communication network by its own spreading dynamics or by a disease outbreak on a contact network, but that the disease threshold is not affected by information spreading. Our key finding is that there is an optimal information transmission rate that markedly suppresses the disease spreading. We find that the time evolution of the dynamics in the proposed model qualitatively agrees with the real-world spreading processes at the optimal information transmission rate.
Cardiorespiratory Information Dynamics during Mental Arithmetic and Sustained Attention
Widjaja, Devy; Montalto, Alessandro; Vlemincx, Elke; Marinazzo, Daniele; Van Huffel, Sabine; Faes, Luca
2015-01-01
An analysis of cardiorespiratory dynamics during mental arithmetic, which induces stress, and sustained attention was conducted using information theory. The information storage and internal information of heart rate variability (HRV) were determined respectively as the self-entropy of the tachogram, and the self-entropy of the tachogram conditioned to the knowledge of respiration. The information transfer and cross information from respiration to HRV were assessed as the transfer and cross-entropy, both measures of cardiorespiratory coupling. These information-theoretic measures identified significant nonlinearities in the cardiorespiratory time series. Additionally, it was shown that, although mental stress is related to a reduction in vagal activity, no difference in cardiorespiratory coupling was found when several mental states (rest, mental stress, sustained attention) are compared. However, the self-entropy of HRV conditioned to respiration was very informative to study the predictability of RR interval series during mental tasks, and showed higher predictability during mental arithmetic compared to sustained attention or rest. PMID:26042824
Sheehan, Joanne; Sherman, Kerry A; Lam, Thomas; Boyages, John
2007-04-01
Little is known of the psychosocial factors associated with decision regret in the context of breast reconstruction following mastectomy for breast cancer treatment. Moreover, there is a paucity of theoretically-based research in the area of post-decision regret. Adopting the theoretical framework of the Monitoring Process Model (Cancer 1995;76(1):167-177), the current study assessed the role of information satisfaction, current psychological distress and the moderating effect of monitoring coping style to the experience of regret over the decision to undergo reconstructive surgery. Women (N=123) diagnosed with breast cancer who had undergone immediate or delayed breast reconstruction following mastectomy participated in the study. The majority of participants (52.8%, n=65) experienced no decision regret, 27.6% experienced mild regret and 19.5% moderate to strong regret. Bivariate analyses indicated that decision regret was associated with low satisfaction with preparatory information, depression, anxiety and stress. Multinominal logistic regression analysis showed, controlling for mood state and time since last reconstructive procedure, that lower satisfaction with information and increased depression were associated with increased likelihood of experiencing regret. Monitoring coping style moderated the association between anxiety and regret (beta=-0.10, OR=0.91, p=0.01), whereby low monitors who were highly anxious had a greater likelihood of experiencing regret than highly anxious high monitors. Copyright (c) 2006 John Wiley & Sons, Ltd.
Information theoretical assessment of visual communication with wavelet coding
NASA Astrophysics Data System (ADS)
Rahman, Zia-ur
1995-06-01
A visual communication channel can be characterized by the efficiency with which it conveys information, and the quality of the images restored from the transmitted data. Efficient data representation requires the use of constraints of the visual communication channel. Our information theoretic analysis combines the design of the wavelet compression algorithm with the design of the visual communication channel. Shannon's communication theory, Wiener's restoration filter, and the critical design factors of image gathering and display are combined to provide metrics for measuring the efficiency of data transmission, and for quantitatively assessing the visual quality of the restored image. These metrics are: a) the mutual information (Eta) between the radiance the radiance field and the restored image, and b) the efficiency of the channel which can be roughly measured by as the ratio (Eta) /H, where H is the average number of bits being used to transmit the data. Huck, et al. (Journal of Visual Communication and Image Representation, Vol. 4, No. 2, 1993) have shown that channels desinged to maximize (Eta) , also maximize. Our assessment provides a framework for designing channels which provide the highest possible visual quality for a given amount of data under the critical design limitations of the image gathering and display devices. Results show that a trade-off exists between the maximum realizable information of the channel and its efficiency: an increase in one leads to a decrease in the other. The final selection of which of these quantities to maximize is, of course, application dependent.
A practice course to cultivate students' comprehensive ability of photoelectricity
NASA Astrophysics Data System (ADS)
Lv, Yong; Liu, Yang; Niu, Chunhui; Liu, Lishuang
2017-08-01
After the studying of many theoretical courses, it's important and urgent for the students from specialty of optoelectronic information science and engineering to cultivate their comprehensive ability of photoelectricity. We set up a comprehensive practice course named "Integrated Design of Optoelectronic Information System" (IDOIS) for the purpose that students can integrate their knowledge of optics, electronics and computer programming to design, install and debug an optoelectronic system with independent functions. Eight years of practice shows that this practice course can train students' ability of analysis, design/development and debugging of photoelectric system, improve their ability in document retrieval, design proposal and summary report writing, teamwork, innovation consciousness and skill.
Informed spectral analysis: audio signal parameter estimation using side information
NASA Astrophysics Data System (ADS)
Fourer, Dominique; Marchand, Sylvain
2013-12-01
Parametric models are of great interest for representing and manipulating sounds. However, the quality of the resulting signals depends on the precision of the parameters. When the signals are available, these parameters can be estimated, but the presence of noise decreases the resulting precision of the estimation. Furthermore, the Cramér-Rao bound shows the minimal error reachable with the best estimator, which can be insufficient for demanding applications. These limitations can be overcome by using the coding approach which consists in directly transmitting the parameters with the best precision using the minimal bitrate. However, this approach does not take advantage of the information provided by the estimation from the signal and may require a larger bitrate and a loss of compatibility with existing file formats. The purpose of this article is to propose a compromised approach, called the 'informed approach,' which combines analysis with (coded) side information in order to increase the precision of parameter estimation using a lower bitrate than pure coding approaches, the audio signal being known. Thus, the analysis problem is presented in a coder/decoder configuration where the side information is computed and inaudibly embedded into the mixture signal at the coder. At the decoder, the extra information is extracted and is used to assist the analysis process. This study proposes applying this approach to audio spectral analysis using sinusoidal modeling which is a well-known model with practical applications and where theoretical bounds have been calculated. This work aims at uncovering new approaches for audio quality-based applications. It provides a solution for challenging problems like active listening of music, source separation, and realistic sound transformations.
Usefulness of a Regional Health Care Information System in primary care: a case study.
Maass, Marianne C; Asikainen, Paula; Mäenpää, Tiina; Wanne, Olli; Suominen, Tarja
2008-08-01
The goal of this paper is to describe some benefits and possible cost consequences of computer based access to specialised health care information. A before-after activity analysis regarding 20 diabetic patients' clinical appointments was performed in a Health Centre in Satakunta region in Finland. Cost data, an interview, time-and-motion studies, and flow charts based on modelling were applied. Access to up-to-date diagnostic information reduced redundant clinical re-appointments, repeated tests, and mail orders for missing data. Timely access to diagnostic information brought about several benefits regarding workflow, patient care, and disease management. These benefits resulted in theoretical net cost savings. The study results indicated that Regional Information Systems may be useful tools to support performance and improve efficiency. However, further studies are required in order to verify how the monetary savings would impact the performance of Health Care Units.
Zamora-López, Gorka; Zhou, Changsong; Kurths, Jürgen
2009-01-01
Sensory stimuli entering the nervous system follow particular paths of processing, typically separated (segregated) from the paths of other modal information. However, sensory perception, awareness and cognition emerge from the combination of information (integration). The corticocortical networks of cats and macaque monkeys display three prominent characteristics: (i) modular organisation (facilitating the segregation), (ii) abundant alternative processing paths and (iii) the presence of highly connected hubs. Here, we study in detail the organisation and potential function of the cortical hubs by graph analysis and information theoretical methods. We find that the cortical hubs form a spatially delocalised, but topologically central module with the capacity to integrate multisensory information in a collaborative manner. With this, we resolve the underlying anatomical substrate that supports the simultaneous capacity of the cortex to segregate and to integrate multisensory information. PMID:20428515
Starke, Sandra D; Baber, Chris
2018-07-01
User interface (UI) design can affect the quality of decision making, where decisions based on digitally presented content are commonly informed by visually sampling information through eye movements. Analysis of the resulting scan patterns - the order in which people visually attend to different regions of interest (ROIs) - gives an insight into information foraging strategies. In this study, we quantified scan pattern characteristics for participants engaging with conceptually different user interface designs. Four interfaces were modified along two dimensions relating to effort in accessing information: data presentation (either alpha-numerical data or colour blocks), and information access time (all information sources readily available or sequential revealing of information required). The aim of the study was to investigate whether a) people develop repeatable scan patterns and b) different UI concepts affect information foraging and task performance. Thirty-two participants (eight for each UI concept) were given the task to correctly classify 100 credit card transactions as normal or fraudulent based on nine transaction attributes. Attributes varied in their usefulness of predicting the correct outcome. Conventional and more recent (network analysis- and bioinformatics-based) eye tracking metrics were used to quantify visual search. Empirical findings were evaluated in context of random data and possible accuracy for theoretical decision making strategies. Results showed short repeating sequence fragments within longer scan patterns across participants and conditions, comprising a systematic and a random search component. The UI design concept showing alpha-numerical data in full view resulted in most complete data foraging, while the design concept showing colour blocks in full view resulted in the fastest task completion time. Decision accuracy was not significantly affected by UI design. Theoretical calculations showed that the difference in achievable accuracy between very complex and simple decision making strategies was small. We conclude that goal-directed search of familiar information results in repeatable scan pattern fragments (often corresponding to information sources considered particularly important), but no repeatable complete scan pattern. The underlying concept of the UI affects how visual search is performed, and a decision making strategy develops. This should be taken in consideration when designing for applied domains. Copyright © 2018 Elsevier Ltd. All rights reserved.
On-board data management study for EOPAP
NASA Technical Reports Server (NTRS)
Davisson, L. D.
1975-01-01
The requirements, implementation techniques, and mission analysis associated with on-board data management for EOPAP were studied. SEASAT-A was used as a baseline, and the storage requirements, data rates, and information extraction requirements were investigated for each of the following proposed SEASAT sensors: a short pulse 13.9 GHz radar, a long pulse 13.9 GHz radar, a synthetic aperture radar, a multispectral passive microwave radiometer facility, and an infrared/visible very high resolution radiometer (VHRR). Rate distortion theory was applied to determine theoretical minimum data rates and compared with the rates required by practical techniques. It was concluded that practical techniques can be used which approach the theoretically optimum based upon an empirically determined source random process model. The results of the preceding investigations were used to recommend an on-board data management system for (1) data compression through information extraction, optimal noiseless coding, source coding with distortion, data buffering, and data selection under command or as a function of data activity, (2) for command handling, (3) for spacecraft operation and control, and (4) for experiment operation and monitoring.
Combustion: an oil spill mitigation tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1979-11-01
The technical feasibility of using combustion as an oil spill mitigation tool was studied. Part I of the two-part report is a practical guide oriented toward the needs of potential users, while Part II is the research or resource document from which the practical guidance was drawn. The study included theoretical evaluations of combustion of petroleum pool fires under the effects of weathering and an oil classification system related to combustion potential. The theoretical analysis of combustion is balanced by practical experience of oil burning and case history information. Decision elements are provided which can be used as a guidemore » for technical evaluations of a particular oil spill situation. The rationale for assessing technical feasibility is given in the context of other alternatives available for response to an oil spill. A series of research and technology development concepts are included for future research. The ethics of using oil burning are discussed as issues, concerns, and tradeoffs. A detailed annotated bibliography is appended along with a capsule review of a decade of oil burning studies and other support information.« less
Altan, Irem; Charbonneau, Patrick; Snell, Edward H.
2016-01-01
Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed. PMID:26792536
Comprehensive proteomic analysis of Penicillium verrucosum.
Nöbauer, Katharina; Hummel, Karin; Mayrhofer, Corina; Ahrens, Maike; Setyabudi, Francis M C; Schmidt-Heydt, Markus; Eisenacher, Martin; Razzazi-Fazeli, Ebrahim
2017-05-01
Mass spectrometric identification of proteins in species lacking validated sequence information is a major problem in veterinary science. In the present study, we used ochratoxin A producing Penicillium verrucosum to identify and quantitatively analyze proteins of an organism with yet no protein information available. The work presented here aimed to provide a comprehensive protein identification of P. verrucosum using shotgun proteomics. We were able to identify 3631 proteins in an "ab initio" translated database from DNA sequences of P. verrucosum. Additionally, a sequential window acquisition of all theoretical fragment-ion spectra analysis was done to find differentially regulated proteins at two different time points of the growth curve. We compared the proteins at the beginning (day 3) and at the end of the log phase (day 12). © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Using text analysis to quantify the similarity and evolution of scientific disciplines
Dias, Laércio; Scharloth, Joachim
2018-01-01
We use an information-theoretic measure of linguistic similarity to investigate the organization and evolution of scientific fields. An analysis of almost 20 M papers from the past three decades reveals that the linguistic similarity is related but different from experts and citation-based classifications, leading to an improved view on the organization of science. A temporal analysis of the similarity of fields shows that some fields (e.g. computer science) are becoming increasingly central, but that on average the similarity between pairs of disciplines has not changed in the last decades. This suggests that tendencies of convergence (e.g. multi-disciplinarity) and divergence (e.g. specialization) of disciplines are in balance. PMID:29410857
Using text analysis to quantify the similarity and evolution of scientific disciplines.
Dias, Laércio; Gerlach, Martin; Scharloth, Joachim; Altmann, Eduardo G
2018-01-01
We use an information-theoretic measure of linguistic similarity to investigate the organization and evolution of scientific fields. An analysis of almost 20 M papers from the past three decades reveals that the linguistic similarity is related but different from experts and citation-based classifications, leading to an improved view on the organization of science. A temporal analysis of the similarity of fields shows that some fields (e.g. computer science) are becoming increasingly central, but that on average the similarity between pairs of disciplines has not changed in the last decades. This suggests that tendencies of convergence (e.g. multi-disciplinarity) and divergence (e.g. specialization) of disciplines are in balance.
The evolutionary basis of human social learning
Morgan, T. J. H.; Rendell, L. E.; Ehn, M.; Hoppitt, W.; Laland, K. N.
2012-01-01
Humans are characterized by an extreme dependence on culturally transmitted information. Such dependence requires the complex integration of social and asocial information to generate effective learning and decision making. Recent formal theory predicts that natural selection should favour adaptive learning strategies, but relevant empirical work is scarce and rarely examines multiple strategies or tasks. We tested nine hypotheses derived from theoretical models, running a series of experiments investigating factors affecting when and how humans use social information, and whether such behaviour is adaptive, across several computer-based tasks. The number of demonstrators, consensus among demonstrators, confidence of subjects, task difficulty, number of sessions, cost of asocial learning, subject performance and demonstrator performance all influenced subjects' use of social information, and did so adaptively. Our analysis provides strong support for the hypothesis that human social learning is regulated by adaptive learning rules. PMID:21795267
The evolutionary basis of human social learning.
Morgan, T J H; Rendell, L E; Ehn, M; Hoppitt, W; Laland, K N
2012-02-22
Humans are characterized by an extreme dependence on culturally transmitted information. Such dependence requires the complex integration of social and asocial information to generate effective learning and decision making. Recent formal theory predicts that natural selection should favour adaptive learning strategies, but relevant empirical work is scarce and rarely examines multiple strategies or tasks. We tested nine hypotheses derived from theoretical models, running a series of experiments investigating factors affecting when and how humans use social information, and whether such behaviour is adaptive, across several computer-based tasks. The number of demonstrators, consensus among demonstrators, confidence of subjects, task difficulty, number of sessions, cost of asocial learning, subject performance and demonstrator performance all influenced subjects' use of social information, and did so adaptively. Our analysis provides strong support for the hypothesis that human social learning is regulated by adaptive learning rules.
From "They" Science to "Our" Science: Hip Hop Epistemology in STEAM Education
NASA Astrophysics Data System (ADS)
Dolberry, Maurice E.
Hip hop has moved from being considered a type of music into being understood as a culture in which a prominent type of music originates. Hip hop culture has a philosophy and epistemological constructs as well. This study analyzed those constructs to determine how conceptions of science factor in hip hop worldviews. Pedagogical models in culturally responsive teaching and Science, Technology, Engineering, Arts, and Mathematics (STEAM) education were also examined to discern their philosophical connections with hip hop culture. These connections were used to create two theoretical models. The first one, Hip Hop Science, described how scientific thought functions in hip hop culture. The second model, Hip Hop STEAM Pedagogy, proposes how hip hop culture can inform STEAM teaching practices. The study began by using Critical Race Theory to create a theoretical framework proposing how the two theoretical models could be derived from the philosophical and pedagogical concepts. Content analysis and narrative inquiry were used to analyze data collected from scholarly texts, hip hop songs, and interviews with hip hop-responsive educators. The data from these sources were used initially to assess the adequacy of the proposed theoretical framework, and subsequently to improve its viability. Four overlapping themes emerged from the data analyses, including hip hop-resistance to formal education; how hip hop culture informs pedagogical practice in hip hop-responsive classrooms; conceptions of knowledge and reality that shape how hip hoppers conduct scientific inquiry; and hip hop-based philosophies of effective teaching for hip hoppers as a marginalized cultural group. The findings indicate that there are unique connections between hip hop epistemology, sciencemindedness, and pedagogical practices in STEAM education. The revised theoretical framework clarified the nature of these connections, and supported claims from prior research that hip hop culture provides viable sites of engagement for STEAM educators. It concluded with suggestions for future research that further explicates hip hop epistemology and Hip Hop STEAM Pedagogy.
Identifying Seizure Onset Zone From the Causal Connectivity Inferred Using Directed Information
NASA Astrophysics Data System (ADS)
Malladi, Rakesh; Kalamangalam, Giridhar; Tandon, Nitin; Aazhang, Behnaam
2016-10-01
In this paper, we developed a model-based and a data-driven estimator for directed information (DI) to infer the causal connectivity graph between electrocorticographic (ECoG) signals recorded from brain and to identify the seizure onset zone (SOZ) in epileptic patients. Directed information, an information theoretic quantity, is a general metric to infer causal connectivity between time-series and is not restricted to a particular class of models unlike the popular metrics based on Granger causality or transfer entropy. The proposed estimators are shown to be almost surely convergent. Causal connectivity between ECoG electrodes in five epileptic patients is inferred using the proposed DI estimators, after validating their performance on simulated data. We then proposed a model-based and a data-driven SOZ identification algorithm to identify SOZ from the causal connectivity inferred using model-based and data-driven DI estimators respectively. The data-driven SOZ identification outperforms the model-based SOZ identification algorithm when benchmarked against visual analysis by neurologist, the current clinical gold standard. The causal connectivity analysis presented here is the first step towards developing novel non-surgical treatments for epilepsy.
Variables in psychology: a critique of quantitative psychology.
Toomela, Aaro
2008-09-01
Mind is hidden from direct observation; it can be studied only by observing behavior. Variables encode information about behaviors. There is no one-to-one correspondence between behaviors and mental events underlying the behaviors, however. In order to understand mind it would be necessary to understand exactly what information is represented in variables. This aim cannot be reached after variables are already encoded. Therefore, statistical data analysis can be very misleading in studies aimed at understanding mind that underlies behavior. In this article different kinds of information that can be represented in variables are described. It is shown how informational ambiguity of variables leads to problems of theoretically meaningful interpretation of the results of statistical data analysis procedures in terms of hidden mental processes. Reasons are provided why presence of dependence between variables does not imply causal relationship between events represented by variables and absence of dependence between variables cannot rule out the causal dependence of events represented by variables. It is concluded that variable-psychology has a very limited range of application for the development of a theory of mind-psychology.
An ideal observer analysis of visual working memory.
Sims, Chris R; Jacobs, Robert A; Knill, David C
2012-10-01
Limits in visual working memory (VWM) strongly constrain human performance across many tasks. However, the nature of these limits is not well understood. In this article we develop an ideal observer analysis of human VWM by deriving the expected behavior of an optimally performing but limited-capacity memory system. This analysis is framed around rate-distortion theory, a branch of information theory that provides optimal bounds on the accuracy of information transmission subject to a fixed information capacity. The result of the ideal observer analysis is a theoretical framework that provides a task-independent and quantitative definition of visual memory capacity and yields novel predictions regarding human performance. These predictions are subsequently evaluated and confirmed in 2 empirical studies. Further, the framework is general enough to allow the specification and testing of alternative models of visual memory (e.g., how capacity is distributed across multiple items). We demonstrate that a simple model developed on the basis of the ideal observer analysis-one that allows variability in the number of stored memory representations but does not assume the presence of a fixed item limit-provides an excellent account of the empirical data and further offers a principled reinterpretation of existing models of VWM. PsycINFO Database Record (c) 2012 APA, all rights reserved.
Anti-Noise Bidirectional Quantum Steganography Protocol with Large Payload
NASA Astrophysics Data System (ADS)
Qu, Zhiguo; Chen, Siyi; Ji, Sai; Ma, Songya; Wang, Xiaojun
2018-06-01
An anti-noise bidirectional quantum steganography protocol with large payload protocol is proposed in this paper. In the new protocol, Alice and Bob enable to transmit classical information bits to each other while teleporting secret quantum states covertly. The new protocol introduces the bidirectional quantum remote state preparation into the bidirectional quantum secure communication, not only to expand secret information from classical bits to quantum state, but also extract the phase and amplitude values of secret quantum state for greatly enlarging the capacity of secret information. The new protocol can also achieve better imperceptibility, since the eavesdropper can hardly detect the hidden channel or even obtain effective secret quantum states. Comparing with the previous quantum steganography achievements, due to its unique bidirectional quantum steganography, the new protocol can obtain higher transmission efficiency and better availability. Furthermore, the new algorithm can effectively resist quantum noises through theoretical analysis. Finally, the performance analysis proves the conclusion that the new protocol not only has good imperceptibility, high security, but also large payload.
Anti-Noise Bidirectional Quantum Steganography Protocol with Large Payload
NASA Astrophysics Data System (ADS)
Qu, Zhiguo; Chen, Siyi; Ji, Sai; Ma, Songya; Wang, Xiaojun
2018-03-01
An anti-noise bidirectional quantum steganography protocol with large payload protocol is proposed in this paper. In the new protocol, Alice and Bob enable to transmit classical information bits to each other while teleporting secret quantum states covertly. The new protocol introduces the bidirectional quantum remote state preparation into the bidirectional quantum secure communication, not only to expand secret information from classical bits to quantum state, but also extract the phase and amplitude values of secret quantum state for greatly enlarging the capacity of secret information. The new protocol can also achieve better imperceptibility, since the eavesdropper can hardly detect the hidden channel or even obtain effective secret quantum states. Comparing with the previous quantum steganography achievements, due to its unique bidirectional quantum steganography, the new protocol can obtain higher transmission efficiency and better availability. Furthermore, the new algorithm can effectively resist quantum noises through theoretical analysis. Finally, the performance analysis proves the conclusion that the new protocol not only has good imperceptibility, high security, but also large payload.
An information theoretic approach of designing sparse kernel adaptive filters.
Liu, Weifeng; Park, Il; Principe, José C
2009-12-01
This paper discusses an information theoretic approach of designing sparse kernel adaptive filters. To determine useful data to be learned and remove redundant ones, a subjective information measure called surprise is introduced. Surprise captures the amount of information a datum contains which is transferable to a learning system. Based on this concept, we propose a systematic sparsification scheme, which can drastically reduce the time and space complexity without harming the performance of kernel adaptive filters. Nonlinear regression, short term chaotic time-series prediction, and long term time-series forecasting examples are presented.
Yao, Ruan; Li-Ying, Wang; Ting-Jun, Zhu; Men-Bao, Qian; Chun-Li, Cao; Yu-Wan, Hao; Tian, Tian; Shi-Zhu, Li
2017-03-01
To assess the theoretical knowledge and practical skills of parasitic diseases among technicians from disease control and prevention institutions. The Assessment on National Parasitic Disease Control and Prevention Techniques was organized in September, 2015. Together, 124 subjects from disease control and prevention institutions at province, prefecture or county levels in 31 provinces joined the assessment. A database was built consisting of subjects' basic information and assessment scores. Statistical analysis was used to analyze the scores by gender, age, professional title, institutions and places of participants. The average total score of all the subjects was 123.3, with a passing rate of 57.3%. The average scores of male subjects (48 subjects) and female subjects (76 subjects) were 125.9 and 121.7 respectively; the average scores of the subjects aged under 30 years (57 subjects), between 30 and 40 years (61 subjects) and above 40 years (6 subjects) were 119.6, 128.1 and 111.2 respectively; the average scores of persons with junior (94 subjects), intermediate (28 subjects) and senior (2 subjects) professional titles were 119.2, 135.9 and 140.5 respectively. The average theoretical assessment score of all the subjects was 61.9, with a passing rate of 62.9%. The average practical skill assessment score of all the subjects was 61.4, with a passing rate of 58.1%. The theoretical assessment results range widely. The theoretical knowledge results of technicians from disease control and prevention institutions are low in general. Therefore, the specific training based on daily work needs to be enhanced.
ERIC Educational Resources Information Center
Prado, Javier Calzada; Marzal, Miguel Angel
2013-01-01
Introduction: The role of library and information science professionals as knowledge facilitators is solidly grounded in the profession's theoretical foundations as much as connected with its social relevance. Knowledge science is presented in this paper as a convenient theoretical framework for this mission, and knowledge engagement…
Bell, Nikki; Vaughan, Nicholas P; Morris, Len; Griffin, Peter
2012-04-01
Few studies have assessed respiratory protective equipment (RPE) failures at the organizational level despite evidence to suggest that compliance with good practice may be low. The aim of this study was to develop an understanding of what current RPE programmes look like across industry and how this compares with good practice. Twenty cross-industry site visits were conducted with companies that had RPE programmes in place. Visits involved management interviews to explore current RPE systems and procedures and the decision making underpinning these. Observations of RPE operatives were included followed by short interviews to discuss the behaviours observed. Post-site assessments jointly undertaken by an RPE scientist and psychologist produced ratings for each site on six critical aspects of RPE programmes (knowledge/awareness, selection, use, training/information, supervision, and storage/cleaning/maintenance). Overall ratings for theoretical competence (i.e. management knowledge of RPE) and practical control (i.e. actual RPE practice on the shop floor) were also given. Qualitative analysis was performed on all interview data. The performance of RPE programmes varied across industry. Fewer than half the companies visited were considered to have an acceptable level of theoretical competence and practical control. Four distinct groups emerged from the 20 sites studied, ranging from Learners (low theoretical competence and practical control--four sites), Developers (acceptable theoretical competence and low practical control--five sites), and Fortuitous (low theoretical competence and acceptable practical control--two sites), to Proficient (acceptable theoretical competence and practical control--nine sites). None of the companies visited were achieving optimal control through the use of RPE. Widespread inadequacies were found with programme implementation, particularly training, supervision, and maintenance. Our taxonomy based on the four groups (Learners, Developers, Fortuitous, and Proficient) provided a useful expert-informed tool for explaining the variation in performance of RPE programmes across industry. Although further research and development are required, this taxonomy offers a useful starting point for the development of practical tools that may assist managers in making the much-needed improvements to all facets of programme implementation, particularly training, supervision, and maintenance.
SpectralNET – an application for spectral graph analysis and visualization
Forman, Joshua J; Clemons, Paul A; Schreiber, Stuart L; Haggarty, Stephen J
2005-01-01
Background Graph theory provides a computational framework for modeling a variety of datasets including those emerging from genomics, proteomics, and chemical genetics. Networks of genes, proteins, small molecules, or other objects of study can be represented as graphs of nodes (vertices) and interactions (edges) that can carry different weights. SpectralNET is a flexible application for analyzing and visualizing these biological and chemical networks. Results Available both as a standalone .NET executable and as an ASP.NET web application, SpectralNET was designed specifically with the analysis of graph-theoretic metrics in mind, a computational task not easily accessible using currently available applications. Users can choose either to upload a network for analysis using a variety of input formats, or to have SpectralNET generate an idealized random network for comparison to a real-world dataset. Whichever graph-generation method is used, SpectralNET displays detailed information about each connected component of the graph, including graphs of degree distribution, clustering coefficient by degree, and average distance by degree. In addition, extensive information about the selected vertex is shown, including degree, clustering coefficient, various distance metrics, and the corresponding components of the adjacency, Laplacian, and normalized Laplacian eigenvectors. SpectralNET also displays several graph visualizations, including a linear dimensionality reduction for uploaded datasets (Principal Components Analysis) and a non-linear dimensionality reduction that provides an elegant view of global graph structure (Laplacian eigenvectors). Conclusion SpectralNET provides an easily accessible means of analyzing graph-theoretic metrics for data modeling and dimensionality reduction. SpectralNET is publicly available as both a .NET application and an ASP.NET web application from . Source code is available upon request. PMID:16236170
SpectralNET--an application for spectral graph analysis and visualization.
Forman, Joshua J; Clemons, Paul A; Schreiber, Stuart L; Haggarty, Stephen J
2005-10-19
Graph theory provides a computational framework for modeling a variety of datasets including those emerging from genomics, proteomics, and chemical genetics. Networks of genes, proteins, small molecules, or other objects of study can be represented as graphs of nodes (vertices) and interactions (edges) that can carry different weights. SpectralNET is a flexible application for analyzing and visualizing these biological and chemical networks. Available both as a standalone .NET executable and as an ASP.NET web application, SpectralNET was designed specifically with the analysis of graph-theoretic metrics in mind, a computational task not easily accessible using currently available applications. Users can choose either to upload a network for analysis using a variety of input formats, or to have SpectralNET generate an idealized random network for comparison to a real-world dataset. Whichever graph-generation method is used, SpectralNET displays detailed information about each connected component of the graph, including graphs of degree distribution, clustering coefficient by degree, and average distance by degree. In addition, extensive information about the selected vertex is shown, including degree, clustering coefficient, various distance metrics, and the corresponding components of the adjacency, Laplacian, and normalized Laplacian eigenvectors. SpectralNET also displays several graph visualizations, including a linear dimensionality reduction for uploaded datasets (Principal Components Analysis) and a non-linear dimensionality reduction that provides an elegant view of global graph structure (Laplacian eigenvectors). SpectralNET provides an easily accessible means of analyzing graph-theoretic metrics for data modeling and dimensionality reduction. SpectralNET is publicly available as both a .NET application and an ASP.NET web application from http://chembank.broad.harvard.edu/resources/. Source code is available upon request.
Information Theoretic Studies and Assessment of Space Object Identification
2014-03-24
localization are contained in Ref. [5]. 1.7.1 A Bayesian MPE Based Analysis of 2D Point-Source-Pair Superresolution In a second recently submitted paper [6], a...related problem of the optical superresolution (OSR) of a pair of equal-brightness point sources separated spatially by a distance (or angle) smaller...1403.4897 [physics.optics] (19 March 2014). 6. S. Prasad, “Asymptotics of Bayesian error probability and 2D pair superresolution ,” submitted to Opt. Express
Development of the Environmental Technical Information System
1975-04-01
official Department oi the Army position, unless so designated by other authorized documents. «CC:,;!;I?N tw HIS . ■ " ftIM B c ii a UN, " □ j...regulations that may concern the Army. CHLUS is complete tor six states and lor areas ot federal jurisdiction, and datu tor another ten stales are...Theoretical Analysis 20 Isolating The Export Industries: Direct Methods Description of the Models 24 Economic Impact on Local Businesses Change in
Linear Quantum Systems: Non-Classical States and Robust Stability
2016-06-29
has a history going back some 50 years, to the birth of modern control theory with Kalman’s foundational work on filtering and LQG optimal control ...information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ORGANIZATION. 1. REPORT DATE (DD...analysis and control of quantum linear systems and their interactions with non-classical quantum fields by developing control theoretic concepts exploiting
NASA Astrophysics Data System (ADS)
Ercan, İlke; Suyabatmaz, Enes
2018-06-01
The saturation in the efficiency and performance scaling of conventional electronic technologies brings about the development of novel computational paradigms. Brownian circuits are among the promising alternatives that can exploit fluctuations to increase the efficiency of information processing in nanocomputing. A Brownian cellular automaton, where signals propagate randomly and are driven by local transition rules, can be made computationally universal by embedding arbitrary asynchronous circuits on it. One of the potential realizations of such circuits is via single electron tunneling (SET) devices since SET technology enable simulation of noise and fluctuations in a fashion similar to Brownian search. In this paper, we perform a physical-information-theoretic analysis on the efficiency limitations in a Brownian NAND and half-adder circuits implemented using SET technology. The method we employed here establishes a solid ground that enables studying computational and physical features of this emerging technology on an equal footing, and yield fundamental lower bounds that provide valuable insights into how far its efficiency can be improved in principle. In order to provide a basis for comparison, we also analyze a NAND gate and half-adder circuit implemented in complementary metal oxide semiconductor technology to show how the fundamental bound of the Brownian circuit compares against a conventional paradigm.
Efficient retrieval of landscape Hessian: Forced optimal covariance adaptive learning
NASA Astrophysics Data System (ADS)
Shir, Ofer M.; Roslund, Jonathan; Whitley, Darrell; Rabitz, Herschel
2014-06-01
Knowledge of the Hessian matrix at the landscape optimum of a controlled physical observable offers valuable information about the system robustness to control noise. The Hessian can also assist in physical landscape characterization, which is of particular interest in quantum system control experiments. The recently developed landscape theoretical analysis motivated the compilation of an automated method to learn the Hessian matrix about the global optimum without derivative measurements from noisy data. The current study introduces the forced optimal covariance adaptive learning (FOCAL) technique for this purpose. FOCAL relies on the covariance matrix adaptation evolution strategy (CMA-ES) that exploits covariance information amongst the control variables by means of principal component analysis. The FOCAL technique is designed to operate with experimental optimization, generally involving continuous high-dimensional search landscapes (≳30) with large Hessian condition numbers (≳104). This paper introduces the theoretical foundations of the inverse relationship between the covariance learned by the evolution strategy and the actual Hessian matrix of the landscape. FOCAL is presented and demonstrated to retrieve the Hessian matrix with high fidelity on both model landscapes and quantum control experiments, which are observed to possess nonseparable, nonquadratic search landscapes. The recovered Hessian forms were corroborated by physical knowledge of the systems. The implications of FOCAL extend beyond the investigated studies to potentially cover other physically motivated multivariate landscapes.
Information Dynamics and Aspects of Musical Perception
NASA Astrophysics Data System (ADS)
Dubnov, Shlomo
Musical experience has been often suggested to be related to forming of expectations, their fulfillment or denial. In terms of information theory, expectancies and predictions serve to reduce uncertainty about the future and might be used to efficiently represent and "compress" data. In this chapter we present an information theoretic model of musical listening based on the idea that expectations that arise from past musical material are framing our appraisal of what comes next, and that this process eventually results in creation of emotions or feelings. Using a notion of "information rate" we can measure the amount of information between past and present in the musical signal on different time scales using statistics of sound spectral features. Several musical pieces are analyzed in terms of short and long term information rate dynamics and are compared to analysis of musical form and its structural functions. The findings suggest that a relation exists between information dynamics and musical structure that eventually leads to creation of human listening experience and feelings such as "wow" and "aha".
Chahine, Saad; Cristancho, Sayra; Padgett, Jessica; Lingard, Lorelei
2017-06-01
In the competency-based medical education (CBME) approach, clinical competency committees are responsible for making decisions about trainees' competence. However, we currently lack a theoretical model for group decision-making to inform this emerging assessment phenomenon. This paper proposes an organizing framework to study and guide the decision-making processes of clinical competency committees.This is an explanatory, non-exhaustive review, tailored to identify relevant theoretical and evidence-based papers related to small group decision-making. The search was conducted using Google Scholar, Web of Science, MEDLINE, ERIC, and PsycINFO for relevant literature. Using a thematic analysis, two researchers (SC & JP) met four times between April-June 2016 to consolidate the literature included in this review.Three theoretical orientations towards group decision-making emerged from the review: schema, constructivist, and social influence. Schema orientations focus on how groups use algorithms for decision-making. Constructivist orientations focus on how groups construct their shared understanding. Social influence orientations focus on how individual members influence the group's perspective on a decision. Moderators of decision-making relevant to all orientations include: guidelines, stressors, authority, and leadership.Clinical competency committees are the mechanisms by which groups of clinicians will be in charge of interpreting multiple assessment data points and coming to a shared decision about trainee competence. The way in which these committees make decisions can have huge implications for trainee progression and, ultimately, patient care. Therefore, there is a pressing need to build the science of how such group decision-making works in practice. This synthesis suggests a preliminary organizing framework that can be used in the implementation and study of clinical competency committees.
NASA Astrophysics Data System (ADS)
Feng, Ximeng; Li, Gang; Yu, Haixia; Wang, Shaohui; Yi, Xiaoqing; Lin, Ling
2018-03-01
Noninvasive blood component analysis by spectroscopy has been a hotspot in biomedical engineering in recent years. Dynamic spectrum provides an excellent idea for noninvasive blood component measurement, but studies have been limited to the application of broadband light sources and high-resolution spectroscopy instruments. In order to remove redundant information, a more effective wavelength selection method has been presented in this paper. In contrast to many common wavelength selection methods, this method is based on sensing mechanism which has a clear mechanism and can effectively avoid the noise from acquisition system. The spectral difference coefficient was theoretically proved to have a guiding significance for wavelength selection. After theoretical analysis, the multi-band spectral difference coefficient-wavelength selection method combining with the dynamic spectrum was proposed. An experimental analysis based on clinical trial data from 200 volunteers has been conducted to illustrate the effectiveness of this method. The extreme learning machine was used to develop the calibration models between the dynamic spectrum data and hemoglobin concentration. The experiment result shows that the prediction precision of hemoglobin concentration using multi-band spectral difference coefficient-wavelength selection method is higher compared with other methods.
[Health community agent: subject of the buccal health practice in Alagoinhas, Bahia state].
Rodrigues, Ana Aurea Alécio de Oliveira; Santos, Adriano Maia Dos; Assis, Marluce Maria Araújo
2010-05-01
This study about the work of micro politics was carried out by the Buccal Health Team (ESB) in the Family Health Program (PSF) of Alagoinhas, Bahia State, and has as central theoretical purpose the specific and singular forms in the practice of daily work, using the technologies (hard, light-hard and light). The methodological trajectory is based on the historical-social current in view of a dialectic approach of qualitative nature. The techniques of data collection used were: semi structured interview, observation of the work process and documental analysis. The analysis of the data was oriented by the hermeneutics-dialectics, allowing to compare the different levels of analysis, articulating the theoretical with the empirical evidence. The results reveal that the Family Health Teams are multidisciplinary, but have still not developed an interdisciplinary work, hence occurring juxtaposition of skills. Each unit plans their work process according to the singularities of the social subjects, implementing different characteristics in how to welcome, inform, attend and refer. An effort in changing the work process can be perceived in the perspective of amplified clinic with the health community agent standing out as a social/collective subject.
The role of ecological dynamics in analysing performance in team sports.
Vilar, Luís; Araújo, Duarte; Davids, Keith; Button, Chris
2012-01-01
Performance analysis is a subdiscipline of sports sciences and one-approach, notational analysis, has been used to objectively audit and describe behaviours of performers during different subphases of play, providing additional information for practitioners to improve future sports performance. Recent criticisms of these methods have suggested the need for a sound theoretical rationale to explain performance behaviours, not just describe them. The aim of this article was to show how ecological dynamics provides a valid theoretical explanation of performance in team sports by explaining the formation of successful and unsuccessful patterns of play, based on symmetry-breaking processes emerging from functional interactions between players and the performance environment. We offer the view that ecological dynamics is an upgrade to more operational methods of performance analysis that merely document statistics of competitive performance. In support of our arguments, we refer to exemplar data on competitive performance in team sports that have revealed functional interpersonal interactions between attackers and defenders, based on variations in the spatial positioning of performers relative to each other in critical performance areas, such as the scoring zones. Implications of this perspective are also considered for practice task design and sport development programmes.
A Meta-Analysis of Serious Digital Games for Healthy Lifestyle Promotion
DeSmet, Ann; Van Ryckeghem, Dimitri; Compernolle, Sofie; Baranowski, Tom; Thompson, Debbe; Crombez, Geert; Poels, Karolien; Van Lippevelde, Wendy; Bastiaensens, Sara; Van Cleemput, Katrien; Vandebosch, Heidi; De Bourdeaudhuij, Ilse
2015-01-01
Several systematic reviews have described health-promoting effects of serious games but so far no meta-analysis has been reported. This paper presents a meta-analysis of 54 serious digital game studies for healthy lifestyle promotion, in which we investigated the overall effectiveness of serious digital games on healthy lifestyle promotion outcomes and the role of theoretically and clinically important moderators. Findings showed serious games have small positive effects on healthy lifestyles (g=0.260, 95% CI 0.148; 0.373) and their determinants (g=0.334, 95% CI 0.260; 0.407), especially for knowledge. Effects on clinical outcomes were significant, but much smaller (g=0.079, 95% CI 0.038; 0.120). Long-term effects were maintained for all outcomes except for behavior. Serious games are best individually tailored to both socio-demographic and change need information, and benefit from a strong focus on game theories or a dual theoretical foundation in both behavioral prediction and game theories. They can be effective either as a stand-alone or multi-component programs, and appeal to populations regardless of age and gender. Given that effects of games remain heterogeneous, further exploration of which game features create larger effects are needed. PMID:25172024
Hershberger, Patricia E; Finnegan, Lorna; Altfeld, Susan; Lake, Sara; Hirshfeld-Cytron, Jennifer
2013-01-01
Young women with cancer now face the complex decision about whether to undergo fertility preservation. Yet little is known about how these women process information involved in making this decision. The purpose of this article is to expand theoretical understanding of the decision-making process by examining aspects of information processing among young women diagnosed with cancer. Using a grounded theory approach, 27 women with cancer participated in individual, semistructured interviews. Data were coded and analyzed using constant-comparison techniques that were guided by 5 dimensions within the Contemplate phase of the decision-making process framework. In the first dimension, young women acquired information primarily from clinicians and Internet sources. Experiential information, often obtained from peers, occurred in the second dimension. Preferences and values were constructed in the third dimension as women acquired factual, moral, and ethical information. Women desired tailored, personalized information that was specific to their situation in the fourth dimension; however, women struggled with communicating these needs to clinicians. In the fifth dimension, women offered detailed descriptions of clinician behaviors that enhance or impede decisional debriefing. Better understanding of theoretical underpinnings surrounding women's information processes can facilitate decision support and improve clinical care.
Glynn, Liam G; Glynn, Fergus; Casey, Monica; Wilkinson, Louise Gaffney; Hayes, Patrick S; Heaney, David; Murphy, Andrew W M
2018-05-02
Problematic translational gaps continue to exist between demonstrating the positive impact of healthcare interventions in research settings and their implementation into routine daily practice. The aim of this qualitative evaluation of the SMART MOVE trial was to conduct a theoretically informed analysis, using normalisation process theory, of the potential barriers and levers to the implementation of a mhealth intervention to promote physical activity in primary care. The study took place in the West of Ireland with recruitment in the community from the Clare Primary Care Network. SMART MOVE trial participants and the staff from four primary care centres were invited to take part and all agreed to do so. A qualitative methodology with a combination of focus groups (general practitioners, practice nurses and non-clinical staff from four separate primary care centres, n = 14) and individual semi-structured interviews (intervention and control SMART MOVE trial participants, n = 4) with purposeful sampling utilising the principles of Framework Analysis was utilised. The Normalisation Process Theory was used to develop the topic guide for the interviews and also informed the data analysis process. Four themes emerged from the analysis: personal and professional exercise strategies; roles and responsibilities to support active engagement; utilisation challenges; and evaluation, adoption and adherence. It was evident that introducing a new healthcare intervention demands a comprehensive evaluation of the intervention itself and also the environment in which it is to operate. Despite certain obstacles, the opportunity exists for the successful implementation of a novel healthcare intervention that addresses a hitherto unresolved healthcare need, provided that the intervention has strong usability attributes for both disseminators and target users and coheres strongly with the core objectives and culture of the health care environment in which it is to operate. We carried out a theoretical analysis of stakeholder informed barriers and levers to the implementation of a novel exercise promotion tool in the Irish primary care setting. We believe that this process amplifies the implementation potential of such an intervention in primary care. The SMART MOVE trial is registered at Current Controlled Trials (ISRCTN99944116; Date of registration: 1st August 2012).
Role of information theoretic uncertainty relations in quantum theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jizba, Petr, E-mail: p.jizba@fjfi.cvut.cz; ITP, Freie Universität Berlin, Arnimallee 14, D-14195 Berlin; Dunningham, Jacob A., E-mail: J.Dunningham@sussex.ac.uk
2015-04-15
Uncertainty relations based on information theory for both discrete and continuous distribution functions are briefly reviewed. We extend these results to account for (differential) Rényi entropy and its related entropy power. This allows us to find a new class of information-theoretic uncertainty relations (ITURs). The potency of such uncertainty relations in quantum mechanics is illustrated with a simple two-energy-level model where they outperform both the usual Robertson–Schrödinger uncertainty relation and Shannon entropy based uncertainty relation. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrödinger cat states. Again,more » improvement over both the Robertson–Schrödinger uncertainty principle and Shannon ITUR is demonstrated in these cases. Further salient issues such as the proof of a generalized entropy power inequality and a geometric picture of information-theoretic uncertainty relations are also discussed.« less
Flannery, C; McHugh, S; Anaba, A E; Clifford, E; O'Riordan, M; Kenny, L C; McAuliffe, F M; Kearney, P M; Byrne, M
2018-05-21
Obesity during pregnancy is associated with increased risk of gestational diabetes mellitus (GDM) and other complications. Physical activity is a modifiable lifestyle factor that may help to prevent these complications but many women reduce their physical activity levels during pregnancy. Interventions targeting physical activity in pregnancy are on-going but few identify the underlying behaviour change mechanisms by which the intervention is expected to work. To enhance intervention effectiveness, recent tools in behavioural science such as the Theoretical Domains Framework (TDF) and COM-B model (capability, opportunity, motivation and behaviour) have been employed to understand behaviours for intervention development. Using these behaviour change methods, this study aimed to identify the enablers and barriers to physical activity in overweight and obese pregnant women. Semi-structured interviews were conducted with a purposive sample of overweight and obese women at different stages of pregnancy attending a public antenatal clinic in a large academic maternity hospital in Cork, Ireland. Interviews were recorded and transcribed into NVivo V.10 software. Data analysis followed the framework approach, drawing on the TDF and the COM-B model. Twenty one themes were identified and these mapped directly on to the COM-B model of behaviour change and ten of the TDF domains. Having the social opportunity to engage in physical activity was identified as an enabler; pregnant women suggested being active was easier when supported by their partners. Knowledge was a commonly reported barrier with women lacking information on safe activities during pregnancy and describing the information received from their midwife as 'limited'. Having the physical capability and physical opportunity to carry out physical activity were also identified as barriers; experiencing pain, a lack of time, having other children, and working prevented women from being active. A wide range of barriers and enablers were identified which influenced women's capability, motivation and opportunity to engage in physical activity with "knowledge" as the most commonly reported barrier. This study is a theoretical starting point in making a 'behavioural diagnoses' and the results will be used to inform the development of an intervention to increase physical activity levels among overweight and obese pregnant women.
A Theoretical Sketch of Medical Professionalism as a Normative Complex
ERIC Educational Resources Information Center
Holtman, Matthew C.
2008-01-01
Validity arguments for assessment tools intended to measure medical professionalism suffer for lack of a clear theoretical statement of what professionalism is and how it should behave. Drawing on several decades of field research addressing deviance and informal social control among physicians, a theoretical sketch of professionalism is presented…
The use of information theory for the evaluation of biomarkers of aging and physiological age.
Blokh, David; Stambler, Ilia
2017-04-01
The present work explores the application of information theoretical measures, such as entropy and normalized mutual information, for research of biomarkers of aging. The use of information theory affords unique methodological advantages for the study of aging processes, as it allows evaluating non-linear relations between biological parameters, providing the precise quantitative strength of those relations, both for individual and multiple parameters, showing cumulative or synergistic effect. Here we illustrate those capabilities utilizing a dataset on heart disease, including diagnostic parameters routinely available to physicians. The use of information-theoretical methods, utilizing normalized mutual information, revealed the exact amount of information that various diagnostic parameters or their combinations contained about the persons' age. Based on those exact informative values for the correlation of measured parameters with age, we constructed a diagnostic rule (a decision tree) to evaluate physiological age, as compared to chronological age. The present data illustrated that younger subjects suffering from heart disease showed characteristics of people of higher age (higher physiological age). Utilizing information-theoretical measures, with additional data, it may be possible to create further clinically applicable information-theory-based markers and models for the evaluation of physiological age, its relation to age-related diseases and its potential modifications by therapeutic interventions. Copyright © 2017 Elsevier B.V. All rights reserved.
2011-01-01
Background Tobacco use adversely affects oral health. Clinical guidelines recommend that dental providers promote tobacco abstinence and provide patients who use tobacco with brief tobacco use cessation counselling. Research shows that these guidelines are seldom implemented, however. To improve guideline adherence and to develop effective interventions, it is essential to understand provider behaviour and challenges to implementation. This study aimed to develop a theoretically informed measure for assessing among dental providers implementation difficulties related to tobacco use prevention and cessation (TUPAC) counselling guidelines, to evaluate those difficulties among a sample of dental providers, and to investigate a possible underlying structure of applied theoretical domains. Methods A 35-item questionnaire was developed based on key theoretical domains relevant to the implementation behaviours of healthcare providers. Specific items were drawn mostly from the literature on TUPAC counselling studies of healthcare providers. The data were collected from dentists (n = 73) and dental hygienists (n = 22) in 36 dental clinics in Finland using a web-based survey. Of 95 providers, 73 participated (76.8%). We used Cronbach's alpha to ascertain the internal consistency of the questionnaire. Mean domain scores were calculated to assess different aspects of implementation difficulties and exploratory factor analysis to assess the theoretical domain structure. The authors agreed on the labels assigned to the factors on the basis of their component domains and the broader behavioural and theoretical literature. Results Internal consistency values for theoretical domains varied from 0.50 ('emotion') to 0.71 ('environmental context and resources'). The domain environmental context and resources had the lowest mean score (21.3%; 95% confidence interval [CI], 17.2 to 25.4) and was identified as a potential implementation difficulty. The domain emotion provided the highest mean score (60%; 95% CI, 55.0 to 65.0). Three factors were extracted that explain 70.8% of the variance: motivation (47.6% of variance, α = 0.86), capability (13.3% of variance, α = 0.83), and opportunity (10.0% of variance, α = 0.71). Conclusions This study demonstrated a theoretically informed approach to identifying possible implementation difficulties in TUPAC counselling among dental providers. This approach provides a method for moving from diagnosing implementation difficulties to designing and evaluating interventions. PMID:21615948
Detailed Analysis of the Interoccurrence Time Statistics in Seismic Activity
NASA Astrophysics Data System (ADS)
Tanaka, Hiroki; Aizawa, Yoji
2017-02-01
The interoccurrence time statistics of seismiciry is studied theoretically as well as numerically by taking into account the conditional probability and the correlations among many earthquakes in different magnitude levels. It is known so far that the interoccurrence time statistics is well approximated by the Weibull distribution, but the more detailed information about the interoccurrence times can be obtained from the analysis of the conditional probability. Firstly, we propose the Embedding Equation Theory (EET), where the conditional probability is described by two kinds of correlation coefficients; one is the magnitude correlation and the other is the inter-event time correlation. Furthermore, the scaling law of each correlation coefficient is clearly determined from the numerical data-analysis carrying out with the Preliminary Determination of Epicenter (PDE) Catalog and the Japan Meteorological Agency (JMA) Catalog. Secondly, the EET is examined to derive the magnitude dependence of the interoccurrence time statistics and the multi-fractal relation is successfully formulated. Theoretically we cannot prove the universality of the multi-fractal relation in seismic activity; nevertheless, the theoretical results well reproduce all numerical data in our analysis, where several common features or the invariant aspects are clearly observed. Especially in the case of stationary ensembles the multi-fractal relation seems to obey an invariant curve, furthermore in the case of non-stationary (moving time) ensembles for the aftershock regime the multi-fractal relation seems to satisfy a certain invariant curve at any moving times. It is emphasized that the multi-fractal relation plays an important role to unify the statistical laws of seismicity: actually the Gutenberg-Richter law and the Weibull distribution are unified in the multi-fractal relation, and some universality conjectures regarding the seismicity are briefly discussed.
NASA Astrophysics Data System (ADS)
Tapia-Herrera, R.; Huerta-Lopez, C. I.; Martinez-Cruzado, J. A.
2009-05-01
Results of site characterization for an experimental site in the metropolitan area of Tijuana, B. C., Mexico are presented as part of the on-going research in which time series of earthquakes, ambient noise, and induced vibrations were processed with three different methods: H/V spectral ratios, Spectral Analysis of Surface Waves (SASW), and the Random Decrement Method, (RDM). Forward modeling using the wave propagation stiffness matrix method (Roësset and Kausel, 1981) was used to compute the theoretical SH/P, SV/P spectral ratios, and the experimental H/V spectral ratios were computed following the conventional concepts of Fourier analysis. The modeling/comparison between the theoretical and experimental H/V spectral ratios was carried out. For the SASW method the theoretical dispersion curves were also computed and compared with the experimental one, and finally the theoretical free vibration decay curve was compared with the experimental one obtained with the RDM. All three methods were tested with ambient noise, induced vibrations, and earthquake signals. Both experimental spectral ratios obtained with ambient noise as well as earthquake signals agree quite well with the theoretical spectral ratios, particularly at the fundamental vibration frequency of the recording site. Differences between the fundamental vibration frequencies are evident for sites located at alluvial fill (~0.6 Hz) and at sites located at conglomerate/sandstones fill (0.75 Hz). Shear wave velocities for the soft soil layers of the 4-layer discrete soil model ranges as low as 100 m/s and up to 280 m/s. The results with the SASW provided information that allows to identify low velocity layers, not seen before with the traditional seismic methods. The damping estimations obtained with the RDM are within the expected values, and the dominant frequency of the system also obtained with the RDM correlates within the range of plus-minus 20 % with the one obtained by means of the H/V spectral ratio.
Should the model for risk-informed regulation be game theory rather than decision theory?
Bier, Vicki M; Lin, Shi-Woei
2013-02-01
Risk analysts frequently view the regulation of risks as being largely a matter of decision theory. According to this view, risk analysis methods provide information on the likelihood and severity of various possible outcomes; this information should then be assessed using a decision-theoretic approach (such as cost/benefit analysis) to determine whether the risks are acceptable, and whether additional regulation is warranted. However, this view ignores the fact that in many industries (particularly industries that are technologically sophisticated and employ specialized risk and safety experts), risk analyses may be done by regulated firms, not by the regulator. Moreover, those firms may have more knowledge about the levels of safety at their own facilities than the regulator does. This creates a situation in which the regulated firm has both the opportunity-and often also the motive-to provide inaccurate (in particular, favorably biased) risk information to the regulator, and hence the regulator has reason to doubt the accuracy of the risk information provided by regulated parties. Researchers have argued that decision theory is capable of dealing with many such strategic interactions as well as game theory can. This is especially true in two-player, two-stage games in which the follower has a unique best strategy in response to the leader's strategy, as appears to be the case in the situation analyzed in this article. However, even in such cases, we agree with Cox that game-theoretic methods and concepts can still be useful. In particular, the tools of mechanism design, and especially the revelation principle, can simplify the analysis of such games because the revelation principle provides rigorous assurance that it is sufficient to analyze only games in which licensees truthfully report their risk levels, making the problem more manageable. Without that, it would generally be necessary to consider much more complicated forms of strategic behavior (including deception), to identify optimal regulatory strategies. Therefore, we believe that the types of regulatory interactions analyzed in this article are better modeled using game theory rather than decision theory. In particular, the goals of this article are to review the relevant literature in game theory and regulatory economics (to stimulate interest in this area among risk analysts), and to present illustrative results showing how the application of game theory can provide useful insights into the theory and practice of risk-informed regulation. © 2012 Society for Risk Analysis.
Gonzalez Bernaldo de Quiros, Fernan; Dawidowski, Adriana R; Figar, Silvana
2017-02-01
In this study, we aimed: 1) to conceptualize the theoretical challenges facing health information systems (HIS) to represent patients' decisions about health and medical treatments in everyday life; 2) to suggest approaches for modeling these processes. The conceptualization of the theoretical and methodological challenges was discussed in 2015 during a series of interdisciplinary meetings attended by health informatics staff, epidemiologists and health professionals working in quality management and primary and secondary prevention of chronic diseases of the Hospital Italiano de Buenos Aires, together with sociologists, anthropologists and e-health stakeholders. HIS are facing the need and challenge to represent social human processes based on constructivist and complexity theories, which are the current frameworks of human sciences for understanding human learning and socio-cultural changes. Computer systems based on these theories can model processes of social construction of concrete and subjective entities and the interrelationships between them. These theories could be implemented, among other ways, through the mapping of health assets, analysis of social impact through community trials and modeling of complexity with system simulation tools. This analysis suggested the need to complement the traditional linear causal explanations of disease onset (and treatments) that are the bases for models of analysis of HIS with constructivist and complexity frameworks. Both may enlighten the complex interrelationships among patients, health services and the health system. The aim of this strategy is to clarify people's decision making processes to improve the efficiency, quality and equity of the health services and the health system.
Kraft, Reuben H.; Mckee, Phillip Justin; Dagro, Amy M.; Grafton, Scott T.
2012-01-01
This article presents the integration of brain injury biomechanics and graph theoretical analysis of neuronal connections, or connectomics, to form a neurocomputational model that captures spatiotemporal characteristics of trauma. We relate localized mechanical brain damage predicted from biofidelic finite element simulations of the human head subjected to impact with degradation in the structural connectome for a single individual. The finite element model incorporates various length scales into the full head simulations by including anisotropic constitutive laws informed by diffusion tensor imaging. Coupling between the finite element analysis and network-based tools is established through experimentally-based cellular injury thresholds for white matter regions. Once edges are degraded, graph theoretical measures are computed on the “damaged” network. For a frontal impact, the simulations predict that the temporal and occipital regions undergo the most axonal strain and strain rate at short times (less than 24 hrs), which leads to cellular death initiation, which results in damage that shows dependence on angle of impact and underlying microstructure of brain tissue. The monotonic cellular death relationships predict a spatiotemporal change of structural damage. Interestingly, at 96 hrs post-impact, computations predict no network nodes were completely disconnected from the network, despite significant damage to network edges. At early times () network measures of global and local efficiency were degraded little; however, as time increased to 96 hrs the network properties were significantly reduced. In the future, this computational framework could help inform functional networks from physics-based structural brain biomechanics to obtain not only a biomechanics-based understanding of injury, but also neurophysiological insight. PMID:22915997
Liaw, Siaw-Teng; Pearce, Christopher; Liyanage, Harshana; Liaw, Gladys S S; de Lusignan, Simon
2014-01-01
Increasing investment in eHealth aims to improve cost effectiveness and safety of care. Data extraction and aggregation can create new data products to improve professional practice and provide feedback to improve the quality of source data. A previous systematic review concluded that locally relevant clinical indicators and use of clinical record systems could support clinical governance. We aimed to extend and update the review with a theoretical framework. We searched PubMed, Medline, Web of Science, ABI Inform (Proquest) and Business Source Premier (EBSCO) using the terms curation, information ecosystem, data quality management (DQM), data governance, information governance (IG) and data stewardship. We focused on and analysed the scope of DQM and IG processes, theoretical frameworks, and determinants of the processing, quality assurance, presentation and sharing of data across the enterprise. There are good theoretical reasons for integrated governance, but there is variable alignment of DQM, IG and health system objectives across the health enterprise. Ethical constraints exist that require health information ecosystems to process data in ways that are aligned with improving health and system efficiency and ensuring patient safety. Despite an increasingly 'big-data' environment, DQM and IG in health services are still fragmented across the data production cycle. We extend current work on DQM and IG with a theoretical framework for integrated IG across the data cycle. The dimensions of this theory-based framework would require testing with qualitative and quantitative studies to examine the applicability and utility, along with an evaluation of its impact on data quality across the health enterprise.
Van Zyl, Hendra; Kotze, Marike; Laubscher, Ria
2014-03-28
eHealth has been identified as a useful approach to disseminate HIV/AIDS information. Together with Consumer Health Informatics (CHI), the Web-to-Public Knowledge Transfer Model (WPKTM) has been applied as a theoretical framework to identify consumer needs for AfroAIDSinfo, a South African Web portal. As part of the CHI practice, regular eSurveys are conducted to determine whether these needs are changing and are continually being met. eSurveys show high rates of satisfaction with the content as well as the modes of delivery. The nature of information is thought of as reliable to reuse; both for education and for referencing of information. Using CHI and the WPKTM as a theoretical framework, it ensures that needs of consumers are being met and that they find the tailored methods of presenting the information agreeable. Combining ICTs and theories in eHealth interventions, this approach can be expanded to deliver information in other sectors of public health.
Van Zyl, Hendra; Kotze, Marike; Laubscher, Ria
2014-01-01
eHealth has been identified as a useful approach to disseminate HIV/AIDS information. Together with Consumer Health Informatics (CHI), the Web-to-Public Knowledge Transfer Model (WPKTM) has been applied as a theoretical framework to identify consumer needs for AfroAIDSinfo, a South African Web portal. As part of the CHI practice, regular eSurveys are conducted to determine whether these needs are changing and are continually being met. eSurveys show high rates of satisfaction with the content as well as the modes of delivery. The nature of information is thought of as reliable to reuse; both for education and for referencing of information. Using CHI and the WPKTM as a theoretical framework, it ensures that needs of consumers are being met and that they find the tailored methods of presenting the information agreeable. Combining ICTs and theories in eHealth interventions, this approach can be expanded to deliver information in other sectors of public health. PMID:24686487
Davis, Thomas D
2017-01-01
Practice evaluation strategies range in style from the formal-analytic tools of single-subject designs, rapid assessment instruments, algorithmic steps in evidence-informed practice, and computer software applications, to the informal-interactive tools of clinical supervision, consultation with colleagues, use of client feedback, and clinical experience. The purpose of this article is to provide practice researchers in social work with an evidence-informed theory that is capable of explaining both how and why social workers use practice evaluation strategies to self-monitor the effectiveness of their interventions in terms of client change. The author delineates the theoretical contours and consequences of what is called dual-process theory. Drawing on evidence-informed advances in the cognitive and social neurosciences, the author identifies among everyday social workers a theoretically stable, informal-interactive tool preference that is a cognitively necessary, sufficient, and stand-alone preference that requires neither the supplementation nor balance of formal-analytic tools. The author's delineation of dual-process theory represents a theoretical contribution in the century-old attempt to understand how and why social workers evaluate their practice the way they do.
Information-theoretic decomposition of embodied and situated systems.
Da Rold, Federico
2018-07-01
The embodied and situated view of cognition stresses the importance of real-time and nonlinear bodily interaction with the environment for developing concepts and structuring knowledge. In this article, populations of robots controlled by an artificial neural network learn a wall-following task through artificial evolution. At the end of the evolutionary process, time series are recorded from perceptual and motor neurons of selected robots. Information-theoretic measures are estimated on pairings of variables to unveil nonlinear interactions that structure the agent-environment system. Specifically, the mutual information is utilized to quantify the degree of dependence and the transfer entropy to detect the direction of the information flow. Furthermore, the system is analyzed with the local form of such measures, thus capturing the underlying dynamics of information. Results show that different measures are interdependent and complementary in uncovering aspects of the robots' interaction with the environment, as well as characteristics of the functional neural structure. Therefore, the set of information-theoretic measures provides a decomposition of the system, capturing the intricacy of nonlinear relationships that characterize robots' behavior and neural dynamics. Copyright © 2018 Elsevier Ltd. All rights reserved.
(I Can't Get No) Saturation: A simulation and guidelines for sample sizes in qualitative research.
van Rijnsoever, Frank J
2017-01-01
I explore the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the codes in the population have been observed once in the sample. I delineate three different scenarios to sample information sources: "random chance," which is based on probability sampling, "minimal information," which yields at least one new code per sampling step, and "maximum information," which yields the largest number of new codes per sampling step. Next, I use simulations to assess the minimum sample size for each scenario for systematically varying hypothetical populations. I show that theoretical saturation is more dependent on the mean probability of observing codes than on the number of codes in a population. Moreover, the minimal and maximal information scenarios are significantly more efficient than random chance, but yield fewer repetitions per code to validate the findings. I formulate guidelines for purposive sampling and recommend that researchers follow a minimum information scenario.
A Demographic Perspective on Family Change
Bianchi, Suzanne M.
2014-01-01
Demographic analysis seeks to understand how individual microlevel decisions about child-bearing, marriage and partnering, geographic mobility, and behaviors that influence health and longevity aggregate to macrolevel population trends and differentials in fertility, mortality and migration. In this review, I first discuss theoretical perspectives—classic demographic transition theory, the perspective of the “second demographic transition,” the spread of developmental idealism—that inform demographers’ understanding of macrolevel population change. Then, I turn to a discussion of the role that demographically informed data collection has played in illuminating family change since the mid-20th century in the United States. Finally, I discuss ways in which demographic theory and data collection might inform future areas of family research, particularly in the area of intergenerational family relationships and new and emerging family forms. PMID:26078785
[Development of indicators for evaluating public dental healthcare services].
Bueno, Vera Lucia Ribeiro de Carvalho; Cordoni Júnior, Luiz; Mesas, Arthur Eumann
2011-07-01
The objective of this article is to describe and analyze the development of indicators used to identify strengths and deficiencies in public dental healthcare services in the municipality of Cambé, Paraná. The methodology employed was a historical-organizational case study. A theoretical model of the service was developed for evaluation planning. To achieve this, information was collected from triangulation of methods (interviews, document analysis and observation). A matrix was then developed which presents analysis dimensions, criteria, indicators, punctuation, parameters and sources of information. Three workshops were staged during the process with local service professionals in order to verify whether both the logical model and the matrix represented the service adequately. The period for collecting data was from November 2006 through July, 2007. As a result, a flowchart of the organization of the public dental health service and a matrix with two-dimensional analysis, twelve criteria and twenty-four indicators, was developed. The development of indicators favoring the participation of people involved with the practice has enabled more comprehensive and realistic evaluation planning.
Implementation of a cost-accounting model in a biobank: practical implications.
Gonzalez-Sanchez, Maria Beatriz; Lopez-Valeiras, Ernesto; García-Montero, Andres C
2014-01-01
Given the state of global economy, cost measurement and control have become increasingly relevant over the past years. The scarcity of resources and the need to use these resources more efficiently is making cost information essential in management, even in non-profit public institutions. Biobanks are no exception. However, no empirical experiences on the implementation of cost accounting in biobanks have been published to date. The aim of this paper is to present a step-by-step implementation of a cost-accounting tool for the main production and distribution activities of a real/active biobank, including a comprehensive explanation on how to perform the calculations carried out in this model. Two mathematical models for the analysis of (1) production costs and (2) request costs (order management and sample distribution) have stemmed from the analysis of the results of this implementation, and different theoretical scenarios have been prepared. Global analysis and discussion provides valuable information for internal biobank management and even for strategic decisions at the research and development governmental policies level.
Visibility Graph Based Time Series Analysis
Stephen, Mutua; Gu, Changgui; Yang, Huijie
2015-01-01
Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115
Determining the optimal forensic DNA analysis procedure following investigation of sample quality.
Hedell, Ronny; Hedman, Johannes; Mostad, Petter
2018-07-01
Crime scene traces of various types are routinely sent to forensic laboratories for analysis, generally with the aim of addressing questions about the source of the trace. The laboratory may choose to analyse the samples in different ways depending on the type and quality of the sample, the importance of the case and the cost and performance of the available analysis methods. Theoretically well-founded guidelines for the choice of analysis method are, however, lacking in most situations. In this paper, it is shown how such guidelines can be created using Bayesian decision theory. The theory is applied to forensic DNA analysis, showing how the information from the initial qPCR analysis can be utilized. It is assumed the alternatives for analysis are using a standard short tandem repeat (STR) DNA analysis assay, using the standard assay and a complementary assay, or the analysis may be cancelled following quantification. The decision is based on information about the DNA amount and level of DNA degradation of the forensic sample, as well as case circumstances and the cost for analysis. Semi-continuous electropherogram models are used for simulation of DNA profiles and for computation of likelihood ratios. It is shown how tables and graphs, prepared beforehand, can be used to quickly find the optimal decision in forensic casework.
ERIC Educational Resources Information Center
Alsadoon, Abeer; Prasad, P. W. C.; Beg, Azam
2017-01-01
Making the students understand the theoretical concepts of digital logic design concepts is one of the major issues faced by the academics, therefore the teachers have tried different techniques to link the theoretical information to the practical knowledge. Use of software simulations is a technique for learning and practice that can be applied…
Savory, Christopher N.; Ganose, Alex M.; Travis, Will; Atri, Ria S.; Palgrave, Robert G.
2016-01-01
As the worldwide demand for energy increases, low-cost solar cells are being looked to as a solution for the future. To attain this, non-toxic earth-abundant materials are crucial, however cell efficiencies for current materials are limited in many cases. In this article, we examine the two silver copper sulfides AgCuS and Ag3CuS2 as possible solar absorbers using hybrid density functional theory, diffuse reflectance spectroscopy, XPS and Hall effect measurements. We show that both compounds demonstrate promising electronic structures and band gaps for high theoretical efficiency solar cells, based on Shockley–Queisser limits. Detailed analysis of their optical properties, however, indicates that only AgCuS should be of interest for PV applications, with a high theoretical efficiency. From this, we also calculate the band alignment of AgCuS against various buffer layers to aid in future device construction. PMID:27774149
Sato, Keiko; Akimoto, Kazunori
2017-06-01
In general, it has been considered that estrogen receptor-positive (ER + ) breast cancer has a good prognosis and is responsive to endocrine therapy. However, one third of patients with ER + breast cancer exhibit endocrine therapy resistance, and many patients develop recurrence and die 5 to 10 years after diagnosis. In ER + breast cancer, a major problem is to distinguish those patients most likely to develop recurrence or metastatic disease within 10 years after diagnosis from those with a sufficiently good prognosis. We downloaded the messenger RNA expression data and the clinical information for 401 patients with ER + breast cancer from the cBioPortal for Cancer Genomics. An information-theoretical approach was used to identify the prognostic factors for survival in patients with ER + breast cancer and to classify those patients according to the prognostic factors. The information-theoretical approach contributed to the identification of KMT2C and SLC20A1 as prognostic biomarkers in ER + breast cancer. We found that low KMT2C expression was associated with a poor outcome and high SLC20A1 expression was associated with a poor outcome. Both levels of KMT2C and SLC20A1 expression were significantly and strongly associated with the differentiation of survival. The 10-year survival rate for ER + patients with low KMT2C and high SLC20A1 expression was 0%. In contrast, for ER + patients with high KMT2C and low SLC20A1 expression, the 10-year survival rate was 86.78%. Our results strongly suggest that clinical examination of the expression of both KMT2C and SLC20A1 in ER + breast cancer will be very useful for the determination of prognosis and therapy. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Jianxiong
2014-06-01
This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF
Crystal structure prediction supported by incomplete experimental data
NASA Astrophysics Data System (ADS)
Tsujimoto, Naoto; Adachi, Daiki; Akashi, Ryosuke; Todo, Synge; Tsuneyuki, Shinji
2018-05-01
We propose an efficient theoretical scheme for structure prediction on the basis of the idea of combining methods, which optimize theoretical calculation and experimental data simultaneously. In this scheme, we formulate a cost function based on a weighted sum of interatomic potential energies and a penalty function which is defined with partial experimental data totally insufficient for conventional structure analysis. In particular, we define the cost function using "crystallinity" formulated with only peak positions within the small range of the x-ray-diffraction pattern. We apply this method to well-known polymorphs of SiO2 and C with up to 108 atoms in the simulation cell and show that it reproduces the correct structures efficiently with very limited information of diffraction peaks. This scheme opens a new avenue for determining and predicting structures that are difficult to determine by conventional methods.
[Clinical judgment is a schema. Conceptual proposals and training perspectives.
Nagels, Marc
2017-06-01
Clinical judgment is a critical concept for the development of nursing and nursing education. Its theoretical origins are multiple and its definition is not yet consensus. The analysis of the scientific and professional literature shows heterogeneous and dispersed points of views, notably on the role of intuition, on its cognitive and metacognitive dimensions, and on its proximity to other concepts. Between professional stakes and epistemological constructions, clinical judgment is still an emerging concept.To overcome the obstacle and contribute to the theoretical effort, we will argue that clinical judgment must be analyzed as a schema. It presents all the characteristics : diagnosis and information necessary for reasoning, rational decision-making process, metacognitive control and evaluation of decision-making. Perspectives then open to better understand the nursing activity.In conclusion, recommendations for developing clinical judgment in training will be presented.
Gray, John R.
2005-01-01
The Advisory Committee on Water Information's Subcommittee on Sedimentation sponsored the Federal Interagency Sediment Monitoring Instrument and Analysis Research Workshop on September 9-11, 2003, at the U.S. Geological Survey Flagstaff Field Center, Arizona. The workshop brought together a diverse group representing most Federal agencies whose mission includes fluvial-sediment issues; academia; the private sector; and others with interests and expertise in fluvial-sediment monitoring ? suspended sediment, bedload, bed material, and bed topography ? and associated data-analysis techniques. The workshop emphasized technological and theoretical advances related to measurements of suspended sediment, bedload, bed material and bed topography, and data analyses. This workshop followed and expanded upon part of the 2002 Federal Interagency Workshop on Turbidity and Other Sediment Surrogates (http://water.usgs.gov/pubs/circ/2003/circ1250/), which initiated a process to provide national standards for measurement and use of turbidity and other sediment-surrogate data. This report provides a description of the salient attributes of the workshop and related information, major deliberations and findings, and principal recommendations. This information is available for evaluation by the Subcommittee on Sedimentation, which may opt to develop an action plan based on the recommendations that it endorses for consideration by the Advisory Committee on Water Information.
Thomson, Oliver P; Petty, Nicola J; Moore, Ann P
2014-02-01
How practitioners conceive clinical practice influences many aspects of their clinical work including how they view knowledge, clinical decision-making, and their actions. Osteopaths have relied upon the philosophical and theoretical foundations upon which the profession was built to guide clinical practice. However, it is currently unknown how osteopaths conceive clinical practice, and how these conceptions develop and influence their clinical work. This paper reports the conceptions of practice of experienced osteopaths in the UK. A constructivist grounded theory approach was taken in this study. The constant comparative method of analysis was used to code and analyse data. Purposive sampling was employed to initially select participants. Subsequent theoretical sampling, informed by data analysis, allowed specific participants to be sampled. Data collection methods involved semi-structured interviews and non-participant observation of practitioners during a patient appointment, which was video-recorded and followed by a video-prompted reflective interview. Participants' conception of practice lay on a continuum, from technical rationality to professional artistry and the development of which was influenced by their educational experience, view of health and disease, epistemology of practice knowledge, theory-practice relationship and their perceived therapeutic role. The findings from this study provide the first theoretical insight of osteopaths' conceptions of clinical practice and the factors which influence such conceptions. Copyright © 2013 Elsevier Ltd. All rights reserved.
Tavender, Emma J; Bosch, Marije; Gruen, Russell L; Green, Sally E; Michie, Susan; Brennan, Sue E; Francis, Jill J; Ponsford, Jennie L; Knott, Jonathan C; Meares, Sue; Smyth, Tracy; O'Connor, Denise A
2015-05-25
Despite the availability of evidence-based guidelines for the management of mild traumatic brain injury in the emergency department (ED), variations in practice exist. Interventions designed to implement recommended behaviours can reduce this variation. Using theory to inform intervention development is advocated; however, there is no consensus on how to select or apply theory. Integrative theoretical frameworks, based on syntheses of theories and theoretical constructs relevant to implementation, have the potential to assist in the intervention development process. This paper describes the process of applying two theoretical frameworks to investigate the factors influencing recommended behaviours and the choice of behaviour change techniques and modes of delivery for an implementation intervention. A stepped approach was followed: (i) identification of locally applicable and actionable evidence-based recommendations as targets for change, (ii) selection and use of two theoretical frameworks for identifying barriers to and enablers of change (Theoretical Domains Framework and Model of Diffusion of Innovations in Service Organisations) and (iii) identification and operationalisation of intervention components (behaviour change techniques and modes of delivery) to address the barriers and enhance the enablers, informed by theory, evidence and feasibility/acceptability considerations. We illustrate this process in relation to one recommendation, prospective assessment of post-traumatic amnesia (PTA) by ED staff using a validated tool. Four recommendations for managing mild traumatic brain injury were targeted with the intervention. The intervention targeting the PTA recommendation consisted of 14 behaviour change techniques and addressed 6 theoretical domains and 5 organisational domains. The mode of delivery was informed by six Cochrane reviews. It was delivered via five intervention components : (i) local stakeholder meetings, (ii) identification of local opinion leader teams, (iii) a train-the-trainer workshop for appointed local opinion leaders, (iv) local training workshops for delivery by trained local opinion leaders and (v) provision of tools and materials to prompt recommended behaviours. Two theoretical frameworks were used in a complementary manner to inform intervention development in managing mild traumatic brain injury in the ED. The effectiveness and cost-effectiveness of the developed intervention is being evaluated in a cluster randomised trial, part of the Neurotrauma Evidence Translation (NET) program.
Quantum Gravitational Effects on the Boundary
NASA Astrophysics Data System (ADS)
James, F.; Park, I. Y.
2018-04-01
Quantum gravitational effects might hold the key to some of the outstanding problems in theoretical physics. We analyze the perturbative quantum effects on the boundary of a gravitational system and the Dirichlet boundary condition imposed at the classical level. Our analysis reveals that for a black hole solution, there is a contradiction between the quantum effects and the Dirichlet boundary condition: the black hole solution of the one-particle-irreducible action no longer satisfies the Dirichlet boundary condition as would be expected without going into details. The analysis also suggests that the tension between the Dirichlet boundary condition and loop effects is connected with a certain mechanism of information storage on the boundary.
The smooth entropy formalism for von Neumann algebras
NASA Astrophysics Data System (ADS)
Berta, Mario; Furrer, Fabian; Scholz, Volkher B.
2016-01-01
We discuss information-theoretic concepts on infinite-dimensional quantum systems. In particular, we lift the smooth entropy formalism as introduced by Renner and collaborators for finite-dimensional systems to von Neumann algebras. For the smooth conditional min- and max-entropy, we recover similar characterizing properties and information-theoretic operational interpretations as in the finite-dimensional case. We generalize the entropic uncertainty relation with quantum side information of Tomamichel and Renner and discuss applications to quantum cryptography. In particular, we prove the possibility to perform privacy amplification and classical data compression with quantum side information modeled by a von Neumann algebra.
The smooth entropy formalism for von Neumann algebras
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berta, Mario, E-mail: berta@caltech.edu; Furrer, Fabian, E-mail: furrer@eve.phys.s.u-tokyo.ac.jp; Scholz, Volkher B., E-mail: scholz@phys.ethz.ch
2016-01-15
We discuss information-theoretic concepts on infinite-dimensional quantum systems. In particular, we lift the smooth entropy formalism as introduced by Renner and collaborators for finite-dimensional systems to von Neumann algebras. For the smooth conditional min- and max-entropy, we recover similar characterizing properties and information-theoretic operational interpretations as in the finite-dimensional case. We generalize the entropic uncertainty relation with quantum side information of Tomamichel and Renner and discuss applications to quantum cryptography. In particular, we prove the possibility to perform privacy amplification and classical data compression with quantum side information modeled by a von Neumann algebra.
Informational laws of genome structures
Bonnici, Vincenzo; Manca, Vincenzo
2016-01-01
In recent years, the analysis of genomes by means of strings of length k occurring in the genomes, called k-mers, has provided important insights into the basic mechanisms and design principles of genome structures. In the present study, we focus on the proper choice of the value of k for applying information theoretic concepts that express intrinsic aspects of genomes. The value k = lg2(n), where n is the genome length, is determined to be the best choice in the definition of some genomic informational indexes that are studied and computed for seventy genomes. These indexes, which are based on information entropies and on suitable comparisons with random genomes, suggest five informational laws, to which all of the considered genomes obey. Moreover, an informational genome complexity measure is proposed, which is a generalized logistic map that balances entropic and anti-entropic components of genomes and is related to their evolutionary dynamics. Finally, applications to computational synthetic biology are briefly outlined. PMID:27354155
Faes, Luca; Nollo, Giandomenico; Krohova, Jana; Czippelova, Barbora; Turianikova, Zuzana; Javorka, Michal
2017-07-01
To fully elucidate the complex physiological mechanisms underlying the short-term autonomic regulation of heart period (H), systolic and diastolic arterial pressure (S, D) and respiratory (R) variability, the joint dynamics of these variables need to be explored using multivariate time series analysis. This study proposes the utilization of information-theoretic measures to measure causal interactions between nodes of the cardiovascular/cardiorespiratory network and to assess the nature (synergistic or redundant) of these directed interactions. Indexes of information transfer and information modification are extracted from the H, S, D and R series measured from healthy subjects in a resting state and during postural stress. Computations are performed in the framework of multivariate linear regression, using bootstrap techniques to assess on a single-subject basis the statistical significance of each measure and of its transitions across conditions. We find patterns of information transfer and modification which are related to specific cardiovascular and cardiorespiratory mechanisms in resting conditions and to their modification induced by the orthostatic stress.
Informational laws of genome structures
NASA Astrophysics Data System (ADS)
Bonnici, Vincenzo; Manca, Vincenzo
2016-06-01
In recent years, the analysis of genomes by means of strings of length k occurring in the genomes, called k-mers, has provided important insights into the basic mechanisms and design principles of genome structures. In the present study, we focus on the proper choice of the value of k for applying information theoretic concepts that express intrinsic aspects of genomes. The value k = lg2(n), where n is the genome length, is determined to be the best choice in the definition of some genomic informational indexes that are studied and computed for seventy genomes. These indexes, which are based on information entropies and on suitable comparisons with random genomes, suggest five informational laws, to which all of the considered genomes obey. Moreover, an informational genome complexity measure is proposed, which is a generalized logistic map that balances entropic and anti-entropic components of genomes and is related to their evolutionary dynamics. Finally, applications to computational synthetic biology are briefly outlined.
Oussalah, Abderrahim; Fournier, Jean-Paul; Guéant, Jean-Louis; Braun, Marc
2015-02-01
Data regarding knowledge acquisition during residency training are sparse. Predictors of theoretical learning quality, academic career achievements and evidence-based medical practice during residency are unknown. We performed a cross-sectional study on residents and attending physicians across several residency programs in 2 French faculties of medicine. We comprehensively evaluated the information-seeking behavior (I-SB) during residency using a standardized questionnaire and looked for independent predictors of theoretical learning quality, academic career achievements, and evidence-based medical practice among I-SB components using multivariate logistic regression analysis. Between February 2013 and May 2013, 338 fellows and attending physicians were included in the study. Textbooks and international medical journals were reported to be used on a regular basis by 24% and 57% of the respondents, respectively. Among the respondents, 47% refer systematically (4.4%) or frequently (42.6%) to published guidelines from scientific societies upon their publication. The median self-reported theoretical learning quality score was 5/10 (interquartile range, 3-6; range, 1-10). A high theoretical learning quality score (upper quartile) was independently and strongly associated with the following I-SB components: systematic reading of clinical guidelines upon their publication (odds ratio [OR], 5.55; 95% confidence interval [CI], 1.77-17.44); having access to a library that offers the leading textbooks of the specialty in the medical department (OR, 2.45, 95% CI, 1.33-4.52); knowledge of the specialty leading textbooks (OR, 2.12; 95% CI, 1.09-4.10); and PubMed search skill score ≥5/10 (OR, 1.94; 95% CI, 1.01-3.73). Research Master (M2) and/or PhD thesis enrolment were independently and strongly associated with the following predictors: PubMed search skill score ≥5/10 (OR, 4.10; 95% CI, 1.46-11.53); knowledge of the leading medical journals of the specialty (OR, 3.33; 95% CI, 1.32-8.38); attending national and international academic conferences and meetings (OR, 2.43; 95% CI, 1.09-5.43); and using academic theoretical learning supports several times a week (OR, 2.23; 95% CI, 1.11- 4.49). This study showed weaknesses in the theoretical learning framework during residency. I-SB was independently associated with quality of academic theoretical learning, academic career achievements, and the use of evidence-based medicine in everyday clinical practice. CNIL No.1797639.
Neutron beam measurement of industrial polymer materials for composition and bulk integrity
NASA Astrophysics Data System (ADS)
Rogante, M.; Rosta, L.; Heaton, M. E.
2013-10-01
Neutron beam techniques, among other non-destructive diagnostics, are particularly irreplaceable in the complete analysis of industrial materials and components when supplying fundamental information. In this paper, nanoscale small-angle neutron scattering analysis and prompt gamma activation analysis for the characterization of industrial polymers are considered. The basic theoretical aspects are briefly introduced and some applications are presented. The investigations of the SU-8 polymer in axial airflow microturbines—i.e. microelectromechanical systems—are presented foremost. Also presented are full and feasibility studies on polyurethanes, composites based on cross-linked polymers reinforced by carbon fibres and polymer cement concrete. The obtained results have provided a substantial contribution to the improvement of the considered materials, and indeed confirmed the industrial applicability of the adopted techniques in the analysis of polymers.
Error analysis of stochastic gradient descent ranking.
Chen, Hong; Tang, Yi; Li, Luoqing; Yuan, Yuan; Li, Xuelong; Tang, Yuanyan
2013-06-01
Ranking is always an important task in machine learning and information retrieval, e.g., collaborative filtering, recommender systems, drug discovery, etc. A kernel-based stochastic gradient descent algorithm with the least squares loss is proposed for ranking in this paper. The implementation of this algorithm is simple, and an expression of the solution is derived via a sampling operator and an integral operator. An explicit convergence rate for leaning a ranking function is given in terms of the suitable choices of the step size and the regularization parameter. The analysis technique used here is capacity independent and is novel in error analysis of ranking learning. Experimental results on real-world data have shown the effectiveness of the proposed algorithm in ranking tasks, which verifies the theoretical analysis in ranking error.
Fiori, Simone
2003-12-01
In recent work, we introduced nonlinear adaptive activation function (FAN) artificial neuron models, which learn their activation functions in an unsupervised way by information-theoretic adapting rules. We also applied networks of these neurons to some blind signal processing problems, such as independent component analysis and blind deconvolution. The aim of this letter is to study some fundamental aspects of FAN units' learning by investigating the properties of the associated learning differential equation systems.
Theoretical bases of project management in conditions of innovative economy based on fuzzy modeling
NASA Astrophysics Data System (ADS)
Beilin, I. L.; Khomenko, V. V.
2018-05-01
In recent years, more and more Russian enterprises (both private and public) are trying to organize their activities on the basis of modern scientific research in order to improve the management of economic processes. Business planning, financial and investment analysis, modern software products based on the latest scientific developments are introduced everywhere. At the same time, there is a growing demand for market research (both at the microeconomic and macroeconomic levels), for financial and general economic information.
Current progress in multiple-image blind demixing algorithms
NASA Astrophysics Data System (ADS)
Szu, Harold H.
2000-06-01
Imagery edges occur naturally in human visual systems as a consequence of redundancy reduction towards `sparse and orthogonality feature maps,' which have been recently derived from the maximum entropy information-theoretical first principle of artificial neural networks. After a brief match review of such an Independent Component Analysis or Blind Source Separation of edge maps, we explore the de- mixing condition for more than two imagery objects recognizable by an intelligent pair of cameras with memory in a time-multiplex fashion.
Informal Monograph on Riverine Sand Dunes
1991-10-01
current deficiencies in the theoretical models are discussed and are concluded to stem from the difficulties inherent to analysis of nonuniform, turbulent...T(x,t) = m[l + crlx(x,t)1[(U - Uc) + Ox(x - 13, - d,t)]n (5) 134 5. (4) and (5) yield 1r(x,b), which is Fourier transformed to yield Bt(k,t) - Tk2 ...is known about the formation, behavior and characteristics of alluvial bed forms, and the principal deficiencies in our knowledge about them. Section
Spectroscopy of infrared-active phonons in high-temperature superconductors
NASA Technical Reports Server (NTRS)
Litvinchuk, A. P.; Thomsen, C.; Cardona, M.; Borjesson, L.
1995-01-01
For a large variety of superconducting materials both experimental and theoretical lattice dynamical studies have been performed to date. The assignment of the observed infrared- and Raman-active phonon modes to the particular lattice eigenmodes is generally accepted. We will concentrate here upon the analysis of the changes of the infrared-phonon parameters (frequency and linewidth) upon entering the superconducting state which, as will be shown, may provide information on the magnitude of the superconductivity-related gap and its dependence on the superconducting transition temperature Tc.
Computing Properties Of Chemical Mixtures At Equilibrium
NASA Technical Reports Server (NTRS)
Mcbride, B. J.; Gordon, S.
1995-01-01
Scientists and engineers need data on chemical equilibrium compositions to calculate theoretical thermodynamic properties of chemical systems. Information essential in design and analysis of such equipment as compressors, turbines, nozzles, engines, shock tubes, heat exchangers, and chemical-processing equipment. CET93 is general program that calculates chemical equilibrium compositions and properties of mixtures for any chemical system for which thermodynamic data are available. Includes thermodynamic data for more than 1,300 gaseous and condensed species and thermal-transport data for 151 gases. Written in FORTRAN 77.
2003-03-01
were stimulated by Cordes and Dougherty (1993). A proposed theory of burnout facilitates such efforts ( Schaufeli et al., 1995). Schaufleli et al...as the depletion of mental resources. Moore’s study focuses on work exhaustion in technology professionals and, therefore, uses the Schaufeli et...al. concept and measure of exhaustion. To place emphasis on the workplace aspect of the Schaufeli et al.’s exhaustion construct, Moore’s research
Error analysis for a spaceborne laser ranging system
NASA Technical Reports Server (NTRS)
Pavlis, E. C.
1979-01-01
The dependence (or independence) of baseline accuracies, obtained from a typical mission of a spaceborne ranging system, on several factors is investigated. The emphasis is placed on a priori station information, but factors such as the elevation cut-off angle, the geometry of the network, the mean orbital height, and to a limited extent geopotential modeling are also examined. The results are obtained through simulations, but some theoretical justification is also given. Guidelines for freeing the results from these dependencies are suggested for most of the factors.
A Model for Chorus Associated Electrostatic Bursts.
1983-06-30
neglect of the ion contribution to D is justified since vb2 >> v12. Now Equation 8 can be further simplified since D2 << g2, allowing a separation of...will be treated here as a review for those who may be unfamiliar with this beam instability. We will assume Vb2 >> Ve2 to simplify the analysis. Then we...principal, one might use the theoretical prediction to try to extract information on the beam from the spectrum. But in this case, the beam
Analysing the diffusion and adoption of mobile IT across social worlds.
Nielsen, Jeppe Agger; Mengiste, Shegaw Anagaw
2014-06-01
The diffusion and adoption of information technology innovations (e.g. mobile information technology) in healthcare organizations involves a dynamic process of change with multiple stakeholders with competing interests, varying commitments, and conflicting values. Nevertheless, the extant literature on mobile information technology diffusion and adoption has predominantly focused on organizations and individuals as the unit of analysis, with little emphasis on the environment in which healthcare organizations are embedded. We propose the social worlds approach as a promising theoretical lens for dealing with this limitation together with reports from a case study of a mobile information technology innovation in elderly home care in Denmark including both the sociopolitical and organizational levels in the analysis. Using the notions of social worlds, trajectories, and boundary objects enables us to show how mobile information technology innovation in Danish home care can facilitate negotiation and collaboration across different social worlds in one setting while becoming a source of tension and conflicts in others. The trajectory of mobile information technology adoption was shaped by influential stakeholders in the Danish home care sector. Boundary objects across multiple social worlds legitimized the adoption, but the use arrangement afforded by the new technology interfered with important aspects of home care practices, creating resistance among the healthcare personnel.
A multiscale cerebral neurochemical connectome of the rat brain
Schöttler, Judith; Ercsey-Ravasz, Maria; Cosa-Linan, Alejandro; Varga, Melinda; Toroczkai, Zoltan; Spanagel, Rainer
2017-01-01
Understanding the rat neurochemical connectome is fundamental for exploring neuronal information processing. By using advanced data mining, supervised machine learning, and network analysis, this study integrates over 5 decades of neuroanatomical investigations into a multiscale, multilayer neurochemical connectome of the rat brain. This neurochemical connectivity database (ChemNetDB) is supported by comprehensive systematically-determined receptor distribution maps. The rat connectome has an onion-type structural organization and shares a number of structural features with mesoscale connectomes of mouse and macaque. Furthermore, we demonstrate that extremal values of graph theoretical measures (e.g., degree and betweenness) are associated with evolutionary-conserved deep brain structures such as amygdala, bed nucleus of the stria terminalis, dorsal raphe, and lateral hypothalamus, which regulate primitive, yet fundamental functions, such as circadian rhythms, reward, aggression, anxiety, and fear. The ChemNetDB is a freely available resource for systems analysis of motor, sensory, emotional, and cognitive information processing. PMID:28671956
NASA Technical Reports Server (NTRS)
Watson, Ken; Hummer-Miller, Susanne; Kruse, Fred A.
1986-01-01
A theoretical radiance model was employed together with laboratory data on a suite of igneous rock to evaluate various algorithms for processing Thermal Infrared Multispectral Scanner (TIMS) data. Two aspects of the general problem were examined: extraction of emissivity information from the observed TIMS radiance data, and how to use emissivity data in a way that is geologically meaningful. The four algorithms were evaluated for appropriate band combinations of TIMS data acquired on both day and night overflights of the Tuscarora Mountains, including the Carlin gold deposit, in north-central Nevada. Analysis of a color composited PC decorrelated image (Bands 3, 4, 5--blue/green/red) of the Northern Grapevine Mountains, Nevada, area showed some useful correlation with the regional geology. The thermal infrared region provides fundamental spectral information that can be used to discriminate the major rock types occurring on the Earth's surface.
NASA Technical Reports Server (NTRS)
Seidel, A. D.
1974-01-01
The economic value of information produced by an assumed operational version of an earth resources survey satellite of the ERTS class is assessed. The theoretical capability of an ERTS system to provide improved agricultural forecasts is analyzed and this analysis is used as a reasonable input to the econometric methods derived by ECON. An econometric investigation into the markets for agricultural commodities is summarized. An overview of the effort including the objectives, scopes, and architecture of the analysis, and the estimation strategy employed is presented. The results and conclusions focus on the economic importance of improved crop forecasts, U.S. exports, and government policy operations. Several promising avenues of further investigation are suggested.
Face-space architectures: evidence for the use of independent color-based features.
Nestor, Adrian; Plaut, David C; Behrmann, Marlene
2013-07-01
The concept of psychological face space lies at the core of many theories of face recognition and representation. To date, much of the understanding of face space has been based on principal component analysis (PCA); the structure of the psychological space is thought to reflect some important aspects of a physical face space characterized by PCA applications to face images. In the present experiments, we investigated alternative accounts of face space and found that independent component analysis provided the best fit to human judgments of face similarity and identification. Thus, our results challenge an influential approach to the study of human face space and provide evidence for the role of statistically independent features in face encoding. In addition, our findings support the use of color information in the representation of facial identity, and we thus argue for the inclusion of such information in theoretical and computational constructs of face space.
Li, Sikun; Wang, Xiangzhao; Su, Xianyu; Tang, Feng
2012-04-20
This paper theoretically discusses modulus of two-dimensional (2D) wavelet transform (WT) coefficients, calculated by using two frequently used 2D daughter wavelet definitions, in an optical fringe pattern analysis. The discussion shows that neither is good enough to represent the reliability of the phase data. The differences between the two frequently used 2D daughter wavelet definitions in the performance of 2D WT also are discussed. We propose a new 2D daughter wavelet definition for reliability-guided phase unwrapping of optical fringe pattern. The modulus of the advanced 2D WT coefficients, obtained by using a daughter wavelet under this new daughter wavelet definition, includes not only modulation information but also local frequency information of the deformed fringe pattern. Therefore, it can be treated as a good parameter that represents the reliability of the retrieved phase data. Computer simulation and experimentation show the validity of the proposed method.
A project management system for the X-29A flight test program
NASA Technical Reports Server (NTRS)
Stewart, J. F.; Bauer, C. A.
1983-01-01
The project-management system developed for NASA's participation in the X-29A aircraft development program is characterized from a theoretical perspective, as an example of a system appropriate to advanced, highly integrated technology projects. System-control theory is applied to the analysis of classical project-management techniques and structures, which are found to be of closed-loop multivariable type; and the effects of increasing project complexity and integration are evaluated. The importance of information flow, sampling frequency, information holding, and delays is stressed. The X-29A system is developed in four stages: establishment of overall objectives and requirements, determination of information processes (block diagrams) definition of personnel functional roles and relationships, and development of a detailed work-breakdown structure. The resulting system is shown to require a greater information flow to management than conventional methods. Sample block diagrams are provided.
Montangie, Lisandro; Montani, Fernando
2016-10-01
Spike correlations among neurons are widely encountered in the brain. Although models accounting for pairwise interactions have proved able to capture some of the most important features of population activity at the level of the retina, the evidence shows that pairwise neuronal correlation analysis does not resolve cooperative population dynamics by itself. By means of a series expansion for short time scales of the mutual information conveyed by a population of neurons, the information transmission can be broken down into firing rate and correlational components. In a proposed extension of this framework, we investigate the information components considering both second- and higher-order correlations. We show that the existence of a mixed stimulus-dependent correlation term defines a new scenario for the interplay between pairwise and higher-than-pairwise interactions in noise and signal correlations that would lead either to redundancy or synergy in the information-theoretic sense.
Liao, C-M; You, S-H; Cheng, Y-H
2015-01-01
Influenza poses a significant public health burden worldwide. Understanding how and to what extent people would change their behaviour in response to influenza outbreaks is critical for formulating public health policies. We incorporated the information-theoretic framework into a behaviour-influenza (BI) transmission dynamics system in order to understand the effects of individual behavioural change on influenza epidemics. We showed that information transmission of risk perception played a crucial role in the spread of health-seeking behaviour throughout influenza epidemics. Here a network BI model provides a new approach for understanding the risk perception spread and human behavioural change during disease outbreaks. Our study allows simultaneous consideration of epidemiological, psychological, and social factors as predictors of individual perception rates in behaviour-disease transmission systems. We suggest that a monitoring system with precise information on risk perception should be constructed to effectively promote health behaviours in preparation for emerging disease outbreaks.
What should we measure? Conceptualizing usage in health information exchange
Jasperson, Jon
2010-01-01
Under the provisions of the Health Information Technology for Economic & Clinical Health act providers need to demonstrate their ‘meaningful use’ of electronic health record systems' health information exchange (HIE) capability. HIE usage is not a simple construct, but the choice of its measurement must attend to the users, context, and objectives of the system being examined. This review examined how usage is reported in the existing literature and also what conceptualizations of usage might best reflect the nature and objectives of HIE. While existing literature on HIE usage included a diverse set of measures, most were theoretically weak, did not attend to the interplay of measure, level of analysis and architectural strategy, and did not reflect how HIE usage affected the actual process of care. Attention to these issues will provide greater insight into the effects of previously inaccessible information on medical decision-making and the process of care. PMID:20442148
Collaborative Manufacturing Management in Networked Supply Chains
NASA Astrophysics Data System (ADS)
Pouly, Michel; Naciri, Souleiman; Berthold, Sébastien
ERP systems provide information management and analysis to industrial companies and support their planning activities. They are currently mostly based on theoretical values (averages) of parameters and not on the actual, real shop floor data, leading to disturbance of the planning algorithms. On the other hand, sharing data between manufacturers, suppliers and customers becomes very important to ensure reactivity towards markets variability. This paper proposes software solutions to address these requirements and methods to automatically capture the necessary corresponding shop floor information. In order to share data produced by different legacy systems along the collaborative networked supply chain, we propose to use the Generic Product Model developed by Hitachi to extract, translate and store heterogeneous ERP data.
A Bourdieusian Analysis of U.S. Military Culture Ground in the Mental Help-Seeking Literature.
Abraham, Traci; Cheney, Ann M; Curran, Geoffrey M
2017-09-01
This theoretical treatise uses the scientific literature concerning help seeking for mental illness among those with a background in the U.S. military to posit a more complex definition of military culture. The help-seeking literature is used to illustrate how hegemonic masculinity, when situated in the military field, informs the decision to seek formal treatment for mental illness among those men with a background in the U.S. military. These analyses advocate for a nuanced, multidimensional, and situated definition of U.S. military culture that emphasizes the way in which institutional structures and social relations of power intersect with individual values, beliefs, and motivations to inform and structure health-related practices.
NASA Astrophysics Data System (ADS)
Ijaz, S.; Nadeem, S.
2017-11-01
A theoretical examination is presented in this analysis to study the flow of a bio-nanofluid through a curved stenotic channel. The curved channel is considered with an overlapping stenotic region. The effect of convective conditions is incorporated to discuss the heat transfer characteristic. The mathematical problem of a curved stenotic channel is formulated and then solved by using the exact technique. To discuss the hemodynamics of a curved stenotic channel the expression of resistance to blood is evaluated by dividing the channel into pre-stenotic, stenotic and post stenotic region. In this investigation gold, silver and copper nanoparticles are used as drug carriers. The outcomes of the graphical illustration reveal that with an increase in nanoparticle concentration hemodynamics effects of stenosed curved channel are reduced and they also conclude that the drug Au nanoparticles are more effective to minimize hemodynamics when compared to the drug Ag and Cu nanoparticles. This analysis finds valuable theoretical information for nanoparticles used as drug agents in the field of bio-inspired applications.
Cárdenas, Walter HZ; Mamani, Javier B; Sibov, Tatiana T; Caous, Cristofer A; Amaro, Edson; Gamarra, Lionel F
2012-01-01
Background Nanoparticles in suspension are often utilized for intracellular labeling and evaluation of toxicity in experiments conducted in vitro. The purpose of this study was to undertake a computational modeling analysis of the deposition kinetics of a magnetite nanoparticle agglomerate in cell culture medium. Methods Finite difference methods and the Crank–Nicolson algorithm were used to solve the equation of mass transport in order to analyze concentration profiles and dose deposition. Theoretical data were confirmed by experimental magnetic resonance imaging. Results Different behavior in the dose fraction deposited was found for magnetic nanoparticles up to 50 nm in diameter when compared with magnetic nanoparticles of a larger diameter. Small changes in the dispersion factor cause variations of up to 22% in the dose deposited. The experimental data confirmed the theoretical results. Conclusion These findings are important in planning for nanomaterial absorption, because they provide valuable information for efficient intracellular labeling and control toxicity. This model enables determination of the in vitro transport behavior of specific magnetic nanoparticles, which is also relevant to other models that use cellular components and particle absorption processes. PMID:22745539
NASA Technical Reports Server (NTRS)
Wolf, David R.
2004-01-01
The topic of this paper is a hierarchy of information-like functions, here named the information correlation functions, where each function of the hierarchy may be thought of as the information between the variables it depends upon. The information correlation functions are particularly suited to the description of the emergence of complex behaviors due to many- body or many-agent processes. They are particularly well suited to the quantification of the decomposition of the information carried among a set of variables or agents, and its subsets. In more graphical language, they provide the information theoretic basis for understanding the synergistic and non-synergistic components of a system, and as such should serve as a forceful toolkit for the analysis of the complexity structure of complex many agent systems. The information correlation functions are the natural generalization to an arbitrary number of sets of variables of the sequence starting with the entropy function (one set of variables) and the mutual information function (two sets). We start by describing the traditional measures of information (entropy) and mutual information.
Multi-trait analysis of genome-wide association summary statistics using MTAG.
Turley, Patrick; Walters, Raymond K; Maghzian, Omeed; Okbay, Aysu; Lee, James J; Fontana, Mark Alan; Nguyen-Viet, Tuan Anh; Wedow, Robbee; Zacher, Meghan; Furlotte, Nicholas A; Magnusson, Patrik; Oskarsson, Sven; Johannesson, Magnus; Visscher, Peter M; Laibson, David; Cesarini, David; Neale, Benjamin M; Benjamin, Daniel J
2018-02-01
We introduce multi-trait analysis of GWAS (MTAG), a method for joint analysis of summary statistics from genome-wide association studies (GWAS) of different traits, possibly from overlapping samples. We apply MTAG to summary statistics for depressive symptoms (N eff = 354,862), neuroticism (N = 168,105), and subjective well-being (N = 388,538). As compared to the 32, 9, and 13 genome-wide significant loci identified in the single-trait GWAS (most of which are themselves novel), MTAG increases the number of associated loci to 64, 37, and 49, respectively. Moreover, association statistics from MTAG yield more informative bioinformatics analyses and increase the variance explained by polygenic scores by approximately 25%, matching theoretical expectations.
The gene and the genon concept: a functional and information-theoretic analysis
Scherrer, Klaus; Jost, Jürgen
2007-01-01
‘Gene' has become a vague and ill-defined concept. To set the stage for mathematical analysis of gene storage and expression, we return to the original concept of the gene as a function encoded in the genome, basis of genetic analysis, that is a polypeptide or other functional product. The additional information needed to express a gene is contained within each mRNA as an ensemble of signals, added to or superimposed onto the coding sequence. To designate this programme, we introduce the term ‘genon'. Individual genons are contained in the pre-mRNA forming a pre-genon. A genomic domain contains a proto-genon, with the signals of transcription activation in addition to the pre-genon in the transcripts. Some contain several mRNAs and hence genons, to be singled out by RNA processing and differential splicing. The programme in the genon in cis is implemented by corresponding factors of protein or RNA nature contained in the transgenon of the cell or organism. The gene, the cis programme contained in the individual domain and transcript, and the trans programme of factors, can be analysed by information theory. PMID:17353929
Development and initial evaluation of the Clinical Information Systems Success Model (CISSM).
Garcia-Smith, Dianna; Effken, Judith A
2013-06-01
Most clinical information systems (CIS) today are technically sound, but the number of successful implementations of these systems is low. The purpose of this study was to develop and test a theoretically based integrated CIS Success Model (CISSM) from the nurse perspective. Model predictors of CIS success were taken from existing research on information systems acceptance, user satisfaction, use intention, user behavior and perceptions, as well as clinical research. Data collected online from 234 registered nurses in four hospitals were used to test the model. Each nurse had used the Cerner Power Chart Admission Health Profile for at least 3 months. Psychometric testing and factor analysis of the 23-item CISSM instrument established its construct validity and reliability. Initial analysis showed nurses' satisfaction with and dependency on CIS use predicted their perceived CIS use Net Benefit. Further analysis identified Social Influence and Facilitating Conditions as other predictors of CIS user Net Benefit. The level of hospital CIS integration may account for the role of CIS Use Dependency in the success of CIS. Based on our experience, CISSM provides a formative as well as summative tool for evaluating CIS success from the nurse's perspective. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Olsen, Jolie; Sen, Sandip
2014-04-01
Steven Brams's [(1994). Theory of moves. Cambridge University Press] Theory of Moves (TOM) is an alternative to traditional game theoretic treatment of real-life interactions, in which players choose strategies based on analysis of future moves and counter-moves that arise if game play commences at a specified start state and either player can choose to move first. In repeated play, players using TOM rationality arrive at nonmyopic equilibria. One advantage of TOM is its ability to model scenarios in which power asymmetries exist between players. In particular, threat power, i.e. the ability of one player to threaten and sustain immediate, globally disadvantageous outcomes to force a desirable result long term, can be utilised to induce Pareto optimal states in games such as Prisoner's Dilemma which result in Pareto-dominated outcomes using traditional methods. Unfortunately, prior work on TOM is limited by an assumption of complete information. This paper presents a mechanism that can be used by a player to utilise threat power when playing a strict, ordinal 2×2 game under incomplete information. We also analyse the benefits of threat power and support in this analysis with empirical evidence.
NASA Astrophysics Data System (ADS)
Bell, A.; Tang, G.; Yang, P.; Wu, D.
2017-12-01
Due to their high spatial and temporal coverage, cirrus clouds have a profound role in regulating the Earth's energy budget. Variability of their radiative, geometric, and microphysical properties can pose significant uncertainties in global climate model simulations if not adequately constrained. Thus, the development of retrieval methodologies able to accurately retrieve ice cloud properties and present associated uncertainties is essential. The effectiveness of cirrus cloud retrievals relies on accurate a priori understanding of ice radiative properties, as well as the current state of the atmosphere. Current studies have implemented information content theory analyses prior to retrievals to quantify the amount of information that should be expected on parameters to be retrieved, as well as the relative contribution of information provided by certain measurement channels. Through this analysis, retrieval algorithms can be designed in a way to maximize the information in measurements, and therefore ensure enough information is present to retrieve ice cloud properties. In this study, we present such an information content analysis to quantify the amount of information to be expected in retrievals of cirrus ice water path and particle effective diameter using sub-millimeter and thermal infrared radiometry. Preliminary results show these bands to be sensitive to changes in ice water path and effective diameter, and thus lend confidence their ability to simultaneously retrieve these parameters. Further quantification of sensitivity and the information provided from these bands can then be used to design and optimal retrieval scheme. While this information content analysis is employed on a theoretical retrieval combining simulated radiance measurements, the methodology could in general be applicable to any instrument or retrieval approach.
Comparing capacity coefficient and dual task assessment of visual multitasking workload
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaha, Leslie M.
Capacity coefficient analysis could offer a theoretically grounded alternative approach to subjective measures and dual task assessment of cognitive workload. Workload capacity or workload efficiency is a human information processing modeling construct defined as the amount of information that can be processed by the visual cognitive system given a specified of amount of time. In this paper, I explore the relationship between capacity coefficient analysis of workload efficiency and dual task response time measures. To capture multitasking performance, I examine how the relatively simple assumptions underlying the capacity construct generalize beyond the single visual decision making tasks. The fundamental toolsmore » for measuring workload efficiency are the integrated hazard and reverse hazard functions of response times, which are defined by log transforms of the response time distribution. These functions are used in the capacity coefficient analysis to provide a functional assessment of the amount of work completed by the cognitive system over the entire range of response times. For the study of visual multitasking, capacity coefficient analysis enables a comparison of visual information throughput as the number of tasks increases from one to two to any number of simultaneous tasks. I illustrate the use of capacity coefficients for visual multitasking on sample data from dynamic multitasking in the modified Multi-attribute Task Battery.« less
An Ideal Observer Analysis of Visual Working Memory
Sims, Chris R.; Jacobs, Robert A.; Knill, David C.
2013-01-01
Limits in visual working memory (VWM) strongly constrain human performance across many tasks. However, the nature of these limits is not well understood. In this paper we develop an ideal observer analysis of human visual working memory, by deriving the expected behavior of an optimally performing, but limited-capacity memory system. This analysis is framed around rate–distortion theory, a branch of information theory that provides optimal bounds on the accuracy of information transmission subject to a fixed information capacity. The result of the ideal observer analysis is a theoretical framework that provides a task-independent and quantitative definition of visual memory capacity and yields novel predictions regarding human performance. These predictions are subsequently evaluated and confirmed in two empirical studies. Further, the framework is general enough to allow the specification and testing of alternative models of visual memory (for example, how capacity is distributed across multiple items). We demonstrate that a simple model developed on the basis of the ideal observer analysis—one which allows variability in the number of stored memory representations, but does not assume the presence of a fixed item limit—provides an excellent account of the empirical data, and further offers a principled re-interpretation of existing models of visual working memory. PMID:22946744
Wixted, John T; Mickes, Laura
2018-01-01
Receiver operating characteristic (ROC) analysis was introduced to the field of eyewitness identification 5 years ago. Since that time, it has been both influential and controversial, and the debate has raised an issue about measuring discriminability that is rarely considered. The issue concerns the distinction between empirical discriminability (measured by area under the ROC curve) vs. underlying/theoretical discriminability (measured by d' or variants of it). Under most circumstances, the two measures will agree about a difference between two conditions in terms of discriminability. However, it is possible for them to disagree, and that fact can lead to confusion about which condition actually yields higher discriminability. For example, if the two conditions have implications for real-world practice (e.g., a comparison of competing lineup formats), should a policymaker rely on the area-under-the-curve measure or the theory-based measure? Here, we illustrate the fact that a given empirical ROC yields as many underlying discriminability measures as there are theories that one is willing to take seriously. No matter which theory is correct, for practical purposes, the singular area-under-the-curve measure best identifies the diagnostically superior procedure. For that reason, area under the ROC curve informs policy in a way that underlying theoretical discriminability never can. At the same time, theoretical measures of discriminability are equally important, but for a different reason. Without an adequate theoretical understanding of the relevant task, the field will be in no position to enhance empirical discriminability.
ERIC Educational Resources Information Center
All-Union Inst. for Scientific and Technical Information, Moscow (USSR).
Reports given before the Committee on "Research on the Theoretical Basis of Information" of the International Federation for Documentation (FID/RI) are presented unaltered and unabridged in English or in Russian -- the language of their presentation. Each report is accompanied by an English or Russian resume. Generally, only original…
Integrated Education in Conflicted Societies: Is There a Need for New Theoretical Language?
ERIC Educational Resources Information Center
Zembylas, Michalinos; Bekerman, Zvi
2013-01-01
This article takes on the issue of "integrated education" in conflicted societies and engages in a deeper analysis of its dominant theoretical concepts, approaches, and implications. This analysis suggests that the theoretical language that drives current approaches of integrated education may unintentionally be complicit to the project…
Economic growth and carbon emission control
NASA Astrophysics Data System (ADS)
Zhang, Zhenyu
The question about whether environmental improvement is compatible with continued economic growth remains unclear and requires further study in a specific context. This study intends to provide insight on the potential for carbon emissions control in the absence of international agreement, and connect the empirical analysis with theoretical framework. The Chinese electricity generation sector is used as a case study to demonstrate the problem. Both social planner and private problems are examined to derive the conditions that define the optimal level of production and pollution. The private problem will be demonstrated under the emission regulation using an emission tax, an input tax and an abatement subsidy respectively. The social optimal emission flow is imposed into the private problem. To provide tractable analytical results, a Cobb-Douglas type production function is used to describe the joint production process of the desired output and undesired output (i.e., electricity and emissions). A modified Hamiltonian approach is employed to solve the system and the steady state solutions are examined for policy implications. The theoretical analysis suggests that the ratio of emissions to desired output (refer to 'emission factor'), is a function of productive capital and other parameters. The finding of non-constant emission factor shows that reducing emissions without further cutting back the production of desired outputs is feasible under some circumstances. Rather than an ad hoc specification, the optimal conditions derived from our theoretical framework are used to examine the relationship between desired output and emission level. Data comes from the China Statistical Yearbook and China Electric Power Yearbook and provincial information of electricity generation for the year of 1993-2003 are used to estimate the Cobb-Douglas type joint production by the full information maximum likelihood (FIML) method. The empirical analysis shed light on the optimal policies of emissions control required for achieving the social goal in a private context. The results suggest that the efficiency of abatement technology is crucial for the timing of executing the emission tax. And emission tax is preferred to an input tax, as long as the detection of emissions is not costly and abatement technology is efficient. Keywords: Economic growth, Carbon emission, Power generation, Joint production, China
Information theoretical assessment of image gathering and coding for digital restoration
NASA Technical Reports Server (NTRS)
Huck, Friedrich O.; John, Sarah; Reichenbach, Stephen E.
1990-01-01
The process of image-gathering, coding, and restoration is presently treated in its entirety rather than as a catenation of isolated tasks, on the basis of the relationship between the spectral information density of a transmitted signal and the restorability of images from the signal. This 'information-theoretic' assessment accounts for the information density and efficiency of the acquired signal as a function of the image-gathering system's design and radiance-field statistics, as well as for the information efficiency and data compression that are obtainable through the combination of image gathering with coding to reduce signal redundancy. It is found that high information efficiency is achievable only through minimization of image-gathering degradation as well as signal redundancy.
A New Methodology for Simultaneous Multi-layer Retrievals of Ice and Liquid Water Cloud Properties
NASA Astrophysics Data System (ADS)
Sourdeval, O.; Labonnote, L.; Baran, A. J.; Brogniez, G.
2014-12-01
It is widely recognized that the study of clouds has nowadays become one of the major concern of the climate research community. Consequently, a multitude of retrieval methodologies have been developed during the last decades in order to obtain accurate retrievals of cloud properties that can be supplied to climate models. Most of the current methodologies have proven to be satisfactory for separately retrieving ice or liquid cloud properties, but very few of them have attempted simultaneous retrievals of these two cloud types. Recent studies nevertheless show that the omission of one of these layers can have strong consequences on the retrievals and their accuracy. In this study, a new methodology that simultaneously retrieves the properties of ice and liquid clouds is presented. The optical thickness and the effective radius of up to two liquid cloud layers and the ice water path of one ice cloud layer are simultaneously retrieved, along with an accurate estimation of their uncertainties. Radiometric measurements ranging from the visible to the thermal infrared are used for performing the retrievals. In order to quantify the capabilities and limitations of our methodology, the results of a theoretical information content analysis are first presented. This analysis allows obtaining an a priori understanding of how much information should be expected on each of the retrieval parameters in different atmospheric conditions, and which set of channels is likely to provide this information. After such theoretical considerations, global retrievals corresponding to several months of A-Train data are presented. Comparisons of our retrievals with operational products from active and passive instruments are effectuated and show good global agreements. These comparisons are useful for validating our retrievals but also for testing how operational products can be influenced by multi-layer configurations.
NASA Astrophysics Data System (ADS)
Gordova, Yulia; Gorbatenko, Valentina; Martynova, Yulia; Shulgina, Tamara
2014-05-01
A problem of making education relevant to the workplace tasks is a key problem of higher education because old-school training programs are not keeping pace with the rapidly changing situation in the professional field of environmental sciences. A joint group of specialists from Tomsk State University and Siberian center for Environmental research and Training/IMCES SB RAS developed several new courses for students of "Climatology" and "Meteorology" specialties, which comprises theoretical knowledge from up-to-date environmental sciences with practical tasks. To organize the educational process we use an open-source course management system Moodle (www.moodle.org). It gave us an opportunity to combine text and multimedia in a theoretical part of educational courses. The hands-on approach is realized through development of innovative trainings which are performed within the information-computational platform "Climate" (http://climate.scert.ru/) using web GIS tools. These trainings contain practical tasks on climate modeling and climate changes assessment and analysis and should be performed using typical tools which are usually used by scientists performing such kind of research. Thus, students are engaged in n the use of modern tools of the geophysical data analysis and it cultivates dynamic of their professional learning. The hands-on approach can help us to fill in this gap because it is the only approach that offers experience, increases students involvement, advance the use of modern information and communication tools. The courses are implemented at Tomsk State University and help forming modern curriculum in Earth system science area. This work is partially supported by SB RAS project VIII.80.2.1, RFBR grants numbers 13-05-12034 and 14-05-00502.
Hershberger, Patricia E.; Finnegan, Lorna; Altfeld, Susan; Lake, Sara; Hirshfeld-Cytron, Jennifer
2014-01-01
Background Young women with cancer now face the complex decision about whether to undergo fertility preservation. Yet little is known about how these women process information involved in making this decision. Objective The purpose of this paper is to expand theoretical understanding of the decision-making process by examining aspects of information processing among young women diagnosed with cancer. Methods Using a grounded theory approach, 27 women with cancer participated in individual, semi-structured interviews. Data were coded and analyzed using constant-comparison techniques that were guided by five dimensions within the Contemplate phase of the decision-making process framework. Results In the first dimension, young women acquired information primarily from clinicians and Internet sources. Experiential information, often obtained from peers, occurred in the second dimension. Preferences and values were constructed in the third dimension as women acquired factual, moral, and ethical information. Women desired tailored, personalized information that was specific to their situation in the fourth dimension; however, women struggled with communicating these needs to clinicians. In the fifth dimension, women offered detailed descriptions of clinician behaviors that enhance or impede decisional debriefing. Conclusion Better understanding of theoretical underpinnings surrounding women’s information processes can facilitate decision support and improve clinical care. PMID:24552086
Hop limited epidemic-like information spreading in mobile social networks with selfish nodes
NASA Astrophysics Data System (ADS)
Wu, Yahui; Deng, Su; Huang, Hongbin
2013-07-01
Similar to epidemics, information can be transmitted directly among users in mobile social networks. Different from epidemics, we can control the spreading process by adjusting the corresponding parameters (e.g., hop count) directly. This paper proposes a theoretical model to evaluate the performance of an epidemic-like spreading algorithm, in which the maximal hop count of the information is limited. In addition, our model can be used to evaluate the impact of users’ selfish behavior. Simulations show the accuracy of our theoretical model. Numerical results show that the information hop count can have an important impact. In addition, the impact of selfish behavior is related to the information hop count.
NASA Astrophysics Data System (ADS)
Blasch, Erik; Kadar, Ivan; Hintz, Kenneth; Biermann, Joachim; Chong, Chee-Yee; Salerno, John; Das, Subrata
2007-04-01
Resource management (or process refinement) is critical for information fusion operations in that users, sensors, and platforms need to be informed, based on mission needs, on how to collect, process, and exploit data. To meet these growing concerns, a panel session was conducted at the International Society of Information Fusion Conference in 2006 to discuss the various issues surrounding the interaction of Resource Management with Level 2/3 Situation and Threat Assessment. This paper briefly consolidates the discussion of the invited panel panelists. The common themes include: (1) Addressing the user in system management, sensor control, and knowledge based information collection (2) Determining a standard set of fusion metrics for optimization and evaluation based on the application (3) Allowing dynamic and adaptive updating to deliver timely information needs and information rates (4) Optimizing the joint objective functions at all information fusion levels based on decision-theoretic analysis (5) Providing constraints from distributed resource mission planning and scheduling; and (6) Defining L2/3 situation entity definitions for knowledge discovery, modeling, and information projection
Yardley, Lucy; Morrison, Leanne G; Andreou, Panayiota; Joseph, Judith; Little, Paul
2010-09-17
It is recognised as good practice to use qualitative methods to elicit users' views of internet-delivered health-care interventions during their development. This paper seeks to illustrate the advantages of combining usability testing with 'theoretical modelling', i.e. analyses that relate the findings of qualitative studies during intervention development to social science theory, in order to gain deeper insights into the reasons and context for how people respond to the intervention. This paper illustrates how usability testing may be enriched by theoretical modelling by means of two qualitative studies of users' views of the delivery of information in an internet-delivered intervention to help users decide whether they needed to seek medical care for their cold or flu symptoms. In Study 1, 21 participants recruited from a city in southern England were asked to 'think aloud' while viewing draft web-pages presented in paper format. In Study 2, views of our prototype website were elicited, again using think aloud methods, in a sample of 26 participants purposively sampled for diversity in education levels. Both data-sets were analysed by thematic analysis. Study 1 revealed that although the information provided by the draft web-pages had many of the intended empowering benefits, users often felt overwhelmed by the quantity of information. Relating these findings to theory and research on factors influencing preferences for information-seeking we hypothesised that to meet the needs of different users (especially those with lower literacy levels) our website should be designed to provide only essential personalised advice, but with options to access further information. Study 2 showed that our website design did prove accessible to users with different literacy levels. However, some users seemed to want still greater control over how information was accessed. Educational level need not be an insuperable barrier to appreciating web-based access to detailed health-related information, provided that users feel they can quickly gain access to the specific information they seek.