ERIC Educational Resources Information Center
Li, Feifei
2017-01-01
An information-correction method for testlet-based tests is introduced. This method takes advantage of both generalizability theory (GT) and item response theory (IRT). The measurement error for the examinee proficiency parameter is often underestimated when a unidimensional conditional-independence IRT model is specified for a testlet dataset. By…
Saeed, Faisal; Salim, Naomie; Abdo, Ammar
2013-07-01
Many consensus clustering methods have been applied in different areas such as pattern recognition, machine learning, information theory and bioinformatics. However, few methods have been used for chemical compounds clustering. In this paper, an information theory and voting based algorithm (Adaptive Cumulative Voting-based Aggregation Algorithm A-CVAA) was examined for combining multiple clusterings of chemical structures. The effectiveness of clusterings was evaluated based on the ability of the clustering method to separate active from inactive molecules in each cluster, and the results were compared with Ward's method. The chemical dataset MDL Drug Data Report (MDDR) and the Maximum Unbiased Validation (MUV) dataset were used. Experiments suggest that the adaptive cumulative voting-based consensus method can improve the effectiveness of combining multiple clusterings of chemical structures. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Modeling method of time sequence model based grey system theory and application proceedings
NASA Astrophysics Data System (ADS)
Wei, Xuexia; Luo, Yaling; Zhang, Shiqiang
2015-12-01
This article gives a modeling method of grey system GM(1,1) model based on reusing information and the grey system theory. This method not only extremely enhances the fitting and predicting accuracy of GM(1,1) model, but also maintains the conventional routes' merit of simple computation. By this way, we have given one syphilis trend forecast method based on reusing information and the grey system GM(1,1) model.
An information theory criteria based blind method for enumerating active users in DS-CDMA system
NASA Astrophysics Data System (ADS)
Samsami Khodadad, Farid; Abed Hodtani, Ghosheh
2014-11-01
In this paper, a new and blind algorithm for active user enumeration in asynchronous direct sequence code division multiple access (DS-CDMA) in multipath channel scenario is proposed. The proposed method is based on information theory criteria. There are two main categories of information criteria which are widely used in active user enumeration, Akaike Information Criterion (AIC) and Minimum Description Length (MDL) information theory criteria. The main difference between these two criteria is their penalty functions. Due to this difference, MDL is a consistent enumerator which has better performance in higher signal-to-noise ratios (SNR) but AIC is preferred in lower SNRs. In sequel, we propose a SNR compliance method based on subspace and training genetic algorithm to have the performance of both of them. Moreover, our method uses only a single antenna, in difference to the previous methods which decrease hardware complexity. Simulation results show that the proposed method is capable of estimating the number of active users without any prior knowledge and the efficiency of the method.
Managing for resilience: an information theory-based ...
Ecosystems are complex and multivariate; hence, methods to assess the dynamics of ecosystems should have the capacity to evaluate multiple indicators simultaneously. Most research on identifying leading indicators of regime shifts has focused on univariate methods and simple models which have limited utility when evaluating real ecosystems, particularly because drivers are often unknown. We discuss some common univariate and multivariate approaches for detecting critical transitions in ecosystems and demonstrate their capabilities via case studies. Synthesis and applications. We illustrate the utility of an information theory-based index for assessing ecosystem dynamics. Trends in this index also provide a sentinel of both abrupt and gradual transitions in ecosystems. In response to the need to identify leading indicators of regime shifts in ecosystems, our research compares traditional indicators and Fisher information, an information theory based method, by examining four case study systems. Results demonstrate the utility of methods and offers great promise for quantifying and managing for resilience.
ERIC Educational Resources Information Center
Jesness, Bradley
This paper examines concepts in information-processing theory which are likely to be relevant to development and characterizes the methods and data upon which the concepts are based. Among the concepts examined are those which have slight empirical grounds. Other concepts examined are those which seem to have empirical bases but which are…
Jiang, Wen; Cao, Ying; Yang, Lin; He, Zichang
2017-08-28
Specific emitter identification plays an important role in contemporary military affairs. However, most of the existing specific emitter identification methods haven't taken into account the processing of uncertain information. Therefore, this paper proposes a time-space domain information fusion method based on Dempster-Shafer evidence theory, which has the ability to deal with uncertain information in the process of specific emitter identification. In this paper, radars will generate a group of evidence respectively based on the information they obtained, and our main task is to fuse the multiple groups of evidence to get a reasonable result. Within the framework of recursive centralized fusion model, the proposed method incorporates a correlation coefficient, which measures the relevance between evidence and a quantum mechanical approach, which is based on the parameters of radar itself. The simulation results of an illustrative example demonstrate that the proposed method can effectively deal with uncertain information and get a reasonable recognition result.
Rolling bearing fault diagnosis based on information fusion using Dempster-Shafer evidence theory
NASA Astrophysics Data System (ADS)
Pei, Di; Yue, Jianhai; Jiao, Jing
2017-10-01
This paper presents a fault diagnosis method for rolling bearing based on information fusion. Acceleration sensors are arranged at different position to get bearing vibration data as diagnostic evidence. The Dempster-Shafer (D-S) evidence theory is used to fuse multi-sensor data to improve diagnostic accuracy. The efficiency of the proposed method is demonstrated by the high speed train transmission test bench. The results of experiment show that the proposed method in this paper improves the rolling bearing fault diagnosis accuracy compared with traditional signal analysis methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ridolfi, E.; Napolitano, F., E-mail: francesco.napolitano@uniroma1.it; Alfonso, L.
2016-06-08
The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existingmore » guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers’ cross-sectional spacing.« less
Grounded Theory as a "Family of Methods": A Genealogical Analysis to Guide Research
ERIC Educational Resources Information Center
Babchuk, Wayne A.
2011-01-01
This study traces the evolution of grounded theory from a nuclear to an extended family of methods and considers the implications that decision-making based on informed choices throughout all phases of the research process has for realizing the potential of grounded theory for advancing adult education theory and practice. [This paper was…
A Natural Teaching Method Based on Learning Theory.
ERIC Educational Resources Information Center
Smilkstein, Rita
1991-01-01
The natural teaching method is active and student-centered, based on schema and constructivist theories, and informed by research in neuroplasticity. A schema is a mental picture or understanding of something we have learned. Humans can have knowledge only to the degree to which they have constructed schemas from learning experiences and practice.…
Academic Primer Series: Eight Key Papers about Education Theory
Gottlieb, Michael; Boysen-Osborn, Megan; Chan, Teresa M.; Krzyzaniak, Sara M.; Pineda, Nicolas; Spector, Jordan; Sherbino, Jonathan
2017-01-01
Introduction Many teachers adopt instructional methods based on assumptions of best practices without attention to or knowledge of supporting education theory. Familiarity with a variety of theories informs education that is efficient, strategic, and evidence-based. As part of the Academic Life in Emergency Medicine Faculty Incubator Program, a list of key education theories for junior faculty was developed. Methods A list of key papers on theories relevant to medical education was generated using an expert panel, a virtual community of practice synthetic discussion, and a social media call for resources. A three-round, Delphi-informed voting methodology including novice and expert educators produced a rank order of the top papers. Results These educators identified 34 unique papers. Eleven papers described the general use of education theory, while 23 papers focused on a specific theory. The top three papers on general education theories and top five papers on specific education theory were selected and summarized. The relevance of each paper for junior faculty and faculty developers is also presented. Conclusion This paper presents a reading list of key papers for junior faculty in medical education roles. Three papers about general education theories and five papers about specific educational theories are identified and annotated. These papers may help provide foundational knowledge in education theory to inform junior faculty teaching practice. PMID:28210367
Research and Development of Web-Based Virtual Online Classroom
ERIC Educational Resources Information Center
Yang, Zongkai; Liu, Qingtang
2007-01-01
To build a web-based virtual learning environment depends on information technologies, concerns technology supporting learning methods and theories. A web-based virtual online classroom is designed and developed based on learning theories and streaming media technologies. And it is composed of two parts: instructional communicating environment…
Evidence Combination From an Evolutionary Game Theory Perspective.
Deng, Xinyang; Han, Deqiang; Dezert, Jean; Deng, Yong; Shyr, Yu
2016-09-01
Dempster-Shafer evidence theory is a primary methodology for multisource information fusion because it is good at dealing with uncertain information. This theory provides a Dempster's rule of combination to synthesize multiple evidences from various information sources. However, in some cases, counter-intuitive results may be obtained based on that combination rule. Numerous new or improved methods have been proposed to suppress these counter-intuitive results based on perspectives, such as minimizing the information loss or deviation. Inspired by evolutionary game theory, this paper considers a biological and evolutionary perspective to study the combination of evidences. An evolutionary combination rule (ECR) is proposed to help find the most biologically supported proposition in a multievidence system. Within the proposed ECR, we develop a Jaccard matrix game to formalize the interaction between propositions in evidences, and utilize the replicator dynamics to mimick the evolution of propositions. Experimental results show that the proposed ECR can effectively suppress the counter-intuitive behaviors appeared in typical paradoxes of evidence theory, compared with many existing methods. Properties of the ECR, such as solution's stability and convergence, have been mathematically proved as well.
NASA Astrophysics Data System (ADS)
Guan, Yihong; Luo, Yatao; Yang, Tao; Qiu, Lei; Li, Junchang
2012-01-01
The features of the spatial information of Markov random field image was used in image segmentation. It can effectively remove the noise, and get a more accurate segmentation results. Based on the fuzziness and clustering of pixel grayscale information, we find clustering center of the medical image different organizations and background through Fuzzy cmeans clustering method. Then we find each threshold point of multi-threshold segmentation through two dimensional histogram method, and segment it. The features of fusing multivariate information based on the Dempster-Shafer evidence theory, getting image fusion and segmentation. This paper will adopt the above three theories to propose a new human brain image segmentation method. Experimental result shows that the segmentation result is more in line with human vision, and is of vital significance to accurate analysis and application of tissues.
ERIC Educational Resources Information Center
Huvila, Isto
2008-01-01
Introduction: A work roles and role theory-based approach to conceptualise human information activity, denoted information work analysis is discussed. The present article explicates the approach and its special characteristics and benefits in comparison to earlier methods of analysing human information work. Method: The approach is discussed in…
Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method
NASA Astrophysics Data System (ADS)
Zhang, Xiangnan
2018-03-01
A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.
An Improved Image Matching Method Based on Surf Algorithm
NASA Astrophysics Data System (ADS)
Chen, S. J.; Zheng, S. Z.; Xu, Z. G.; Guo, C. C.; Ma, X. L.
2018-04-01
Many state-of-the-art image matching methods, based on the feature matching, have been widely studied in the remote sensing field. These methods of feature matching which get highly operating efficiency, have a disadvantage of low accuracy and robustness. This paper proposes an improved image matching method which based on the SURF algorithm. The proposed method introduces color invariant transformation, information entropy theory and a series of constraint conditions to increase feature points detection and matching accuracy. First, the model of color invariant transformation is introduced for two matching images aiming at obtaining more color information during the matching process and information entropy theory is used to obtain the most information of two matching images. Then SURF algorithm is applied to detect and describe points from the images. Finally, constraint conditions which including Delaunay triangulation construction, similarity function and projective invariant are employed to eliminate the mismatches so as to improve matching precision. The proposed method has been validated on the remote sensing images and the result benefits from its high precision and robustness.
NASA Astrophysics Data System (ADS)
Tacnet, Jean-Marc; Dupouy, Guillaume; Carladous, Simon; Dezert, Jean; Batton-Hubert, Mireille
2017-04-01
In mountain areas, natural phenomena such as snow avalanches, debris-flows and rock-falls, put people and objects at risk with sometimes dramatic consequences. Risk is classically considered as a combination of hazard, the combination of the intensity and frequency of the phenomenon, and vulnerability which corresponds to the consequences of the phenomenon on exposed people and material assets. Risk management consists in identifying the risk level as well as choosing the best strategies for risk prevention, i.e. mitigation. In the context of natural phenomena in mountainous areas, technical and scientific knowledge is often lacking. Risk management decisions are therefore based on imperfect information. This information comes from more or less reliable sources ranging from historical data, expert assessments, numerical simulations etc. Finally, risk management decisions are the result of complex knowledge management and reasoning processes. Tracing the information and propagating information quality from data acquisition to decisions are therefore important steps in the decision-making process. One major goal today is therefore to assist decision-making while considering the availability, quality and reliability of information content and sources. A global integrated framework is proposed to improve the risk management process in a context of information imperfection provided by more or less reliable sources: uncertainty as well as imprecision, inconsistency and incompleteness are considered. Several methods are used and associated in an original way: sequential decision context description, development of specific multi-criteria decision-making methods, imperfection propagation in numerical modeling and information fusion. This framework not only assists in decision-making but also traces the process and evaluates the impact of information quality on decision-making. We focus and present two main developments. The first one relates to uncertainty and imprecision propagation in numerical modeling using both classical Monte-Carlo probabilistic approach and also so-called Hybrid approach using possibility theory. Second approach deals with new multi-criteria decision-making methods which consider information imperfection, source reliability, importance and conflict, using fuzzy sets as well as possibility and belief function theories. Implemented methods consider information imperfection propagation and information fusion in total aggregation methods such as AHP (Saaty, 1980) or partial aggregation methods such as the Electre outranking method (see Soft Electre Tri ) or decisions in certain but also risky or uncertain contexts (see new COWA-ER and FOWA-ER- Cautious and Fuzzy Ordered Weighted Averaging-Evidential Reasoning). For example, the ER-MCDA methodology considers expert assessment as a multi-criteria decision process based on imperfect information provided by more or less heterogeneous, reliable and conflicting sources: it mixes AHP, fuzzy sets theory, possibility theory and belief function theory using DSmT (Dezert-Smarandache Theory) framework which provides powerful fusion rules.
A Model-Driven Development Method for Management Information Systems
NASA Astrophysics Data System (ADS)
Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki
Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.
Evidence Combination From an Evolutionary Game Theory Perspective
Deng, Xinyang; Han, Deqiang; Dezert, Jean; Deng, Yong; Shyr, Yu
2017-01-01
Dempster-Shafer evidence theory is a primary methodology for multi-source information fusion because it is good at dealing with uncertain information. This theory provides a Dempster’s rule of combination to synthesize multiple evidences from various information sources. However, in some cases, counter-intuitive results may be obtained based on that combination rule. Numerous new or improved methods have been proposed to suppress these counter-intuitive results based on perspectives, such as minimizing the information loss or deviation. Inspired by evolutionary game theory, this paper considers a biological and evolutionary perspective to study the combination of evidences. An evolutionary combination rule (ECR) is proposed to help find the most biologically supported proposition in a multi-evidence system. Within the proposed ECR, we develop a Jaccard matrix game (JMG) to formalize the interaction between propositions in evidences, and utilize the replicator dynamics to mimick the evolution of propositions. Experimental results show that the proposed ECR can effectively suppress the counter-intuitive behaviors appeared in typical paradoxes of evidence theory, compared with many existing methods. Properties of the ECR, such as solution’s stability and convergence, have been mathematically proved as well. PMID:26285231
Detecting spatial regimes in ecosystems | Science Inventory ...
Research on early warning indicators has generally focused on assessing temporal transitions with limited application of these methods to detecting spatial regimes. Traditional spatial boundary detection procedures that result in ecoregion maps are typically based on ecological potential (i.e. potential vegetation), and often fail to account for ongoing changes due to stressors such as land use change and climate change and their effects on plant and animal communities. We use Fisher information, an information theory based method, on both terrestrial and aquatic animal data (US Breeding Bird Survey and marine zooplankton) to identify ecological boundaries, and compare our results to traditional early warning indicators, conventional ecoregion maps, and multivariate analysis such as nMDS (non-metric Multidimensional Scaling) and cluster analysis. We successfully detect spatial regimes and transitions in both terrestrial and aquatic systems using Fisher information. Furthermore, Fisher information provided explicit spatial information about community change that is absent from other multivariate approaches. Our results suggest that defining spatial regimes based on animal communities may better reflect ecological reality than do traditional ecoregion maps, especially in our current era of rapid and unpredictable ecological change. Use an information theory based method to identify ecological boundaries and compare our results to traditional early warning
MAVEN-SA: Model-Based Automated Visualization for Enhanced Situation Awareness
2005-11-01
34 methods. But historically, as arts evolve, these how to methods become systematized and codified (e.g. the development and refinement of color theory ...schema (as necessary) 3. Draw inferences from new knowledge to support decision making process 33 Visual language theory suggests that humans process...informed by theories of learning. Over the years, many types of software have been developed to support student learning. The various types of
Li, Jingchao; Cao, Yunpeng; Ying, Yulong; Li, Shuying
2016-01-01
Bearing failure is one of the dominant causes of failure and breakdowns in rotating machinery, leading to huge economic loss. Aiming at the nonstationary and nonlinear characteristics of bearing vibration signals as well as the complexity of condition-indicating information distribution in the signals, a novel rolling element bearing fault diagnosis method based on multifractal theory and gray relation theory was proposed in the paper. Firstly, a generalized multifractal dimension algorithm was developed to extract the characteristic vectors of fault features from the bearing vibration signals, which can offer more meaningful and distinguishing information reflecting different bearing health status in comparison with conventional single fractal dimension. After feature extraction by multifractal dimensions, an adaptive gray relation algorithm was applied to implement an automated bearing fault pattern recognition. The experimental results show that the proposed method can identify various bearing fault types as well as severities effectively and accurately. PMID:28036329
Li, Jingchao; Cao, Yunpeng; Ying, Yulong; Li, Shuying
2016-01-01
Bearing failure is one of the dominant causes of failure and breakdowns in rotating machinery, leading to huge economic loss. Aiming at the nonstationary and nonlinear characteristics of bearing vibration signals as well as the complexity of condition-indicating information distribution in the signals, a novel rolling element bearing fault diagnosis method based on multifractal theory and gray relation theory was proposed in the paper. Firstly, a generalized multifractal dimension algorithm was developed to extract the characteristic vectors of fault features from the bearing vibration signals, which can offer more meaningful and distinguishing information reflecting different bearing health status in comparison with conventional single fractal dimension. After feature extraction by multifractal dimensions, an adaptive gray relation algorithm was applied to implement an automated bearing fault pattern recognition. The experimental results show that the proposed method can identify various bearing fault types as well as severities effectively and accurately.
Cooperative Localization for Multi-AUVs Based on GM-PHD Filters and Information Entropy Theory
Zhang, Lichuan; Wang, Tonghao; Xu, Demin
2017-01-01
Cooperative localization (CL) is considered a promising method for underwater localization with respect to multiple autonomous underwater vehicles (multi-AUVs). In this paper, we proposed a CL algorithm based on information entropy theory and the probability hypothesis density (PHD) filter, aiming to enhance the global localization accuracy of the follower. In the proposed framework, the follower carries lower cost navigation systems, whereas the leaders carry better ones. Meanwhile, the leaders acquire the followers’ observations, including both measurements and clutter. Then, the PHD filters are utilized on the leaders and the results are communicated to the followers. The followers then perform weighted summation based on all received messages and obtain a final positioning result. Based on the information entropy theory and the PHD filter, the follower is able to acquire a precise knowledge of its position. PMID:28991191
ERIC Educational Resources Information Center
St Quinton, Tom; Brunton, Julie A.
2018-01-01
Purpose: This study is the 3rd piece of formative research utilizing the theory of planned behavior to inform the development of a behavior change intervention. Focus groups were used to identify reasons for and solutions to previously identified key beliefs in addition to potentially effective behavior change techniques. Method: A purposive…
A Study of Driver's Route Choice Behavior Based on Evolutionary Game Theory
Jiang, Xiaowei; Ji, Yanjie; Deng, Wei
2014-01-01
This paper proposes a route choice analytic method that embeds cumulative prospect theory in evolutionary game theory to analyze how the drivers adjust their route choice behaviors under the influence of the traffic information. A simulated network with two alternative routes and one variable message sign is built to illustrate the analytic method. We assume that the drivers in the transportation system are bounded rational, and the traffic information they receive is incomplete. An evolutionary game model is constructed to describe the evolutionary process of the drivers' route choice decision-making behaviors. Here we conclude that the traffic information plays an important role in the route choice behavior. The driver's route decision-making process develops towards different evolutionary stable states in accordance with different transportation situations. The analysis results also demonstrate that employing cumulative prospect theory and evolutionary game theory to study the driver's route choice behavior is effective. This analytic method provides an academic support and suggestion for the traffic guidance system, and may optimize the travel efficiency to a certain extent. PMID:25610455
A study of driver's route choice behavior based on evolutionary game theory.
Jiang, Xiaowei; Ji, Yanjie; Du, Muqing; Deng, Wei
2014-01-01
This paper proposes a route choice analytic method that embeds cumulative prospect theory in evolutionary game theory to analyze how the drivers adjust their route choice behaviors under the influence of the traffic information. A simulated network with two alternative routes and one variable message sign is built to illustrate the analytic method. We assume that the drivers in the transportation system are bounded rational, and the traffic information they receive is incomplete. An evolutionary game model is constructed to describe the evolutionary process of the drivers' route choice decision-making behaviors. Here we conclude that the traffic information plays an important role in the route choice behavior. The driver's route decision-making process develops towards different evolutionary stable states in accordance with different transportation situations. The analysis results also demonstrate that employing cumulative prospect theory and evolutionary game theory to study the driver's route choice behavior is effective. This analytic method provides an academic support and suggestion for the traffic guidance system, and may optimize the travel efficiency to a certain extent.
Liang, Zhaohui; Liu, Jun; Huang, Jimmy X; Zeng, Xing
2018-01-01
The genetic polymorphism of Cytochrome P450 (CYP 450) is considered as one of the main causes for adverse drug reactions (ADRs). In order to explore the latent correlations between ADRs and potentially corresponding single-nucleotide polymorphism (SNPs) in CYP450, three algorithms based on information theory are used as the main method to predict the possible relation. The study uses a retrospective case-control study to explore the potential relation of ADRs to specific genomic locations and single-nucleotide polymorphism (SNP). The genomic data collected from 53 healthy volunteers are applied for the analysis, another group of genomic data collected from 30 healthy volunteers excluded from the study are used as the control group. The SNPs respective on five loci of CYP2D6*2,*10,*14 and CYP1A2*1C, *1F are detected by the Applied Biosystem 3130xl. The raw data is processed by ChromasPro to detect the specific alleles on the above loci from each sample. The secondary data are reorganized and processed by R combined with the reports of ADRs from clinical reports. Three information theory based algorithms are implemented for the screening task: JMI, CMIM, and mRMR. If a SNP is selected by more than two algorithms, we are confident to conclude that it is related to the corresponding ADR. The selection results are compared with the control decision tree + LASSO regression model. In the study group where ADRs occur, 10 SNPs are considered relevant to the occurrence of a specific ADR by the combined information theory model. In comparison, only 5 SNPs are considered relevant to a specific ADR by the decision tree + LASSO regression model. In addition, the new method detects more relevant pairs of SNP and ADR which are affected by both SNP and dosage. This implies that the new information theory based model is effective to discover correlations of ADRs and CYP 450 SNPs and is helpful in predicting the potential vulnerable genotype for some ADRs. The newly proposed information theory based model has superiority performance in detecting the relation between SNP and ADR compared to the decision tree + LASSO regression model. The new model is more sensitive to detect ADRs compared to the old method, while the old method is more reliable. Therefore, the selection criteria for selecting algorithms should depend on the pragmatic needs. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Teaching Methods Utilizing a Field Theory Viewpoint in the Elementary Reading Program.
ERIC Educational Resources Information Center
LeChuga, Shirley; Lowry, Heath
1980-01-01
Suggests and lists sources of information on reading instruction that discuss the promotion and enrichment of the interactive learning process between children and their environment based on principles underlying the cognitive-field theory of learning. (MKM)
A Comparison of Web-Based and Face-to-Face Functional Measurement Experiments
ERIC Educational Resources Information Center
Van Acker, Frederik; Theuns, Peter
2010-01-01
Information Integration Theory (IIT) is concerned with how people combine information into an overall judgment. A method is hereby presented to perform Functional Measurement (FM) experiments, the methodological counterpart of IIT, on the Web. In a comparison of Web-based FM experiments, face-to-face experiments, and computer-based experiments in…
ERIC Educational Resources Information Center
Yamagata-Lynch, Lisa C.
2007-01-01
Understanding human activity in real-world situations often involves complicated data collection, analysis, and presentation methods. This article discusses how Cultural-Historical Activity Theory (CHAT) can inform design-based research practices that focus on understanding activity in real-world situations. I provide a sample data set with…
Persona, Marek; Kutarov, Vladimir V; Kats, Boris M; Persona, Andrzej; Marczewska, Barbara
2007-01-01
The paper describes the new prediction method of octanol-water partition coefficient, which is based on molecular graph theory. The results obtained using the new method are well correlated with experimental values. These results were compared with the ones obtained by use of ten other structure correlated methods. The comparison shows that graph theory can be very useful in structure correlation research.
An information theory framework for dynamic functional domain connectivity.
Vergara, Victor M; Miller, Robyn; Calhoun, Vince
2017-06-01
Dynamic functional network connectivity (dFNC) analyzes time evolution of coherent activity in the brain. In this technique dynamic changes are considered for the whole brain. This paper proposes an information theory framework to measure information flowing among subsets of functional networks call functional domains. Our method aims at estimating bits of information contained and shared among domains. The succession of dynamic functional states is estimated at the domain level. Information quantity is based on the probabilities of observing each dynamic state. Mutual information measurement is then obtained from probabilities across domains. Thus, we named this value the cross domain mutual information (CDMI). Strong CDMIs were observed in relation to the subcortical domain. Domains related to sensorial input, motor control and cerebellum form another CDMI cluster. Information flow among other domains was seldom found. Other methods of dynamic connectivity focus on whole brain dFNC matrices. In the current framework, information theory is applied to states estimated from pairs of multi-network functional domains. In this context, we apply information theory to measure information flow across functional domains. Identified CDMI clusters point to known information pathways in the basal ganglia and also among areas of sensorial input, patterns found in static functional connectivity. In contrast, CDMI across brain areas of higher level cognitive processing follow a different pattern that indicates scarce information sharing. These findings show that employing information theory to formally measured information flow through brain domains reveals additional features of functional connectivity. Copyright © 2017 Elsevier B.V. All rights reserved.
Academic Primer Series: Eight Key Papers about Education Theory.
Gottlieb, Michael; Boysen-Osborn, Megan; Chan, Teresa M; Krzyzaniak, Sara M; Pineda, Nicolas; Spector, Jordan; Sherbino, Jonathan
2017-02-01
Many teachers adopt instructional methods based on assumptions of best practices without attention to or knowledge of supporting education theory. Familiarity with a variety of theories informs education that is efficient, strategic, and evidence-based. As part of the Academic Life in Emergency Medicine Faculty Incubator Program, a list of key education theories for junior faculty was developed. A list of key papers on theories relevant to medical education was generated using an expert panel, a virtual community of practice synthetic discussion, and a social media call for resources. A three-round, Delphi-informed voting methodology including novice and expert educators produced a rank order of the top papers. These educators identified 34 unique papers. Eleven papers described the general use of education theory, while 23 papers focused on a specific theory. The top three papers on general education theories and top five papers on specific education theory were selected and summarized. The relevance of each paper for junior faculty and faculty developers is also presented. This paper presents a reading list of key papers for junior faculty in medical education roles. Three papers about general education theories and five papers about specific educational theories are identified and annotated. These papers may help provide foundational knowledge in education theory to inform junior faculty teaching practice.
Exploring Older Adults' Health Information Seeking Behaviors
ERIC Educational Resources Information Center
Manafo, Elizabeth; Wong, Sharon
2012-01-01
Objective: To explore older adults' (55-70 years) health information-seeking behaviors. Methods: Using a qualitative methodology, based on grounded theory, data were collected using in-depth interviews. Participants were community-living, older adults in Toronto, Canada who independently seek nutrition and health information. Interview transcripts…
Greville-Harris, Maddy; Bostock, Jennifer; Din, Amy; Graham, Cynthia A; Lewith, George; Liossi, Christina; O’Riordan, Tim; White, Peter; Yardley, Lucy
2016-01-01
Background According to established ethical principles and guidelines, patients in clinical trials should be fully informed about the interventions they might receive. However, information about placebo-controlled clinical trials typically focuses on the new intervention being tested and provides limited and at times misleading information about placebos. Objective We aimed to create an informative, scientifically accurate, and engaging website that could be used to improve understanding of placebo effects among patients who might be considering taking part in a placebo-controlled clinical trial. Methods Our approach drew on evidence-, theory-, and person-based intervention development. We used existing evidence and theory about placebo effects to develop content that was scientifically accurate. We used existing evidence and theory of health behavior to ensure our content would be communicated persuasively, to an audience who might currently be ignorant or misinformed about placebo effects. A qualitative ‘think aloud’ study was conducted in which 10 participants viewed prototypes of the website and spoke their thoughts out loud in the presence of a researcher. Results The website provides information about 10 key topics and uses text, evidence summaries, quizzes, audio clips of patients’ stories, and a short film to convey key messages. Comments from participants in the think aloud study highlighted occasional misunderstandings and off-putting/confusing features. These were addressed by modifying elements of content, style, and navigation to improve participants’ experiences of using the website. Conclusions We have developed an evidence-based website that incorporates theory-based techniques to inform members of the public about placebos and placebo effects. Qualitative research ensured our website was engaging and convincing for our target audience who might not perceive a need to learn about placebo effects. Before using the website in clinical trials, it is necessary to test its effects on key outcomes including patients’ knowledge and capacity for making informed choices about placebos. PMID:27288271
ERIC Educational Resources Information Center
Jiang, Yong
2017-01-01
Traditional mathematical methods built around exactitude have limitations when applied to the processing of educational information, due to their uncertainty and imperfection. Alternative mathematical methods, such as grey system theory, have been widely applied in processing incomplete information systems and have proven effective in a number of…
ERIC Educational Resources Information Center
Association for Education in Journalism and Mass Communication.
The Communication Theory and Method Division section of the proceedings contains the following 16 papers: "Profiling TV Ratings Users: Content-Based Advisories and Their Adoption" (Robert Abelman and David Atkin); "It's All About the Information: Salience Effects on the Perceptions of News Exemplification" (Francesca R. Dillman…
Information-theoretic metamodel of organizational evolution
NASA Astrophysics Data System (ADS)
Sepulveda, Alfredo
2011-12-01
Social organizations are abstractly modeled by holarchies---self-similar connected networks---and intelligent complex adaptive multiagent systems---large networks of autonomous reasoning agents interacting via scaled processes. However, little is known of how information shapes evolution in such organizations, a gap that can lead to misleading analytics. The research problem addressed in this study was the ineffective manner in which classical model-predict-control methods used in business analytics attempt to define organization evolution. The purpose of the study was to construct an effective metamodel for organization evolution based on a proposed complex adaptive structure---the info-holarchy. Theoretical foundations of this study were holarchies, complex adaptive systems, evolutionary theory, and quantum mechanics, among other recently developed physical and information theories. Research questions addressed how information evolution patterns gleamed from the study's inductive metamodel more aptly explained volatility in organization. In this study, a hybrid grounded theory based on abstract inductive extensions of information theories was utilized as the research methodology. An overarching heuristic metamodel was framed from the theoretical analysis of the properties of these extension theories and applied to business, neural, and computational entities. This metamodel resulted in the synthesis of a metaphor for, and generalization of organization evolution, serving as the recommended and appropriate analytical tool to view business dynamics for future applications. This study may manifest positive social change through a fundamental understanding of complexity in business from general information theories, resulting in more effective management.
Bai, Xiao-ping; Zhang, Xi-wei
2013-01-01
Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.
Generalised squeezing and information theory approach to quantum entanglement
NASA Technical Reports Server (NTRS)
Vourdas, A.
1993-01-01
It is shown that the usual one- and two-mode squeezing are based on reducible representations of the SU(1,1) group. Generalized squeezing is introduced with the use of different SU(1,1) rotations on each irreducible sector. Two-mode squeezing entangles the modes and information theory methods are used to study this entanglement. The entanglement of three modes is also studied with the use of the strong subadditivity property of the entropy.
Motivating Students in Credit-Based Information Literacy Courses: Theories and Practice.
ERIC Educational Resources Information Center
Jacobson, Trudi E.; Xu, Lijuan
2002-01-01
Discusses methods for enhancing student motivation, particularly in information literacy courses in higher education. Topics include Keller's ARCS (Attention, Relevance, Confidence, Satisfaction). motivation model; course design; teaching behaviors; teacher enthusiasm; clarity in presenting materials; interaction; active engagement; cooperative…
Digital focusing of OCT images based on scalar diffraction theory and information entropy.
Liu, Guozhong; Zhi, Zhongwei; Wang, Ruikang K
2012-11-01
This paper describes a digital method that is capable of automatically focusing optical coherence tomography (OCT) en face images without prior knowledge of the point spread function of the imaging system. The method utilizes a scalar diffraction model to simulate wave propagation from out-of-focus scatter to the focal plane, from which the propagation distance between the out-of-focus plane and the focal plane is determined automatically via an image-definition-evaluation criterion based on information entropy theory. By use of the proposed approach, we demonstrate that the lateral resolution close to that at the focal plane can be recovered from the imaging planes outside the depth of field region with minimal loss of resolution. Fresh onion tissues and mouse fat tissues are used in the experiments to show the performance of the proposed method.
McKernan McKay, Mary; Alicea, Stacey; Elwyn, Laura; McClain, Zachary R B; Parker, Gary; Small, Latoya A; Mellins, Claude Ann
2014-01-01
This article describes a program of prevention and intervention research conducted by the CHAMP (Collaborative HIV prevention and Adolescent Mental health Project; McKay & Paikoff, 2007 ) investigative team. CHAMP refers to a set of theory-driven, evidence-informed, collaboratively designed, family-based approaches meant to address the prevention, health, and mental health needs of poverty-impacted African American and Latino urban youth who are either at risk for HIV exposure or perinatally infected and at high risk for reinfection and possible transmission. CHAMP approaches are informed by theoretical frameworks that incorporate an understanding of the critical influences of multilevel contextual factors on youth risk taking and engagement in protective health behaviors. Highly influential theories include the triadic theory of influence, social action theory, and ecological developmental perspectives. CHAMP program delivery strategies were developed via a highly collaborative process drawing upon community-based participatory research methods in order to enhance cultural and contextual sensitivity of program content and format. The development and preliminary outcomes associated with a family-based intervention for a new population, perinatally HIV-infected youth and their adult caregivers, referred to as CHAMP+, is described to illustrate the integration of theory, existing evidence, and intensive input from consumers and healthcare providers.
Gene Regulatory Network Inferences Using a Maximum-Relevance and Maximum-Significance Strategy
Liu, Wei; Zhu, Wen; Liao, Bo; Chen, Xiangtao
2016-01-01
Recovering gene regulatory networks from expression data is a challenging problem in systems biology that provides valuable information on the regulatory mechanisms of cells. A number of algorithms based on computational models are currently used to recover network topology. However, most of these algorithms have limitations. For example, many models tend to be complicated because of the “large p, small n” problem. In this paper, we propose a novel regulatory network inference method called the maximum-relevance and maximum-significance network (MRMSn) method, which converts the problem of recovering networks into a problem of how to select the regulator genes for each gene. To solve the latter problem, we present an algorithm that is based on information theory and selects the regulator genes for a specific gene by maximizing the relevance and significance. A first-order incremental search algorithm is used to search for regulator genes. Eventually, a strict constraint is adopted to adjust all of the regulatory relationships according to the obtained regulator genes and thus obtain the complete network structure. We performed our method on five different datasets and compared our method to five state-of-the-art methods for network inference based on information theory. The results confirm the effectiveness of our method. PMID:27829000
Qualitative model-based diagnosis using possibility theory
NASA Technical Reports Server (NTRS)
Joslyn, Cliff
1994-01-01
The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.
NASA Astrophysics Data System (ADS)
Wen, Xueda; Matsuura, Shunji; Ryu, Shinsei
2016-06-01
We develop an approach based on edge theories to calculate the entanglement entropy and related quantities in (2+1)-dimensional topologically ordered phases. Our approach is complementary to, e.g., the existing methods using replica trick and Witten's method of surgery, and applies to a generic spatial manifold of genus g , which can be bipartitioned in an arbitrary way. The effects of fusion and braiding of Wilson lines can be also straightforwardly studied within our framework. By considering a generic superposition of states with different Wilson line configurations, through an interference effect, we can detect, by the entanglement entropy, the topological data of Chern-Simons theories, e.g., the R symbols, monodromy, and topological spins of quasiparticles. Furthermore, by using our method, we calculate other entanglement/correlation measures such as the mutual information and the entanglement negativity. In particular, it is found that the entanglement negativity of two adjacent noncontractible regions on a torus provides a simple way to distinguish Abelian and non-Abelian topological orders.
ERIC Educational Resources Information Center
Smith, Kasee L.; Rayfield, John
2017-01-01
Understanding methods for effectively instructing STEM education concepts is essential in the current climate of education (Freeman, Marginson, & Tyler 2014). Kolb's experiential learning theory (ELT) outlines four specific modes of learning, based on preferences for grasping and transforming information. This quasi-experimental study was…
Use of Intervention Mapping to Enhance Health Care Professional Practice: A Systematic Review.
Durks, Desire; Fernandez-Llimos, Fernando; Hossain, Lutfun N; Franco-Trigo, Lucia; Benrimoj, Shalom I; Sabater-Hernández, Daniel
2017-08-01
Intervention Mapping is a planning protocol for developing behavior change interventions, the first three steps of which are intended to establish the foundations and rationales of such interventions. This systematic review aimed to identify programs that used Intervention Mapping to plan changes in health care professional practice. Specifically, it provides an analysis of the information provided by the programs in the first three steps of the protocol to determine their foundations and rationales of change. A literature search was undertaken in PubMed, Scopus, SciELO, and DOAJ using "Intervention Mapping" as keyword. Key information was gathered, including theories used, determinants of practice, research methodologies, theory-based methods, and practical applications. Seventeen programs aimed at changing a range of health care practices were included. The social cognitive theory and the theory of planned behavior were the most frequently used frameworks in driving change within health care practices. Programs used a large variety of research methodologies to identify determinants of practice. Specific theory-based methods (e.g., modelling and active learning) and practical applications (e.g., health care professional training and facilitation) were reported to inform the development of practice change interventions and programs. In practice, Intervention Mapping delineates a three-step systematic, theory- and evidence-driven process for establishing the theoretical foundations and rationales underpinning change in health care professional practice. The use of Intervention Mapping can provide health care planners with useful guidelines for the theoretical development of practice change interventions and programs.
Evidential analysis of difference images for change detection of multitemporal remote sensing images
NASA Astrophysics Data System (ADS)
Chen, Yin; Peng, Lijuan; Cremers, Armin B.
2018-03-01
In this article, we develop two methods for unsupervised change detection in multitemporal remote sensing images based on Dempster-Shafer's theory of evidence (DST). In most unsupervised change detection methods, the probability of difference image is assumed to be characterized by mixture models, whose parameters are estimated by the expectation maximization (EM) method. However, the main drawback of the EM method is that it does not consider spatial contextual information, which may entail rather noisy detection results with numerous spurious alarms. To remedy this, we firstly develop an evidence theory based EM method (EEM) which incorporates spatial contextual information in EM by iteratively fusing the belief assignments of neighboring pixels to the central pixel. Secondly, an evidential labeling method in the sense of maximizing a posteriori probability (MAP) is proposed in order to further enhance the detection result. It first uses the parameters estimated by EEM to initialize the class labels of a difference image. Then it iteratively fuses class conditional information and spatial contextual information, and updates labels and class parameters. Finally it converges to a fixed state which gives the detection result. A simulated image set and two real remote sensing data sets are used to evaluate the two evidential change detection methods. Experimental results show that the new evidential methods are comparable to other prevalent methods in terms of total error rate.
Prospect Theory and Interval-Valued Hesitant Set for Safety Evacuation Model
NASA Astrophysics Data System (ADS)
Kou, Meng; Lu, Na
2018-01-01
The study applies the research results of prospect theory and multi attribute decision making theory, combined with the complexity, uncertainty and multifactor influence of the underground mine fire system and takes the decision makers’ psychological behavior of emotion and intuition into full account to establish the intuitionistic fuzzy multiple attribute decision making method that is based on the prospect theory. The model established by this method can explain the decision maker’s safety evacuation decision behavior in the complex system of underground mine fire due to the uncertainty of the environment, imperfection of the information and human psychological behavior and other factors.
Ford, John A; Jones, Andrew P; Wong, Geoff; Clark, Allan B; Porter, Tom; Shakespeare, Tom; Swart, Ann Marie; Steel, Nicholas
2015-01-01
Introduction The UK has an ageing population, especially in rural areas, where deprivation is high among older people. Previous research has identified this group as at high risk of poor access to healthcare. The aim of this study is to generate a theory of how socioeconomically disadvantaged older people from rural areas access primary care, to develop an intervention based on this theory and test it in a feasibility trial. Methods and analysis On the basis of the MRC Framework for Developing and Evaluating Complex Interventions, three methods will be used to generate the theory. First, a realist review will elucidate the patient pathway based on existing literature. Second, an analysis of the English Longitudinal Study of Ageing will be completed using structural equation modelling. Third, 15 semistructured interviews will be undertaken with patients and four focus groups with health professionals. A triangulation protocol will be used to allow each of these methods to inform and be informed by each other, and to integrate data into one overall realist theory. Based on this theory, an intervention will be developed in discussion with stakeholders to ensure that the intervention is feasible and practical. The intervention will be tested within a feasibility trial, the design of which will depend on the intervention. Lessons from the feasibility trial will be used to refine the intervention and gather the information needed for a definitive trial. Ethics and dissemination Ethics approval from the regional ethics committee has been granted for the focus groups with health professionals and interviews with patients. Ethics approval will be sought for the feasibility trial after the intervention has been designed. Findings will be disseminated to the key stakeholders involved in intervention development, to researchers, clinicians and health planners through peer-reviewed journal articles and conference publications, and locally through a dissemination event. PMID:26384728
Counting the number of Feynman graphs in QCD
NASA Astrophysics Data System (ADS)
Kaneko, T.
2018-05-01
Information about the number of Feynman graphs for a given physical process in a given field theory is especially useful for confirming the result of a Feynman graph generator used in an automatic system of perturbative calculations. A method of counting the number of Feynman graphs with weight of symmetry factor was established based on zero-dimensional field theory, and was used in scalar theories and QED. In this article this method is generalized to more complicated models by direct calculation of generating functions on a computer algebra system. This method is applied to QCD with and without counter terms, where many higher order are being calculated automatically.
Computational Methods in Drug Discovery
Sliwoski, Gregory; Kothiwale, Sandeepkumar; Meiler, Jens
2014-01-01
Computer-aided drug discovery/design methods have played a major role in the development of therapeutically important small molecules for over three decades. These methods are broadly classified as either structure-based or ligand-based methods. Structure-based methods are in principle analogous to high-throughput screening in that both target and ligand structure information is imperative. Structure-based approaches include ligand docking, pharmacophore, and ligand design methods. The article discusses theory behind the most important methods and recent successful applications. Ligand-based methods use only ligand information for predicting activity depending on its similarity/dissimilarity to previously known active ligands. We review widely used ligand-based methods such as ligand-based pharmacophores, molecular descriptors, and quantitative structure-activity relationships. In addition, important tools such as target/ligand data bases, homology modeling, ligand fingerprint methods, etc., necessary for successful implementation of various computer-aided drug discovery/design methods in a drug discovery campaign are discussed. Finally, computational methods for toxicity prediction and optimization for favorable physiologic properties are discussed with successful examples from literature. PMID:24381236
Information theory applications for biological sequence analysis.
Vinga, Susana
2014-05-01
Information theory (IT) addresses the analysis of communication systems and has been widely applied in molecular biology. In particular, alignment-free sequence analysis and comparison greatly benefited from concepts derived from IT, such as entropy and mutual information. This review covers several aspects of IT applications, ranging from genome global analysis and comparison, including block-entropy estimation and resolution-free metrics based on iterative maps, to local analysis, comprising the classification of motifs, prediction of transcription factor binding sites and sequence characterization based on linguistic complexity and entropic profiles. IT has also been applied to high-level correlations that combine DNA, RNA or protein features with sequence-independent properties, such as gene mapping and phenotype analysis, and has also provided models based on communication systems theory to describe information transmission channels at the cell level and also during evolutionary processes. While not exhaustive, this review attempts to categorize existing methods and to indicate their relation with broader transversal topics such as genomic signatures, data compression and complexity, time series analysis and phylogenetic classification, providing a resource for future developments in this promising area.
Digital focusing of OCT images based on scalar diffraction theory and information entropy
Liu, Guozhong; Zhi, Zhongwei; Wang, Ruikang K.
2012-01-01
This paper describes a digital method that is capable of automatically focusing optical coherence tomography (OCT) en face images without prior knowledge of the point spread function of the imaging system. The method utilizes a scalar diffraction model to simulate wave propagation from out-of-focus scatter to the focal plane, from which the propagation distance between the out-of-focus plane and the focal plane is determined automatically via an image-definition-evaluation criterion based on information entropy theory. By use of the proposed approach, we demonstrate that the lateral resolution close to that at the focal plane can be recovered from the imaging planes outside the depth of field region with minimal loss of resolution. Fresh onion tissues and mouse fat tissues are used in the experiments to show the performance of the proposed method. PMID:23162717
Control theory based airfoil design for potential flow and a finite volume discretization
NASA Technical Reports Server (NTRS)
Reuther, J.; Jameson, A.
1994-01-01
This paper describes the implementation of optimization techniques based on control theory for airfoil design. In previous studies it was shown that control theory could be used to devise an effective optimization procedure for two-dimensional profiles in which the shape is determined by a conformal transformation from a unit circle, and the control is the mapping function. The goal of our present work is to develop a method which does not depend on conformal mapping, so that it can be extended to treat three-dimensional problems. Therefore, we have developed a method which can address arbitrary geometric shapes through the use of a finite volume method to discretize the potential flow equation. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented, where both target speed distributions and minimum drag are used as objective functions.
Pieterse, Arwen H; de Vries, Marieke; Kunneman, Marleen; Stiggelbout, Anne M; Feldman-Stewart, Deb
2013-01-01
Healthcare decisions, particularly those involving weighing benefits and harms that may significantly affect quality and/or length of life, should reflect patients' preferences. To support patients in making choices, patient decision aids and values clarification methods (VCM) in particular have been developed. VCM intend to help patients to determine the aspects of the choices that are important to their selection of a preferred option. Several types of VCM exist. However, they are often designed without clear reference to theory, which makes it difficult for their development to be systematic and internally coherent. Our goal was to provide theory-informed recommendations for the design of VCM. Process theories of decision making specify components of decision processes, thus, identify particular processes that VCM could aim to facilitate. We conducted a review of the MEDLINE and PsycINFO databases and of references to theories included in retrieved papers, to identify process theories of decision making. We selected a theory if (a) it fulfilled criteria for a process theory; (b) provided a coherent description of the whole process of decision making; and (c) empirical evidence supports at least some of its postulates. Four theories met our criteria: Image Theory, Differentiation and Consolidation theory, Parallel Constraint Satisfaction theory, and Fuzzy-trace Theory. Based on these, we propose that VCM should: help optimize mental representations; encourage considering all potentially appropriate options; delay selection of an initially favoured option; facilitate the retrieval of relevant values from memory; facilitate the comparison of options and their attributes; and offer time to decide. In conclusion, our theory-based design recommendations are explicit and transparent, providing an opportunity to test each in a systematic manner. Copyright © 2012 Elsevier Ltd. All rights reserved.
Managing for resilience: an information theory-based approach to assessing ecosystems
Ecosystems are complex and multivariate; hence, methods to assess the dynamics of ecosystems should have the capacity to evaluate multiple indicators simultaneously. Most research on identifying leading indicators of regime shifts has focused on univariate methods and simple mod...
The digital language of amino acids.
Kurić, L
2007-11-01
The subject of this paper is a digital approach to the investigation of the biochemical basis of genetic processes. The digital mechanism of nucleic acid and protein bio-syntheses, the evolution of biomacromolecules and, especially, the biochemical evolution of genetic language have been analyzed by the application of cybernetic methods, information theory and system theory, respectively. This paper reports the discovery of new methods for developing the new technologies in genetics. It is about the most advanced digital technology which is based on program, cybernetics and informational systems and laws. The results in the practical application of the new technology could be useful in bioinformatics, genetics, biochemistry, medicine and other natural sciences.
Quantum Approach to Informatics
NASA Astrophysics Data System (ADS)
Stenholm, Stig; Suominen, Kalle-Antti
2005-08-01
An essential overview of quantum information Information, whether inscribed as a mark on a stone tablet or encoded as a magnetic domain on a hard drive, must be stored in a physical object and thus made subject to the laws of physics. Traditionally, information processing such as computation occurred in a framework governed by laws of classical physics. However, information can also be stored and processed using the states of matter described by non-classical quantum theory. Understanding this quantum information, a fundamentally different type of information, has been a major project of physicists and information theorists in recent years, and recent experimental research has started to yield promising results. Quantum Approach to Informatics fills the need for a concise introduction to this burgeoning new field, offering an intuitive approach for readers in both the physics and information science communities, as well as in related fields. Only a basic background in quantum theory is required, and the text keeps the focus on bringing this theory to bear on contemporary informatics. Instead of proofs and other highly formal structures, detailed examples present the material, making this a uniquely accessible introduction to quantum informatics. Topics covered include: * An introduction to quantum information and the qubit * Concepts and methods of quantum theory important for informatics * The application of information concepts to quantum physics * Quantum information processing and computing * Quantum gates * Error correction using quantum-based methods * Physical realizations of quantum computing circuits A helpful and economical resource for understanding this exciting new application of quantum theory to informatics, Quantum Approach to Informatics provides students and researchers in physics and information science, as well as other interested readers with some scientific background, with an essential overview of the field.
Autosophy information theory provides lossless data and video compression based on the data content
NASA Astrophysics Data System (ADS)
Holtz, Klaus E.; Holtz, Eric S.; Holtz, Diana
1996-09-01
A new autosophy information theory provides an alternative to the classical Shannon information theory. Using the new theory in communication networks provides both a high degree of lossless compression and virtually unbreakable encryption codes for network security. The bandwidth in a conventional Shannon communication is determined only by the data volume and the hardware parameters, such as image size; resolution; or frame rates in television. The data content, or what is shown on the screen, is irrelevant. In contrast, the bandwidth in autosophy communication is determined only by data content, such as novelty and movement in television images. It is the data volume and hardware parameters that become irrelevant. Basically, the new communication methods use prior 'knowledge' of the data, stored in a library, to encode subsequent transmissions. The more 'knowledge' stored in the libraries, the higher the potential compression ratio. 'Information' is redefined as that which is not already known by the receiver. Everything already known is redundant and need not be re-transmitted. In a perfect communication each transmission code, called a 'tip,' creates a new 'engram' of knowledge in the library in which each tip transmission can represent any amount of data. Autosophy theories provide six separate learning modes, or omni dimensional networks, all of which can be used for data compression. The new information theory reveals the theoretical flaws of other data compression methods, including: the Huffman; Ziv Lempel; LZW codes and commercial compression codes such as V.42bis and MPEG-2.
Techni-kits and Techni-kit Building Systems
NASA Technical Reports Server (NTRS)
Callender, E. D.; Hartsough, C.; Morris, R. V.; Yamamoto, Y.
1985-01-01
Techni-kits consists of theories, methods, standards and computer based tools that assist in design of information-intensive systems. Techni-kit "building system" is techni-kit that builds techni-kits.
Remote sensing of suspended sediment water research: principles, methods, and progress
NASA Astrophysics Data System (ADS)
Shen, Ping; Zhang, Jing
2011-12-01
In this paper, we reviewed the principle, data, methods and steps in suspended sediment research by using remote sensing, summed up some representative models and methods, and analyzes the deficiencies of existing methods. Combined with the recent progress of remote sensing theory and application in water suspended sediment research, we introduced in some data processing methods such as atmospheric correction method, adjacent effect correction, and some intelligence algorithms such as neural networks, genetic algorithms, support vector machines into the suspended sediment inversion research, combined with other geographic information, based on Bayesian theory, we improved the suspended sediment inversion precision, and aim to give references to the related researchers.
NASA Astrophysics Data System (ADS)
Clemens, Joshua William
Game theory has application across multiple fields, spanning from economic strategy to optimal control of an aircraft and missile on an intercept trajectory. The idea of game theory is fascinating in that we can actually mathematically model real-world scenarios and determine optimal decision making. It may not always be easy to mathematically model certain real-world scenarios, nonetheless, game theory gives us an appreciation for the complexity involved in decision making. This complexity is especially apparent when the players involved have access to different information upon which to base their decision making (a nonclassical information pattern). Here we will focus on the class of adversarial two-player games (sometimes referred to as pursuit-evasion games) with nonclassical information pattern. We present a two-sided (simultaneous) optimization solution method for the two-player linear quadratic Gaussian (LQG) multistage game. This direct solution method allows for further interpretation of each player's decision making (strategy) as compared to previously used formal solution methods. In addition to the optimal control strategies, we present a saddle point proof and we derive an expression for the optimal performance index value. We provide some numerical results in order to further interpret the optimal control strategies and to highlight real-world application of this game-theoretic optimal solution.
de Bock, Élodie; Hardouin, Jean-Benoit; Blanchin, Myriam; Le Neel, Tanguy; Kubis, Gildas; Bonnaud-Antignac, Angélique; Dantan, Étienne; Sébille, Véronique
2016-10-01
The objective was to compare classical test theory and Rasch-family models derived from item response theory for the analysis of longitudinal patient-reported outcomes data with possibly informative intermittent missing items. A simulation study was performed in order to assess and compare the performance of classical test theory and Rasch model in terms of bias, control of the type I error and power of the test of time effect. The type I error was controlled for classical test theory and Rasch model whether data were complete or some items were missing. Both methods were unbiased and displayed similar power with complete data. When items were missing, Rasch model remained unbiased and displayed higher power than classical test theory. Rasch model performed better than the classical test theory approach regarding the analysis of longitudinal patient-reported outcomes with possibly informative intermittent missing items mainly for power. This study highlights the interest of Rasch-based models in clinical research and epidemiology for the analysis of incomplete patient-reported outcomes data. © The Author(s) 2013.
Sea Ice Detection Based on an Improved Similarity Measurement Method Using Hyperspectral Data.
Han, Yanling; Li, Jue; Zhang, Yun; Hong, Zhonghua; Wang, Jing
2017-05-15
Hyperspectral remote sensing technology can acquire nearly continuous spectrum information and rich sea ice image information, thus providing an important means of sea ice detection. However, the correlation and redundancy among hyperspectral bands reduce the accuracy of traditional sea ice detection methods. Based on the spectral characteristics of sea ice, this study presents an improved similarity measurement method based on linear prediction (ISMLP) to detect sea ice. First, the first original band with a large amount of information is determined based on mutual information theory. Subsequently, a second original band with the least similarity is chosen by the spectral correlation measuring method. Finally, subsequent bands are selected through the linear prediction method, and a support vector machine classifier model is applied to classify sea ice. In experiments performed on images of Baffin Bay and Bohai Bay, comparative analyses were conducted to compare the proposed method and traditional sea ice detection methods. Our proposed ISMLP method achieved the highest classification accuracies (91.18% and 94.22%) in both experiments. From these results the ISMLP method exhibits better performance overall than other methods and can be effectively applied to hyperspectral sea ice detection.
Sea Ice Detection Based on an Improved Similarity Measurement Method Using Hyperspectral Data
Han, Yanling; Li, Jue; Zhang, Yun; Hong, Zhonghua; Wang, Jing
2017-01-01
Hyperspectral remote sensing technology can acquire nearly continuous spectrum information and rich sea ice image information, thus providing an important means of sea ice detection. However, the correlation and redundancy among hyperspectral bands reduce the accuracy of traditional sea ice detection methods. Based on the spectral characteristics of sea ice, this study presents an improved similarity measurement method based on linear prediction (ISMLP) to detect sea ice. First, the first original band with a large amount of information is determined based on mutual information theory. Subsequently, a second original band with the least similarity is chosen by the spectral correlation measuring method. Finally, subsequent bands are selected through the linear prediction method, and a support vector machine classifier model is applied to classify sea ice. In experiments performed on images of Baffin Bay and Bohai Bay, comparative analyses were conducted to compare the proposed method and traditional sea ice detection methods. Our proposed ISMLP method achieved the highest classification accuracies (91.18% and 94.22%) in both experiments. From these results the ISMLP method exhibits better performance overall than other methods and can be effectively applied to hyperspectral sea ice detection. PMID:28505135
[Traceability of Wine Varieties Using Near Infrared Spectroscopy Combined with Cyclic Voltammetry].
Li, Meng-hua; Li, Jing-ming; Li, Jun-hui; Zhang, Lu-da; Zhao, Long-lian
2015-06-01
To achieve the traceability of wine varieties, a method was proposed to fuse Near-infrared (NIR) spectra and cyclic voltammograms (CV) which contain different information using D-S evidence theory. NIR spectra and CV curves of three different varieties of wines (cabernet sauvignon, merlot, cabernet gernischt) which come from seven different geographical origins were collected separately. The discriminant models were built using PLS-DA method. Based on this, D-S evidence theory was then applied to achieve the integration of the two kinds of discrimination results. After integrated by D-S evidence theory, the accuracy rate of cross-validation is 95.69% and validation set is 94.12% for wine variety identification. When only considering the wine that come from Yantai, the accuracy rate of cross-validation is 99.46% and validation set is 100%. All the traceability models after fusion achieved better results on classification than individual method. These results suggest that the proposed method combining electrochemical information with spectral information using the D-S evidence combination formula is benefit to the improvement of model discrimination effect, and is a promising tool for discriminating different kinds of wines.
Constructing acoustic timefronts using random matrix theory.
Hegewisch, Katherine C; Tomsovic, Steven
2013-10-01
In a recent letter [Hegewisch and Tomsovic, Europhys. Lett. 97, 34002 (2012)], random matrix theory is introduced for long-range acoustic propagation in the ocean. The theory is expressed in terms of unitary propagation matrices that represent the scattering between acoustic modes due to sound speed fluctuations induced by the ocean's internal waves. The scattering exhibits a power-law decay as a function of the differences in mode numbers thereby generating a power-law, banded, random unitary matrix ensemble. This work gives a more complete account of that approach and extends the methods to the construction of an ensemble of acoustic timefronts. The result is a very efficient method for studying the statistical properties of timefronts at various propagation ranges that agrees well with propagation based on the parabolic equation. It helps identify which information about the ocean environment can be deduced from the timefronts and how to connect features of the data to that environmental information. It also makes direct connections to methods used in other disordered waveguide contexts where the use of random matrix theory has a multi-decade history.
Reverse and direct methods for solving the characteristic equation
NASA Astrophysics Data System (ADS)
Lozhkin, Alexander; Bozek, Pavol; Lyalin, Vadim; Tarasov, Vladimir; Tothova, Maria; Sultanov, Ravil
2016-06-01
Fundamentals of information-linguistic interpretation of the geometry presented shortly. The method of solving the characteristic equation based on Euler's formula is described. The separation of the characteristic equation for several disassembled for Jordan curves. Applications of the theory for problems of mechatronics outlined briefly.
The use of information theory for the evaluation of biomarkers of aging and physiological age.
Blokh, David; Stambler, Ilia
2017-04-01
The present work explores the application of information theoretical measures, such as entropy and normalized mutual information, for research of biomarkers of aging. The use of information theory affords unique methodological advantages for the study of aging processes, as it allows evaluating non-linear relations between biological parameters, providing the precise quantitative strength of those relations, both for individual and multiple parameters, showing cumulative or synergistic effect. Here we illustrate those capabilities utilizing a dataset on heart disease, including diagnostic parameters routinely available to physicians. The use of information-theoretical methods, utilizing normalized mutual information, revealed the exact amount of information that various diagnostic parameters or their combinations contained about the persons' age. Based on those exact informative values for the correlation of measured parameters with age, we constructed a diagnostic rule (a decision tree) to evaluate physiological age, as compared to chronological age. The present data illustrated that younger subjects suffering from heart disease showed characteristics of people of higher age (higher physiological age). Utilizing information-theoretical measures, with additional data, it may be possible to create further clinically applicable information-theory-based markers and models for the evaluation of physiological age, its relation to age-related diseases and its potential modifications by therapeutic interventions. Copyright © 2017 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Breitmeyer, Bruno G.; Ganz, Leo
1976-01-01
This paper reviewed briefly the major types of masking effects obtained with various methods and the major theories or models that have been proposed to account for these effects, and outlined a three-mechanism model of visual pattern masking based on psychophysical and neurophysiological properties of the visual system. (Author/RK)
ERIC Educational Resources Information Center
Kim, ChanMin; Keller, John M.
2008-01-01
This study investigated what kind of supportive information can be effective in improving the situation where there were severe motivational challenges. Motivational and volitional email messages (MVEM) were constructed based on an integrated model of four theories and methods, which are Keller's ARCS model, Kuhl's action control theory,…
A Method for Capturing and Reconciling Stakeholder Intentions Based on the Formal Concept Analysis
NASA Astrophysics Data System (ADS)
Aoyama, Mikio
Information systems are ubiquitous in our daily life. Thus, information systems need to work appropriately anywhere at any time for everybody. Conventional information systems engineering tends to engineer systems from the viewpoint of systems functionality. However, the diversity of the usage context requires fundamental change compared to our current thinking on information systems; from the functionality the systems provide to the goals the systems should achieve. The intentional approach embraces the goals and related aspects of the information systems. This chapter presents a method for capturing, structuring and reconciling diverse goals of multiple stakeholders. The heart of the method lies in the hierarchical structuring of goals by goal lattice based on the formal concept analysis, a semantic extension of the lattice theory. We illustrate the effectiveness of the presented method through application to the self-checkout systems for large-scale supermarkets.
Bogg, Tim; Finn, Peter R.
2009-01-01
Objective: Using insights from Ecological Systems Theory and Reinforcement Sensitivity Theory, the current study assessed the utility of a series of hypothetical role-based alcohol-consumption scenarios that varied in their presentation of rewarding and punishing information. Method: The scenarios, along with measures of impulsive sensation seeking and a self-report of weekly alcohol consumption, were administered to a sample of alcohol-dependent and non-alcohol-dependent college-age individuals (N = 170). Results: The results showed scenario attendance decisions were largely unaffected by alcohol-dependence status and variations in contextual reward and punishment information. In contrast to the attendance findings, the results for the alcohol-consumption decisions showed alcohol-dependent individuals reported a greater frequency of deciding to drink, as well as indicating greater alcohol consumption in the contexts of complementary rewarding or nonpunishing information. Regression results provided evidence for the criterion-related validity of scenario outcomes in an account of diagnostic alcohol problems. Conclusions: The results are discussed in terms of the conceptual and predictive gains associated with an assessment approach to alcohol-consumption decision making that combines situational information organized and balanced through the frameworks of Ecological Systems Theory and Reinforcement Sensitivity Theory. PMID:19371496
NASA Astrophysics Data System (ADS)
Wang, Fei
2013-09-01
Geiger-mode detectors have single photon sensitivity and picoseconds timing resolution, which make it a good candidate for low light level ranging applications, especially in the case of flash three dimensional imaging applications where the received laser power is extremely limited. Another advantage of Geiger-mode APD is their capability of large output current which can drive CMOS timing circuit directly, which means that larger format focal plane arrays can be easily fabricated using the mature CMOS technology. However Geiger-mode detector based FPAs can only measure the range information of a scene but not the reflectivity. Reflectivity is a major characteristic which can help target classification and identification. According to Poisson statistic nature, detection probability is tightly connected to the incident number of photon. Employing this relation, a signal intensity estimation method based on probability inversion is proposed. Instead of measuring intensity directly, several detections are conducted, then the detection probability is obtained and the intensity is estimated using this method. The relation between the estimator's accuracy, measuring range and number of detections are discussed based on statistical theory. Finally Monte-Carlo simulation is conducted to verify the correctness of this theory. Using 100 times of detection, signal intensity equal to 4.6 photons per detection can be measured using this method. With slight modification of measuring strategy, intensity information can be obtained using current Geiger-mode detector based FPAs, which can enrich the information acquired and broaden the application field of current technology.
Control theory based airfoil design using the Euler equations
NASA Technical Reports Server (NTRS)
Jameson, Antony; Reuther, James
1994-01-01
This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using the potential flow equation with either a conformal mapping or a general coordinate system. The goal of our present work is to extend the development to treat the Euler equations in two-dimensions by procedures that can readily be generalized to treat complex shapes in three-dimensions. Therefore, we have developed methods which can address airfoil design through either an analytic mapping or an arbitrary grid perturbation method applied to a finite volume discretization of the Euler equations. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented for both the inverse problem and drag minimization problem.
Theory-based interventions in physical activity: a systematic review of literature in Iran.
Abdi, Jalal; Eftekhar, Hassan; Estebsari, Fatemeh; Sadeghi, Roya
2014-11-30
Lack of physical activity is ranked fourth among the causes of human death and chronic diseases. Using models and theories to design, implement, and evaluate the health education and health promotion interventions has many advantages. Using models and theories of physical activity, we decided to systematically study the educational and promotional interventions carried out in Iran from 2003 to 2013.Three information databases were used to systematically select papers using key words including Iranian Magazine Database (MAGIRAN), Iran Medical Library (MEDLIB), and Scientific Information Database (SID). Twenty papers were selected and studied .Having been applied in 9 studies, The Trans Theoretical Model (TTM) was the most widespread model in Iran (PENDER in 3 studies, BASNEF in 2, and the Theory of Planned Behavior in 2 studies). With regards to the educational methods, almost all studies used a combination of methods. The most widely used Integrative educational method was group discussion. Only one integrated study was done. Behavior maintenance was not addressed in 75% of the studies. Almost all studies used self-reporting instruments. The effectiveness of educational methods was assessed in none of the studies. Most of the included studies had several methodological weaknesses, which hinder the validity and applicability of their results. According to the findings, the necessity of need assessment in using models, epidemiology and methodology consultation, addressing maintenance of physical activity, using other theories and models such as social marketing and social-cognitive theory, and other educational methods like empirical and complementary are suggested.
Côté, José; Cossette, Sylvie; Ramirez-Garcia, Pilar; Rouleau, Geneviève; Auger, Patricia; Boudreau, François; Gagnon, Marie-Pierre
2017-01-01
Background . In the domain of health behavior change, the deployment and utilization of information and communications technologies as a way to deliver interventions appear to be promising. This article describes the development of a web-based tailored intervention, TAVIE en santé , to support people living with HIV in the adoption of healthy behaviors. Methods . This intervention was developed through an Intervention Mapping (IM) framework and is based on the theory of planned behavior. Results . Crucial steps of IM are the selection of key determinants of behavior and the selection of useful theory-based intervention methods to change the targeted determinants (active ingredients). The content and the sequence of the intervention are then created based on these parameters. TAVIE en santé is composed of 7 interactive web sessions hosted by a virtual nurse. It aims to develop and strengthen skills required for behavior change. Based on an algorithm using individual cognitive data (attitude, perceived behavioral control, and intention), the number of sessions, theory-based intervention methods, and messages contents are tailored to each user. Conclusion . TAVIE en santé is currently being evaluated. The use of IM allows developing intervention with a systematic approach based on theory, empirical evidence, and clinical and experiential knowledge.
Cossette, Sylvie; Ramirez-Garcia, Pilar; Rouleau, Geneviève; Auger, Patricia; Boudreau, François; Gagnon, Marie-Pierre
2017-01-01
Background. In the domain of health behavior change, the deployment and utilization of information and communications technologies as a way to deliver interventions appear to be promising. This article describes the development of a web-based tailored intervention, TAVIE en santé, to support people living with HIV in the adoption of healthy behaviors. Methods. This intervention was developed through an Intervention Mapping (IM) framework and is based on the theory of planned behavior. Results. Crucial steps of IM are the selection of key determinants of behavior and the selection of useful theory-based intervention methods to change the targeted determinants (active ingredients). The content and the sequence of the intervention are then created based on these parameters. TAVIE en santé is composed of 7 interactive web sessions hosted by a virtual nurse. It aims to develop and strengthen skills required for behavior change. Based on an algorithm using individual cognitive data (attitude, perceived behavioral control, and intention), the number of sessions, theory-based intervention methods, and messages contents are tailored to each user. Conclusion. TAVIE en santé is currently being evaluated. The use of IM allows developing intervention with a systematic approach based on theory, empirical evidence, and clinical and experiential knowledge. PMID:28393077
Ko, Linda K; Turner-McGrievy, Gabrielle M; Campbell, Marci K
2014-04-01
Podcasting is an emerging technology, and previous interventions have shown promising results using theory-based podcast for weight loss among overweight and obese individuals. This study investigated whether constructs of social cognitive theory and information processing theories (IPTs) mediate the effect of a podcast intervention on weight loss among overweight individuals. Data are from Pounds off Digitally, a study testing the efficacy of two weight loss podcast interventions (control podcast and theory-based podcast). Path models were constructed (n = 66). The IPTs, elaboration likelihood model, information control theory, and cognitive load theory mediated the effect of a theory-based podcast on weight loss. The intervention was significantly associated with all IPTs. Information control theory and cognitive load theory were related to elaboration, and elaboration was associated with weight loss. Social cognitive theory constructs did not mediate weight loss. Future podcast interventions grounded in theory may be effective in promoting weight loss.
Quantum entanglement of identical particles by standard information-theoretic notions
Lo Franco, Rosario; Compagno, Giuseppe
2016-01-01
Quantum entanglement of identical particles is essential in quantum information theory. Yet, its correct determination remains an open issue hindering the general understanding and exploitation of many-particle systems. Operator-based methods have been developed that attempt to overcome the issue. Here we introduce a state-based method which, as second quantization, does not label identical particles and presents conceptual and technical advances compared to the previous ones. It establishes the quantitative role played by arbitrary wave function overlaps, local measurements and particle nature (bosons or fermions) in assessing entanglement by notions commonly used in quantum information theory for distinguishable particles, like partial trace. Our approach furthermore shows that bringing identical particles into the same spatial location functions as an entangling gate, providing fundamental theoretical support to recent experimental observations with ultracold atoms. These results pave the way to set and interpret experiments for utilizing quantum correlations in realistic scenarios where overlap of particles can count, as in Bose-Einstein condensates, quantum dots and biological molecular aggregates. PMID:26857475
[Design of the image browser for PACS image workstation].
Li, Feng; Zhou, He-Qin
2006-09-01
The design of PACS image workstation based on DICOM3.0 is introduced in the paper, then the designing method of the PACS image browser based on the control system theory is presented,focusing on two main units:DICOM analyzer and the information mapping transformer.
Application of data fusion technology based on D-S evidence theory in fire detection
NASA Astrophysics Data System (ADS)
Cai, Zhishan; Chen, Musheng
2015-12-01
Judgment and identification based on single fire characteristic parameter information in fire detection is subject to environmental disturbances, and accordingly its detection performance is limited with the increase of false positive rate and false negative rate. The compound fire detector employs information fusion technology to judge and identify multiple fire characteristic parameters in order to improve the reliability and accuracy of fire detection. The D-S evidence theory is applied to the multi-sensor data-fusion: first normalize the data from all sensors to obtain the normalized basic probability function of the fire occurrence; then conduct the fusion processing using the D-S evidence theory; finally give the judgment results. The results show that the method meets the goal of accurate fire signal identification and increases the accuracy of fire alarm, and therefore is simple and effective.
Research on Capturing of Customer Requirements Based on Innovation Theory
NASA Astrophysics Data System (ADS)
junwu, Ding; dongtao, Yang; zhenqiang, Bao
To exactly and effectively capture customer requirements information, a new customer requirements capturing modeling method was proposed. Based on the analysis of function requirement models of previous products and the application of technology system evolution laws of the Theory of Innovative Problem Solving (TRIZ), the customer requirements could be evolved from existing product designs, through modifying the functional requirement unit and confirming the direction of evolution design. Finally, a case study was provided to illustrate the feasibility of the proposed approach.
The Sanctuary Model of Trauma-Informed Organizational Change
ERIC Educational Resources Information Center
Bloom, Sandra L.; Sreedhar, Sarah Yanosy
2008-01-01
This article features the Sanctuary Model[R], a trauma-informed method for creating or changing an organizational culture. Although the model is based on trauma theory, its tenets have application in working with children and adults across a wide diagnostic spectrum. Originally developed in a short-term, acute inpatient psychiatric setting for…
NASA Astrophysics Data System (ADS)
Kopylova, N. S.; Bykova, A. A.; Beregovoy, D. N.
2018-05-01
Based on the landscape-geographical approach, a structural and logical scheme for the Northwestern Federal District Econet has been developed, which can be integrated into the federal and world ecological network in order to improve the environmental infrastructure of the region. The method of Northwestern Federal District Econet organization on the basis of graph theory by means of the Quantum GIS geographic information system is proposed as an effective mean of preserving and recreating the unique biodiversity of landscapes, regulation of the sphere of environmental protection.
NASA Astrophysics Data System (ADS)
Shi, Jing; Shi, Yunli; Tan, Jian; Zhu, Lei; Li, Hu
2018-02-01
Traditional power forecasting models cannot efficiently take various factors into account, neither to identify the relation factors. In this paper, the mutual information in information theory and the artificial intelligence random forests algorithm are introduced into the medium and long-term electricity demand prediction. Mutual information can identify the high relation factors based on the value of average mutual information between a variety of variables and electricity demand, different industries may be highly associated with different variables. The random forests algorithm was used for building the different industries forecasting models according to the different correlation factors. The data of electricity consumption in Jiangsu Province is taken as a practical example, and the above methods are compared with the methods without regard to mutual information and the industries. The simulation results show that the above method is scientific, effective, and can provide higher prediction accuracy.
Deafness, Thought-Bubbles and Theory of Mind Development
Wellman, Henry M.; Peterson, Candida C.
2013-01-01
The processes and mechanisms of theory of mind development were examined via a training study of false belief conceptions in deaf children of hearing parents (N = 43). In comparison to two different control conditions, training based on thought-bubble instruction about beliefs was linked with improved false belief understanding as well as progress on a broader theory-of-mind scale. By combining intervention, microgenetic, and developmental-scaling methods the findings provide informative data about the nature and mechanisms of theory-of-mind change in deaf children, as well as an initial demonstration of a useful intervention for enhancing social cognition in deaf children of hearing parents. The methods and results also point to possible avenues for the study of conceptual change more generally. PMID:23544856
NASA Astrophysics Data System (ADS)
Mohammad-Djafari, Ali
2015-01-01
The main object of this tutorial article is first to review the main inference tools using Bayesian approach, Entropy, Information theory and their corresponding geometries. This review is focused mainly on the ways these tools have been used in data, signal and image processing. After a short introduction of the different quantities related to the Bayes rule, the entropy and the Maximum Entropy Principle (MEP), relative entropy and the Kullback-Leibler divergence, Fisher information, we will study their use in different fields of data and signal processing such as: entropy in source separation, Fisher information in model order selection, different Maximum Entropy based methods in time series spectral estimation and finally, general linear inverse problems.
Research on the Fusion of Dependent Evidence Based on Rank Correlation Coefficient.
Shi, Fengjian; Su, Xiaoyan; Qian, Hong; Yang, Ning; Han, Wenhua
2017-10-16
In order to meet the higher accuracy and system reliability requirements, the information fusion for multi-sensor systems is an increasing concern. Dempster-Shafer evidence theory (D-S theory) has been investigated for many applications in multi-sensor information fusion due to its flexibility in uncertainty modeling. However, classical evidence theory assumes that the evidence is independent of each other, which is often unrealistic. Ignoring the relationship between the evidence may lead to unreasonable fusion results, and even lead to wrong decisions. This assumption severely prevents D-S evidence theory from practical application and further development. In this paper, an innovative evidence fusion model to deal with dependent evidence based on rank correlation coefficient is proposed. The model first uses rank correlation coefficient to measure the dependence degree between different evidence. Then, total discount coefficient is obtained based on the dependence degree, which also considers the impact of the reliability of evidence. Finally, the discount evidence fusion model is presented. An example is illustrated to show the use and effectiveness of the proposed method.
Research on the Fusion of Dependent Evidence Based on Rank Correlation Coefficient
Su, Xiaoyan; Qian, Hong; Yang, Ning; Han, Wenhua
2017-01-01
In order to meet the higher accuracy and system reliability requirements, the information fusion for multi-sensor systems is an increasing concern. Dempster–Shafer evidence theory (D–S theory) has been investigated for many applications in multi-sensor information fusion due to its flexibility in uncertainty modeling. However, classical evidence theory assumes that the evidence is independent of each other, which is often unrealistic. Ignoring the relationship between the evidence may lead to unreasonable fusion results, and even lead to wrong decisions. This assumption severely prevents D–S evidence theory from practical application and further development. In this paper, an innovative evidence fusion model to deal with dependent evidence based on rank correlation coefficient is proposed. The model first uses rank correlation coefficient to measure the dependence degree between different evidence. Then, total discount coefficient is obtained based on the dependence degree, which also considers the impact of the reliability of evidence. Finally, the discount evidence fusion model is presented. An example is illustrated to show the use and effectiveness of the proposed method. PMID:29035341
NASA Astrophysics Data System (ADS)
Suo, M. Q.; Li, Y. P.; Huang, G. H.
2011-09-01
In this study, an inventory-theory-based interval-parameter two-stage stochastic programming (IB-ITSP) model is proposed through integrating inventory theory into an interval-parameter two-stage stochastic optimization framework. This method can not only address system uncertainties with complex presentation but also reflect transferring batch (the transferring quantity at once) and period (the corresponding cycle time) in decision making problems. A case of water allocation problems in water resources management planning is studied to demonstrate the applicability of this method. Under different flow levels, different transferring measures are generated by this method when the promised water cannot be met. Moreover, interval solutions associated with different transferring costs also have been provided. They can be used for generating decision alternatives and thus help water resources managers to identify desired policies. Compared with the ITSP method, the IB-ITSP model can provide a positive measure for solving water shortage problems and afford useful information for decision makers under uncertainty.
Lin, Fen-Fang; Wang, Ke; Yang, Ning; Yan, Shi-Guang; Zheng, Xin-Yu
2012-02-01
In this paper, some main factors such as soil type, land use pattern, lithology type, topography, road, and industry type that affect soil quality were used to precisely obtain the spatial distribution characteristics of regional soil quality, mutual information theory was adopted to select the main environmental factors, and decision tree algorithm See 5.0 was applied to predict the grade of regional soil quality. The main factors affecting regional soil quality were soil type, land use, lithology type, distance to town, distance to water area, altitude, distance to road, and distance to industrial land. The prediction accuracy of the decision tree model with the variables selected by mutual information was obviously higher than that of the model with all variables, and, for the former model, whether of decision tree or of decision rule, its prediction accuracy was all higher than 80%. Based on the continuous and categorical data, the method of mutual information theory integrated with decision tree could not only reduce the number of input parameters for decision tree algorithm, but also predict and assess regional soil quality effectively.
NASA Astrophysics Data System (ADS)
Pan, Feng; Pachepsky, Yakov A.; Guber, Andrey K.; McPherson, Brian J.; Hill, Robert L.
2012-01-01
SummaryUnderstanding streamflow patterns in space and time is important for improving flood and drought forecasting, water resources management, and predictions of ecological changes. Objectives of this work include (a) to characterize the spatial and temporal patterns of streamflow using information theory-based measures at two thoroughly-monitored agricultural watersheds located in different hydroclimatic zones with similar land use, and (b) to elucidate and quantify temporal and spatial scale effects on those measures. We selected two USDA experimental watersheds to serve as case study examples, including the Little River experimental watershed (LREW) in Tifton, Georgia and the Sleepers River experimental watershed (SREW) in North Danville, Vermont. Both watersheds possess several nested sub-watersheds and more than 30 years of continuous data records of precipitation and streamflow. Information content measures (metric entropy and mean information gain) and complexity measures (effective measure complexity and fluctuation complexity) were computed based on the binary encoding of 5-year streamflow and precipitation time series data. We quantified patterns of streamflow using probabilities of joint or sequential appearances of the binary symbol sequences. Results of our analysis illustrate that information content measures of streamflow time series are much smaller than those for precipitation data, and the streamflow data also exhibit higher complexity, suggesting that the watersheds effectively act as filters of the precipitation information that leads to the observed additional complexity in streamflow measures. Correlation coefficients between the information-theory-based measures and time intervals are close to 0.9, demonstrating the significance of temporal scale effects on streamflow patterns. Moderate spatial scale effects on streamflow patterns are observed with absolute values of correlation coefficients between the measures and sub-watershed area varying from 0.2 to 0.6 in the two watersheds. We conclude that temporal effects must be evaluated and accounted for when the information theory-based methods are used for performance evaluation and comparison of hydrological models.
NASA Astrophysics Data System (ADS)
Cui, Jia; Hong, Bei; Jiang, Xuepeng; Chen, Qinghua
2017-05-01
With the purpose of reinforcing correlation analysis of risk assessment threat factors, a dynamic assessment method of safety risks based on particle filtering is proposed, which takes threat analysis as the core. Based on the risk assessment standards, the method selects threat indicates, applies a particle filtering algorithm to calculate influencing weight of threat indications, and confirms information system risk levels by combining with state estimation theory. In order to improve the calculating efficiency of the particle filtering algorithm, the k-means cluster algorithm is introduced to the particle filtering algorithm. By clustering all particles, the author regards centroid as the representative to operate, so as to reduce calculated amount. The empirical experience indicates that the method can embody the relation of mutual dependence and influence in risk elements reasonably. Under the circumstance of limited information, it provides the scientific basis on fabricating a risk management control strategy.
Laukkanen, Sanna; Kangas, Annika; Kangas, Jyrki
2002-02-01
Voting theory has a lot in common with utility theory, and especially with group decision-making. An expected-utility-maximising strategy exists in voting situations, as well as in decision-making situations. Therefore, it is natural to utilise the achievements of voting theory also in group decision-making. Most voting systems are based on a single criterion or holistic preference information on decision alternatives. However, a voting scheme called multicriteria approval is specially developed for decision-making situations with multiple criteria. This study considers the voting theory from the group decision support point of view and compares it with some other methods applied to similar purposes in natural resource management. A case study is presented, where the approval voting approach is introduced to natural resources planning and tested in a forestry group decision-making process. Applying multicriteria approval method was found to be a potential approach for handling some challenges typical for forestry group decision support. These challenges include (i) utilising ordinal information in the evaluation of decision alternatives, (ii) being readily understandable for and treating equally all the stakeholders in possession of different levels of knowledge on the subject considered, (iii) fast and cheap acquisition of preference information from several stakeholders, and (iv) dealing with multiple criteria.
Feasibility study of molecular memory device based on DNA using methylation to store information
NASA Astrophysics Data System (ADS)
Jiang, Liming; Qiu, Wanzhi; Al-Dirini, Feras; Hossain, Faruque M.; Evans, Robin; Skafidas, Efstratios
2016-07-01
DNA, because of its robustness and dense information storage capability, has been proposed as a potential candidate for next-generation storage media. However, encoding information into the DNA sequence requires molecular synthesis technology, which to date is costly and prone to synthesis errors. Reading the DNA strand information is also complex. Ideally, DNA storage will provide methods for modifying stored information. Here, we conduct a feasibility study investigating the use of the DNA 5-methylcytosine (5mC) methylation state as a molecular memory to store information. We propose a new 1-bit memory device and study, based on the density functional theory and non-equilibrium Green's function method, the feasibility of electrically reading the information. Our results show that changes to methylation states lead to changes in the peak of negative differential resistance which can be used to interrogate memory state. Our work demonstrates a new memory concept based on methylation state which can be beneficial in the design of next generation DNA based molecular electronic memory devices.
NASA Astrophysics Data System (ADS)
Fu, Libi; Song, Weiguo; Lo, Siuming
2017-01-01
Emergencies involved in mass events are related to a variety of factors and processes. An important factor is the transmission of information on danger that has an influence on nonlinear crowd dynamics during the process of crowd dispersion. Due to much uncertainty in this process, there is an urgent need to propose a method to investigate the influence. In this paper, a novel fuzzy-theory-based method is presented to study crowd dynamics under the influence of information transmission. Fuzzy functions and rules are designed for the ambiguous description of human states. Reasonable inference is employed to decide the output values of decision making such as pedestrian movement speed and directions. Through simulation under four-way pedestrian situations, good crowd dispersion phenomena are achieved. Simulation results under different conditions demonstrate that information transmission cannot always induce successful crowd dispersion in all situations. This depends on whether decision strategies in response to information on danger are unified and effective, especially in dense crowds. Results also suggest that an increase in drift strength at low density and the percentage of pedestrians, who choose one of the furthest unoccupied Von Neumann neighbors from the dangerous source as the drift direction at high density, is helpful in crowd dispersion. Compared with previous work, our comprehensive study improves an in-depth understanding of nonlinear crowd dynamics under the effect of information on danger.
An application of information theory to stochastic classical gravitational fields
NASA Astrophysics Data System (ADS)
Angulo, J.; Angulo, J. C.; Angulo, J. M.
2018-06-01
The objective of this study lies on the incorporation of the concepts developed in the Information Theory (entropy, complexity, etc.) with the aim of quantifying the variation of the uncertainty associated with a stochastic physical system resident in a spatiotemporal region. As an example of application, a relativistic classical gravitational field has been considered, with a stochastic behavior resulting from the effect induced by one or several external perturbation sources. One of the key concepts of the study is the covariance kernel between two points within the chosen region. Using this concept and the appropriate criteria, a methodology is proposed to evaluate the change of uncertainty at a given spatiotemporal point, based on available information and efficiently applying the diverse methods that Information Theory provides. For illustration, a stochastic version of the Einstein equation with an added Gaussian Langevin term is analyzed.
Langholz, Bryan; Thomas, Duncan C.; Stovall, Marilyn; Smith, Susan A.; Boice, John D.; Shore, Roy E.; Bernstein, Leslie; Lynch, Charles F.; Zhang, Xinbo; Bernstein, Jonine L.
2009-01-01
Summary Methods for the analysis of individually matched case-control studies with location-specific radiation dose and tumor location information are described. These include likelihood methods for analyses that just use cases with precise location of tumor information and methods that also include cases with imprecise tumor location information. The theory establishes that each of these likelihood based methods estimates the same radiation rate ratio parameters, within the context of the appropriate model for location and subject level covariate effects. The underlying assumptions are characterized and the potential strengths and limitations of each method are described. The methods are illustrated and compared using the WECARE study of radiation and asynchronous contralateral breast cancer. PMID:18647297
A novel model for DNA sequence similarity analysis based on graph theory.
Qi, Xingqin; Wu, Qin; Zhang, Yusen; Fuller, Eddie; Zhang, Cun-Quan
2011-01-01
Determination of sequence similarity is one of the major steps in computational phylogenetic studies. As we know, during evolutionary history, not only DNA mutations for individual nucleotide but also subsequent rearrangements occurred. It has been one of major tasks of computational biologists to develop novel mathematical descriptors for similarity analysis such that various mutation phenomena information would be involved simultaneously. In this paper, different from traditional methods (eg, nucleotide frequency, geometric representations) as bases for construction of mathematical descriptors, we construct novel mathematical descriptors based on graph theory. In particular, for each DNA sequence, we will set up a weighted directed graph. The adjacency matrix of the directed graph will be used to induce a representative vector for DNA sequence. This new approach measures similarity based on both ordering and frequency of nucleotides so that much more information is involved. As an application, the method is tested on a set of 0.9-kb mtDNA sequences of twelve different primate species. All output phylogenetic trees with various distance estimations have the same topology, and are generally consistent with the reported results from early studies, which proves the new method's efficiency; we also test the new method on a simulated data set, which shows our new method performs better than traditional global alignment method when subsequent rearrangements happen frequently during evolutionary history.
Design and realization of retina-like three-dimensional imaging based on a MOEMS mirror
NASA Astrophysics Data System (ADS)
Cao, Jie; Hao, Qun; Xia, Wenze; Peng, Yuxin; Cheng, Yang; Mu, Jiaxing; Wang, Peng
2016-07-01
To balance conflicts for high-resolution, large-field-of-view and real-time imaging, a retina-like imaging method based on time-of flight (TOF) is proposed. Mathematical models of 3D imaging based on MOEMS are developed. Based on this method, we perform simulations of retina-like scanning properties, including compression of redundant information and rotation and scaling invariance. To validate the theory, we develop a prototype and conduct relevant experiments. The preliminary results agree well with the simulations.
Greville-Harris, Maddy; Bostock, Jennifer; Din, Amy; Graham, Cynthia A; Lewith, George; Liossi, Christina; O'Riordan, Tim; White, Peter; Yardley, Lucy; Bishop, Felicity L
2016-06-10
According to established ethical principles and guidelines, patients in clinical trials should be fully informed about the interventions they might receive. However, information about placebo-controlled clinical trials typically focuses on the new intervention being tested and provides limited and at times misleading information about placebos. We aimed to create an informative, scientifically accurate, and engaging website that could be used to improve understanding of placebo effects among patients who might be considering taking part in a placebo-controlled clinical trial. Our approach drew on evidence-, theory-, and person-based intervention development. We used existing evidence and theory about placebo effects to develop content that was scientifically accurate. We used existing evidence and theory of health behavior to ensure our content would be communicated persuasively, to an audience who might currently be ignorant or misinformed about placebo effects. A qualitative 'think aloud' study was conducted in which 10 participants viewed prototypes of the website and spoke their thoughts out loud in the presence of a researcher. The website provides information about 10 key topics and uses text, evidence summaries, quizzes, audio clips of patients' stories, and a short film to convey key messages. Comments from participants in the think aloud study highlighted occasional misunderstandings and off-putting/confusing features. These were addressed by modifying elements of content, style, and navigation to improve participants' experiences of using the website. We have developed an evidence-based website that incorporates theory-based techniques to inform members of the public about placebos and placebo effects. Qualitative research ensured our website was engaging and convincing for our target audience who might not perceive a need to learn about placebo effects. Before using the website in clinical trials, it is necessary to test its effects on key outcomes including patients' knowledge and capacity for making informed choices about placebos.
ERIC Educational Resources Information Center
DeMars, Christine E.
2012-01-01
In structural equation modeling software, either limited-information (bivariate proportions) or full-information item parameter estimation routines could be used for the 2-parameter item response theory (IRT) model. Limited-information methods assume the continuous variable underlying an item response is normally distributed. For skewed and…
A new method based on Dempster-Shafer theory and fuzzy c-means for brain MRI segmentation
NASA Astrophysics Data System (ADS)
Liu, Jie; Lu, Xi; Li, Yunpeng; Chen, Xiaowu; Deng, Yong
2015-10-01
In this paper, a new method is proposed to decrease sensitiveness to motion noise and uncertainty in magnetic resonance imaging (MRI) segmentation especially when only one brain image is available. The method is approached with considering spatial neighborhood information by fusing the information of pixels with their neighbors with Dempster-Shafer (DS) theory. The basic probability assignment (BPA) of each single hypothesis is obtained from the membership function of applying fuzzy c-means (FCM) clustering to the gray levels of the MRI. Then multiple hypotheses are generated according to the single hypothesis. Then we update the objective pixel’s BPA by fusing the BPA of the objective pixel and those of its neighbors to get the final result. Some examples in MRI segmentation are demonstrated at the end of the paper, in which our method is compared with some previous methods. The results show that the proposed method is more effective than other methods in motion-blurred MRI segmentation.
Impact of a theory-based video on initiation of long-acting reversible contraception after abortion.
Davidson, AuTumn S; Whitaker, Amy K; Martins, Summer L; Hill, Brandon; Kuhn, Caroline; Hagbom-Ma, Catherine; Gilliam, Melissa
2015-03-01
Adoption of long-acting reversible contraception (LARC) (ie, the intrauterine device or the contraceptive implant) immediately after abortion is associated with high contraceptive satisfaction and reduced rates of repeat abortion. Theory-based counseling interventions have been demonstrated to improve a variety of health behaviors; data on theory-based counseling interventions for postabortion contraception are lacking. Informed by the transtheoretical model of behavioral change, a video intervention was developed to increase awareness of, and dispel misconceptions about, LARC methods. The intervention was evaluated in a randomized controlled trial among women aged 18-29 years undergoing surgical abortion at a clinic in Chicago, IL. Participants were randomized 1:1 to watch the intervention video or to watch a stress management video (control), both 7 minutes in duration. Contraceptive methods were supplied to all participants free of charge. Rates of LARC initiation immediately after abortion were compared. Rates of LARC initiation immediately after abortion were not significantly different between the 2 study arms; 59.6% in the intervention and 51.6% in the control arm chose a LARC method (P = .27). This study resulted in an unexpectedly high rate of LARC initiation immediately after abortion. High rates of LARC initiation could not be attributed to a theory-based counseling intervention. Copyright © 2015 Elsevier Inc. All rights reserved.
IRT Model Selection Methods for Dichotomous Items
ERIC Educational Resources Information Center
Kang, Taehoon; Cohen, Allan S.
2007-01-01
Fit of the model to the data is important if the benefits of item response theory (IRT) are to be obtained. In this study, the authors compared model selection results using the likelihood ratio test, two information-based criteria, and two Bayesian methods. An example illustrated the potential for inconsistency in model selection depending on…
Ko, Linda K.; Turner-McGrievy, Gabrielle; Campbell, Marci K.
2016-01-01
Podcasting is an emerging technology, and previous interventions have shown promising results using theory-based podcast for weight loss among overweight and obese individuals. This study investigated whether constructs of social cognitive theory and information processing theories (IPTs) mediate the effect of a podcast intervention on weight loss among overweight individuals. Data are from Pounds off Digitally, a study testing the efficacy of two weight loss podcast interventions (control podcast and theory-based podcast). Path models were constructed (n = 66). The IPTs—elaboration likelihood model, information control theory, and cognitive load theory—mediated the effect of a theory-based podcast on weight loss. The intervention was significantly associated with all IPTs. Information control theory and cognitive load theory were related to elaboration, and elaboration was associated with weight loss. Social cognitive theory constructs did not mediate weight loss. Future podcast interventions grounded in theory may be effective in promoting weight loss. PMID:24082027
Deafness, thought bubbles, and theory-of-mind development.
Wellman, Henry M; Peterson, Candida C
2013-12-01
The processes and mechanisms of theory-of-mind development were examined via a training study of false-belief conceptions in deaf children of hearing parents (N = 43). In comparison to 2 different control conditions, training based on thought-bubble instruction about beliefs was linked with improved false-belief understanding as well as progress on a broader theory-of-mind scale. By combining intervention, microgenetic, and developmental scaling methods, the findings provide informative data about the nature and mechanisms of theory-of-mind change in deaf children, as well as an initial demonstration of a useful intervention for enhancing social cognition in deaf children of hearing parents. The methods and results also point to possible avenues for the study of conceptual change more generally. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Song, Misoon; Choi, Suyoung; Kim, Se-An; Seo, Kyoungsan; Lee, Soo Jin
2015-01-01
Development of behavior theory-based health promotion programs is encouraged with the paradigm shift from contents to behavior outcomes. This article describes the development process of the diabetes self-management program for older Koreans (DSME-OK) using intervention mapping (IM) protocol. The IM protocol includes needs assessment, defining goals and objectives, identifying theory and determinants, developing a matrix to form change objectives, selecting strategies and methods, structuring the program, and planning for evaluation and pilot testing. The DSME-OK adopted seven behavior objectives developed by the American Association of Diabetes Educators as behavioral outcomes. The program applied an information-motivation-behavioral skills model, and interventions were targeted to 3 determinants to change health behaviors. Specific methods were selected to achieve each objective guided by IM protocol. As the final step, program evaluation was planned including a pilot test. The DSME-OK was structured as the 3 determinants of the IMB model were intervened to achieve behavior objectives in each session. The program has 12 weekly 90-min sessions tailored for older adults. Using the IM protocol in developing a theory-based self-management program was beneficial in terms of providing a systematic guide to developing theory-based and behavior outcome-focused health education programs.
An Improved Evidential-IOWA Sensor Data Fusion Approach in Fault Diagnosis
Zhou, Deyun; Zhuang, Miaoyan; Fang, Xueyi; Xie, Chunhe
2017-01-01
As an important tool of information fusion, Dempster–Shafer evidence theory is widely applied in handling the uncertain information in fault diagnosis. However, an incorrect result may be obtained if the combined evidence is highly conflicting, which may leads to failure in locating the fault. To deal with the problem, an improved evidential-Induced Ordered Weighted Averaging (IOWA) sensor data fusion approach is proposed in the frame of Dempster–Shafer evidence theory. In the new method, the IOWA operator is used to determine the weight of different sensor data source, while determining the parameter of the IOWA, both the distance of evidence and the belief entropy are taken into consideration. First, based on the global distance of evidence and the global belief entropy, the α value of IOWA is obtained. Simultaneously, a weight vector is given based on the maximum entropy method model. Then, according to IOWA operator, the evidence are modified before applying the Dempster’s combination rule. The proposed method has a better performance in conflict management and fault diagnosis due to the fact that the information volume of each evidence is taken into consideration. A numerical example and a case study in fault diagnosis are presented to show the rationality and efficiency of the proposed method. PMID:28927017
2014-05-01
DISTRIBUTION A. Approved for public release: distribution unlimited. INFORMATION, UNDERSTANDING, AND INFLUENCE: AN AGENCY THEORY STRATEGY ...Influence: An Agency Theory Strategy For Air Base Communications And Cyberspace Support 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...present communications and cyberspace support organizations. Next, it introduces a strategy based on this analysis to bring information, understanding
Semiclassical Path Integral Calculation of Nonlinear Optical Spectroscopy.
Provazza, Justin; Segatta, Francesco; Garavelli, Marco; Coker, David F
2018-02-13
Computation of nonlinear optical response functions allows for an in-depth connection between theory and experiment. Experimentally recorded spectra provide a high density of information, but to objectively disentangle overlapping signals and to reach a detailed and reliable understanding of the system dynamics, measurements must be integrated with theoretical approaches. Here, we present a new, highly accurate and efficient trajectory-based semiclassical path integral method for computing higher order nonlinear optical response functions for non-Markovian open quantum systems. The approach is, in principle, applicable to general Hamiltonians and does not require any restrictions on the form of the intrasystem or system-bath couplings. This method is systematically improvable and is shown to be valid in parameter regimes where perturbation theory-based methods qualitatively breakdown. As a test of the methodology presented here, we study a system-bath model for a coupled dimer for which we compare against numerically exact results and standard approximate perturbation theory-based calculations. Additionally, we study a monomer with discrete vibronic states that serves as the starting point for future investigation of vibronic signatures in nonlinear electronic spectroscopy.
A Monte-Carlo game theoretic approach for Multi-Criteria Decision Making under uncertainty
NASA Astrophysics Data System (ADS)
Madani, Kaveh; Lund, Jay R.
2011-05-01
Game theory provides a useful framework for studying Multi-Criteria Decision Making problems. This paper suggests modeling Multi-Criteria Decision Making problems as strategic games and solving them using non-cooperative game theory concepts. The suggested method can be used to prescribe non-dominated solutions and also can be used as a method to predict the outcome of a decision making problem. Non-cooperative stability definitions for solving the games allow consideration of non-cooperative behaviors, often neglected by other methods which assume perfect cooperation among decision makers. To deal with the uncertainty in input variables a Monte-Carlo Game Theory (MCGT) approach is suggested which maps the stochastic problem into many deterministic strategic games. The games are solved using non-cooperative stability definitions and the results include possible effects of uncertainty in input variables on outcomes. The method can handle multi-criteria multi-decision-maker problems with uncertainty. The suggested method does not require criteria weighting, developing a compound decision objective, and accurate quantitative (cardinal) information as it simplifies the decision analysis by solving problems based on qualitative (ordinal) information, reducing the computational burden substantially. The MCGT method is applied to analyze California's Sacramento-San Joaquin Delta problem. The suggested method provides insights, identifies non-dominated alternatives, and predicts likely decision outcomes.
Classification of weld defect based on information fusion technology for radiographic testing system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Hongquan; Liang, Zeming, E-mail: heavenlzm@126.com; Gao, Jianmin
Improving the efficiency and accuracy of weld defect classification is an important technical problem in developing the radiographic testing system. This paper proposes a novel weld defect classification method based on information fusion technology, Dempster–Shafer evidence theory. First, to characterize weld defects and improve the accuracy of their classification, 11 weld defect features were defined based on the sub-pixel level edges of radiographic images, four of which are presented for the first time in this paper. Second, we applied information fusion technology to combine different features for weld defect classification, including a mass function defined based on the weld defectmore » feature information and the quartile-method-based calculation of standard weld defect class which is to solve a sample problem involving a limited number of training samples. A steam turbine weld defect classification case study is also presented herein to illustrate our technique. The results show that the proposed method can increase the correct classification rate with limited training samples and address the uncertainties associated with weld defect classification.« less
Jiang, Hongquan; Liang, Zeming; Gao, Jianmin; Dang, Changying
2016-03-01
Improving the efficiency and accuracy of weld defect classification is an important technical problem in developing the radiographic testing system. This paper proposes a novel weld defect classification method based on information fusion technology, Dempster-Shafer evidence theory. First, to characterize weld defects and improve the accuracy of their classification, 11 weld defect features were defined based on the sub-pixel level edges of radiographic images, four of which are presented for the first time in this paper. Second, we applied information fusion technology to combine different features for weld defect classification, including a mass function defined based on the weld defect feature information and the quartile-method-based calculation of standard weld defect class which is to solve a sample problem involving a limited number of training samples. A steam turbine weld defect classification case study is also presented herein to illustrate our technique. The results show that the proposed method can increase the correct classification rate with limited training samples and address the uncertainties associated with weld defect classification.
Change detection of bitemporal multispectral images based on FCM and D-S theory
NASA Astrophysics Data System (ADS)
Shi, Aiye; Gao, Guirong; Shen, Shaohong
2016-12-01
In this paper, we propose a change detection method of bitemporal multispectral images based on the D-S theory and fuzzy c-means (FCM) algorithm. Firstly, the uncertainty and certainty regions are determined by thresholding method applied to the magnitudes of difference image (MDI) and spectral angle information (SAI) of bitemporal images. Secondly, the FCM algorithm is applied to the MDI and SAI in the uncertainty region, respectively. Then, the basic probability assignment (BPA) functions of changed and unchanged classes are obtained by the fuzzy membership values from the FCM algorithm. In addition, the optimal value of fuzzy exponent of FCM is adaptively determined by conflict degree between the MDI and SAI in uncertainty region. Finally, the D-S theory is applied to obtain the new fuzzy partition matrix for uncertainty region and further the change map is obtained. Experiments on bitemporal Landsat TM images and bitemporal SPOT images validate that the proposed method is effective.
A variable-order laminated plate theory based on the variational-asymptotical method
NASA Technical Reports Server (NTRS)
Lee, Bok W.; Sutyrin, Vladislav G.; Hodges, Dewey H.
1993-01-01
The variational-asymptotical method is a mathematical technique by which the three-dimensional analysis of laminated plate deformation can be split into a linear, one-dimensional, through-the-thickness analysis and a nonlinear, two-dimensional, plate analysis. The elastic constants used in the plate analysis are obtained from the through-the-thickness analysis, along with approximate, closed-form three-dimensional distributions of displacement, strain, and stress. In this paper, a theory based on this technique is developed which is capable of approximating three-dimensional elasticity to any accuracy desired. The asymptotical method allows for the approximation of the through-the-thickness behavior in terms of the eigenfunctions of a certain Sturm-Liouville problem associated with the thickness coordinate. These eigenfunctions contain all the necessary information about the nonhomogeneities along the thickness coordinate of the plate and thus possess the appropriate discontinuities in the derivatives of displacement. The theory is presented in this paper along with numerical results for the eigenfunctions of various laminated plates.
Time delayed Ensemble Nudging Method
NASA Astrophysics Data System (ADS)
An, Zhe; Abarbanel, Henry
Optimal nudging method based on time delayed embedding theory has shows potentials on analyzing and data assimilation in previous literatures. To extend the application and promote the practical implementation, new nudging assimilation method based on the time delayed embedding space is presented and the connection with other standard assimilation methods are studied. Results shows the incorporating information from the time series of data can reduce the sufficient observation needed to preserve the quality of numerical prediction, making it a potential alternative in the field of data assimilation of large geophysical models.
The Effects of Single and Dual Coded Multimedia Instructional Methods on Chinese Character Learning
ERIC Educational Resources Information Center
Wang, Ling
2013-01-01
Learning Chinese characters is a difficult task for adult English native speakers due to the significant differences between the Chinese and English writing system. The visuospatial properties of Chinese characters have inspired the development of instructional methods using both verbal and visual information based on the Dual Coding Theory. This…
ERIC Educational Resources Information Center
Wonacott, Michael E.
Both face-to-face and distance learning methods are currently being used in adult education and career and technical education. In theory, the advantages of face-to-face and distance learning methods complement each other. In practice, however, both face-to-face and information and communications technology (ICT)-based distance programs often rely…
Anticipated detection of favorable periods for wind energy production by means of information theory
NASA Astrophysics Data System (ADS)
Vogel, Eugenio; Saravia, Gonzalo; Kobe, Sigismund; Schumann, Rolf; Schuster, Rolf
Managing the electric power produced by different sources requires mixing the different response times they present. Thus, for instance, coal burning presents large time lags until operational conditions are reached while hydroelectric generation can react in a matter of some seconds or few minutes to reach the desired productivity. Wind energy production (WEP) can be instantaneously fed to the network to save fuels with low thermal inertia (gas burning for instance), but this source presents sudden variations within few hours. We report here for the first time a method based on information theory to handle WEP. This method has been successful in detecting dynamical changes in magnetic transitions and variations of stock markets. An algorithm called wlzip based on information recognition is used to recognize the information content of a time series. We make use of publically available energy data in Germany to simulate real applications. After a calibration process the system can recognize directly on the WEP data the onset of favorable periods of a desired strength. Optimization can lead to a few hours of anticipation which is enough to control the mixture of WEP with other energy sources, thus saving fuels.
Discovery of Empirical Components by Information Theory
2016-08-10
AFRL-AFOSR-VA-TR-2016-0289 Discovery of Empirical Components by Information Theory Amit Singer TRUSTEES OF PRINCETON UNIVERSITY 1 NASSAU HALL...3. DATES COVERED (From - To) 15 Feb 2013 to 14 Feb 2016 5a. CONTRACT NUMBER Discovery of Empirical Components by Information Theory 5b. GRANT...they draw not only from traditional linear algebra based numerical analysis or approximation theory , but also from information theory , graph theory
Renewed roles for librarians in problem-based learning in the medical curriculum.
Mi, Misa
2011-01-01
Problem-based learning (PBL) is a teaching-learning process or method of instruction that is widely used in medical education curricula. Librarians play important roles as facilitators for PBL as well as guides for information resources. Involvement in PBL activities presents unique opportunities to incorporate library resources and instruction into the medical curriculum. This article reviews the problem-based learning method within the conceptual framework of the learning theory of constructivism. It describes how a medical librarian at a U.S. medical school used emerging technologies to facilitate PBL small group case discussions, guide students to quality information resources, and enhance the learning environment for the PBL process.
Systematic searching for theory to inform systematic reviews: is it feasible? Is it desirable?
Booth, Andrew; Carroll, Christopher
2015-09-01
In recognising the potential value of theory in understanding how interventions work comes a challenge - how to make identification of theory less haphazard? To explore the feasibility of systematic identification of theory. We searched PubMed for published reviews (1998-2012) that had explicitly sought to identify theory. Systematic searching may be characterised by a structured question, methodological filters and an itemised search procedure. We constructed a template (BeHEMoTh - Behaviour of interest; Health context; Exclusions; Models or Theories) for use when systematically identifying theory. The authors tested the template within two systematic reviews. Of 34 systematic reviews, only 12 reviews (35%) reported a method for identifying theory. Nineteen did not specify how they identified studies containing theory. Data were unavailable for three reviews. Candidate terms include concept(s)/conceptual, framework(s), model(s), and theory/theories/theoretical. Information professionals must overcome inadequate reporting and the use of theory out of context. The review team faces an additional concern in lack of 'theory fidelity'. Based on experience with two systematic reviews, the BeHEMoTh template and procedure offers a feasible and useful approach for identification of theory. Applications include realist synthesis, framework synthesis or review of complex interventions. The procedure requires rigorous evaluation. © 2015 Health Libraries Group.
Research on fatigue driving pre-warning system based on multi-information fusion
NASA Astrophysics Data System (ADS)
Zhao, Xuyang; Ye, Wenwu
2018-05-01
With the development of science and technology, transportation network has grown faster. But at the same time, the quantity of traffic accidents due to fatigue driving grows faster as well. In the meantime, fatigue driving has been one of the main causes of traffic accidents. Therefore, it is indispensable for us to study the detection of fatigue driving to help to driving safety. There are numerous approaches in discrimination method. Each type of method has its reasonable theoretical basis, but the disadvantages of traditional fatigue driving detection methods have been more and more obvious since we study the traditional physiology and psychological features of fatigue drivers. So we set up a new system based on multi-information fusion and pattern recognition theory. In the paper, the fatigue driving pre-warning system discriminates fatigue by analyzing the characteristic parameters, the parameters derived from the steering wheel angle, the driver's power of gripping and the heart rate. And the data analysis system is established based on fuzzy C-means clustering theory. Finally, KNN classifier is used to establish the relation between feature indexes and fatigue degree. It is verified that the system has the better accuracy, agility and robustness according to our confirmatory experiment.
Ford, John A; Jones, Andrew P; Wong, Geoff; Clark, Allan B; Porter, Tom; Shakespeare, Tom; Swart, Ann Marie; Steel, Nicholas
2015-09-18
The UK has an ageing population, especially in rural areas, where deprivation is high among older people. Previous research has identified this group as at high risk of poor access to healthcare. The aim of this study is to generate a theory of how socioeconomically disadvantaged older people from rural areas access primary care, to develop an intervention based on this theory and test it in a feasibility trial. On the basis of the MRC Framework for Developing and Evaluating Complex Interventions, three methods will be used to generate the theory. First, a realist review will elucidate the patient pathway based on existing literature. Second, an analysis of the English Longitudinal Study of Ageing will be completed using structural equation modelling. Third, 15 semistructured interviews will be undertaken with patients and four focus groups with health professionals. A triangulation protocol will be used to allow each of these methods to inform and be informed by each other, and to integrate data into one overall realist theory. Based on this theory, an intervention will be developed in discussion with stakeholders to ensure that the intervention is feasible and practical. The intervention will be tested within a feasibility trial, the design of which will depend on the intervention. Lessons from the feasibility trial will be used to refine the intervention and gather the information needed for a definitive trial. Ethics approval from the regional ethics committee has been granted for the focus groups with health professionals and interviews with patients. Ethics approval will be sought for the feasibility trial after the intervention has been designed. Findings will be disseminated to the key stakeholders involved in intervention development, to researchers, clinicians and health planners through peer-reviewed journal articles and conference publications, and locally through a dissemination event. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Entropy in sound and vibration: towards a new paradigm.
Le Bot, A
2017-01-01
This paper describes a discussion on the method and the status of a statistical theory of sound and vibration, called statistical energy analysis (SEA). SEA is a simple theory of sound and vibration in elastic structures that applies when the vibrational energy is diffusely distributed. We show that SEA is a thermodynamical theory of sound and vibration, based on a law of exchange of energy analogous to the Clausius principle. We further investigate the notion of entropy in this context and discuss its meaning. We show that entropy is a measure of information lost in the passage from the classical theory of sound and vibration and SEA, its thermodynamical counterpart.
Feasibility study of molecular memory device based on DNA using methylation to store information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Liming; Al-Dirini, Feras; Center for Neural Engineering
DNA, because of its robustness and dense information storage capability, has been proposed as a potential candidate for next-generation storage media. However, encoding information into the DNA sequence requires molecular synthesis technology, which to date is costly and prone to synthesis errors. Reading the DNA strand information is also complex. Ideally, DNA storage will provide methods for modifying stored information. Here, we conduct a feasibility study investigating the use of the DNA 5-methylcytosine (5mC) methylation state as a molecular memory to store information. We propose a new 1-bit memory device and study, based on the density functional theory and non-equilibriummore » Green's function method, the feasibility of electrically reading the information. Our results show that changes to methylation states lead to changes in the peak of negative differential resistance which can be used to interrogate memory state. Our work demonstrates a new memory concept based on methylation state which can be beneficial in the design of next generation DNA based molecular electronic memory devices.« less
Jiang, Quansheng; Shen, Yehu; Li, Hua; Xu, Fengyu
2018-01-24
Feature recognition and fault diagnosis plays an important role in equipment safety and stable operation of rotating machinery. In order to cope with the complexity problem of the vibration signal of rotating machinery, a feature fusion model based on information entropy and probabilistic neural network is proposed in this paper. The new method first uses information entropy theory to extract three kinds of characteristics entropy in vibration signals, namely, singular spectrum entropy, power spectrum entropy, and approximate entropy. Then the feature fusion model is constructed to classify and diagnose the fault signals. The proposed approach can combine comprehensive information from different aspects and is more sensitive to the fault features. The experimental results on simulated fault signals verified better performances of our proposed approach. In real two-span rotor data, the fault detection accuracy of the new method is more than 10% higher compared with the methods using three kinds of information entropy separately. The new approach is proved to be an effective fault recognition method for rotating machinery.
Similarity Theory of Withdrawn Water Temperature Experiment
2015-01-01
Selective withdrawal from a thermal stratified reservoir has been widely utilized in managing reservoir water withdrawal. Besides theoretical analysis and numerical simulation, model test was also necessary in studying the temperature of withdrawn water. However, information on the similarity theory of the withdrawn water temperature model remains lacking. Considering flow features of selective withdrawal, the similarity theory of the withdrawn water temperature model was analyzed theoretically based on the modification of governing equations, the Boussinesq approximation, and some simplifications. The similarity conditions between the model and the prototype were suggested. The conversion of withdrawn water temperature between the model and the prototype was proposed. Meanwhile, the fundamental theory of temperature distribution conversion was firstly proposed, which could significantly improve the experiment efficiency when the basic temperature of the model was different from the prototype. Based on the similarity theory, an experiment was performed on the withdrawn water temperature which was verified by numerical method. PMID:26065020
Fiber tracking of brain white matter based on graph theory.
Lu, Meng
2015-01-01
Brain white matter tractography is reconstructed via diffusion-weighted magnetic resonance images. Due to the complex structure of brain white matter fiber bundles, fiber crossing and fiber branching are abundant in human brain. And regular methods with diffusion tensor imaging (DTI) can't accurately handle this problem. the biggest problems of the brain tractography. Therefore, this paper presented a novel brain white matter tractography method based on graph theory, so the fiber tracking between two voxels is transformed into locating the shortest path in a graph. Besides, the presented method uses Q-ball imaging (QBI) as the source data instead of DTI, because QBI can provide accurate information about multiple fiber crossing and branching in one voxel using orientation distribution function (ODF). Experiments showed that the presented method can accurately handle the problem of brain white matter fiber crossing and branching, and reconstruct brain tractograhpy both in phantom data and real brain data.
Control Theory based Shape Design for the Incompressible Navier-Stokes Equations
NASA Astrophysics Data System (ADS)
Cowles, G.; Martinelli, L.
2003-12-01
A design method for shape optimization in incompressible turbulent viscous flow has been developed and validated for inverse design. The gradient information is determined using a control theory based algorithm. With such an approach, the cost of computing the gradient is negligible. An additional adjoint system must be solved which requires the cost of a single steady state flow solution. Thus, this method has an enormous advantage over traditional finite-difference based algorithms. The method of artificial compressibility is utilized to solve both the flow and adjoint systems. An algebraic turbulence model is used to compute the eddy viscosity. The method is validated using several inverse wing design test cases. In each case, the program must modify the shape of the initial wing such that its pressure distribution matches that of the target wing. Results are shown for the inversion of both finite thickness wings as well as zero thickness wings which can be considered a model of yacht sails.
Teram, Eli; Schachter, Candice L; Stalker, Carol A
2005-10-01
Grounded theory and participatory action research methods are distinct approaches to qualitative inquiry. Although grounded theory has been conceptualized in constructivist terms, it has elements of positivist thinking with an image of neutral search for objective truth through rigorous data collection and analysis. Participatory action research is based on a critique of this image and calls for more inclusive research processes. It questions the possibility of objective social sciences and aspires to engage people actively in all stages of generating knowledge. The authors applied both approaches in a project designed to explore the experiences of female survivors of childhood sexual abuse with physical therapy and subsequently develop a handbook on sensitive practice for clinicians that takes into consideration the needs and perspectives of these clients. Building on this experience, they argue that the integration of grounded theory and participatory action research can empower clients to inform professional practice.
Theory-based interventions for contraception.
Lopez, Laureen M; Tolley, Elizabeth E; Grimes, David A; Chen-Mok, Mario
2009-01-21
The explicit use of theory in research helps expand the knowledge base. Theories and models have been used extensively in HIV-prevention research and in interventions for preventing sexually transmitted infections (STIs). The health behavior field uses many theories or models of change. However, educational interventions addressing contraception often have no stated theoretical base. Review randomized controlled trials that tested a theoretical approach to inform contraceptive choice; encourage contraceptive use; or promote adherence to, or continuation of, a contraceptive regimen. We searched computerized databases for trials that tested a theory-based intervention for improving contraceptive use (MEDLINE, POPLINE, CENTRAL, PsycINFO, EMBASE, ClinicalTrials.gov, and ICTRP). We also wrote to researchers to find other trials. Trials tested a theory-based intervention for improving contraceptive use. We excluded trials focused on high-risk groups. Interventions addressed the use of one or more contraceptive methods. The reports provided evidence that the intervention was based on a specific theory or model. The primary outcomes were pregnancy, contraceptive choice, initiating or changing contraceptive use, contraceptive regimen adherence, and contraception continuation. The primary author evaluated abstracts for eligibility. Two authors extracted data from included studies. We calculated the odds ratio for dichotomous outcomes and the mean difference for continuous data. No meta-analysis was conducted due to intervention differences. Of 26 trials, 12 interventions addressed contraception (other than condoms), while 14 focused on condom use for preventing HIV or STIs. In 2 of 10 trials with pregnancy or birth data, a theory-based group showed better results. Four of nine trials with contraceptive use (other than condoms) showed better outcomes in an experimental group. For condom use, a theory-based group had favorable results in 14 of 20 trials, but the number was halved in a subgroup analysis. Social Cognitive Theory was the main theoretical basis for 12 trials, and 10 showed positive results. Of the other 14 trials, favorable results were shown for other social cognition models (N=2), motivational interviewing (N=5), and the AIDS Risk Reduction Model (N=2). No major patterns were detected by type of theory, intervention, or target population. Family planning researchers and practitioners could apply the relevant theories and effective interventions from HIV and STI prevention. More thorough use of single theories would help inform the field about what works. Better reporting is needed on research design and intervention implementation.
The application of foraging theory to the information searching behaviour of general practitioners.
Dwairy, Mai; Dowell, Anthony C; Stahl, Jean-Claude
2011-08-23
General Practitioners (GPs) employ strategies to identify and retrieve medical evidence for clinical decision making which take workload and time constraints into account. Optimal Foraging Theory (OFT) initially developed to study animal foraging for food is used to explore the information searching behaviour of General Practitioners. This study is the first to apply foraging theory within this context.Study objectives were: 1. To identify the sequence and steps deployed in identifiying and retrieving evidence for clinical decision making. 2. To utilise Optimal Foraging Theory to assess the effectiveness and efficiency of General Practitioner information searching. GPs from the Wellington region of New Zealand were asked to document in a pre-formatted logbook the steps and outcomes of an information search linked to their clinical decision making, and fill in a questionnaire about their personal, practice and information-searching backgrounds. A total of 115/155 eligible GPs returned a background questionnaire, and 71 completed their information search logbook. GPs spent an average of 17.7 minutes addressing their search for clinical information. Their preferred information sources were discussions with colleagues (38% of sources) and books (22%). These were the two most profitable information foraging sources (15.9 min and 9.5 min search time per answer, compared to 34.3 minutes in databases). GPs nearly always accessed another source when unsuccessful (95% after 1st source), and frequently when successful (43% after 2nd source). Use of multiple sources accounted for 41% of searches, and increased search success from 70% to 89%. By consulting in foraging terms the most 'profitable' sources of information (colleagues, books), rapidly switching sources when unsuccessful, and frequently double checking, GPs achieve an efficient trade-off between maximizing search success and information reliability, and minimizing searching time. As predicted by foraging theory, GPs trade time-consuming evidence-based (electronic) information sources for sources with a higher information reward per unit time searched. Evidence-based practice must accommodate these 'real world' foraging pressures, and Internet resources should evolve to deliver information as effectively as traditional methods of information gathering.
Network Learning for Educational Change. Professional Learning
ERIC Educational Resources Information Center
Veugelers, Wiel, Ed.; O'Hair, Mary John, Ed.
2005-01-01
School-university networks are becoming an important method to enhance educational renewal and student achievement. Networks go beyond tensions of top-down versus bottom-up, school development and professional development of individuals, theory and practice, and formal and informal organizational structures. The theoretical base of networking…
A decision method based on uncertainty reasoning of linguistic truth-valued concept lattice
NASA Astrophysics Data System (ADS)
Yang, Li; Xu, Yang
2010-04-01
Decision making with linguistic information is a research hotspot now. This paper begins by establishing the theory basis for linguistic information processing and constructs the linguistic truth-valued concept lattice for a decision information system, and further utilises uncertainty reasoning to make the decision. That is, we first utilise the linguistic truth-valued lattice implication algebra to unify the different kinds of linguistic expressions; second, we construct the linguistic truth-valued concept lattice and decision concept lattice according to the concrete decision information system and third, we establish the internal and external uncertainty reasoning methods and talk about the rationality of them. We apply these uncertainty reasoning methods into decision making and present some generation methods of decision rules. In the end, we give an application of this decision method by an example.
Entropy evolution of moving mirrors and the information loss problem
NASA Astrophysics Data System (ADS)
Chen, Pisin; Yeom, Dong-han
2017-07-01
We investigate the entanglement entropy and the information flow of two-dimensional moving mirrors. Here we point out that various mirror trajectories can help to mimic different candidate resolutions to the information loss paradox following the semiclassical quantum field theory: (i) a suddenly stopping mirror corresponds to the assertion that all information is attached to the last burst, (ii) a slowly stopping mirror corresponds to the assertion that thermal Hawking radiation carries information, and (iii) a long propagating mirror corresponds to the remnant scenario. Based on such analogy, we find that the last burst of a black hole cannot contain enough information, while slowly emitting radiation can restore unitarity. For all cases, there is an apparent inconsistency between the picture based on quantum entanglements and that based on the semiclassical quantum field theory. Based on the quantum entanglement theory, a stopping mirror will generate a firewall-like violent emission which is in conflict with notions based on the semiclassical quantum field theory.
IMMAN: free software for information theory-based chemometric analysis.
Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo
2015-05-01
The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA supervised algorithms. Graphic representation for Shannon's distribution of MD calculating software.
Multiclassifier information fusion methods for microarray pattern recognition
NASA Astrophysics Data System (ADS)
Braun, Jerome J.; Glina, Yan; Judson, Nicholas; Herzig-Marx, Rachel
2004-04-01
This paper addresses automatic recognition of microarray patterns, a capability that could have a major significance for medical diagnostics, enabling development of diagnostic tools for automatic discrimination of specific diseases. The paper presents multiclassifier information fusion methods for microarray pattern recognition. The input space partitioning approach based on fitness measures that constitute an a-priori gauging of classification efficacy for each subspace is investigated. Methods for generation of fitness measures, generation of input subspaces and their use in the multiclassifier fusion architecture are presented. In particular, two-level quantification of fitness that accounts for the quality of each subspace as well as the quality of individual neighborhoods within the subspace is described. Individual-subspace classifiers are Support Vector Machine based. The decision fusion stage fuses the information from mulitple SVMs along with the multi-level fitness information. Final decision fusion stage techniques, including weighted fusion as well as Dempster-Shafer theory based fusion are investigated. It should be noted that while the above methods are discussed in the context of microarray pattern recognition, they are applicable to a broader range of discrimination problems, in particular to problems involving a large number of information sources irreducible to a low-dimensional feature space.
Kinematic Determination of an Unmodeled Serial Manipulator by Means of an IMU
NASA Astrophysics Data System (ADS)
Ciarleglio, Constance A.
Kinematic determination for an unmodeled manipulator is usually done through a-priori knowledge of the manipulator physical characteristics or external sensor information. The mathematics of the kinematic estimation, often based on Denavit- Hartenberg convention, are complex and have high computation requirements, in addition to being unique to the manipulator for which the method is developed. Analytical methods that can compute kinematics on-the fly have the potential to be highly beneficial in dynamic environments where different configurations and variable manipulator types are often required. This thesis derives a new screw theory based method of kinematic determination, using a single inertial measurement unit (IMU), for use with any serial, revolute manipulator. The method allows the expansion of reconfigurable manipulator design and simplifies the kinematic process for existing manipulators. A simulation is presented where the theory of the method is verified and characterized with error. The method is then implemented on an existing manipulator as a verification of functionality.
Petri net-based method for the analysis of the dynamics of signal propagation in signaling pathways.
Hardy, Simon; Robillard, Pierre N
2008-01-15
Cellular signaling networks are dynamic systems that propagate and process information, and, ultimately, cause phenotypical responses. Understanding the circuitry of the information flow in cells is one of the keys to understanding complex cellular processes. The development of computational quantitative models is a promising avenue for attaining this goal. Not only does the analysis of the simulation data based on the concentration variations of biological compounds yields information about systemic state changes, but it is also very helpful for obtaining information about the dynamics of signal propagation. This article introduces a new method for analyzing the dynamics of signal propagation in signaling pathways using Petri net theory. The method is demonstrated with the Ca(2+)/calmodulin-dependent protein kinase II (CaMKII) regulation network. The results constitute temporal information about signal propagation in the network, a simplified graphical representation of the network and of the signal propagation dynamics and a characterization of some signaling routes as regulation motifs.
NASA Astrophysics Data System (ADS)
Keum, Jongho; Coulibaly, Paulin
2017-07-01
Adequate and accurate hydrologic information from optimal hydrometric networks is an essential part of effective water resources management. Although the key hydrologic processes in the water cycle are interconnected, hydrometric networks (e.g., streamflow, precipitation, groundwater level) have been routinely designed individually. A decision support framework is proposed for integrated design of multivariable hydrometric networks. The proposed method is applied to design optimal precipitation and streamflow networks simultaneously. The epsilon-dominance hierarchical Bayesian optimization algorithm was combined with Shannon entropy of information theory to design and evaluate hydrometric networks. Specifically, the joint entropy from the combined networks was maximized to provide the most information, and the total correlation was minimized to reduce redundant information. To further optimize the efficiency between the networks, they were designed by maximizing the conditional entropy of the streamflow network given the information of the precipitation network. Compared to the traditional individual variable design approach, the integrated multivariable design method was able to determine more efficient optimal networks by avoiding the redundant stations. Additionally, four quantization cases were compared to evaluate their effects on the entropy calculations and the determination of the optimal networks. The evaluation results indicate that the quantization methods should be selected after careful consideration for each design problem since the station rankings and the optimal networks can change accordingly.
Low-complexity video encoding method for wireless image transmission in capsule endoscope.
Takizawa, Kenichi; Hamaguchi, Kiyoshi
2010-01-01
This paper presents a low-complexity video encoding method applicable for wireless image transmission in capsule endoscopes. This encoding method is based on Wyner-Ziv theory, in which side information available at a transmitter is treated as side information at its receiver. Therefore complex processes in video encoding, such as estimation of the motion vector, are moved to the receiver side, which has a larger-capacity battery. As a result, the encoding process is only to decimate coded original data through channel coding. We provide a performance evaluation for a low-density parity check (LDPC) coding method in the AWGN channel.
An information-based network approach for protein classification
Wan, Xiaogeng; Zhao, Xin; Yau, Stephen S. T.
2017-01-01
Protein classification is one of the critical problems in bioinformatics. Early studies used geometric distances and polygenetic-tree to classify proteins. These methods use binary trees to present protein classification. In this paper, we propose a new protein classification method, whereby theories of information and networks are used to classify the multivariate relationships of proteins. In this study, protein universe is modeled as an undirected network, where proteins are classified according to their connections. Our method is unsupervised, multivariate, and alignment-free. It can be applied to the classification of both protein sequences and structures. Nine examples are used to demonstrate the efficiency of our new method. PMID:28350835
Shannon information entropy in heavy-ion collisions
NASA Astrophysics Data System (ADS)
Ma, Chun-Wang; Ma, Yu-Gang
2018-03-01
The general idea of information entropy provided by C.E. Shannon "hangs over everything we do" and can be applied to a great variety of problems once the connection between a distribution and the quantities of interest is found. The Shannon information entropy essentially quantify the information of a quantity with its specific distribution, for which the information entropy based methods have been deeply developed in many scientific areas including physics. The dynamical properties of heavy-ion collisions (HICs) process make it difficult and complex to study the nuclear matter and its evolution, for which Shannon information entropy theory can provide new methods and observables to understand the physical phenomena both theoretically and experimentally. To better understand the processes of HICs, the main characteristics of typical models, including the quantum molecular dynamics models, thermodynamics models, and statistical models, etc., are briefly introduced. The typical applications of Shannon information theory in HICs are collected, which cover the chaotic behavior in branching process of hadron collisions, the liquid-gas phase transition in HICs, and the isobaric difference scaling phenomenon for intermediate mass fragments produced in HICs of neutron-rich systems. Even though the present applications in heavy-ion collision physics are still relatively simple, it would shed light on key questions we are seeking for. It is suggested to further develop the information entropy methods in nuclear reactions models, as well as to develop new analysis methods to study the properties of nuclear matters in HICs, especially the evolution of dynamics system.
A Two-Stage Composition Method for Danger-Aware Services Based on Context Similarity
NASA Astrophysics Data System (ADS)
Wang, Junbo; Cheng, Zixue; Jing, Lei; Ota, Kaoru; Kansen, Mizuo
Context-aware systems detect user's physical and social contexts based on sensor networks, and provide services that adapt to the user accordingly. Representing, detecting, and managing the contexts are important issues in context-aware systems. Composition of contexts is a useful method for these works, since it can detect a context by automatically composing small pieces of information to discover service. Danger-aware services are a kind of context-aware services which need description of relations between a user and his/her surrounding objects and between users. However when applying the existing composition methods to danger-aware services, they show the following shortcomings that (1) they have not provided an explicit method for representing composition of multi-user' contexts, (2) there is no flexible reasoning mechanism based on similarity of contexts, so that they can just provide services exactly following the predefined context reasoning rules. Therefore, in this paper, we propose a two-stage composition method based on context similarity to solve the above problems. The first stage is composition of the useful information to represent the context for a single user. The second stage is composition of multi-users' contexts to provide services by considering the relation of users. Finally the danger degree of the detected context is computed by using context similarity between the detected context and the predefined context. Context is dynamically represented based on two-stage composition rules and a Situation theory based Ontology, which combines the advantages of Ontology and Situation theory. We implement the system in an indoor ubiquitous environment, and evaluate the system through two experiments with the support of subjects. The experiment results show the method is effective, and the accuracy of danger detection is acceptable to a danger-aware system.
Theory development for situational awareness in multi-casualty incidents.
Busby, Steven; Witucki-Brown, Janet
2011-09-01
Nurses and other field-level providers will be increasingly called on to respond to both natural and manmade situations that involve multiple casualties. Situational Awareness (SA) is necessary for managing these complicated incidents. The purpose of the study was to create new knowledge by discovering the process of SA in multi-casualty incidents (MCI) and develop substantive theory with regard to field-level SA for use by emergency response nurses and other providers. A qualitative, grounded theory approach was used to develop the first substantive theory of SA for MCI. The sample included 15 emergency response providers from the Southeastern United States. One pilot interview was conducted to trial and refine the semi-structured interview questions. Following Institutional Review Board approval, data collection and analysis occurred from September 2008 through January 2009. The grounded theory methods of Corbin and Strauss (2008) and Charmaz (2006) informed this study. Transcribed participant interviews constituted the bulk of the data with additional data provided by field notes and extensive memos. Multiple levels of coding, theoretical sampling, and theoretical sensitivity were used to develop and relate concepts resulting in emerging theory. Multiple methods were used for maintaining the rigor of the study. The process of SA in MCI involves emergency responders establishing and maintaining control of dynamic, contextually-based situations. Against the backdrop of experience and other preparatory interval actions, responders handle various types of information and manage resources, roles, relationships and human emotion. The goal is to provide an environment of relative safety in which patient care is provided. SA in MCI is an on-going and iterative process with each piece of information informing new actions. Analysis culminated in the development of the Busby Theory of Situational Awareness in Multi-casualty Incidents. SA in MCI is a growing need at local, national and international levels. The newly developed theory provides a useful model for appreciating SA in the context of MCI thereby improving practice and providing a tool for education. The theory also provides a catalyst for further research refining and testing of the theory and for studying larger-scale incidents. Copyright © 2011 Emergency Nurses Association. Published by Mosby, Inc. All rights reserved.
ERIC Educational Resources Information Center
Pettersson, Rune
2014-01-01
Information design has practical and theoretical components. As an academic discipline we may view information design as a combined discipline, a practical theory, or as a theoretical practice. So far information design has incorporated facts, influences, methods, practices, principles, processes, strategies, and tools from a large number of…
Geoid Recovery Using Geophysical Inverse Theory Applied to Satellite to Satellite Tracking Data
NASA Technical Reports Server (NTRS)
Gaposchkin, E. M.
2000-01-01
This report describes a new method for determination of the geopotential, or the equivalent geoid. It is based on Satellite-to-Satellite Tracking (SST) of two co-orbiting low earth satellites separated by a few hundred kilometers. The analysis is aimed at the GRACE Mission, though it is generally applicable to any SST data. It is proposed that the SST be viewed as a mapping mission. That is, the result will be maps of the geoid or gravity, as contrasted with determination of spherical harmonics or Fourier coefficients. A method has been developed, based on Geophysical Inverse Theory (GIT), that can provide maps at a prescribed (desired) resolution and the corresponding error map from the SST data. This computation can be done area by area avoiding simultaneous recovery of all the geopotential information. The necessary elements of potential theory, celestial mechanics, and Geophysical Inverse Theory are described, a computation architecture is described, and the results of several simulations presented. Centimeter accuracy geoids with 50 to 100 km resolution can be recovered with a 30 to 60 day mission.
A study on locating the sonic source of sinusoidal magneto-acoustic signals using a vector method.
Zhang, Shunqi; Zhou, Xiaoqing; Ma, Ren; Yin, Tao; Liu, Zhipeng
2015-01-01
Methods based on the magnetic-acoustic effect are of great significance in studying the electrical imaging properties of biological tissues and currents. The continuous wave method, which is commonly used, can only detect the current amplitude without the sound source position. Although the pulse mode adopted in magneto-acoustic imaging can locate the sonic source, the low measuring accuracy and low SNR has limited its application. In this study, a vector method was used to solve and analyze the magnetic-acoustic signal based on the continuous sine wave mode. This study includes theory modeling of the vector method, simulations to the line model, and experiments with wire samples to analyze magneto-acoustic (MA) signal characteristics. The results showed that the amplitude and phase of the MA signal contained the location information of the sonic source. The amplitude and phase obeyed the vector theory in the complex plane. This study sets a foundation for a new technique to locate sonic sources for biomedical imaging of tissue conductivity. It also aids in studying biological current detecting and reconstruction based on the magneto-acoustic effect.
Entropy in sound and vibration: towards a new paradigm
2017-01-01
This paper describes a discussion on the method and the status of a statistical theory of sound and vibration, called statistical energy analysis (SEA). SEA is a simple theory of sound and vibration in elastic structures that applies when the vibrational energy is diffusely distributed. We show that SEA is a thermodynamical theory of sound and vibration, based on a law of exchange of energy analogous to the Clausius principle. We further investigate the notion of entropy in this context and discuss its meaning. We show that entropy is a measure of information lost in the passage from the classical theory of sound and vibration and SEA, its thermodynamical counterpart. PMID:28265190
The Philosophy of Information as an Underlying and Unifying Theory of Information Science
ERIC Educational Resources Information Center
Tomic, Taeda
2010-01-01
Introduction: Philosophical analyses of theoretical principles underlying these sub-domains reveal philosophy of information as underlying meta-theory of information science. Method: Conceptual research on the knowledge sub-domains in information science and philosophy and analysis of their mutual connection. Analysis: Similarities between…
On long-only information-based portfolio diversification framework
NASA Astrophysics Data System (ADS)
Santos, Raphael A.; Takada, Hellinton H.
2014-12-01
Using the concepts from information theory, it is possible to improve the traditional frameworks for long-only asset allocation. In modern portfolio theory, the investor has two basic procedures: the choice of a portfolio that maximizes its risk-adjusted excess return or the mixed allocation between the maximum Sharpe portfolio and the risk-free asset. In the literature, the first procedure was already addressed using information theory. One contribution of this paper is the consideration of the second procedure in the information theory context. The performance of these approaches was compared with three traditional asset allocation methodologies: the Markowitz's mean-variance, the resampled mean-variance and the equally weighted portfolio. Using simulated and real data, the information theory-based methodologies were verified to be more robust when dealing with the estimation errors.
BP network identification technology of infrared polarization based on fuzzy c-means clustering
NASA Astrophysics Data System (ADS)
Zeng, Haifang; Gu, Guohua; He, Weiji; Chen, Qian; Yang, Wei
2011-08-01
Infrared detection system is frequently employed on surveillance operations and reconnaissance mission to detect particular targets of interest in both civilian and military communities. By incorporating the polarization of light as supplementary information, the target discrimination performance could be enhanced. So this paper proposed an infrared target identification method which is based on fuzzy theory and neural network with polarization properties of targets. The paper utilizes polarization degree and light intensity to advance the unsupervised KFCM (kernel fuzzy C-Means) clustering method. And establish different material pol1arization properties database. In the built network, the system can feedback output corresponding material types of probability distribution toward any input polarized degree such as 10° 15°, 20°, 25°, 30°. KFCM, which has stronger robustness and accuracy than FCM, introduces kernel idea and gives the noise points and invalid value different but intuitively reasonable weights. Because of differences in characterization of material properties, there will be some conflicts in classification results. And D - S evidence theory was used in the combination of the polarization and intensity information. Related results show KFCM clustering precision and operation rate are higher than that of the FCM clustering method. The artificial neural network method realizes material identification, which reasonable solved the problems of complexity in environmental information of infrared polarization, and improperness of background knowledge and inference rule. This method of polarization identification is fast in speed, good in self-adaption and high in resolution.
Templeman, Kate; Robinson, Anske; McKenna, Lisa
2016-09-01
The aim of this study was to identify Australian medical students' complementary medicine information needs. Thirty medical students from 10 medical education faculties across Australian universities were recruited. Data were generated using in-depth semi-structured interviews and constructivist grounded theory method was used to analyze and construct data. Students sought complementary medicine information from a range of inadequate sources, such as pharmacological texts, Internet searches, peer-reviewed medical journals, and drug databases. The students identified that many complementary medicine resources may not be regarded as objective, reliable, differentiated, or comprehensive, leaving much that medical education needs to address. Most students sought succinct, easily accessible, evidence-based information to inform safe and appropriate clinical decisions about complementary medicines. A number of preferred resources were identified that can be recommended and actively promoted to medical students. Therefore, specific, evidence-based complementary medicine databases and secondary resources should be subscribed and recommended to medical schools and students, to assist meeting professional responsibilities regarding complementary medicines. These findings may help inform the development of appropriate medical information resources regarding complementary medicines. © 2016 John Wiley & Sons Australia, Ltd.
Mobile Phone-Based Behavioural Interventions for Health: A Systematic Review
ERIC Educational Resources Information Center
Buhi, Eric R.; Trudnak, Tara E.; Martinasek, Mary P.; Oberne, Alison B.; Fuhrmann, Hollie J.; McDermott, Robert J.
2013-01-01
Objective: To perform a systematic review of the literature concerning behavioural mobile health (mHealth) and summarize points related to heath topic, use of theory, audience, purpose, design, intervention components, and principal results that can inform future health education applications. Design: A systematic review of the literature. Method:…
Educational Applications of the Dialectic: Theory and Research.
ERIC Educational Resources Information Center
Slife, Brent D.
The field of education has largely ignored the concept of the dialectic, except in the Socratic teaching method, and even there bipolar meaning or reasoning has not been recognized. Mainstream educational psychology bases its assumptions about human reasoning and learning on current demonstrative concepts of information processing and levels of…
Cognitive load reducing in destination decision system
NASA Astrophysics Data System (ADS)
Wu, Chunhua; Wang, Cong; Jiang, Qien; Wang, Jian; Chen, Hong
2007-12-01
With limited cognitive resource, the quantity of information can be processed by a person is limited. If the limitation is broken, the whole cognitive process would be affected, so did the final decision. The research of effective ways to reduce the cognitive load is launched from two aspects: cutting down the number of alternatives and directing the user to allocate his limited attention resource based on the selective visual attention theory. Decision-making is such a complex process that people usually have difficulties to express their requirements completely. An effective method to get user's hidden requirements is put forward in this paper. With more requirements be caught, the destination decision system can filtering more quantity of inappropriate alternatives. Different information piece has different utility, if the information with high utility would get attention easily, the decision might be made more easily. After analyzing the current selective visual attention theory, a new presentation style based on user's visual attention also put forward in this paper. This model arranges information presentation according to the movement of sightline. Through visual attention, the user can put their limited attention resource on the important information. Hidden requirements catching and presenting information based on the selective visual attention are effective ways to reducing the cognitive load.
Hazavehei, Seyed Mohammad Mehdi; Afshari, Maryam
2016-08-01
The consumption of fruit and vegetables in old ages is particularly important, so that the appropriate consumption amount leads to reduction in the risk of chronic diseases. To increase consumption of fruit and vegetables and modify consumption pattern in the elderlies, training programs and appropriate intervention can be designed and implemented. The study was done to assess and compare nutritional intervention-based training methods and education using theories and health education models for the consumption of fruits and vegetables in the elderlies. Electronic search using keywords of Country Review Information Bank (Magiran), Scientific Information Database, Pub Med, Science direct, Science, Biomed central from the beginning of March 2014 to end of April 2015 was performed. Ten interventional studies were assessed in this systematic study. The interventions were divided into two groups of studies, a total of five studies, theories and health education models were the basis of training intervention and the other five studies that include their interventions without the use of theories and health education models was carried out. Of ten interventional studies, three studies as before and after and seven studies as the intervention and control was performed. The results showed that education-based theory and health education models have a greater impact on the consumption of fruit and vegetables in the elderlies. The duration and interventions performing method, environmental factors and educational programs using appropriate models and theories are important on the effectiveness of interventions to increase consumption of fruit and vegetables in the elderliness.
McCoy, Lisa K; Hermos, John A; Bokhour, Barbara G; Frayne, Susan M
2004-09-01
Faith-based substance abuse rehabilitation programs provide residential treatment for many substance abusers. To determine key governing concepts of such programs, we conducted semi-structured interviews with sample of eleven clinical and administrative staff referred to us by program directors at six, Evangelical Christian, faith-based, residential rehabilitation programs representing two large, nationwide networks. Qualitative analysis using grounded theory methods examined how spirituality is incorporated into treatment and elicited key theories of addiction and recovery. Although containing comprehensive secular components, the core activities are strongly rooted in a Christian belief system that informs their understanding of addiction and recovery and drives the treatment format. These governing conceptions, that addiction stems from attempts to fill a spiritual void through substance use and recovery through salvation and a long-term relationship with God, provide an explicit, theory-driven model upon which they base their core treatment activities. Knowledge of these core concepts and practices should be helpful to clinicians in considering referrals to faith-based recovery programs.
An evidential link prediction method and link predictability based on Shannon entropy
NASA Astrophysics Data System (ADS)
Yin, Likang; Zheng, Haoyang; Bian, Tian; Deng, Yong
2017-09-01
Predicting missing links is of both theoretical value and practical interest in network science. In this paper, we empirically investigate a new link prediction method base on similarity and compare nine well-known local similarity measures on nine real networks. Most of the previous studies focus on the accuracy, however, it is crucial to consider the link predictability as an initial property of networks itself. Hence, this paper has proposed a new link prediction approach called evidential measure (EM) based on Dempster-Shafer theory. Moreover, this paper proposed a new method to measure link predictability via local information and Shannon entropy.
NASA Astrophysics Data System (ADS)
Sun, Y.; Li, Y. P.; Huang, G. H.
2012-06-01
In this study, a queuing-theory-based interval-fuzzy robust two-stage programming (QB-IRTP) model is developed through introducing queuing theory into an interval-fuzzy robust two-stage (IRTP) optimization framework. The developed QB-IRTP model can not only address highly uncertain information for the lower and upper bounds of interval parameters but also be used for analysing a variety of policy scenarios that are associated with different levels of economic penalties when the promised targets are violated. Moreover, it can reflect uncertainties in queuing theory problems. The developed method has been applied to a case of long-term municipal solid waste (MSW) management planning. Interval solutions associated with different waste-generation rates, different waiting costs and different arriving rates have been obtained. They can be used for generating decision alternatives and thus help managers to identify desired MSW management policies under various economic objectives and system reliability constraints.
ERIC Educational Resources Information Center
Mursu, Anja; Luukkonen, Irmeli; Toivanen, Marika; Korpela, Mikko
2007-01-01
Introduction: The purpose of information systems is to facilitate work activities: here we consider how Activity Theory can be applied in information systems development. Method. The requirements for an analytical model for emancipatory, work-oriented information systems research and practice are specified. Previous research work in Activity…
The research on user behavior evaluation method for network state
NASA Astrophysics Data System (ADS)
Zhang, Chengyuan; Xu, Haishui
2017-08-01
Based on the correlation between user behavior and network running state, this paper proposes a method of user behavior evaluation based on network state. Based on the analysis and evaluation methods in other fields of study, we introduce the theory and tools of data mining. Based on the network status information provided by the trusted network view, the user behavior data and the network state data are analysed. Finally, we construct the user behavior evaluation index and weight, and on this basis, we can accurately quantify the influence degree of the specific behavior of different users on the change of network running state, so as to provide the basis for user behavior control decision.
A Systematic Review of Rural, Theory-based Physical Activity Interventions.
Walsh, Shana M; Meyer, M Renée Umstattd; Gamble, Abigail; Patterson, Megan S; Moore, Justin B
2017-05-01
This systematic review synthesized the scientific literature on theory-based physical activity (PA) interventions in rural populations. PubMed, PsycINFO, and Web of Science databases were searched to identify studies with a rural study sample, PA as a primary outcome, use of a behavioral theory or model, randomized or quasi-experimental research design, and application at the primary and/or secondary level of prevention. Thirty-one studies met our inclusion criteria. The Social Cognitive Theory (N = 14) and Transtheoretical Model (N = 10) were the most frequently identified theories; however, most intervention studies were informed by theory but lacked higher-level theoretical application and testing. Interventions largely took place in schools (N = 10) and with female-only samples (N = 8). Findings demonstrated that theory-based PA interventions are mostly successful at increasing PA in rural populations but require improvement. Future studies should incorporate higher levels of theoretical application, and should explore adapting or developing rural-specific theories. Study designs should employ more rigorous research methods to decrease bias and increase validity of findings. Follow-up assessments to determine behavioral maintenance and/or intervention sustainability are warranted. Finally, funding agencies and journals are encouraged to adopt rural-urban commuting area codes as the standard for defining rural.
Explaining Michigan: Developing an Ex Post Theory of a Quality Improvement Program
Dixon-Woods, Mary; Bosk, Charles L; Aveling, Emma Louise; Goeschel, Christine A; Pronovost, Peter J
2011-01-01
Context: Understanding how and why programs work—not simply whether they work—is crucial. Good theory is indispensable to advancing the science of improvement. We argue for the usefulness of ex post theorization of programs. Methods: We propose an approach, located within the broad family of theory-oriented methods, for developing ex post theories of interventional programs. We use this approach to develop an ex post theory of the Michigan Intensive Care Unit (ICU) project, which attracted international attention by successfully reducing rates of central venous catheter bloodstream infections (CVC-BSIs). The procedure used to develop the ex post theory was (1) identify program leaders’ initial theory of change and learning from running the program; (2) enhance this with new information in the form of theoretical contributions from social scientists; (3) synthesize prior and new information to produce an updated theory. Findings: The Michigan project achieved its effects by (1) generating isomorphic pressures for ICUs to join the program and conform to its requirements; (2) creating a densely networked community with strong horizontal links that exerted normative pressures on members; (3) reframing CVC-BSIs as a social problem and addressing it through a professional movement combining “grassroots” features with a vertically integrating program structure; (4) using several interventions that functioned in different ways to shape a culture of commitment to doing better in practice; (5) harnessing data on infection rates as a disciplinary force; and (6) using “hard edges.” Conclusions: Updating program theory in the light of experience from program implementation is essential to improving programs’ generalizability and transferability, although it is not a substitute for concurrent evaluative fieldwork. Future iterations of programs based on the Michigan project, and improvement science more generally, may benefit from the updated theory present here. PMID:21676020
Clinical overview: a framework for analysis.
Bossen, Claus; Jensen, Lotte G
2013-01-01
In this presentation, we investigate concepts and theories for analysing how healthcare professionals achieve overview of patient cases. By 'overview' we mean the situation in which a healthcare professional with sufficient certainty and in concrete situations knows how to proceed based on available information upon a patient. Achieving overview is central for the efficient and safe use of healthcare IT systems, and for the realization of the potential improvements of healthcare that are behind investments in such systems. We focus on the theories of decision-making, sensemaking, narratives, ethnomethodology and distributed cognition. Whereas decision-making theory tend to be sequential and normative, we find the concept of 'functional deployment' in sensemaking theory, 'emplotment' in narrative theory, the focus on 'members' methods' in ethnomethodology and the inclusion of 'computational artifacts' in distributed cognition helpful.
ERIC Educational Resources Information Center
Wang, Lin
2013-01-01
Background: Cultural-historical activity theory is an important theory in modern psychology. In recent years, it has drawn more attention from related disciplines including information science. Argument: This paper argues that activity theory and domain analysis which uses the theory as one of its bases could bring about some important…
NASA Astrophysics Data System (ADS)
Xu, Pengcheng; Wang, Dong; Singh, Vijay P.; Wang, Yuankun; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Chen, Yuanfang; Chen, Xi; Liu, Jiufu; Zou, Ying; He, Ruimin
2017-12-01
Hydrometeorological data are needed for obtaining point and areal mean, quantifying the spatial variability of hydrometeorological variables, and calibration and verification of hydrometeorological models. Hydrometeorological networks are utilized to collect such data. Since data collection is expensive, it is essential to design an optimal network based on the minimal number of hydrometeorological stations in order to reduce costs. This study proposes a two-phase copula entropy- based multiobjective optimization approach that includes: (1) copula entropy-based directional information transfer (CDIT) for clustering the potential hydrometeorological gauges into several groups, and (2) multiobjective method for selecting the optimal combination of gauges for regionalized groups. Although entropy theory has been employed for network design before, the joint histogram method used for mutual information estimation has several limitations. The copula entropy-based mutual information (MI) estimation method is shown to be more effective for quantifying the uncertainty of redundant information than the joint histogram (JH) method. The effectiveness of this approach is verified by applying to one type of hydrometeorological gauge network, with the use of three model evaluation measures, including Nash-Sutcliffe Coefficient (NSC), arithmetic mean of the negative copula entropy (MNCE), and MNCE/NSC. Results indicate that the two-phase copula entropy-based multiobjective technique is capable of evaluating the performance of regional hydrometeorological networks and can enable decision makers to develop strategies for water resources management.
NASA Astrophysics Data System (ADS)
Gafurov, O.; Gafurov, D.; Syryamkin, V.
2018-05-01
The paper analyses a field of computer science formed at the intersection of such areas of natural science as artificial intelligence, mathematical statistics, and database theory, which is referred to as "Data Mining" (discovery of knowledge in data). The theory of neural networks is applied along with classical methods of mathematical analysis and numerical simulation. The paper describes the technique protected by the patent of the Russian Federation for the invention “A Method for Determining Location of Production Wells during the Development of Hydrocarbon Fields” [1–3] and implemented using the geoinformation system NeuroInformGeo. There are no analogues in domestic and international practice. The paper gives an example of comparing the forecast of the oil reservoir quality made by the geophysicist interpreter using standard methods and the forecast of the oil reservoir quality made using this technology. The technical result achieved shows the increase of efficiency, effectiveness, and ecological compatibility of development of mineral deposits and discovery of a new oil deposit.
Evaluating hydrological model performance using information theory-based metrics
USDA-ARS?s Scientific Manuscript database
The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...
Denehy, Mel; Crawford, Gemma; Leavy, Justine; Nimmo, Lauren; Jancey, Jonine
2016-01-01
Introduction Worldwide, children under the age of 5 years are at particular risk of drowning. Responding to this need requires the development of evidence-informed drowning prevention strategies. Historically, drowning prevention strategies have included denying access, learning survival skills and providing supervision, as well as education and information which includes the use of mass media. Interventions underpinned by behavioural theory and formative evaluation tend to be more effective, yet few practical examples exist in the drowning and/or injury prevention literature. The Health Belief Model and Social Cognitive Theory will be used to explore participants' perspectives regarding proposed mass media messaging. This paper describes a qualitative protocol to undertake formative research to develop theory-based messages for a child drowning prevention campaign. Methods and analysis The primary data source will be focus group interviews with parents and caregivers of children under 5 years of age in metropolitan and regional Western Australia. Qualitative content analysis will be used to analyse the data. Ethics and dissemination This study will contribute to the drowning prevention literature to inform the development of future child drowning prevention mass media campaigns. Findings from the study will be disseminated to practitioners, policymakers and researchers via international conferences, peer and non-peer-reviewed journals and evidence summaries. The study was submitted and approved by the Curtin University Human Research Ethics Committee. PMID:27207621
Towards Information Polycentricity Theory--Investigation of a Hospital Revenue Cycle
ERIC Educational Resources Information Center
Singh, Rajendra
2011-01-01
This research takes steps towards developing a new theory of organizational information management based on the ideas that, first, information creates ordering effects in transactions and, second, that there are multiple centers of authority in organizations. The rationale for developing this theory is the empirical observation that hospitals have…
ERIC Educational Resources Information Center
Boekkooi-Timminga, Ellen
Nine methods for automated test construction are described. All are based on the concepts of information from item response theory. Two general kinds of methods for the construction of parallel tests are presented: (1) sequential test design; and (2) simultaneous test design. Sequential design implies that the tests are constructed one after the…
Statistical lamb wave localization based on extreme value theory
NASA Astrophysics Data System (ADS)
Harley, Joel B.
2018-04-01
Guided wave localization methods based on delay-and-sum imaging, matched field processing, and other techniques have been designed and researched to create images that locate and describe structural damage. The maximum value of these images typically represent an estimated damage location. Yet, it is often unclear if this maximum value, or any other value in the image, is a statistically significant indicator of damage. Furthermore, there are currently few, if any, approaches to assess the statistical significance of guided wave localization images. As a result, we present statistical delay-and-sum and statistical matched field processing localization methods to create statistically significant images of damage. Our framework uses constant rate of false alarm statistics and extreme value theory to detect damage with little prior information. We demonstrate our methods with in situ guided wave data from an aluminum plate to detect two 0.75 cm diameter holes. Our results show an expected improvement in statistical significance as the number of sensors increase. With seventeen sensors, both methods successfully detect damage with statistical significance.
Ink Wash Painting Style Rendering With Physically-based Ink Dispersion Model
NASA Astrophysics Data System (ADS)
Wang, Yifan; Li, Weiran; Zhu, Qing
2018-04-01
This paper presents a real-time rendering method based on the GPU programmable pipeline for rendering the 3D scene in ink wash painting style. The method is divided into main three parts: First, render the ink properties of 3D model by calculating its vertex curvature. Then, cached the ink properties to a paper structure and using an ink dispersion model which is defined by referencing the theory of porous media to simulate the dispersion of ink. Finally, convert the ink properties to the pixel color information and render it to the screen. This method has a better performance than previous methods in visual quality.
From bed to bench: bridging from informatics practice to theory: an exploratory analysis.
Haux, R; Lehmann, C U
2014-01-01
In 2009, Applied Clinical Informatics (ACI)--focused on applications in clinical informatics--was launched as a companion journal to Methods of Information in Medicine (MIM). Both journals are official journals of the International Medical Informatics Association. To explore which congruencies and interdependencies exist in publications from theory to practice and from practice to theory and to determine existing gaps. Major topics discussed in ACI and MIM were analyzed. We explored if the intention of publishing companion journals to provide an information bridge from informatics theory to informatics practice and vice versa could be supported by this model. In this manuscript we will report on congruencies and interdependences from practice to theory and on major topics in MIM. Retrospective, prolective observational study on recent publications of ACI and MIM. All publications of the years 2012 and 2013 were indexed and analyzed. Hundred and ninety-six publications were analyzed (ACI 87, MIM 109). In MIM publications, modelling aspects as well as methodological and evaluation approaches for the analysis of data, information, and knowledge in biomedicine and health care were frequently raised - and often discussed from an interdisciplinary point of view. Important themes were ambient-assisted living, anatomic spatial relations, biomedical informatics as scientific discipline, boosting, coding, computerized physician order entry, data analysis, grid and cloud computing, health care systems and services, health-enabling technologies, health information search, health information systems, imaging, knowledge-based decision support, patient records, signal analysis, and web science. Congruencies between journals could be found in themes, but with a different focus on content. Interdependencies from practice to theory, found in these publications, were only limited. Bridging from informatics theory to practice and vice versa remains a major component of successful research and practice as well as a major challenge.
A Classification Scheme for Adult Education. Education Libraries Bulletin, Supplement Twelve.
ERIC Educational Resources Information Center
Greaves, Monica A., Comp.
This classification scheme, based on the 'facet formula' theory of Ranganathan, is designed primarily for the library of the National Institute of Adult Education in London, England. Kinds of persons being educated (educands), methods and problems of education, specific countries, specific organizations, and forms in which the information is…
Decoding 2D-PAGE complex maps: relevance to proteomics.
Pietrogrande, Maria Chiara; Marchetti, Nicola; Dondi, Francesco; Righetti, Pier Giorgio
2006-03-20
This review describes two mathematical approaches useful for decoding the complex signal of 2D-PAGE maps of protein mixtures. These methods are helpful for interpreting the large amount of data of each 2D-PAGE map by extracting all the analytical information hidden therein by spot overlapping. Here the basic theory and application to 2D-PAGE maps are reviewed: the means for extracting information from the experimental data and their relevance to proteomics are discussed. One method is based on the quantitative theory of statistical model of peak overlapping (SMO) using the spot experimental data (intensity and spatial coordinates). The second method is based on the study of the 2D-autocovariance function (2D-ACVF) computed on the experimental digitised map. They are two independent methods that are able to extract equal and complementary information from the 2D-PAGE map. Both methods permit to obtain fundamental information on the sample complexity and the separation performance and to single out ordered patterns present in spot positions: the availability of two independent procedures to compute the same separation parameters is a powerful tool to estimate the reliability of the obtained results. The SMO procedure is an unique tool to quantitatively estimate the degree of spot overlapping present in the map, while the 2D-ACVF method is particularly powerful in simply singling out the presence of order in the spot position from the complexity of the whole 2D map, i.e., spot trains. The procedures were validated by extensive numerical computation on computer-generated maps describing experimental 2D-PAGE gels of protein mixtures. Their applicability to real samples was tested on reference maps obtained from literature sources. The review describes the most relevant information for proteomics: sample complexity, separation performance, overlapping extent, identification of spot trains related to post-translational modifications (PTMs).
Bishop, Felicity L; Greville-Harris, Maddy; Bostock, Jennifer; Din, Amy; Graham, Cynthia A; Lewith, George; Liossi, Christina; O'Riordan, Tim; Ryves, Rachel; White, Peter; Yardley, Lucy
2016-08-01
Potential acupuncture patients seek out information about acupuncture from various sources including websites, many of which are unreliable. We aimed to create an informative, scientifically accurate and engaging website to educate patients about acupuncture for back pain and modify their beliefs in a way that might enhance its clinical effects. We used psychological theory and techniques to design an evidence-based website, incorporating multimedia elements. We conducted qualitative "think aloud" audio-recorded interviews to elicit user views of the website. A convenience sample of ten participants (4 male; aged 21-64 years from the local community) looked at the website in the presence of a researcher and spoke their thoughts out loud. Comments were categorised by topic. The website comprises 11 main pages and addresses key topics of interest to potential acupuncture patients, including beneficial and adverse effects, mechanisms of action, safety, practicalities, and patients' experiences of acupuncture. It provides information through text, evidence summaries and audio-clips of four patients' stories and two acupuncturists' descriptions of their practice, and three short films. Evidence from the think aloud study was used to identify opportunities to make the website more informative, engaging, and user-friendly. Using a combination of psychological theory and qualitative interviews enabled us to produce a user-friendly, evidence-based website that is likely to change patients' beliefs about acupuncture for back pain. Before using the website in clinical settings it is necessary to test its effects on key outcomes including patients' beliefs and capacity for making informed choices about acupuncture.
Approximating Reflectance and Transmittance of Vegetation Using Multiple Spectral Invariants
NASA Astrophysics Data System (ADS)
Mottus, M.
2011-12-01
Canopy spectral invariants, eigenvalues of the radiative transfer equation and photon recollision probability are some of the new theoretical tools that have been applied in remote sensing of vegetation and atmosphere. The theoretical approach based on spectral invariants, informally also referred to as the p-theory, owns its attractivity to several factors. Firstly, it provides a rapid and physically-based way of describing canopy scattering. Secondly, the p-theory aims at parameterizing canopy structure in reflectance models using a simple and intuitive concept which can be applied at various structural levels, from shoot to tree crown. The theory has already been applied at scales from the molecular level to forest stands. The most important shortcoming of the p-theory lies in its inability to predict the directionality of scattering. The theory is currently based on only one physical parameter, the photon recollision probability p. It is evident that one parameter cannot contain enough information to reasonably predict the observed complex reflectance patterns produced by natural vegetation canopies. Without estimating scattering directionality, however, the theory cannot be compared with even the most simple (and well-tested) two-stream vegetation reflectance models. In this study, we evaluate the possibility to use additional parameters to fit the measured reflectance and transmittance of a vegetation stand. As a first step, the parameters are applied to separate canopy scattering into reflectance and transmittance. New parameters are introduced following the general approach of eigenvector expansion. Thus, the new parameters are coined higher-order spectral invariants. Calculation of higher-order invariants is based on separating first-order scattering from total scattering. Thus, the method explicitly accounts for different view geometries with different fractions of visible sunlit canopy (e.g., hot-spot). It additionally allows to produce different irradiation levels on leaf surfaces for direct and diffuse incidence, thus (in theory) allowing more accurate calculation of potential photosynthesis rates. Similarly to the p-theory, the use of multiple spectral invariants facilitates easy parametrization of canopy structure and scaling between different structural levels (leaf-shoot-stand). Spectral invariant-based remote sensing approaches are well suited for relatively large pixels even when no detailed ground truth information is available. In a case study, the theory of multiple spectral invariants was applied to measured canopy scattering. Spectral reflectance and transmittance measurements were carried out in gray alder (Alnus incana) plantation at Tartu Observatory, Estonia, in August 2006. The equations produced by the theory of spectral invariants were fitted to measured radiation fluxes. Preliminary results indicate that quantities with invariant-like behavior may indeed be used to approximate canopy scattering directionality.
Reducing Interpolation Artifacts for Mutual Information Based Image Registration
Soleimani, H.; Khosravifard, M.A.
2011-01-01
Medical image registration methods which use mutual information as similarity measure have been improved in recent decades. Mutual Information is a basic concept of Information theory which indicates the dependency of two random variables (or two images). In order to evaluate the mutual information of two images their joint probability distribution is required. Several interpolation methods, such as Partial Volume (PV) and bilinear, are used to estimate joint probability distribution. Both of these two methods yield some artifacts on mutual information function. Partial Volume-Hanning window (PVH) and Generalized Partial Volume (GPV) methods are introduced to remove such artifacts. In this paper we show that the acceptable performance of these methods is not due to their kernel function. It's because of the number of pixels which incorporate in interpolation. Since using more pixels requires more complex and time consuming interpolation process, we propose a new interpolation method which uses only four pixels (the same as PV and bilinear interpolations) and removes most of the artifacts. Experimental results of the registration of Computed Tomography (CT) images show superiority of the proposed scheme. PMID:22606673
Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory
Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong
2016-01-01
Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611
Recoverability in quantum information theory
NASA Astrophysics Data System (ADS)
Wilde, Mark
The fact that the quantum relative entropy is non-increasing with respect to quantum physical evolutions lies at the core of many optimality theorems in quantum information theory and has applications in other areas of physics. In this work, we establish improvements of this entropy inequality in the form of physically meaningful remainder terms. One of the main results can be summarized informally as follows: if the decrease in quantum relative entropy between two quantum states after a quantum physical evolution is relatively small, then it is possible to perform a recovery operation, such that one can perfectly recover one state while approximately recovering the other. This can be interpreted as quantifying how well one can reverse a quantum physical evolution. Our proof method is elementary, relying on the method of complex interpolation, basic linear algebra, and the recently introduced Renyi generalization of a relative entropy difference. The theorem has a number of applications in quantum information theory, which have to do with providing physically meaningful improvements to many known entropy inequalities. This is based on arXiv:1505.04661, now accepted for publication in Proceedings of the Royal Society A. I acknowledge support from startup funds from the Department of Physics and Astronomy at LSU, the NSF under Award No. CCF-1350397, and the DARPA Quiness Program through US Army Research Office award W31P4Q-12-1-0019.
Fu, Yongqing; Li, Xingyuan; Li, Yanan; Yang, Wei; Song, Hailiang
2013-03-01
Chaotic communication has aroused general interests in recent years, but its communication effect is not ideal with the restriction of chaos synchronization. In this paper a new chaos M-ary digital modulation and demodulation method is proposed. By using region controllable characteristics of spatiotemporal chaos Hamilton map in phase plane and chaos unique characteristic, which is sensitive to initial value, zone mapping method is proposed. It establishes the map relationship between M-ary digital information and the region of Hamilton map phase plane, thus the M-ary information chaos modulation is realized. In addition, zone partition demodulation method is proposed based on the structure characteristic of Hamilton modulated information, which separates M-ary information from phase trajectory of chaotic Hamilton map, and the theory analysis of zone partition demodulator's boundary range is given. Finally, the communication system based on the two methods is constructed on the personal computer. The simulation shows that in high speed transmission communications and with no chaos synchronization circumstance, the proposed chaotic M-ary modulation and demodulation method has outperformed some conventional M-ary modulation methods, such as quadrature phase shift keying and M-ary pulse amplitude modulation in bit error rate. Besides, it has performance improvement in bandwidth efficiency, transmission efficiency and anti-noise performance, and the system complexity is low and chaos signal is easy to generate.
Efficient Regressions via Optimally Combining Quantile Information*
Zhao, Zhibiao; Xiao, Zhijie
2014-01-01
We develop a generally applicable framework for constructing efficient estimators of regression models via quantile regressions. The proposed method is based on optimally combining information over multiple quantiles and can be applied to a broad range of parametric and nonparametric settings. When combining information over a fixed number of quantiles, we derive an upper bound on the distance between the efficiency of the proposed estimator and the Fisher information. As the number of quantiles increases, this upper bound decreases and the asymptotic variance of the proposed estimator approaches the Cramér-Rao lower bound under appropriate conditions. In the case of non-regular statistical estimation, the proposed estimator leads to super-efficient estimation. We illustrate the proposed method for several widely used regression models. Both asymptotic theory and Monte Carlo experiments show the superior performance over existing methods. PMID:25484481
Presseau, Justin; Nicholas Angl, Emily; Jokhio, Iffat; Schwalm, JD; Grimshaw, Jeremy M; Bosiak, Beth; Natarajan, Madhu K; Ivers, Noah M
2017-01-01
Background Taking all recommended secondary prevention cardiac medications and fully participating in a formal cardiac rehabilitation program significantly reduces mortality and morbidity in the year following a heart attack. However, many people who have had a heart attack stop taking some or all of their recommended medications prematurely and many do not complete a formal cardiac rehabilitation program. Objective The objective of our study was to develop a user-centered, theory-based, scalable intervention of printed educational materials to encourage and support people who have had a heart attack to use recommended secondary prevention cardiac treatments. Methods Prior to the design process, we conducted theory-based interviews and surveys with patients who had had a heart attack to identify key determinants of secondary prevention behaviors. Our interdisciplinary research team then partnered with a patient advisor and design firm to undertake an iterative, theory-informed, user-centered design process to operationalize techniques to address these determinants. User-centered design requires considering users’ needs, goals, strengths, limitations, context, and intuitive processes; designing prototypes adapted to users accordingly; observing how potential users respond to the prototype; and using those data to refine the design. To accomplish these tasks, we conducted user research to develop personas (archetypes of potential users), developed a preliminary prototype using behavior change theory to map behavior change techniques to identified determinants of medication adherence, and conducted 2 design cycles, testing materials via think-aloud and semistructured interviews with a total of 11 users (10 patients who had experienced a heart attack and 1 caregiver). We recruited participants at a single cardiac clinic using purposive sampling informed by our personas. We recorded sessions with users and extracted key themes from transcripts. We held interdisciplinary team discussions to interpret findings in the context of relevant theory-based evidence and iteratively adapted the intervention accordingly. Results Through our iterative development and testing, we identified 3 key tensions: (1) evidence from theory-based studies versus users’ feelings, (2) informative versus persuasive communication, and (3) logistical constraints for the intervention versus users’ desires or preferences. We addressed these by (1) identifying root causes for users’ feelings and addressing those to better incorporate theory- and evidence-based features, (2) accepting that our intervention was ethically justified in being persuasive, and (3) making changes to the intervention where possible, such as attempting to match imagery in the materials to patients’ self-images. Conclusions Theory-informed interventions must be operationalized in ways that fit with user needs. Tensions between users’ desires or preferences and health care system goals and constraints must be identified and addressed to the greatest extent possible. A cluster randomized controlled trial of the final intervention is currently underway. PMID:28249831
Selecting Organization Development Theory from an HRD Perspective
ERIC Educational Resources Information Center
Lynham, Susan A.; Chermack, Thomas J.; Noggle, Melissa A.
2004-01-01
As is true for human resource development (HRD), the field of organization development (OD) draws from numerous disciplines to inform its theory base. However, the identification and selection of theory to inform improved practice remains a challenge and begs the question of what can be used to inform and guide one in the identification and…
Detecting spatial regimes in ecosystems
Sundstrom, Shana M.; Eason, Tarsha; Nelson, R. John; Angeler, David G.; Barichievy, Chris; Garmestani, Ahjond S.; Graham, Nicholas A.J.; Granholm, Dean; Gunderson, Lance; Knutson, Melinda; Nash, Kirsty L.; Spanbauer, Trisha; Stow, Craig A.; Allen, Craig R.
2017-01-01
Research on early warning indicators has generally focused on assessing temporal transitions with limited application of these methods to detecting spatial regimes. Traditional spatial boundary detection procedures that result in ecoregion maps are typically based on ecological potential (i.e. potential vegetation), and often fail to account for ongoing changes due to stressors such as land use change and climate change and their effects on plant and animal communities. We use Fisher information, an information theory-based method, on both terrestrial and aquatic animal data (U.S. Breeding Bird Survey and marine zooplankton) to identify ecological boundaries, and compare our results to traditional early warning indicators, conventional ecoregion maps and multivariate analyses such as nMDS and cluster analysis. We successfully detected spatial regimes and transitions in both terrestrial and aquatic systems using Fisher information. Furthermore, Fisher information provided explicit spatial information about community change that is absent from other multivariate approaches. Our results suggest that defining spatial regimes based on animal communities may better reflect ecological reality than do traditional ecoregion maps, especially in our current era of rapid and unpredictable ecological change.
Bohmian Photonics for Independent Control of the Phase and Amplitude of Waves
NASA Astrophysics Data System (ADS)
Yu, Sunkyu; Piao, Xianji; Park, Namkyoo
2018-05-01
The de Broglie-Bohm theory is one of the nonstandard interpretations of quantum phenomena that focuses on reintroducing definite positions of particles, in contrast to the indeterminism of the Copenhagen interpretation. In spite of intense debate on its measurement and nonlocality, the de Broglie-Bohm theory based on the reformulation of the Schrödinger equation allows for the description of quantum phenomena as deterministic trajectories embodied in the modified Hamilton-Jacobi mechanics. Here, we apply the Bohmian reformulation to Maxwell's equations to achieve the independent manipulation of optical phase evolution and energy confinement. After establishing the deterministic design method based on the Bohmian approach, we investigate the condition of optical materials enabling scattering-free light with bounded or random phase evolutions. We also demonstrate a unique form of optical confinement and annihilation that preserves the phase information of incident light. Our separate tailoring of wave information extends the notion and range of artificial materials.
Spectra of empirical autocorrelation matrices: A random-matrix-theory-inspired perspective
NASA Astrophysics Data System (ADS)
Jamali, Tayeb; Jafari, G. R.
2015-07-01
We construct an autocorrelation matrix of a time series and analyze it based on the random-matrix theory (RMT) approach. The autocorrelation matrix is capable of extracting information which is not easily accessible by the direct analysis of the autocorrelation function. In order to provide a precise conclusion based on the information extracted from the autocorrelation matrix, the results must be first evaluated. In other words they need to be compared with some sort of criterion to provide a basis for the most suitable and applicable conclusions. In the context of the present study, the criterion is selected to be the well-known fractional Gaussian noise (fGn). We illustrate the applicability of our method in the context of stock markets. For the former, despite the non-Gaussianity in returns of the stock markets, a remarkable agreement with the fGn is achieved.
Application of fuzzy system theory in addressing the presence of uncertainties
NASA Astrophysics Data System (ADS)
Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.
2015-02-01
In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.
2002-08-01
the measurement noise, as well as the physical model of the forward scattered electric field. The Bayesian algorithms for the Uncertain Permittivity...received at multiple sensors. In this research project a tissue- model -based signal-detection theory approach for the detection of mammary tumors in the...oriented information processors. In this research project a tissue- model - based signal detection theory approach for the detection of mammary tumors in the
2014-01-01
Background Evidence indicates that post − stroke rehabilitation improves function, independence and quality of life. A key aspect of rehabilitation is the provision of appropriate information and feedback to the learner. Advances in information and communications technology (ICT) have allowed for the development of various systems to complement stroke rehabilitation that could be used in the home setting. These systems may increase the provision of rehabilitation a stroke survivor receives and carries out, as well as providing a learning platform that facilitates long-term self-managed rehabilitation and behaviour change. This paper describes the application of an innovative evaluative methodology to explore the utilisation of feedback for post-stroke upper-limb rehabilitation in the home. Methods Using the principles of realistic evaluation, this study aimed to test and refine intervention theories by exploring the complex interactions of contexts, mechanisms and outcomes that arise from technology deployment in the home. Methods included focus groups followed by multi-method case studies (n = 5) before, during and after the use of computer-based equipment. Data were analysed in relation to the context-mechanism-outcome hypotheses case by case. This was followed by a synthesis of the findings to answer the question, ‘what works for whom and in what circumstances and respects?’ Results Data analysis reveals that to achieve desired outcomes through the use of ICT, key elements of computer feedback, such as accuracy, measurability, rewarding feedback, adaptability, and knowledge of results feedback, are required to trigger the theory-driven mechanisms underpinning the intervention. In addition, the pre-existing context and the personal and environmental contexts, such as previous experience of service delivery, personal goals, trust in the technology, and social circumstances may also enable or constrain the underpinning theory-driven mechanisms. Conclusions Findings suggest that the theory-driven mechanisms underpinning the utilisation of feedback from computer-based technology for home-based upper-limb post-stroke rehabilitation are dependent on key elements of computer feedback and the personal and environmental context. The identification of these elements may therefore inform the development of technology; therapy education and the subsequent adoption of technology and a self-management paradigm; long-term self-managed rehabilitation; and importantly, improvements in the physical and psychosocial aspects of recovery. PMID:24903401
Using information theory to assess the communicative capacity of circulating microRNA.
Finn, Nnenna A; Searles, Charles D
2013-10-11
The discovery of extracellular microRNAs (miRNAs) and their transport modalities (i.e., microparticles, exosomes, proteins and lipoproteins) has sparked theories regarding their role in intercellular communication. Here, we assessed the information transfer capacity of different miRNA transport modalities in human serum by utilizing basic principles of information theory. Zipf Statistics were calculated for each of the miRNA transport modalities identified in human serum. Our analyses revealed that miRNA-mediated information transfer is redundant, as evidenced by negative Zipf's Statistics with magnitudes greater than one. In healthy subjects, the potential communicative capacity of miRNA in complex with circulating proteins was significantly lower than that of miRNA encapsulated in circulating microparticles and exosomes. Moreover, the presence of coronary heart disease significantly lowered the communicative capacity of all circulating miRNA transport modalities. To assess the internal organization of circulating miRNA signals, Shannon's zero- and first-order entropies were calculated. Microparticles (MPs) exhibited the lowest Shannon entropic slope, indicating a relatively high capacity for information transfer. Furthermore, compared to the other miRNA transport modalities, MPs appeared to be the most efficient at transferring miRNA to cultured endothelial cells. Taken together, these findings suggest that although all transport modalities have the capacity for miRNA-based information transfer, MPs may be the simplest and most robust way to achieve miRNA-based signal transduction in sera. This study presents a novel method for analyzing the quantitative capacity of miRNA-mediated information transfer while providing insight into the communicative characteristics of distinct circulating miRNA transport modalities. Published by Elsevier Inc.
Theory-based interventions for contraception.
Lopez, Laureen M; Tolley, Elizabeth E; Grimes, David A; Chen, Mario; Stockton, Laurie L
2013-08-07
The explicit use of theory in research helps expand the knowledge base. Theories and models have been used extensively in HIV-prevention research and in interventions for preventing sexually transmitted infections (STIs). The health behavior field uses many theories or models of change. However, educational interventions addressing contraception often have no stated theoretical base. Review randomized controlled trials (RCTs) that tested a theoretical approach to inform contraceptive choice; encourage contraceptive use; or promote adherence to, or continuation of, a contraceptive regimen. Through June 2013, we searched computerized databases for trials that tested a theory-based intervention for improving contraceptive use (MEDLINE, POPLINE, CENTRAL, PsycINFO, ClinicalTrials.gov, and ICTRP). Previous searches also included EMBASE. For the initial review, we wrote to investigators to find other trials. Trials tested a theory-based intervention for improving contraceptive use. We excluded trials focused on high-risk groups and preventing sexually transmitted infections or HIV. Interventions addressed the use of one or more contraceptive methods for contraception. The reports provided evidence that the intervention was based on a specific theory or model. The primary outcomes were pregnancy, contraceptive choice or use, and contraceptive adherence or continuation. The primary author evaluated abstracts for eligibility. Two authors extracted data from included studies. For the dichotomous outcomes, the Mantel-Haenszel odds ratio (OR) with 95% CI was calculated using a fixed-effect model. Cluster randomized trials used various methods of accounting for the clustering, such as multilevel modeling. Most reports did not provide information to calculate the effective sample size. Therefore, we presented the results as reported by the investigators. No meta-analysis was conducted due to differences in interventions and outcome measures. We included three new trials for a total of 17. Ten randomly assigned individuals and seven were cluster-randomized. Eight trials showed some intervention effect.Two of 12 trials with pregnancy or birth data showed some effect. A theory-based group was less likely than the comparison group to have a second birth (OR 0.41; 95% CI 0.17 to 1.00) or to report a pregnancy (OR 0.24 (95% CI 0.10 to 0.56); OR 0.27 (95% CI 0.11 to 0.66)). The theoretical bases were social cognitive theory (SCT) and another social cognition model.Of 12 trials with data on contraceptive use (non-condom), six showed some effect. A theory-based group was more likely to consistently use oral contraceptives (OR 1.41; 95% CI 1.06 to 1.87), hormonal contraceptives (reported relative risk (RR) 1.30; 95% CI 1.06 to 1.58) or dual methods (reported RR 1.36; 95% CI 1.01 to 1.85); to use an effective contraceptive method (reported effect size 1.76; OR 2.04 (95% CI 1.47 to 2.83)) or use more habitual contraception (reported P < 0.05); and were less likely to use ineffective contraception (OR 0.56; 95% CI 0.31 to 0.98). Theories and models included the Health Belief Model (HBM), SCT, SCT plus another theory, other social cognition, and motivational interviewing (MI).For condom use, a theory-based group had favorable results in 5 of 11 trials. The main differences were reporting more consistent condom use (reported RR 1.57; 95% CI 1.28 to 1.94) and more condom use during last sex (reported results: risk ratio 1.47 (95% CI 1.12 to 1.93); effect size 1.68; OR 2.12 (95% CI 1.24 to 3.56); OR 1.45 (95% CI 1.03 to 2.03)). The theories were SCT, SCT plus another theory, and HBM.Nearly all trials provided multiple sessions or contacts. SCT provided the basis for seven trials focused on adolescents, of which five reported some effectiveness. Two others based on other social cognition models had favorable results with adolescents. Of six trials including adult women, five provided individual sessions. Some effect was seen in two using MI and one using the HBM. Two based on the Transtheoretical Model did not show any effect. Eight trials provided evidence of high or moderate quality. Family planning researchers and practitioners could adapt the effective interventions, although most provided group sessions for adolescents. Three were conducted outside the USA. Clinics and low-resource settings need high-quality evidence on changing behavior. Thorough use of single theories would help in identifying what works, as would better reporting on research design and intervention implementation.
Distillation of Greenberger-Horne-Zeilinger states by selective information manipulation.
Cohen, O; Brun, T A
2000-06-19
Methods for distilling Greenberger-Horne-Zeilinger (GHZ) states from arbitrary entangled tripartite pure states are described. These techniques work for virtually any input state. Each technique has two stages which we call primary and secondary distillations. Primary distillation produces a GHZ state with some probability, so that when applied to an ensemble of systems a certain percentage is discarded. Secondary distillation produces further GHZs from the discarded systems. These protocols are developed with the help of an approach to quantum information theory based on absolutely selective information, which has other potential applications.
Identification of open quantum systems from observable time traces
Zhang, Jun; Sarovar, Mohan
2015-05-27
Estimating the parameters that dictate the dynamics of a quantum system is an important task for quantum information processing and quantum metrology, as well as fundamental physics. In our paper we develop a method for parameter estimation for Markovian open quantum systems using a temporal record of measurements on the system. Furthermore, the method is based on system realization theory and is a generalization of our previous work on identification of Hamiltonian parameters.
Electronic Structure Methods Based on Density Functional Theory
2010-01-01
0188 The public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing...chapter in the ASM Handbook , Volume 22A: Fundamentals of Modeling for Metals Processing, 2010. PAO Case Number: 88ABW-2009-3258; Clearance Date: 16 Jul...are represented using a linear combination, or basis, of plane waves. Over time several methods were developed to avoid the large number of planewaves
ERIC Educational Resources Information Center
Marcoulides, Katerina M.
2018-01-01
This study examined the use of Bayesian analysis methods for the estimation of item parameters in a two-parameter logistic item response theory model. Using simulated data under various design conditions with both informative and non-informative priors, the parameter recovery of Bayesian analysis methods were examined. Overall results showed that…
Weighted score-level feature fusion based on Dempster-Shafer evidence theory for action recognition
NASA Astrophysics Data System (ADS)
Zhang, Guoliang; Jia, Songmin; Li, Xiuzhi; Zhang, Xiangyin
2018-01-01
The majority of human action recognition methods use multifeature fusion strategy to improve the classification performance, where the contribution of different features for specific action has not been paid enough attention. We present an extendible and universal weighted score-level feature fusion method using the Dempster-Shafer (DS) evidence theory based on the pipeline of bag-of-visual-words. First, the partially distinctive samples in the training set are selected to construct the validation set. Then, local spatiotemporal features and pose features are extracted from these samples to obtain evidence information. The DS evidence theory and the proposed rule of survival of the fittest are employed to achieve evidence combination and calculate optimal weight vectors of every feature type belonging to each action class. Finally, the recognition results are deduced via the weighted summation strategy. The performance of the established recognition framework is evaluated on Penn Action dataset and a subset of the joint-annotated human metabolome database (sub-JHMDB). The experiment results demonstrate that the proposed feature fusion method can adequately exploit the complementarity among multiple features and improve upon most of the state-of-the-art algorithms on Penn Action and sub-JHMDB datasets.
Streamflow Prediction based on Chaos Theory
NASA Astrophysics Data System (ADS)
Li, X.; Wang, X.; Babovic, V. M.
2015-12-01
Chaos theory is a popular method in hydrologic time series prediction. Local model (LM) based on this theory utilizes time-delay embedding to reconstruct the phase-space diagram. For this method, its efficacy is dependent on the embedding parameters, i.e. embedding dimension, time lag, and nearest neighbor number. The optimal estimation of these parameters is thus critical to the application of Local model. However, these embedding parameters are conventionally estimated using Average Mutual Information (AMI) and False Nearest Neighbors (FNN) separately. This may leads to local optimization and thus has limitation to its prediction accuracy. Considering about these limitation, this paper applies a local model combined with simulated annealing (SA) to find the global optimization of embedding parameters. It is also compared with another global optimization approach of Genetic Algorithm (GA). These proposed hybrid methods are applied in daily and monthly streamflow time series for examination. The results show that global optimization can contribute to the local model to provide more accurate prediction results compared with local optimization. The LM combined with SA shows more advantages in terms of its computational efficiency. The proposed scheme here can also be applied to other fields such as prediction of hydro-climatic time series, error correction, etc.
NASA Astrophysics Data System (ADS)
Zendejas, Gerardo; Chiasson, Mike
This paper will propose and explore a method to enhance focal actors' abilities to enroll and control the many social and technical components interacting during the initiation, production, and diffusion of innovations. The reassembling and stabilizing of such components is the challenging goal of the focal actors involved in these processes. To address this possibility, a healthcare project involving the initiation, production, and diffusion of an IT-based innovation will be influenced by the researcher, using concepts from actor network theory (ANT), within an action research methodology (ARM). The experiences using this method, and the nature of enrolment and translation during its use, will highlight if and how ANT can provide a problem-solving method to help assemble the social and technical actants involved in the diffusion of an innovation. Finally, the paper will discuss the challenges and benefits of implementing such methods to attain widespread diffusion.
Adaptation to sensory-motor reflex perturbations is blind to the source of errors.
Hudson, Todd E; Landy, Michael S
2012-01-06
In the study of visual-motor control, perhaps the most familiar findings involve adaptation to externally imposed movement errors. Theories of visual-motor adaptation based on optimal information processing suppose that the nervous system identifies the sources of errors to effect the most efficient adaptive response. We report two experiments using a novel perturbation based on stimulating a visually induced reflex in the reaching arm. Unlike adaptation to an external force, our method induces a perturbing reflex within the motor system itself, i.e., perturbing forces are self-generated. This novel method allows a test of the theory that error source information is used to generate an optimal adaptive response. If the self-generated source of the visually induced reflex perturbation is identified, the optimal response will be via reflex gain control. If the source is not identified, a compensatory force should be generated to counteract the reflex. Gain control is the optimal response to reflex perturbation, both because energy cost and movement errors are minimized. Energy is conserved because neither reflex-induced nor compensatory forces are generated. Precision is maximized because endpoint variance is proportional to force production. We find evidence against source-identified adaptation in both experiments, suggesting that sensory-motor information processing is not always optimal.
Training needs for toxicity testing in the 21st century: a survey-informed analysis.
Lapenna, Silvia; Gabbert, Silke; Worth, Andrew
2012-12-01
Current training needs on the use of alternative methods in predictive toxicology, including new approaches based on mode-of-action (MoA) and adverse outcome pathway (AOP) concepts, are expected to evolve rapidly. In order to gain insight into stakeholder preferences for training, the European Commission's Joint Research Centre (JRC) conducted a single-question survey with twelve experts in regulatory agencies, industry, national research organisations, NGOs and consultancies. Stakeholder responses were evaluated by means of theory-based qualitative data analysis. Overall, a set of training topics were identified that relate both to general background information and to guidance for applying alternative testing methods. In particular, for the use of in silico methods, stakeholders emphasised the need for training on data integration and evaluation, in order to increase confidence in applying these methods for regulatory purposes. Although the survey does not claim to offer an exhaustive overview of the training requirements, its findings support the conclusion that the development of well-targeted and tailor-made training opportunities that inform about the usefulness of alternative methods, in particular those that offer practical experience in the application of in silico methods, deserves more attention. This should be complemented by transparent information and guidance on the interpretation of the results generated by these methods and software tools. 2012 FRAME.
A New Understanding for the Rain Rate retrieval of Attenuating Radars Measurement
NASA Astrophysics Data System (ADS)
Koner, P.; Battaglia, A.; Simmer, C.
2009-04-01
The retrieval of rain rate from the attenuated radar (e.g. Cloud Profiling Radar on board of CloudSAT in orbit since June 2006) is a challenging problem. ĹEcuyer and Stephens [1] underlined this difficulty (for rain rates larger than 1.5 mm/h) and suggested the need of additional information (like path-integrated attenuations (PIA) derived from surface reference techniques or precipitation water path estimated from co-located passive microwave radiometer) to constrain the retrieval. It is generally discussed based on the optimal estimation theory that there are no solutions without constraining the problem in a case of visible attenuation because there is no enough information content to solve the problem. However, when the problem is constrained by the additional measurement of PIA, there is a reasonable solution. This raises the spontaneous question: Is all information enclosed in this additional measurement? This also contradicts with the information theory because one measurement can introduce only one degree of freedom in the retrieval. Why is one degree of freedom so important in the above problem? This question cannot be explained using the estimation and information theories of OEM. On the other hand, Koner and Drummond [2] argued that the OEM is basically a regularization method, where a-priori covariance is used as a stabilizer and the regularization strength is determined by the choices of the a-priori and error covariance matrices. The regularization is required for the reduction of the condition number of Jacobian, which drives the noise injection from the measurement and inversion spaces to the state space in an ill-posed inversion. In this work, the above mentioned question will be discussed based on the regularization theory, error mitigation and eigenvalue mathematics. References 1. L'Ecuyer TS and Stephens G. An estimation based precipitation retrieval algorithm for attenuating radar. J. Appl. Met., 2002, 41, 272-85. 2. Koner PK, Drummond JR. A comparison of regularization techniques for atmospheric trace gases retrievals. JQSRT 2008; 109:514-26.
The difference between a dynamic and mechanical approach to stroke treatment.
Helgason, Cathy M
2007-06-01
The current classification of stroke is based on causation, also called pathogenesis, and relies on binary logic faithful to the Aristotelian tradition. Accordingly, a pathology is or is not the cause of the stroke, is considered independent of others, and is the target for treatment. It is the subject for large double-blind randomized clinical therapeutic trials. The scientific view behind clinical trials is the fundamental concept that information is statistical, and causation is determined by probabilities. Therefore, the cause and effect relation will be determined by probability-theory-based statistics. This is the basis of evidence-based medicine, which calls for the results of such trials to be the basis for physician decisions regarding diagnosis and treatment. However, there are problems with the methodology behind evidence-based medicine. Calculations using probability-theory-based statistics regarding cause and effect are performed within an automatic system where there are known inputs and outputs. This method of research provides a framework of certainty with no surprise elements or outcomes. However, it is not a system or method that will come up with previously unknown variables, concepts, or universal principles; it is not a method that will give a new outcome; and it is not a method that allows for creativity, expertise, or new insight for problem solving.
The subject of pedagogy from theory to practice--the view of newly registered nurses.
Ivarsson, Bodil; Nilsson, Gunilla
2009-07-01
The aim was to describe, from the newly registered nurses' perspective, specific events when using their pedagogical knowledge in their everyday clinical practice. The design was qualitative and the critical incident technique was used. Data was collected via interviews with ten newly registered nurses who graduated from the same University program 10 months earlier and are now employed at a university hospital. Two categories emerged in the analyses. The first category was "Pedagogical methods in theory" with the sub-categories Theory and the application of the course in practice, Knowledge of pedagogy and Information as a professional competence. The second category was "Pedagogical methods in everyday clinical practice" with sub-categories Factual knowledge versus pedagogical knowledge, Information and relatives, Difficulties when giving information, Understanding information received, Pedagogical tools, Collaboration in teams in pedagogical situations, and Time and giving information. By identifying specific events regarding pedagogical methods the findings can be useful for everyone from teachers and health-care managers to nurse students and newly registered nurses, to improve teaching methods in nurse education.
Effects of atmospheric aerosols on scattering reflected visible light from earth resource features
NASA Technical Reports Server (NTRS)
Noll, K. E.; Tschantz, B. A.; Davis, W. T.
1972-01-01
The vertical variations in atmospheric light attenuation under ambient conditions were identified, and a method through which aerial photographs of earth features might be corrected to yield quantitative information about the actual features was provided. A theoretical equation was developed based on the Bouguer-Lambert extinction law and basic photographic theory.
ERIC Educational Resources Information Center
McGair, Charles D.
2012-01-01
Many theories, methods, and practices are utilized to evaluate teachers with the intention of determining teacher effectiveness to better inform decisions about retention, tenure, certification and performance-based pay. In the 21st century there has been a renewed emphasis on teacher evaluation in public schools, largely due to federal "Race…
Upper-Bound Estimates Of SEU in CMOS
NASA Technical Reports Server (NTRS)
Edmonds, Larry D.
1990-01-01
Theory of single-event upsets (SEU) (changes in logic state caused by energetic charged subatomic particles) in complementary metal oxide/semiconductor (CMOS) logic devices extended to provide upper-bound estimates of rates of SEU when limited experimental information available and configuration and dimensions of SEU-sensitive regions of devices unknown. Based partly on chord-length-distribution method.
Sebire, Simon J.; Kesten, Joanna M.; Edwards, Mark J.; May, Thomas; Banfield, Kathryn; Tomkinson, Keeley; Blair, Peter S.; Bird, Emma L.; Powell, Jane E.; Jago, Russell
2016-01-01
Objectives To report the theory-based process evaluation of the Bristol Girls' Dance Project, a cluster-randomised controlled trial to increase adolescent girls' physical activity. Design A mixed-method process evaluation of the intervention's self-determination theory components comprising lesson observations, post-intervention interviews and focus groups. Method Four intervention dance lessons per dance instructor were observed, audio recorded and rated to estimate the use of need-supportive teaching strategies. Intervention participants (n = 281) reported their dance instructors' provision of autonomy-support. Semi-structured interviews with the dance instructors (n = 10) explored fidelity to the theory and focus groups were conducted with participants (n = 59) in each school to explore their receipt of the intervention and views on the dance instructors' motivating style. Results Although instructors accepted the theory-based approach, intervention fidelity was variable. Relatedness support was the most commonly observed need-supportive teaching behaviour, provision of structure was moderate and autonomy-support was comparatively low. The qualitative findings identified how instructors supported competence and developed trusting relationships with participants. Fidelity was challenged where autonomy provision was limited to option choices rather than input into the pace or direction of lessons and where controlling teaching styles were adopted, often to manage disruptive behaviour. Conclusion The successes and challenges to achieving theoretical fidelity in the Bristol Girls' Dance Project may help explain the intervention effects and can more broadly inform the design of theory-based complex interventions aimed at increasing young people's physical activity in after-school settings. PMID:27175102
Chiasson, Mike; Reddy, Madhu; Kaplan, Bonnie; Davidson, Elizabeth
2007-06-01
The effective use of information technology (IT) is a crucial component for the delivery of effective services in health care. Current approaches to medical informatics (MI) research have significantly contributed to the success of IT use in health care but important challenges remain to be addressed. We believe that expanding the multi-disciplinary basis for MI research is important to meeting these research challenges. In this paper, we outline theories and methods used in information systems (IS) research that we believe can inform our understanding of health care IT applications and outcomes. To do so, we discuss some general differences in the focus and methods of MI and IS research to identify broad opportunities. We then review conceptual and methodological approaches in IS that have been applied in health care IT research. These include: technology-use mediation, collaborative work, genre theory, interpretive research, action research, and modeling. Examples of these theories and methods in healthcare IS research are illustrated.
NASA Astrophysics Data System (ADS)
Cheng, Yayun; Qi, Bo; Liu, Siyuan; Hu, Fei; Gui, Liangqi; Peng, Xiaohui
2016-10-01
Polarimetric measurements can provide additional information as compared to unpolarized ones. In this paper, linear polarization ratio (LPR) is created to be a feature discriminator. The LPR properties of several materials are investigated using Fresnel theory. The theoretical results show that LPR is sensitive to the material type (metal or dielectric). Then a linear polarization ratio-based (LPR-based) method is presented to distinguish between metal and dielectric materials. In order to apply this method to practical applications, the optimal range of incident angle have been discussed. The typical outdoor experiments including various objects such as aluminum plate, grass, concrete, soil and wood, have been conducted to validate the presented classification method.
Patterns-Based IS Change Management in SMEs
NASA Astrophysics Data System (ADS)
Makna, Janis; Kirikova, Marite
The majority of information systems change management guidelines and standards are either too abstract or too bureaucratic to be easily applicable in small enterprises. This chapter proposes the approach, the method, and the prototype that are designed especially for information systems change management in small and medium enterprises. The approach is based on proven patterns of changes in the set of information systems elements. The set of elements was obtained by theoretical analysis of information systems and business process definitions and enterprise architectures. The patterns were evolved from a number of information systems theories and tested in 48 information systems change management projects. The prototype presents and helps to handle three basic change patterns, which help to anticipate the overall scope of changes related to particular elementary changes in an enterprise information system. The use of prototype requires just basic knowledge in organizational business process and information management.
Creativity, information, and consciousness: The information dynamics of thinking.
Wiggins, Geraint A
2018-05-07
This paper presents a theory of the basic operation of mind, Information Dynamics of Thinking, which is intended for computational implementation and thence empirical testing. It is based on the information theory of Shannon, and treats the mind/brain as an information processing organ that aims to be information-efficient, in that it predicts its world, so as to use information efficiently, and regularly re-represents it, so as to store information efficiently. The theory is presented in context of a background review of various research areas that impinge upon its development. Consequences of the theory and testable hypotheses arising from it are discussed. Copyright © 2018. Published by Elsevier B.V.
Teachers' and Researchers' Beliefs of Learning and the use of Learning Progressions
NASA Astrophysics Data System (ADS)
Clapp, Francis Neely
In the last decade, science education reform in the United States has emphasized the exploration of cognitive learning pathways, which are theories on how a person learns a particular science subject matter. These theories are based, in part, by Piagetian developmental theory. One such model, called Learning Progressions (LP), has become prominent within science education reform. Science education researchers design LPs which in turn are used by science educators to sequence their curricula. The new national science standards released in April 2013 (Next Generation Science Standards) are, in part, grounded in the LP model. Understanding how teachers apply and use LPs, therefore, is valuable because professional development programs are likely to use this model, given the federal attention LP have received in science education reform. I sought to identify the beliefs and discourse that both LP developers and intended LP implementers have around student learning, teaching, and learning progressions. However, studies measuring beliefs or perspectives of LP-focused projects are absent in published works. A qualitative research is therefore warranted to explore this rather uncharted research area. Research questions were examined through the use of an instrumental case study. A case study approach was selected over other methodologies, as the research problem is, in part, bound within a clearly identifiable case (a professional development experience centering on a single LP model). One of the broadest definitions of a case study is noted by Becker (1968), who stated that goals of case studies are "to arrive at a comprehensive understanding of the groups under study" and to develop "general theoretical statements about regularities in social structure and process." (p.233). Based on Merriam (1985) the general consensus in the case study literature is that the assumptions underlying this method are common to naturalistic inquiry with research conducted primarily in the field with little control of variables. Beyond this similarity, different researchers have varying definitions to case studies. Merriam's (1985) provided a summary of the delineations and varying types of case studies. Merriam divided the various case study methods by their functions, with a marked divide between theory building and non-theory building methods. Non-theory building case studies are generally descriptive, and interpretive methods that apply theory to a case or context allow researchers to better understand the phenomena observed (Lijphart, 1971; Merriam, 1985). Conversely, theory building case studies focus on hypothesis generation, theory confirming, theory informing, or theory refuting (Lijphart, 1971; Merriam, 1985). Though there are many definitions and methods labeled as 'case studies,' for the purpose of this study, Yin's (1981) definition of a case study will be used. Yin (1981) defined a case study as a method to examine "(a) a contemporary phenomenon in its real-life context, especially when (b) the boundaries between phenomenon and context are not clearly evident" (p. 59). My study seeks to apply theory and study phenomena in their context, as I will examine teachers' practice in context of their respective classrooms. This study focuses on the lived experiences of both teacher and research stakeholders within the study. Specifically, I interviewed teachers who participated in a year-long teacher-in-residence (TiR) program. In addition, researchers/content experts who conceptualized the LP were also interviewed. Because the TiR experience was a form of professional development, I propose to study the impact that it had on participants' perceptions of the LP and any teacher-reported changes in their respective classrooms. However, because beliefs influence the language that we use to describe phenomena (such as learning and teaching), it is informative to also describe patterns in how LP developers explain learning and teaching. Subsequently, the results of this study will inform literature on both science teacher professional development and LPs theory to practice.
LeVine, Michael V.; Weinstein, Harel
2014-01-01
Complex networks of interacting residues and microdomains in the structures of biomolecular systems underlie the reliable propagation of information from an input signal, such as the concentration of a ligand, to sites that generate the appropriate output signal, such as enzymatic activity. This information transduction often carries the signal across relatively large distances at the molecular scale in a form of allostery that is essential for the physiological functions performed by biomolecules. While allosteric behaviors have been documented from experiments and computation, the mechanism of this form of allostery proved difficult to identify at the molecular level. Here, we introduce a novel analysis framework, called N-body Information Theory (NbIT) analysis, which is based on information theory and uses measures of configurational entropy in a biomolecular system to identify microdomains and individual residues that act as (i)-channels for long-distance information sharing between functional sites, and (ii)-coordinators that organize dynamics within functional sites. Application of the new method to molecular dynamics (MD) trajectories of the occluded state of the bacterial leucine transporter LeuT identifies a channel of allosteric coupling between the functionally important intracellular gate and the substrate binding sites known to modulate it. NbIT analysis is shown also to differentiate residues involved primarily in stabilizing the functional sites, from those that contribute to allosteric couplings between sites. NbIT analysis of MD data thus reveals rigorous mechanistic elements of allostery underlying the dynamics of biomolecular systems. PMID:24785005
Neuropsychology 3.0: Evidence-Based Science and Practice
Bilder, Robert M.
2011-01-01
Neuropsychology is poised for transformations of its concepts and methods, leveraging advances in neuroimaging, the human genome project, psychometric theory, and information technologies. It is argued that a paradigm shift towards evidence-based science and practice can be enabled by innovations, including: (1) formal definition of neuropsychological concepts and tasks in cognitive ontologies; (2) creation of collaborative neuropsychological knowledgebases; and (3) design of web-based assessment methods that permit free development, large-sample implementation, and dynamic refinement of neuropsychological tests and the constructs these aim to assess. This article considers these opportunities, highlights selected obstacles, and offers suggestions for stepwise progress towards these goals. PMID:21092355
REGIME CHANGES IN ECOLOGICAL SYSTEMS: AN INFORMATION THEORY APPROACH
We present our efforts at developing an ecological system using Information Theory. We derive an expression for Fisher Information based on sampling of the system trajectory as it evolves in the state space. The Fisher Information index as we have derived it captures the characte...
Evidence conflict measure based on OWA operator in open world
Wang, Shiyu; Liu, Xiang; Zheng, Hanqing; Wei, Boya
2017-01-01
Dempster-Shafer evidence theory has been extensively used in many information fusion systems since it was proposed by Dempster and extended by Shafer. Many scholars have been conducted on conflict management of Dempster-Shafer evidence theory in past decades. However, how to determine a potent parameter to measure evidence conflict, when the given environment is in an open world, namely the frame of discernment is incomplete, is still an open issue. In this paper, a new method which combines generalized conflict coefficient, generalized evidence distance, and generalized interval correlation coefficient based on ordered weighted averaging (OWA) operator, to measure the conflict of evidence is presented. Through ordered weighted average of these three parameters, the combinatorial coefficient can still measure the conflict effectively when one or two parameters are not valid. Several numerical examples demonstrate the effectiveness of the proposed method. PMID:28542271
Bootstrapping conformal field theories with the extremal functional method.
El-Showk, Sheer; Paulos, Miguel F
2013-12-13
The existence of a positive linear functional acting on the space of (differences between) conformal blocks has been shown to rule out regions in the parameter space of conformal field theories (CFTs). We argue that at the boundary of the allowed region the extremal functional contains, in principle, enough information to determine the dimensions and operator product expansion (OPE) coefficients of an infinite number of operators appearing in the correlator under analysis. Based on this idea we develop the extremal functional method (EFM), a numerical procedure for deriving the spectrum and OPE coefficients of CFTs lying on the boundary (of solution space). We test the EFM by using it to rederive the low lying spectrum and OPE coefficients of the two-dimensional Ising model based solely on the dimension of a single scalar quasiprimary--no Virasoro algebra required. Our work serves as a benchmark for applications to more interesting, less known CFTs in the near future.
Pangenesis as a source of new genetic information. The history of a now disproven theory.
Bergman, Gerald
2006-01-01
Evolution is based on natural selection of existing biological phenotypic traits. Natural selection can only eliminate traits. It cannot create new ones, requiring a theory to explain the origin of new genetic information. The theory of pangenesis was a major attempt to explain the source of new genetic information required to produce phenotypic variety. This theory, advocated by Darwin as the main source of genetic variety, has now been empirically disproved. It is currently a theory mainly of interest to science historians.
Information Theory and the Earth's Density Distribution
NASA Technical Reports Server (NTRS)
Rubincam, D. P.
1979-01-01
An argument for using the information theory approach as an inference technique in solid earth geophysics. A spherically symmetric density distribution is derived as an example of the method. A simple model of the earth plus knowledge of its mass and moment of inertia lead to a density distribution which was surprisingly close to the optimum distribution. Future directions for the information theory approach in solid earth geophysics as well as its strengths and weaknesses are discussed.
Using Rasch Analysis to Inform Rating Scale Development
ERIC Educational Resources Information Center
Van Zile-Tamsen, Carol
2017-01-01
The use of surveys, questionnaires, and rating scales to measure important outcomes in higher education is pervasive, but reliability and validity information is often based on problematic Classical Test Theory approaches. Rasch Analysis, based on Item Response Theory, provides a better alternative for examining the psychometric quality of rating…
Concepts and Measurements for Manpower and Occupational Analysis.
ERIC Educational Resources Information Center
Scoville, James G.
This volume contains information on occupational data and their uses, jobs-theories, case studies, and improved data bases. A survey was made of current applications of occupational information data and conceptual bases and practical shortcomings of the more frequently used classification systems. In addition, an economic theory was developed to…
Scandurra, I; Hägglund, M; Koch, S
2008-08-01
This paper presents a new multi-disciplinary method for user needs analysis and requirements specification in the context of health information systems based on established theories from the fields of participatory design and computer supported cooperative work (CSCW). Whereas conventional methods imply a separate, sequential needs analysis for each profession, the "multi-disciplinary thematic seminar" (MdTS) method uses a collaborative design process. Application of the method in elderly homecare resulted in prototypes that were well adapted to the intended user groups. Vital information in the points of intersection between different care professions was elicited and a holistic view of the entire care process was obtained. Health informatics-usability specialists and clinical domain experts are necessary to apply the method. Although user needs acquisition can be time-consuming, MdTS was perceived to efficiently identify in-context user needs, and transformed these directly into requirements specifications. Consequently the method was perceived to expedite the entire ICT implementation process.
Momeni, Ali; Rouhi, Kasra; Rajabalipanah, Hamid; Abdolali, Ali
2018-04-18
Inspired by the information theory, a new concept of re-programmable encrypted graphene-based coding metasurfaces was investigated at terahertz frequencies. A channel-coding function was proposed to convolutionally record an arbitrary information message onto unrecognizable but recoverable parity beams generated by a phase-encrypted coding metasurface. A single graphene-based reflective cell with dual-mode biasing voltages was designed to act as "0" and "1" meta-atoms, providing broadband opposite reflection phases. By exploiting graphene tunability, the proposed scheme enabled an unprecedented degree of freedom in the real-time mapping of information messages onto multiple parity beams which could not be damaged, altered, and reverse-engineered. Various encryption types such as mirroring, anomalous reflection, multi-beam generation, and scattering diffusion can be dynamically attained via our multifunctional metasurface. Besides, contrary to conventional time-consuming and optimization-based methods, this paper convincingly offers a fast, straightforward, and efficient design of diffusion metasurfaces of arbitrarily large size. Rigorous full-wave simulations corroborated the results where the phase-encrypted metasurfaces exhibited a polarization-insensitive reflectivity less than -10 dB over a broadband frequency range from 1 THz to 1.7 THz. This work reveals new opportunities for the extension of re-programmable THz-coding metasurfaces and may be of interest for reflection-type security systems, computational imaging, and camouflage technology.
Morphology-Induced Information Transfer in Bat Sonar
NASA Astrophysics Data System (ADS)
Reijniers, Jonas; Vanderelst, Dieter; Peremans, Herbert
2010-10-01
It has been argued that an important part of understanding bat echolocation comes down to understanding the morphology of the bat sound processing apparatus. In this Letter we present a method based on information theory that allows us to assess target localization performance of bat sonar, without a priori knowledge on the position, size, or shape of the reflecting target. We demonstrate this method using simulated directivity patterns of the frequency-modulated bat Micronycteris microtis. The results of this analysis indicate that the morphology of this bat’s sound processing apparatus has evolved to be a compromise between sensitivity and accuracy with the pinnae and the noseleaf playing different roles.
The cell monolayer trajectory from the system state point of view.
Stys, Dalibor; Vanek, Jan; Nahlik, Tomas; Urban, Jan; Cisar, Petr
2011-10-01
Time-lapse microscopic movies are being increasingly utilized for understanding the derivation of cell states and predicting cell future. Often, fluorescence and other types of labeling are not available or desirable, and cell state-definitions based on observable structures must be used. We present the methodology for cell behavior recognition and prediction based on the short term cell recurrent behavior analysis. This approach has theoretical justification in non-linear dynamics theory. The methodology is based on the general stochastic systems theory which allows us to define the cell states, trajectory and the system itself. We introduce the usage of a novel image content descriptor based on information contribution (gain) by each image point for the cell state characterization as the first step. The linkage between the method and the general system theory is presented as a general frame for cell behavior interpretation. We also discuss extended cell description, system theory and methodology for future development. This methodology may be used for many practical purposes, ranging from advanced, medically relevant, precise cell culture diagnostics to very utilitarian cell recognition in a noisy or uneven image background. In addition, the results are theoretically justified.
Error-based Extraction of States and Energy Landscapes from Experimental Single-Molecule Time-Series
NASA Astrophysics Data System (ADS)
Taylor, J. Nicholas; Li, Chun-Biu; Cooper, David R.; Landes, Christy F.; Komatsuzaki, Tamiki
2015-03-01
Characterization of states, the essential components of the underlying energy landscapes, is one of the most intriguing subjects in single-molecule (SM) experiments due to the existence of noise inherent to the measurements. Here we present a method to extract the underlying state sequences from experimental SM time-series. Taking into account empirical error and the finite sampling of the time-series, the method extracts a steady-state network which provides an approximation of the underlying effective free energy landscape. The core of the method is the application of rate-distortion theory from information theory, allowing the individual data points to be assigned to multiple states simultaneously. We demonstrate the method's proficiency in its application to simulated trajectories as well as to experimental SM fluorescence resonance energy transfer (FRET) trajectories obtained from isolated agonist binding domains of the AMPA receptor, an ionotropic glutamate receptor that is prevalent in the central nervous system.
A Multi-level Fuzzy Evaluation Method for Smart Distribution Network Based on Entropy Weight
NASA Astrophysics Data System (ADS)
Li, Jianfang; Song, Xiaohui; Gao, Fei; Zhang, Yu
2017-05-01
Smart distribution network is considered as the future trend of distribution network. In order to comprehensive evaluate smart distribution construction level and give guidance to the practice of smart distribution construction, a multi-level fuzzy evaluation method based on entropy weight is proposed. Firstly, focus on both the conventional characteristics of distribution network and new characteristics of smart distribution network such as self-healing and interaction, a multi-level evaluation index system which contains power supply capability, power quality, economy, reliability and interaction is established. Then, a combination weighting method based on Delphi method and entropy weight method is put forward, which take into account not only the importance of the evaluation index in the experts’ subjective view, but also the objective and different information from the index values. Thirdly, a multi-level evaluation method based on fuzzy theory is put forward. Lastly, an example is conducted based on the statistical data of some cites’ distribution network and the evaluation method is proved effective and rational.
Eigencentrality based on dissimilarity measures reveals central nodes in complex networks
Alvarez-Socorro, A. J.; Herrera-Almarza, G. C.; González-Díaz, L. A.
2015-01-01
One of the most important problems in complex network’s theory is the location of the entities that are essential or have a main role within the network. For this purpose, the use of dissimilarity measures (specific to theory of classification and data mining) to enrich the centrality measures in complex networks is proposed. The centrality method used is the eigencentrality which is based on the heuristic that the centrality of a node depends on how central are the nodes in the immediate neighbourhood (like rich get richer phenomenon). This can be described by an eigenvalues problem, however the information of the neighbourhood and the connections between neighbours is not taken in account, neglecting their relevance when is one evaluates the centrality/importance/influence of a node. The contribution calculated by the dissimilarity measure is parameter independent, making the proposed method is also parameter independent. Finally, we perform a comparative study of our method versus other methods reported in the literature, obtaining more accurate and less expensive computational results in most cases. PMID:26603652
Towards a Formal Basis for Modular Safety Cases
NASA Technical Reports Server (NTRS)
Denney, Ewen; Pai, Ganesh
2015-01-01
Safety assurance using argument-based safety cases is an accepted best-practice in many safety-critical sectors. Goal Structuring Notation (GSN), which is widely used for presenting safety arguments graphically, provides a notion of modular arguments to support the goal of incremental certification. Despite the efforts at standardization, GSN remains an informal notation whereas the GSN standard contains appreciable ambiguity especially concerning modular extensions. This, in turn, presents challenges when developing tools and methods to intelligently manipulate modular GSN arguments. This paper develops the elements of a theory of modular safety cases, leveraging our previous work on formalizing GSN arguments. Using example argument structures we highlight some ambiguities arising through the existing guidance, present the intuition underlying the theory, clarify syntax, and address modular arguments, contracts, well-formedness and well-scopedness of modules. Based on this theory, we have a preliminary implementation of modular arguments in our toolset, AdvoCATE.
An ambiguity of information content and error in an ill-posed satellite inversion
NASA Astrophysics Data System (ADS)
Koner, Prabhat
According to Rodgers (2000, stochastic approach), the averaging kernel (AK) is the representational matrix to understand the information content in a scholastic inversion. On the other hand, in deterministic approach this is referred to as model resolution matrix (MRM, Menke 1989). The analysis of AK/MRM can only give some understanding of how much regularization is imposed on the inverse problem. The trace of the AK/MRM matrix, which is the so-called degree of freedom from signal (DFS; stochastic) or degree of freedom in retrieval (DFR; deterministic). There are no physical/mathematical explanations in the literature: why the trace of the matrix is a valid form to calculate this quantity? We will present an ambiguity between information and error using a real life problem of SST retrieval from GOES13. The stochastic information content calculation is based on the linear assumption. The validity of such mathematics in satellite inversion will be questioned because it is based on the nonlinear radiative transfer and ill-conditioned inverse problems. References: Menke, W., 1989: Geophysical data analysis: discrete inverse theory. San Diego academic press. Rodgers, C.D., 2000: Inverse methods for atmospheric soundings: theory and practice. Singapore :World Scientific.
An improved resource management model based on MDS
NASA Astrophysics Data System (ADS)
Yuan, Man; Sun, Changying; Li, Pengfei; Sun, Yongdong; He, Rui
2005-11-01
GRID technology provides a kind of convenient method for managing GRID resources. This service is so-called monitoring, discovering service. This method is proposed by Globus Alliance, in this GRID environment, all kinds of resources, such as computational resources, storage resources and other resources can be organized by MDS specifications. However, this MDS is a theory framework, particularly, in a small world intranet, in the case of limit of resources, the MDS has its own limitation. Based on MDS, an improved light method for managing corporation computational resources and storage resources is proposed in intranet(IMDS). Firstly, in MDS, all kinds of resource description information is stored in LDAP, it is well known although LDAP is a light directory access protocol, in practice, programmers rarely master how to access and store resource information into LDAP store, in such way, it limits MDS to be used. So, in intranet, these resources' description information can be stored in RDBMS, programmers and users can access this information by standard SQL. Secondly, in MDS, how to monitor all kinds of resources in GRID is not transparent for programmers and users. In such way, it limits its application scope, in general, resource monitoring method base on SNMP is widely employed in intranet, therefore, a kind of resource monitoring method based on SNMP is integrated into MDS. Finally, all kinds of resources in the intranet can be described by XML, and all kinds of resources' description information is stored in RDBMS, such as MySql, and retrieved by standard SQL, dynamic information for all kinds of resources can be sent to resource storage by SNMP, A prototype resource description, monitoring is designed and implemented in intranet.
Riddlesperger, K L; Beard, M; Flowers, D L; Hisley, S M; Pfeifer, K A; Stiller, J J
1996-09-01
Since the 1980s the electronic domain has become the primary method for academic and professional communication of research and information. Papers relating to theory construction in nursing are a frequently occurring phenomenon within the electronic domain. Theory construction provides the underpinning for the advancement of professional nursing, facilitating the conceptualization of nursing actions leading to theory-based practice and research. The purpose of this study was to address the research question, 'What are the similarities and differences among theory construction papers that are accessible electronically in nursing literature today?' The Cumulative Index to Nursing and Allied Health Literature (CINAHL) was accessed to obtain a listing of papers from which an overall description of the type of theory construction papers being published in the nursing literature today could be determined. A literature search was conducted using the description 'theory construction'. Papers were limited to publication years from 1990 onwards. A total of 125 papers were obtained and read by one of the six authors. Using grounded theory, categories emerged by identification of similarities and differences among the papers. The findings are discussed here along with suggestions for further study. A second purpose of this paper was to present both traditional and non-traditional methods of tapping into the electronic domain when searching for assistance with theory construction.
Aligning observed and modelled behaviour based on workflow decomposition
NASA Astrophysics Data System (ADS)
Wang, Lu; Du, YuYue; Liu, Wei
2017-09-01
When business processes are mostly supported by information systems, the availability of event logs generated from these systems, as well as the requirement of appropriate process models are increasing. Business processes can be discovered, monitored and enhanced by extracting process-related information. However, some events cannot be correctly identified because of the explosion of the amount of event logs. Therefore, a new process mining technique is proposed based on a workflow decomposition method in this paper. Petri nets (PNs) are used to describe business processes, and then conformance checking of event logs and process models is investigated. A decomposition approach is proposed to divide large process models and event logs into several separate parts that can be analysed independently; while an alignment approach based on a state equation method in PN theory enhances the performance of conformance checking. Both approaches are implemented in programmable read-only memory (ProM). The correctness and effectiveness of the proposed methods are illustrated through experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Man, Jun; Zhang, Jiangjiang; Li, Weixuan
2016-10-01
The ensemble Kalman filter (EnKF) has been widely used in parameter estimation for hydrological models. The focus of most previous studies was to develop more efficient analysis (estimation) algorithms. On the other hand, it is intuitively understandable that a well-designed sampling (data-collection) strategy should provide more informative measurements and subsequently improve the parameter estimation. In this work, a Sequential Ensemble-based Optimal Design (SEOD) method, coupled with EnKF, information theory and sequential optimal design, is proposed to improve the performance of parameter estimation. Based on the first-order and second-order statistics, different information metrics including the Shannon entropy difference (SD), degrees ofmore » freedom for signal (DFS) and relative entropy (RE) are used to design the optimal sampling strategy, respectively. The effectiveness of the proposed method is illustrated by synthetic one-dimensional and two-dimensional unsaturated flow case studies. It is shown that the designed sampling strategies can provide more accurate parameter estimation and state prediction compared with conventional sampling strategies. Optimal sampling designs based on various information metrics perform similarly in our cases. The effect of ensemble size on the optimal design is also investigated. Overall, larger ensemble size improves the parameter estimation and convergence of optimal sampling strategy. Although the proposed method is applied to unsaturated flow problems in this study, it can be equally applied in any other hydrological problems.« less
The Global Positioning System: Theory and operation
NASA Astrophysics Data System (ADS)
Tucker, Lester Plunkett
Scope and method of study. The purpose of this study is to document the theory, development, and training needs of the United States Global Positioning System for the United States Air Force. This subject area had very little information and to assess the United States Air Force training needs required an investigation into existing training accomplished on the Global Positioning System. The United States Air Force has only one place to obtain the data at Headquarters Air Education and Training Command. Findings and conclusion. The United States Air Force, at the time of this study, does not have a theory and operations course dealing with the newest technology advancement in world navigation. Although this new technology is being provided on aircraft in the form of new navigation hardware, no official course of study is provided by the United States Air Force to it's pilots and navigators dealing with theory and operation. Based on the latest reports dealing with the Global Positioning System, a course on the Global Positioning System was developed in the Instructional Systems Design format to provide background information and understanding of this new technology. Readers of this study must be aware that the information contained in this study is very dynamic. Technology is advancing so fast in this area that it might make this information obsolete in a short amount of time.
NASA Astrophysics Data System (ADS)
Saracco, Ginette; Moreau, Frédérique; Mathé, Pierre-Etienne; Hermitte, Daniel; Michel, Jean-Marie
2007-10-01
We have previously developed a method for characterizing and localizing `homogeneous' buried sources, from the measure of potential anomalies at a fixed height above ground (magnetic, electric and gravity). This method is based on potential theory and uses the properties of the Poisson kernel (real by definition) and the continuous wavelet theory. Here, we relax the assumption on sources and introduce a method that we call the `multiscale tomography'. Our approach is based on the harmonic extension of the observed magnetic field to produce a complex source by use of a complex Poisson kernel solution of the Laplace equation for complex potential field. A phase and modulus are defined. We show that the phase provides additional information on the total magnetic inclination and the structure of sources, while the modulus allows us to characterize its spatial location, depth and `effective degree'. This method is compared to the `complex dipolar tomography', extension of the Patella method that we previously developed. We applied both methods and a classical electrical resistivity tomography to detect and localize buried archaeological structures like antique ovens from magnetic measurements on the Fox-Amphoux site (France). The estimates are then compared with the results of excavations.
- and Graph-Based Point Cloud Segmentation of 3d Scenes Using Perceptual Grouping Laws
NASA Astrophysics Data System (ADS)
Xu, Y.; Hoegner, L.; Tuttas, S.; Stilla, U.
2017-05-01
Segmentation is the fundamental step for recognizing and extracting objects from point clouds of 3D scene. In this paper, we present a strategy for point cloud segmentation using voxel structure and graph-based clustering with perceptual grouping laws, which allows a learning-free and completely automatic but parametric solution for segmenting 3D point cloud. To speak precisely, two segmentation methods utilizing voxel and supervoxel structures are reported and tested. The voxel-based data structure can increase efficiency and robustness of the segmentation process, suppressing the negative effect of noise, outliers, and uneven points densities. The clustering of voxels and supervoxel is carried out using graph theory on the basis of the local contextual information, which commonly conducted utilizing merely pairwise information in conventional clustering algorithms. By the use of perceptual laws, our method conducts the segmentation in a pure geometric way avoiding the use of RGB color and intensity information, so that it can be applied to more general applications. Experiments using different datasets have demonstrated that our proposed methods can achieve good results, especially for complex scenes and nonplanar surfaces of objects. Quantitative comparisons between our methods and other representative segmentation methods also confirms the effectiveness and efficiency of our proposals.
Application of fuzzy system theory in addressing the presence of uncertainties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.
In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statisticalmore » approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.« less
ERIC Educational Resources Information Center
Holden, Richard J.; Karsh, Ben-Tzion
2009-01-01
Primary objective: much research and practice related to the design and implementation of information technology in health care has been atheoretical. It is argued that using extant theory to develop testable models of health information technology (HIT) benefits both research and practice. Methods and procedures: several theories of motivation,…
ERIC Educational Resources Information Center
Roberts, Leah; González Alonso, Jorge; Pliatsikas, Christos; Rothman, Jason
2018-01-01
This special issue is a testament to the recent burgeoning interest by theoretical linguists, language acquisitionists and teaching practitioners in the neuroscience of language. It offers a highly valuable, state-of-the-art overview of the neurophysiological methods that are currently being applied to questions in the field of second language…
ERIC Educational Resources Information Center
FRANKLIN, PAULA; FRANKLIN, RICHARD
THIS NATIONAL TRAINING LABORATORIES (NTL) CONFERENCE, DEPARTING SOMEWHAT FROM ITS USUAL EXPERIENCE-BASED LEARNING PROGRAMS, FOCUSED LABORATORY TRAINING METHODS ON THE DECISION-MAKING PROCESS IN URBAN COMMUNITY PROBLEM SOLVING. THE CONFERENCE PRESENTED THEORY, INFORMATION, AND OPINION ON THE NATURE OF CITIES AND THEIR DECISION-MAKING PROCESSES.…
Promoting Food Safety Awareness for Older Adults by Using Online Education Modules
ERIC Educational Resources Information Center
Roy, Amber; Francis, Sarah L.; Shaw, Angela; Rajagopal, Lakshman
2016-01-01
Older adults are susceptible to and at greater risk for food-borne illness in comparison to those in other adult age groups. Online education is an underused method for the delivery of food safety information to this population. Three online mini-modules, based on social marketing theory (SMT), were created for and pilot-tested with older adults.…
Chiral limit of N = 4 SYM and ABJM and integrable Feynman graphs
NASA Astrophysics Data System (ADS)
Caetano, João; Gürdoğan, Ömer; Kazakov, Vladimir
2018-03-01
We consider a special double scaling limit, recently introduced by two of the authors, combining weak coupling and large imaginary twist, for the γ-twisted N = 4 SYM theory. We also establish the analogous limit for ABJM theory. The resulting non-gauge chiral 4D and 3D theories of interacting scalars and fermions are integrable in the planar limit. In spite of the breakdown of conformality by double-trace interactions, most of the correlators for local operators of these theories are conformal, with non-trivial anomalous dimensions defined by specific, integrable Feynman diagrams. We discuss the details of this diagrammatics. We construct the doubly-scaled asymptotic Bethe ansatz (ABA) equations for multi-magnon states in these theories. Each entry of the mixing matrix of local conformal operators in the simplest of these theories — the bi-scalar model in 4D and tri-scalar model in 3D — is given by a single Feynman diagram at any given loop order. The related diagrams are in principle computable, up to a few scheme dependent constants, by integrability methods (quantum spectral curve or ABA). These constants should be fixed from direct computations of a few simplest graphs. This integrability-based method is advocated to be able to provide information about some high loop order graphs which are hardly computable by other known methods. We exemplify our approach with specific five-loop graphs.
Rigorous Electromagnetic Analysis of the Focusing Action of Refractive Cylindrical Microlens
NASA Astrophysics Data System (ADS)
Liu, Juan; Gu, Ben-Yuan; Dong, Bi-Zhen; Yang, Guo-Zhen
The focusing action of refractive cylindrical microlens is investigated based on the rigorous electromagnetic theory with the use of the boundary element method. The focusing behaviors of these refractive microlenses with continuous and multilevel surface-envelope are characterized in terms of total electric-field patterns, the electric-field intensity distributions on the focal plane, and their diffractive efficiencies at the focal spots. The obtained results are also compared with the ones obtained by Kirchhoff's scalar diffraction theory. The present numerical and graphical results may provide useful information for the analysis and design of refractive elements in micro-optics.
NASA Technical Reports Server (NTRS)
Slemp, Wesley C. H.; Kapania, Rakesh K.; Tessler, Alexander
2010-01-01
Computation of interlaminar stresses from the higher-order shear and normal deformable beam theory and the refined zigzag theory was performed using the Sinc method based on Interpolation of Highest Derivative. The Sinc method based on Interpolation of Highest Derivative was proposed as an efficient method for determining through-the-thickness variations of interlaminar stresses from one- and two-dimensional analysis by integration of the equilibrium equations of three-dimensional elasticity. However, the use of traditional equivalent single layer theories often results in inaccuracies near the boundaries and when the lamina have extremely large differences in material properties. Interlaminar stresses in symmetric cross-ply laminated beams were obtained by solving the higher-order shear and normal deformable beam theory and the refined zigzag theory with the Sinc method based on Interpolation of Highest Derivative. Interlaminar stresses and bending stresses from the present approach were compared with a detailed finite element solution obtained by ABAQUS/Standard. The results illustrate the ease with which the Sinc method based on Interpolation of Highest Derivative can be used to obtain the through-the-thickness distributions of interlaminar stresses from the beam theories. Moreover, the results indicate that the refined zigzag theory is a substantial improvement over the Timoshenko beam theory due to the piecewise continuous displacement field which more accurately represents interlaminar discontinuities in the strain field. The higher-order shear and normal deformable beam theory more accurately captures the interlaminar stresses at the ends of the beam because it allows transverse normal strain. However, the continuous nature of the displacement field requires a large number of monomial terms before the interlaminar stresses are computed as accurately as the refined zigzag theory.
Segmentation of human face using gradient-based approach
NASA Astrophysics Data System (ADS)
Baskan, Selin; Bulut, M. Mete; Atalay, Volkan
2001-04-01
This paper describes a method for automatic segmentation of facial features such as eyebrows, eyes, nose, mouth and ears in color images. This work is an initial step for wide range of applications based on feature-based approaches, such as face recognition, lip-reading, gender estimation, facial expression analysis, etc. Human face can be characterized by its skin color and nearly elliptical shape. For this purpose, face detection is performed using color and shape information. Uniform illumination is assumed. No restrictions on glasses, make-up, beard, etc. are imposed. Facial features are extracted using the vertically and horizontally oriented gradient projections. The gradient of a minimum with respect to its neighbor maxima gives the boundaries of a facial feature. Each facial feature has a different horizontal characteristic. These characteristics are derived by extensive experimentation with many face images. Using fuzzy set theory, the similarity between the candidate and the feature characteristic under consideration is calculated. Gradient-based method is accompanied by the anthropometrical information, for robustness. Ear detection is performed using contour-based shape descriptors. This method detects the facial features and circumscribes each facial feature with the smallest rectangle possible. AR database is used for testing. The developed method is also suitable for real-time systems.
Wang, Feng; Kaplan, Jess L.; Gold, Benjamin D.; Bhasin, Manoj K.; Ward, Naomi L.; Kellermayer, Richard; Kirschner, Barbara S.; Heyman, Melvin B.; Dowd, Scot E.; Cox, Stephen B.; Dogan, Haluk; Steven, Blaire; Ferry, George D.; Cohen, Stanley A.; Baldassano, Robert N.; Moran, Christopher J.; Garnett, Elizabeth A.; Drake, Lauren; Otu, Hasan H.; Mirny, Leonid A.; Libermann, Towia A.; Winter, Harland S.; Korolev, Kirill
2016-01-01
SUMMARY The relationship between the host and its microbiota is challenging to understand because both microbial communities and their environment are highly variable. We developed a set of techniques to address this challenge based on population dynamics and information theory. These methods identified additional bacterial taxa associated with pediatric Crohn's disease and could detect significant changes in microbial communities with fewer samples than previous statistical approaches. We also substantially improved the accuracy of the diagnosis based on the microbiota from stool samples and found that the ecological niche of a microbe predicts its role in Crohn’s disease. Bacteria typically residing in the lumen of healthy patients decrease in disease while bacteria typically residing on the mucosa of healthy patients increase in disease. Our results also show that the associations with Crohn’s disease are evolutionarily conserved and provide a mutual-information-based method to visualize dysbiosis. PMID:26804920
Vihstadt, Corrie; Evans, Roni
2015-01-01
Background: Online educational interventions to teach evidence-based practice (EBP) are a promising mechanism for overcoming some of the barriers to incorporating research into practice. However, attention must be paid to aligning strategies with adult learning theories to achieve optimal outcomes. Methods: We describe the development of a series of short self-study modules, each covering a small set of learning objectives. Our approach, informed by design-based research (DBR), involved 6 phases: analysis, design, design evaluation, redesign, development/implementation, and evaluation. Participants were faculty and students in 3 health programs at a complementary and integrative educational institution. Results: We chose a reusable learning object approach that allowed us to apply 4 main learning theories: events of instruction, cognitive load, dual processing, and ARCS (attention, relevance, confidence, satisfaction). A formative design evaluation suggested that the identified theories and instructional approaches were likely to facilitate learning and motivation. Summative evaluation was based on a student survey (N=116) that addressed how these theories supported learning. Results suggest that, overall, the selected theories helped students learn. Conclusion: The DBR approach allowed us to evaluate the specific intervention and theories for general applicability. This process also helped us define and document the intervention at a level of detail that covers almost all the proposed Guideline for Reporting Evidence-based practice Educational intervention and Teaching (GREET) items. This thorough description will facilitate the interpretation of future research and implementation of the intervention. Our approach can also serve as a model for others considering online EBP intervention development. PMID:26421233
Petrillo, Jennifer; Cano, Stefan J; McLeod, Lori D; Coon, Cheryl D
2015-01-01
To provide comparisons and a worked example of item- and scale-level evaluations based on three psychometric methods used in patient-reported outcome development-classical test theory (CTT), item response theory (IRT), and Rasch measurement theory (RMT)-in an analysis of the National Eye Institute Visual Functioning Questionnaire (VFQ-25). Baseline VFQ-25 data from 240 participants with diabetic macular edema from a randomized, double-masked, multicenter clinical trial were used to evaluate the VFQ at the total score level. CTT, RMT, and IRT evaluations were conducted, and results were assessed in a head-to-head comparison. Results were similar across the three methods, with IRT and RMT providing more detailed diagnostic information on how to improve the scale. CTT led to the identification of two problematic items that threaten the validity of the overall scale score, sets of redundant items, and skewed response categories. IRT and RMT additionally identified poor fit for one item, many locally dependent items, poor targeting, and disordering of over half the response categories. Selection of a psychometric approach depends on many factors. Researchers should justify their evaluation method and consider the intended audience. If the instrument is being developed for descriptive purposes and on a restricted budget, a cursory examination of the CTT-based psychometric properties may be all that is possible. In a high-stakes situation, such as the development of a patient-reported outcome instrument for consideration in pharmaceutical labeling, however, a thorough psychometric evaluation including IRT or RMT should be considered, with final item-level decisions made on the basis of both quantitative and qualitative results. Copyright © 2015. Published by Elsevier Inc.
Francis, Jillian J; Grimshaw, Jeremy M; Zwarenstein, Merrick; Eccles, Martin P; Shiller, Susan; Godin, Gaston; Johnston, Marie; O'Rourke, Keith; Presseau, Justin; Tetroe, Jacqueline
2007-01-01
Background A challenge for implementation researchers is to develop principles that could generate testable hypotheses that apply across a range of clinical contexts, thus leading to generalisability of findings. Such principles may be provided by systematically developed theories. The opportunity has arisen to test some of these theoretical principles in the Ontario Printed Educational Materials (OPEM) trial by conducting a sub-trial within the existing trial structure. OPEM is a large factorial cluster-randomised trial evaluating the effects of short directive and long discursive educational messages embedded into informed, an evidence-based newsletter produced in Canada by the Institute for Clinical Evaluative Sciences (ICES) and mailed to all primary care physicians in Ontario. The content of educational messages in the sub-trial will be constructed using both standard methods and methods inspired by psychological theory. The aim of this study is to test the effectiveness of the TheoRY-inspired MEssage ('TRY-ME') compared with the 'standard' message in changing prescribing behaviour. Methods The OPEM trial participants randomised to receive the short directive message attached to the outside of informed (an 'outsert') will be sub-randomised to receive either a standard message or a message informed by the theory of planned behaviour (TPB) using a two (long insert or no insert) by three (theory-based outsert or standard outsert or no outsert) design. The messages will relate to prescription of thiazide diuretics as first line drug treatment for hypertension (described in the accompanying protocol, "The Ontario Printed Educational Materials trial"). The short messages will be developed independently by two research teams. The primary outcome is prescription of thiazide diuretics, measured by routinely collected data available within ICES. The study is designed to answer the question, is there any difference in guideline adherence (i.e., thiazide prescription rates) between physicians in the six groups? A process evaluation survey instrument based on the TPB will be administered pre- and post-intervention (described in the accompanying protocol, "Looking inside the black box"). The second research question concerns processes that may underlie observed differences in prescribing behaviour. We expect that effects of the messages on prescribing behaviour will be mediated through changes in physicians' cognitions. Trial registration number Current controlled trial ISRCTN72772651 PMID:18039363
Non-contact detection of cardiac rate based on visible light imaging device
NASA Astrophysics Data System (ADS)
Zhu, Huishi; Zhao, Yuejin; Dong, Liquan
2012-10-01
We have developed a non-contact method to detect human cardiac rate at a distance. This detection is based on the general lighting condition. Using the video signal of human face region captured by webcam, we acquire the cardiac rate based on the PhotoPlethysmoGraphy theory. In this paper, the cardiac rate detecting method is mainly in view of the blood's different absorptivities of the lights various wavelengths. Firstly, we discompose the video signal into RGB three color signal channels and choose the face region as region of interest to take average gray value. Then, we draw three gray-mean curves on each color channel with time as variable. When the imaging device has good fidelity of color, the green channel signal shows the PhotoPlethysmoGraphy information most clearly. But the red and blue channel signals can provide more other physiological information on the account of their light absorptive characteristics of blood. We divide red channel signal by green channel signal to acquire the pulse wave. With the passband from 0.67Hz to 3Hz as a filter of the pulse wave signal and the frequency spectrum superimposed algorithm, we design frequency extracted algorithm to achieve the cardiac rate. Finally, we experiment with 30 volunteers, containing different genders and different ages. The results of the experiments are all relatively agreeable. The difference is about 2bmp. Through the experiment, we deduce that the PhotoPlethysmoGraphy theory based on visible light can also be used to detect other physiological information.
McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W
2015-03-27
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.
Implications for Application of Qualitative Methods to Library and Information Science Research.
ERIC Educational Resources Information Center
Grover, Robert; Glazier, Jack
1985-01-01
Presents conceptual framework for library and information science research and analyzes research methodology that has application for information science, using as example results of study conducted by authors. Rationale for use of qualitative research methods in theory building is discussed and qualitative and quantitative research methods are…
Establishing a research agenda for scientific and technical information (STI) - Focus on the user
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.
1992-01-01
This report addresses the relationship between library science and information science theory and practice, between the development of conceptual understanding, and the practical competence of information professionals. Consideration is given to the concept of research, linking theory with practice, and the reality of theory based practice. Attention is given to the need for research and research priorities, focus on the user and information-seeking behavior, and a user-oriented research agenda for STI.
Establishing a research agenda for Scientific and Technical Information (STI): Focus on the user
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.
1992-01-01
This report addresses the relationship between library science and information science theory and practice, between the development of conceptual understanding, and the practical competence of information professionals. Consideration is given to the concept of research, linking theory with practice, and the reality of theory based practice. Attention is given to the need for research and research priorities, focus on the user and information-seeking behavior, and a user-oriented research agenda for STI.
Information Theory for Gabor Feature Selection for Face Recognition
NASA Astrophysics Data System (ADS)
Shen, Linlin; Bai, Li
2006-12-01
A discriminative and robust feature—kernel enhanced informative Gabor feature—is proposed in this paper for face recognition. Mutual information is applied to select a set of informative and nonredundant Gabor features, which are then further enhanced by kernel methods for recognition. Compared with one of the top performing methods in the 2004 Face Verification Competition (FVC2004), our methods demonstrate a clear advantage over existing methods in accuracy, computation efficiency, and memory cost. The proposed method has been fully tested on the FERET database using the FERET evaluation protocol. Significant improvements on three of the test data sets are observed. Compared with the classical Gabor wavelet-based approaches using a huge number of features, our method requires less than 4 milliseconds to retrieve a few hundreds of features. Due to the substantially reduced feature dimension, only 4 seconds are required to recognize 200 face images. The paper also unified different Gabor filter definitions and proposed a training sample generation algorithm to reduce the effects caused by unbalanced number of samples available in different classes.
ERIC Educational Resources Information Center
Han, Gang; Newell, Jay
2014-01-01
This study explores the adoption of the team-based learning (TBL) method in knowledge-based and theory-oriented journalism and mass communication (J&MC) courses. It first reviews the origin and concept of TBL, the relevant theories, and then introduces the TBL method and implementation, including procedures and assessments, employed in an…
Developmental Relationships between Language and Theory of Mind
ERIC Educational Resources Information Center
Miller, Carol A.
2006-01-01
Purpose: This tutorial is intended to inform readers about the development of theory of mind (understanding of mental states) and to discuss relationships between theory of mind and language development. Method: A narrative review of selected literature on language and theory of mind is presented. Theory of mind is defined, and commonly used…
Chen, Zhao; Cao, Yanfeng; He, Shuaibing; Qiao, Yanjiang
2018-01-01
Action (" gongxiao " in Chinese) of traditional Chinese medicine (TCM) is the high recapitulation for therapeutic and health-preserving effects under the guidance of TCM theory. TCM-defined herbal properties (" yaoxing " in Chinese) had been used in this research. TCM herbal property (TCM-HP) is the high generalization and summary for actions, both of which come from long-term effective clinical practice in two thousands of years in China. However, the specific relationship between TCM-HP and action of TCM is complex and unclear from a scientific perspective. The research about this is conducive to expound the connotation of TCM-HP theory and is of important significance for the development of the TCM-HP theory. One hundred and thirty-three herbs including 88 heat-clearing herbs (HCHs) and 45 blood-activating stasis-resolving herbs (BAHRHs) were collected from reputable TCM literatures, and their corresponding TCM-HPs/actions information were collected from Chinese pharmacopoeia (2015 edition). The Kennard-Stone (K-S) algorithm was used to split 133 herbs into 100 calibration samples and 33 validation samples. Then, machine learning methods including supported vector machine (SVM), k-nearest neighbor (kNN) and deep learning methods including deep belief network (DBN), convolutional neutral network (CNN) were adopted to develop action classification models based on TCM-HP theory, respectively. In order to ensure robustness, these four classification methods were evaluated by using the method of tenfold cross validation and 20 external validation samples for prediction. As results, 72.7-100% of 33 validation samples including 17 HCHs and 16 BASRHs were correctly predicted by these four types of methods. Both of the DBN and CNN methods gave out the best results and their sensitivity, specificity, precision, accuracy were all 100.00%. Especially, the predicted results of external validation set showed that the performance of deep learning methods (DBN, CNN) were better than traditional machine learning methods (kNN, SVM) in terms of their sensitivity, specificity, precision, accuracy. Moreover, the distribution patterns of TCM-HPs of HCHs and BASRHs were also analyzed to detect the featured TCM-HPs of these two types of herbs. The result showed that the featured TCM-HPs of HCHs were cold, bitter, liver and stomach meridians entered, while those of BASRHs were warm, bitter and pungent, liver meridian entered. The performance on validation set and external validation set of deep learning methods (DBN, CNN) were better than machine learning models (kNN, SVM) in sensitivity, specificity, precision, accuracy when predicting the actions of heat-clearing and blood-activating stasis-resolving based on TCM-HP theory. The deep learning classification methods owned better generalization ability and accuracy when predicting the actions of heat-clearing and blood-activating stasis-resolving based on TCM-HP theory. Besides, the methods of deep learning would help us to improve our understanding about the relationship between herbal property and action, as well as to enrich and develop the theory of TCM-HP scientifically.
Wave field restoration using three-dimensional Fourier filtering method.
Kawasaki, T; Takai, Y; Ikuta, T; Shimizu, R
2001-11-01
A wave field restoration method in transmission electron microscopy (TEM) was mathematically derived based on a three-dimensional (3D) image formation theory. Wave field restoration using this method together with spherical aberration correction was experimentally confirmed in through-focus images of amorphous tungsten thin film, and the resolution of the reconstructed phase image was successfully improved from the Scherzer resolution limit to the information limit. In an application of this method to a crystalline sample, the surface structure of Au(110) was observed in a profile-imaging mode. The processed phase image showed quantitatively the atomic relaxation of the topmost layer.
Design of Restoration Method Based on Compressed Sensing and TwIST Algorithm
NASA Astrophysics Data System (ADS)
Zhang, Fei; Piao, Yan
2018-04-01
In order to improve the subjective and objective quality of degraded images at low sampling rates effectively,save storage space and reduce computational complexity at the same time, this paper proposes a joint restoration algorithm of compressed sensing and two step iterative threshold shrinkage (TwIST). The algorithm applies the TwIST algorithm which used in image restoration to the compressed sensing theory. Then, a small amount of sparse high-frequency information is obtained in frequency domain. The TwIST algorithm based on compressed sensing theory is used to accurately reconstruct the high frequency image. The experimental results show that the proposed algorithm achieves better subjective visual effects and objective quality of degraded images while accurately restoring degraded images.
Information theory in systems biology. Part I: Gene regulatory and metabolic networks.
Mousavian, Zaynab; Kavousi, Kaveh; Masoudi-Nejad, Ali
2016-03-01
"A Mathematical Theory of Communication", was published in 1948 by Claude Shannon to establish a framework that is now known as information theory. In recent decades, information theory has gained much attention in the area of systems biology. The aim of this paper is to provide a systematic review of those contributions that have applied information theory in inferring or understanding of biological systems. Based on the type of system components and the interactions between them, we classify the biological systems into 4 main classes: gene regulatory, metabolic, protein-protein interaction and signaling networks. In the first part of this review, we attempt to introduce most of the existing studies on two types of biological networks, including gene regulatory and metabolic networks, which are founded on the concepts of information theory. Copyright © 2015 Elsevier Ltd. All rights reserved.
Simultaneous computation of jet turbulence and noise
NASA Technical Reports Server (NTRS)
Berman, C. H.; Ramos, J. I.
1989-01-01
The existing flow computation methods, wave computation techniques, and theories based on noise source models are reviewed in order to assess the capabilities of numerical techniques to compute jet turbulence noise and understand the physical mechanisms governing it over a range of subsonic and supersonic nozzle exit conditions. In particular, attention is given to (1) methods for extrapolating near field information, obtained from flow computations, to the acoustic far field and (2) the numerical solution of the time-dependent Lilley equation.
ERIC Educational Resources Information Center
Cunningham, Jennifer L.
2013-01-01
The purpose of this sequential, explanatory mixed methods research study was to understand what factors influenced African American maternal intentions to get their daughters aged 9 years to 12 years vaccinated in Alabama. In the first, quantitative phase of the study, the research questions focused on identifying the predictive power of eleven…
Method for PE Pipes Fusion Jointing Based on TRIZ Contradictions Theory
NASA Astrophysics Data System (ADS)
Sun, Jianguang; Tan, Runhua; Gao, Jinyong; Wei, Zihui
The core of the TRIZ theories is the contradiction detection and solution. TRIZ provided various methods for the contradiction solution, but all that is not systematized. Combined with the technique system conception, this paper summarizes an integration solution method for contradiction solution based on the TRIZ contradiction theory. According to the method, a flowchart of integration solution method for contradiction is given. As a casestudy, method of fusion jointing PE pipe is analysised.
Demystifying theory and its use in improvement
Davidoff, Frank; Dixon-Woods, Mary; Leviton, Laura; Michie, Susan
2015-01-01
The role and value of theory in improvement work in healthcare has been seriously underrecognised. We join others in proposing that more informed use of theory can strengthen improvement programmes and facilitate the evaluation of their effectiveness. Many professionals, including improvement practitioners, are unfortunately mystified—and alienated—by theory, which discourages them from using it in their work. In an effort to demystify theory we make the point in this paper that, far from being discretionary or superfluous, theory (‘reason-giving’), both informal and formal, is intimately woven into virtually all human endeavour. We explore the special characteristics of grand, mid-range and programme theory; consider the consequences of misusing theory or failing to use it; review the process of developing and applying programme theory; examine some emerging criteria of ‘good’ theory; and emphasise the value, as well as the challenge, of combining informal experience-based theory with formal, publicly developed theory. We conclude that although informal theory is always at work in improvement, practitioners are often not aware of it or do not make it explicit. The germane issue for improvement practitioners, therefore, is not whether they use theory but whether they make explicit the particular theory or theories, informal and formal, they actually use. PMID:25616279
Introduction: Ecological knowledge, theory and information in space and time [Chapter 1
Samuel A. Cushman; Falk Huettmann
2010-01-01
A central theme of this book is that there is a strong mutual dependence between explanatory theory, available data and analytical method in determining the lurching progress of ecological knowledge (Fig. 1.1). The two central arguments are first that limits in each of theory, data and method have continuously constrained advances in understanding ecological systems...
NASA Astrophysics Data System (ADS)
Luzzi, R.; Vasconcellos, A. R.; Ramos, J. G.; Rodrigues, C. G.
2018-01-01
We describe the formalism of statistical irreversible thermodynamics constructed based on Zubarev's nonequilibrium statistical operator (NSO) method, which is a powerful and universal tool for investigating the most varied physical phenomena. We present brief overviews of the statistical ensemble formalism and statistical irreversible thermodynamics. The first can be constructed either based on a heuristic approach or in the framework of information theory in the Jeffreys-Jaynes scheme of scientific inference; Zubarev and his school used both approaches in formulating the NSO method. We describe the main characteristics of statistical irreversible thermodynamics and discuss some particular considerations of several authors. We briefly describe how Rosenfeld, Bohr, and Prigogine proposed to derive a thermodynamic uncertainty principle.
Garcia, Fernando; Lopez, Francisco J; Cano, Carlos; Blanco, Armando
2009-01-01
Background Regulatory motifs describe sets of related transcription factor binding sites (TFBSs) and can be represented as position frequency matrices (PFMs). De novo identification of TFBSs is a crucial problem in computational biology which includes the issue of comparing putative motifs with one another and with motifs that are already known. The relative importance of each nucleotide within a given position in the PFMs should be considered in order to compute PFM similarities. Furthermore, biological data are inherently noisy and imprecise. Fuzzy set theory is particularly suitable for modeling imprecise data, whereas fuzzy integrals are highly appropriate for representing the interaction among different information sources. Results We propose FISim, a new similarity measure between PFMs, based on the fuzzy integral of the distance of the nucleotides with respect to the information content of the positions. Unlike existing methods, FISim is designed to consider the higher contribution of better conserved positions to the binding affinity. FISim provides excellent results when dealing with sets of randomly generated motifs, and outperforms the remaining methods when handling real datasets of related motifs. Furthermore, we propose a new cluster methodology based on kernel theory together with FISim to obtain groups of related motifs potentially bound by the same TFs, providing more robust results than existing approaches. Conclusion FISim corrects a design flaw of the most popular methods, whose measures favour similarity of low information content positions. We use our measure to successfully identify motifs that describe binding sites for the same TF and to solve real-life problems. In this study the reliability of fuzzy technology for motif comparison tasks is proven. PMID:19615102
Bastien, Olivier; Ortet, Philippe; Roy, Sylvaine; Maréchal, Eric
2005-03-10
Popular methods to reconstruct molecular phylogenies are based on multiple sequence alignments, in which addition or removal of data may change the resulting tree topology. We have sought a representation of homologous proteins that would conserve the information of pair-wise sequence alignments, respect probabilistic properties of Z-scores (Monte Carlo methods applied to pair-wise comparisons) and be the basis for a novel method of consistent and stable phylogenetic reconstruction. We have built up a spatial representation of protein sequences using concepts from particle physics (configuration space) and respecting a frame of constraints deduced from pair-wise alignment score properties in information theory. The obtained configuration space of homologous proteins (CSHP) allows the representation of real and shuffled sequences, and thereupon an expression of the TULIP theorem for Z-score probabilities. Based on the CSHP, we propose a phylogeny reconstruction using Z-scores. Deduced trees, called TULIP trees, are consistent with multiple-alignment based trees. Furthermore, the TULIP tree reconstruction method provides a solution for some previously reported incongruent results, such as the apicomplexan enolase phylogeny. The CSHP is a unified model that conserves mutual information between proteins in the way physical models conserve energy. Applications include the reconstruction of evolutionary consistent and robust trees, the topology of which is based on a spatial representation that is not reordered after addition or removal of sequences. The CSHP and its assigned phylogenetic topology, provide a powerful and easily updated representation for massive pair-wise genome comparisons based on Z-score computations.
ERIC Educational Resources Information Center
Ko, Linda K.; Turner-McGrievy, Gabrielle M.; Campbell, Marci K.
2014-01-01
Podcasting is an emerging technology, and previous interventions have shown promising results using theory-based podcast for weight loss among overweight and obese individuals. This study investigated whether constructs of social cognitive theory and information processing theories (IPTs) mediate the effect of a podcast intervention on weight loss…
Discovering the intelligence in molecular biology.
Uberbacher, E
1995-12-01
The Third International Conference on Intelligent Systems in Molecular Biology was truly an outstanding event. Computational methods in molecular biology have reached a new level of maturity and utility, resulting in many high-impact applications. The success of this meeting bodes well for the rapid and continuing development of computational methods, intelligent systems and information-based approaches for the biosciences. The basic technology, originally most often applied to 'feasibility' problems, is now dealing effectively with the most difficult real-world problems. Significant progress has been made in understanding protein-structure information, structural classification, and how functional information and the relevant features of active-site geometry can be gleaned from structures by automated computational approaches. The value and limits of homology-based methods, and the ability to classify proteins by structure in the absence of homology, have reached a new level of sophistication. New methods for covariation analysis in the folding of large structures such as RNAs have shown remarkably good results, indicating the long-term potential to understand very complicated molecules and multimolecular complexes using computational means. Novel methods, such as HMMs, context-free grammars and the uses of mutual information theory, have taken center stage as highly valuable tools in our quest to represent and characterize biological information. A focus on creative uses of intelligent systems technologies and the trend toward biological application will undoubtedly continue and grow at the 1996 ISMB meeting in St Louis.
Bayesian Methods for Effective Field Theories
NASA Astrophysics Data System (ADS)
Wesolowski, Sarah
Microscopic predictions of the properties of atomic nuclei have reached a high level of precision in the past decade. This progress mandates improved uncertainty quantification (UQ) for a robust comparison of experiment with theory. With the uncertainty from many-body methods under control, calculations are now sensitive to the input inter-nucleon interactions. These interactions include parameters that must be fit to experiment, inducing both uncertainty from the fit and from missing physics in the operator structure of the Hamiltonian. Furthermore, the implementation of the inter-nucleon interactions is not unique, which presents the additional problem of assessing results using different interactions. Effective field theories (EFTs) take advantage of a separation of high- and low-energy scales in the problem to form a power-counting scheme that allows the organization of terms in the Hamiltonian based on their expected contribution to observable predictions. This scheme gives a natural framework for quantification of uncertainty due to missing physics. The free parameters of the EFT, called the low-energy constants (LECs), must be fit to data, but in a properly constructed EFT these constants will be natural-sized, i.e., of order unity. The constraints provided by the EFT, namely the size of the systematic uncertainty from truncation of the theory and the natural size of the LECs, are assumed information even before a calculation is performed or a fit is done. Bayesian statistical methods provide a framework for treating uncertainties that naturally incorporates prior information as well as putting stochastic and systematic uncertainties on an equal footing. For EFT UQ Bayesian methods allow the relevant EFT properties to be incorporated quantitatively as prior probability distribution functions (pdfs). Following the logic of probability theory, observable quantities and underlying physical parameters such as the EFT breakdown scale may be expressed as pdfs that incorporate the prior pdfs. Problems of model selection, such as distinguishing between competing EFT implementations, are also natural in a Bayesian framework. In this thesis we focus on two complementary topics for EFT UQ using Bayesian methods--quantifying EFT truncation uncertainty and parameter estimation for LECs. Using the order-by-order calculations and underlying EFT constraints as prior information, we show how to estimate EFT truncation uncertainties. We then apply the result to calculating truncation uncertainties on predictions of nucleon-nucleon scattering in chiral effective field theory. We apply model-checking diagnostics to our calculations to ensure that the statistical model of truncation uncertainty produces consistent results. A framework for EFT parameter estimation based on EFT convergence properties and naturalness is developed which includes a series of diagnostics to ensure the extraction of the maximum amount of available information from data to estimate LECs with minimal bias. We develop this framework using model EFTs and apply it to the problem of extrapolating lattice quantum chromodynamics results for the nucleon mass. We then apply aspects of the parameter estimation framework to perform case studies in chiral EFT parameter estimation, investigating a possible operator redundancy at fourth order in the chiral expansion and the appropriate inclusion of truncation uncertainty in estimating LECs.
GIS Application Management for Disabled People
NASA Astrophysics Data System (ADS)
Tongkaw, Sasalak
2017-08-01
This research aimed to develop and design Geographical Information Systems (GIS) for facilitating disabled people by presenting some useful disabled information on the Google Map. The map could provide information about disabled types of people such as blind, deaf and physical movement. This research employed the Multiview 2 theory and method to plan and find out the problems in real world situation. This research used many designing data structure methods such as Data Flow Diagram, and ER-Diagram. The research focused into two parts: server site and client site which included the interface for Web-based application. The clear information of disable people on the map was useful for facilitating disabled people to find some useful information. In addition, it provided specialized data for company and government officers for managing and planning local facilities for disabled people in the cities. The disable could access the system through the Internet access at any time by using mobile or portable devices.
Information Processing Theory and Conceptual Development.
ERIC Educational Resources Information Center
Schroder, H. M.
An educational program based upon information processing theory has been developed at Southern Illinois University. The integrating theme was the development of conceptual ability for coping with social and personal problems. It utilized student information search and concept formation as foundations for discussion and judgment and was organized…
Authorship attribution based on Life-Like Network Automata.
Machicao, Jeaneth; Corrêa, Edilson A; Miranda, Gisele H B; Amancio, Diego R; Bruno, Odemir M
2018-01-01
The authorship attribution is a problem of considerable practical and technical interest. Several methods have been designed to infer the authorship of disputed documents in multiple contexts. While traditional statistical methods based solely on word counts and related measurements have provided a simple, yet effective solution in particular cases; they are prone to manipulation. Recently, texts have been successfully modeled as networks, where words are represented by nodes linked according to textual similarity measurements. Such models are useful to identify informative topological patterns for the authorship recognition task. However, there is no consensus on which measurements should be used. Thus, we proposed a novel method to characterize text networks, by considering both topological and dynamical aspects of networks. Using concepts and methods from cellular automata theory, we devised a strategy to grasp informative spatio-temporal patterns from this model. Our experiments revealed an outperformance over structural analysis relying only on topological measurements, such as clustering coefficient, betweenness and shortest paths. The optimized results obtained here pave the way for a better characterization of textual networks.
Lu, Chao; Zheng, Yefeng; Birkbeck, Neil; Zhang, Jingdan; Kohlberger, Timo; Tietjen, Christian; Boettger, Thomas; Duncan, James S; Zhou, S Kevin
2012-01-01
In this paper, we present a novel method by incorporating information theory into the learning-based approach for automatic and accurate pelvic organ segmentation (including the prostate, bladder and rectum). We target 3D CT volumes that are generated using different scanning protocols (e.g., contrast and non-contrast, with and without implant in the prostate, various resolution and position), and the volumes come from largely diverse sources (e.g., diseased in different organs). Three key ingredients are combined to solve this challenging segmentation problem. First, marginal space learning (MSL) is applied to efficiently and effectively localize the multiple organs in the largely diverse CT volumes. Second, learning techniques, steerable features, are applied for robust boundary detection. This enables handling of highly heterogeneous texture pattern. Third, a novel information theoretic scheme is incorporated into the boundary inference process. The incorporation of the Jensen-Shannon divergence further drives the mesh to the best fit of the image, thus improves the segmentation performance. The proposed approach is tested on a challenging dataset containing 188 volumes from diverse sources. Our approach not only produces excellent segmentation accuracy, but also runs about eighty times faster than previous state-of-the-art solutions. The proposed method can be applied to CT images to provide visual guidance to physicians during the computer-aided diagnosis, treatment planning and image-guided radiotherapy to treat cancers in pelvic region.
A four stage approach for ontology-based health information system design.
Kuziemsky, Craig E; Lau, Francis
2010-11-01
To describe and illustrate a four stage methodological approach to capture user knowledge in a biomedical domain area, use that knowledge to design an ontology, and then implement and evaluate the ontology as a health information system (HIS). A hybrid participatory design-grounded theory (GT-PD) method was used to obtain data and code them for ontology development. Prototyping was used to implement the ontology as a computer-based tool. Usability testing evaluated the computer-based tool. An empirically derived domain ontology and set of three problem-solving approaches were developed as a formalized model of the concepts and categories from the GT coding. The ontology and problem-solving approaches were used to design and implement a HIS that tested favorably in usability testing. The four stage approach illustrated in this paper is useful for designing and implementing an ontology as the basis for a HIS. The approach extends existing ontology development methodologies by providing an empirical basis for theory incorporated into ontology design. Copyright © 2010 Elsevier B.V. All rights reserved.
A new approach to preserve privacy data mining based on fuzzy theory in numerical database
NASA Astrophysics Data System (ADS)
Cui, Run; Kim, Hyoung Joong
2014-01-01
With the rapid development of information techniques, data mining approaches have become one of the most important tools to discover the in-deep associations of tuples in large-scale database. Hence how to protect the private information is quite a huge challenge, especially during the data mining procedure. In this paper, a new method is proposed for privacy protection which is based on fuzzy theory. The traditional fuzzy approach in this area will apply fuzzification to the data without considering its readability. A new style of obscured data expression is introduced to provide more details of the subsets without reducing the readability. Also we adopt a balance approach between the privacy level and utility when to achieve the suitable subgroups. An experiment is provided to show that this approach is suitable for the classification without a lower accuracy. In the future, this approach can be adapted to the data stream as the low computation complexity of the fuzzy function with a suitable modification.
Grey-Theory-Based Optimization Model of Emergency Logistics Considering Time Uncertainty.
Qiu, Bao-Jian; Zhang, Jiang-Hua; Qi, Yuan-Tao; Liu, Yang
2015-01-01
Natural disasters occur frequently in recent years, causing huge casualties and property losses. Nowadays, people pay more and more attention to the emergency logistics problems. This paper studies the emergency logistics problem with multi-center, multi-commodity, and single-affected-point. Considering that the path near the disaster point may be damaged, the information of the state of the paths is not complete, and the travel time is uncertainty, we establish the nonlinear programming model that objective function is the maximization of time-satisfaction degree. To overcome these drawbacks: the incomplete information and uncertain time, this paper firstly evaluates the multiple roads of transportation network based on grey theory and selects the reliable and optimal path. Then simplify the original model under the scenario that the vehicle only follows the optimal path from the emergency logistics center to the affected point, and use Lingo software to solve it. The numerical experiments are presented to show the feasibility and effectiveness of the proposed method.
Grey-Theory-Based Optimization Model of Emergency Logistics Considering Time Uncertainty
Qiu, Bao-Jian; Zhang, Jiang-Hua; Qi, Yuan-Tao; Liu, Yang
2015-01-01
Natural disasters occur frequently in recent years, causing huge casualties and property losses. Nowadays, people pay more and more attention to the emergency logistics problems. This paper studies the emergency logistics problem with multi-center, multi-commodity, and single-affected-point. Considering that the path near the disaster point may be damaged, the information of the state of the paths is not complete, and the travel time is uncertainty, we establish the nonlinear programming model that objective function is the maximization of time-satisfaction degree. To overcome these drawbacks: the incomplete information and uncertain time, this paper firstly evaluates the multiple roads of transportation network based on grey theory and selects the reliable and optimal path. Then simplify the original model under the scenario that the vehicle only follows the optimal path from the emergency logistics center to the affected point, and use Lingo software to solve it. The numerical experiments are presented to show the feasibility and effectiveness of the proposed method. PMID:26417946
Information theory in systems biology. Part II: protein-protein interaction and signaling networks.
Mousavian, Zaynab; Díaz, José; Masoudi-Nejad, Ali
2016-03-01
By the development of information theory in 1948 by Claude Shannon to address the problems in the field of data storage and data communication over (noisy) communication channel, it has been successfully applied in many other research areas such as bioinformatics and systems biology. In this manuscript, we attempt to review some of the existing literatures in systems biology, which are using the information theory measures in their calculations. As we have reviewed most of the existing information-theoretic methods in gene regulatory and metabolic networks in the first part of the review, so in the second part of our study, the application of information theory in other types of biological networks including protein-protein interaction and signaling networks will be surveyed. Copyright © 2015 Elsevier Ltd. All rights reserved.
An imprecise probability approach for squeal instability analysis based on evidence theory
NASA Astrophysics Data System (ADS)
Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie
2017-01-01
An imprecise probability approach based on evidence theory is proposed for squeal instability analysis of uncertain disc brakes in this paper. First, the squeal instability of the finite element (FE) model of a disc brake is investigated and its dominant unstable eigenvalue is detected by running two typical numerical simulations, i.e., complex eigenvalue analysis (CEA) and transient dynamical analysis. Next, the uncertainty mainly caused by contact and friction is taken into account and some key parameters of the brake are described as uncertain parameters. All these uncertain parameters are usually involved with imprecise data such as incomplete information and conflict information. Finally, a squeal instability analysis model considering imprecise uncertainty is established by integrating evidence theory, Taylor expansion, subinterval analysis and surrogate model. In the proposed analysis model, the uncertain parameters with imprecise data are treated as evidence variables, and the belief measure and plausibility measure are employed to evaluate system squeal instability. The effectiveness of the proposed approach is demonstrated by numerical examples and some interesting observations and conclusions are summarized from the analyses and discussions. The proposed approach is generally limited to the squeal problems without too many investigated parameters. It can be considered as a potential method for squeal instability analysis, which will act as the first step to reduce squeal noise of uncertain brakes with imprecise information.
ERIC Educational Resources Information Center
Ong, Chiek Pin; Tasir, Zaidatun
2015-01-01
The aim of the research is to study the information retention among trainee teachers using a self-instructional printed module based on Cognitive Load Theory for learning spreadsheet software. Effective pedagogical considerations integrating the theoretical concepts related to cognitive load are reflected in the design and development of the…
NASA Astrophysics Data System (ADS)
Pazderin, A. V.; Sof'in, V. V.; Samoylenko, V. O.
2015-11-01
Efforts aimed at improving energy efficiency in all branches of the fuel and energy complex shall be commenced with setting up a high-tech automated system for monitoring and accounting energy resources. Malfunctions and failures in the measurement and information parts of this system may distort commercial measurements of energy resources and lead to financial risks for power supplying organizations. In addition, measurement errors may be connected with intentional distortion of measurements for reducing payment for using energy resources on the consumer's side, which leads to commercial loss of energy resource. The article presents a universal mathematical method for verifying the validity of measurement information in networks for transporting energy resources, such as electricity and heat, petroleum, gas, etc., based on the state estimation theory. The energy resource transportation network is represented by a graph the nodes of which correspond to producers and consumers, and its branches stand for transportation mains (power lines, pipelines, and heat network elements). The main idea of state estimation is connected with obtaining the calculated analogs of energy resources for all available measurements. Unlike "raw" measurements, which contain inaccuracies, the calculated flows of energy resources, called estimates, will fully satisfy the suitability condition for all state equations describing the energy resource transportation network. The state equations written in terms of calculated estimates will be already free from residuals. The difference between a measurement and its calculated analog (estimate) is called in the estimation theory an estimation remainder. The obtained large values of estimation remainders are an indicator of high errors of particular energy resource measurements. By using the presented method it is possible to improve the validity of energy resource measurements, to estimate the transportation network observability, to eliminate the energy resource flows measurement imbalances, and to filter invalid measurements at the data acquisition and processing stage in performing monitoring of an automated energy resource monitoring and accounting system.
1991-10-01
SUBJECT TERMS 15. NUMBER OF PAGES engineering management information systems method formalization 60 information engineering process modeling 16 PRICE...CODE information systems requirements definition methods knowlede acquisition methods systems engineering 17. SECURITY CLASSIFICATION ji. SECURITY... Management , Inc., Santa Monica, California. CORYNEN, G. C., 1975, A Mathematical Theory of Modeling and Simula- tion. Ph.D. Dissertation, Department
Ranking streamflow model performance based on Information theory metrics
NASA Astrophysics Data System (ADS)
Martinez, Gonzalo; Pachepsky, Yakov; Pan, Feng; Wagener, Thorsten; Nicholson, Thomas
2016-04-01
The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic model evaluation and selection. We simulated 10-year streamflow time series in five watersheds located in Texas, North Carolina, Mississippi, and West Virginia. Eight model of different complexity were applied. The information-theory based metrics were obtained after representing the time series as strings of symbols where different symbols corresponded to different quantiles of the probability distribution of streamflow. The symbol alphabet was used. Three metrics were computed for those strings - mean information gain that measures the randomness of the signal, effective measure complexity that characterizes predictability and fluctuation complexity that characterizes the presence of a pattern in the signal. The observed streamflow time series has smaller information content and larger complexity metrics than the precipitation time series. Watersheds served as information filters and and streamflow time series were less random and more complex than the ones of precipitation. This is reflected the fact that the watershed acts as the information filter in the hydrologic conversion process from precipitation to streamflow. The Nash Sutcliffe efficiency metric increased as the complexity of models increased, but in many cases several model had this efficiency values not statistically significant from each other. In such cases, ranking models by the closeness of the information-theory based parameters in simulated and measured streamflow time series can provide an additional criterion for the evaluation of hydrologic model performance.
Cognitive Load Theory and the Effects of Transient Information on the Modality Effect
ERIC Educational Resources Information Center
Leahy, Wayne; Sweller, John
2016-01-01
Based on cognitive load theory and the "transient information effect," this paper investigated the "modality effect" while interpreting a contour map. The length and complexity of auditory and visual text instructions were manipulated. Experiment 1 indicated that longer audio text information within a presentation was inferior…
de Bock, Élodie; Hardouin, Jean-Benoit; Blanchin, Myriam; Le Neel, Tanguy; Kubis, Gildas; Sébille, Véronique
2015-01-01
The purpose of this study was to identify the most adequate strategy for group comparison of longitudinal patient-reported outcomes in the presence of possibly informative intermittent missing data. Models coming from classical test theory (CTT) and item response theory (IRT) were compared. Two groups of patients' responses to dichotomous items with three times of assessment were simulated. Different cases were considered: presence or absence of a group effect and/or a time effect, a total of 100 or 200 patients, 4 or 7 items and two different values for the correlation coefficient of the latent trait between two consecutive times (0.4 or 0.9). Cases including informative and non-informative intermittent missing data were compared at different rates (15, 30 %). These simulated data were analyzed with CTT using score and mixed model (SM) and with IRT using longitudinal Rasch mixed model (LRM). The type I error, the power and the bias of the group effect estimations were compared between the two methods. This study showed that LRM performs better than SM. When the rate of missing data rose to 30 %, estimations were biased with SM mainly for informative missing data. Otherwise, LRM and SM methods were comparable concerning biases. However, regardless of the rate of intermittent missing data, power of LRM was higher compared to power of SM. In conclusion, LRM should be favored when the rate of missing data is higher than 15 %. For other cases, SM and LRM provide similar results.
Aksiuta, E F; Ostashev, A V; Sergeev, E V; Aksiuta, V E
1997-01-01
The methods of the information (entropy) error theory were used to make a metrological analysis of the well-known commercial measuring systems for timing an anticipative reaction (AR) to the position of a moving object, which is based on the electromechanical, gas-discharge, and electron principles. The required accuracy of measurement was ascertained to be achieved only by using the systems based on the electron principle of moving object simulation and AR measurement.
Jenkins, Emily K; Kothari, Anita; Bungay, Vicky; Johnson, Joy L; Oliffe, John L
2016-08-30
Much of the research and theorising in the knowledge translation (KT) field has focused on clinical settings, providing little guidance to those working in community settings. In this study, we build on previous research in community-based KT by detailing the theory driven and empirically-informed CollaboraKTion framework. A case study design and ethnographic methods were utilised to gain an in-depth understanding of the processes for conducting a community-based KT study as a means to distilling the CollaboraKTion framework. Drawing on extensive field notes describing fieldwork observations and interactions as well as evidence from the participatory research and KT literature, we detail the processes and steps undertaken in this community-based KT study as well as their rationale and the challenges encountered. In an effort to build upon existing knowledge, Kitson and colleagues' co-KT framework, which provides guidance for conducting KT aimed at addressing population-level health, was applied as a coding structure to inform the current analysis. This approach was selected because it (1) supported the application of an existing community-based KT framework to empirical data and (2) provided an opportunity to contribute to the theory and practice gaps in the community-based KT literature through an inductively derived empirical example. Analysis revealed that community-based KT is an iterative process that can be viewed as comprising five overarching processes: (1) contacting and connecting; (2) deepening understandings; (3) adapting and applying the knowledge base; (4) supporting and evaluating continued action; and (5) transitioning and embedding as well as several key elements within each of these processes (e.g. building on existing knowledge, establishing partnerships). These empirically informed theory advancements in KT and participatory research traditions are summarised in the CollaboraKTion framework. We suggest that community-based KT researchers place less emphasis on enhancing uptake of specific interventions and focus on collaboratively identifying and creating changes to the contextual factors that influence health outcomes. The CollaboraKTion framework can be used to guide the development, implementation and evaluation of contextually relevant, evidence-informed initiatives aimed at improving population health, amid providing a foundation to leverage future research and practice in this emergent KT area.
Identity, Attribution, and the Challenge of Targeting in the Cyberdomain
2018-03-07
icts, hybrid warfare, Islamic State, terrorism, biometrics, net- work analysis, big data, activity-based intelligence, high -value individuals One of...environment, combatant identity and pattern of life information became crucial elements of high -value targeting and the process of removing...alytical methods deeply infl uenced by social network theory and targeting pro- cesses specifi cally designed for engaging high -value individuals and
ERIC Educational Resources Information Center
Yettick, Holly
2015-01-01
Most members of the American public will never read this article. Instead, they will obtain much of their information about education from the news media. Yet little academic research has examined the type or quality of education research and expertise they will find there. Through the lens of gatekeeping theory, this mixed-methods study aims to…
NASA Astrophysics Data System (ADS)
Alfonso, Leonardo; Chacon, Juan; Solomatine, Dimitri
2016-04-01
The EC-FP7 WeSenseIt project proposes the development of a Citizen Observatory of Water, aiming at enhancing environmental monitoring and forecasting with the help of citizens equipped with low-cost sensors and personal devices such as smartphones and smart umbrellas. In this regard, Citizen Observatories may complement the limited data availability in terms of spatial and temporal density, which is of interest, among other areas, to improve hydraulic and hydrological models. At this point, the following question arises: how can citizens, who are part of a citizen observatory, be optimally guided so that the data they collect and send is useful to improve modelling and water management? This research proposes a new methodology to identify the optimal location and timing of potential observations coming from moving sensors of hydrological variables. The methodology is based on Information Theory, which has been widely used in hydrometric monitoring design [1-4]. In particular, the concepts of Joint Entropy, as a measure of the amount of information that is contained in a set of random variables, which, in our case, correspond to the time series of hydrological variables captured at given locations in a catchment. The methodology presented is a step forward in the state of the art because it solves the multiobjective optimisation problem of getting simultaneously the minimum number of informative and non-redundant sensors needed for a given time, so that the best configuration of monitoring sites is found at every particular moment in time. To this end, the existing algorithms have been improved to make them efficient. The method is applied to cases in The Netherlands, UK and Italy and proves to have a great potential to complement the existing in-situ monitoring networks. [1] Alfonso, L., A. Lobbrecht, and R. Price (2010a), Information theory-based approach for location of monitoring water level gauges in polders, Water Resour. Res., 46(3), W03528 [2] Alfonso, L., A. Lobbrecht, and R. Price (2010b), Optimization of water level monitoring network in polder systems using information theory, WATER RESOURCES RESEARCH, 46(12), W12553,10.1029/2009wr008953. [3] Alfonso, L., L. He, A. Lobbrecht, and R. Price (2013), Information theory applied to evaluate the discharge monitoring network of the Magdalena River, Journal of Hydroinformatics, 15(1), 211-228 [4] Alfonso, L., E. Ridolfi, S. Gaytan-Aguilar, F. Napolitano, and F. Russo (2014), Ensemble Entropy for Monitoring Network Design, Entropy, 16(3), 1365-1375
Chilisa, Bagele; Mohiemang, Irene; Mpeta, Kolentino Nyamadzapasi; Malinga, Tumane; Ntshwarang, Poloko; Koyabe, Bramwell Walela; Heeren, G Anita
2016-01-01
Culture and tradition influences behaviour. Multiple partner and concurrent relationships are made responsible for the increase of HIV infection in Sub-Saharan Africa. A contextualized "Theory of Planned Behaviour" was used to identify predictors of intention to practice monogamy. A mixed method design using qualitative data from focus groups, stories and a survey were analyzed for quantitative data. The qualitative data added to the behavioural beliefs a socio-cultural belief domain as well as attitudes, subjective norms, and perceived behavioural control predicted the intention to practice monogamy. The adolescents showed a tendency towards having more than one sexual partner. The normative beliefs and the socio cultural beliefs also predicted intentions while hedonistic belief and partner reaction did not. In contextualizing theory-based interventions, it is important to draw from stories and the langauage that circulate in a community about a given behaviour. More studies are needed on ways to combine qualitative approaches with quantitative approaches to inform the development of theory based culturally appropriate and context specific intervention strategies to reduce the risk of HIV.
Chilisa, Bagele; Mohiemang, Irene; Mpeta, Kolentino Nyamadzapasi; Malinga, Tumane; Ntshwarang, Poloko; Koyabe, Bramwell Walela; Heeren, G. Anita
2016-01-01
Culture and tradition influences behaviour. Multiple partner and concurrent relationships are made responsible for the increase of HIV infection in Sub-Saharan Africa. A contextualized “Theory of Planned Behaviour” was used to identify predictors of intention to practice monogamy. A mixed method design using qualitative data from focus groups, stories and a survey were analyzed for quantitative data. The qualitative data added to the behavioural beliefs a socio-cultural belief domain as well as attitudes, subjective norms, and perceived behavioural control predicted the intention to practice monogamy. The adolescents showed a tendency towards having more than one sexual partner. The normative beliefs and the socio cultural beliefs also predicted intentions while hedonistic belief and partner reaction did not. In contextualizing theory-based interventions, it is important to draw from stories and the langauage that circulate in a community about a given behaviour. More studies are needed on ways to combine qualitative approaches with quantitative approaches to inform the development of theory based culturally appropriate and context specific intervention strategies to reduce the risk of HIV. PMID:28090169
Gluons for (almost) nothing, gravitons for free
NASA Astrophysics Data System (ADS)
Carrasco, John Joseph M.
2013-07-01
In this talk I describe a new method for organizing Yang-Mills scattering amplitudes that allow the definition of an entire multi-loop scattering amplitude in terms of a small number of "master" graphs. A small amount of information is required from the theory, and constraints propagate this information to the full amplitude. When organized in such away corresponding gravitational amplitudes are trivially found. This talk is based on work[1- 4] done in collaboration with Zvi Bern, Lance Dixon, Henrik Johansson, and Radu Roiban, and follows closely the presentation given in ref. [5].
NASA Astrophysics Data System (ADS)
Wang, C.; Rubin, Y.
2014-12-01
Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision information assimilation method of other geotechnical parameters.
Artificial retina model for the retinally blind based on wavelet transform
NASA Astrophysics Data System (ADS)
Zeng, Yan-an; Song, Xin-qiang; Jiang, Fa-gang; Chang, Da-ding
2007-01-01
Artificial retina is aimed for the stimulation of remained retinal neurons in the patients with degenerated photoreceptors. Microelectrode arrays have been developed for this as a part of stimulator. Design such microelectrode arrays first requires a suitable mathematical method for human retinal information processing. In this paper, a flexible and adjustable human visual information extracting model is presented, which is based on the wavelet transform. With the flexible of wavelet transform to image information processing and the consistent to human visual information extracting, wavelet transform theory is applied to the artificial retina model for the retinally blind. The response of the model to synthetic image is shown. The simulated experiment demonstrates that the model behaves in a manner qualitatively similar to biological retinas and thus may serve as a basis for the development of an artificial retina.
Validating Coherence Measurements Using Aligned and Unaligned Coherence Functions
NASA Technical Reports Server (NTRS)
Miles, Jeffrey Hilton
2006-01-01
This paper describes a novel approach based on the use of coherence functions and statistical theory for sensor validation in a harsh environment. By the use of aligned and unaligned coherence functions and statistical theory one can test for sensor degradation, total sensor failure or changes in the signal. This advanced diagnostic approach and the novel data processing methodology discussed provides a single number that conveys this information. This number as calculated with standard statistical procedures for comparing the means of two distributions is compared with results obtained using Yuen's robust statistical method to create confidence intervals. Examination of experimental data from Kulite pressure transducers mounted in a Pratt & Whitney PW4098 combustor using spectrum analysis methods on aligned and unaligned time histories has verified the effectiveness of the proposed method. All the procedures produce good results which demonstrates how robust the technique is.
Statistical mechanics of broadcast channels using low-density parity-check codes.
Nakamura, Kazutaka; Kabashima, Yoshiyuki; Morelos-Zaragoza, Robert; Saad, David
2003-03-01
We investigate the use of Gallager's low-density parity-check (LDPC) codes in a degraded broadcast channel, one of the fundamental models in network information theory. Combining linear codes is a standard technique in practical network communication schemes and is known to provide better performance than simple time sharing methods when algebraic codes are used. The statistical physics based analysis shows that the practical performance of the suggested method, achieved by employing the belief propagation algorithm, is superior to that of LDPC based time sharing codes while the best performance, when received transmissions are optimally decoded, is bounded by the time sharing limit.
Brief Instrumental School-Based Mentoring for Middle School Students: Theory and Impact
ERIC Educational Resources Information Center
McQuillin, Samuel D.; Lyons, Michael D.
2016-01-01
This study evaluated the efficacy of an intentionally brief school-based mentoring program. This academic goal-focused mentoring program was developed through a series of iterative randomized controlled trials, and is informed by research in social cognitive theory, cognitive dissonance theory, motivational interviewing, and research in academic…
Baker-Ericzén, Mary J.; Jenkins, Melissa M.; Park, Soojin; Garland, Ann F.
2014-01-01
Background Mental health professionals’ decision-making practice is an area of increasing interest and importance, especially in the pediatric research and clinical communities. Objective The present study explored the role of prior training in evidence-based treatments on clinicians’ assessment and treatment formulations using case vignettes. Specifically, study aims included using the Naturalistic Decision Making (NDM) cognitive theory to 1) examine potential associations between EBT training and decision-making processes (novice versus expert type), and 2) explore how client and family contextual information affects clinical decision-making. Methods Forty-eight clinicians across two groups (EBT trained=14, Not EBT trained=34) participated. Clinicians were comparable on professional experience, demographics, and discipline. The quasi-experimental design used an analog “think aloud” method where clinicians read case vignettes about a child with disruptive behavior problems and verbalized case conceptualization and treatment planning out-loud. Responses were coded according to NDM theory. Results MANOVA results were significant for EBT training status such that EBT trained clinicians’ displayed cognitive processes more closely aligned with “expert” decision-makers and non-EBT trained clinicians’ decision processes were more similar to “novice” decision-makers, following NDM theory. Non-EBT trained clinicians assigned significantly more diagnoses, provided less detailed treatment plans and discussed fewer EBTs. Parent/family contextual information also appeared to influence decision-making. Conclusion This study offers a preliminary investigation of the possible broader impacts of EBT training and potential associations with development of expert decision-making skills. Targeting clinicians’ decision-making may be an important avenue to pursue within dissemination-implementation efforts in mental health practice. PMID:25892901
Liver segmentation from CT images using a sparse priori statistical shape model (SP-SSM).
Wang, Xuehu; Zheng, Yongchang; Gan, Lan; Wang, Xuan; Sang, Xinting; Kong, Xiangfeng; Zhao, Jie
2017-01-01
This study proposes a new liver segmentation method based on a sparse a priori statistical shape model (SP-SSM). First, mark points are selected in the liver a priori model and the original image. Then, the a priori shape and its mark points are used to obtain a dictionary for the liver boundary information. Second, the sparse coefficient is calculated based on the correspondence between mark points in the original image and those in the a priori model, and then the sparse statistical model is established by combining the sparse coefficients and the dictionary. Finally, the intensity energy and boundary energy models are built based on the intensity information and the specific boundary information of the original image. Then, the sparse matching constraint model is established based on the sparse coding theory. These models jointly drive the iterative deformation of the sparse statistical model to approximate and accurately extract the liver boundaries. This method can solve the problems of deformation model initialization and a priori method accuracy using the sparse dictionary. The SP-SSM can achieve a mean overlap error of 4.8% and a mean volume difference of 1.8%, whereas the average symmetric surface distance and the root mean square symmetric surface distance can reach 0.8 mm and 1.4 mm, respectively.
Sharifirad, Gholamreza; Mostafavi, Firoozeh; Reisi, Mahnouush; Mahaki, Behzad; Javadzade, Homamodin; Heydarabadi, Akbar Babaei; Esfahani, Mahmoud Nasr
2015-01-01
Background: Health literacy is one of the most important priorities for improving health care quality through enhancing patient-provider communication. Implementing health literacy strategies enable nurses to provide information and instructions for patients in a manner that is more commensurate and understandable. The purpose of this study was to investigate the factors affecting nurses’ intention to implement health literacy strategies in patient education based on theory of planned behavior. Methods: A cross-sectional study was done on 148 nurse practitioners of AL-Zahra educational hospital in Isfahan, Iran, using a descriptive-analytic method. Data collected via a standardized questionnaire based on theory of planned behavior constructed and analyzed by SPSS v.17 using ANOVA, Independent T-test, Pearson correlation and linear regression. Results: There was statistically significant correlation between using health literacy strategies and marriage status, attending in retraining courses, employment type, job history, and job status. Perceived behavioral control was the most powerful predictor of intention (β=0.417) and use health literacy strategies in patient education and behavior of nurses (β=0.33). Conclusion: According to the findings of this study, perceived behavioral control is a powerful determinant of nurses’ intention and behavior of using health literacy strategies in patient education. Hence we recommend nurse educators to pay special attention to the constructs of this theory mainly perceived behavioral control in retrain courses about patient education and health literacy strategies. PMID:25945078
A pilot randomized, controlled trial of an active video game physical activity intervention.
Peng, Wei; Pfeiffer, Karin A; Winn, Brian; Lin, Jih-Hsuan; Suton, Darijan
2015-12-01
Active video games (AVGs) transform the sedentary screen time of video gaming into active screen time and have great potential to serve as a "gateway" tool to a more active lifestyle for the least active individuals. This pilot randomized trial was conducted to explore the potential of theory-guided active video games in increasing moderate-to-vigorous physical activity (MVPA) among young adults. In this pilot 4-week intervention, participants were randomly assigned to 1 of the following groups: an AVG group with all the self determination theory (SDT)-based game features turned off, an AVG group with all the SDT-based game features turned on, a passive gameplay group with all the SDT-based game features turned on, and a control group. Physical activity was measured using ActiGraph GT3X accelerometers. Other outcomes included attendance and perceived need satisfaction of autonomy, competence and relatedness. It was found that playing the self-determination theory supported AVG resulted in greater MVPA compared with the control group immediately postintervention. The AVG with the theory-supported features also resulted in greater attendance and psychological need satisfaction than the non-theory-supported one. An AVG designed with motivation theory informed features positively impacted attendance and MVPA immediately postintervention, suggesting that including AVG features guided with motivation theory may be a method of addressing common problems with adherence and increasing effectiveness of active gaming. (PsycINFO Database Record (c) 2015 APA, all rights reserved).
Parker, Jack; Mawson, Susan; Mountain, Gail; Nasr, Nasrin; Zheng, Huiru
2014-06-05
Evidence indicates that post-stroke rehabilitation improves function, independence and quality of life. A key aspect of rehabilitation is the provision of appropriate information and feedback to the learner.Advances in information and communications technology (ICT) have allowed for the development of various systems to complement stroke rehabilitation that could be used in the home setting. These systems may increase the provision of rehabilitation a stroke survivor receives and carries out, as well as providing a learning platform that facilitates long-term self-managed rehabilitation and behaviour change. This paper describes the application of an innovative evaluative methodology to explore the utilisation of feedback for post-stroke upper-limb rehabilitation in the home. Using the principles of realistic evaluation, this study aimed to test and refine intervention theories by exploring the complex interactions of contexts, mechanisms and outcomes that arise from technology deployment in the home. Methods included focus groups followed by multi-method case studies (n = 5) before, during and after the use of computer-based equipment. Data were analysed in relation to the context-mechanism-outcome hypotheses case by case. This was followed by a synthesis of the findings to answer the question, 'what works for whom and in what circumstances and respects?' Data analysis reveals that to achieve desired outcomes through the use of ICT, key elements of computer feedback, such as accuracy, measurability, rewarding feedback, adaptability, and knowledge of results feedback, are required to trigger the theory-driven mechanisms underpinning the intervention. In addition, the pre-existing context and the personal and environmental contexts, such as previous experience of service delivery, personal goals, trust in the technology, and social circumstances may also enable or constrain the underpinning theory-driven mechanisms. Findings suggest that the theory-driven mechanisms underpinning the utilisation of feedback from computer-based technology for home-based upper-limb post-stroke rehabilitation are dependent on key elements of computer feedback and the personal and environmental context. The identification of these elements may therefore inform the development of technology; therapy education and the subsequent adoption of technology and a self-management paradigm; long-term self-managed rehabilitation; and importantly, improvements in the physical and psychosocial aspects of recovery.
Priming methods in semantics and pragmatics.
Maldonado, Mora; Spector, Benjamin; Chemla, Emmanuel
2017-01-01
Structural priming is a powerful method to inform linguistic theories. We argue that this method extends nicely beyond syntax to theories of meaning. Priming, however, should still be seen as only one of the tools available for linguistic data collection. Specifically, because priming can occur at different, potentially conflicting levels, it cannot detect every aspect of linguistic representations.
Demystifying theory and its use in improvement.
Davidoff, Frank; Dixon-Woods, Mary; Leviton, Laura; Michie, Susan
2015-03-01
The role and value of theory in improvement work in healthcare has been seriously underrecognised. We join others in proposing that more informed use of theory can strengthen improvement programmes and facilitate the evaluation of their effectiveness. Many professionals, including improvement practitioners, are unfortunately mystified-and alienated-by theory, which discourages them from using it in their work. In an effort to demystify theory we make the point in this paper that, far from being discretionary or superfluous, theory ('reason-giving'), both informal and formal, is intimately woven into virtually all human endeavour. We explore the special characteristics of grand, mid-range and programme theory; consider the consequences of misusing theory or failing to use it; review the process of developing and applying programme theory; examine some emerging criteria of 'good' theory; and emphasise the value, as well as the challenge, of combining informal experience-based theory with formal, publicly developed theory. We conclude that although informal theory is always at work in improvement, practitioners are often not aware of it or do not make it explicit. The germane issue for improvement practitioners, therefore, is not whether they use theory but whether they make explicit the particular theory or theories, informal and formal, they actually use. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Cartographic generalization of urban street networks based on gravitational field theory
NASA Astrophysics Data System (ADS)
Liu, Gang; Li, Yongshu; Li, Zheng; Guo, Jiawei
2014-05-01
The automatic generalization of urban street networks is a constant and important aspect of geographical information science. Previous studies show that the dual graph for street-street relationships more accurately reflects the overall morphological properties and importance of streets than do other methods. In this study, we construct a dual graph to represent street-street relationship and propose an approach to generalize street networks based on gravitational field theory. We retain the global structural properties and topological connectivity of an original street network and borrow from gravitational field theory to define the gravitational force between nodes. The concept of multi-order neighbors is introduced and the gravitational force is taken as the measure of the importance contribution between nodes. The importance of a node is defined as the result of the interaction between a given node and its multi-order neighbors. Degree distribution is used to evaluate the level of maintaining the global structure and topological characteristics of a street network and to illustrate the efficiency of the suggested method. Experimental results indicate that the proposed approach can be used in generalizing street networks and retaining their density characteristics, connectivity and global structure.
Sociocultural Learning Theory in Practice: Implications for Athletic Training Educators
Peer, Kimberly S.; McClendon, Ronald C.
2002-01-01
Objective: To discuss cognitive and sociocultural learning theory literature related to athletic training instructional and evaluation strategies while providing support for the application of these practices in the didactic and clinical components of athletic training education programs. Data Sources: We searched Educational Resources Information Center (ERIC) and Education Abstracts from 1975–2001 using the key words social cognitive, sociocultural learning theory, constructivism, and athletic training education. Current literature in the fields of educational psychology and athletic training education provides the foundation for applying theory to practice with specific emphasis on the theoretic framework and application of sociocultural learning theory strategies in athletic training education. Data Synthesis: Athletic training educators must have a strong fundamental knowledge of learning theory and a commitment to incorporate theory into educational practice. We integrate literature from both fields to generate practical strategies for using sociocultural learning theory in athletic training education. Conclusions/Recommendations: Social cognitive and sociocultural learning theory advocates a constructive, self-regulated, and goal-oriented environment with the student at the center of the educational process. Although a shift exists in athletic training education toward more active instructional strategies with the implementation of competency-based education, many educational environments are still dominated by traditional didactic instructional methods promoting student passivity. As athletic training education programs strive to increase accountability, educators in the field must critically analyze teaching and evaluation methods and integrate new material to ensure that learning is maximized. PMID:12937534
NASA Astrophysics Data System (ADS)
Olsen, Jolie; Sen, Sandip
2014-04-01
Steven Brams's [(1994). Theory of moves. Cambridge University Press] Theory of Moves (TOM) is an alternative to traditional game theoretic treatment of real-life interactions, in which players choose strategies based on analysis of future moves and counter-moves that arise if game play commences at a specified start state and either player can choose to move first. In repeated play, players using TOM rationality arrive at nonmyopic equilibria. One advantage of TOM is its ability to model scenarios in which power asymmetries exist between players. In particular, threat power, i.e. the ability of one player to threaten and sustain immediate, globally disadvantageous outcomes to force a desirable result long term, can be utilised to induce Pareto optimal states in games such as Prisoner's Dilemma which result in Pareto-dominated outcomes using traditional methods. Unfortunately, prior work on TOM is limited by an assumption of complete information. This paper presents a mechanism that can be used by a player to utilise threat power when playing a strict, ordinal 2×2 game under incomplete information. We also analyse the benefits of threat power and support in this analysis with empirical evidence.
Plenoptic layer-based modeling for image based rendering.
Pearson, James; Brookes, Mike; Dragotti, Pier Luigi
2013-09-01
Image based rendering is an attractive alternative to model based rendering for generating novel views because of its lower complexity and potential for photo-realistic results. To reduce the number of images necessary for alias-free rendering, some geometric information for the 3D scene is normally necessary. In this paper, we present a fast automatic layer-based method for synthesizing an arbitrary new view of a scene from a set of existing views. Our algorithm takes advantage of the knowledge of the typical structure of multiview data to perform occlusion-aware layer extraction. In addition, the number of depth layers used to approximate the geometry of the scene is chosen based on plenoptic sampling theory with the layers placed non-uniformly to account for the scene distribution. The rendering is achieved using a probabilistic interpolation approach and by extracting the depth layer information on a small number of key images. Numerical results demonstrate that the algorithm is fast and yet is only 0.25 dB away from the ideal performance achieved with the ground-truth knowledge of the 3D geometry of the scene of interest. This indicates that there are measurable benefits from following the predictions of plenoptic theory and that they remain true when translated into a practical system for real world data.
Summary of Work for Joint Research Interchanges with DARWIN Integrated Product Team 1998
NASA Technical Reports Server (NTRS)
Hesselink, Lambertus
1999-01-01
The intent of Stanford University's SciVis group is to develop technologies that enabled comparative analysis and visualization techniques for simulated and experimental flow fields. These techniques would then be made available under the Joint Research Interchange for potential injection into the DARWIN Workspace Environment (DWE). In the past, we have focused on techniques that exploited feature based comparisons such as shock and vortex extractions. Our current research effort focuses on finding a quantitative comparison of general vector fields based on topological features. Since the method relies on topological information, grid matching and vector alignment is not needed in the comparison. This is often a problem with many data comparison techniques. In addition, since only topology based information is stored and compared for each field, there is a significant compression of information that enables large databases to be quickly searched. This report will briefly (1) describe current technologies in the area of comparison techniques, (2) will describe the theory of our new method and finally (3) summarize a few of the results.
Summary of Work for Joint Research Interchanges with DARWIN Integrated Product Team
NASA Technical Reports Server (NTRS)
Hesselink, Lambertus
1999-01-01
The intent of Stanford University's SciVis group is to develop technologies that enabled comparative analysis and visualization techniques for simulated and experimental flow fields. These techniques would then be made available un- der the Joint Research Interchange for potential injection into the DARWIN Workspace Environment (DWE). In the past, we have focused on techniques that exploited feature based comparisons such as shock and vortex extractions. Our current research effort focuses on finding a quantitative comparison of general vector fields based on topological features. Since the method relies on topological information, grid matching an@ vector alignment is not needed in the comparison. This is often a problem with many data comparison techniques. In addition, since only topology based information is stored and compared for each field, there is a significant compression of information that enables large databases to be quickly searched. This report will briefly (1) describe current technologies in the area of comparison techniques, (2) will describe the theory of our new method and finally (3) summarize a few of the results.
Autonomous entropy-based intelligent experimental design
NASA Astrophysics Data System (ADS)
Malakar, Nabin Kumar
2011-07-01
The aim of this thesis is to explore the application of probability and information theory in experimental design, and to do so in a way that combines what we know about inference and inquiry in a comprehensive and consistent manner. Present day scientific frontiers involve data collection at an ever-increasing rate. This requires that we find a way to collect the most relevant data in an automated fashion. By following the logic of the scientific method, we couple an inference engine with an inquiry engine to automate the iterative process of scientific learning. The inference engine involves Bayesian machine learning techniques to estimate model parameters based upon both prior information and previously collected data, while the inquiry engine implements data-driven exploration. By choosing an experiment whose distribution of expected results has the maximum entropy, the inquiry engine selects the experiment that maximizes the expected information gain. The coupled inference and inquiry engines constitute an autonomous learning method for scientific exploration. We apply it to a robotic arm to demonstrate the efficacy of the method. Optimizing inquiry involves searching for an experiment that promises, on average, to be maximally informative. If the set of potential experiments is described by many parameters, the search involves a high-dimensional entropy space. In such cases, a brute force search method will be slow and computationally expensive. We develop an entropy-based search algorithm, called nested entropy sampling, to select the most informative experiment. This helps to reduce the number of computations necessary to find the optimal experiment. We also extended the method of maximizing entropy, and developed a method of maximizing joint entropy so that it could be used as a principle of collaboration between two robots. This is a major achievement of this thesis, as it allows the information-based collaboration between two robotic units towards a same goal in an automated fashion.
A fuzzy MCDM approach for evaluating school performance based on linguistic information
NASA Astrophysics Data System (ADS)
Musani, Suhaina; Jemain, Abdul Aziz
2013-11-01
Decision making is the process of finding the best option among the feasible alternatives. This process should consider a variety of criteria, but this study only focus on academic achievement. The data used is the percentage of candidates who obtained Malaysian Certificate of Education (SPM) in Melaka based on school academic achievement for each subject. 57 secondary schools in Melaka as listed by the Ministry of Education involved in this study. Therefore the school ranking can be done using MCDM (Multi Criteria Decision Making) methods. The objective of this study is to develop a rational method for evaluating school performance based on linguistic information. Since the information or level of academic achievement provided in linguistic manner, there is a possible chance of getting incomplete or uncertain problems. So in order to overcome the situation, the information could be provided as fuzzy numbers. Since fuzzy set represents the uncertainty in human perceptions. In this research, VIKOR (Multi Criteria Optimization and Compromise Solution) has been used as a MCDM tool for the school ranking process in fuzzy environment. Results showed that fuzzy set theory can solve the limitations of using MCDM when there is uncertainty problems exist in the data.
Extracting information in spike time patterns with wavelets and information theory.
Lopes-dos-Santos, Vítor; Panzeri, Stefano; Kayser, Christoph; Diamond, Mathew E; Quian Quiroga, Rodrigo
2015-02-01
We present a new method to assess the information carried by temporal patterns in spike trains. The method first performs a wavelet decomposition of the spike trains, then uses Shannon information to select a subset of coefficients carrying information, and finally assesses timing information in terms of decoding performance: the ability to identify the presented stimuli from spike train patterns. We show that the method allows: 1) a robust assessment of the information carried by spike time patterns even when this is distributed across multiple time scales and time points; 2) an effective denoising of the raster plots that improves the estimate of stimulus tuning of spike trains; and 3) an assessment of the information carried by temporally coordinated spikes across neurons. Using simulated data, we demonstrate that the Wavelet-Information (WI) method performs better and is more robust to spike time-jitter, background noise, and sample size than well-established approaches, such as principal component analysis, direct estimates of information from digitized spike trains, or a metric-based method. Furthermore, when applied to real spike trains from monkey auditory cortex and from rat barrel cortex, the WI method allows extracting larger amounts of spike timing information. Importantly, the fact that the WI method incorporates multiple time scales makes it robust to the choice of partly arbitrary parameters such as temporal resolution, response window length, number of response features considered, and the number of available trials. These results highlight the potential of the proposed method for accurate and objective assessments of how spike timing encodes information. Copyright © 2015 the American Physiological Society.
Quantum Information as a Non-Kolmogorovian Generalization of Shannon's Theory
NASA Astrophysics Data System (ADS)
Holik, Federico; Bosyk, Gustavo; Bellomo, Guido
2015-10-01
In this article we discuss the formal structure of a generalized information theory based on the extension of the probability calculus of Kolmogorov to a (possibly) non-commutative setting. By studying this framework, we argue that quantum information can be considered as a particular case of a huge family of non-commutative extensions of its classical counterpart. In any conceivable information theory, the possibility of dealing with different kinds of information measures plays a key role. Here, we generalize a notion of state spectrum, allowing us to introduce a majorization relation and a new family of generalized entropic measures.
The application of machine vision in fire protection system
NASA Astrophysics Data System (ADS)
Rong, Jiang
2018-04-01
Based on the previous research, this paper introduces the theory of wavelet, collects the situation through the video system, and calculates the key information needed in the fire protection system. That is, through the algorithm to collect the information, according to the flame color characteristics and smoke characteristics were extracted, and as the characteristic information corresponding processing. Alarm system set the corresponding alarm threshold, when more than this alarm threshold, the system will alarm. This combination of flame color characteristics and smoke characteristics of the fire method not only improve the accuracy of judgment, but also improve the efficiency of judgments. Experiments show that the scheme is feasible.
Zare Sakhvidi, Mohammad Javad; Zare, Maryam; Mostaghaci, Mehrdad; Mehrparvar, Amir Houshang; Morowatisharifabad, Mohammad Ali; Naghshineh, Elham
2015-01-01
Backgrounds. The aim of this study was to describe the preventive behaviors of industrial workers and factors influencing occupational cancer prevention behaviors using protection motivation theory. Methods. A self-administered questionnaire was completed by 161 petrochemical workers in Iran in 2014 which consisted of three sections: background information, protection motivation theory measures, and occupational cancers preventive behaviors. Results. A statistically significant positive correlation was found between PM and self-efficacy, response efficacy, and the cancer preventive behaviors. Meanwhile, statistically significant negative correlations were found between PM, cost, and reward. Conclusions. Among available PMT constructs, only self-efficacy and cost were significant predictors of preventive behaviors. Protection motivation model based health promotion interventions with focus on self-efficacy and cost would be desirable in the case of occupational cancers prevention.
Zare Sakhvidi, Mohammad Javad; Zare, Maryam; Mehrparvar, Amir Houshang; Morowatisharifabad, Mohammad Ali; Naghshineh, Elham
2015-01-01
Backgrounds. The aim of this study was to describe the preventive behaviors of industrial workers and factors influencing occupational cancer prevention behaviors using protection motivation theory. Methods. A self-administered questionnaire was completed by 161 petrochemical workers in Iran in 2014 which consisted of three sections: background information, protection motivation theory measures, and occupational cancers preventive behaviors. Results. A statistically significant positive correlation was found between PM and self-efficacy, response efficacy, and the cancer preventive behaviors. Meanwhile, statistically significant negative correlations were found between PM, cost, and reward. Conclusions. Among available PMT constructs, only self-efficacy and cost were significant predictors of preventive behaviors. Protection motivation model based health promotion interventions with focus on self-efficacy and cost would be desirable in the case of occupational cancers prevention. PMID:26543649
Structural reliability analysis under evidence theory using the active learning kriging model
NASA Astrophysics Data System (ADS)
Yang, Xufeng; Liu, Yongshou; Ma, Panke
2017-11-01
Structural reliability analysis under evidence theory is investigated. It is rigorously proved that a surrogate model providing only correct sign prediction of the performance function can meet the accuracy requirement of evidence-theory-based reliability analysis. Accordingly, a method based on the active learning kriging model which only correctly predicts the sign of the performance function is proposed. Interval Monte Carlo simulation and a modified optimization method based on Karush-Kuhn-Tucker conditions are introduced to make the method more efficient in estimating the bounds of failure probability based on the kriging model. Four examples are investigated to demonstrate the efficiency and accuracy of the proposed method.
Using Wavelet Bases to Separate Scales in Quantum Field Theory
NASA Astrophysics Data System (ADS)
Michlin, Tracie L.
This thesis investigates the use of Daubechies wavelets to separate scales in local quantum field theory. Field theories have an infinite number of degrees of freedom on all distance scales. Quantum field theories are believed to describe the physics of subatomic particles. These theories have no known mathematically convergent approximation methods. Daubechies wavelet bases can be used separate degrees of freedom on different distance scales. Volume and resolution truncations lead to mathematically well-defined truncated theories that can be treated using established methods. This work demonstrates that flow equation methods can be used to block diagonalize truncated field theoretic Hamiltonians by scale. This eliminates the fine scale degrees of freedom. This may lead to approximation methods and provide an understanding of how to formulate well-defined fine resolution limits.
ERIC Educational Resources Information Center
Abuhamdieh, Ayman H.; Harder, Joseph T.
2015-01-01
This paper proposes a meta-cognitive, systems-based, information structuring model (McSIS) to systematize online information search behavior based on literature review of information-seeking models. The General Systems Theory's (GST) prepositions serve as its framework. Factors influencing information-seekers, such as the individual learning…
Sevick, Mary Ann; Woolf, Kathleen; Mattoo, Aditya; Katz, Stuart D; Li, Huilin; St-Jules, David E; Jagannathan, Ram; Hu, Lu; Pompeii, Mary Lou; Ganguzza, Lisa; Li, Zhi; Sierra, Alex; Williams, Stephen K; Goldfarb, David S
2018-01-01
Patients with complex chronic diseases usually must make multiple lifestyle changes to limit and manage their conditions. Numerous studies have shown that education alone is insufficient for engaging people in lifestyle behavior change, and that theory-based behavioral approaches also are necessary. However, even the most motivated individual may have difficulty with making lifestyle changes because of the information complexity associated with multiple behavior changes. The goal of the current Healthy Hearts and Kidneys study was to evaluate, different mobile health (mHealth)-delivered intervention approaches for engaging individuals with type 2 diabetes (T2D) and concurrent chronic kidney disease (CKD) in behavior changes. Participants were randomized to 1 of 4 groups, receiving: (1) a behavioral counseling, (2) technology-based self-monitoring to reduce information complexity, (3) combined behavioral counseling and technology-based self-monitoring, or (4) baseline advice. We will determine the impact of randomization assignment on weight loss success and 24-hour urinary excretion of sodium and phosphorus. With this report we describe the study design, methods, and approaches used to assure information security for this ongoing clinical trial. Clinical Trials.gov Identifier: NCT02276742. Copyright © 2017. Published by Elsevier Inc.
Plowden, K O; Wenger, A F
2001-01-01
African Americans are facing a serious health crisis. They are disproportionately affected by most chronic illnesses. The disparity among ethic groups as it relates to health and illness is related to psychosocial and biological factors within the African American culture. Many African Americans are sometimes reluctant to participate in studies. This article discusses the process of creating a caring community when conducting research within an African American community based on the experience of the authors with two faith communities in a southern metropolitan area in the United States. The process is identified as unknowing, reflection, presence, and knowing. The process is based on Leininger's theory of culture care diversity and universality and her stranger to friend enabler. When the theory and method are used, the investigator moves from a stranger within the community to a trusted friend and begins to collect rich and valuable data for analysis from the informants' point of view.
An Isomorphism between Lyapunov Exponents and Shannon's Channel Capacity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedland, Gerald; Metere, Alfredo
We demonstrate that discrete Lyapunov exponents are isomorphic to numeric overflows of the capacity of an arbitrary noiseless and memoryless channel in a Shannon communication model with feedback. The isomorphism allows the understanding of Lyapunov exponents in terms of Information Theory, rather than the traditional definitions in chaos theory. The result also implies alternative approaches to the calculation of related quantities, such as the Kolmogorov Sinai entropy which has been linked to thermodynamic entropy. This work provides a bridge between fundamental physics and information theory. It suggests, among other things, that machine learning and other information theory methods can bemore » employed at the core of physics simulations.« less
NASA Astrophysics Data System (ADS)
Yu, Haiyan; Fan, Jiulun
2017-12-01
Local thresholding methods for uneven lighting image segmentation always have the limitations that they are very sensitive to noise injection and that the performance relies largely upon the choice of the initial window size. This paper proposes a novel algorithm for segmenting uneven lighting images with strong noise injection based on non-local spatial information and intuitionistic fuzzy theory. We regard an image as a gray wave in three-dimensional space, which is composed of many peaks and troughs, and these peaks and troughs can divide the image into many local sub-regions in different directions. Our algorithm computes the relative characteristic of each pixel located in the corresponding sub-region based on fuzzy membership function and uses it to replace its absolute characteristic (its gray level) to reduce the influence of uneven light on image segmentation. At the same time, the non-local adaptive spatial constraints of pixels are introduced to avoid noise interference with the search of local sub-regions and the computation of local characteristics. Moreover, edge information is also taken into account to avoid false peak and trough labeling. Finally, a global method based on intuitionistic fuzzy entropy is employed on the wave transformation image to obtain the segmented result. Experiments on several test images show that the proposed method has excellent capability of decreasing the influence of uneven illumination on images and noise injection and behaves more robustly than several classical global and local thresholding methods.
The TEACH Method: An Interactive Approach for Teaching the Needs-Based Theories Of Motivation
ERIC Educational Resources Information Center
Moorer, Cleamon, Jr.
2014-01-01
This paper describes an interactive approach for explaining and teaching the Needs-Based Theories of Motivation. The acronym TEACH stands for Theory, Example, Application, Collaboration, and Having Discussion. This method can help business students to better understand and distinguish the implications of Maslow's Hierarchy of Needs,…
Kothari, Anita; Boyko, Jennifer A; Campbell-Davison, Andrea
2015-09-09
Informal knowledge is used in public health practice to make sense of research findings. Although knowledge translation theories highlight the importance of informal knowledge, it is not clear to what extent the same literature provides guidance in terms of how to use it in practice. The objective of this study was to address this gap by exploring what planned action theories suggest in terms of using three types of informal knowledge: local, experiential and expert. We carried out an exploratory secondary analysis of the planned action theories that informed the development of a popular knowledge translation theory. Our sample included twenty-nine (n = 29) papers. We extracted information from these papers about sources of and guidance for using informal knowledge, and then carried out a thematic analysis. We found that theories of planned action provide guidance (including sources of, methods for identifying, and suggestions for use) for using local, experiential and expert knowledge. This study builds on previous knowledge translation related work to provide insight into the practical use of informal knowledge. Public health practitioners can refer to the guidance summarized in this paper to inform their decision-making. Further research about how to use informal knowledge in public health practice is needed given the value being accorded to using informal knowledge in public health decision-making processes.
Using evaluation theory in priority setting and resource allocation.
Smith, Neale; Mitton, Craig; Cornelissen, Evelyn; Gibson, Jennifer; Peacock, Stuart
2012-01-01
Public sector interest in methods for priority setting and program or policy evaluation has grown considerably over the last several decades, given increased expectations for accountable and efficient use of resources and emphasis on evidence-based decision making as a component of good management practice. While there has been some occasional effort to conduct evaluation of priority setting projects, the literatures around priority setting and evaluation have largely evolved separately. In this paper, the aim is to bring them together. The contention is that evaluation theory is a means by which evaluators reflect upon what it is they are doing when they do evaluation work. Theories help to organize thinking, sort out relevant from irrelevant information, provide transparent grounds for particular implementation choices, and can help resolve problematic issues which may arise in the conduct of an evaluation project. A detailed review of three major branches of evaluation theory--methods, utilization, and valuing--identifies how such theories can guide the development of efforts to evaluate priority setting and resource allocation initiatives. Evaluation theories differ in terms of their guiding question, anticipated setting or context, evaluation foci, perspective from which benefits are calculated, and typical methods endorsed. Choosing a particular theoretical approach will structure the way in which any priority setting process is evaluated. The paper suggests that explicitly considering evaluation theory makes key aspects of the evaluation process more visible to all stakeholders, and can assist in the design of effective evaluation of priority setting processes; this should iteratively serve to improve the understanding of priority setting practices themselves.
McDonnell, J. D.; Schunck, N.; Higdon, D.; ...
2015-03-24
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. In addition, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDonnell, J. D.; Schunck, N.; Higdon, D.
2015-03-24
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less
Van de Velde, Dominique; Devisch, Ignaas; De Vriendt, Patricia
2016-08-01
Purpose To explore the perspectives of male clients in a neurological rehabilitation setting with regard to the occupational therapy they have received and the client-centred approach. Method This study involved a qualitative research design based on the grounded theory tradition. Individual in-depth interviews were used to collect data. Data were analysed using a constant comparative method. Seven male participants from an inpatient neurological setting were included using a theoretical sampling technique. Results Three themes emerged to describe the approach of the therapists to client-centred practice: (a) a shared biomedical focus as the start of the rehabilitation process, (b) the un-simultaneous shift from a biomedical towards a psycho-social focus and (c) formal versus informal nature of gathering client information. Conclusion A client-centred approach entails a shift from the therapist focussing on recovery from the short-term neurological issues towards the long-term consequences of the disease. According to the client, this shift in reasoning must occur at a specific and highly subjective moment during the rehabilitation process. Identifying this moment could strengthen the client-centred approach. Implications for Rehabilitation Client-centred practice entails a shift from recovering the short-term neurological issues towards the long-term psycho-social consequences of the disease. To be effective in client-centred practice, the clients expect from the professional to be an authority with regard to biomedical issues and to be partner with regard to psycho-social issues. Client-centred practice is most likely to be successful when client is susceptible to discuss his psycho-social issues and finding this moment is a challenge for the professional. Using formal methods for goal setting do not necessarily cover all the information needed for a client-centred therapy programme. Rather, using informal methods could lead to a more valid image of the client.
Predicting activity approach based on new atoms similarity kernel function.
Abu El-Atta, Ahmed H; Moussa, M I; Hassanien, Aboul Ella
2015-07-01
Drug design is a high cost and long term process. To reduce time and costs for drugs discoveries, new techniques are needed. Chemoinformatics field implements the informational techniques and computer science like machine learning and graph theory to discover the chemical compounds properties, such as toxicity or biological activity. This is done through analyzing their molecular structure (molecular graph). To overcome this problem there is an increasing need for algorithms to analyze and classify graph data to predict the activity of molecules. Kernels methods provide a powerful framework which combines machine learning with graph theory techniques. These kernels methods have led to impressive performance results in many several chemoinformatics problems like biological activity prediction. This paper presents a new approach based on kernel functions to solve activity prediction problem for chemical compounds. First we encode all atoms depending on their neighbors then we use these codes to find a relationship between those atoms each other. Then we use relation between different atoms to find similarity between chemical compounds. The proposed approach was compared with many other classification methods and the results show competitive accuracy with these methods. Copyright © 2015 Elsevier Inc. All rights reserved.
Kim, Og Yeon; Shim, Soonmi
2014-01-01
BACKGROUND/OBJECTIVES The purpose of this study is to identify how level of information affected intention, using the Theory of Planned Behavior. SUBJECTS/METHODS The study was conducted survey in diverse community centers and shopping malls in Seoul, which yielded N = 209 datasets. To compare processed foods consumption behavior, we divided samples into two groups based on level of information about food additives (whether respondents felt that information on food additives was sufficient or not). We analyzed differences in attitudes toward food additives and toward purchasing processed foods, subjective norms, perceived behavioral control, and behavioral intentions to processed foods between sufficient information group and lack information group. RESULTS The results confirmed that more than 78% of respondents thought information on food additives was insufficient. However, the group who felt information was sufficient had more positive attitudes about consuming processed foods and behavioral intentions than the group who thought information was inadequate. This study found people who consider that they have sufficient information on food additives tend to have more positive attitudes toward processed foods and intention to consume processed foods. CONCLUSIONS This study suggests increasing needs for nutrition education on the appropriate use of processed foods. Designing useful nutrition education requires a good understanding of factors which influence on processed foods consumption. PMID:24944779
Information Dissemination of Public Health Emergency on Social Networks and Intelligent Computation
Hu, Hongzhi; Mao, Huajuan; Hu, Xiaohua; Hu, Feng; Sun, Xuemin; Jing, Zaiping; Duan, Yunsuo
2015-01-01
Due to the extensive social influence, public health emergency has attracted great attention in today's society. The booming social network is becoming a main information dissemination platform of those events and caused high concerns in emergency management, among which a good prediction of information dissemination in social networks is necessary for estimating the event's social impacts and making a proper strategy. However, information dissemination is largely affected by complex interactive activities and group behaviors in social network; the existing methods and models are limited to achieve a satisfactory prediction result due to the open changeable social connections and uncertain information processing behaviors. ACP (artificial societies, computational experiments, and parallel execution) provides an effective way to simulate the real situation. In order to obtain better information dissemination prediction in social networks, this paper proposes an intelligent computation method under the framework of TDF (Theory-Data-Feedback) based on ACP simulation system which was successfully applied to the analysis of A (H1N1) Flu emergency. PMID:26609303
Information Dissemination of Public Health Emergency on Social Networks and Intelligent Computation.
Hu, Hongzhi; Mao, Huajuan; Hu, Xiaohua; Hu, Feng; Sun, Xuemin; Jing, Zaiping; Duan, Yunsuo
2015-01-01
Due to the extensive social influence, public health emergency has attracted great attention in today's society. The booming social network is becoming a main information dissemination platform of those events and caused high concerns in emergency management, among which a good prediction of information dissemination in social networks is necessary for estimating the event's social impacts and making a proper strategy. However, information dissemination is largely affected by complex interactive activities and group behaviors in social network; the existing methods and models are limited to achieve a satisfactory prediction result due to the open changeable social connections and uncertain information processing behaviors. ACP (artificial societies, computational experiments, and parallel execution) provides an effective way to simulate the real situation. In order to obtain better information dissemination prediction in social networks, this paper proposes an intelligent computation method under the framework of TDF (Theory-Data-Feedback) based on ACP simulation system which was successfully applied to the analysis of A (H1N1) Flu emergency.
A feature selection approach towards progressive vector transmission over the Internet
NASA Astrophysics Data System (ADS)
Miao, Ru; Song, Jia; Feng, Min
2017-09-01
WebGIS has been applied for visualizing and sharing geospatial information popularly over the Internet. In order to improve the efficiency of the client applications, the web-based progressive vector transmission approach is proposed. Important features should be selected and transferred firstly, and the methods for measuring the importance of features should be further considered in the progressive transmission. However, studies on progressive transmission for large-volume vector data have mostly focused on map generalization in the field of cartography, but rarely discussed on the selection of geographic features quantitatively. This paper applies information theory for measuring the feature importance of vector maps. A measurement model for the amount of information of vector features is defined based upon the amount of information for dealing with feature selection issues. The measurement model involves geometry factor, spatial distribution factor and thematic attribute factor. Moreover, a real-time transport protocol (RTP)-based progressive transmission method is then presented to improve the transmission of vector data. To clearly demonstrate the essential methodology and key techniques, a prototype for web-based progressive vector transmission is presented, and an experiment of progressive selection and transmission for vector features is conducted. The experimental results indicate that our approach clearly improves the performance and end-user experience of delivering and manipulating large vector data over the Internet.
A framework for the evaluation of patient information leaflets
Garner, Mark; Ning, Zhenye; Francis, Jill
2011-01-01
Abstract Background The provision of patient information leaflets (PILs) is an important part of health care. PILs require evaluation, but the frameworks that are used for evaluation are largely under‐informed by theory. Most evaluation to date has been based on indices of readability, yet several writers argue that readability is not enough. We propose a framework for evaluating PILs that reflect the central role of the patient perspective in communication and use methods for evaluation based on simple linguistic principles. The proposed framework The framework has three elements that give rise to three approaches to evaluation. Each element is a necessary but not sufficient condition for effective communication. Readability (focussing on text) may be assessed using existing well‐established procedures. Comprehensibility (focussing on reader and text) may be assessed using multiple‐choice questions based on the lexical and semantic features of the text. Communicative effectiveness (focussing on reader) explores the relationship between the emotional, cognitive and behavioural responses of the reader and the objectives of the PIL. Suggested methods for assessment are described, based on our preliminary empirical investigations. Conclusions The tripartite model of communicative effectiveness is a patient‐centred framework for evaluating PILs. It may assist the field in moving beyond readability to broader indicators of the quality and appropriateness of printed information provided to patients. PMID:21332620
Extended Importance Sampling for Reliability Analysis under Evidence Theory
NASA Astrophysics Data System (ADS)
Yuan, X. K.; Chen, B.; Zhang, B. Q.
2018-05-01
In early engineering practice, the lack of data and information makes uncertainty difficult to deal with. However, evidence theory has been proposed to handle uncertainty with limited information as an alternative way to traditional probability theory. In this contribution, a simulation-based approach, called ‘Extended importance sampling’, is proposed based on evidence theory to handle problems with epistemic uncertainty. The proposed approach stems from the traditional importance sampling for reliability analysis under probability theory, and is developed to handle the problem with epistemic uncertainty. It first introduces a nominal instrumental probability density function (PDF) for every epistemic uncertainty variable, and thus an ‘equivalent’ reliability problem under probability theory is obtained. Then the samples of these variables are generated in a way of importance sampling. Based on these samples, the plausibility and belief (upper and lower bounds of probability) can be estimated. It is more efficient than direct Monte Carlo simulation. Numerical and engineering examples are given to illustrate the efficiency and feasible of the proposed approach.
Multi-sensor information fusion method for vibration fault diagnosis of rolling bearing
NASA Astrophysics Data System (ADS)
Jiao, Jing; Yue, Jianhai; Pei, Di
2017-10-01
Bearing is a key element in high-speed electric multiple unit (EMU) and any defect of it can cause huge malfunctioning of EMU under high operation speed. This paper presents a new method for bearing fault diagnosis based on least square support vector machine (LS-SVM) in feature-level fusion and Dempster-Shafer (D-S) evidence theory in decision-level fusion which were used to solve the problems about low detection accuracy, difficulty in extracting sensitive characteristics and unstable diagnosis system of single-sensor in rolling bearing fault diagnosis. Wavelet de-nosing technique was used for removing the signal noises. LS-SVM was used to make pattern recognition of the bearing vibration signal, and then fusion process was made according to the D-S evidence theory, so as to realize recognition of bearing fault. The results indicated that the data fusion method improved the performance of the intelligent approach in rolling bearing fault detection significantly. Moreover, the results showed that this method can efficiently improve the accuracy of fault diagnosis.
ERIC Educational Resources Information Center
Finke, Erinn H.; Hickerson, Benjamin; McLaughlin, Eileen
2015-01-01
Purpose: The purpose of this study was to determine parental attitudes regarding engagement with video games by their children with autism spectrum disorder (ASD) and whether attitudes vary based on ASD symptom severity. Method: Online survey methodology was used to gather information from parents of children with ASD between the ages of 8 and 12…
Liu, Jiaen; Zhang, Xiaotong; Schmitter, Sebastian; Van de Moortele, Pierre-Francois; He, Bin
2014-01-01
Purpose To develop high-resolution electrical properties tomography (EPT) methods and investigate a gradient-based EPT (gEPT) approach which aims to reconstruct the electrical properties (EP), including conductivity and permittivity, of an imaged sample from experimentally measured B1 maps with improved boundary reconstruction and robustness against measurement noise. Theory and Methods Using a multi-channel transmit/receive stripline head coil, with acquired B1 maps for each coil element, by assuming negligible Bz component compared to transverse B1 components, a theory describing the relationship between B1 field, EP value and their spatial gradient has been proposed. The final EP images were obtained through spatial integration over the reconstructed EP gradient. Numerical simulation, physical phantom and in vivo human experiments at 7 T have been conducted to evaluate the performance of the proposed methods. Results Reconstruction results were compared with target EP values in both simulations and phantom experiments. Human experimental results were compared with EP values in literature. Satisfactory agreement was observed with improved boundary reconstruction. Importantly, the proposed gEPT method proved to be more robust against noise when compared to previously described non-gradient-based EPT approaches. Conclusion The proposed gEPT approach holds promises to improve EP mapping quality by recovering the boundary information and enhancing robustness against noise. PMID:25213371
Theory-based interventions for contraception.
Lopez, Laureen M; Tolley, Elizabeth E; Grimes, David A; Chen-Mok, Mario
2011-03-16
The explicit use of theory in research helps expand the knowledge base. Theories and models have been used extensively in HIV-prevention research and in interventions for preventing sexually transmitted infections (STIs). The health behavior field uses many theories or models of change. However, educational interventions addressing contraception often have no stated theoretical base. Review randomized controlled trials (RCTs) that tested a theoretical approach to inform contraceptive choice; encourage contraceptive use; or promote adherence to, or continuation of, a contraceptive regimen. We searched computerized databases for trials that tested a theory-based intervention for improving contraceptive use (MEDLINE, POPLINE, CENTRAL, PsycINFO, EMBASE, ClinicalTrials.gov, and ICTRP). We also wrote to researchers to find other trials. Trials tested a theory-based intervention for improving contraceptive use. We excluded trials focused on high-risk groups and preventing sexually transmitted infections or HIV. Interventions addressed the use of one or more contraceptive methods for contraception. The reports provided evidence that the intervention was based on a specific theory or model. The primary outcomes were pregnancy, contraceptive choice, initiating or changing contraceptive use, contraceptive regimen adherence, and contraception continuation. The primary author evaluated abstracts for eligibility. Two authors extracted data from included studies. We calculated the odds ratio for dichotomous outcomes. No meta-analysis was conducted due to intervention differences. Fourteen RCTs met our inclusion criteria. In 2 of 10 trials with pregnancy or birth data, a theory-based group showed better results. Four of 10 trials with contraceptive use data (other than condoms) showed better outcomes in an experimental group. For condom use, a theory-based group had favorable results in three of eight trials. Social Cognitive Theory was the main theoretical basis for five trials, of which three showed positive results. Two based on other social cognition models had favorable results, as did two of four focused on motivational interviewing. Thirteen trials provided multiple sessions or contacts. Of seven effective interventions, five targeted adolescents, including four with group sessions. Three effective trials had individual sessions. Seven trials were rated as having high or moderate quality; three of those had favorable results. Family planning researchers and practitioners could adapt the effective interventions. Reproductive health needs high-quality research on behavior change, especially for clinical and low-resource settings. More thorough use of single theories would help, as would better reporting on research design and intervention implementation.
Using a web-based system for the continuous distance education in cytopathology.
Stergiou, Nikolaos; Georgoulakis, Giannis; Margari, Niki; Aninos, Dionisios; Stamataki, Melina; Stergiou, Efi; Pouliakis, Abraam; Karakitsos, Petros
2009-12-01
The evolution of information technologies and telecommunications has made the World Wide Web a low cost and easily accessible tool for the dissemination of information and knowledge. Continuous Medical Education (CME) sites dedicated in cytopathology field are rather poor, they do not succeed in following the constant changes and lack the ability of providing cytopathologists with a dynamic learning environment, adaptable to the development of cytopathology. Learning methods including skills such as decision making, reasoning and problem solving are critical in the development of such a learning environment. The objectives of this study are (1) to demonstrate on the basis of a web-based training system the successful application of traditional learning theories and methods and (2) to effectively evaluate users' perception towards the educational program, using a combination of observers, theories and methods. Trainees are given the opportunity to browse through the educational material, collaborate in synchronous and asynchronous mode, practice their skills through problems and tasks and test their knowledge using the self-evaluation tool. On the other hand, the trainers are responsible for editing learning material, attending students' progress and organizing the problem-based and task-based scenarios. The implementation of the web-based training system is based on the three-tier architecture and uses an Apache Tomcat web server and a MySQL database server. By December 2008, CytoTrainer's learning environment contains two courses in cytopathology: Gynaecological Cytology and Thyroid Cytology offering about 2000 digital images and 20 case sessions. Our evaluation method is a combination of both qualitative and quantitative approaches to explore how the various parts of the system and students' attitudes work together. Trainees approved of the course's content, methodology and learning activities. The triangulation of evaluation methods revealed that the training program is suitable for the continuous distance education in cytopathology and that it has improved the trainees' skills in diagnostic cytopathology. The web-based training system can be successfully involved in the continuous distance education in cytopathology. It provides the opportunity to access learning material from any place at any time and supports the acquisition of diagnostic knowledge.
NASA Astrophysics Data System (ADS)
Zhu, Xiaohua; Li, Chuanrong; Tang, Lingli
2018-03-01
Leaf area index (LAI) is a key structural characteristic of vegetation and plays a significant role in global change research. Several methods and remotely sensed data have been evaluated for LAI estimation. This study aimed to evaluate the suitability of the look-up-table (LUT) approach for crop LAI retrieval from Satellite Pour l'Observation de la Terre (SPOT)-5 data and establish an LUT approach for LAI inversion based on scale information. The LAI inversion result was validated by in situ LAI measurements, indicating that the LUT generated based on the PROSAIL (PROSPECT+SAIL: properties spectra + scattering by arbitrarily inclined leaves) model was suitable for crop LAI estimation, with a root mean square error (RMSE) of ˜0.31m2 / m2 and determination coefficient (R2) of 0.65. The scale effect of crop LAI was analyzed based on Taylor expansion theory, indicating that when the SPOT data aggregated by 200 × 200 pixel, the relative error is significant with 13.7%. Finally, an LUT method integrated with scale information was proposed in this article, improving the inversion accuracy with RMSE of 0.20 m2 / m2 and R2 of 0.83.
NASA Astrophysics Data System (ADS)
Parrott, Annette M.
Problem. Science teachers are charged with preparing students to become scientifically literate individuals. Teachers are given curriculum that specifies the knowledge that students should come away with; however, they are not necessarily aware of the knowledge with which the student arrives or how best to help them navigate between the two knowledge states. Educators must be aware, not only of where their students are conceptually, but how their students move from their prior knowledge and naive theories, to scientifically acceptable theories. The understanding of how students navigate this course has the potential to revolutionize educational practices. Methods. This study explored how five 9th grade biology students reconstructed their cognitive frameworks and navigated conceptual change from prior conception to consensual genetics knowledge. The research questions investigated were: (1) how do students in the process of changing their naive science theories to accepted science theories describe their journey from prior knowledge to current conception, and (2) what are the methods that students utilize to bridge the gap between alternate and consensual science conceptions to effect conceptual change. Qualitative and quantitative methods were employed to gather and analyze the data. In depth, semi-structured interviews formed the primary data for probing the context and details of students' conceptual change experience. Primary interview data was coded by thematic analysis. Results and discussion. This study revealed information about students' perceived roles in learning, the role of articulation in the conceptual change process, and ways in which a community of learners aids conceptual change. It was ascertained that students see their role in learning primarily as repeating information until they could add that information to their knowledge. Students are more likely to consider challenges to their conceptual frameworks and be more motivated to become active participants in constructing their knowledge when they are working collaboratively with peers instead of receiving instruction from their teacher. Articulation was found to be instrumental in aiding learners in identifying their alternate conceptions as well as in revisiting, investigating and reconstructing their conceptual frameworks. Based on the assumptions generated, suggestions were offered to inform pedagogical practice in support of the conceptual change process.
A joint tracking method for NSCC based on WLS algorithm
NASA Astrophysics Data System (ADS)
Luo, Ruidan; Xu, Ying; Yuan, Hong
2017-12-01
Navigation signal based on compound carrier (NSCC), has the flexible multi-carrier scheme and various scheme parameters configuration, which enables it to possess significant efficiency of navigation augmentation in terms of spectral efficiency, tracking accuracy, multipath mitigation capability and anti-jamming reduction compared with legacy navigation signals. Meanwhile, the typical scheme characteristics can provide auxiliary information for signal synchronism algorithm design. This paper, based on the characteristics of NSCC, proposed a kind of joint tracking method utilizing Weighted Least Square (WLS) algorithm. In this method, the LS algorithm is employed to jointly estimate each sub-carrier frequency shift with the frequency-Doppler linear relationship, by utilizing the known sub-carrier frequency. Besides, the weighting matrix is set adaptively according to the sub-carrier power to ensure the estimation accuracy. Both the theory analysis and simulation results illustrate that the tracking accuracy and sensitivity of this method outperforms the single-carrier algorithm with lower SNR.
Using fuzzy fractal features of digital images for the material surface analisys
NASA Astrophysics Data System (ADS)
Privezentsev, D. G.; Zhiznyakov, A. L.; Astafiev, A. V.; Pugin, E. V.
2018-01-01
Edge detection is an important task in image processing. There are a lot of approaches in this area: Sobel, Canny operators and others. One of the perspective techniques in image processing is the use of fuzzy logic and fuzzy sets theory. They allow us to increase processing quality by representing information in its fuzzy form. Most of the existing fuzzy image processing methods switch to fuzzy sets on very late stages, so this leads to some useful information loss. In this paper, a novel method of edge detection based on fuzzy image representation and fuzzy pixels is proposed. With this approach, we convert the image to fuzzy form on the first step. Different approaches to this conversion are described. Several membership functions for fuzzy pixel description and requirements for their form and view are given. A novel approach to edge detection based on Sobel operator and fuzzy image representation is proposed. Experimental testing of developed method was performed on remote sensing images.
Research on the lesion segmentation of breast tumor MR images based on FCM-DS theory
NASA Astrophysics Data System (ADS)
Zhang, Liangbin; Ma, Wenjun; Shen, Xing; Li, Yuehua; Zhu, Yuemin; Chen, Li; Zhang, Su
2017-03-01
Magnetic resonance imaging (MRI) plays an important role in the treatment of breast tumor by high intensity focused ultrasound (HIFU). The doctors evaluate the scale, distribution and the statement of benign or malignancy of breast tumor by analyzing variety modalities of MRI, such as the T2, DWI and DCE images for making accurate preoperative treatment plan and evaluating the effect of the operation. This paper presents a method of lesion segmentation of breast tumor based on FCM-DS theory. Fuzzy c-means clustering (FCM) algorithm combined with Dempster-Shafer (DS) theory is used to process the uncertainty of information, segmenting the lesion areas on DWI and DCE modalities of MRI and reducing the scale of the uncertain parts. Experiment results show that FCM-DS can fuse the DWI and DCE images to achieve accurate segmentation and display the statement of benign or malignancy of lesion area by Time-Intensity Curve (TIC), which could be beneficial in making preoperative treatment plan and evaluating the effect of the therapy.
2016-07-08
Systems Using Automata Theory and Barrier Certifi- cates We developed a sound but incomplete method for the computational verification of specifications...method merges ideas from automata -based model checking with those from control theory including so-called barrier certificates and optimization-based... Automata theory meets barrier certificates: Temporal logic verification of nonlinear systems,” IEEE Transactions on Automatic Control, 2015. [J2] R
Nezarat, Amin; Dastghaibifard, GH
2015-01-01
One of the most complex issues in the cloud computing environment is the problem of resource allocation so that, on one hand, the cloud provider expects the most profitability and, on the other hand, users also expect to have the best resources at their disposal considering the budget constraints and time. In most previous work conducted, heuristic and evolutionary approaches have been used to solve this problem. Nevertheless, since the nature of this environment is based on economic methods, using such methods can decrease response time and reducing the complexity of the problem. In this paper, an auction-based method is proposed which determines the auction winner by applying game theory mechanism and holding a repetitive game with incomplete information in a non-cooperative environment. In this method, users calculate suitable price bid with their objective function during several round and repetitions and send it to the auctioneer; and the auctioneer chooses the winning player based the suggested utility function. In the proposed method, the end point of the game is the Nash equilibrium point where players are no longer inclined to alter their bid for that resource and the final bid also satisfies the auctioneer’s utility function. To prove the response space convexity, the Lagrange method is used and the proposed model is simulated in the cloudsim and the results are compared with previous work. At the end, it is concluded that this method converges to a response in a shorter time, provides the lowest service level agreement violations and the most utility to the provider. PMID:26431035
Nezarat, Amin; Dastghaibifard, G H
2015-01-01
One of the most complex issues in the cloud computing environment is the problem of resource allocation so that, on one hand, the cloud provider expects the most profitability and, on the other hand, users also expect to have the best resources at their disposal considering the budget constraints and time. In most previous work conducted, heuristic and evolutionary approaches have been used to solve this problem. Nevertheless, since the nature of this environment is based on economic methods, using such methods can decrease response time and reducing the complexity of the problem. In this paper, an auction-based method is proposed which determines the auction winner by applying game theory mechanism and holding a repetitive game with incomplete information in a non-cooperative environment. In this method, users calculate suitable price bid with their objective function during several round and repetitions and send it to the auctioneer; and the auctioneer chooses the winning player based the suggested utility function. In the proposed method, the end point of the game is the Nash equilibrium point where players are no longer inclined to alter their bid for that resource and the final bid also satisfies the auctioneer's utility function. To prove the response space convexity, the Lagrange method is used and the proposed model is simulated in the cloudsim and the results are compared with previous work. At the end, it is concluded that this method converges to a response in a shorter time, provides the lowest service level agreement violations and the most utility to the provider.
Hackman, Christine L; Knowlden, Adam P
2014-01-01
Background Childhood obesity has reached epidemic proportions in many nations around the world. The theory of planned behavior (TPB) and the theory of reasoned action (TRA) have been used to successfully plan and evaluate numerous interventions for many different behaviors. The aim of this study was to systematically review and synthesize TPB and TRA-based dietary behavior interventions targeting adolescents and young adults. Methods The following databases were systematically searched to find articles for this review: Academic Search Premier; Cumulative Index to Nursing and Allied Health (CINAHL); Education Resources Information Center (ERIC); Health Source: Nursing/Academic Edition; Cochrane Central Register of Controlled Trials (CENTRAL); and MEDLINE. Inclusion criteria for articles were: 1) primary or secondary interventions, 2) with any quantitative design, 3) published in the English language, 4) between January 2003 and March 2014, 5) that targeted adolescents or young adults, 6) which included dietary change behavior as the outcome, and 7) utilized TPB or TRA. Results Of the eleven intervention studies evaluated, nine resulted in dietary behavior change that was attributed to the treatment. Additionally, all but one study found there to be a change in at least one construct of TRA or TPB, while one study did not measure constructs. All of the studies utilized some type of quantitative design, with two employing quasi-experimental, and eight employing randomized control trial design. Among the studies, four utilized technology including emails, social media posts, information on school websites, web-based activities, audio messages in classrooms, interactive DVDs, and health-related websites. Two studies incorporated goal setting and four employed persuasive communication. Conclusion Interventions directed toward changing dietary behaviors in adolescents should aim to incorporate multi-faceted, theory-based approaches. Future studies should consider utilizing randomized control trial design and operationalize variables. More research is needed to identify the optimal TPB and TRA modalities to modify dietary behaviors. PMID:24966710
ERIC Educational Resources Information Center
Hackman, Judith Dozier
Seven potentially useful maxims from the field of human information processing are proposed that may help institutional researchers prepare and present information for higher education decision-makers. The maxims, which are based on research and theory about how people cognitively process information, are as follows: (1) more may not be better;…
ERIC Educational Resources Information Center
Wiio, Osmo A.
A more unified approach to communication theory can evolve through systems modeling of information theory, communication modes, and mass media operations. Such systematic analysis proposes, as is the case care here, that information models be based upon combinations of energy changes and exchanges and changes in receiver systems. The mass media is…
Linking theory with qualitative research through study of stroke caregiving families.
Pierce, Linda L; Steiner, Victoria; Cervantez Thompson, Teresa L; Friedemann, Marie-Luise
2014-01-01
This theoretical article outlines the deliberate process of applying a qualitative data analysis method rooted in Friedemann's Framework of Systemic Organization through the study of a web-based education and support intervention for stroke caregiving families. Directed by Friedemann's framework, the analytic method involved developing, refining, and using a coding rubric to explore interactive patterns between caregivers and care recipients from this 3-month feasibility study using this education and support intervention. Specifically, data were gathered from the intervention's web-based discussion component between caregivers and the nurse specialist, as well as from telephone caregiver interviews. A theoretical framework guided the process of developing and refining this coding rubric for the purpose of organizing data; but, more importantly, guided the investigators' thought processes, allowing them to extract rich information from the data set, as well as synthesize this information to generate a broad understanding of the caring situation. © 2013 Association of Rehabilitation Nurses.
A Nonlinear Diffusion Equation-Based Model for Ultrasound Speckle Noise Removal
NASA Astrophysics Data System (ADS)
Zhou, Zhenyu; Guo, Zhichang; Zhang, Dazhi; Wu, Boying
2018-04-01
Ultrasound images are contaminated by speckle noise, which brings difficulties in further image analysis and clinical diagnosis. In this paper, we address this problem in the view of nonlinear diffusion equation theories. We develop a nonlinear diffusion equation-based model by taking into account not only the gradient information of the image, but also the information of the gray levels of the image. By utilizing the region indicator as the variable exponent, we can adaptively control the diffusion type which alternates between the Perona-Malik diffusion and the Charbonnier diffusion according to the image gray levels. Furthermore, we analyze the proposed model with respect to the theoretical and numerical properties. Experiments show that the proposed method achieves much better speckle suppression and edge preservation when compared with the traditional despeckling methods, especially in the low gray level and low-contrast regions.
Improving resolution of crosswell seismic section based on time-frequency analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, H.; Li, Y.
1994-12-31
According to signal theory, to improve resolution of seismic section is to extend high-frequency band of seismic signal. In cross-well section, sonic log can be regarded as a reliable source providing high-frequency information to the trace near the borehole. In such case, what to do is to introduce this high-frequency information into the whole section. However, neither traditional deconvolution algorithms nor some new inversion methods such as BCI (Broad Constraint Inversion) are satisfied because of high-frequency noise and nonuniqueness of inversion results respectively. To overcome their disadvantages, this paper presents a new algorithm based on Time-Frequency Analysis (TFA) technology whichmore » has been increasingly received much attention as an useful signal analysis too. Practical applications show that the new method is a stable scheme to improve resolution of cross-well seismic section greatly without decreasing Signal to Noise Ratio (SNR).« less
Spatial Uncertainty Modeling of Fuzzy Information in Images for Pattern Classification
Pham, Tuan D.
2014-01-01
The modeling of the spatial distribution of image properties is important for many pattern recognition problems in science and engineering. Mathematical methods are needed to quantify the variability of this spatial distribution based on which a decision of classification can be made in an optimal sense. However, image properties are often subject to uncertainty due to both incomplete and imprecise information. This paper presents an integrated approach for estimating the spatial uncertainty of vagueness in images using the theory of geostatistics and the calculus of probability measures of fuzzy events. Such a model for the quantification of spatial uncertainty is utilized as a new image feature extraction method, based on which classifiers can be trained to perform the task of pattern recognition. Applications of the proposed algorithm to the classification of various types of image data suggest the usefulness of the proposed uncertainty modeling technique for texture feature extraction. PMID:25157744
ERIC Educational Resources Information Center
Kwon, Nahyun
2017-01-01
Introduction: This study was conducted to investigate the characteristics of research and information activities of laboratory scientists in different work positions throughout a research lifecycle. Activity theory was applied as the conceptual and analytical framework. Method: Taking a qualitative research approach, in-depth interviews and field…
Information's role in the estimation of chaotic signals
NASA Astrophysics Data System (ADS)
Drake, Daniel Fred
1998-11-01
Researchers have proposed several methods designed to recover chaotic signals from noise-corrupted observations. While the methods vary, their qualitative performance does not: in low levels of noise all methods effectively recover the underlying signal; in high levels of noise no method can recover the underlying signal to any meaningful degree of accuracy. Of the methods proposed to date, all represent sub-optimal estimators. So: Is the inability to recover the signal in high noise levels simply a consequence of estimator sub-optimality? Or is estimator failure actually a manifestation of some intrinsic property of chaos itself? These questions are answered by deriving an optimal estimator for a class of chaotic systems and noting that it, too, fails in high levels of noise. An exact, closed- form expression for the estimator is obtained for a class of chaotic systems whose signals are solutions to a set of linear (but noncausal) difference equations. The existence of this linear description circumvents the difficulties normally encountered when manipulating the nonlinear (but causal) expressions that govern. chaotic behavior. The reason why even the optimal estimator fails to recover underlying chaotic signals in high levels of noise has its roots in information theory. At such noise levels, the mutual information linking the corrupted observations to the underlying signal is essentially nil, reducing the estimator to a simple guessing strategy based solely on a priori statistics. Entropy, long the common bond between information theory and dynamical systems, is actually one aspect of a far more complete characterization of information sources: the rate distortion function. Determining the rate distortion function associated with the class of chaotic systems considered in this work provides bounds on estimator performance in high levels of noise. Finally, a slight modification of the linear description leads to a method of synthesizing on limited precision platforms ``pseudo-chaotic'' sequences that mimic true chaotic behavior to any finite degree of precision and duration. The use of such a technique in spread-spectrum communications is considered.
Anthropometric Procedures for Protective Equipment Sizing and Design
Hsiao, Hongwei
2015-01-01
Objectives This article presented four anthropometric theories (univariate, bivariate/probability distribution, multivariate, and shape-based methods) for protective equipment design decisions. Background While the significance of anthropometric information for product design is well recognized, designers continue to face challenges in selecting efficient anthropometric data processing methods and translating the acquired information into effective product designs. Methods For this study, 100 farm tractor operators, 3,718 respirator users, 951 firefighters, and 816 civilian workers participated in four studies on the design of tractor roll-over protective structures (ROPS), respirator test panels, fire truck cabs, and fall-arrest harnesses, respectively. Their anthropometry and participant-equipment interfaces were evaluated. Results Study 1 showed a need to extend the 90-cm vertical clearance for tractor ROPS in the current industrial standards to 98.3 to 101.3 cm. Study 2 indicated that current respirator test panel would have excluded 10% of the male firefighter population; a systematic adjustment to the boundaries of test panel cells was suggested. Study 3 provided 24 principal component analysis-based firefighter body models to facilitate fire truck cab design. Study 4 developed an improved gender-based fall-arrest harness sizing scheme to supplant the current unisex system. Conclusions This article presented four anthropometric approaches and a six-step design paradigm for ROPS, respirator test panel, fire truck cab, and fall-arrest harness applications, which demonstrated anthropometric theories and practices for defining protective equipment fit and sizing schemes. Applications The study provided a basis for equipment designers, standards writers, and industry manufacturers to advance anthropometric applications for product design and improve product efficacy. PMID:23516791
2011-01-01
Background The present study examines the structure and operation of social networks of information and advice and their role in making decisions as to whether to adopt new evidence-based practices (EBPs) among agency directors and other program professionals in 12 California counties participating in a large randomized controlled trial. Methods Interviews were conducted with 38 directors, assistant directors, and program managers of county probation, mental health, and child welfare departments. Grounded-theory analytic methods were used to identify themes related to EBP adoption and network influences. A web-based survey collected additional quantitative information on members of information and advice networks of study participants. A mixed-methods approach to data analysis was used to create a sociometric data set (n = 176) for examination of associations between advice seeking and network structure. Results Systems leaders develop and maintain networks of information and advice based on roles, responsibility, geography, and friendship ties. Networks expose leaders to information about EBPs and opportunities to adopt EBPs; they also influence decisions to adopt EBPs. Individuals in counties at the same stage of implementation accounted for 83% of all network ties. Networks in counties that decided not to implement a specific EBP had no extra-county ties. Implementation of EBPs at the two-year follow-up was associated with the size of county, urban versus rural counties, and in-degree centrality. Collaboration was viewed as critical to implementing EBPs, especially in small, rural counties where agencies have limited resources on their own. Conclusions Successful implementation of EBPs requires consideration and utilization of existing social networks of high-status systems leaders that often cut across service organizations and their geographic jurisdictions. Trial Registration NCT00880126 PMID:21958674
Theory of mind selectively predicts preschoolers’ knowledge-based selective word learning
Brosseau-Liard, Patricia; Penney, Danielle; Poulin-Dubois, Diane
2015-01-01
Children can selectively attend to various attributes of a model, such as past accuracy or physical strength, to guide their social learning. There is a debate regarding whether a relation exists between theory-of-mind skills and selective learning. We hypothesized that high performance on theory-of-mind tasks would predict preference for learning new words from accurate informants (an epistemic attribute), but not from physically strong informants (a non-epistemic attribute). Three- and 4-year-olds (N = 65) completed two selective learning tasks, and their theory of mind abilities were assessed. As expected, performance on a theory-of-mind battery predicted children’s preference to learn from more accurate informants but not from physically stronger informants. Results thus suggest that preschoolers with more advanced theory of mind have a better understanding of knowledge and apply that understanding to guide their selection of informants. This work has important implications for research on children’s developing social cognition and early learning. PMID:26211504
Theory of mind selectively predicts preschoolers' knowledge-based selective word learning.
Brosseau-Liard, Patricia; Penney, Danielle; Poulin-Dubois, Diane
2015-11-01
Children can selectively attend to various attributes of a model, such as past accuracy or physical strength, to guide their social learning. There is a debate regarding whether a relation exists between theory-of-mind skills and selective learning. We hypothesized that high performance on theory-of-mind tasks would predict preference for learning new words from accurate informants (an epistemic attribute), but not from physically strong informants (a non-epistemic attribute). Three- and 4-year-olds (N = 65) completed two selective learning tasks, and their theory-of-mind abilities were assessed. As expected, performance on a theory-of-mind battery predicted children's preference to learn from more accurate informants but not from physically stronger informants. Results thus suggest that preschoolers with more advanced theory of mind have a better understanding of knowledge and apply that understanding to guide their selection of informants. This work has important implications for research on children's developing social cognition and early learning. © 2015 The British Psychological Society.
Patton, Deborah E; Hughes, Carmel M; Cadogan, Cathal A; Ryan, Cristín A
2017-02-01
Previous interventions have shown limited success in improving medication adherence in older adults, and this may be due to the lack of a theoretical underpinning. This review sought to determine the effectiveness of theory-based interventions aimed at improving medication adherence in older adults prescribed polypharmacy and to explore the extent to which psychological theory informed their development. Eight electronic databases were searched from inception to March 2015, and extensive hand-searching was conducted. Interventions delivered to older adults (populations with a mean/median age of ≥65 years) prescribed polypharmacy (four or more regular oral/non-oral medicines) were eligible. Studies had to report an underpinning theory and measure at least one adherence and one clinical/humanistic outcome. Data were extracted independently by two reviewers and included details of intervention content, delivery, providers, participants, outcomes and theories used. The theory coding scheme (TCS) was used to assess the extent of theory use. Five studies cited theory as the basis for intervention development (social cognitive theory, health belief model, transtheoretical model, self-regulation model). The extent of theory use and intervention effectiveness in terms of adherence and clinical/humanistic outcomes varied across studies. No study made optimal use of theory as recommended in the TCS. The heterogeneity observed and inclusion of pilot designs mean conclusions regarding effectiveness of theory-based interventions targeting older adults prescribed polypharmacy could not be drawn. Further primary research involving theory as a central component of intervention development is required. The review findings will help inform the design of future theory-based adherence interventions.
Sandars, John; Patel, Rakesh S; Goh, Poh Sun; Kokatailo, Patricia K; Lafferty, Natalie
2015-01-01
There is an increasing use of technology for teaching and learning in medical education but often the use of educational theory to inform the design is not made explicit. The educational theories, both normative and descriptive, used by medical educators determine how the technology is intended to facilitate learning and may explain why some interventions with technology may be less effective compared with others. The aim of this study is to highlight the importance of medical educators making explicit the educational theories that inform their design of interventions using technology. The use of illustrative examples of the main educational theories to demonstrate the importance of theories informing the design of interventions using technology. Highlights the use of educational theories for theory-based and realistic evaluations of the use of technology in medical education. An explicit description of the educational theories used to inform the design of an intervention with technology can provide potentially useful insights into why some interventions with technology are more effective than others. An explicit description is also an important aspect of the scholarship of using technology in medical education.
Using Financial Information in Continuing Education. Accepted Methods and New Approaches.
ERIC Educational Resources Information Center
Matkin, Gary W.
This book, which is intended as a resource/reference guide for experienced financial managers and course planners, examines accepted methods and new approaches for using financial information in continuing education. The introduction reviews theory and practice, traditional and new methods, planning and organizational management, and technology.…
Evaluating Theory-Based Evaluation: Information, Norms, and Adherence
ERIC Educational Resources Information Center
Jacobs, W. Jake; Sisco, Melissa; Hill, Dawn; Malter, Frederic; Figueredo, Aurelio Jose
2012-01-01
Programmatic social interventions attempt to produce appropriate social-norm-guided behavior in an open environment. A marriage of applicable psychological theory, appropriate program evaluation theory, and outcome of evaluations of specific social interventions assures the acquisition of cumulative theory and the production of successful social…
Explaining Michigan: developing an ex post theory of a quality improvement program.
Dixon-Woods, Mary; Bosk, Charles L; Aveling, Emma Louise; Goeschel, Christine A; Pronovost, Peter J
2011-06-01
Understanding how and why programs work-not simply whether they work-is crucial. Good theory is indispensable to advancing the science of improvement. We argue for the usefulness of ex post theorization of programs. We propose an approach, located within the broad family of theory-oriented methods, for developing ex post theories of interventional programs. We use this approach to develop an ex post theory of the Michigan Intensive Care Unit (ICU) project, which attracted international attention by successfully reducing rates of central venous catheter bloodstream infections (CVC-BSIs). The procedure used to develop the ex post theory was (1) identify program leaders' initial theory of change and learning from running the program; (2) enhance this with new information in the form of theoretical contributions from social scientists; (3) synthesize prior and new information to produce an updated theory. The Michigan project achieved its effects by (1) generating isomorphic pressures for ICUs to join the program and conform to its requirements; (2) creating a densely networked community with strong horizontal links that exerted normative pressures on members; (3) reframing CVC-BSIs as a social problem and addressing it through a professional movement combining "grassroots" features with a vertically integrating program structure; (4) using several interventions that functioned in different ways to shape a culture of commitment to doing better in practice; (5) harnessing data on infection rates as a disciplinary force; and (6) using "hard edges." Updating program theory in the light of experience from program implementation is essential to improving programs' generalizability and transferability, although it is not a substitute for concurrent evaluative fieldwork. Future iterations of programs based on the Michigan project, and improvement science more generally, may benefit from the updated theory present here. © 2011 Milbank Memorial Fund. Published by Wiley Periodicals Inc.
Prouty, Christine; Mohebbi, Shima; Zhang, Qiong
2018-06-15
Given the increasing vulnerability of communities to the negative impacts of untreated wastewater, resource recovery (RR) systems provide a paradigm shift away from a traditional approach of waste separation and treatment towards a productive recovery of water, energy and nutrients. The aim of this research is to understand the relationships between factors that influence the adoption and sustainability of wastewater-based RR systems to inform technology implementation strategies. The study presents a theory-informed, community-influenced system dynamics (SD) model to provide decision-makers with an adaptable tool that simulates system-level responses to the strategies that are developed for the coastal town of Placencia, Belize. The modeling framework is informed by literature-based theories such as the theory of diffusion of innovations (TDI) and the theory of planned behavior (TPB). Various methods, including surveys, interviews, participatory observations, and a water constituents mass balance analysis are used to validate relationships and numerically populate the model. The SD model was evaluated with field data and simulated to identify strategies that will improve the adoption and sustainability of RR systems. Site demonstrations (marketing strategy) made a significant impact on the stock of adopted RR systems. The stock of sustained RR systems is driven by the sustainability rate (i.e. economic and environmental viability) which can be improved by more site demonstrations and tank options (technical strategy). These strategies, however, only contributed to incremental improvements in the system's sustainability performance. This study shows that changing community behaviors (i.e. reporting the correct number of users and reclaiming resources), represented by structural change in the SD model, is the more significant way to influence the sustainable management of the community's wastewater resources. Copyright © 2018 Elsevier Ltd. All rights reserved.
Object-oriented recognition of high-resolution remote sensing image
NASA Astrophysics Data System (ADS)
Wang, Yongyan; Li, Haitao; Chen, Hong; Xu, Yuannan
2016-01-01
With the development of remote sensing imaging technology and the improvement of multi-source image's resolution in satellite visible light, multi-spectral and hyper spectral , the high resolution remote sensing image has been widely used in various fields, for example military field, surveying and mapping, geophysical prospecting, environment and so forth. In remote sensing image, the segmentation of ground targets, feature extraction and the technology of automatic recognition are the hotspot and difficulty in the research of modern information technology. This paper also presents an object-oriented remote sensing image scene classification method. The method is consist of vehicles typical objects classification generation, nonparametric density estimation theory, mean shift segmentation theory, multi-scale corner detection algorithm, local shape matching algorithm based on template. Remote sensing vehicles image classification software system is designed and implemented to meet the requirements .
Trial-based economic evaluations in occupational health: principles, methods, and recommendations.
van Dongen, Johanna M; van Wier, Marieke F; Tompa, Emile; Bongers, Paulien M; van der Beek, Allard J; van Tulder, Maurits W; Bosmans, Judith E
2014-06-01
To allocate available resources as efficiently as possible, decision makers need information on the relative economic merits of occupational health and safety (OHS) interventions. Economic evaluations can provide this information by comparing the costs and consequences of alternatives. Nevertheless, only a few of the studies that consider the effectiveness of OHS interventions take the extra step of considering their resource implications. Moreover, the methodological quality of those that do is generally poor. Therefore, this study aims to help occupational health researchers conduct high-quality trial-based economic evaluations by discussing the theory and methodology that underlie them, and by providing recommendations for good practice regarding their design, analysis, and reporting. This study also helps consumers of this literature with understanding and critically appraising trial-based economic evaluations of OHS interventions.
Wang, Feng; Kaplan, Jess L; Gold, Benjamin D; Bhasin, Manoj K; Ward, Naomi L; Kellermayer, Richard; Kirschner, Barbara S; Heyman, Melvin B; Dowd, Scot E; Cox, Stephen B; Dogan, Haluk; Steven, Blaire; Ferry, George D; Cohen, Stanley A; Baldassano, Robert N; Moran, Christopher J; Garnett, Elizabeth A; Drake, Lauren; Otu, Hasan H; Mirny, Leonid A; Libermann, Towia A; Winter, Harland S; Korolev, Kirill S
2016-02-02
The relationship between the host and its microbiota is challenging to understand because both microbial communities and their environments are highly variable. We have developed a set of techniques based on population dynamics and information theory to address this challenge. These methods identify additional bacterial taxa associated with pediatric Crohn disease and can detect significant changes in microbial communities with fewer samples than previous statistical approaches required. We have also substantially improved the accuracy of the diagnosis based on the microbiota from stool samples, and we found that the ecological niche of a microbe predicts its role in Crohn disease. Bacteria typically residing in the lumen of healthy individuals decrease in disease, whereas bacteria typically residing on the mucosa of healthy individuals increase in disease. Our results also show that the associations with Crohn disease are evolutionarily conserved and provide a mutual information-based method to depict dysbiosis. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Information fusion-based approach for studying influence on Twitter using belief theory.
Azaza, Lobna; Kirgizov, Sergey; Savonnet, Marinette; Leclercq, Éric; Gastineau, Nicolas; Faiz, Rim
2016-01-01
Influence in Twitter has become recently a hot research topic, since this micro-blogging service is widely used to share and disseminate information. Some users are more able than others to influence and persuade peers. Thus, studying most influential users leads to reach a large-scale information diffusion area, something very useful in marketing or political campaigns. In this study, we propose a new approach for multi-level influence assessment on multi-relational networks, such as Twitter . We define a social graph to model the relationships between users as a multiplex graph where users are represented by nodes, and links model the different relations between them (e.g., retweets , mentions , and replies ). We explore how relations between nodes in this graph could reveal about the influence degree and propose a generic computational model to assess influence degree of a certain node. This is based on the conjunctive combination rule from the belief functions theory to combine different types of relations. We experiment the proposed method on a large amount of data gathered from Twitter during the European Elections 2014 and deduce top influential candidates. The results show that our model is flexible enough to to consider multiple interactions combination according to social scientists needs or requirements and that the numerical results of the belief theory are accurate. We also evaluate the approach over the CLEF RepLab 2014 data set and show that our approach leads to quite interesting results.
DNA binding site characterization by means of Rényi entropy measures on nucleotide transitions.
Perera, A; Vallverdu, M; Claria, F; Soria, J M; Caminal, P
2008-06-01
In this work, parametric information-theory measures for the characterization of binding sites in DNA are extended with the use of transitional probabilities on the sequence. We propose the use of parametric uncertainty measures such as Rényi entropies obtained from the transition probabilities for the study of the binding sites, in addition to nucleotide frequency-based Rényi measures. Results are reported in this work comparing transition frequencies (i.e., dinucleotides) and base frequencies for Shannon and parametric Rényi entropies for a number of binding sites found in E. Coli, lambda and T7 organisms. We observe that the information provided by both approaches is not redundant. Furthermore, under the presence of noise in the binding site matrix we observe overall improved robustness of nucleotide transition-based algorithms when compared with nucleotide frequency-based method.
Item Response Theory Equating Using Bayesian Informative Priors.
ERIC Educational Resources Information Center
de la Torre, Jimmy; Patz, Richard J.
This paper seeks to extend the application of Markov chain Monte Carlo (MCMC) methods in item response theory (IRT) to include the estimation of equating relationships along with the estimation of test item parameters. A method is proposed that incorporates estimation of the equating relationship in the item calibration phase. Item parameters from…
Teaching Qualitative Research: Using Theory to Inform Practice
ERIC Educational Resources Information Center
Sallee, Margaret W.
2010-01-01
This article considers how theories of instructional scaffolding--which call for a skilled expert to teach a novice a new task by breaking it into smaller pieces--might be employed in graduate-level qualitative methods courses. The author discusses how she used instructional scaffolding in the design and delivery of a qualitative methods course…
Gooding, Kate; Makwinja, Regina; Nyirenda, Deborah; Vincent, Robin; Sambakunsi, Rodrick
2018-01-01
Background: Evaluation of community and public engagement in research is important to deepen understanding of how engagement works and to enhance its effectiveness. Theories of change have been recommended for evaluating community engagement, for their ability to make explicit intended outcomes and understandings of how engagement activities contribute to these outcomes. However, there are few documented examples of using theories of change for evaluation of engagement. This article reports experience of using theories of change to develop a framework for evaluating community engagement in research at a clinical research organisation in Malawi. We describe the steps used to develop theories of change, and the way theories of change were used to design data collection plans. Based on our experience, we reflect on the advantages and challenges of the theory of change approach. Methods: The theories of change and evaluation framework were developed through a series of workshops and meetings between engagement practitioners, monitoring and evaluation staff, and researchers. We first identified goals for engagement, then used ‘so that’ chains to clarify pathways and intermediate outcomes between engagement activities and goals. Further meetings were held to refine initial theories of change, identify priority information needs, and define feasible evaluation methods. Results: The theory of change approach had several benefits. In particular, it helped to construct an evaluation framework focused on relevant outcomes and not just activities. The process of reflecting on intended goals and pathways also helped staff to review the design of engagement activities. Challenges included practical considerations around time to consider evaluation plans among practitioners (a challenge for evaluation more generally regardless of method), and more fundamental difficulties related to identifying feasible and agreed outcomes. Conclusions: These experiences from Malawi provide lessons for other research organisations considering use of theories of change to support evaluation of community engagement. PMID:29560418
Gooding, Kate; Makwinja, Regina; Nyirenda, Deborah; Vincent, Robin; Sambakunsi, Rodrick
2018-01-01
Background: Evaluation of community and public engagement in research is important to deepen understanding of how engagement works and to enhance its effectiveness. Theories of change have been recommended for evaluating community engagement, for their ability to make explicit intended outcomes and understandings of how engagement activities contribute to these outcomes. However, there are few documented examples of using theories of change for evaluation of engagement. This article reports experience of using theories of change to develop a framework for evaluating community engagement in research at a clinical research organisation in Malawi. We describe the steps used to develop theories of change, and the way theories of change were used to design data collection plans. Based on our experience, we reflect on the advantages and challenges of the theory of change approach. Methods: The theories of change and evaluation framework were developed through a series of workshops and meetings between engagement practitioners, monitoring and evaluation staff, and researchers. We first identified goals for engagement, then used 'so that' chains to clarify pathways and intermediate outcomes between engagement activities and goals. Further meetings were held to refine initial theories of change, identify priority information needs, and define feasible evaluation methods. Results: The theory of change approach had several benefits. In particular, it helped to construct an evaluation framework focused on relevant outcomes and not just activities. The process of reflecting on intended goals and pathways also helped staff to review the design of engagement activities. Challenges included practical considerations around time to consider evaluation plans among practitioners (a challenge for evaluation more generally regardless of method), and more fundamental difficulties related to identifying feasible and agreed outcomes. Conclusions: These experiences from Malawi provide lessons for other research organisations considering use of theories of change to support evaluation of community engagement.
Facial expression recognition under partial occlusion based on fusion of global and local features
NASA Astrophysics Data System (ADS)
Wang, Xiaohua; Xia, Chen; Hu, Min; Ren, Fuji
2018-04-01
Facial expression recognition under partial occlusion is a challenging research. This paper proposes a novel framework for facial expression recognition under occlusion by fusing the global and local features. In global aspect, first, information entropy are employed to locate the occluded region. Second, principal Component Analysis (PCA) method is adopted to reconstruct the occlusion region of image. After that, a replace strategy is applied to reconstruct image by replacing the occluded region with the corresponding region of the best matched image in training set, Pyramid Weber Local Descriptor (PWLD) feature is then extracted. At last, the outputs of SVM are fitted to the probabilities of the target class by using sigmoid function. For the local aspect, an overlapping block-based method is adopted to extract WLD features, and each block is weighted adaptively by information entropy, Chi-square distance and similar block summation methods are then applied to obtain the probabilities which emotion belongs to. Finally, fusion at the decision level is employed for the data fusion of the global and local features based on Dempster-Shafer theory of evidence. Experimental results on the Cohn-Kanade and JAFFE databases demonstrate the effectiveness and fault tolerance of this method.
SD-MSAEs: Promoter recognition in human genome based on deep feature extraction.
Xu, Wenxuan; Zhang, Li; Lu, Yaping
2016-06-01
The prediction and recognition of promoter in human genome play an important role in DNA sequence analysis. Entropy, in Shannon sense, of information theory is a multiple utility in bioinformatic details analysis. The relative entropy estimator methods based on statistical divergence (SD) are used to extract meaningful features to distinguish different regions of DNA sequences. In this paper, we choose context feature and use a set of methods of SD to select the most effective n-mers distinguishing promoter regions from other DNA regions in human genome. Extracted from the total possible combinations of n-mers, we can get four sparse distributions based on promoter and non-promoters training samples. The informative n-mers are selected by optimizing the differentiating extents of these distributions. Specially, we combine the advantage of statistical divergence and multiple sparse auto-encoders (MSAEs) in deep learning to extract deep feature for promoter recognition. And then we apply multiple SVMs and a decision model to construct a human promoter recognition method called SD-MSAEs. Framework is flexible that it can integrate new feature extraction or new classification models freely. Experimental results show that our method has high sensitivity and specificity. Copyright © 2016 Elsevier Inc. All rights reserved.
Control Theory and Statistical Generalizations.
ERIC Educational Resources Information Center
Powers, William T.
1990-01-01
Contrasts modeling methods in control theory to the methods of statistical generalizations in empirical studies of human or animal behavior. Presents a computer simulation that predicts behavior based on variables (effort and rewards) determined by the invariable (desired reward). Argues that control theory methods better reflect relationships to…
Baker-Ericzén, Mary J; Jenkins, Melissa M; Park, Soojin; Garland, Ann F
2015-02-01
Mental health professionals' decision-making practice is an area of increasing interest and importance, especially in the pediatric research and clinical communities. The present study explored the role of prior training in evidence-based treatments on clinicians' assessment and treatment formulations using case vignettes. Specifically, study aims included using the Naturalistic Decision Making (NDM) cognitive theory to 1) examine potential associations between EBT training and decision-making processes (novice versus expert type), and 2) explore how client and family contextual information affects clinical decision-making. Forty-eight clinicians across two groups (EBT trained=14, Not EBT trained=34) participated. Clinicians were comparable on professional experience, demographics, and discipline. The quasi-experimental design used an analog "think aloud" method where clinicians read case vignettes about a child with disruptive behavior problems and verbalized case conceptualization and treatment planning out-loud. Responses were coded according to NDM theory. MANOVA results were significant for EBT training status such that EBT trained clinicians' displayed cognitive processes more closely aligned with "expert" decision-makers and non-EBT trained clinicians' decision processes were more similar to "novice" decision-makers, following NDM theory. Non-EBT trained clinicians assigned significantly more diagnoses, provided less detailed treatment plans and discussed fewer EBTs. Parent/family contextual information also appeared to influence decision-making. This study offers a preliminary investigation of the possible broader impacts of EBT training and potential associations with development of expert decision-making skills. Targeting clinicians' decision-making may be an important avenue to pursue within dissemination-implementation efforts in mental health practice.
A novel image watermarking method based on singular value decomposition and digital holography
NASA Astrophysics Data System (ADS)
Cai, Zhishan
2016-10-01
According to the information optics theory, a novel watermarking method based on Fourier-transformed digital holography and singular value decomposition (SVD) is proposed in this paper. First of all, a watermark image is converted to a digital hologram using the Fourier transform. After that, the original image is divided into many non-overlapping blocks. All the blocks and the hologram are decomposed using SVD. The singular value components of the hologram are then embedded into the singular value components of each block using an addition principle. Finally, SVD inverse transformation is carried out on the blocks and hologram to generate the watermarked image. The watermark information embedded in each block is extracted at first when the watermark is extracted. After that, an averaging operation is carried out on the extracted information to generate the final watermark information. Finally, the algorithm is simulated. Furthermore, to test the encrypted image's resistance performance against attacks, various attack tests are carried out. The results show that the proposed algorithm has very good robustness against noise interference, image cut, compression, brightness stretching, etc. In particular, when the image is rotated by a large angle, the watermark information can still be extracted correctly.
Adaptive, Distributed Control of Constrained Multi-Agent Systems
NASA Technical Reports Server (NTRS)
Bieniawski, Stefan; Wolpert, David H.
2004-01-01
Product Distribution (PO) theory was recently developed as a broad framework for analyzing and optimizing distributed systems. Here we demonstrate its use for adaptive distributed control of Multi-Agent Systems (MASS), i.e., for distributed stochastic optimization using MAS s. First we review one motivation of PD theory, as the information-theoretic extension of conventional full-rationality game theory to the case of bounded rational agents. In this extension the equilibrium of the game is the optimizer of a Lagrangian of the (Probability dist&&on on the joint state of the agents. When the game in question is a team game with constraints, that equilibrium optimizes the expected value of the team game utility, subject to those constraints. One common way to find that equilibrium is to have each agent run a Reinforcement Learning (E) algorithm. PD theory reveals this to be a particular type of search algorithm for minimizing the Lagrangian. Typically that algorithm i s quite inefficient. A more principled alternative is to use a variant of Newton's method to minimize the Lagrangian. Here we compare this alternative to RL-based search in three sets of computer experiments. These are the N Queen s problem and bin-packing problem from the optimization literature, and the Bar problem from the distributed RL literature. Our results confirm that the PD-theory-based approach outperforms the RL-based scheme in all three domains.
Information dynamics in carcinogenesis and tumor growth.
Gatenby, Robert A; Frieden, B Roy
2004-12-21
The storage and transmission of information is vital to the function of normal and transformed cells. We use methods from information theory and Monte Carlo theory to analyze the role of information in carcinogenesis. Our analysis demonstrates that, during somatic evolution of the malignant phenotype, the accumulation of genomic mutations degrades intracellular information. However, the degradation is constrained by the Darwinian somatic ecology in which mutant clones proliferate only when the mutation confers a selective growth advantage. In that environment, genes that normally decrease cellular proliferation, such as tumor suppressor or differentiation genes, suffer maximum information degradation. Conversely, those that increase proliferation, such as oncogenes, are conserved or exhibit only gain of function mutations. These constraints shield most cellular populations from catastrophic mutator-induced loss of the transmembrane entropy gradient and, therefore, cell death. The dynamics of constrained information degradation during carcinogenesis cause the tumor genome to asymptotically approach a minimum information state that is manifested clinically as dedifferentiation and unconstrained proliferation. Extreme physical information (EPI) theory demonstrates that altered information flow from cancer cells to their environment will manifest in-vivo as power law tumor growth with an exponent of size 1.62. This prediction is based only on the assumption that tumor cells are at an absolute information minimum and are capable of "free field" growth that is, they are unconstrained by external biological parameters. The prediction agrees remarkably well with several studies demonstrating power law growth in small human breast cancers with an exponent of 1.72+/-0.24. This successful derivation of an analytic expression for cancer growth from EPI alone supports the conceptual model that carcinogenesis is a process of constrained information degradation and that malignant cells are minimum information systems. EPI theory also predicts that the estimated age of a clinically observed tumor is subject to a root-mean square error of about 30%. This is due to information loss and tissue disorganization and probably manifests as a randomly variable lag phase in the growth pattern that has been observed experimentally. This difference between tumor size and age may impose a fundamental limit on the efficacy of screening based on early detection of small tumors. Independent of the EPI analysis, Monte Carlo methods are applied to predict statistical tumor growth due to perturbed information flow from the environment into transformed cells. A "simplest" Monte Carlo model is suggested by the findings in the EPI approach that tumor growth arises out of a minimally complex mechanism. The outputs of large numbers of simulations show that (a) about 40% of the populations do not survive the first two-generations due to mutations in critical gene segments; but (b) those that do survive will experience power law growth identical to the predicted rate obtained from the independent EPI approach. The agreement between these two very different approaches to the problem strongly supports the idea that tumor cells regress to a state of minimum information during carcinogenesis, and that information dynamics are integrally related to tumor development and growth.
Use of empathy in psychiatric practice: constructivist grounded theory study
Watling, Chris
2017-01-01
Background Psychiatry has faced significant criticism for overreliance on the Diagnostic and Statistical Manual of Mental Disorders (DSM) and medications with purported disregard for empathetic, humanistic interventions. Aims To develop an empirically based qualitative theory explaining how psychiatrists use empathy in day-to-day practice, to inform practice and teaching approaches. Method This study used constructivist grounded theory methodology to ask (a) ‘How do psychiatrists understand and use empathetic engagement in the day-to-day practice of psychiatry?’ and (b) ‘How do psychiatrists learn and teach the skills of empathetic engagement?’ The authors interviewed 17 academic psychiatrists and 4 residents and developed a theory by iterative coding of the collected data. Results This constructivist grounded theory of empathetic engagement in psychiatric practice considered three major elements: relational empathy, transactional empathy and instrumental empathy. As one moves from relational empathy through transactional empathy to instrumental empathy, the actions of the psychiatrist become more deliberate and interventional. Conclusions Participants were described by empathy-based interventions which are presented in a theory of ’empathetic engagement’. This is in contrast to a paradigm that sees psychiatry as purely based on neurobiological interventions, with psychotherapy and interpersonal interventions as completely separate activities from day-to-day psychiatric practice. Declaration of interest None. Copyright and usage © The Royal College of Psychiatrists 2017. This is an open access article distributed under the terms of the Creative Commons Non-Commercial, No Derivatives (CC BY-NC-ND) license. PMID:28243463
Theory research of seam recognition and welding torch pose control based on machine vision
NASA Astrophysics Data System (ADS)
Long, Qiang; Zhai, Peng; Liu, Miao; He, Kai; Wang, Chunyang
2017-03-01
At present, the automation requirement of the welding become higher, so a method of the welding information extraction by vision sensor is proposed in this paper, and the simulation with the MATLAB has been conducted. Besides, in order to improve the quality of robot automatic welding, an information retrieval method for welding torch pose control by visual sensor is attempted. Considering the demands of welding technology and engineering habits, the relative coordinate systems and variables are strictly defined, and established the mathematical model of the welding pose, and verified its feasibility by using the MATLAB simulation in the paper, these works lay a foundation for the development of welding off-line programming system with high precision and quality.
Analysis and design of nonlinear resonances via singularity theory
NASA Astrophysics Data System (ADS)
Cirillo, G. I.; Habib, G.; Kerschen, G.; Sepulchre, R.
2017-03-01
Bifurcation theory and continuation methods are well-established tools for the analysis of nonlinear mechanical systems subject to periodic forcing. We illustrate the added value and the complementary information provided by singularity theory with one distinguished parameter. While tracking bifurcations reveals the qualitative changes in the behaviour, tracking singularities reveals how structural changes are themselves organised in parameter space. The complementarity of that information is demonstrated in the analysis of detached resonance curves in a two-degree-of-freedom system.
Automatic Trading Agent. RMT Based Portfolio Theory and Portfolio Selection
NASA Astrophysics Data System (ADS)
Snarska, M.; Krzych, J.
2006-11-01
Portfolio theory is a very powerful tool in the modern investment theory. It is helpful in estimating risk of an investor's portfolio, arosen from lack of information, uncertainty and incomplete knowledge of reality, which forbids a perfect prediction of future price changes. Despite of many advantages this tool is not known and not widely used among investors on Warsaw Stock Exchange. The main reason for abandoning this method is a high level of complexity and immense calculations. The aim of this paper is to introduce an automatic decision-making system, which allows a single investor to use complex methods of Modern Portfolio Theory (MPT). The key tool in MPT is an analysis of an empirical covariance matrix. This matrix, obtained from historical data, biased by such a high amount of statistical uncertainty, that it can be seen as random. By bringing into practice the ideas of Random Matrix Theory (RMT), the noise is removed or significantly reduced, so the future risk and return are better estimated and controlled. These concepts are applied to the Warsaw Stock Exchange Simulator {http://gra.onet.pl}. The result of the simulation is 18% level of gains in comparison with respective 10% loss of the Warsaw Stock Exchange main index WIG.
NASA Astrophysics Data System (ADS)
Chen, K.; Y Zhang, T.; Zhang, F.; Zhang, Z. R.
2017-12-01
Grey system theory regards uncertain system in which information is known partly and unknown partly as research object, extracts useful information from part known, and thereby revealing the potential variation rule of the system. In order to research the applicability of data-driven modelling method in melting peak temperature (T m) fitting and prediction of polypropylene (PP) during ultraviolet radiation aging, the T m of homo-polypropylene after different ultraviolet radiation exposure time investigated by differential scanning calorimeter was fitted and predicted by grey GM(1, 1) model based on grey system theory. The results show that the T m of PP declines with the prolong of aging time, and fitting and prediction equation obtained by grey GM(1, 1) model is T m = 166.567472exp(-0.00012t). Fitting effect of the above equation is excellent and the maximum relative error between prediction value and actual value of T m is 0.32%. Grey system theory needs less original data, has high prediction accuracy, and can be used to predict aging behaviour of PP.
Implementation of Complexity Analyzing Based on Additional Effect
NASA Astrophysics Data System (ADS)
Zhang, Peng; Li, Na; Liang, Yanhong; Liu, Fang
According to the Complexity Theory, there is complexity in the system when the functional requirement is not be satisfied. There are several study performances for Complexity Theory based on Axiomatic Design. However, they focus on reducing the complexity in their study and no one focus on method of analyzing the complexity in the system. Therefore, this paper put forth a method of analyzing the complexity which is sought to make up the deficiency of the researches. In order to discussing the method of analyzing the complexity based on additional effect, this paper put forth two concepts which are ideal effect and additional effect. The method of analyzing complexity based on additional effect combines Complexity Theory with Theory of Inventive Problem Solving (TRIZ). It is helpful for designers to analyze the complexity by using additional effect. A case study shows the application of the process.
Rocchini, Duccio
2009-01-01
Measuring heterogeneity in satellite imagery is an important task to deal with. Most measures of spectral diversity have been based on Shannon Information theory. However, this approach does not inherently address different scales, ranging from local (hereafter referred to alpha diversity) to global scales (gamma diversity). The aim of this paper is to propose a method for measuring spectral heterogeneity at multiple scales based on rarefaction curves. An algorithmic solution of rarefaction applied to image pixel values (Digital Numbers, DNs) is provided and discussed. PMID:22389600
NASA Technical Reports Server (NTRS)
Thomas, J. M.; Hanagud, S.
1974-01-01
The design criteria and test options for aerospace structural reliability were investigated. A decision methodology was developed for selecting a combination of structural tests and structural design factors. The decision method involves the use of Bayesian statistics and statistical decision theory. Procedures are discussed for obtaining and updating data-based probabilistic strength distributions for aerospace structures when test information is available and for obtaining subjective distributions when data are not available. The techniques used in developing the distributions are explained.
Moments of inclination error distribution computer program
NASA Technical Reports Server (NTRS)
Myler, T. R.
1981-01-01
A FORTRAN coded computer program is described which calculates orbital inclination error statistics using a closed-form solution. This solution uses a data base of trajectory errors from actual flights to predict the orbital inclination error statistics. The Scott flight history data base consists of orbit insertion errors in the trajectory parameters - altitude, velocity, flight path angle, flight azimuth, latitude and longitude. The methods used to generate the error statistics are of general interest since they have other applications. Program theory, user instructions, output definitions, subroutine descriptions and detailed FORTRAN coding information are included.
NASA Astrophysics Data System (ADS)
He, Di; Lim, Boon Pang; Yang, Xuesong; Hasegawa-Johnson, Mark; Chen, Deming
2018-06-01
Most mainstream Automatic Speech Recognition (ASR) systems consider all feature frames equally important. However, acoustic landmark theory is based on a contradictory idea, that some frames are more important than others. Acoustic landmark theory exploits quantal non-linearities in the articulatory-acoustic and acoustic-perceptual relations to define landmark times at which the speech spectrum abruptly changes or reaches an extremum; frames overlapping landmarks have been demonstrated to be sufficient for speech perception. In this work, we conduct experiments on the TIMIT corpus, with both GMM and DNN based ASR systems and find that frames containing landmarks are more informative for ASR than others. We find that altering the level of emphasis on landmarks by re-weighting acoustic likelihood tends to reduce the phone error rate (PER). Furthermore, by leveraging the landmark as a heuristic, one of our hybrid DNN frame dropping strategies maintained a PER within 0.44% of optimal when scoring less than half (45.8% to be precise) of the frames. This hybrid strategy out-performs other non-heuristic-based methods and demonstrate the potential of landmarks for reducing computation.
NASA Astrophysics Data System (ADS)
Loschetter, Annick; Rohmer, Jérémy
2016-04-01
Standard and new generation of monitoring observations provide in almost real-time important information about the evolution of the volcanic system. These observations are used to update the model and contribute to a better hazard assessment and to support decision making concerning potential evacuation. The framework BET_EF (based on Bayesian Event Tree) developed by INGV enables dealing with the integration of information from monitoring with the prospect of decision making. Using this framework, the objectives of the present work are i. to propose a method to assess the added value of information (within the Value Of Information (VOI) theory) from monitoring; ii. to perform sensitivity analysis on the different parameters that influence the VOI from monitoring. VOI consists in assessing the possible increase in expected value provided by gathering information, for instance through monitoring. Basically, the VOI is the difference between the value with information and the value without additional information in a Cost-Benefit approach. This theory is well suited to deal with situations that can be represented in the form of a decision tree such as the BET_EF tool. Reference values and ranges of variation (for sensitivity analysis) were defined for input parameters, based on data from the MESIMEX exercise (performed at Vesuvio volcano in 2006). Complementary methods for sensitivity analyses were implemented: local, global using Sobol' indices and regional using Contribution to Sample Mean and Variance plots. The results (specific to the case considered) obtained with the different techniques are in good agreement and enable answering the following questions: i. Which characteristics of monitoring are important for early warning (reliability)? ii. How do experts' opinions influence the hazard assessment and thus the decision? Concerning the characteristics of monitoring, the more influent parameters are the means rather than the variances for the case considered. For the parameters that concern expert setting, the weight attributed to monitoring measurement ω, the mean of thresholds, the economic context and the setting of the decision threshold are very influential. The interest of applying the VOI theory (more precisely the value of imperfect information) in the BET framework was demonstrated as support for helping experts in the setting of the monitoring system or for helping managers to decide the installation of additional monitoring systems. Acknowledgments: This work was carried out in the framework of the project MEDSUV. This project is funded under the call FP7 ENV.2012.6.4-2: Long-term monitoring experiment in geologically active regions of Europe prone to natural hazards: the Supersite concept. Grant agreement n°308665.
Statistics-based email communication security behavior recognition
NASA Astrophysics Data System (ADS)
Yi, Junkai; Su, Yueyang; Zhao, Xianghui
2017-08-01
With the development of information technology, e-mail has become a popular communication medium. It has great significant to determine the relationship between the two sides of the communication. Firstly, this paper analysed and processed the content and attachment of e-mail using the skill of steganalysis and malware analysis. And it also conducts the following feature extracting and behaviour model establishing which based on Naive Bayesian theory. Then a behaviour analysis method was employed to calculate and evaluate the communication security. Finally, some experiments about the accuracy of the behavioural relationship of communication identifying has been carried out. The result shows that this method has a great effects and correctness as eighty-four percent.
Authorship attribution based on Life-Like Network Automata
Machicao, Jeaneth; Corrêa, Edilson A.; Miranda, Gisele H. B.; Amancio, Diego R.
2018-01-01
The authorship attribution is a problem of considerable practical and technical interest. Several methods have been designed to infer the authorship of disputed documents in multiple contexts. While traditional statistical methods based solely on word counts and related measurements have provided a simple, yet effective solution in particular cases; they are prone to manipulation. Recently, texts have been successfully modeled as networks, where words are represented by nodes linked according to textual similarity measurements. Such models are useful to identify informative topological patterns for the authorship recognition task. However, there is no consensus on which measurements should be used. Thus, we proposed a novel method to characterize text networks, by considering both topological and dynamical aspects of networks. Using concepts and methods from cellular automata theory, we devised a strategy to grasp informative spatio-temporal patterns from this model. Our experiments revealed an outperformance over structural analysis relying only on topological measurements, such as clustering coefficient, betweenness and shortest paths. The optimized results obtained here pave the way for a better characterization of textual networks. PMID:29566100
NASA Technical Reports Server (NTRS)
Reuther, James; Alonso, Juan Jose; Rimlinger, Mark J.; Jameson, Antony
1996-01-01
This work describes the application of a control theory-based aerodynamic shape optimization method to the problem of supersonic aircraft design. The design process is greatly accelerated through the use of both control theory and a parallel implementation on distributed memory computers. Control theory is employed to derive the adjoint differential equations whose solution allows for the evaluation of design gradient information at a fraction of the computational cost required by previous design methods. The resulting problem is then implemented on parallel distributed memory architectures using a domain decomposition approach, an optimized communication schedule, and the MPI (Message Passing Interface) Standard for portability and efficiency. The final result achieves very rapid aerodynamic design based on higher order computational fluid dynamics methods (CFD). In our earlier studies, the serial implementation of this design method was shown to be effective for the optimization of airfoils, wings, wing-bodies, and complex aircraft configurations using both the potential equation and the Euler equations. In our most recent paper, the Euler method was extended to treat complete aircraft configurations via a new multiblock implementation. Furthermore, during the same conference, we also presented preliminary results demonstrating that this basic methodology could be ported to distributed memory parallel computing architectures. In this paper, our concern will be to demonstrate that the combined power of these new technologies can be used routinely in an industrial design environment by applying it to the case study of the design of typical supersonic transport configurations. A particular difficulty of this test case is posed by the propulsion/airframe integration.
2010-01-01
Background This paper presents the development of a study design built on the principles of theory-driven evaluation. The theory-driven evaluation approach was used to evaluate an adolescent sexual and reproductive health intervention in Mali, Burkina Faso and Cameroon to improve continuity of care through the creation of networks of social and health care providers. Methods/design Based on our experience and the existing literature, we developed a six-step framework for the design of theory-driven evaluations, which we applied in the ex-post evaluation of the networking component of the intervention. The protocol was drafted with the input of the intervention designer. The programme theory, the central element of theory-driven evaluation, was constructed on the basis of semi-structured interviews with designers, implementers and beneficiaries and an analysis of the intervention's logical framework. Discussion The six-step framework proved useful as it allowed for a systematic development of the protocol. We describe the challenges at each step. We found that there is little practical guidance in the existing literature, and also a mix up of terminology of theory-driven evaluation approaches. There is a need for empirical methodological development in order to refine the tools to be used in theory driven evaluation. We conclude that ex-post evaluations of programmes can be based on such an approach if the required information on context and mechanisms is collected during the programme. PMID:21118510
Design-based and model-based inference in surveys of freshwater mollusks
Dorazio, R.M.
1999-01-01
Well-known concepts in statistical inference and sampling theory are used to develop recommendations for planning and analyzing the results of quantitative surveys of freshwater mollusks. Two methods of inference commonly used in survey sampling (design-based and model-based) are described and illustrated using examples relevant in surveys of freshwater mollusks. The particular objectives of a survey and the type of information observed in each unit of sampling can be used to help select the sampling design and the method of inference. For example, the mean density of a sparsely distributed population of mollusks can be estimated with higher precision by using model-based inference or by using design-based inference with adaptive cluster sampling than by using design-based inference with conventional sampling. More experience with quantitative surveys of natural assemblages of freshwater mollusks is needed to determine the actual benefits of different sampling designs and inferential procedures.
Rationalising the 'irrational': a think aloud study of discrete choice experiment responses.
Ryan, Mandy; Watson, Verity; Entwistle, Vikki
2009-03-01
Stated preference methods assume respondents' preferences are consistent with utility theory, but many empirical studies report evidence of preferences that violate utility theory. This evidence is often derived from quantitative tests that occur naturally within, or are added to, stated preference tasks. In this study, we use qualitative methods to explore three axioms of utility theory: completeness, monotonicity, and continuity. We take a novel approach, adopting a 'think aloud' technique to identify violations of the axioms of utility theory and to consider how well the quantitative tests incorporated within a discrete choice experiment are able to detect these. Results indicate that quantitative tests classify respondents as being 'irrational' when qualitative statements would indicate they are 'rational'. In particular, 'non-monotonic' responses can often be explained by respondents inferring additional information beyond what is presented in the task, and individuals who appear to adopt non-compensatory decision-making strategies do so because they rate particular attributes very highly (they are not attempting to simplify the task). The results also provide evidence of 'cost-based responses': respondents assumed tests with higher costs would be of higher quality. The value of including in-depth qualitative validation techniques in the development of stated preference tasks is shown.
A Population Health Surveillance Theory
Bigras-Poulin, Michel; Michel, Pascal; Ravel, André
2012-01-01
OBJECTIVES Despite its extensive use, the term "Surveillance" often takes on various meanings in the scientific literature pertinent to public health and animal health. A critical appraisal of this literature also reveals ambiguities relating to the scope and necessary structural components underpinning the surveillance process. The authors hypothesized that these inconsistencies translate to real or perceived deficiencies in the conceptual framework of population health surveillance. This paper presents a population health surveillance theory framed upon an explicit conceptual system relative to health surveillance performed in human and animal populations. METHODS The population health surveillance theory reflects the authors' system of thinking and was based on a creative process. RESULTS Population health surveillance includes two broad components: one relating to the human organization (which includes expertise and the administrative program), and one relating to the system per se (which includes elements of design and method) and which can be viewed as a process. The population health surveillance process is made of five sequential interrelated steps: 1) a trigger or need, 2) problem formulation, 3) surveillance planning, 4) surveillance implementation, and 5) information communication and audit. CONCLUSIONS The population health surveillance theory provides a systematic way of understanding, organizing and evaluating the population health surveillance process. PMID:23251837
Jerusalem lectures on black holes and quantum information
NASA Astrophysics Data System (ADS)
Harlow, D.
2016-01-01
These lectures give an introduction to the quantum physics of black holes, including recent developments based on quantum information theory such as the firewall paradox and its various cousins. An introduction is also given to holography and the anti-de Sitter/conformal field theory (AdS/CFT) correspondence, focusing on those aspects which are relevant for the black hole information problem.
ERIC Educational Resources Information Center
Cole, Charles; Mandelblatt, Bertie
2000-01-01
Uses Kintsch's proposition-based construction-integration theory of discourse comprehension to detail the user coding operations that occur in each of the three subsystems (Perception, Comprehension, Application) in which users process an information retrieval systems (IRS) message. Describes an IRS device made up of two separate parts that enable…
What Density Functional Theory could do for Quantum Information
NASA Astrophysics Data System (ADS)
Mattsson, Ann
2015-03-01
The Hohenberg-Kohn theorem of Density Functional Theory (DFT), and extensions thereof, tells us that all properties of a system of electrons can be determined through their density, which uniquely determines the many-body wave-function. Given access to the appropriate, universal, functionals of the density we would, in theory, be able to determine all observables of any electronic system, without explicit reference to the wave-function. On the other hand, the wave-function is at the core of Quantum Information (QI), with the wave-function of a set of qubits being the central computational resource in a quantum computer. While there is seemingly little overlap between DFT and QI, reliance upon observables form a key connection. Though the time-evolution of the wave-function and associated phase information is fundamental to quantum computation, the initial and final states of a quantum computer are characterized by observables of the system. While observables can be extracted directly from a system's wave-function, DFT tells us that we may be able to intuit a method for extracting them from its density. In this talk, I will review the fundamentals of DFT and how these principles connect to the world of QI. This will range from DFT's utility in the engineering of physical qubits, to the possibility of using it to efficiently (but approximately) simulate Hamiltonians at the logical level. The apparent paradox of describing algorithms based on the quantum mechanical many-body wave-function with a DFT-like theory based on observables will remain a focus throughout. The ultimate goal of this talk is to initiate a dialog about what DFT could do for QI, in theory and in practice. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Inclusive Breakup Theory of Three-Body Halos
NASA Astrophysics Data System (ADS)
Hussein, Mahir S.; Souza, Lucas A.; Chimanski, Emanuel; Carlson, Brett; Frederico, Tobias
2017-11-01
We present a recently developed theory for the inclusive breakup of three-fragment projectiles within a four-body spectator model [1], for the treatment of the elastic and inclusive non-elastic break up reactions involving weakly bound three-cluster nuclei in A (a; b) X / a = x1 + x2 + b collisions. The four-body theory is an extension of the three-body approaches developed in the 80's by Ichimura, Autern and Vincent (IAV) [2], Udagawa and Tamura (UT) [3] and Hussein and McVoy (HM) [4]. We expect that experimentalists shall be encouraged to search for more information about the x1 + x2 system in the elastic breakup cross section and that also further developments and extensions of the surrogate method will be pursued, based on the inclusive non-elastic breakup part of the b spectrum.
Digital Game-Based Learning Supports Student Motivation, Cognitive Success, and Performance Outcomes
ERIC Educational Resources Information Center
Woo, Jeng-Chung
2014-01-01
Traditional multimedia learning is primarily based on the cognitive load concept of information processing theory. Recent digital game-based learning (DGBL) studies have focused on exploring content support for learning motivation and related game characteristics. Motivation, volition, and performance (MVP) theory indicates that cognitive load and…
ERIC Educational Resources Information Center
Penton, John
Designed to provide information about reading in New Zealand, this report offers an overview of theory and practice in that area. Among the topics discussed are: the current concern about reading standards; developmental reading; effective methods of reading instruction; research into the nature of the reading process; preparation of teachers of…
Information theory lateral density distribution for Earth inferred from global gravity field
NASA Technical Reports Server (NTRS)
Rubincam, D. P.
1981-01-01
Information Theory Inference, better known as the Maximum Entropy Method, was used to infer the lateral density distribution inside the Earth. The approach assumed that the Earth consists of indistinguishable Maxwell-Boltzmann particles populating infinitesimal volume elements, and followed the standard methods of statistical mechanics (maximizing the entropy function). The GEM 10B spherical harmonic gravity field coefficients, complete to degree and order 36, were used as constraints on the lateral density distribution. The spherically symmetric part of the density distribution was assumed to be known. The lateral density variation was assumed to be small compared to the spherically symmetric part. The resulting information theory density distribution for the cases of no crust removed, 30 km of compensated crust removed, and 30 km of uncompensated crust removed all gave broad density anomalies extending deep into the mantle, but with the density contrasts being the greatest towards the surface (typically + or 0.004 g cm 3 in the first two cases and + or - 0.04 g cm 3 in the third). None of the density distributions resemble classical organized convection cells. The information theory approach may have use in choosing Standard Earth Models, but, the inclusion of seismic data into the approach appears difficult.
Decentralised consensus-based formation tracking of multiple differential drive robots
NASA Astrophysics Data System (ADS)
Chu, Xing; Peng, Zhaoxia; Wen, Guoguang; Rahmani, Ahmed
2017-11-01
This article investigates the control problem for formation tracking of multiple nonholonomic robots under distributed manner which means each robot only needs local information exchange. A class of general state and input transform is introduced to convert the formation-tracking issue of multi-robot systems into the consensus-like problem with time-varying reference. The distributed observer-based protocol with nonlinear dynamics is developed for each robot to achieve the consensus tracking of the new system, which namely means a group of nonholonomic mobile robots can form the desired formation configuration with its centroid moving along the predefined reference trajectory. The finite-time stability of observer and control law is analysed rigorously by using the Lyapunov direct method, algebraic graph theory and matrix analysis. Numerical examples are finally provided to illustrate the effectiveness of the theory results proposed in this paper.
2005-06-01
Remarkably, the Iraqis have used religion and language to define the rhythms of war, including new methods of deception. The Coalition Forces, led by the...to explain socio-cultural factors based on organizational theories . Organizational culture is a concept often used to describe shared corporate...is governed by battle space informational footprints. In this regard, Vygotsky (1978) notes that human cognition and decision- making develops in
The Evaluation of ERP Sandtable Simulation Based on AHP
NASA Astrophysics Data System (ADS)
Xu, Lan
Due to the trend of world globalization, many enterprises have extended their business to operate globally. Enterprise resource planning is a powerful management system providing the best business resources information. This paper proposed the theory of AHP, and presented ERP sandtable simulation evaluation to discuss how to make a decision using AHP. Using this method can make enterprises consider factors influence operation of enterprise adequately, including feedback and dependence among the factors.
Placing User-Generated Content on the Map with Confidence
2014-11-03
Terms Theory,Algorithms Keywords Geographic information retrieval, Geolocation 1. INTRODUCTION We describe a method that places on the map short text...we collected using twitter4j, a Java library for the Twitter API . After filtering, there were 44,289 documents in the Twitter test set We evaluate how...Baldwin. Text-based twitter user geolocation prediction. J. Artif. Intell. Res.(JAIR), 49:451–500, 2014. [4] C. Hauff, B. Thomee, and M. Trevisiol
2010-01-01
Background It is recognised as good practice to use qualitative methods to elicit users' views of internet-delivered health-care interventions during their development. This paper seeks to illustrate the advantages of combining usability testing with 'theoretical modelling', i.e. analyses that relate the findings of qualitative studies during intervention development to social science theory, in order to gain deeper insights into the reasons and context for how people respond to the intervention. This paper illustrates how usability testing may be enriched by theoretical modelling by means of two qualitative studies of users' views of the delivery of information in an internet-delivered intervention to help users decide whether they needed to seek medical care for their cold or flu symptoms. Methods In Study 1, 21 participants recruited from a city in southern England were asked to 'think aloud' while viewing draft web-pages presented in paper format. In Study 2, views of our prototype website were elicited, again using think aloud methods, in a sample of 26 participants purposively sampled for diversity in education levels. Both data-sets were analysed by thematic analysis. Results Study 1 revealed that although the information provided by the draft web-pages had many of the intended empowering benefits, users often felt overwhelmed by the quantity of information. Relating these findings to theory and research on factors influencing preferences for information-seeking we hypothesised that to meet the needs of different users (especially those with lower literacy levels) our website should be designed to provide only essential personalised advice, but with options to access further information. Study 2 showed that our website design did prove accessible to users with different literacy levels. However, some users seemed to want still greater control over how information was accessed. Conclusions Educational level need not be an insuperable barrier to appreciating web-based access to detailed health-related information, provided that users feel they can quickly gain access to the specific information they seek. PMID:20849599
Strickling, Kate; Payne, Hannah E; Jensen, Kayla C; West, Joshua H
2016-01-01
Background There is increasing interest in Pinterest as a method of disseminating health information. However, it is unclear whether the health information promoted on Pinterest is evidence-based or incorporates behavior change theory. Objectives The objective of the study was to determine the presence of health behavior theory (HBT) constructs in pins found on Pinterest and assess the relationship between various pin characteristics and the likelihood of inclusion of HBT. Methods A content analysis was conducted on pins collected from Pinterest identified with the search terms “nutrition infographic” and “healthy eating infographic.” The coding rubric included HBT constructs, pin characteristics, and visual communication tools. Each HBT construct was coded as present or not present (yes=1, no=0). A total theory score was calculated by summing the values for each of the 9 constructs (range 0-9). Adjusted regression analysis was used to identify factors associated with the inclusion of health behavior change theory in pins (P<.05). Results The mean total theory score was 2.03 (SD 1.2). Perceived benefits were present most often (170/236, 72%), followed by behavioral capability (123/238, 51.7%) and perceived severity (79/236, 33.5%). The construct that appeared the least was self-regulation/self-control (2/237, 0.8%). Pin characteristics associated with the inclusion of HBT included a large amount of text (P=.01), photographs of real people (P=.001), cartoon pictures of food (P=.01), and the presence of references (P=.001). The number of repins (P=.04), likes (P=.01), and comments (P=.01) were positively associated with the inclusion of HBT. Conclusions These findings suggest that current Pinterest infographics targeting healthy eating contain few HBT elements. Health professionals and organizations should create and disseminate infographics that contain more elements of HBT to better influence healthy eating behavior. This may be accomplished by creating pins that use both text and images of people and food in order to portray elements of HBT and convey nutritional information. PMID:27932316
Input-Based Approaches to Teaching Grammar: A Review of Classroom-Oriented Research.
ERIC Educational Resources Information Center
Ellis, Rod
1999-01-01
Examines the theoretical rationales (universal grammar, information-processing theories, skill-learning theories) for input-based grammar teaching and reviews classroom-oriented research (i.e., enriched-input studies, input-processing studies) that has integrated this option. (Author/VWL)
Planning the diffusion of a neck-injury prevention programme among community rugby union coaches.
Donaldson, Alex; Poulos, Roslyn G
2014-01-01
This paper describes the development of a theory-informed and evidence-informed, context-specific diffusion plan for the Mayday Safety Procedure (MSP) among community rugby coaches in regional New South Wales, Australia. Step 5 of Intervention Mapping was used to plan strategies to enhance MSP adoption and implementation. Coaches were identified as the primary MSP adopters and implementers within a system including administrators, players and referees. A local advisory group was established to ensure context relevance. Performance objectives (eg, attend MSP training for coaches) and determinants of adoption and implementation behaviour (eg, knowledge, beliefs, skills and environment) were identified, informed by Social Cognitive Theory. Adoption and implementation matrices were developed and change-objectives for coaches were identified (eg, skills to deliver MSP training to players). Finally, intervention methods and specific strategies (eg, coach education, social marketing and policy and by-law development) were identified based on advisory group member experience, evidence of effective coach safety behaviour-change interventions and Diffusion of Innovations theory. This is the first published example of a systematic approach to plan injury prevention programme diffusion in community sports. The key strengths of this approach were an effective researcher-practitioner partnership; actively engaging local sports administrators; targeting specific behaviour determinants, informed by theory and evidence; and taking context-related practical strengths and constraints into consideration. The major challenges were the time involved in using a systematic diffusion planning approach for the first time; and finding a planning language that was acceptable and meaningful to researchers and practitioners.
Bong Seok Park; Jin Bae Park; Yoon Ho Choi
2011-08-01
We present a leader-follower-based adaptive formation control method for electrically driven nonholonomic mobile robots with limited information. First, an adaptive observer is developed under the condition that the velocity measurement is not available. With the proposed adaptive observer, the formation control part is designed to achieve the desired formation and guarantee the collision avoidance. In addition, neural network is employed to compensate the actuator saturation, and the projection algorithm is used to estimate the velocity information of the leader. It is shown, by using the Lyapunov theory, that all errors of the closed-loop system are uniformly ultimately bounded. Simulation results are presented to illustrate the performance of the proposed control system.
NASA Astrophysics Data System (ADS)
Zhang, Yufeng; Long, Man; Luo, Sida; Bao, Yu; Shen, Hanxia
2015-12-01
Transit route choice model is the key technology of public transit systems planning and management. Traditional route choice models are mostly based on expected utility theory which has an evident shortcoming that it cannot accurately portray travelers' subjective route choice behavior for their risk preferences are not taken into consideration. Cumulative prospect theory (CPT), a brand new theory, can be used to describe travelers' decision-making process under the condition of uncertainty of transit supply and risk preferences of multi-type travelers. The method to calibrate the reference point, a key parameter to CPT-based transit route choice model, determines the precision of the model to a great extent. In this paper, a new method is put forward to obtain the value of reference point which combines theoretical calculation and field investigation results. Comparing the proposed method with traditional method, it shows that the new method can promote the quality of CPT-based model by improving the accuracy in simulating travelers' route choice behaviors based on transit trip investigation from Nanjing City, China. The proposed method is of great significance to logical transit planning and management, and to some extent makes up the defect that obtaining the reference point is solely based on qualitative analysis.
Communication in diagnostic radiology: meeting the challenges of complexity.
Larson, David B; Froehle, Craig M; Johnson, Neil D; Towbin, Alexander J
2014-11-01
As patients and information flow through the imaging process, value is added step-by-step when information is acquired, interpreted, and communicated back to the referring clinician. However, radiology information systems are often plagued with communication errors and delays. This article presents theories and recommends strategies to continuously improve communication in the complex environment of modern radiology. Communication theories, methods, and systems that have proven their effectiveness in other environments can serve as models for radiology.
ISS method for coordination control of nonlinear dynamical agents under directed topology.
Wang, Xiangke; Qin, Jiahu; Yu, Changbin
2014-10-01
The problems of coordination of multiagent systems with second-order locally Lipschitz continuous nonlinear dynamics under directed interaction topology are investigated in this paper. A completely nonlinear input-to-state stability (ISS)-based framework, drawing on ISS methods, with the aid of results from graph theory, matrix theory, and the ISS cyclic-small-gain theorem, is proposed for the coordination problem under directed topology, which can effectively tackle the technical challenges caused by locally Lipschitz continuous dynamics. Two coordination problems, i.e., flocking with a virtual leader and containment control, are considered. For both problems, it is assumed that only a portion of the agents can obtain the information from the leader(s). For the first problem, the proposed strategy is shown effective in driving a group of nonlinear dynamical agents reach the prespecified geometric pattern under the condition that at least one agent in each strongly connected component of the information-interconnection digraph with zero in-degree has access to the state information of the virtual leader; and the strategy proposed for the second problem can guarantee the nonlinear dynamical agents moving to the convex hull spanned by the positions of multiple leaders under the condition that for each agent there exists at least one leader that has a directed path to this agent.
Information distribution in distributed microprocessor based flight control systems
NASA Technical Reports Server (NTRS)
Montgomery, R. C.; Lee, P. S.
1977-01-01
This paper presents an optimal control theory that accounts for variable time intervals in the information distribution to control effectors in a distributed microprocessor based flight control system. The theory is developed using a linear process model for the aircraft dynamics and the information distribution process is modeled as a variable time increment process where, at the time that information is supplied to the control effectors, the control effectors know the time of the next information update only in a stochastic sense. An optimal control problem is formulated and solved that provides the control law that minimizes the expected value of a quadratic cost function. An example is presented where the theory is applied to the control of the longitudinal motions of the F8-DFBW aircraft. Theoretical and simulation results indicate that, for the example problem, the optimal cost obtained using a variable time increment Markov information update process where the control effectors know only the past information update intervals and the Markov transition mechanism is almost identical to that obtained using a known uniform information update interval.
ERIC Educational Resources Information Center
Allison, Dennis J.
A theory of memory is introduced, which seeks to respond to the shortcomings of existing theories based on metaphors. Memory is presented as a mechanism, a comparison process in which information held in some form of immediate storage (whether based on perception or previous cognition or both) is compared to previously stored long-term storage.…
Information theoretic analysis of edge detection in visual communication
NASA Astrophysics Data System (ADS)
Jiang, Bo; Rahman, Zia-ur
2010-08-01
Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the artifacts introduced into the process by the image gathering process. However, experiments show that the image gathering process profoundly impacts the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. In this paper, we perform an end-to-end information theory based system analysis to assess edge detection methods. We evaluate the performance of the different algorithms as a function of the characteristics of the scene, and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge detection algorithm is regarded to have high performance only if the information rate from the scene to the edge approaches the maximum possible. This goal can be achieved only by jointly optimizing all processes. People generally use subjective judgment to compare different edge detection methods. There is not a common tool that can be used to evaluate the performance of the different algorithms, and to give people a guide for selecting the best algorithm for a given system or scene. Our information-theoretic assessment becomes this new tool to which allows us to compare the different edge detection operators in a common environment.
ERIC Educational Resources Information Center
Heshi, Kamal Nosrati; Nasrabadi, Hassanali Bakhtiyar
2016-01-01
The present paper attempts to recognize principles and methods of education based on Wittgenstein's picture theory of language. This qualitative research utilized inferential analytical approach to review the related literature and extracted a set of principles and methods from his theory on picture language. Findings revealed that Wittgenstein…
Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis
2016-03-01
Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time.
Theory-based interventions for contraception.
Lopez, Laureen M; Grey, Thomas W; Chen, Mario; Tolley, Elizabeth E; Stockton, Laurie L
2016-11-23
The explicit use of theory in research helps expand the knowledge base. Theories and models have been used extensively in HIV-prevention research and in interventions for preventing sexually transmitted infections (STIs). The health behavior field uses many theories or models of change. However, many educational interventions addressing contraception have no explicit theoretical base. To review randomized controlled trials (RCTs) that tested a theoretical approach to inform contraceptive choice and encourage or improve contraceptive use. To 1 November 2016, we searched for trials that tested a theory-based intervention for improving contraceptive use in PubMed, CENTRAL, POPLINE, Web of Science, ClinicalTrials.gov, and ICTRP. For the initial review, we wrote to investigators to find other trials. Included trials tested a theory-based intervention for improving contraceptive use. Interventions addressed the use of one or more methods for contraception. The reports provided evidence that the intervention was based on a specific theory or model. The primary outcomes were pregnancy and contraceptive choice or use. We assessed titles and abstracts identified during the searches. One author extracted and entered the data into Review Manager; a second author verified accuracy. We examined studies for methodological quality.For unadjusted dichotomous outcomes, we calculated the Mantel-Haenszel odds ratio (OR) with 95% confidence interval (CI). Cluster randomized trials used various methods of accounting for the clustering, such as multilevel modeling. Most reports did not provide information to calculate the effective sample size. Therefore, we presented the results as reported by the investigators. We did not conduct meta-analysis due to varied interventions and outcome measures. We included 10 new trials for a total of 25. Five were conducted outside the USA. Fifteen randomly assigned individuals and 10 randomized clusters. This section focuses on nine trials with high or moderate quality evidence and an intervention effect. Five based on social cognitive theory addressed preventing adolescent pregnancy and were one to two years long. The comparison was usual care or education. Adolescent mothers with a home-based curriculum had fewer second births in two years (OR 0.41, 95% CI 0.17 to 1.00). Twelve months after a school-based curriculum, the intervention group was more likely to report using an effective contraceptive method (adjusted OR 1.76 ± standard error (SE) 0.29) and using condoms during last intercourse (adjusted OR 1.68 ± SE 0.25). In alternative schools, after five months the intervention group reported more condom use during last intercourse (reported adjusted OR 2.12, 95% CI 1.24 to 3.56). After a school-based risk-reduction program, at three months the intervention group was less likely to report no condom use at last intercourse (adjusted OR 0.67, 95% CI 0.47 to 0.96). The risk avoidance group (abstinence-focused) was less likely to do so at 15 months (OR 0.61, 95% CI 0.45 to 0.85). At 24 months after a case management and peer-leadership program, the intervention group reported more consistent use of hormonal contraceptives (adjusted relative risk (RR) 1.30, 95% CI 1.06 to 1.58), condoms (RR 1.57, 95% CI 1.28 to 1.94), and dual methods (RR 1.36, 95% CI 1.01 to 1.85).Four of the nine trials used motivational interviewing (MI). In three studies, the comparison group received handouts. The MI group more often reported effective contraception use at nine months (OR 2.04, 95% CI 1.47 to 2.83). In two studies, the MI group was less likely to report using ineffective contraception at three months (OR 0.31, 95% CI 0.12 to 0.77) and four months (OR 0.56, 95% CI 0.31 to 0.98), respectively. In the fourth trial, the MI group was more likely than a group with non-standard counseling to initiate long-acting reversible contraception (LARC) by one month (OR 3.99, 95% CI 1.36 to 11.68) and to report using LARC at three months (OR 3.38, 95% CI 1.06 to 10.71). The overall quality of evidence was moderate. Trials based on social cognitive theory focused on adolescents and provided multiple sessions. Those using motivational interviewing had a wider age range but specific populations. Sites with low resources need effective interventions adapted for their settings and their typical clients. Reports could be clearer about how the theory was used to design and implement the intervention.
[Forensic evidence-based medicine in computer communication networks].
Qiu, Yun-Liang; Peng, Ming-Qi
2013-12-01
As an important component of judicial expertise, forensic science is broad and highly specialized. With development of network technology, increasement of information resources, and improvement of people's legal consciousness, forensic scientists encounter many new problems, and have been required to meet higher evidentiary standards in litigation. In view of this, evidence-based concept should be established in forensic medicine. We should find the most suitable method in forensic science field and other related area to solve specific problems in the evidence-based mode. Evidence-based practice can solve the problems in legal medical field, and it will play a great role in promoting the progress and development of forensic science. This article reviews the basic theory of evidence-based medicine and its effect, way, method, and evaluation in the forensic medicine in order to discuss the application value of forensic evidence-based medicine in computer communication networks.
Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W.; Imel, Zac E.; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C.
2014-01-01
The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. PMID:25242192
Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W; Imel, Zac E; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C
2015-02-01
The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. Copyright © 2015 Elsevier Inc. All rights reserved.
How Cultural Evolutionary Theory Can Inform Social Psychology and Vice Versa
ERIC Educational Resources Information Center
Mesoudi, Alex
2009-01-01
Cultural evolutionary theory is an interdisciplinary field in which human culture is viewed as a Darwinian process of variation, competition, and inheritance, and the tools, methods, and theories developed by evolutionary biologists to study genetic evolution are adapted to study cultural change. It is argued here that an integration of the…
Lifting as We Climb: Developing Constellations of Learning within an Informal Online Radio Format
ERIC Educational Resources Information Center
Mistry, Margaret Egan
2012-01-01
This mixed-methods study combines the sociocultural theories of Vygotsky's research on thought and language, Mezirow's Transformational Learning Theory, situated learning theory of Rogoff, Lave, and Wenger, to explore individual and group process and resulting products within an online university radio station system. The study…
Single underwater image enhancement based on color cast removal and visibility restoration
NASA Astrophysics Data System (ADS)
Li, Chongyi; Guo, Jichang; Wang, Bo; Cong, Runmin; Zhang, Yan; Wang, Jian
2016-05-01
Images taken under underwater condition usually have color cast and serious loss of contrast and visibility. Degraded underwater images are inconvenient for observation and analysis. In order to address these problems, an underwater image-enhancement method is proposed. A simple yet effective underwater image color cast removal algorithm is first presented based on the optimization theory. Then, based on the minimum information loss principle and inherent relationship of medium transmission maps of three color channels in an underwater image, an effective visibility restoration algorithm is proposed to recover visibility, contrast, and natural appearance of degraded underwater images. To evaluate the performance of the proposed method, qualitative comparison, quantitative comparison, and color accuracy test are conducted. Experimental results demonstrate that the proposed method can effectively remove color cast, improve contrast and visibility, and recover natural appearance of degraded underwater images. Additionally, the proposed method is comparable to and even better than several state-of-the-art methods.
Cadastral data model established and perfected with 4S technology
NASA Astrophysics Data System (ADS)
He, Beijing; He, Jiang; He, Jianpeng
1998-08-01
Considering China's social essential system and the actual case of the formation of cadastral information in urban and rural area, and based on the 4S technology and the theory and method of canton's GPS geodetic data bench developed by the authors, we thoroughly research on some correlative technical problems about establishing and perfecting all-level's microcosmic cadastral data model (called model in the following) once again. Such problems as the following are included: cadastral, feature and topographic information and its modality and expressing method, classifying and grading the model, coordinate system to be selected, data basis for the model, the collecting method and digitalization of information, database's structural model, mathematical model and the establishing technology of 3 or more dimensional model, dynamic monitoring of and the development and application of the model. Then, the domestic and overseas application prospect is revealed. It also has the tendency to intrude markets cooperated with 'data bench' technology or RS image maps' all-analysis digital surveying and mapping technology.
Implications of Information Theory for Computational Modeling of Schizophrenia.
Silverstein, Steven M; Wibral, Michael; Phillips, William A
2017-10-01
Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory-such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio-can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development.
Sawin, Kathleen J; Weiss, Marianne E; Johnson, Norah; Gralton, Karen; Malin, Shelly; Klingbeil, Carol; Lerret, Stacee M; Thompson, Jamie J; Zimmanck, Kim; Kaul, Molly; Schiffman, Rachel F
2017-03-01
Parents of hospitalized children, especially parents of children with complex and chronic health conditions, report not being adequately prepared for self-management of their child's care at home after discharge. No theory-based discharge intervention exists to guide pediatric nurses' preparation of parents for discharge. To develop a theory-based conversation guide to optimize nurses' preparation of parents for discharge and self-management of their child at home following hospitalization. Two frameworks and one method influenced the development of the intervention: the Individual and Family Self-Management Theory, Tanner's Model of Clinical Judgment, and the Teach-Back method. A team of nurse scientists, nursing leaders, nurse administrators, and clinical nurses developed and field tested the electronic version of a nine-domain conversation guide for use in acute care pediatric hospitals. The theory-based intervention operationalized self-management concepts, added components of nursing clinical judgment, and integrated the Teach-Back method. Development of a theory-based intervention, the translation of theoretical knowledge to clinical innovation, is an important step toward testing the effectiveness of the theory in guiding clinical practice. Clinical nurses will establish the practice relevance through future use and refinement of the intervention. © 2017 Sigma Theta Tau International.
Daly, Louise; McCarron, Mary; Higgins, Agnes; McCallion, Philip
2013-02-01
This paper presents a theory explaining the processes used by informal carers of people with dementia to mange alterations to their, and people with dementias' relationships with and places within their social worlds. Informal carers provide the majority of care to people with dementia. A great deal of international informal dementia care research is available, much of which elucidates the content, impacts and consequences of the informal caring role and the coping mechanisms that carers use. However, the socially situated experiences and processes integral to informal caring in dementia have not yet been robustly accounted for. A classic grounded theory approach was used as it is designed for research enquiries that aim to generate theory illustrating social patterns of action used to address an identified problem. Thirty interviews were conducted with 31 participants between 2006-2008. The theory was conceptualised from the data using the concurrent methods of theoretical sampling, constant comparative analysis, memo writing and theoretical sensitivity. Informal carers' main concern was identified as 'Living on the fringes', which was stimulated by dementia-related stigma and living a different life. The theory of 'Sustaining Place' explains the social pattern of actions employed by informal carers to manage this problem on behalf of themselves and the person with dementia. The theory of 'Sustaining Place' identifies an imperative for nurses, other formal carers and society to engage in actions to support and enable social connectedness, social inclusion and citizenship for informal carers and people with dementia. 'Sustaining Place' facilitates enhanced understanding of the complex and socially situated nature of informal dementia care through its portrayal of informal carers as social agents and can be used to guide nurses to better support those who live with dementia. © 2012 Blackwell Publishing Ltd.
The care pathway: concepts and theories: an introduction.
Schrijvers, Guus; van Hoorn, Arjan; Huiskes, Nicolette
2012-01-01
This article addresses first the definition of a (care) pathway, and then follows a description of theories since the 1950s. It ends with a discussion of theoretical advantages and disadvantages of care pathways for patients and professionals. The objective of this paper is to provide a theoretical base for empirical studies on care pathways. The knowledge for this chapter is based on several books on pathways, which we found by searching in the digital encyclopedia Wikipedia. Although this is not usual in scientific publications, this method was used because books are not searchable by databases as Pubmed. From 2005, we performed a literature search on Pubmed and other literature databases, and with the keywords integrated care pathway, clinical pathway, critical pathway, theory, research, and evaluation. One of the inspirational sources was the website of the European Pathway Association (EPA) and its journal International Journal of Care Pathways. The authors visited several sites for this paper. These are mentioned as illustration of a concept or theory. Most of them have English websites with more information. The URLs of these websites are not mentioned in this paper as a reference, because the content of them changes fast, sometimes every day.
The care pathway: concepts and theories: an introduction
Schrijvers, Guus; van Hoorn, Arjan; Huiskes, Nicolette
2012-01-01
This article addresses first the definition of a (care) pathway, and then follows a description of theories since the 1950s. It ends with a discussion of theoretical advantages and disadvantages of care pathways for patients and professionals. The objective of this paper is to provide a theoretical base for empirical studies on care pathways. The knowledge for this chapter is based on several books on pathways, which we found by searching in the digital encyclopedia Wikipedia. Although this is not usual in scientific publications, this method was used because books are not searchable by databases as Pubmed. From 2005, we performed a literature search on Pubmed and other literature databases, and with the keywords integrated care pathway, clinical pathway, critical pathway, theory, research, and evaluation. One of the inspirational sources was the website of the European Pathway Association (EPA) and its journal International Journal of Care Pathways. The authors visited several sites for this paper. These are mentioned as illustration of a concept or theory. Most of them have English websites with more information. The URLs of these websites are not mentioned in this paper as a reference, because the content of them changes fast, sometimes every day. PMID:23593066
Carrell, David; Malin, Bradley; Aberdeen, John; Bayer, Samuel; Clark, Cheryl; Wellner, Ben; Hirschman, Lynette
2013-01-01
Secondary use of clinical text is impeded by a lack of highly effective, low-cost de-identification methods. Both, manual and automated methods for removing protected health information, are known to leave behind residual identifiers. The authors propose a novel approach for addressing the residual identifier problem based on the theory of Hiding In Plain Sight (HIPS). HIPS relies on obfuscation to conceal residual identifiers. According to this theory, replacing the detected identifiers with realistic but synthetic surrogates should collectively render the few 'leaked' identifiers difficult to distinguish from the synthetic surrogates. The authors conducted a pilot study to test this theory on clinical narrative, de-identified by an automated system. Test corpora included 31 oncology and 50 family practice progress notes read by two trained chart abstractors and an informaticist. Experimental results suggest approximately 90% of residual identifiers can be effectively concealed by the HIPS approach in text containing average and high densities of personal identifying information. This pilot test suggests HIPS is feasible, but requires further evaluation. The results need to be replicated on larger corpora of diverse origin under a range of detection scenarios. Error analyses also suggest areas where surrogate generation techniques can be refined to improve efficacy. If these results generalize to existing high-performing de-identification systems with recall rates of 94-98%, HIPS could increase the effective de-identification rates of these systems to levels above 99% without further advancements in system recall. Additional and more rigorous assessment of the HIPS approach is warranted.
Wu, Tailai; Deng, Zhaohua; Zhang, Donglan; Buchanan, Paula R; Zha, Dongqing; Wang, Ruoxi
2018-07-01
The aim of this study is to investigate how doctor-consumer interaction in social media influences consumers' health information seeking and usage intention. Based on professional-client interaction theory and expectation confirmation theory, we propose that doctor-consumer interaction can be divided into instrumental interaction and affective interaction. These two types of interaction influence consumers' health information seeking and usage intention through consumer satisfaction and trust towards doctors. To validate our proposed research model, we employed the survey method. The measurement instruments for all constructs were developed based on previous literatures, and 352 valid answers were collected by using these instruments. Our results reveal that consumers' intention to seek health information significantly predicts their intention to use health information from social media. Meanwhile, both consumer satisfaction and trust towards doctors influences consumers' health information seeking and usage intention significantly. With regards to the impact of the interaction between doctors and consumers, the results show that both types of doctor-consumer interaction significantly affect consumer satisfaction and trust towards doctors. The mediation analysis confirms the mediation role of consumer satisfaction and trust towards doctors. Compared with many intentional intervention programs, doctor-consumer interaction can be treated as an effective intervention with low cost to promote consumers' health information seeking and usage. Meanwhile, both instrumental and affective interaction should be highlighted for the best interaction results. At last, consumer satisfaction and trust towards doctors could be considered as the important working mechanisms for the effect of doctor-consumer interaction. Copyright © 2018 Elsevier B.V. All rights reserved.
Combining a dispersal model with network theory to assess habitat connectivity.
Lookingbill, Todd R; Gardner, Robert H; Ferrari, Joseph R; Keller, Cherry E
2010-03-01
Assessing the potential for threatened species to persist and spread within fragmented landscapes requires the identification of core areas that can sustain resident populations and dispersal corridors that can link these core areas with isolated patches of remnant habitat. We developed a set of GIS tools, simulation methods, and network analysis procedures to assess potential landscape connectivity for the Delmarva fox squirrel (DFS; Sciurus niger cinereus), an endangered species inhabiting forested areas on the Delmarva Peninsula, USA. Information on the DFS's life history and dispersal characteristics, together with data on the composition and configuration of land cover on the peninsula, were used as input data for an individual-based model to simulate dispersal patterns of millions of squirrels. Simulation results were then assessed using methods from graph theory, which quantifies habitat attributes associated with local and global connectivity. Several bottlenecks to dispersal were identified that were not apparent from simple distance-based metrics, highlighting specific locations for landscape conservation, restoration, and/or squirrel translocations. Our approach links simulation models, network analysis, and available field data in an efficient and general manner, making these methods useful and appropriate for assessing the movement dynamics of threatened species within landscapes being altered by human and natural disturbances.
Hesitant Fuzzy Thermodynamic Method for Emergency Decision Making Based on Prospect Theory.
Ren, Peijia; Xu, Zeshui; Hao, Zhinan
2017-09-01
Due to the timeliness of emergency response and much unknown information in emergency situations, this paper proposes a method to deal with the emergency decision making, which can comprehensively reflect the emergency decision making process. By utilizing the hesitant fuzzy elements to represent the fuzziness of the objects and the hesitant thought of the experts, this paper introduces the negative exponential function into the prospect theory so as to portray the psychological behaviors of the experts, which transforms the hesitant fuzzy decision matrix into the hesitant fuzzy prospect decision matrix (HFPDM) according to the expectation-levels. Then, this paper applies the energy and the entropy in thermodynamics to take the quantity and the quality of the decision values into account, and defines the thermodynamic decision making parameters based on the HFPDM. Accordingly, a whole procedure for emergency decision making is conducted. What is more, some experiments are designed to demonstrate and improve the validation of the emergency decision making procedure. Last but not the least, this paper makes a case study about the emergency decision making in the firing and exploding at Port Group in Tianjin Binhai New Area, which manifests the effectiveness and practicability of the proposed method.
NASA Astrophysics Data System (ADS)
Johnson, David T.
Quantum mechanics is an extremely successful and accurate physical theory, yet since its inception, it has been afflicted with numerous conceptual difficulties. The primary subject of this thesis is the theory of entropic quantum dynamics (EQD), which seeks to avoid these conceptual problems by interpreting quantum theory from an informational perspective. We begin by reviewing Cox's work in describing probability theory as a means of rationally and consistently quantifying uncertainties. We then discuss how probabilities can be updated according to either Bayes' theorem or the extended method of maximum entropy (ME). After that discussion, we review the work of Caticha and Giffin that shows that Bayes' theorem is a special case of ME. This important result demonstrates that the ME method is the general method for updating probabilities. We then review some motivating difficulties in quantum mechanics before discussing Caticha's work in deriving quantum theory from the approach of entropic dynamics, which concludes our review. After entropic dynamics is introduced, we develop the concepts of symmetries and transformations from an informational perspective. The primary result is the formulation of a symmetry condition that any transformation must satisfy in order to qualify as a symmetry in EQD. We then proceed to apply this condition to the extended Galilean transformation. This transformation is of interest as it exhibits features of both special and general relativity. The transformation yields a gravitational potential that arises from an equivalence of information. We conclude the thesis with a discussion of the measurement problem in quantum mechanics. We discuss the difficulties that arise in the standard quantum mechanical approach to measurement before developing our theory of entropic measurement. In entropic dynamics, position is the only observable. We show how a theory built on this one observable can account for the multitude of measurements present in quantum theory. Furthermore, we show that the Born rule need not be postulated, but can be derived in EQD. Finally, we show how the wave function can be updated by the ME method as the phase is constructed purely in terms of probabilities.
NASA Astrophysics Data System (ADS)
Dong, Yumin; Xiao, Shufen; Ma, Hongyang; Chen, Libo
2016-12-01
Cloud computing and big data have become the developing engine of current information technology (IT) as a result of the rapid development of IT. However, security protection has become increasingly important for cloud computing and big data, and has become a problem that must be solved to develop cloud computing. The theft of identity authentication information remains a serious threat to the security of cloud computing. In this process, attackers intrude into cloud computing services through identity authentication information, thereby threatening the security of data from multiple perspectives. Therefore, this study proposes a model for cloud computing protection and management based on quantum authentication, introduces the principle of quantum authentication, and deduces the quantum authentication process. In theory, quantum authentication technology can be applied in cloud computing for security protection. This technology cannot be cloned; thus, it is more secure and reliable than classical methods.
Empowering Older Patients to Engage in Self Care: Designing an Interactive Robotic Device
Tiwari, Priyadarshi; Warren, Jim; Day, Karen
2011-01-01
Objectives: To develop and test an interactive robot mounted computing device to support medication management as an example of a complex self-care task in older adults. Method: A Grounded Theory (GT), Participatory Design (PD) approach was used within three Action Research (AR) cycles to understand design requirements and test the design configuration addressing the unique task requirements. Results: At the end of the first cycle a conceptual framework was evolved. The second cycle informed architecture and interface design. By the end of third cycle residents successfully interacted with the dialogue system and were generally satisfied with the robot. The results informed further refinement of the prototype. Conclusion: An interactive, touch screen based, robot-mounted information tool can be developed to support healthcare needs of older people. Qualitative methods such as the hybrid GT-PD-AR approach may be particularly helpful for innovating and articulating design requirements in challenging situations. PMID:22195203
Höfener, Sebastian; Gomes, André Severo Pereira; Visscher, Lucas
2012-01-28
In this article, we present a consistent derivation of a density functional theory (DFT) based embedding method which encompasses wave-function theory-in-DFT (WFT-in-DFT) and the DFT-based subsystem formulation of response theory (DFT-in-DFT) by Neugebauer [J. Neugebauer, J. Chem. Phys. 131, 084104 (2009)] as special cases. This formulation, which is based on the time-averaged quasi-energy formalism, makes use of the variation Lagrangian techniques to allow the use of non-variational (in particular: coupled cluster) wave-function-based methods. We show how, in the time-independent limit, we naturally obtain expressions for the ground-state DFT-in-DFT and WFT-in-DFT embedding via a local potential. We furthermore provide working equations for the special case in which coupled cluster theory is used to obtain the density and excitation energies of the active subsystem. A sample application is given to demonstrate the method. © 2012 American Institute of Physics
New Aspects of Probabilistic Forecast Verification Using Information Theory
NASA Astrophysics Data System (ADS)
Tödter, Julian; Ahrens, Bodo
2013-04-01
This work deals with information-theoretical methods in probabilistic forecast verification, particularly concerning ensemble forecasts. Recent findings concerning the "Ignorance Score" are shortly reviewed, then a consistent generalization to continuous forecasts is motivated. For ensemble-generated forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up a natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The useful properties of the conceptually appealing CRIGN are illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This algorithm can also be used to calculate the decomposition of the more traditional CRPS exactly. The applicability of the "new" measures is demonstrated in a small evaluation study of ensemble-based precipitation forecasts.
NASA Astrophysics Data System (ADS)
Gagatsos, Christos N.; Karanikas, Alexandros I.; Kordas, Georgios; Cerf, Nicolas J.
2016-02-01
In spite of their simple description in terms of rotations or symplectic transformations in phase space, quadratic Hamiltonians such as those modelling the most common Gaussian operations on bosonic modes remain poorly understood in terms of entropy production. For instance, determining the quantum entropy generated by a Bogoliubov transformation is notably a hard problem, with generally no known analytical solution, while it is vital to the characterisation of quantum communication via bosonic channels. Here we overcome this difficulty by adapting the replica method, a tool borrowed from statistical physics and quantum field theory. We exhibit a first application of this method to continuous-variable quantum information theory, where it enables accessing entropies in an optical parametric amplifier. As an illustration, we determine the entropy generated by amplifying a binary superposition of the vacuum and a Fock state, which yields a surprisingly simple, yet unknown analytical expression.
NASA Astrophysics Data System (ADS)
van Haver, Sven; Janssen, Olaf T. A.; Braat, Joseph J. M.; Janssen, Augustus J. E. M.; Urbach, H. Paul; Pereira, Silvania F.
2008-03-01
In this paper we introduce a new mask imaging algorithm that is based on the source point integration method (or Abbe method). The method presented here distinguishes itself from existing methods by exploiting the through-focus imaging feature of the Extended Nijboer-Zernike (ENZ) theory of diffraction. An introduction to ENZ-theory and its application in general imaging is provided after which we describe the mask imaging scheme that can be derived from it. The remainder of the paper is devoted to illustrating the advantages of the new method over existing methods (Hopkins-based). To this extent several simulation results are included that illustrate advantages arising from: the accurate incorporation of isolated structures, the rigorous treatment of the object (mask topography) and the fully vectorial through-focus image formation of the ENZ-based algorithm.
Nolan, Jim
2014-01-01
This paper suggests a novel clustering method for analyzing the National Incident-Based Reporting System (NIBRS) data, which include the determination of correlation of different crime types, the development of a likelihood index for crimes to occur in a jurisdiction, and the clustering of jurisdictions based on crime type. The method was tested by using the 2005 assault data from 121 jurisdictions in Virginia as a test case. The analyses of these data show that some different crime types are correlated and some different crime parameters are correlated with different crime types. The analyses also show that certain jurisdictions within Virginia share certain crime patterns. This information assists with constructing a pattern for a specific crime type and can be used to determine whether a jurisdiction may be more likely to see this type of crime occur in their area. PMID:24778585
Pattern Activity Clustering and Evaluation (PACE)
NASA Astrophysics Data System (ADS)
Blasch, Erik; Banas, Christopher; Paul, Michael; Bussjager, Becky; Seetharaman, Guna
2012-06-01
With the vast amount of network information available on activities of people (i.e. motions, transportation routes, and site visits) there is a need to explore the salient properties of data that detect and discriminate the behavior of individuals. Recent machine learning approaches include methods of data mining, statistical analysis, clustering, and estimation that support activity-based intelligence. We seek to explore contemporary methods in activity analysis using machine learning techniques that discover and characterize behaviors that enable grouping, anomaly detection, and adversarial intent prediction. To evaluate these methods, we describe the mathematics and potential information theory metrics to characterize behavior. A scenario is presented to demonstrate the concept and metrics that could be useful for layered sensing behavior pattern learning and analysis. We leverage work on group tracking, learning and clustering approaches; as well as utilize information theoretical metrics for classification, behavioral and event pattern recognition, and activity and entity analysis. The performance evaluation of activity analysis supports high-level information fusion of user alerts, data queries and sensor management for data extraction, relations discovery, and situation analysis of existing data.
Implementation of Performance-Based Acquisition in Non-Western Countries
2009-03-01
narratives , phenomenologies , ethnographies , grounded theory studies , or case studies . The researcher collects...are biography, phenomenological study , grounded theory study , ethnography , and case study . The approach used for qualitative data collection method ... qualitative methods , such as the grounded theory approach to
Reed, Frances M; Fitzgerald, Les; Rae, Melanie
2016-01-01
To highlight philosophical and theoretical considerations for planning a mixed methods research design that can inform a practice model to guide rural district nursing end of life care. Conceptual models of nursing in the community are general and lack guidance for rural district nursing care. A combination of pragmatism and nurse agency theory can provide a framework for ethical considerations in mixed methods research in the private world of rural district end of life care. Reflection on experience gathered in a two-stage qualitative research phase, involving rural district nurses who use advocacy successfully, can inform a quantitative phase for testing and complementing the data. Ongoing data analysis and integration result in generalisable inferences to achieve the research objective. Mixed methods research that creatively combines philosophical and theoretical elements to guide design in the particular ethical situation of community end of life care can be used to explore an emerging field of interest and test the findings for evidence to guide quality nursing practice. Combining philosophy and nursing theory to guide mixed methods research design increases the opportunity for sound research outcomes that can inform a nursing model of care.
Fair, Kayla; Hong, Y Alicia; Beaudoin, Christopher E; Pulczinski, Jairus; Ory, Marcia G
2015-01-01
Background Thousands of mobile health apps are now available for use on mobile phones for a variety of uses and conditions, including cancer survivorship. Many of these apps appear to deliver health behavior interventions but may fail to consider design considerations based in human computer interface and health behavior change theories. Objective This study is designed to assess the presence of and manner in which health behavior change and health communication theories are applied in mobile phone cancer survivorship apps. Methods The research team selected a set of criteria-based health apps for mobile phones and assessed each app using qualitative coding methods to assess the application of health behavior change and communication theories. Each app was assessed using a coding derived from the taxonomy of 26 health behavior change techniques by Abraham and Michie with a few important changes based on the characteristics of mHealth apps that are specific to information processing and human computer interaction such as control theory and feedback systems. Results A total of 68 mobile phone apps and games built on the iOS and Android platforms were coded, with 65 being unique. Using a Cohen’s kappa analysis statistic, the inter-rater reliability for the iOS apps was 86.1 (P<.001) and for the Android apps, 77.4 (P<.001). For the most part, the scores for inclusion of theory-based health behavior change characteristics in the iOS platform cancer survivorship apps were consistently higher than those of the Android platform apps. For personalization and tailoring, 67% of the iOS apps (24/36) had these elements as compared to 38% of the Android apps (12/32). In the area of prompting for intention formation, 67% of the iOS apps (34/36) indicated these elements as compared to 16% (5/32) of the Android apps. Conclusions Mobile apps are rapidly emerging as a way to deliver health behavior change interventions that can be tailored or personalized for individuals. As these apps and games continue to evolve and include interactive and adaptive sensors and other forms of dynamic feedback, their content and interventional elements need to be grounded in human computer interface design and health behavior and communication theory and practice. PMID:25830810
ERIC Educational Resources Information Center
Lim, Melvyn H.
2015-01-01
The aim of this research was to understand and develop theory concerning how teachers in a pioneer "School of the Future" in Singapore deal with information and communication technology (ICT) integration, utilising grounded theory methods, within the interpretivist paradigm. Findings revealed that teachers tended not to make radical…
Kim, Scott Y. H.; Wall, Ian F.; Stanczyk, Aimee; Vries, Raymond De
2010-01-01
In a Liberal Democracy, Policy Decisions regarding ethical controversies, including those in research ethics, should incorporate the opinions of its citizens. Eliciting informed and well-considered ethical opinions can be challenging. The issues may not be widely familiar and they may involve complex scientific, legal, historical, and ethical dimensions. Traditional surveys risk eliciting superficial and uninformed opinions that may be of dubious quality for policy formation. We argue that the theory and practice of deliberative democracy (DD) is especially useful in overcoming such inadequacies. We explain DD theory and practice, discuss the rationale for using DD methods in research ethics, and illustrate in depth the use of a DD method for a long-standing research ethics controversy involving research based on surrogate consent. The potential pitfalls of DD and the means of minimizing them as well as future research directions are also discussed. PMID:19919315
Kim, Scott Y H; Wall, Ian F; Stanczyk, Aimee; De Vries, Raymond
2009-12-01
In a liberal democracy, policy decisions regarding ethical controversies, including those in research ethics, should incorporate the opinions of its citizens. Eliciting informed and well-considered ethical opinions can be challenging. The issues may not be widely familiar and they may involve complex scientific, legal, historical, and ethical dimensions. Traditional surveys risk eliciting superficial and uninformed opinions that may be of dubious quality for policy formation. We argue that the theory and practice of deliberative democracy (DD) is especially useful in overcoming such inadequacies. We explain DD theory and practice, discuss the rationale for using DD methods in research ethics, and illustrate in depth the use of a DD method for a longstanding research ethics controversy involving research based on surrogate consent. The potential pitfalls of DD and the means of minimizing them as well as future research directions are also discussed.
Solution to the Phase Problem Using Multibeam X-Ray Diffraction.
NASA Astrophysics Data System (ADS)
Shen, Qun
Multi-beam x-ray diffraction, especially the asymmetry effect in the virtual Bragg scattering case, has been proved to provide useful phase information on the structure factors that are involved in the scattering process. A perturbation theory has been developed to provide an analytical expression for the diffracted wave field in virtual Bragg scattering situations, which explains the physical origin of the asymmetry effect. Two experiments on the (202) reflection of benzil, using 3.5 keV x-rays, have shown that the asymmetry effect is visible in a mosaic non-centrosymmetric organic crystal. The results do not depend on the shape of the crystal, hence proving that the method is universally applicable. A practical method to obtain arbitrary values of the phase triplet, based on the perturbation theory, has been developed and shown to work in the case of non-centrosymmetric crystals like benzil.
Cybersemiotics: a transdisciplinary framework for information studies.
Brier, S
1998-04-01
This paper summarizes recent attempts by this author to create a transdisciplinary, non-Cartesian and non-reductionistic framework for information studies in natural, social, and technological systems. To confront, in a scientific way, the problems of modern information technology where phenomenological man is dealing with socially constructed texts in algorithmically based digital bit-machines we need a theoretical framework spanning from physics over biology and technological design to phenomenological and social production of signification and meaning. I am working with such pragmatic theories as second order cybernetics (coupled with autopolesis theory), Lakoffs biologically oriented cognitive semantics, Peirce's triadic semiotics, and Wittgenstein's pragmatic language game theory. A coherent synthesis of these theories is what the cybersemiotic framework attempts to accomplish.
Critical social theory as a model for the informatics curriculum for nursing.
Wainwright, P; Jones, P G
2000-01-01
It is widely acknowledged that the education and training of nurses in information management and technology is problematic. Drawing from recent research this paper presents a theoretical framework within which the nature of the problems faced by nurses in the use of information may be analyzed. This framework, based on the critical social theory of Habermas, also provides a model for the informatics curriculum. The advantages of problem based learning and multi-media web-based technologies for the delivery of learning materials within this area are also discussed.
Biases and power for groups comparison on subjective health measurements.
Hamel, Jean-François; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Roquelaure, Yves; Sébille, Véronique
2012-01-01
Subjective health measurements are increasingly used in clinical research, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: so-called classical test theory (CTT), relying on observed scores and models coming from Item Response Theory (IRT) relying on a response model relating the items responses to a latent parameter, often called latent trait. Whether IRT or CTT would be the most appropriate method to compare two independent groups of patients on a patient reported outcomes measurement remains unknown and was investigated using simulations. For CTT-based analyses, groups comparison was performed using t-test on the scores. For IRT-based analyses, several methods were compared, according to whether the Rasch model was considered with random effects or with fixed effects, and the group effect was included as a covariate or not. Individual latent traits values were estimated using either a deterministic method or by stochastic approaches. Latent traits were then compared with a t-test. Finally, a two-steps method was performed to compare the latent trait distributions, and a Wald test was performed to test the group effect in the Rasch model including group covariates. The only unbiased IRT-based method was the group covariate Wald's test, performed on the random effects Rasch model. This model displayed the highest observed power, which was similar to the power using the score t-test. These results need to be extended to the case frequently encountered in practice where data are missing and possibly informative.
The theory behind the full scattering profile
NASA Astrophysics Data System (ADS)
Feder, Idit; Duadi, Hamootal; Fixler, Dror
2018-02-01
Optical methods for extracting properties of tissues are commonly used. These methods are non-invasive, cause no harm to the patient and are characterized by high speed. The human tissue is a turbid media hence it poses a challenge for different optical methods. In addition the analysis of the emitted light requires calibration for achieving accuracy information. Most of the methods analyze the reflected light based on their phase and amplitude or the transmitted light. We suggest a new optical method for extracting optical properties of cylindrical tissues based on their full scattering profile (FSP), which means the angular distribution of the reemitted light. The FSP of cylindrical tissues is relevant for biomedical measurement of fingers, earlobes or pinched tissues. We found the iso-pathlength (IPL) point, a point on the surface of the cylinder medium where the light intensity remains constant and does not depend on the reduced scattering coefficient of the medium, but rather depends on the spatial structure and the cylindrical geometry. However, a similar behavior was also previously reported in reflection from a semi-infinite medium. Moreover, we presented a linear dependency between the radius of the tissue and the point's location. This point can be used as a self-calibration point and thus improve the accuracy of optical tissue measurements. This natural phenomenon has not been investigated before. We show this phenomenon theoretically, based on the diffusion theory, which is supported by our simulation results using Monte Carlo simulation.
NASA Astrophysics Data System (ADS)
Korotey, E. V.; Sinyavskii, N. Ya.
2007-07-01
A new method for determination of rheological parameters of liquid crystals with zero anisotropy of diamagnetic susceptibility is proposed, which is based on the measurement of the quadrupole splitting line of the NMR 2H spectrum. The method provides higher information content of the experiments, with the shear flow discarded from consideration, compared to that obtained by the classical Leslie-Ericksen theory. A comparison with the experiment is performed, the coefficients of anisotropic viscosity of lecithin/D2O/cyclohexane are determined, and a conclusion is drawn as concerns the domain shapes.
Masters, Kevin S; Ross, Kaile M; Hooker, Stephanie A; Wooldridge, Jennalee L
2018-05-18
There has been a notable disconnect between theories of behavior change and behavior change interventions. Because few interventions are both explicitly and adequately theory-based, investigators cannot assess the impact of theory on intervention effectiveness. Theory-based interventions, designed to deliberately engage the theory's proposed mechanisms of change, are needed to adequately test theories. Thus, systematic approaches to theory-based intervention development are needed. This article will introduce and discuss the psychometric method of developing theory-based interventions. The psychometric approach to intervention development utilizes basic psychometric principles at each step of the intervention development process in order to build a theoretically driven intervention to, subsequently, be tested in process (mechanism) and outcome studies. Five stages of intervention development are presented as follows: (i) Choice of theory; (ii) Identification and characterization of key concepts and expected relations; (iii) Intervention construction; (iv) Initial testing and revision; and (v) Empirical testing of the intervention. Examples of this approach from the Colorado Meaning-Activity Project (COMAP) are presented. Based on self-determination theory integrated with meaning or purpose, and utilizing a motivational interviewing approach, the COMAP intervention is individually based with an initial interview followed by smart phone-delivered interventions for increasing daily activity. The psychometric approach to intervention development is one method to ensure careful consideration of theory in all steps of intervention development. This structured approach supports developing a research culture that endorses deliberate and systematic operationalization of theory into behavior change intervention from the outset of intervention development.
Bogg, Tim; Finn, Peter R
2009-05-01
Using insights from Ecological Systems Theory and Reinforcement Sensitivity Theory, the current study assessed the utility of a series of hypothetical role-based alcohol-consumption scenarios that varied in their presentation of rewarding and punishing information. The scenarios, along with measures of impulsive sensation seeking and a self-report of weekly alcohol consumption, were administered to a sample of alcohol-dependent and non-alcohol-dependent college-age individuals (N = 170). The results showed scenario attendance decisions were largely unaffected by alcohol-dependence status and variations in contextual reward and punishment information. In contrast to the attendance findings, the results for the alcohol-consumption decisions showed alcohol-dependent individuals reported a greater frequency of deciding to drink, as well as indicating greater alcohol consumption in the contexts of complementary rewarding or nonpunishing information. Regression results provided evidence for the criterion-related validity of scenario outcomes in an account of diagnostic alcohol problems. The results are discussed in terms of the conceptual and predictive gains associated with an assessment approach to alcohol-consumption decision making that combines situational information organized and balanced through the frameworks of Ecological Systems Theory and Reinforcement Sensitivity Theory.
Lyon, Aaron R; Lewis, Cara C; Melvin, Abigail; Boyd, Meredith; Nicodimos, Semret; Liu, Freda F; Jungbluth, Nathaniel
2016-09-22
Health information technologies (HIT) have become nearly ubiquitous in the contemporary healthcare landscape, but information about HIT development, functionality, and implementation readiness is frequently siloed. Theory-driven methods of compiling, evaluating, and integrating information from the academic and commercial sectors are necessary to guide stakeholder decision-making surrounding HIT adoption and to develop pragmatic HIT research agendas. This article presents the Health Information Technologies-Academic and Commercial Evaluation (HIT-ACE) methodology, a structured, theory-driven method for compiling and evaluating information from multiple sectors. As an example demonstration of the methodology, we apply HIT-ACE to mental and behavioral health measurement feedback systems (MFS). MFS are a specific class of HIT that support the implementation of routine outcome monitoring, an evidence-based practice. HIT-ACE is guided by theories and frameworks related to user-centered design and implementation science. The methodology involves four phases: (1) coding academic and commercial materials, (2) developer/purveyor interviews, (3) linking putative implementation mechanisms to hit capabilities, and (4) experimental testing of capabilities and mechanisms. In the current demonstration, phase 1 included a systematic process to identify MFS in mental and behavioral health using academic literature and commercial websites. Using user-centered design, implementation science, and feedback frameworks, the HIT-ACE coding system was developed, piloted, and used to review each identified system for the presence of 38 capabilities and 18 additional characteristics via a consensus coding process. Bibliometic data were also collected to examine the representation of the systems in the scientific literature. As an example, results are presented for the application of HIT-ACE phase 1 to MFS wherein 49 separate MFS were identified, reflecting a diverse array of characteristics and capabilities. Preliminary findings demonstrate the utility of HIT-ACE to represent the scope and diversity of a given class of HIT beyond what can be identified in the academic literature. Phase 2 data collection is expected to confirm and expand the information presented and phases 3 and 4 will provide more nuanced information about the impact of specific HIT capabilities. In all, HIT-ACE is expected to support adoption decisions and additional HIT development and implementation research.
Non-ad-hoc decision rule for the Dempster-Shafer method of evidential reasoning
NASA Astrophysics Data System (ADS)
Cheaito, Ali; Lecours, Michael; Bosse, Eloi
1998-03-01
This paper is concerned with the fusion of identity information through the use of statistical analysis rooted in Dempster-Shafer theory of evidence to provide automatic identification aboard a platform. An identity information process for a baseline Multi-Source Data Fusion (MSDF) system is defined. The MSDF system is applied to information sources which include a number of radars, IFF systems, an ESM system, and a remote track source. We use a comprehensive Platform Data Base (PDB) containing all the possible identity values that the potential target may take, and we use the fuzzy logic strategies which enable the fusion of subjective attribute information from sensor and the PDB to make the derivation of target identity more quickly, more precisely, and with statistically quantifiable measures of confidence. The conventional Dempster-Shafer lacks a formal basis upon which decision can be made in the face of ambiguity. We define a non-ad hoc decision rule based on the expected utility interval for pruning the `unessential' propositions which would otherwise overload the real-time data fusion systems. An example has been selected to demonstrate the implementation of our modified Dempster-Shafer method of evidential reasoning.
Do Peer Tutors Help Teach ESL Students to Learn English as a Second Language More Successfully?
ERIC Educational Resources Information Center
Lyttle, LeighAnne
2011-01-01
This research study tries to understand the information processing model and social learning theory in regards to teaching English as a Second Language (ESL) to Spanish speakers by using peer teaching methods. This study will examine each theory's concepts and frameworks to better comprehend what teaching methods support English language learning.…
Ekberg, Joakim; Ericson, Leni; Timpka, Toomas; Eriksson, Henrik; Nordfeldt, Sam; Hanberger, Lena; Ludvigsson, Johnny
2010-04-01
Self-directed learning denotes that the individual is in command of what should be learned and why it is important. In this study, guidelines for the design of Web 2.0 systems for supporting diabetic adolescents' every day learning needs are examined in light of theories about information behaviour and social learning. A Web 2.0 system was developed to support a community of practice and social learning structures were created to support building of relations between members on several levels in the community. The features of the system included access to participation in the culture of diabetes management practice, entry to information about the community and about what needs to be learned to be a full practitioner or respected member in the community, and free sharing of information, narratives and experience-based knowledge. After integration with the key elements derived from theories of information behaviour, a preliminary design guideline document was formulated.
2013-09-01
right time. The CCS will use this to create task oriented, role based gestalt views of the patient that the ICU clinical team can understand and rely on...these artifacts, such as diagrams, organize crucial information to assist cognitive work, from perception to decision making and outcome assessment...In: D. Silverman, ed. Qualitative research: Theory , method and practice. London: Sage: 161-82. Nemeth, C., O’Connor, M., Klock, P.A., and Cook
Strengthening Theoretical Testing in Criminology Using Agent-based Modeling.
Johnson, Shane D; Groff, Elizabeth R
2014-07-01
The Journal of Research in Crime and Delinquency ( JRCD ) has published important contributions to both criminological theory and associated empirical tests. In this article, we consider some of the challenges associated with traditional approaches to social science research, and discuss a complementary approach that is gaining popularity-agent-based computational modeling-that may offer new opportunities to strengthen theories of crime and develop insights into phenomena of interest. Two literature reviews are completed. The aim of the first is to identify those articles published in JRCD that have been the most influential and to classify the theoretical perspectives taken. The second is intended to identify those studies that have used an agent-based model (ABM) to examine criminological theories and to identify which theories have been explored. Ecological theories of crime pattern formation have received the most attention from researchers using ABMs, but many other criminological theories are amenable to testing using such methods. Traditional methods of theory development and testing suffer from a number of potential issues that a more systematic use of ABMs-not without its own issues-may help to overcome. ABMs should become another method in the criminologists toolbox to aid theory testing and falsification.
Sebire, Simon J; Kesten, Joanna M; Edwards, Mark J; May, Thomas; Banfield, Kathryn; Tomkinson, Keeley; Blair, Peter S; Bird, Emma L; Powell, Jane E; Jago, Russell
2016-05-01
To report the theory-based process evaluation of the Bristol Girls' Dance Project, a cluster-randomised controlled trial to increase adolescent girls' physical activity. A mixed-method process evaluation of the intervention's self-determination theory components comprising lesson observations, post-intervention interviews and focus groups. Four intervention dance lessons per dance instructor were observed, audio recorded and rated to estimate the use of need-supportive teaching strategies. Intervention participants (n = 281) reported their dance instructors' provision of autonomy-support. Semi-structured interviews with the dance instructors (n = 10) explored fidelity to the theory and focus groups were conducted with participants (n = 59) in each school to explore their receipt of the intervention and views on the dance instructors' motivating style. Although instructors accepted the theory-based approach, intervention fidelity was variable. Relatedness support was the most commonly observed need-supportive teaching behaviour, provision of structure was moderate and autonomy-support was comparatively low. The qualitative findings identified how instructors supported competence and developed trusting relationships with participants. Fidelity was challenged where autonomy provision was limited to option choices rather than input into the pace or direction of lessons and where controlling teaching styles were adopted, often to manage disruptive behaviour. The successes and challenges to achieving theoretical fidelity in the Bristol Girls' Dance Project may help explain the intervention effects and can more broadly inform the design of theory-based complex interventions aimed at increasing young people's physical activity in after-school settings.
Practical theories for service life prediction of critical aerospace structural components
NASA Technical Reports Server (NTRS)
Ko, William L.; Monaghan, Richard C.; Jackson, Raymond H.
1992-01-01
A new second-order theory was developed for predicting the service lives of aerospace structural components. The predictions based on this new theory were compared with those based on the Ko first-order theory and the classical theory of service life predictions. The new theory gives very accurate service life predictions. An equivalent constant-amplitude stress cycle method was proposed for representing the random load spectrum for crack growth calculations. This method predicts the most conservative service life. The proposed use of minimum detectable crack size, instead of proof load established crack size as an initial crack size for crack growth calculations, could give a more realistic service life.
George, Nika; MacDougall, Megan
2016-01-01
Background Women are disproportionately likely to assist aging family members; approximately 53 million in the United States are involved with the health care of aging parents, in-laws, or other relatives. The busy schedules of “sandwich generation” women who care for older relatives require accessible and flexible health education, including Web-based approaches. Objective This paper describes the development and implementation of a Web-based health education intervention, The Sandwich Generation Diner, as a tool for intergenerational caregivers of older adults with physical and cognitive impairments. Methods We used Bartholomew’s Intervention Mapping (IM) process to develop our theory-based health education program. Bandura’s (1997) self-efficacy theory provided the overarching theoretical model. Results The Sandwich Generation Diner website features four modules that address specific health care concerns. Our research involves randomly assigning caregiver participants to one of two experimental conditions that are identical in the type of information provided, but vary significantly in the presentation. In addition to structured Web-based assessments, specific website usage data are recorded. Conclusions The Sandwich Generation Diner was developed to address some of the informational and self-efficacy needs of intergenerational female caregivers. The next step is to demonstrate that this intervention is: (1) attractive and effective with families assisting older adults, and (2) feasible to embed within routine home health services for older adults. PMID:27269632
A smartphone app to communicate child passenger safety: an application of theory to practice.
Gielen, A C; McDonald, E M; Omaki, E; Shields, W; Case, J; Aitken, M
2015-10-01
Child passenger safety remains an important public health problem because motor vehicle crashes are the leading cause of death for children, and the majority of children ride improperly restrained. Using a mobile app to communicate with parents about injury prevention offers promise but little information is available on how to create such a tool. The purpose of this article is to illustrate a theory-based approach to developing a tailored, smartphone app for communicating child passenger safety information to parents. The theoretical basis for the tailoring is the elaboration likelihood model, and we utilized the precaution adoption process model (PAPM) to reflect the stage-based nature of behavior change. We created assessment items (written at ≤6th grade reading level) to determine the child's proper type of car seat, the parent's PAPM stage and beliefs on selected constructs designed to facilitate stage movement according to the theory. A message library and template were created to provide a uniform structure for the tailored feedback. We demonstrate how messages derived in this way can be delivered through new m-health technology and conclude with recommendations for the utility of the methods used here for other m-health, patient education interventions. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Applying Item Response Theory Methods to Design a Learning Progression-Based Science Assessment
ERIC Educational Resources Information Center
Chen, Jing
2012-01-01
Learning progressions are used to describe how students' understanding of a topic progresses over time and to classify the progress of students into steps or levels. This study applies Item Response Theory (IRT) based methods to investigate how to design learning progression-based science assessments. The research questions of this study are: (1)…
Lunar-base construction equipment and methods evaluation
NASA Technical Reports Server (NTRS)
Boles, Walter W.; Ashley, David B.; Tucker, Richard L.
1993-01-01
A process for evaluating lunar-base construction equipment and methods concepts is presented. The process is driven by the need for more quantitative, systematic, and logical methods for assessing further research and development requirements in an area where uncertainties are high, dependence upon terrestrial heuristics is questionable, and quantitative methods are seldom applied. Decision theory concepts are used in determining the value of accurate information and the process is structured as a construction-equipment-and-methods selection methodology. Total construction-related, earth-launch mass is the measure of merit chosen for mathematical modeling purposes. The work is based upon the scope of the lunar base as described in the National Aeronautics and Space Administration's Office of Exploration's 'Exploration Studies Technical Report, FY 1989 Status'. Nine sets of conceptually designed construction equipment are selected as alternative concepts. It is concluded that the evaluation process is well suited for assisting in the establishment of research agendas in an approach that is first broad, with a low level of detail, followed by more-detailed investigations into areas that are identified as critical due to high degrees of uncertainty and sensitivity.
2010-01-01
Background The challenge today is to develop a modeling and simulation paradigm that integrates structural, molecular and genetic data for a quantitative understanding of physiology and behavior of biological processes at multiple scales. This modeling method requires techniques that maintain a reasonable accuracy of the biological process and also reduces the computational overhead. This objective motivates the use of new methods that can transform the problem from energy and affinity based modeling to information theory based modeling. To achieve this, we transform all dynamics within the cell into a random event time, which is specified through an information domain measure like probability distribution. This allows us to use the “in silico” stochastic event based modeling approach to find the molecular dynamics of the system. Results In this paper, we present the discrete event simulation concept using the example of the signal transduction cascade triggered by extra-cellular Mg2+ concentration in the two component PhoPQ regulatory system of Salmonella Typhimurium. We also present a model to compute the information domain measure of the molecular transport process by estimating the statistical parameters of inter-arrival time between molecules/ions coming to a cell receptor as external signal. This model transforms the diffusion process into the information theory measure of stochastic event completion time to get the distribution of the Mg2+ departure events. Using these molecular transport models, we next study the in-silico effects of this external trigger on the PhoPQ system. Conclusions Our results illustrate the accuracy of the proposed diffusion models in explaining the molecular/ionic transport processes inside the cell. Also, the proposed simulation framework can incorporate the stochasticity in cellular environments to a certain degree of accuracy. We expect that this scalable simulation platform will be able to model more complex biological systems with reasonable accuracy to understand their temporal dynamics. PMID:21143785
Ghosh, Preetam; Ghosh, Samik; Basu, Kalyan; Das, Sajal K; Zhang, Chaoyang
2010-12-01
The challenge today is to develop a modeling and simulation paradigm that integrates structural, molecular and genetic data for a quantitative understanding of physiology and behavior of biological processes at multiple scales. This modeling method requires techniques that maintain a reasonable accuracy of the biological process and also reduces the computational overhead. This objective motivates the use of new methods that can transform the problem from energy and affinity based modeling to information theory based modeling. To achieve this, we transform all dynamics within the cell into a random event time, which is specified through an information domain measure like probability distribution. This allows us to use the "in silico" stochastic event based modeling approach to find the molecular dynamics of the system. In this paper, we present the discrete event simulation concept using the example of the signal transduction cascade triggered by extra-cellular Mg2+ concentration in the two component PhoPQ regulatory system of Salmonella Typhimurium. We also present a model to compute the information domain measure of the molecular transport process by estimating the statistical parameters of inter-arrival time between molecules/ions coming to a cell receptor as external signal. This model transforms the diffusion process into the information theory measure of stochastic event completion time to get the distribution of the Mg2+ departure events. Using these molecular transport models, we next study the in-silico effects of this external trigger on the PhoPQ system. Our results illustrate the accuracy of the proposed diffusion models in explaining the molecular/ionic transport processes inside the cell. Also, the proposed simulation framework can incorporate the stochasticity in cellular environments to a certain degree of accuracy. We expect that this scalable simulation platform will be able to model more complex biological systems with reasonable accuracy to understand their temporal dynamics.
Improving Bridging from Informatics Practice to Theory.
Lehmann, C U; Gundlapalli, A V
2015-01-01
In 1962, Methods of Information in Medicine ( MIM ) began to publish papers on the methodology and scientific fundamentals of organizing, representing, and analyzing data, information, and knowledge in biomedicine and health care. Considered a companion journal, Applied Clinical Informatics ( ACI ) was launched in 2009 with a mission to establish a platform that allows sharing of knowledge between clinical medicine and health IT specialists as well as to bridge gaps between visionary design and successful and pragmatic deployment of clinical information systems. Both journals are official journals of the International Medical Informatics Association. As a follow-up to prior work, we set out to explore congruencies and interdependencies in publications of ACI and MIM. The objectives were to describe the major topics discussed in articles published in ACI in 2014 and to determine if there was evidence that theory in 2014 MIM publications was informed by practice described in ACI publications in any year. We also set out to describe lessons learned in the context of bridging informatics practice and theory and offer opinions on how ACI editorial policies could evolve to foster and improve such bridging. We conducted a retrospective observational study and reviewed all articles published in ACI during the calendar year 2014 (Volume 5) for their main theme, conclusions, and key words. We then reviewed the citations of all MIM papers from 2014 to determine if there were references to ACI articles from any year. Lessons learned in the context of bridging informatics practice and theory and opinions on ACI editorial policies were developed by consensus among the two authors. A total of 70 articles were published in ACI in 2014. Clinical decision support, clinical documentation, usability, Meaningful Use, health information exchange, patient portals, and clinical research informatics emerged as major themes. Only one MIM article from 2014 cited an ACI article. There are several lessons learned including the possibility that there may not be direct links between MIM theory and ACI practice articles. ACI editorial policies will continue to evolve to reflect the breadth and depth of the practice of clinical informatics and articles received for publication. Efforts to encourage bridging of informatics practice and theory may be considered by the ACI editors. The lack of direct links from informatics theory-based papers published in MIM in 2014 to papers published in ACI continues as was described for papers published during 2012 to 2013 in the two companion journals. Thus, there is little evidence that theory in MIM has been informed by practice in ACI.
Time-reversal MUSIC imaging of extended targets.
Marengo, Edwin A; Gruber, Fred K; Simonetti, Francesco
2007-08-01
This paper develops, within a general framework that is applicable to rather arbitrary electromagnetic and acoustic remote sensing systems, a theory of time-reversal "MUltiple Signal Classification" (MUSIC)-based imaging of extended (nonpoint-like) scatterers (targets). The general analysis applies to arbitrary remote sensing geometry and sheds light onto how the singular system of the scattering matrix relates to the geometrical and propagation characteristics of the entire transmitter-target-receiver system and how to use this effect for imaging. All the developments are derived within exact scattering theory which includes multiple scattering effects. The derived time-reversal MUSIC methods include both interior sampling, as well as exterior sampling (or enclosure) approaches. For presentation simplicity, particular attention is given to the time-harmonic case where the informational wave modes employed for target interrogation are purely spatial, but the corresponding generalization to broadband fields is also given. This paper includes computer simulations illustrating the derived theory and algorithms.
Sarkadi, Anna; Fabian, Helena
2017-01-01
Evidence-based methods to identify behavioural problems among children are not regularly used within the Swedish Child healthcare. A new procedure was therefore introduced to assess children through parent- and preschool teacher reports using the Strengths and Difficulties Questionnaire (SDQ). This study aims to explore nurses’, preschool teachers’ and parents’ perspectives of this new information sharing model. Using the grounded theory methodology, semi-structured interviews with nurses (n = 10) at child health clinics, preschool teachers (n = 13) and parents (n = 11) of 3-, 4- and 5-year-old children were collected and analysed between March 2014 and June 2014. The analysis was conducted using constant comparative method. The participants were sampled purposively within a larger trial in Sweden. Results indicate that all stakeholders shared a desire to have a complete picture of the child's health. The perceptions that explain why the stakeholders were in favour of the new procedure—the ‘causal conditions’ in a grounded theory model—included: (1) Nurses thought that visits after 18-months were unsatisfactory, (2) Preschool teachers wanted to identify children with difficulties and (3) Parents viewed preschool teachers as being qualified to assess children. However, all stakeholders had doubts as to whether there was a reliable way to assess children’s behaviour. Although nurses found the SDQ to be useful for their clinical evaluation, they noticed that not all parents chose to participate. Both teachers and parents acknowledged benefits of information sharing. However, the former had concerns about parental reactions to their assessments and the latter about how personal information was handled. The theoretical model developed describes that the causal conditions and current context of child healthcare in many respects endorse the introduction of information sharing. However, successful implementation requires considerable work to address barriers: the tension between normative thinking versus helping children with developmental problems for preschool teachers and dealing with privacy issues and inequity in participation for parents. PMID:28076401