Ranking streamflow model performance based on Information theory metrics
NASA Astrophysics Data System (ADS)
Martinez, Gonzalo; Pachepsky, Yakov; Pan, Feng; Wagener, Thorsten; Nicholson, Thomas
2016-04-01
The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic model evaluation and selection. We simulated 10-year streamflow time series in five watersheds located in Texas, North Carolina, Mississippi, and West Virginia. Eight model of different complexity were applied. The information-theory based metrics were obtained after representing the time series as strings of symbols where different symbols corresponded to different quantiles of the probability distribution of streamflow. The symbol alphabet was used. Three metrics were computed for those strings - mean information gain that measures the randomness of the signal, effective measure complexity that characterizes predictability and fluctuation complexity that characterizes the presence of a pattern in the signal. The observed streamflow time series has smaller information content and larger complexity metrics than the precipitation time series. Watersheds served as information filters and and streamflow time series were less random and more complex than the ones of precipitation. This is reflected the fact that the watershed acts as the information filter in the hydrologic conversion process from precipitation to streamflow. The Nash Sutcliffe efficiency metric increased as the complexity of models increased, but in many cases several model had this efficiency values not statistically significant from each other. In such cases, ranking models by the closeness of the information-theory based parameters in simulated and measured streamflow time series can provide an additional criterion for the evaluation of hydrologic model performance.
Information-theoretic metamodel of organizational evolution
NASA Astrophysics Data System (ADS)
Sepulveda, Alfredo
2011-12-01
Social organizations are abstractly modeled by holarchies---self-similar connected networks---and intelligent complex adaptive multiagent systems---large networks of autonomous reasoning agents interacting via scaled processes. However, little is known of how information shapes evolution in such organizations, a gap that can lead to misleading analytics. The research problem addressed in this study was the ineffective manner in which classical model-predict-control methods used in business analytics attempt to define organization evolution. The purpose of the study was to construct an effective metamodel for organization evolution based on a proposed complex adaptive structure---the info-holarchy. Theoretical foundations of this study were holarchies, complex adaptive systems, evolutionary theory, and quantum mechanics, among other recently developed physical and information theories. Research questions addressed how information evolution patterns gleamed from the study's inductive metamodel more aptly explained volatility in organization. In this study, a hybrid grounded theory based on abstract inductive extensions of information theories was utilized as the research methodology. An overarching heuristic metamodel was framed from the theoretical analysis of the properties of these extension theories and applied to business, neural, and computational entities. This metamodel resulted in the synthesis of a metaphor for, and generalization of organization evolution, serving as the recommended and appropriate analytical tool to view business dynamics for future applications. This study may manifest positive social change through a fundamental understanding of complexity in business from general information theories, resulting in more effective management.
NASA Astrophysics Data System (ADS)
Pan, Feng; Pachepsky, Yakov A.; Guber, Andrey K.; McPherson, Brian J.; Hill, Robert L.
2012-01-01
SummaryUnderstanding streamflow patterns in space and time is important for improving flood and drought forecasting, water resources management, and predictions of ecological changes. Objectives of this work include (a) to characterize the spatial and temporal patterns of streamflow using information theory-based measures at two thoroughly-monitored agricultural watersheds located in different hydroclimatic zones with similar land use, and (b) to elucidate and quantify temporal and spatial scale effects on those measures. We selected two USDA experimental watersheds to serve as case study examples, including the Little River experimental watershed (LREW) in Tifton, Georgia and the Sleepers River experimental watershed (SREW) in North Danville, Vermont. Both watersheds possess several nested sub-watersheds and more than 30 years of continuous data records of precipitation and streamflow. Information content measures (metric entropy and mean information gain) and complexity measures (effective measure complexity and fluctuation complexity) were computed based on the binary encoding of 5-year streamflow and precipitation time series data. We quantified patterns of streamflow using probabilities of joint or sequential appearances of the binary symbol sequences. Results of our analysis illustrate that information content measures of streamflow time series are much smaller than those for precipitation data, and the streamflow data also exhibit higher complexity, suggesting that the watersheds effectively act as filters of the precipitation information that leads to the observed additional complexity in streamflow measures. Correlation coefficients between the information-theory-based measures and time intervals are close to 0.9, demonstrating the significance of temporal scale effects on streamflow patterns. Moderate spatial scale effects on streamflow patterns are observed with absolute values of correlation coefficients between the measures and sub-watershed area varying from 0.2 to 0.6 in the two watersheds. We conclude that temporal effects must be evaluated and accounted for when the information theory-based methods are used for performance evaluation and comparison of hydrological models.
Cognitive Load Theory and the Effects of Transient Information on the Modality Effect
ERIC Educational Resources Information Center
Leahy, Wayne; Sweller, John
2016-01-01
Based on cognitive load theory and the "transient information effect," this paper investigated the "modality effect" while interpreting a contour map. The length and complexity of auditory and visual text instructions were manipulated. Experiment 1 indicated that longer audio text information within a presentation was inferior…
Sustainable System Management with Fisher Information based Objectives
Sustainable ecosystem management that integrates ecological, economic and social perspectives is a complex task where simultaneous persistence of human and natural components of the system must be ensured. Given the complexity of this task, systems theory approaches based on soun...
ERIC Educational Resources Information Center
Williamson, David J.
2011-01-01
The specific problem addressed in this study was the low success rate of information technology (IT) projects in the U.S. Due to the abstract nature and inherent complexity of software development, IT projects are among the most complex projects encountered. Most existing schools of project management theory are based on the rational systems…
Cognitive Tools for Assessment and Learning in a High Information Flow Environment.
ERIC Educational Resources Information Center
Lajoie, Susanne P.; Azevedo, Roger; Fleiszer, David M.
1998-01-01
Describes the development of a simulation-based intelligent tutoring system for nurses working in a surgical intensive care unit. Highlights include situative learning theories and models of instruction, modeling expertise, complex decision making, linking theories of learning to the design of computer-based learning environments, cognitive task…
Information theory applications for biological sequence analysis.
Vinga, Susana
2014-05-01
Information theory (IT) addresses the analysis of communication systems and has been widely applied in molecular biology. In particular, alignment-free sequence analysis and comparison greatly benefited from concepts derived from IT, such as entropy and mutual information. This review covers several aspects of IT applications, ranging from genome global analysis and comparison, including block-entropy estimation and resolution-free metrics based on iterative maps, to local analysis, comprising the classification of motifs, prediction of transcription factor binding sites and sequence characterization based on linguistic complexity and entropic profiles. IT has also been applied to high-level correlations that combine DNA, RNA or protein features with sequence-independent properties, such as gene mapping and phenotype analysis, and has also provided models based on communication systems theory to describe information transmission channels at the cell level and also during evolutionary processes. While not exhaustive, this review attempts to categorize existing methods and to indicate their relation with broader transversal topics such as genomic signatures, data compression and complexity, time series analysis and phylogenetic classification, providing a resource for future developments in this promising area.
Learner-Centred Pedagogy for Swim Coaching: A Complex Learning Theory-Informed Approach
ERIC Educational Resources Information Center
Light, Richard
2014-01-01
While constructivist theories of learning have been widely drawn on to understand and explain learning in games when using game-based approaches their use to inform pedagogy beyond games is limited. In particular, there has been little interest in applying constructivist perspectives on learning to sports in which technique is of prime importance.…
A study of the spreading scheme for viral marketing based on a complex network model
NASA Astrophysics Data System (ADS)
Yang, Jianmei; Yao, Canzhong; Ma, Weicheng; Chen, Guanrong
2010-02-01
Buzzword-based viral marketing, known also as digital word-of-mouth marketing, is a marketing mode attached to some carriers on the Internet, which can rapidly copy marketing information at a low cost. Viral marketing actually uses a pre-existing social network where, however, the scale of the pre-existing network is believed to be so large and so random, so that its theoretical analysis is intractable and unmanageable. There are very few reports in the literature on how to design a spreading scheme for viral marketing on real social networks according to the traditional marketing theory or the relatively new network marketing theory. Complex network theory provides a new model for the study of large-scale complex systems, using the latest developments of graph theory and computing techniques. From this perspective, the present paper extends the complex network theory and modeling into the research of general viral marketing and develops a specific spreading scheme for viral marking and an approach to design the scheme based on a real complex network on the QQ instant messaging system. This approach is shown to be rather universal and can be further extended to the design of various spreading schemes for viral marketing based on different instant messaging systems.
A theory-based approach to teaching young children about health: A recipe for understanding
Nguyen, Simone P.; McCullough, Mary Beth; Noble, Ashley
2011-01-01
The theory-theory account of conceptual development posits that children’s concepts are integrated into theories. Concept learning studies have documented the central role that theories play in children’s learning of experimenter-defined categories, but have yet to extensively examine complex, real-world concepts such as health. The present study examined whether providing young children with coherent and causally-related information in a theory-based lesson would facilitate their learning about the concept of health. This study used a pre-test/lesson/post-test design, plus a five month follow-up. Children were randomly assigned to one of three conditions: theory (i.e., 20 children received a theory-based lesson); nontheory (i.e., 20 children received a nontheory-based lesson); and control (i.e., 20 children received no lesson). Overall, the results showed that children in the theory condition had a more accurate conception of health than children in the nontheory and control conditions, suggesting the importance of theories in children’s learning of complex, real-world concepts. PMID:21894237
Cresswell, Kathrin M; Worth, Allison; Sheikh, Aziz
2010-11-01
Actor-Network Theory (ANT) is an increasingly influential, but still deeply contested, approach to understand humans and their interactions with inanimate objects. We argue that health services research, and in particular evaluations of complex IT systems in health service organisations, may benefit from being informed by Actor-Network Theory perspectives. Despite some limitations, an Actor-Network Theory-based approach is conceptually useful in helping to appreciate the complexity of reality (including the complexity of organisations) and the active role of technology in this context. This can prove helpful in understanding how social effects are generated as a result of associations between different actors in a network. Of central importance in this respect is that Actor-Network Theory provides a lens through which to view the role of technology in shaping social processes. Attention to this shaping role can contribute to a more holistic appreciation of the complexity of technology introduction in healthcare settings. It can also prove practically useful in providing a theoretically informed approach to sampling (by drawing on informants that are related to the technology in question) and analysis (by providing a conceptual tool and vocabulary that can form the basis for interpretations). We draw on existing empirical work in this area and our ongoing work investigating the integration of electronic health record systems introduced as part of England's National Programme for Information Technology to illustrate salient points. Actor-Network Theory needs to be used pragmatically with an appreciation of its shortcomings. Our experiences suggest it can be helpful in investigating technology implementations in healthcare settings.
Theories of how the school environment impacts on student health: systematic review and synthesis.
Bonell, C P; Fletcher, A; Jamal, F; Wells, H; Harden, A; Murphy, S; Thomas, J
2013-11-01
Public-health interventions informed by theory can be more effective but complex interventions often use insufficiently complex theories. We systematically reviewed theories of how school environments influence health. We included 37 reports drawing on 24 theories. Narrative synthesis summarised and categorised theories. We then produced an integrated theory of school environment influences on student health. This integrated theory could inform complex interventions such as health promoting schools programmes. Using systematic reviews to develop theories of change might be useful for other types of 'complex' public-health interventions addressing risks at the individual and community levels. © 2013 Published by Elsevier Ltd.
ERIC Educational Resources Information Center
Yamagata-Lynch, Lisa C.
2007-01-01
Understanding human activity in real-world situations often involves complicated data collection, analysis, and presentation methods. This article discusses how Cultural-Historical Activity Theory (CHAT) can inform design-based research practices that focus on understanding activity in real-world situations. I provide a sample data set with…
Analyzing complex networks evolution through Information Theory quantifiers
NASA Astrophysics Data System (ADS)
Carpi, Laura C.; Rosso, Osvaldo A.; Saco, Patricia M.; Ravetti, Martín Gómez
2011-01-01
A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Niño/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.
Managing for resilience: an information theory-based ...
Ecosystems are complex and multivariate; hence, methods to assess the dynamics of ecosystems should have the capacity to evaluate multiple indicators simultaneously. Most research on identifying leading indicators of regime shifts has focused on univariate methods and simple models which have limited utility when evaluating real ecosystems, particularly because drivers are often unknown. We discuss some common univariate and multivariate approaches for detecting critical transitions in ecosystems and demonstrate their capabilities via case studies. Synthesis and applications. We illustrate the utility of an information theory-based index for assessing ecosystem dynamics. Trends in this index also provide a sentinel of both abrupt and gradual transitions in ecosystems. In response to the need to identify leading indicators of regime shifts in ecosystems, our research compares traditional indicators and Fisher information, an information theory based method, by examining four case study systems. Results demonstrate the utility of methods and offers great promise for quantifying and managing for resilience.
Preliminary Characterization of Erythrocytes Deformability on the Entropy-Complexity Plane
Korol, Ana M; D’Arrigo, Mabel; Foresto, Patricia; Pérez, Susana; Martín, Maria T; Rosso, Osualdo A
2010-01-01
We present an application of wavelet-based Information Theory quantifiers (Normalized Total Shannon Entropy, MPR-Statistical Complexity and Entropy-Complexity plane) on red blood cells membrane viscoelasticity characterization. These quantifiers exhibit important localization advantages provided by the Wavelet Theory. The present approach produces a clear characterization of this dynamical system, finding out an evident manifestation of a random process on the red cell samples of healthy individuals, and its sharp reduction of randomness on analyzing a human haematological disease, such as β-thalassaemia minor. PMID:21611139
When ICT Meets Schools: Differentiation, Complexity and Adaptability
ERIC Educational Resources Information Center
Tubin, Dorit
2007-01-01
Purpose: The purpose of this study is to explore the interaction between information communication technology (ICT) and the school's organizational structure, and propose an analytical model based both on Luhmann's system theory and empirical findings. Design/methodology/approach: The approach of building a theory from a case study research along…
Prospect Theory and Interval-Valued Hesitant Set for Safety Evacuation Model
NASA Astrophysics Data System (ADS)
Kou, Meng; Lu, Na
2018-01-01
The study applies the research results of prospect theory and multi attribute decision making theory, combined with the complexity, uncertainty and multifactor influence of the underground mine fire system and takes the decision makers’ psychological behavior of emotion and intuition into full account to establish the intuitionistic fuzzy multiple attribute decision making method that is based on the prospect theory. The model established by this method can explain the decision maker’s safety evacuation decision behavior in the complex system of underground mine fire due to the uncertainty of the environment, imperfection of the information and human psychological behavior and other factors.
On uncertainty in information and ignorance in knowledge
NASA Astrophysics Data System (ADS)
Ayyub, Bilal M.
2010-05-01
This paper provides an overview of working definitions of knowledge, ignorance, information and uncertainty and summarises formalised philosophical and mathematical framework for their analyses. It provides a comparative examination of the generalised information theory and the generalised theory of uncertainty. It summarises foundational bases for assessing the reliability of knowledge constructed as a collective set of justified true beliefs. It discusses system complexity for ancestor simulation potentials. It offers value-driven communication means of knowledge and contrarian knowledge using memes and memetics.
A novel approach to characterize information radiation in complex networks
NASA Astrophysics Data System (ADS)
Wang, Xiaoyang; Wang, Ying; Zhu, Lin; Li, Chao
2016-06-01
The traditional research of information dissemination is mostly based on the virus spreading model that the information is being spread by probability, which does not match very well to the reality, because the information that we receive is always more or less than what was sent. In order to quantitatively describe variations in the amount of information during the spreading process, this article proposes a safety information radiation model on the basis of communication theory, combining with relevant theories of complex networks. This model comprehensively considers the various influence factors when safety information radiates in the network, and introduces some concepts from the communication theory perspective, such as the radiation gain function, receiving gain function, information retaining capacity and information second reception capacity, to describe the safety information radiation process between nodes and dynamically investigate the states of network nodes. On a micro level, this article analyzes the influence of various initial conditions and parameters on safety information radiation through the new model simulation. The simulation reveals that this novel approach can reflect the variation of safety information quantity of each node in the complex network, and the scale-free network has better ;radiation explosive power;, while the small-world network has better ;radiation staying power;. The results also show that it is efficient to improve the overall performance of network security by selecting nodes with high degrees as the information source, refining and simplifying the information, increasing the information second reception capacity and decreasing the noises. In a word, this article lays the foundation for further research on the interactions of information and energy between internal components within complex systems.
Using Target Network Modelling to Increase Battlespace Agility
2013-06-01
Moffat, James. (2003) Complexity Theory and Network Centric Warfare. Washington DC: CCRP Moore, David T.. Sensemaking : A Structure for an Intelligence...Ted Hopf’s “Promise of Constructivism in International Relations Theory ” presented in International Security in 1998; and Adler 1998. 5 Look to...of warfighting within a doctrinal framework. Based on 10 years of research12 informed by social theory , experimentation, NATO doctrinal studies and
Van Beurden, Eric K; Kia, Annie M; Zask, Avigdor; Dietrich, Uta; Rose, Lauren
2013-03-01
Health promotion addresses issues from the simple (with well-known cause/effect links) to the highly complex (webs and loops of cause/effect with unpredictable, emergent properties). Yet there is no conceptual framework within its theory base to help identify approaches appropriate to the level of complexity. The default approach favours reductionism--the assumption that reducing a system to its parts will inform whole system behaviour. Such an approach can yield useful knowledge, yet is inadequate where issues have multiple interacting causes, such as social determinants of health. To address complex issues, there is a need for a conceptual framework that helps choose action that is appropriate to context. This paper presents the Cynefin Framework, informed by complexity science--the study of Complex Adaptive Systems (CAS). It introduces key CAS concepts and reviews the emergence and implications of 'complex' approaches within health promotion. It explains the framework and its use with examples from contemporary practice, and sets it within the context of related bodies of health promotion theory. The Cynefin Framework, especially when used as a sense-making tool, can help practitioners understand the complexity of issues, identify appropriate strategies and avoid the pitfalls of applying reductionist approaches to complex situations. The urgency to address critical issues such as climate change and the social determinants of health calls for us to engage with complexity science. The Cynefin Framework helps practitioners make the shift, and enables those already engaged in complex approaches to communicate the value and meaning of their work in a system that privileges reductionist approaches.
NASA Astrophysics Data System (ADS)
Li, Shu-Bin; Cao, Dan-Ni; Dang, Wen-Xiu; Zhang, Lin
As a new cross-discipline, the complexity science has penetrated into every field of economy and society. With the arrival of big data, the research of the complexity science has reached its summit again. In recent years, it offers a new perspective for traffic control by using complex networks theory. The interaction course of various kinds of information in traffic system forms a huge complex system. A new mesoscopic traffic flow model is improved with variable speed limit (VSL), and the simulation process is designed, which is based on the complex networks theory combined with the proposed model. This paper studies effect of VSL on the dynamic traffic flow, and then analyzes the optimal control strategy of VSL in different network topologies. The conclusion of this research is meaningful to put forward some reasonable transportation plan and develop effective traffic management and control measures to help the department of traffic management.
Information Resources Usage in Project Management Digital Learning System
ERIC Educational Resources Information Center
Davidovitch, Nitza; Belichenko, Margarita; Kravchenko, Yurii
2017-01-01
The article combines a theoretical approach to structuring knowledge that is based on the integrated use of fuzzy semantic network theory predicates, Boolean functions, theory of complexity of network structures and some practical aspects to be considered in the distance learning at the university. The paper proposes a methodological approach that…
Chandler, Jacqueline; Rycroft-Malone, Jo; Hawkes, Claire; Noyes, Jane
2016-02-01
To examine the application of core concepts from Complexity Theory to explain the findings from a process evaluation undertaken in a trial evaluating implementation strategies for recommendations about reducing surgical fasting times. The proliferation of evidence-based guidance requires a greater focus on its implementation. Theory is required to explain the complex processes across the multiple healthcare organizational levels. This social healthcare context involves the interaction between professionals, patients and the organizational systems in care delivery. Complexity Theory may provide an explanatory framework to explain the complexities inherent in implementation in social healthcare contexts. A secondary thematic analysis of qualitative process evaluation data informed by Complexity Theory. Seminal texts applying Complexity Theory to the social context were annotated, key concepts extracted and core Complexity Theory concepts identified. These core concepts were applied as a theoretical lens to provide an explanation of themes from a process evaluation of a trial evaluating the implementation of strategies to reduce surgical fasting times. Sampled substantive texts provided a representative spread of theoretical development and application of Complexity Theory from late 1990's-2013 in social science, healthcare, management and philosophy. Five Complexity Theory core concepts extracted were 'self-organization', 'interaction', 'emergence', 'system history' and 'temporality'. Application of these concepts suggests routine surgical fasting practice is habituated in the social healthcare system and therefore it cannot easily be reversed. A reduction to fasting times requires an incentivised new approach to emerge in the surgical system's priority of completing the operating list. The application of Complexity Theory provides a useful explanation for resistance to change fasting practice. Its utility in implementation research warrants further attention and evaluation. © 2015 John Wiley & Sons Ltd.
Situated learning theory: adding rate and complexity effects via Kauffman's NK model.
Yuan, Yu; McKelvey, Bill
2004-01-01
For many firms, producing information, knowledge, and enhancing learning capability have become the primary basis of competitive advantage. A review of organizational learning theory identifies two approaches: (1) those that treat symbolic information processing as fundamental to learning, and (2) those that view the situated nature of cognition as fundamental. After noting that the former is inadequate because it focuses primarily on behavioral and cognitive aspects of individual learning, this paper argues the importance of studying learning as interactions among people in the context of their environment. It contributes to organizational learning in three ways. First, it argues that situated learning theory is to be preferred over traditional behavioral and cognitive learning theories, because it treats organizations as complex adaptive systems rather than mere information processors. Second, it adds rate and nonlinear learning effects. Third, following model-centered epistemology, it uses an agent-based computational model, in particular a "humanized" version of Kauffman's NK model, to study the situated nature of learning. Using simulation results, we test eight hypotheses extending situated learning theory in new directions. The paper ends with a discussion of possible extensions of the current study to better address key issues in situated learning.
Thinking graphically: Connecting vision and cognition during graph comprehension.
Ratwani, Raj M; Trafton, J Gregory; Boehm-Davis, Deborah A
2008-03-01
Task analytic theories of graph comprehension account for the perceptual and conceptual processes required to extract specific information from graphs. Comparatively, the processes underlying information integration have received less attention. We propose a new framework for information integration that highlights visual integration and cognitive integration. During visual integration, pattern recognition processes are used to form visual clusters of information; these visual clusters are then used to reason about the graph during cognitive integration. In 3 experiments, the processes required to extract specific information and to integrate information were examined by collecting verbal protocol and eye movement data. Results supported the task analytic theories for specific information extraction and the processes of visual and cognitive integration for integrative questions. Further, the integrative processes scaled up as graph complexity increased, highlighting the importance of these processes for integration in more complex graphs. Finally, based on this framework, design principles to improve both visual and cognitive integration are described. PsycINFO Database Record (c) 2008 APA, all rights reserved
An information theory criteria based blind method for enumerating active users in DS-CDMA system
NASA Astrophysics Data System (ADS)
Samsami Khodadad, Farid; Abed Hodtani, Ghosheh
2014-11-01
In this paper, a new and blind algorithm for active user enumeration in asynchronous direct sequence code division multiple access (DS-CDMA) in multipath channel scenario is proposed. The proposed method is based on information theory criteria. There are two main categories of information criteria which are widely used in active user enumeration, Akaike Information Criterion (AIC) and Minimum Description Length (MDL) information theory criteria. The main difference between these two criteria is their penalty functions. Due to this difference, MDL is a consistent enumerator which has better performance in higher signal-to-noise ratios (SNR) but AIC is preferred in lower SNRs. In sequel, we propose a SNR compliance method based on subspace and training genetic algorithm to have the performance of both of them. Moreover, our method uses only a single antenna, in difference to the previous methods which decrease hardware complexity. Simulation results show that the proposed method is capable of estimating the number of active users without any prior knowledge and the efficiency of the method.
A resource management architecture based on complex network theory in cloud computing federation
NASA Astrophysics Data System (ADS)
Zhang, Zehua; Zhang, Xuejie
2011-10-01
Cloud Computing Federation is a main trend of Cloud Computing. Resource Management has significant effect on the design, realization, and efficiency of Cloud Computing Federation. Cloud Computing Federation has the typical characteristic of the Complex System, therefore, we propose a resource management architecture based on complex network theory for Cloud Computing Federation (abbreviated as RMABC) in this paper, with the detailed design of the resource discovery and resource announcement mechanisms. Compare with the existing resource management mechanisms in distributed computing systems, a Task Manager in RMABC can use the historical information and current state data get from other Task Managers for the evolution of the complex network which is composed of Task Managers, thus has the advantages in resource discovery speed, fault tolerance and adaptive ability. The result of the model experiment confirmed the advantage of RMABC in resource discovery performance.
NASA Astrophysics Data System (ADS)
Doyle, Laurance R.; McCowan, Brenda; Hanser, Sean F.
2002-01-01
Information theory allows a quantification of the complexity of a given signaling system. We are applying information theory to dolphin whistle vocalizations, humpback whale songs, squirrel monkey chuck calls, and several other animal communication systems' in order to develop a quantitative and objective way to compare inter species communication systems' complexity. Once signaling units have been correctly classified the communication system must obey certain statistical distributions in order to contain complexity whether it is human languages, dolphin whistle vocalizations, or even a system of communication signals received from an extraterrestrial source.
ERIC Educational Resources Information Center
Kuhn, John R., Jr.
2009-01-01
Drawing upon the theories of complexity and complex adaptive systems and the Singerian Inquiring System from C. West Churchman's seminal work "The Design of Inquiring Systems" the dissertation herein develops a systems design theory for continuous auditing systems. The dissertation consists of discussion of the two foundational theories,…
Salako, Solomon E
2011-03-01
The desirability of obtaining freely given consent is universally accepted. The point, however, is that there is no unanimity on the definition of informed consent or its application in bioethics. Whether informed consent is based on principalism or casuistry or the virtue theory, the problem is how to handle the ethically complex situation created in the interface between informed consent and social justice under international biomedical instruments. This article will proceed by offering detailed historical and critical analyses of informed consent under the European Convention on Human Rights and Biomedicine 1997 and The UNESCO Universal Declaration on Bioethics and Human Rights 2005. Three conceptions of justice will be utilised to show that the doctrine of informed consent has driven the ethos of research on human beings and shaped the physician-patient relationship; and that casuistry and virtue theory are consistent with and not rivals of a principle-based account of informed consent.
Sevick, Mary Ann; Woolf, Kathleen; Mattoo, Aditya; Katz, Stuart D; Li, Huilin; St-Jules, David E; Jagannathan, Ram; Hu, Lu; Pompeii, Mary Lou; Ganguzza, Lisa; Li, Zhi; Sierra, Alex; Williams, Stephen K; Goldfarb, David S
2018-01-01
Patients with complex chronic diseases usually must make multiple lifestyle changes to limit and manage their conditions. Numerous studies have shown that education alone is insufficient for engaging people in lifestyle behavior change, and that theory-based behavioral approaches also are necessary. However, even the most motivated individual may have difficulty with making lifestyle changes because of the information complexity associated with multiple behavior changes. The goal of the current Healthy Hearts and Kidneys study was to evaluate, different mobile health (mHealth)-delivered intervention approaches for engaging individuals with type 2 diabetes (T2D) and concurrent chronic kidney disease (CKD) in behavior changes. Participants were randomized to 1 of 4 groups, receiving: (1) a behavioral counseling, (2) technology-based self-monitoring to reduce information complexity, (3) combined behavioral counseling and technology-based self-monitoring, or (4) baseline advice. We will determine the impact of randomization assignment on weight loss success and 24-hour urinary excretion of sodium and phosphorus. With this report we describe the study design, methods, and approaches used to assure information security for this ongoing clinical trial. Clinical Trials.gov Identifier: NCT02276742. Copyright © 2017. Published by Elsevier Inc.
Goel, Nidhi; Singh, Udai P
2013-10-10
Four new acid-base complexes using picric acid [(OH)(NO2)3C6H2] (PA) and N-heterocyclic bases (1,10-phenanthroline (phen)/2,2';6',2"-terpyridine (terpy)/hexamethylenetetramine (hmta)/2,4,6-tri(2-pyridyl)-1,3,5-triazine (tptz)) were prepared and characterized by elemental analysis, IR, NMR and X-ray crystallography. Crystal structures provide detailed information of the noncovalent interactions present in different complexes. The optimized structures of the complexes were calculated in terms of the density functional theory. The thermolysis of these complexes was investigated by TG-DSC and ignition delay measurements. The model-free isoconversional and model-fitting kinetic approaches have been applied to isothermal TG data for kinetics investigation of thermal decomposition of these complexes.
Li, Jingchao; Cao, Yunpeng; Ying, Yulong; Li, Shuying
2016-01-01
Bearing failure is one of the dominant causes of failure and breakdowns in rotating machinery, leading to huge economic loss. Aiming at the nonstationary and nonlinear characteristics of bearing vibration signals as well as the complexity of condition-indicating information distribution in the signals, a novel rolling element bearing fault diagnosis method based on multifractal theory and gray relation theory was proposed in the paper. Firstly, a generalized multifractal dimension algorithm was developed to extract the characteristic vectors of fault features from the bearing vibration signals, which can offer more meaningful and distinguishing information reflecting different bearing health status in comparison with conventional single fractal dimension. After feature extraction by multifractal dimensions, an adaptive gray relation algorithm was applied to implement an automated bearing fault pattern recognition. The experimental results show that the proposed method can identify various bearing fault types as well as severities effectively and accurately. PMID:28036329
Li, Jingchao; Cao, Yunpeng; Ying, Yulong; Li, Shuying
2016-01-01
Bearing failure is one of the dominant causes of failure and breakdowns in rotating machinery, leading to huge economic loss. Aiming at the nonstationary and nonlinear characteristics of bearing vibration signals as well as the complexity of condition-indicating information distribution in the signals, a novel rolling element bearing fault diagnosis method based on multifractal theory and gray relation theory was proposed in the paper. Firstly, a generalized multifractal dimension algorithm was developed to extract the characteristic vectors of fault features from the bearing vibration signals, which can offer more meaningful and distinguishing information reflecting different bearing health status in comparison with conventional single fractal dimension. After feature extraction by multifractal dimensions, an adaptive gray relation algorithm was applied to implement an automated bearing fault pattern recognition. The experimental results show that the proposed method can identify various bearing fault types as well as severities effectively and accurately.
Information Security Analysis Using Game Theory and Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schlicher, Bob G; Abercrombie, Robert K
Information security analysis can be performed using game theory implemented in dynamic simulations of Agent Based Models (ABMs). Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. Our approach addresses imperfect information and scalability that allows us to also address previous limitations of current stochastic game models. Such models only consider perfect information assuming that the defender is always able to detect attacks; assuming that the state transition probabilities are fixed before the game assuming that the players actions aremore » always synchronous; and that most models are not scalable with the size and complexity of systems under consideration. Our use of ABMs yields results of selected experiments that demonstrate our proposed approach and provides a quantitative measure for realistic information systems and their related security scenarios.« less
ID201202961, DOE S-124,539, Information Security Analysis Using Game Theory and Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abercrombie, Robert K; Schlicher, Bob G
Information security analysis can be performed using game theory implemented in dynamic simulations of Agent Based Models (ABMs). Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. Our approach addresses imperfect information and scalability that allows us to also address previous limitations of current stochastic game models. Such models only consider perfect information assuming that the defender is always able to detect attacks; assuming that the state transition probabilities are fixed before the game assuming that the players actions aremore » always synchronous; and that most models are not scalable with the size and complexity of systems under consideration. Our use of ABMs yields results of selected experiments that demonstrate our proposed approach and provides a quantitative measure for realistic information systems and their related security scenarios.« less
NASA Astrophysics Data System (ADS)
Brier, Soren
2003-06-01
It is argued that a true transdisciplinary information science going from physical information to phenomenological understanding needs a metaphysical framework. Three different kinds of causality are implied: efficient, formal and final. And at least five different levels of existence are needed: 1. The quantum vacuum fields with entangled causation. 2. The physical level with is energy and force-based efficient causation. 3. The informational-chemical level with its formal causation based on pattern fitting. 4. The biological-semiotic level with its non-conscious final causation and 5. The social-linguistic level of self-consciousness with its conscious goal-oriented final causation. To integrate these consistently in an evolutionary theory as emergent levels, neither mechanical determinism nor complexity theory are sufficient because they cannot be a foundation for a theory of lived meaning. C. S. Peirce's triadic semiotic philosophy combined with a cybernetic and systemic view, like N. Luhmann's, could create the framework I call Cybersemiotics.
Hydrated Cations in the General Chemistry Course.
ERIC Educational Resources Information Center
Kauffman, George B.; Baxter, John F., Jr.
1981-01-01
Presents selected information regarding the descriptive chemistry of the common metal ions and their compounds, including the concepts of process of solution, polar molecules, ionic size and charge, complex ions, coordination number, and the Bronsted-Lowry acid-base theory. (CS)
Arts-Based Learning: A New Approach to Nursing Education Using Andragogy.
Nguyen, Megan; Miranda, Joyal; Lapum, Jennifer; Donald, Faith
2016-07-01
Learner-oriented strategies focusing on learning processes are needed to prepare nursing students for complex practice situations. An arts-based learning approach uses art to nurture cognitive and emotional learning. Knowles' theory of andragogy aims to develop the skill of learning and can inform the process of implementing arts-based learning. This article explores the use and evaluation of andragogy-informed arts-based learning for teaching nursing theory at the undergraduate level. Arts-based learning activities were implemented and then evaluated by students and instructors using anonymous questionnaires. Most students reported that the activities promoted learning. All instructors indicated an interest in integrating arts-based learning into the curricula. Facilitators and barriers to mainstreaming arts-based learning were highlighted. Findings stimulate implications for prospective research and education. Findings suggest that arts-based learning approaches enhance learning by supporting deep inquiry and different learning styles. Further exploration of andragogy-informed arts-based learning in nursing and other disciplines is warranted. [J Nurs Educ. 2016;55(7):407-410.]. Copyright 2016, SLACK Incorporated.
Spectral Entropies as Information-Theoretic Tools for Complex Network Comparison
NASA Astrophysics Data System (ADS)
De Domenico, Manlio; Biamonte, Jacob
2016-10-01
Any physical system can be viewed from the perspective that information is implicitly represented in its state. However, the quantification of this information when it comes to complex networks has remained largely elusive. In this work, we use techniques inspired by quantum statistical mechanics to define an entropy measure for complex networks and to develop a set of information-theoretic tools, based on network spectral properties, such as Rényi q entropy, generalized Kullback-Leibler and Jensen-Shannon divergences, the latter allowing us to define a natural distance measure between complex networks. First, we show that by minimizing the Kullback-Leibler divergence between an observed network and a parametric network model, inference of model parameter(s) by means of maximum-likelihood estimation can be achieved and model selection can be performed with appropriate information criteria. Second, we show that the information-theoretic metric quantifies the distance between pairs of networks and we can use it, for instance, to cluster the layers of a multilayer system. By applying this framework to networks corresponding to sites of the human microbiome, we perform hierarchical cluster analysis and recover with high accuracy existing community-based associations. Our results imply that spectral-based statistical inference in complex networks results in demonstrably superior performance as well as a conceptual backbone, filling a gap towards a network information theory.
Gonzalez Bernaldo de Quiros, Fernan; Dawidowski, Adriana R; Figar, Silvana
2017-02-01
In this study, we aimed: 1) to conceptualize the theoretical challenges facing health information systems (HIS) to represent patients' decisions about health and medical treatments in everyday life; 2) to suggest approaches for modeling these processes. The conceptualization of the theoretical and methodological challenges was discussed in 2015 during a series of interdisciplinary meetings attended by health informatics staff, epidemiologists and health professionals working in quality management and primary and secondary prevention of chronic diseases of the Hospital Italiano de Buenos Aires, together with sociologists, anthropologists and e-health stakeholders. HIS are facing the need and challenge to represent social human processes based on constructivist and complexity theories, which are the current frameworks of human sciences for understanding human learning and socio-cultural changes. Computer systems based on these theories can model processes of social construction of concrete and subjective entities and the interrelationships between them. These theories could be implemented, among other ways, through the mapping of health assets, analysis of social impact through community trials and modeling of complexity with system simulation tools. This analysis suggested the need to complement the traditional linear causal explanations of disease onset (and treatments) that are the bases for models of analysis of HIS with constructivist and complexity frameworks. Both may enlighten the complex interrelationships among patients, health services and the health system. The aim of this strategy is to clarify people's decision making processes to improve the efficiency, quality and equity of the health services and the health system.
Maidment, Ian; Booth, Andrew; Mullan, Judy; McKeown, Jane; Bailey, Sylvia; Wong, Geoffrey
2017-07-03
Medication-related adverse events have been estimated to be responsible for 5700 deaths and cost the UK £750 million annually. This burden falls disproportionately on older people. Outcomes from interventions to optimise medication management are caused by multiple context-sensitive mechanisms. The MEdication Management in Older people: REalist Approaches BAsed on Literature and Evaluation (MEMORABLE) project uses realist synthesis to understand how, why, for whom and in what context interventions, to improve medication management in older people on complex medication regimes residing in the community, work. This realist synthesis uses secondary data and primary data from interviews to develop the programme theory. A realist logic of analysis will synthesise data both within and across the two data sources to inform the design of a complex intervention(s) to help improve medication management in older people. 1. Literature review The review (using realist synthesis) contains five stages to develop an initial programme theory to understand why processes are more or less successful and under which situations: focussing of the research question; developing the initial programme theory; developing the search strategy; selection and appraisal based on relevance and rigour; and data analysis/synthesis to develop and refine the programme theory and context, intervention and mechanism configurations. 2. Realist interviews Realist interviews will explore and refine our understanding of the programme theory developed from the realist synthesis. Up to 30 older people and their informal carers (15 older people with multi-morbidity, 10 informal carers and 5 older people with dementia), and 20 care staff will be interviewed. 3. Developing framework for the intervention(s) Data from the realist synthesis and interviews will be used to refine the programme theory for the intervention(s) to identify: the mechanisms that need to be 'triggered', and the contexts related to these mechanisms. Intervention strategies that change the contexts so the mechanisms are triggered to produce desired outcomes will be developed. Feedback on these strategies will be obtained. This realist synthesis aims to develop a framework (underpinned by our programme theory) for a novel multi-disciplinary, multi-agency intervention(s), to improve medication management in community-dwelling older people on complex medication regimens. PROSPERO CRD42016043506.
Quantum Information Biology: From Theory of Open Quantum Systems to Adaptive Dynamics
NASA Astrophysics Data System (ADS)
Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu; Yamato, Ichiro
This chapter reviews quantum(-like) information biology (QIB). Here biology is treated widely as even covering cognition and its derivatives: psychology and decision making, sociology, and behavioral economics and finances. QIB provides an integrative description of information processing by bio-systems at all scales of life: from proteins and cells to cognition, ecological and social systems. Mathematically QIB is based on the theory of adaptive quantum systems (which covers also open quantum systems). Ideologically QIB is based on the quantum-like (QL) paradigm: complex bio-systems process information in accordance with the laws of quantum information and probability. This paradigm is supported by plenty of statistical bio-data collected at all bio-scales. QIB re ects the two fundamental principles: a) adaptivity; and, b) openness (bio-systems are fundamentally open). In addition, quantum adaptive dynamics provides the most generally possible mathematical representation of these principles.
NASA Astrophysics Data System (ADS)
Li, Weiyao; Huang, Guanhua; Xiong, Yunwu
2016-04-01
The complexity of the spatial structure of porous media, randomness of groundwater recharge and discharge (rainfall, runoff, etc.) has led to groundwater movement complexity, physical and chemical interaction between groundwater and porous media cause solute transport in the medium more complicated. An appropriate method to describe the complexity of features is essential when study on solute transport and conversion in porous media. Information entropy could measure uncertainty and disorder, therefore we attempted to investigate complexity, explore the contact between the information entropy and complexity of solute transport in heterogeneous porous media using information entropy theory. Based on Markov theory, two-dimensional stochastic field of hydraulic conductivity (K) was generated by transition probability. Flow and solute transport model were established under four conditions (instantaneous point source, continuous point source, instantaneous line source and continuous line source). The spatial and temporal complexity of solute transport process was characterized and evaluated using spatial moment and information entropy. Results indicated that the entropy increased as the increase of complexity of solute transport process. For the point source, the one-dimensional entropy of solute concentration increased at first and then decreased along X and Y directions. As time increased, entropy peak value basically unchanged, peak position migrated along the flow direction (X direction) and approximately coincided with the centroid position. With the increase of time, spatial variability and complexity of solute concentration increase, which result in the increases of the second-order spatial moment and the two-dimensional entropy. Information entropy of line source was higher than point source. Solute entropy obtained from continuous input was higher than instantaneous input. Due to the increase of average length of lithoface, media continuity increased, flow and solute transport complexity weakened, and the corresponding information entropy also decreased. Longitudinal macro dispersivity declined slightly at early time then rose. Solute spatial and temporal distribution had significant impacts on the information entropy. Information entropy could reflect the change of solute distribution. Information entropy appears a tool to characterize the spatial and temporal complexity of solute migration and provides a reference for future research.
Implementation of Complexity Analyzing Based on Additional Effect
NASA Astrophysics Data System (ADS)
Zhang, Peng; Li, Na; Liang, Yanhong; Liu, Fang
According to the Complexity Theory, there is complexity in the system when the functional requirement is not be satisfied. There are several study performances for Complexity Theory based on Axiomatic Design. However, they focus on reducing the complexity in their study and no one focus on method of analyzing the complexity in the system. Therefore, this paper put forth a method of analyzing the complexity which is sought to make up the deficiency of the researches. In order to discussing the method of analyzing the complexity based on additional effect, this paper put forth two concepts which are ideal effect and additional effect. The method of analyzing complexity based on additional effect combines Complexity Theory with Theory of Inventive Problem Solving (TRIZ). It is helpful for designers to analyze the complexity by using additional effect. A case study shows the application of the process.
Managing for resilience: an information theory-based approach to assessing ecosystems
Ecosystems are complex and multivariate; hence, methods to assess the dynamics of ecosystems should have the capacity to evaluate multiple indicators simultaneously. Most research on identifying leading indicators of regime shifts has focused on univariate methods and simple mod...
2007-03-01
partners for their mutual benefit. Unfortunately, based on government reports, FEMA did not have adequate control of its supply chain information ...is one attractor . “Edge of chaos” systems have two to eight attractors and in chaotic systems many attractors . Some are called strange attractors ...investigates whether chaos theory, part of complexity science, can extract information from Katrina contracting data to help managers make better logistics
Low-complexity video encoding method for wireless image transmission in capsule endoscope.
Takizawa, Kenichi; Hamaguchi, Kiyoshi
2010-01-01
This paper presents a low-complexity video encoding method applicable for wireless image transmission in capsule endoscopes. This encoding method is based on Wyner-Ziv theory, in which side information available at a transmitter is treated as side information at its receiver. Therefore complex processes in video encoding, such as estimation of the motion vector, are moved to the receiver side, which has a larger-capacity battery. As a result, the encoding process is only to decimate coded original data through channel coding. We provide a performance evaluation for a low-density parity check (LDPC) coding method in the AWGN channel.
Guan, Jun; Xu, Xiaoyu; Wu, Shan; Xing, Lizhi
2018-01-01
The input-output table is very comprehensive and detailed in describing the national economic systems with abundant economic relationships, which contain supply and demand information among various industrial sectors. The complex network, a theory, and method for measuring the structure of a complex system can depict the structural characteristics of the internal structure of the researched object by measuring the structural indicators of the social and economic systems, revealing the complex relationships between the inner hierarchies and the external economic functions. In this paper, functions of industrial sectors on the global value chain are to be distinguished with bipartite graph theory, and inter-sector competitive relationships are to be extracted through resource allocation process. Furthermore, quantitative analysis indices will be proposed under the perspective of a complex network, which will be used to bring about simulations on the variation tendencies of economies' status in different situations of commercial intercourses. Finally, a new econophysics analytical framework of international trade is to be established.
Guan, Jun; Xu, Xiaoyu; Wu, Shan
2018-01-01
The input-output table is very comprehensive and detailed in describing the national economic systems with abundant economic relationships, which contain supply and demand information among various industrial sectors. The complex network, a theory, and method for measuring the structure of a complex system can depict the structural characteristics of the internal structure of the researched object by measuring the structural indicators of the social and economic systems, revealing the complex relationships between the inner hierarchies and the external economic functions. In this paper, functions of industrial sectors on the global value chain are to be distinguished with bipartite graph theory, and inter-sector competitive relationships are to be extracted through resource allocation process. Furthermore, quantitative analysis indices will be proposed under the perspective of a complex network, which will be used to bring about simulations on the variation tendencies of economies’ status in different situations of commercial intercourses. Finally, a new econophysics analytical framework of international trade is to be established. PMID:29813083
Fuzzy connectedness and object definition
NASA Astrophysics Data System (ADS)
Udupa, Jayaram K.; Samarasekera, Supun
1995-04-01
Approaches to object information extraction from images should attempt to use the fact that images are fuzzy. In past image segmentation research, the notion of `hanging togetherness' of image elements specified by their fuzzy connectedness has been lacking. We present a theory of fuzzy objects for n-dimensional digital spaces based on a notion of fuzzy connectedness of image elements. Although our definitions lead to problems of enormous combinatorial complexity, the theoretical results allow us to reduce this dramatically. We demonstrate the utility of the theory and algorithms in image segmentation based on several practical examples.
Prompt comprehension in UNIX command production.
Doane, S M; McNamara, D S; Kintsch, W; Polson, P G; Clawson, D M
1992-07-01
We hypothesize that a cognitive analysis based on the construction-integration theory of comprehension (Kintsch, 1988) can predict what is difficult about generating complex composite commands in the UNIX operating system. We provide empirical support for assumptions of the Doane, Kintsch, and Polson (1989, 1990) construction-integration model for generating complex commands in UNIX. We asked users whose UNIX experience varied to produce complex UNIX commands, and then provided help prompts whenever the commands that they produced were erroneous. The help prompts were designed to assist subjects with respect to both the knowledge and the memory processes that our UNIX modeling efforts have suggested are lacking in less expert users. It appears that experts respond to different prompts than do novices. Expert performance is helped by the presentation of abstract information, whereas novice and intermediate performance is modified by presentation of concrete information. Second, while presentation of specific prompts helps less expert subjects, they do not provide sufficient information to obtain correct performance. Our analyses suggest that information about the ordering of commands is required to help the less expert with both knowledge and memory load problems in a manner consistent with skill acquisition theories.
Advancing Models and Theories for Digital Behavior Change Interventions.
Hekler, Eric B; Michie, Susan; Pavel, Misha; Rivera, Daniel E; Collins, Linda M; Jimison, Holly B; Garnett, Claire; Parral, Skye; Spruijt-Metz, Donna
2016-11-01
To be suitable for informing digital behavior change interventions, theories and models of behavior change need to capture individual variation and changes over time. The aim of this paper is to provide recommendations for development of models and theories that are informed by, and can inform, digital behavior change interventions based on discussions by international experts, including behavioral, computer, and health scientists and engineers. The proposed framework stipulates the use of a state-space representation to define when, where, for whom, and in what state for that person, an intervention will produce a targeted effect. The "state" is that of the individual based on multiple variables that define the "space" when a mechanism of action may produce the effect. A state-space representation can be used to help guide theorizing and identify crossdisciplinary methodologic strategies for improving measurement, experimental design, and analysis that can feasibly match the complexity of real-world behavior change via digital behavior change interventions. Copyright © 2016 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
An Additive Definition of Molecular Complexity.
Böttcher, Thomas
2016-03-28
A framework for molecular complexity is established that is based on information theory and consistent with chemical knowledge. The resulting complexity index Cm is derived from abstracting the information content of a molecule by the degrees of freedom in the microenvironments on a per-atom basis, allowing the molecular complexity to be calculated in a simple and additive way. This index allows the complexity of any molecule to be universally assessed and is sensitive to stereochemistry, heteroatoms, and symmetry. The performance of this complexity index is evaluated and compared against the current state of the art. Its additive character gives consistent values also for very large molecules and supports direct comparisons of chemical reactions. Finally, this approach may provide a useful tool for medicinal chemistry in drug design and lead selection, as demonstrated by correlating molecular complexities of antibiotics with compound-specific parameters.
Noyes, Jane; Brenner, Maria; Fox, Patricia; Guerin, Ashleigh
2014-05-01
To report a novel review to develop a health systems model of successful transition of children with complex healthcare needs from hospital to home. Children with complex healthcare needs commonly experience an expensive, ineffectual and prolonged nurse-led discharge process. Children gain no benefit from prolonged hospitalization and are exposed to significant harm. Research to enable intervention development and process evaluation across the entire health system is lacking. Novel mixed-method integrative review informed by health systems theory. DATA CINAHL, PsychInfo, EMBASE, PubMed, citation searching, personal contact. REVIEW Informed by consultation with experts. English language studies, opinion/discussion papers reporting research, best practice and experiences of children, parents and healthcare professionals and purposively selected policies/guidelines from 2002-December 2012 were abstracted using Framework synthesis, followed by iterative theory development. Seven critical factors derived from thirty-four sources across five health system levels explained successful discharge (new programme theory). All seven factors are required in an integrated care pathway, with a dynamic communication loop to facilitate effective discharge (new programme logic). Current health system responses were frequently static and critical success factors were commonly absent, thereby explaining ineffectual discharge. The novel evidence-based model, which reconceptualizes 'discharge' as a highly complex longitudinal health system intervention, makes a significant contribution to global knowledge to drive practice development. Research is required to develop process and outcome measures at different time points in the discharge process and future trials are needed to determine the effectiveness of integrated health system discharge models. © 2013 John Wiley & Sons Ltd.
An application of information theory to stochastic classical gravitational fields
NASA Astrophysics Data System (ADS)
Angulo, J.; Angulo, J. C.; Angulo, J. M.
2018-06-01
The objective of this study lies on the incorporation of the concepts developed in the Information Theory (entropy, complexity, etc.) with the aim of quantifying the variation of the uncertainty associated with a stochastic physical system resident in a spatiotemporal region. As an example of application, a relativistic classical gravitational field has been considered, with a stochastic behavior resulting from the effect induced by one or several external perturbation sources. One of the key concepts of the study is the covariance kernel between two points within the chosen region. Using this concept and the appropriate criteria, a methodology is proposed to evaluate the change of uncertainty at a given spatiotemporal point, based on available information and efficiently applying the diverse methods that Information Theory provides. For illustration, a stochastic version of the Einstein equation with an added Gaussian Langevin term is analyzed.
The future (and past) of quantum theory after the Higgs boson: a quantum-informational viewpoint.
Plotnitsky, Arkady
2016-05-28
Taking as its point of departure the discovery of the Higgs boson, this article considers quantum theory, including quantum field theory, which predicted the Higgs boson, through the combined perspective of quantum information theory and the idea of technology, while also adopting anon-realistinterpretation, in 'the spirit of Copenhagen', of quantum theory and quantum phenomena themselves. The article argues that the 'events' in question in fundamental physics, such as the discovery of the Higgs boson (a particularly complex and dramatic, but not essentially different, case), are made possible by the joint workings of three technologies: experimental technology, mathematical technology and, more recently, digital computer technology. The article will consider the role of and the relationships among these technologies, focusing on experimental and mathematical technologies, in quantum mechanics (QM), quantum field theory (QFT) and finite-dimensional quantum theory, with which quantum information theory has been primarily concerned thus far. It will do so, in part, by reassessing the history of quantum theory, beginning with Heisenberg's discovery of QM, in quantum-informational and technological terms. This history, the article argues, is defined by the discoveries of increasingly complex configurations of observed phenomena and the emergence of the increasingly complex mathematical formalism accounting for these phenomena, culminating in the standard model of elementary-particle physics, defining the current state of QFT. © 2016 The Author(s).
ERIC Educational Resources Information Center
Calhoun, Shawn P.
2012-01-01
Information literacy is a complex knowledge domain. Cognitive processing theory describes the effects an instructional subject and the learning environment have on working memory. Essential processing is one component of cognitive processing theory that explains the inherent complexity of knowledge domains such as information literacy. Prior…
The syntactic complexity of Russian relative clauses
Fedorenko, Evelina; Gibson, Edward
2012-01-01
Although syntactic complexity has been investigated across dozens of studies, the available data still greatly underdetermine relevant theories of processing difficulty. Memory-based and expectation-based theories make opposite predictions regarding fine-grained time course of processing difficulty in syntactically constrained contexts, and each class of theory receives support from results on some constructions in some languages. Here we report four self-paced reading experiments on the online comprehension of Russian relative clauses together with related corpus studies, taking advantage of Russian’s flexible word order to disentangle predictions of competing theories. We find support for key predictions of memory-based theories in reading times at RC verbs, and for key predictions of expectation-based theories in processing difficulty at RC-initial accusative noun phrase (NP) objects, which corpus data suggest should be highly unexpected. These results suggest that a complete theory of syntactic complexity must integrate insights from both expectation-based and memory-based theories. PMID:24711687
Symmetry, Contingency, Complexity: Accommodating Uncertainty in Public Relations Theory.
ERIC Educational Resources Information Center
Murphy, Priscilla
2000-01-01
Explores the potential of complexity theory as a unifying theory in public relations, where scholars have recently raised problems involving flux, uncertainty, adaptiveness, and loss of control. Describes specific complexity-based methodologies and their potential for public relations studies. Offers an account of complexity theory, its…
NASA Astrophysics Data System (ADS)
Karakatsanis, L. P.; Pavlos, G. P.; Iliopoulos, A. C.; Pavlos, E. G.; Clark, P. M.; Duke, J. L.; Monos, D. S.
2018-09-01
This study combines two independent domains of science, the high throughput DNA sequencing capabilities of Genomics and complexity theory from Physics, to assess the information encoded by the different genomic segments of exonic, intronic and intergenic regions of the Major Histocompatibility Complex (MHC) and identify possible interactive relationships. The dynamic and non-extensive statistical characteristics of two well characterized MHC sequences from the homozygous cell lines, PGF and COX, in addition to two other genomic regions of comparable size, used as controls, have been studied using the reconstructed phase space theorem and the non-extensive statistical theory of Tsallis. The results reveal similar non-linear dynamical behavior as far as complexity and self-organization features. In particular, the low-dimensional deterministic nonlinear chaotic and non-extensive statistical character of the DNA sequences was verified with strong multifractal characteristics and long-range correlations. The nonlinear indices repeatedly verified that MHC sequences, whether exonic, intronic or intergenic include varying levels of information and reveal an interaction of the genes with intergenic regions, whereby the lower the number of genes in a region, the less the complexity and information content of the intergenic region. Finally we showed the significance of the intergenic region in the production of the DNA dynamics. The findings reveal interesting content information in all three genomic elements and interactive relationships of the genes with the intergenic regions. The results most likely are relevant to the whole genome and not only to the MHC. These findings are consistent with the ENCODE project, which has now established that the non-coding regions of the genome remain to be of relevance, as they are functionally important and play a significant role in the regulation of expression of genes and coordination of the many biological processes of the cell.
Information theory, animal communication, and the search for extraterrestrial intelligence
NASA Astrophysics Data System (ADS)
Doyle, Laurance R.; McCowan, Brenda; Johnston, Simon; Hanser, Sean F.
2011-02-01
We present ongoing research in the application of information theory to animal communication systems with the goal of developing additional detectors and estimators for possible extraterrestrial intelligent signals. Regardless of the species, for intelligence (i.e., complex knowledge) to be transmitted certain rules of information theory must still be obeyed. We demonstrate some preliminary results of applying information theory to socially complex marine mammal species (bottlenose dolphins and humpback whales) as well as arboreal squirrel monkeys, because they almost exclusively rely on vocal signals for their communications, producing signals which can be readily characterized by signal analysis. Metrics such as Zipf's Law and higher-order information-entropic structure are emerging as indicators of the communicative complexity characteristic of an "intelligent message" content within these animals' signals, perhaps not surprising given these species' social complexity. In addition to human languages, for comparison we also apply these metrics to pulsar signals—perhaps (arguably) the most "organized" of stellar systems—as an example of astrophysical systems that would have to be distinguished from an extraterrestrial intelligence message by such information theoretic filters. We also look at a message transmitted from Earth (Arecibo Observatory) that contains a lot of meaning but little information in the mathematical sense we define it here. We conclude that the study of non-human communication systems on our own planet can make a valuable contribution to the detection of extraterrestrial intelligence by providing quantitative general measures of communicative complexity. Studying the complex communication systems of other intelligent species on our own planet may also be one of the best ways to deprovincialize our thinking about extraterrestrial communication systems in general.
Econophysics: from Game Theory and Information Theory to Quantum Mechanics
NASA Astrophysics Data System (ADS)
Jimenez, Edward; Moya, Douglas
2005-03-01
Rationality is the universal invariant among human behavior, universe physical laws and ordered and complex biological systems. Econophysics isboth the use of physical concepts in Finance and Economics, and the use of Information Economics in Physics. In special, we will show that it is possible to obtain the Quantum Mechanics principles using Information and Game Theory.
Being in Community: A Food Security Themed Approach to Public Scholarship
ERIC Educational Resources Information Center
Harrison, Barbara; Nelson, Connie; Stroink, Mirella
2013-01-01
For six years the Food Security Research Network at Lakehead University, Canada, has been engaged in an interdisciplinary theme-based service-learning initiative focusing on food security. Informed by complexity theory, the contextual fluidity partnership model brings community partners, students, and faculty into a nexus through which new…
Acquisition Risks in a World of Joint Capabilities: A Study of Interdependency Complexity
2013-04-01
key to benefit attainment (Comfort, 1994), whereas others claim that more information leads to a false sense of security (Hall, Ariss , & Todorov...knowledge-based theory of the firm. Strategic Management Journal, 17, 109–122. Hall, C. C., Ariss , L., & Todorov, A. (2007). The illusion of
NASA Astrophysics Data System (ADS)
Clemens, Joshua William
Game theory has application across multiple fields, spanning from economic strategy to optimal control of an aircraft and missile on an intercept trajectory. The idea of game theory is fascinating in that we can actually mathematically model real-world scenarios and determine optimal decision making. It may not always be easy to mathematically model certain real-world scenarios, nonetheless, game theory gives us an appreciation for the complexity involved in decision making. This complexity is especially apparent when the players involved have access to different information upon which to base their decision making (a nonclassical information pattern). Here we will focus on the class of adversarial two-player games (sometimes referred to as pursuit-evasion games) with nonclassical information pattern. We present a two-sided (simultaneous) optimization solution method for the two-player linear quadratic Gaussian (LQG) multistage game. This direct solution method allows for further interpretation of each player's decision making (strategy) as compared to previously used formal solution methods. In addition to the optimal control strategies, we present a saddle point proof and we derive an expression for the optimal performance index value. We provide some numerical results in order to further interpret the optimal control strategies and to highlight real-world application of this game-theoretic optimal solution.
The simplicity complex: exploring simplified health messages in a complex world.
Zarcadoolas, Christina
2011-09-01
A challenge in individual and public health at the start of the 21st century is to effectively communicate health and science information about disease and complex emergencies. The low health literacy of millions of adults in the USA has been referred to as a 'silent killer'. A popular approach to improving health communication and health promotion to low health literate consumers has been to simplify the language of health information. The expected result has been that individuals and groups will better understand information and will then make informed decisions about their health and behaviors. This expectation has grown to include the belief that the public will be better prepared to take appropriate action in complex natural and man-made emergencies. Demonstrating the efficacy of this approach remains, in large part, uninvestigated. And it is becoming more evident that health literacy itself is complex and multifaceted. This article applies linguistic and sociolinguistic models in order to better articulate the role of simplification in health communication and health promotion. Focusing on two models from sociolinguistics-pragmatics and text theory-the article discusses their usefulness in rethinking message simplification. The discussion proposes that a richer, more theory-based understanding of text structures and functions, along with other powerful constructs, including cultural appropriateness, relevancy and context, are needed to close the gaps between health messages, health messengers and patients/the public. The article concludes by making recommendations for future study to empirically test the strengths and limitations of these models and constructs.
A new perspective on the perceptual selectivity of attention under load.
Giesbrecht, Barry; Sy, Jocelyn; Bundesen, Claus; Kyllingsbaek, Søren
2014-05-01
The human attention system helps us cope with a complex environment by supporting the selective processing of information relevant to our current goals. Understanding the perceptual, cognitive, and neural mechanisms that mediate selective attention is a core issue in cognitive neuroscience. One prominent model of selective attention, known as load theory, offers an account of how task demands determine when information is selected and an account of the efficiency of the selection process. However, load theory has several critical weaknesses that suggest that it is time for a new perspective. Here we review the strengths and weaknesses of load theory and offer an alternative biologically plausible computational account that is based on the neural theory of visual attention. We argue that this new perspective provides a detailed computational account of how bottom-up and top-down information is integrated to provide efficient attentional selection and allocation of perceptual processing resources. © 2014 New York Academy of Sciences.
Communication in diagnostic radiology: meeting the challenges of complexity.
Larson, David B; Froehle, Craig M; Johnson, Neil D; Towbin, Alexander J
2014-11-01
As patients and information flow through the imaging process, value is added step-by-step when information is acquired, interpreted, and communicated back to the referring clinician. However, radiology information systems are often plagued with communication errors and delays. This article presents theories and recommends strategies to continuously improve communication in the complex environment of modern radiology. Communication theories, methods, and systems that have proven their effectiveness in other environments can serve as models for radiology.
Broadening conceptions of learning in medical education: the message from teamworking.
Bleakley, Alan
2006-02-01
There is a mismatch between the broad range of learning theories offered in the wider education literature and a relatively narrow range of theories privileged in the medical education literature. The latter are usually described under the heading of 'adult learning theory'. This paper critically addresses the limitations of the current dominant learning theories informing medical education. An argument is made that such theories, which address how an individual learns, fail to explain how learning occurs in dynamic, complex and unstable systems such as fluid clinical teams. Models of learning that take into account distributed knowing, learning through time as well as space, and the complexity of a learning environment including relationships between persons and artefacts, are more powerful in explaining and predicting how learning occurs in clinical teams. Learning theories may be privileged for ideological reasons, such as medicine's concern with autonomy. Where an increasing amount of medical education occurs in workplace contexts, sociocultural learning theories offer a best-fit exploration and explanation of such learning. We need to continue to develop testable models of learning that inform safe work practice. One type of learning theory will not inform all practice contexts and we need to think about a range of fit-for-purpose theories that are testable in practice. Exciting current developments include dynamicist models of learning drawing on complexity theory.
Retinal Connectomics: Towards Complete, Accurate Networks
Marc, Robert E.; Jones, Bryan W.; Watt, Carl B.; Anderson, James R.; Sigulinsky, Crystal; Lauritzen, Scott
2013-01-01
Connectomics is a strategy for mapping complex neural networks based on high-speed automated electron optical imaging, computational assembly of neural data volumes, web-based navigational tools to explore 1012–1015 byte (terabyte to petabyte) image volumes, and annotation and markup tools to convert images into rich networks with cellular metadata. These collections of network data and associated metadata, analyzed using tools from graph theory and classification theory, can be merged with classical systems theory, giving a more completely parameterized view of how biologic information processing systems are implemented in retina and brain. Networks have two separable features: topology and connection attributes. The first findings from connectomics strongly validate the idea that the topologies complete retinal networks are far more complex than the simple schematics that emerged from classical anatomy. In particular, connectomics has permitted an aggressive refactoring of the retinal inner plexiform layer, demonstrating that network function cannot be simply inferred from stratification; exposing the complex geometric rules for inserting different cells into a shared network; revealing unexpected bidirectional signaling pathways between mammalian rod and cone systems; documenting selective feedforward systems, novel candidate signaling architectures, new coupling motifs, and the highly complex architecture of the mammalian AII amacrine cell. This is but the beginning, as the underlying principles of connectomics are readily transferrable to non-neural cell complexes and provide new contexts for assessing intercellular communication. PMID:24016532
Decoding 2D-PAGE complex maps: relevance to proteomics.
Pietrogrande, Maria Chiara; Marchetti, Nicola; Dondi, Francesco; Righetti, Pier Giorgio
2006-03-20
This review describes two mathematical approaches useful for decoding the complex signal of 2D-PAGE maps of protein mixtures. These methods are helpful for interpreting the large amount of data of each 2D-PAGE map by extracting all the analytical information hidden therein by spot overlapping. Here the basic theory and application to 2D-PAGE maps are reviewed: the means for extracting information from the experimental data and their relevance to proteomics are discussed. One method is based on the quantitative theory of statistical model of peak overlapping (SMO) using the spot experimental data (intensity and spatial coordinates). The second method is based on the study of the 2D-autocovariance function (2D-ACVF) computed on the experimental digitised map. They are two independent methods that are able to extract equal and complementary information from the 2D-PAGE map. Both methods permit to obtain fundamental information on the sample complexity and the separation performance and to single out ordered patterns present in spot positions: the availability of two independent procedures to compute the same separation parameters is a powerful tool to estimate the reliability of the obtained results. The SMO procedure is an unique tool to quantitatively estimate the degree of spot overlapping present in the map, while the 2D-ACVF method is particularly powerful in simply singling out the presence of order in the spot position from the complexity of the whole 2D map, i.e., spot trains. The procedures were validated by extensive numerical computation on computer-generated maps describing experimental 2D-PAGE gels of protein mixtures. Their applicability to real samples was tested on reference maps obtained from literature sources. The review describes the most relevant information for proteomics: sample complexity, separation performance, overlapping extent, identification of spot trains related to post-translational modifications (PTMs).
Filter-based multiscale entropy analysis of complex physiological time series.
Xu, Yuesheng; Zhao, Liang
2013-08-01
Multiscale entropy (MSE) has been widely and successfully used in analyzing the complexity of physiological time series. We reinterpret the averaging process in MSE as filtering a time series by a filter of a piecewise constant type. From this viewpoint, we introduce filter-based multiscale entropy (FME), which filters a time series to generate multiple frequency components, and then we compute the blockwise entropy of the resulting components. By choosing filters adapted to the feature of a given time series, FME is able to better capture its multiscale information and to provide more flexibility for studying its complexity. Motivated by the heart rate turbulence theory, which suggests that the human heartbeat interval time series can be described in piecewise linear patterns, we propose piecewise linear filter multiscale entropy (PLFME) for the complexity analysis of the time series. Numerical results from PLFME are more robust to data of various lengths than those from MSE. The numerical performance of the adaptive piecewise constant filter multiscale entropy without prior information is comparable to that of PLFME, whose design takes prior information into account.
The use of information theory in evolutionary biology.
Adami, Christoph
2012-05-01
Information is a key concept in evolutionary biology. Information stored in a biological organism's genome is used to generate the organism and to maintain and control it. Information is also that which evolves. When a population adapts to a local environment, information about this environment is fixed in a representative genome. However, when an environment changes, information can be lost. At the same time, information is processed by animal brains to survive in complex environments, and the capacity for information processing also evolves. Here, I review applications of information theory to the evolution of proteins and to the evolution of information processing in simulated agents that adapt to perform a complex task. © 2012 New York Academy of Sciences.
Do complexity-informed health interventions work? A scoping review.
Brainard, Julii; Hunter, Paul R
2016-09-20
The lens of complexity theory is widely advocated to improve health care delivery. However, empirical evidence that this lens has been useful in designing health care remains elusive. This review assesses whether it is possible to reliably capture evidence for efficacy in results or process within interventions that were informed by complexity science and closely related conceptual frameworks. Systematic searches of scientific and grey literature were undertaken in late 2015/early 2016. Titles and abstracts were screened for interventions (A) delivered by the health services, (B) that explicitly stated that complexity science provided theoretical underpinning, and (C) also reported specific outcomes. Outcomes had to relate to changes in actual practice, service delivery or patient clinical indicators. Data extraction and detailed analysis was undertaken for studies in three developed countries: Canada, UK and USA. Data were extracted for intervention format, barriers encountered and quality aspects (thoroughness or possible biases) of evaluation and reporting. From 5067 initial finds in scientific literature and 171 items in grey literature, 22 interventions described in 29 articles were selected. Most interventions relied on facilitating collaboration to find solutions to specific or general problems. Many outcomes were very positive. However, some outcomes were measured only subjectively, one intervention was designed with complexity theory in mind but did not reiterate this in subsequent evaluation and other interventions were credited as compatible with complexity science but reported no relevant theoretical underpinning. Articles often omitted discussion on implementation barriers or unintended consequences, which suggests that complexity theory was not widely used in evaluation. It is hard to establish cause and effect when attempting to leverage complex adaptive systems and perhaps even harder to reliably find evidence that confirms whether complexity-informed interventions are usually effective. While it is possible to show that interventions that are compatible with complexity science seem efficacious, it remains difficult to show that explicit planning with complexity in mind was particularly valuable. Recommendations are made to improve future evaluation reports, to establish a better evidence base about whether this conceptual framework is useful in intervention design and implementation.
Rosso, Osvaldo A; Ospina, Raydonal; Frery, Alejandro C
2016-01-01
We present a new approach for handwritten signature classification and verification based on descriptors stemming from time causal information theory. The proposal uses the Shannon entropy, the statistical complexity, and the Fisher information evaluated over the Bandt and Pompe symbolization of the horizontal and vertical coordinates of signatures. These six features are easy and fast to compute, and they are the input to an One-Class Support Vector Machine classifier. The results are better than state-of-the-art online techniques that employ higher-dimensional feature spaces which often require specialized software and hardware. We assess the consistency of our proposal with respect to the size of the training sample, and we also use it to classify the signatures into meaningful groups.
A computational framework for modeling targets as complex adaptive systems
NASA Astrophysics Data System (ADS)
Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh
2017-05-01
Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.
Joly, Elizabeth
2016-06-01
To present a discussion of a theoretical perspective developed through integrating Meleis' Transition Theory and Bronfenbrenner's Bioecological Theory of Human Development to inform nursing and advanced nursing practice supporting the transition to adulthood for young people with medical complexity. Theoretical perspectives to inform nursing practice in supporting successful transition are limited, yet nurses frequently encounter young people with medical complexity during the transition to adulthood. Discussion paper. A literature search of CINAHL and Medline was conducted in 2014 and included articles from 2003-2014; informal discussions with families; the author's experiences in a transition program. The integrated theoretical perspective described in this paper can inform nurses and advanced practice nurses on contextual influences, program and intervention development across spheres of influence and outcomes for the transition to adulthood for young people with medical complexity. Young people and their families require effective reciprocal interactions with individuals and services across sectors to successfully transition to adulthood and become situated in the adult world. Intervention must also extend beyond the young person to include providers, services and health and social policy. Nurses can take a leadership role in supporting the transition to adulthood for young people with medical complexity through direct care, case management, education and research. It is integral that nurses holistically consider developmental processes, complexity and contextual conditions that promote positive outcomes during and beyond the transition to adulthood. © 2016 John Wiley & Sons Ltd.
Rate-distortion theory and human perception.
Sims, Chris R
2016-07-01
The fundamental goal of perception is to aid in the achievement of behavioral objectives. This requires extracting and communicating useful information from noisy and uncertain sensory signals. At the same time, given the complexity of sensory information and the limitations of biological information processing, it is necessary that some information must be lost or discarded in the act of perception. Under these circumstances, what constitutes an 'optimal' perceptual system? This paper describes the mathematical framework of rate-distortion theory as the optimal solution to the problem of minimizing the costs of perceptual error subject to strong constraints on the ability to communicate or transmit information. Rate-distortion theory offers a general and principled theoretical framework for developing computational-level models of human perception (Marr, 1982). Models developed in this framework are capable of producing quantitatively precise explanations for human perceptual performance, while yielding new insights regarding the nature and goals of perception. This paper demonstrates the application of rate-distortion theory to two benchmark domains where capacity limits are especially salient in human perception: discrete categorization of stimuli (also known as absolute identification) and visual working memory. A software package written for the R statistical programming language is described that aids in the development of models based on rate-distortion theory. Copyright © 2016 The Author. Published by Elsevier B.V. All rights reserved.
Reading Assessment: A Primer for Teachers and Tutors.
ERIC Educational Resources Information Center
Caldwell, JoAnne Schudt
This primer provides the basic information that teachers and tutors need to get started on the complex process of reading assessment. Designed for maximum utility in today's standards-driven classroom, the primer presents simple, practical assessment strategies that are based on theory and research. It takes teachers step by step through learning…
A Critical Analysis of Hypermedia and Virtual Learning Environments.
ERIC Educational Resources Information Center
Oliver, Kevin M.
The use of hypermedia in education is supported by cognitive flexibility theory which indicates transfer of knowledge to real-world settings is improved when that material is learned in a case-based, associative network emphasizing complexity and links to related information. Hypermedia is further assumed to benefit education, because it resembles…
Communication complexity and information complexity
NASA Astrophysics Data System (ADS)
Pankratov, Denis
Information complexity enables the use of information-theoretic tools in communication complexity theory. Prior to the results presented in this thesis, information complexity was mainly used for proving lower bounds and direct-sum theorems in the setting of communication complexity. We present three results that demonstrate new connections between information complexity and communication complexity. In the first contribution we thoroughly study the information complexity of the smallest nontrivial two-party function: the AND function. While computing the communication complexity of AND is trivial, computing its exact information complexity presents a major technical challenge. In overcoming this challenge, we reveal that information complexity gives rise to rich geometrical structures. Our analysis of information complexity relies on new analytic techniques and new characterizations of communication protocols. We also uncover a connection of information complexity to the theory of elliptic partial differential equations. Once we compute the exact information complexity of AND, we can compute exact communication complexity of several related functions on n-bit inputs with some additional technical work. Previous combinatorial and algebraic techniques could only prove bounds of the form theta( n). Interestingly, this level of precision is typical in the area of information theory, so our result demonstrates that this meta-property of precise bounds carries over to information complexity and in certain cases even to communication complexity. Our result does not only strengthen the lower bound on communication complexity of disjointness by making it more exact, but it also shows that information complexity provides the exact upper bound on communication complexity. In fact, this result is more general and applies to a whole class of communication problems. In the second contribution, we use self-reduction methods to prove strong lower bounds on the information complexity of two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product mod 2 (IP). In our first result we affirm the conjecture that the information complexity of GHD is linear even under the uniform distribution. This strengthens the O(n) bound shown by Kerenidis et al. (2012) and answers an open problem by Chakrabarti et al. (2012). We also prove that the information complexity of IP is arbitrarily close to the trivial upper bound n as the permitted error tends to zero, again strengthening the O(n) lower bound proved by Braverman and Weinstein (2011). More importantly, our proofs demonstrate that self-reducibility makes the connection between information complexity and communication complexity lower bounds a two-way connection. Whereas numerous results in the past used information complexity techniques to derive new communication complexity lower bounds, we explore a generic way, in which communication complexity lower bounds imply information complexity lower bounds in a black-box manner. In the third contribution we consider the roles that private and public randomness play in the definition of information complexity. In communication complexity, private randomness can be trivially simulated by public randomness. Moreover, the communication cost of simulating public randomness with private randomness is well understood due to Newman's theorem (1991). In information complexity, the roles of public and private randomness are reversed: public randomness can be trivially simulated by private randomness. However, the information cost of simulating private randomness with public randomness is not understood. We show that protocols that use only public randomness admit a rather strong compression. In particular, efficient simulation of private randomness by public randomness would imply a version of a direct sum theorem in the setting of communication complexity. This establishes a yet another connection between the two areas. (Abstract shortened by UMI.).
MaxEnt-Based Ecological Theory: A Template for Integrated Catchment Theory
NASA Astrophysics Data System (ADS)
Harte, J.
2017-12-01
The maximum information entropy procedure (MaxEnt) is both a powerful tool for inferring least-biased probability distributions from limited data and a framework for the construction of complex systems theory. The maximum entropy theory of ecology (METE) describes remarkably well widely observed patterns in the distribution, abundance and energetics of individuals and taxa in relatively static ecosystems. An extension to ecosystems undergoing change in response to disturbance or natural succession (DynaMETE) is in progress. I describe the structure of both the static and the dynamic theory and show a range of comparisons with census data. I then propose a generalization of the MaxEnt approach that could provide a framework for a predictive theory of both static and dynamic, fully-coupled, eco-socio-hydrological catchment systems.
Systematic searching for theory to inform systematic reviews: is it feasible? Is it desirable?
Booth, Andrew; Carroll, Christopher
2015-09-01
In recognising the potential value of theory in understanding how interventions work comes a challenge - how to make identification of theory less haphazard? To explore the feasibility of systematic identification of theory. We searched PubMed for published reviews (1998-2012) that had explicitly sought to identify theory. Systematic searching may be characterised by a structured question, methodological filters and an itemised search procedure. We constructed a template (BeHEMoTh - Behaviour of interest; Health context; Exclusions; Models or Theories) for use when systematically identifying theory. The authors tested the template within two systematic reviews. Of 34 systematic reviews, only 12 reviews (35%) reported a method for identifying theory. Nineteen did not specify how they identified studies containing theory. Data were unavailable for three reviews. Candidate terms include concept(s)/conceptual, framework(s), model(s), and theory/theories/theoretical. Information professionals must overcome inadequate reporting and the use of theory out of context. The review team faces an additional concern in lack of 'theory fidelity'. Based on experience with two systematic reviews, the BeHEMoTh template and procedure offers a feasible and useful approach for identification of theory. Applications include realist synthesis, framework synthesis or review of complex interventions. The procedure requires rigorous evaluation. © 2015 Health Libraries Group.
ERIC Educational Resources Information Center
Yuan, Rui; Zhang, Jia; Yu, Shulin
2018-01-01
Although research on teacher collaboration has proliferated in the last few decades, scant attention has been paid to the development of teacher collaboration in school contexts. Informed by the perspective of complexity theory, this study investigates the complex process of teacher collaboration through qualitative interviews in an English…
Trenholm, Susan; Ferlie, Ewan
2013-09-01
We employ complexity theory to analyse the English National Health Service (NHS)'s organisational response to resurgent tuberculosis across London. Tennison (2002) suggests that complexity theory could fruitfully explore a healthcare system's response to this complex and emergent phenomenon: we explore this claim here. We also bring in established New Public Management principles to enhance our empirical analysis, which is based on data collected between late 2009 and mid-2011. We find that the operation of complexity theory based features, especially self-organisation, are significantly impacted by the macro context of a New Public Management-based regime which values control, measurement and risk management more than innovation, flexibility and lateral system building. We finally explore limitations and suggest perspectives for further research. Copyright © 2012 Elsevier Ltd. All rights reserved.
A Methodological Review and Critique of the "Intergenerational Transmission of Violence" Literature.
Haselschwerdt, Megan L; Savasuk-Luxton, Rachel; Hlavaty, Kathleen
2017-01-01
Exposure to interpersonal or interparental violence (EIPV) and child abuse and maltreatment (CAM) are associated with an increased risk of maladaptive outcomes, including later involvement in adulthood intimate partner violence (IPV; often referred to as the theory of intergenerational transmission of violence). Recent meta-analyses, however, have documented a weak effect size when examining this association. By focusing on young adulthood, a development stage in which identity development and romantic relationship formation are salient tasks, we can provide insight into the association between EIPV, CAM, and IPV. Guided by the methodological critiques from the IPV and EIPV literatures, the present study reviewed the methodology used in 16 studies (published between 2002 and 2016) that tested the theory of intergenerational transmission of violence. The review study focused on how EIPV, CAM, and young adult dating violence were measured and analyzed, with the initial goal of better understanding how methodological decision informed the study's findings. Ultimately, we determined that there was simply too much methodological variability and yet too little methodological complexity to truly inform a review and discussion of the results; therefore, our review solely focused on the study's methodological decisions. Based on our review, we suggest that both of these challenges, too much variability and too little complexity, hinder our ability to examine the theory of intergenerational transmission of violence. Future research must strike a balance between methodological consistency and complexity to better understand the intricate nuances of IPV experiences and inform practice.
A game theory-based trust measurement model for social networks.
Wang, Yingjie; Cai, Zhipeng; Yin, Guisheng; Gao, Yang; Tong, Xiangrong; Han, Qilong
2016-01-01
In social networks, trust is a complex social network. Participants in online social networks want to share information and experiences with as many reliable users as possible. However, the modeling of trust is complicated and application dependent. Modeling trust needs to consider interaction history, recommendation, user behaviors and so on. Therefore, modeling trust is an important focus for online social networks. We propose a game theory-based trust measurement model for social networks. The trust degree is calculated from three aspects, service reliability, feedback effectiveness, recommendation credibility, to get more accurate result. In addition, to alleviate the free-riding problem, we propose a game theory-based punishment mechanism for specific trust and global trust, respectively. We prove that the proposed trust measurement model is effective. The free-riding problem can be resolved effectively through adding the proposed punishment mechanism.
TOWARD A THEORY OF SUSTAINABLE SYSTEMS
While there is tremendous interest in sustainability, a fundamental theory of sustainability does not exist. We present our efforts at constructing such a theory using Physics, Information Theory, Economics and Ecology. We discuss the state of complex sustainable systems that i...
From the Phenomenology to the Mechanisms of Consciousness: Integrated Information Theory 3.0
Tononi, Giulio
2014-01-01
This paper presents Integrated Information Theory (IIT) of consciousness 3.0, which incorporates several advances over previous formulations. IIT starts from phenomenological axioms: information says that each experience is specific – it is what it is by how it differs from alternative experiences; integration says that it is unified – irreducible to non-interdependent components; exclusion says that it has unique borders and a particular spatio-temporal grain. These axioms are formalized into postulates that prescribe how physical mechanisms, such as neurons or logic gates, must be configured to generate experience (phenomenology). The postulates are used to define intrinsic information as “differences that make a difference” within a system, and integrated information as information specified by a whole that cannot be reduced to that specified by its parts. By applying the postulates both at the level of individual mechanisms and at the level of systems of mechanisms, IIT arrives at an identity: an experience is a maximally irreducible conceptual structure (MICS, a constellation of concepts in qualia space), and the set of elements that generates it constitutes a complex. According to IIT, a MICS specifies the quality of an experience and integrated information ΦMax its quantity. From the theory follow several results, including: a system of mechanisms may condense into a major complex and non-overlapping minor complexes; the concepts that specify the quality of an experience are always about the complex itself and relate only indirectly to the external environment; anatomical connectivity influences complexes and associated MICS; a complex can generate a MICS even if its elements are inactive; simple systems can be minimally conscious; complicated systems can be unconscious; there can be true “zombies” – unconscious feed-forward systems that are functionally equivalent to conscious complexes. PMID:24811198
>From naive to sophisticated behavior in multiagents-based financial market models
NASA Astrophysics Data System (ADS)
Mansilla, R.
2000-09-01
The behavior of physical complexity and mutual information function of the outcome of a model of heterogeneous, inductive rational agents inspired by the El Farol Bar problem and the Minority Game is studied. The first magnitude is a measure rooted in the Kolmogorov-Chaitin theory and the second a measure related to Shannon's information entropy. Extensive computer simulations were done, as a result of which, is proposed an ansatz for physical complexity of the type C(l)=lα and the dependence of the exponent α from the parameters of the model is established. The accuracy of our results and the relationship with the behavior of mutual information function as a measure of time correlation of agents choice are discussed.
Ospina, Raydonal; Frery, Alejandro C.
2016-01-01
We present a new approach for handwritten signature classification and verification based on descriptors stemming from time causal information theory. The proposal uses the Shannon entropy, the statistical complexity, and the Fisher information evaluated over the Bandt and Pompe symbolization of the horizontal and vertical coordinates of signatures. These six features are easy and fast to compute, and they are the input to an One-Class Support Vector Machine classifier. The results are better than state-of-the-art online techniques that employ higher-dimensional feature spaces which often require specialized software and hardware. We assess the consistency of our proposal with respect to the size of the training sample, and we also use it to classify the signatures into meaningful groups. PMID:27907014
TOWARD A THEORY OF SUSTAINABLE SYSTEMS
While there is tremendous interest in the topic of sustainability, a fundamental theory of sustainability does not exist. We present our efforts at constructing such a theory starting with Information Theory and ecological models. We discuss the state of complex sustainable syste...
ERIC Educational Resources Information Center
Ledbetter, Mary Lee; Campbell, A. Malcolm
2005-01-01
Reasonable people disagree about how to introduce undergraduate students to the marvels and complexities of the biological sciences. With intrinsically varied subdisciplines within biology, exponentially growing bases of information, and new unifying theories rising regularly, introduction to the curriculum is a challenge. Some decide to focus…
ERIC Educational Resources Information Center
Velez-Rubio, Miguel
2013-01-01
Teaching computer programming to freshmen students in Computer Sciences and other Information Technology areas has been identified as a complex activity. Different approaches have been studied looking for the best one that could help to improve this teaching process. A proposed approach was implemented which is based in the language immersion…
Suppressed neural complexity during ketamine- and propofol-induced unconsciousness.
Wang, Jisung; Noh, Gyu-Jeong; Choi, Byung-Moon; Ku, Seung-Woo; Joo, Pangyu; Jung, Woo-Sung; Kim, Seunghwan; Lee, Heonsoo
2017-07-13
Ketamine and propofol have distinctively different molecular mechanisms of action and neurophysiological features, although both induce loss of consciousness. Therefore, identifying a common feature of ketamine- and propofol-induced unconsciousness would provide insight into the underlying mechanism of losing consciousness. In this study we search for a common feature by applying the concept of type-II complexity, and argue that neural complexity is essential for a brain to maintain consciousness. To test this hypothesis, we show that complexity is suppressed during loss of consciousness induced by ketamine or propofol. We analyzed the randomness (type-I complexity) and complexity (type-II complexity) of electroencephalogram (EEG) signals before and after bolus injection of ketamine or propofol. For the analysis, we use Mean Information Gain (MIG) and Fluctuation Complexity (FC), which are information-theory-based measures that quantify disorder and complexity of dynamics respectively. Both ketamine and propofol reduced the complexity of the EEG signal, but ketamine increased the randomness of the signal and propofol decreased it. The finding supports our claim and suggests EEG complexity as a candidate for a consciousness indicator. Copyright © 2017 Elsevier B.V. All rights reserved.
Complexity Leadership: A Theoretical Perspective
ERIC Educational Resources Information Center
Baltaci, Ali; Balci, Ali
2017-01-01
Complex systems are social networks composed of interactive employees interconnected through collaborative, dynamic ties such as shared goals, perspectives and needs. Complex systems are largely based on "the complex system theory". The complex system theory focuses mainly on finding out and developing strategies and behaviours that…
ERIC Educational Resources Information Center
Sung, Dia; You, Yeongmahn; Song, Ji Hoon
2008-01-01
The purpose of this research is to explore the possibility of viable learning organizations based on identifying viable organizational learning mechanisms. Two theoretical foundations, complex system theory and viable system theory, have been integrated to provide the rationale for building the sustainable organizational learning mechanism. The…
Syntactic Recursion Facilitates and Working Memory Predicts Recursive Theory of Mind
Arslan, Burcu; Hohenberger, Annette; Verbrugge, Rineke
2017-01-01
In this study, we focus on the possible roles of second-order syntactic recursion and working memory in terms of simple and complex span tasks in the development of second-order false belief reasoning. We tested 89 Turkish children in two age groups, one younger (4;6–6;5 years) and one older (6;7–8;10 years). Although second-order syntactic recursion is significantly correlated with the second-order false belief task, results of ordinal logistic regressions revealed that the main predictor of second-order false belief reasoning is complex working memory span. Unlike simple working memory and second-order syntactic recursion tasks, the complex working memory task required processing information serially with additional reasoning demands that require complex working memory strategies. Based on our results, we propose that children’s second-order theory of mind develops when they have efficient reasoning rules to process embedded beliefs serially, thus overcoming a possible serial processing bottleneck. PMID:28072823
Using measures of information content and complexity of time series as hydrologic metrics
USDA-ARS?s Scientific Manuscript database
The information theory has been previously used to develop metrics that allowed to characterize temporal patterns in soil moisture dynamics, and to evaluate and to compare performance of soil water flow models. The objective of this study was to apply information and complexity measures to characte...
NASA Astrophysics Data System (ADS)
Saracco, Ginette; Moreau, Frédérique; Mathé, Pierre-Etienne; Hermitte, Daniel; Michel, Jean-Marie
2007-10-01
We have previously developed a method for characterizing and localizing `homogeneous' buried sources, from the measure of potential anomalies at a fixed height above ground (magnetic, electric and gravity). This method is based on potential theory and uses the properties of the Poisson kernel (real by definition) and the continuous wavelet theory. Here, we relax the assumption on sources and introduce a method that we call the `multiscale tomography'. Our approach is based on the harmonic extension of the observed magnetic field to produce a complex source by use of a complex Poisson kernel solution of the Laplace equation for complex potential field. A phase and modulus are defined. We show that the phase provides additional information on the total magnetic inclination and the structure of sources, while the modulus allows us to characterize its spatial location, depth and `effective degree'. This method is compared to the `complex dipolar tomography', extension of the Patella method that we previously developed. We applied both methods and a classical electrical resistivity tomography to detect and localize buried archaeological structures like antique ovens from magnetic measurements on the Fox-Amphoux site (France). The estimates are then compared with the results of excavations.
Efficient Deterministic Finite Automata Minimization Based on Backward Depth Information.
Liu, Desheng; Huang, Zhiping; Zhang, Yimeng; Guo, Xiaojun; Su, Shaojing
2016-01-01
Obtaining a minimal automaton is a fundamental issue in the theory and practical implementation of deterministic finite automatons (DFAs). A minimization algorithm is presented in this paper that consists of two main phases. In the first phase, the backward depth information is built, and the state set of the DFA is partitioned into many blocks. In the second phase, the state set is refined using a hash table. The minimization algorithm has a lower time complexity O(n) than a naive comparison of transitions O(n2). Few states need to be refined by the hash table, because most states have been partitioned by the backward depth information in the coarse partition. This method achieves greater generality than previous methods because building the backward depth information is independent of the topological complexity of the DFA. The proposed algorithm can be applied not only to the minimization of acyclic automata or simple cyclic automata, but also to automata with high topological complexity. Overall, the proposal has three advantages: lower time complexity, greater generality, and scalability. A comparison to Hopcroft's algorithm demonstrates experimentally that the algorithm runs faster than traditional algorithms.
A framework for designing and analyzing binary decision-making strategies in cellular systems†
Porter, Joshua R.; Andrews, Burton W.; Iglesias, Pablo A.
2015-01-01
Cells make many binary (all-or-nothing) decisions based on noisy signals gathered from their environment and processed through noisy decision-making pathways. Reducing the effect of noise to improve the fidelity of decision-making comes at the expense of increased complexity, creating a tradeoff between performance and metabolic cost. We present a framework based on rate distortion theory, a branch of information theory, to quantify this tradeoff and design binary decision-making strategies that balance low cost and accuracy in optimal ways. With this framework, we show that several observed behaviors of binary decision-making systems, including random strategies, hysteresis, and irreversibility, are optimal in an information-theoretic sense for various situations. This framework can also be used to quantify the goals around which a decision-making system is optimized and to evaluate the optimality of cellular decision-making systems by a fundamental information-theoretic criterion. As proof of concept, we use the framework to quantify the goals of the externally triggered apoptosis pathway. PMID:22370552
Spreading Effect in Industrial Complex Network Based on Revised Structural Holes Theory
Ye, Qing; Guan, Jun
2016-01-01
This paper analyzed the spreading effect of industrial sectors with complex network model under perspective of econophysics. Input-output analysis, as an important research tool, focuses more on static analysis. However, the fundamental aim of industry analysis is to figure out how interaction between different industries makes impacts on economic development, which turns out to be a dynamic process. Thus, industrial complex network based on input-output tables from WIOD is proposed to be a bridge connecting accurate static quantitative analysis and comparable dynamic one. With application of revised structural holes theory, flow betweenness and random walk centrality were respectively chosen to evaluate industrial sectors’ long-term and short-term spreading effect process in this paper. It shows that industries with higher flow betweenness or random walk centrality would bring about more intensive industrial spreading effect to the industrial chains they stands in, because value stream transmission of industrial sectors depends on how many products or services it can get from the other ones, and they are regarded as brokers with bigger information superiority and more intermediate interests. PMID:27218468
Spreading Effect in Industrial Complex Network Based on Revised Structural Holes Theory.
Xing, Lizhi; Ye, Qing; Guan, Jun
2016-01-01
This paper analyzed the spreading effect of industrial sectors with complex network model under perspective of econophysics. Input-output analysis, as an important research tool, focuses more on static analysis. However, the fundamental aim of industry analysis is to figure out how interaction between different industries makes impacts on economic development, which turns out to be a dynamic process. Thus, industrial complex network based on input-output tables from WIOD is proposed to be a bridge connecting accurate static quantitative analysis and comparable dynamic one. With application of revised structural holes theory, flow betweenness and random walk centrality were respectively chosen to evaluate industrial sectors' long-term and short-term spreading effect process in this paper. It shows that industries with higher flow betweenness or random walk centrality would bring about more intensive industrial spreading effect to the industrial chains they stands in, because value stream transmission of industrial sectors depends on how many products or services it can get from the other ones, and they are regarded as brokers with bigger information superiority and more intermediate interests.
Eigencentrality based on dissimilarity measures reveals central nodes in complex networks
Alvarez-Socorro, A. J.; Herrera-Almarza, G. C.; González-Díaz, L. A.
2015-01-01
One of the most important problems in complex network’s theory is the location of the entities that are essential or have a main role within the network. For this purpose, the use of dissimilarity measures (specific to theory of classification and data mining) to enrich the centrality measures in complex networks is proposed. The centrality method used is the eigencentrality which is based on the heuristic that the centrality of a node depends on how central are the nodes in the immediate neighbourhood (like rich get richer phenomenon). This can be described by an eigenvalues problem, however the information of the neighbourhood and the connections between neighbours is not taken in account, neglecting their relevance when is one evaluates the centrality/importance/influence of a node. The contribution calculated by the dissimilarity measure is parameter independent, making the proposed method is also parameter independent. Finally, we perform a comparative study of our method versus other methods reported in the literature, obtaining more accurate and less expensive computational results in most cases. PMID:26603652
Cognitive load reducing in destination decision system
NASA Astrophysics Data System (ADS)
Wu, Chunhua; Wang, Cong; Jiang, Qien; Wang, Jian; Chen, Hong
2007-12-01
With limited cognitive resource, the quantity of information can be processed by a person is limited. If the limitation is broken, the whole cognitive process would be affected, so did the final decision. The research of effective ways to reduce the cognitive load is launched from two aspects: cutting down the number of alternatives and directing the user to allocate his limited attention resource based on the selective visual attention theory. Decision-making is such a complex process that people usually have difficulties to express their requirements completely. An effective method to get user's hidden requirements is put forward in this paper. With more requirements be caught, the destination decision system can filtering more quantity of inappropriate alternatives. Different information piece has different utility, if the information with high utility would get attention easily, the decision might be made more easily. After analyzing the current selective visual attention theory, a new presentation style based on user's visual attention also put forward in this paper. This model arranges information presentation according to the movement of sightline. Through visual attention, the user can put their limited attention resource on the important information. Hidden requirements catching and presenting information based on the selective visual attention are effective ways to reducing the cognitive load.
Assessment of Student Learning in Virtual Spaces, Using Orders of Complexity in Levels of Thinking
ERIC Educational Resources Information Center
Capacho, Jose
2017-01-01
This paper aims at showing a new methodology to assess student learning in virtual spaces supported by Information and Communications Technology-ICT. The methodology is based on the Conceptual Pedagogy Theory, and is supported both on knowledge instruments (KI) and intelectual operations (IO). KI are made up of teaching materials embedded in the…
Bai, Xiao-ping; Zhang, Xi-wei
2013-01-01
Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.
The Evolution of ICT Markets: An Agent-Based Model on Complex Networks
NASA Astrophysics Data System (ADS)
Zhao, Liangjie; Wu, Bangtao; Chen, Zhong; Li, Li
Information and communication technology (ICT) products exhibit positive network effects.The dynamic process of ICT markets evolution has two intrinsic characteristics: (1) customers are influenced by each others’ purchasing decision; (2) customers are intelligent agents with bounded rationality.Guided by complex systems theory, we construct an agent-based model and simulate on complex networks to examine how the evolution can arise from the interaction of customers, which occur when they make expectations about the future installed base of a product by the fraction of neighbors who are using the same product in his personal network.We demonstrate that network effects play an important role in the evolution of markets share, which make even an inferior product can dominate the whole market.We also find that the intensity of customers’ communication can influence whether the best initial strategy for firms is to improve product quality or expand their installed base.
Information Theoretic Characterization of Physical Theories with Projective State Space
NASA Astrophysics Data System (ADS)
Zaopo, Marco
2015-08-01
Probabilistic theories are a natural framework to investigate the foundations of quantum theory and possible alternative or deeper theories. In a generic probabilistic theory, states of a physical system are represented as vectors of outcomes probabilities and state spaces are convex cones. In this picture the physics of a given theory is related to the geometric shape of the cone of states. In quantum theory, for instance, the shape of the cone of states corresponds to a projective space over complex numbers. In this paper we investigate geometric constraints on the state space of a generic theory imposed by the following information theoretic requirements: every non completely mixed state of a system is perfectly distinguishable from some other state in a single shot measurement; information capacity of physical systems is conserved under making mixtures of states. These assumptions guarantee that a generic physical system satisfies a natural principle asserting that the more a state of the system is mixed the less information can be stored in the system using that state as logical value. We show that all theories satisfying the above assumptions are such that the shape of their cones of states is that of a projective space over a generic field of numbers. Remarkably, these theories constitute generalizations of quantum theory where superposition principle holds with coefficients pertaining to a generic field of numbers in place of complex numbers. If the field of numbers is trivial and contains only one element we obtain classical theory. This result tells that superposition principle is quite common among probabilistic theories while its absence gives evidence of either classical theory or an implausible theory.
Feasibility study of molecular memory device based on DNA using methylation to store information
NASA Astrophysics Data System (ADS)
Jiang, Liming; Qiu, Wanzhi; Al-Dirini, Feras; Hossain, Faruque M.; Evans, Robin; Skafidas, Efstratios
2016-07-01
DNA, because of its robustness and dense information storage capability, has been proposed as a potential candidate for next-generation storage media. However, encoding information into the DNA sequence requires molecular synthesis technology, which to date is costly and prone to synthesis errors. Reading the DNA strand information is also complex. Ideally, DNA storage will provide methods for modifying stored information. Here, we conduct a feasibility study investigating the use of the DNA 5-methylcytosine (5mC) methylation state as a molecular memory to store information. We propose a new 1-bit memory device and study, based on the density functional theory and non-equilibrium Green's function method, the feasibility of electrically reading the information. Our results show that changes to methylation states lead to changes in the peak of negative differential resistance which can be used to interrogate memory state. Our work demonstrates a new memory concept based on methylation state which can be beneficial in the design of next generation DNA based molecular electronic memory devices.
Reliability analysis in interdependent smart grid systems
NASA Astrophysics Data System (ADS)
Peng, Hao; Kan, Zhe; Zhao, Dandan; Han, Jianmin; Lu, Jianfeng; Hu, Zhaolong
2018-06-01
Complex network theory is a useful way to study many real complex systems. In this paper, a reliability analysis model based on complex network theory is introduced in interdependent smart grid systems. In this paper, we focus on understanding the structure of smart grid systems and studying the underlying network model, their interactions, and relationships and how cascading failures occur in the interdependent smart grid systems. We propose a practical model for interdependent smart grid systems using complex theory. Besides, based on percolation theory, we also study the effect of cascading failures effect and reveal detailed mathematical analysis of failure propagation in such systems. We analyze the reliability of our proposed model caused by random attacks or failures by calculating the size of giant functioning components in interdependent smart grid systems. Our simulation results also show that there exists a threshold for the proportion of faulty nodes, beyond which the smart grid systems collapse. Also we determine the critical values for different system parameters. In this way, the reliability analysis model based on complex network theory can be effectively utilized for anti-attack and protection purposes in interdependent smart grid systems.
Perspective: Sloppiness and emergent theories in physics, biology, and beyond.
Transtrum, Mark K; Machta, Benjamin B; Brown, Kevin S; Daniels, Bryan C; Myers, Christopher R; Sethna, James P
2015-07-07
Large scale models of physical phenomena demand the development of new statistical and computational tools in order to be effective. Many such models are "sloppy," i.e., exhibit behavior controlled by a relatively small number of parameter combinations. We review an information theoretic framework for analyzing sloppy models. This formalism is based on the Fisher information matrix, which is interpreted as a Riemannian metric on a parameterized space of models. Distance in this space is a measure of how distinguishable two models are based on their predictions. Sloppy model manifolds are bounded with a hierarchy of widths and extrinsic curvatures. The manifold boundary approximation can extract the simple, hidden theory from complicated sloppy models. We attribute the success of simple effective models in physics as likewise emerging from complicated processes exhibiting a low effective dimensionality. We discuss the ramifications and consequences of sloppy models for biochemistry and science more generally. We suggest that the reason our complex world is understandable is due to the same fundamental reason: simple theories of macroscopic behavior are hidden inside complicated microscopic processes.
Information Processing by Schizophrenics When Task Complexity Increases
ERIC Educational Resources Information Center
Hirt, Michael; And Others
1977-01-01
The performance of hospitalized paranoid schizophrenics, nonparanoids, and hospitalized controls was compared on motor, perceptual, and cognitive tasks of increasing complexity. The data were examined within the context of comparing differential predictions made by input and central processing theories of information-processing deficit. (Editor)
Srigley, J A; Corace, K; Hargadon, D P; Yu, D; MacDonald, T; Fabrigar, L; Garber, G
2015-11-01
Despite the importance of hand hygiene in preventing transmission of healthcare-associated infections, compliance rates are suboptimal. Hand hygiene is a complex behaviour and psychological frameworks are promising tools to influence healthcare worker (HCW) behaviour. (i) To review the effectiveness of interventions based on psychological theories of behaviour change to improve HCW hand hygiene compliance; (ii) to determine which frameworks have been used to predict HCW hand hygiene compliance. Multiple databases and reference lists of included studies were searched for studies that applied psychological theories to improve and/or predict HCW hand hygiene. All steps in selection, data extraction, and quality assessment were performed independently by two reviewers. The search yielded 918 citations; seven met eligibility criteria. Four studies evaluated hand hygiene interventions based on psychological frameworks. Interventions were informed by goal setting, control theory, operant learning, positive reinforcement, change theory, the theory of planned behaviour, and the transtheoretical model. Three predictive studies employed the theory of planned behaviour, the transtheoretical model, and the theoretical domains framework. Interventions to improve hand hygiene adherence demonstrated efficacy but studies were at moderate to high risk of bias. For many studies, it was unclear how theories of behaviour change were used to inform the interventions. Predictive studies had mixed results. Behaviour change theory is a promising tool for improving hand hygiene; however, these theories have not been extensively examined. Our review reveals a significant gap in the literature and indicates possible avenues for novel research. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.
Saposnik, Gustavo; Johnston, S Claiborne
2016-04-01
Acute stroke care represents a challenge for decision makers. Decisions based on erroneous assessments may generate false expectations of patients and their family members, and potentially inappropriate medical advice. Game theory is the analysis of interactions between individuals to study how conflict and cooperation affect our decisions. We reviewed principles of game theory that could be applied to medical decisions under uncertainty. Medical decisions in acute stroke care are usually made under constrains: short period of time, with imperfect clinical information, limit understanding about patients and families' values and beliefs. Game theory brings some strategies to help us manage complex medical situations under uncertainty. For example, it offers a different perspective by encouraging the consideration of different alternatives through the understanding of patients' preferences and the careful evaluation of cognitive distortions when applying 'real-world' data. The stag-hunt game teaches us the importance of trust to strength cooperation for a successful patient-physician interaction that is beyond a good or poor clinical outcome. The application of game theory to stroke care may improve our understanding of complex medical situations and help clinicians make practical decisions under uncertainty. © 2016 World Stroke Organization.
Curral, Luis; Marques-Quinteiro, Pedro; Gomes, Catarina; Lind, Pedro G
2016-01-01
Recent theoretical contributions have suggested a theory of leadership that is grounded in complexity theory, hence regarding leadership as a complex process (i.e., nonlinear; emergent). This article tests if complexity leadership theory promotes efficiency in work groups. 40 groups of five participants each had to complete four decision making tasks using the city simulation game SimCity4. Before engaging in the four decision making tasks, participants received information regarding what sort of leadership behaviors were more adequate to help them perform better. Results suggest that if complexity leadership theory is applied, groups can achieve higher efficiency over time, when compared with other groups where complexity leadership is not applied. This study goes beyond traditional views of leadership as a centralized form of control, and presents new evidence suggesting that leadership is a collective and emergent phenomenon, anchored in simple rules of behavior.
Marques-Quinteiro, Pedro; Gomes, Catarina; Lind, Pedro G.
2016-01-01
Recent theoretical contributions have suggested a theory of leadership that is grounded in complexity theory, hence regarding leadership as a complex process (i.e., nonlinear; emergent). This article tests if complexity leadership theory promotes efficiency in work groups. 40 groups of five participants each had to complete four decision making tasks using the city simulation game SimCity4. Before engaging in the four decision making tasks, participants received information regarding what sort of leadership behaviors were more adequate to help them perform better. Results suggest that if complexity leadership theory is applied, groups can achieve higher efficiency over time, when compared with other groups where complexity leadership is not applied. This study goes beyond traditional views of leadership as a centralized form of control, and presents new evidence suggesting that leadership is a collective and emergent phenomenon, anchored in simple rules of behavior. PMID:27973596
Assessing Proposals for Interagency Reorganization
2005-05-26
is useful to have a single entity responsible for operations. Though postmodernist theory is based on a diffusion of knowledge there is an... of knowledge …[for] the general good of mankind.”32 Their research tends to focus on technological solutions to complex information management issues...from an institutional perspective different from that of CSIS. The Markle Foundation was created in 1927 “to promote the advancement and diffusion
Experimental generation of complex noisy photonic entanglement
NASA Astrophysics Data System (ADS)
Dobek, K.; Karpiński, M.; Demkowicz-Dobrzański, R.; Banaszek, K.; Horodecki, P.
2013-02-01
We present an experimental scheme based on spontaneous parametric down-conversion to produce multiple-photon pairs in maximally entangled polarization states using an arrangement of two type-I nonlinear crystals. By introducing correlated polarization noise in the paths of the generated photons we prepare mixed-entangled states whose properties illustrate fundamental results obtained recently in quantum information theory, in particular those concerning bound entanglement and privacy.
Minimization of Dependency Length in Written English
ERIC Educational Resources Information Center
Temperley, David
2007-01-01
Gibson's Dependency Locality Theory (DLT) [Gibson, E. 1998. "Linguistic complexity: locality of syntactic dependencies." "Cognition," 68, 1-76; Gibson, E. 2000. "The dependency locality theory: A distance-based theory of linguistic complexity." In A. Marantz, Y. Miyashita, & W. O'Neil (Eds.), "Image,…
Modeling Conditional Probabilities in Complex Educational Assessments. CSE Technical Report.
ERIC Educational Resources Information Center
Mislevy, Robert J.; Almond, Russell; Dibello, Lou; Jenkins, Frank; Steinberg, Linda; Yan, Duanli; Senturk, Deniz
An active area in psychometric research is coordinated task design and statistical analysis built around cognitive models. Compared with classical test theory and item response theory, there is often less information from observed data about the measurement-model parameters. On the other hand, there is more information from the grounding…
Noyes, Jane; Hendry, Maggie; Booth, Andrew; Chandler, Jackie; Lewin, Simon; Glenton, Claire; Garside, Ruth
2016-07-01
To identify examples of how social theories are used in systematic reviews of complex interventions to inform production of Cochrane guidance. Secondary analysis of published/unpublished examples of theories of social phenomena for use in reviews of complex interventions identified through scoping searches, engagement with key authors and methodologists supplemented by snowballing and reference searching. Theories were classified (low-level, mid-range, grand). Over 100 theories were identified with evidence of proliferation over the last 5 years. New low-level theories (tools, taxonomies, etc) have been developed for classifying and reporting complex interventions. Numerous mid-range theories are used; one example demonstrated how control theory had changed the review's findings. Review-specific logic models are increasingly used, but these can be challenging to develop. New low-level and mid-range psychological theories of behavior change are evolving. No reviews using grand theory (e.g., feminist theory) were identified. We produced a searchable Wiki, Mendeley Inventory, and Cochrane guidance. Use of low-level theory is common and evolving; incorporation of mid-range theory is still the exception rather than the norm. Methodological work is needed to evaluate the contribution of theory. Choice of theory reflects personal preference; application of theory is a skilled endeavor. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.
2003-04-01
gener- ally considered to be passive data . Instead the genetic material should be capable of being algorith - mic information, that is, program code or...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other
Complex-energy approach to sum rules within nuclear density functional theory
Hinohara, Nobuo; Kortelainen, Markus; Nazarewicz, Witold; ...
2015-04-27
The linear response of the nucleus to an external field contains unique information about the effective interaction, correlations governing the behavior of the many-body system, and properties of its excited states. To characterize the response, it is useful to use its energy-weighted moments, or sum rules. By comparing computed sum rules with experimental values, the information content of the response can be utilized in the optimization process of the nuclear Hamiltonian or nuclear energy density functional (EDF). But the additional information comes at a price: compared to the ground state, computation of excited states is more demanding. To establish anmore » efficient framework to compute energy-weighted sum rules of the response that is adaptable to the optimization of the nuclear EDF and large-scale surveys of collective strength, we have developed a new technique within the complex-energy finite-amplitude method (FAM) based on the quasiparticle random- phase approximation. The proposed sum-rule technique based on the complex-energy FAM is a tool of choice when optimizing effective interactions or energy functionals. The method is very efficient and well-adaptable to parallel computing. As a result, the FAM formulation is especially useful when standard theorems based on commutation relations involving the nuclear Hamiltonian and external field cannot be used.« less
Information Theory Applied to Animal Communication Systems and Its Possible Application to SETI
NASA Astrophysics Data System (ADS)
Hanser, Sean F.; Doyle, Laurance R.; McCowan, Brenda; Jenkins, Jon M.
2004-06-01
Information theory, as first introduced by Claude Shannon (Shannon &Weaver 1949) quantitatively evaluates the organizational complexity of communication systems. At the same time George Zipf was examining linguistic structure in a way that was mathematically similar to the components of the Shannon first-order entropy (Zipf 1949). Both Shannon's and Zipf's mathematical procedures have been applied to animal communication and recently have been providing insightful results. The Zipf plot is a useful tool for a first estimate of the characterization of a communication system's complexity (which can later be examined for complex structure at deeper levels using Shannon entropic analysis). In this paper we shall discuss some of the applications and pitfalls of using the Zipf distribution as a preliminary evaluator of the communication complexity of a signaling system.
Control theory based airfoil design using the Euler equations
NASA Technical Reports Server (NTRS)
Jameson, Antony; Reuther, James
1994-01-01
This paper describes the implementation of optimization techniques based on control theory for airfoil design. In our previous work it was shown that control theory could be employed to devise effective optimization procedures for two-dimensional profiles by using the potential flow equation with either a conformal mapping or a general coordinate system. The goal of our present work is to extend the development to treat the Euler equations in two-dimensions by procedures that can readily be generalized to treat complex shapes in three-dimensions. Therefore, we have developed methods which can address airfoil design through either an analytic mapping or an arbitrary grid perturbation method applied to a finite volume discretization of the Euler equations. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented for both the inverse problem and drag minimization problem.
Efficient Deterministic Finite Automata Minimization Based on Backward Depth Information
Liu, Desheng; Huang, Zhiping; Zhang, Yimeng; Guo, Xiaojun; Su, Shaojing
2016-01-01
Obtaining a minimal automaton is a fundamental issue in the theory and practical implementation of deterministic finite automatons (DFAs). A minimization algorithm is presented in this paper that consists of two main phases. In the first phase, the backward depth information is built, and the state set of the DFA is partitioned into many blocks. In the second phase, the state set is refined using a hash table. The minimization algorithm has a lower time complexity O(n) than a naive comparison of transitions O(n2). Few states need to be refined by the hash table, because most states have been partitioned by the backward depth information in the coarse partition. This method achieves greater generality than previous methods because building the backward depth information is independent of the topological complexity of the DFA. The proposed algorithm can be applied not only to the minimization of acyclic automata or simple cyclic automata, but also to automata with high topological complexity. Overall, the proposal has three advantages: lower time complexity, greater generality, and scalability. A comparison to Hopcroft’s algorithm demonstrates experimentally that the algorithm runs faster than traditional algorithms. PMID:27806102
Design of Restoration Method Based on Compressed Sensing and TwIST Algorithm
NASA Astrophysics Data System (ADS)
Zhang, Fei; Piao, Yan
2018-04-01
In order to improve the subjective and objective quality of degraded images at low sampling rates effectively,save storage space and reduce computational complexity at the same time, this paper proposes a joint restoration algorithm of compressed sensing and two step iterative threshold shrinkage (TwIST). The algorithm applies the TwIST algorithm which used in image restoration to the compressed sensing theory. Then, a small amount of sparse high-frequency information is obtained in frequency domain. The TwIST algorithm based on compressed sensing theory is used to accurately reconstruct the high frequency image. The experimental results show that the proposed algorithm achieves better subjective visual effects and objective quality of degraded images while accurately restoring degraded images.
Moore, Graham F; Evans, Rhiannon E
2017-12-01
Recent years have seen a growing emphasis on the value of building and testing middle range theory throughout the development and evaluation of complex population health interventions. We agree that a coherent theoretical basis for intervention development, and use of evaluation to test key causal assumptions and build theory, are crucial. However, in this editorial, we argue that such recommendations have often been operationalised in somewhat simplistic terms with potentially perverse consequences, and that an uncritical assumption that an intervention explicitly based on theory is inherently superior carries significant risks. We first argue that the drive for theory-based approaches may have exacerbated a propensity to select 'off-the-shelf' theories, leading to the selection of inappropriate theories which distract attention from the mechanisms through which a problem is actually sustained. Second, we discuss a tendency toward over-reliance on individual-level theorising. Finally, we discuss the relatively slow progress of population health intervention research in attending to issues of context, and the ecological fit of interventions with the systems whose functioning they attempt to change. We argue that while researchers should consider a broad range of potential theoretical perspectives on a given population health problem, citing a popular off-the-shelf theory as having informed an intervention and its evaluation does not inherently make for better science. Before identifying or developing a theory of change, researchers should develop a clear understanding of how the problem under consideration is created and sustained in context. A broader conceptualisation of theory that reaches across disciplines is vital if theory is to enhance, rather than constrain, the contribution of intervention research. Finally, intervention researchers need to move away from viewing interventions as discrete packages of components which can be described in isolation from their contexts, and better understand the systems into which change is being introduced.
Javorka, Michal; Krohova, Jana; Czippelova, Barbora; Turianikova, Zuzana; Lazarova, Zuzana; Wiszt, Radovan; Faes, Luca
2018-07-01
Cardiovascular complexity is a feature of healthy physiological regulation, which stems from the simultaneous activity of several cardiovascular reflexes and other non-reflex physiological mechanisms. It is manifested in the rich dynamics characterizing the spontaneous heart rate and blood pressure variability (HRV and BPV). The present study faces the challenge of disclosing the origin of short-term HRV and BPV from the statistical perspective offered by information theory. To dissect the physiological mechanisms giving rise to cardiovascular complexity in different conditions, measures of predictive information, information storage, information transfer and information modification were applied to the beat-to-beat variability of heart period (HP), systolic arterial pressure (SAP) and respiratory volume signal recorded non-invasively in 61 healthy young subjects at supine rest and during head-up tilt (HUT) and mental arithmetics (MA). Information decomposition enabled to assess simultaneously several expected and newly inferred physiological phenomena, including: (i) the decreased complexity of HP during HUT and the increased complexity of SAP during MA; (ii) the suppressed cardiorespiratory information transfer, related to weakened respiratory sinus arrhythmia, under both challenges; (iii) the altered balance of the information transferred along the two arms of the cardiovascular loop during HUT, with larger baroreflex involvement and smaller feedforward mechanical effects; and (iv) an increased importance of direct respiratory effects on SAP during HUT, and on both HP and SAP during MA. We demonstrate that a decomposition of the information contained in cardiovascular oscillations can reveal subtle changes in system dynamics and improve our understanding of the complexity changes during physiological challenges. Copyright © 2018. Published by Elsevier Ltd.
Knowledge Theories Can Inform Evaluation Practice: What Can a Complexity Lens Add?
ERIC Educational Resources Information Center
Hawe, Penelope; Bond, Lyndal; Butler, Helen
2009-01-01
Programs and policies invariably contain new knowledge. Theories about knowledge utilization, diffusion, implementation, transfer, and knowledge translation theories illuminate some mechanisms of change processes. But more often than not, when it comes to understanding patterns about change processes, "the foreground" is privileged more…
Using health psychology to help patients: theories of behaviour change.
Barley, Elizabeth; Lawson, Victoria
2016-09-08
Behaviour change theories and related research evidence highlight the complexity of making and sticking to health-related behaviour changes. These theories make explicit factors that influence behaviour change, such as health beliefs, past behaviour, intention, social influences, perceived control and the context of the behaviour. Nurses can use this information to understand why a particular patient may find making recommended health behaviour changes difficult and to determine factors that may help them. This article outlines five well-established theories of behaviour change: the health belief model, the theory of planned behaviour, the stages of change model, self-determination theory, and temporal self-regulation theory. The evidence for interventions that are informed by these theories is then explored and appraised. The extent and quality of evidence varies depending on the type of behaviour and patients targeted, but evidence from randomised controlled trials indicates that interventions informed by theory can result in behaviour change.
Using information theory to assess the communicative capacity of circulating microRNA.
Finn, Nnenna A; Searles, Charles D
2013-10-11
The discovery of extracellular microRNAs (miRNAs) and their transport modalities (i.e., microparticles, exosomes, proteins and lipoproteins) has sparked theories regarding their role in intercellular communication. Here, we assessed the information transfer capacity of different miRNA transport modalities in human serum by utilizing basic principles of information theory. Zipf Statistics were calculated for each of the miRNA transport modalities identified in human serum. Our analyses revealed that miRNA-mediated information transfer is redundant, as evidenced by negative Zipf's Statistics with magnitudes greater than one. In healthy subjects, the potential communicative capacity of miRNA in complex with circulating proteins was significantly lower than that of miRNA encapsulated in circulating microparticles and exosomes. Moreover, the presence of coronary heart disease significantly lowered the communicative capacity of all circulating miRNA transport modalities. To assess the internal organization of circulating miRNA signals, Shannon's zero- and first-order entropies were calculated. Microparticles (MPs) exhibited the lowest Shannon entropic slope, indicating a relatively high capacity for information transfer. Furthermore, compared to the other miRNA transport modalities, MPs appeared to be the most efficient at transferring miRNA to cultured endothelial cells. Taken together, these findings suggest that although all transport modalities have the capacity for miRNA-based information transfer, MPs may be the simplest and most robust way to achieve miRNA-based signal transduction in sera. This study presents a novel method for analyzing the quantitative capacity of miRNA-mediated information transfer while providing insight into the communicative characteristics of distinct circulating miRNA transport modalities. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Loepp, Susan; Wootters, William K.
2006-09-01
For many everyday transmissions, it is essential to protect digital information from noise or eavesdropping. This undergraduate introduction to error correction and cryptography is unique in devoting several chapters to quantum cryptography and quantum computing, thus providing a context in which ideas from mathematics and physics meet. By covering such topics as Shor's quantum factoring algorithm, this text informs the reader about current thinking in quantum information theory and encourages an appreciation of the connections between mathematics and science.Of particular interest are the potential impacts of quantum physics:(i) a quantum computer, if built, could crack our currently used public-key cryptosystems; and (ii) quantum cryptography promises to provide an alternative to these cryptosystems, basing its security on the laws of nature rather than on computational complexity. No prior knowledge of quantum mechanics is assumed, but students should have a basic knowledge of complex numbers, vectors, and matrices. Accessible to readers familiar with matrix algebra, vector spaces and complex numbers First undergraduate text to cover cryptography, error-correction, and quantum computation together Features exercises designed to enhance understanding, including a number of computational problems, available from www.cambridge.org/9780521534765
Renmans, Dimitri; Holvoet, Nathalie; Criel, Bart
2017-09-03
Increased attention on "complexity" in health systems evaluation has resulted in many different methodological responses. Theory-driven evaluations and systems thinking are two such responses that aim for better understanding of the mechanisms underlying given outcomes. Here, we studied the implementation of a performance-based financing intervention by the Belgian Technical Cooperation in Western Uganda to illustrate a methodological strategy of combining these two approaches. We utilized a systems dynamics tool called causal loop diagramming (CLD) to generate hypotheses feeding into a theory-driven evaluation. Semi-structured interviews were conducted with 30 health workers from two districts (Kasese and Kyenjojo) and with 16 key informants. After CLD, we identified three relevant hypotheses: "success to the successful", "growth and underinvestment", and "supervision conundrum". The first hypothesis leads to increasing improvements in performance, as better performance leads to more incentives, which in turn leads to better performance. The latter two hypotheses point to potential bottlenecks. Thus, the proposed methodological strategy was a useful tool for identifying hypotheses that can inform a theory-driven evaluation. The hypotheses are represented in a comprehensible way while highlighting the underlying assumptions, and are more easily falsifiable than hypotheses identified without using CLD.
STUDY OF TURBULENT ENERGY OVER COMPLEX TERRAIN: STATE, 1978
The complex structure of the earth's surface influenced atmospheric parameters pertinent to modeling the diffusion process during the 1978 'STATE' field study. The Information Theory approach of statistics proved useful for analyzing the complex structures observed in the radiome...
NASA Astrophysics Data System (ADS)
Lanzalaco, Felix; Pissanetzky, Sergio
2013-12-01
A recent theory of physical information based on the fundamental principles of causality and thermodynamics has proposed that a large number of observable life and intelligence signals can be described in terms of the Causal Mathematical Logic (CML), which is proposed to encode the natural principles of intelligence across any physical domain and substrate. We attempt to expound the current definition of CML, the "Action functional" as a theory in terms of its ability to possess a superior explanatory power for the current neuroscientific data we use to measure the mammalian brains "intelligence" processes at its most general biophysical level. Brain simulation projects define their success partly in terms of the emergence of "non-explicitly programmed" complex biophysical signals such as self-oscillation and spreading cortical waves. Here we propose to extend the causal theory to predict and guide the understanding of these more complex emergent "intelligence Signals". To achieve this we review whether causal logic is consistent with, can explain and predict the function of complete perceptual processes associated with intelligence. Primarily those are defined as the range of Event Related Potentials (ERP) which include their primary subcomponents; Event Related Desynchronization (ERD) and Event Related Synchronization (ERS). This approach is aiming for a universal and predictive logic for neurosimulation and AGi. The result of this investigation has produced a general "Information Engine" model from translation of the ERD and ERS. The CML algorithm run in terms of action cost predicts ERP signal contents and is consistent with the fundamental laws of thermodynamics. A working substrate independent natural information logic would be a major asset. An information theory consistent with fundamental physics can be an AGi. It can also operate within genetic information space and provides a roadmap to understand the live biophysical operation of the phenotype
NASA Astrophysics Data System (ADS)
Gotoda, Hiroshi; Kinugawa, Hikaru; Tsujimoto, Ryosuke; Domen, Shohei; Okuno, Yuta
2017-04-01
Complex-network theory has attracted considerable attention for nearly a decade, and it enables us to encompass our understanding of nonlinear dynamics in complex systems in a wide range of fields, including applied physics and mechanical, chemical, and electrical engineering. We conduct an experimental study using a pragmatic online detection methodology based on complex-network theory to prevent a limiting unstable state such as blowout in a confined turbulent combustion system. This study introduces a modified version of the natural visibility algorithm based on the idea of a visibility limit to serve as a pragmatic online detector. The average degree of the modified version of the natural visibility graph allows us to detect the onset of blowout, resulting in online prevention.
Ko, Linda K; Turner-McGrievy, Gabrielle M; Campbell, Marci K
2014-04-01
Podcasting is an emerging technology, and previous interventions have shown promising results using theory-based podcast for weight loss among overweight and obese individuals. This study investigated whether constructs of social cognitive theory and information processing theories (IPTs) mediate the effect of a podcast intervention on weight loss among overweight individuals. Data are from Pounds off Digitally, a study testing the efficacy of two weight loss podcast interventions (control podcast and theory-based podcast). Path models were constructed (n = 66). The IPTs, elaboration likelihood model, information control theory, and cognitive load theory mediated the effect of a theory-based podcast on weight loss. The intervention was significantly associated with all IPTs. Information control theory and cognitive load theory were related to elaboration, and elaboration was associated with weight loss. Social cognitive theory constructs did not mediate weight loss. Future podcast interventions grounded in theory may be effective in promoting weight loss.
Dietrich, Ariana B; Hu, Xiaoqing; Rosenfeld, J Peter
2014-03-01
In the first of two experiments, we compared the accuracy of the P300 concealed information test protocol as a function of numbers of trials experienced by subjects and ERP averages analyzed by investigators. Contrary to Farwell et al. (Cogn Neurodyn 6(2):115-154, 2012), we found no evidence that 100 trial based averages are more accurate than 66 or 33 trial based averages (all numbers led to accuracies of 84-94 %). There was actually a trend favoring the lowest trial numbers. The second study compared numbers of irrelevant stimuli recalled and recognized in the 3-stimulus protocol versus the complex trial protocol (Rosenfeld in Memory detection: theory and application of the concealed information test, Cambridge University Press, New York, pp 63-89, 2011). Again, in contrast to expectations from Farwell et al. (Cogn Neurodyn 6(2):115-154, 2012), there were no differences between protocols, although there were more irrelevant stimuli recognized than recalled, and irrelevant 4-digit number group stimuli were neither recalled nor recognized as well as irrelevant city name stimuli. We therefore conclude that stimulus processing in the P300-based complex trial protocol-with no more than 33 sweep averages-is adequate to allow accurate detection of concealed information.
Enhancing Undergraduate Mathematics Curriculum via Coding Theory and Cryptography
ERIC Educational Resources Information Center
Aydin, Nuh
2009-01-01
The theory of error-correcting codes and cryptography are two relatively recent applications of mathematics to information and communication systems. The mathematical tools used in these fields generally come from algebra, elementary number theory, and combinatorics, including concepts from computational complexity. It is possible to introduce the…
Complexity of possibly gapped histogram and analysis of histogram.
Fushing, Hsieh; Roy, Tania
2018-02-01
We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT.
Complexity of possibly gapped histogram and analysis of histogram
Roy, Tania
2018-01-01
We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT. PMID:29515829
Complexity of possibly gapped histogram and analysis of histogram
NASA Astrophysics Data System (ADS)
Fushing, Hsieh; Roy, Tania
2018-02-01
We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT.
Plasma Parameters From Reentry Signal Attenuation
Statom, T. K.
2018-02-27
This study presents the application of a theoretically developed method that provides plasma parameter solution space information from measured RF attenuation that occurs during reentry. The purpose is to provide reentry plasma parameter information from the communication signal attenuation. The theoretical development centers around the attenuation and the complex index of refraction. The methodology uses an imaginary index of the refraction matching algorithm with a tolerance to find suitable solutions that satisfy the theory. The imaginary matching terms are then used to determine the real index of refraction resulting in the complex index of refraction. Then a filter is usedmore » to reject nonphysical solutions. Signal attenuation-based plasma parameter properties investigated include the complex index of refraction, plasma frequency, electron density, collision frequency, propagation constant, attenuation constant, phase constant, complex plasma conductivity, and electron mobility. RF plasma thickness attenuation is investigated and compared to the literature. Finally, similar plasma thickness for a specific signal attenuation can have different plasma properties.« less
Plasma Parameters From Reentry Signal Attenuation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Statom, T. K.
This study presents the application of a theoretically developed method that provides plasma parameter solution space information from measured RF attenuation that occurs during reentry. The purpose is to provide reentry plasma parameter information from the communication signal attenuation. The theoretical development centers around the attenuation and the complex index of refraction. The methodology uses an imaginary index of the refraction matching algorithm with a tolerance to find suitable solutions that satisfy the theory. The imaginary matching terms are then used to determine the real index of refraction resulting in the complex index of refraction. Then a filter is usedmore » to reject nonphysical solutions. Signal attenuation-based plasma parameter properties investigated include the complex index of refraction, plasma frequency, electron density, collision frequency, propagation constant, attenuation constant, phase constant, complex plasma conductivity, and electron mobility. RF plasma thickness attenuation is investigated and compared to the literature. Finally, similar plasma thickness for a specific signal attenuation can have different plasma properties.« less
Lessard, Chantale; Contandriopoulos, André-Pierre; Beaulieu, Marie-Dominique
2010-06-01
Despite increasing interest in health economic evaluation, investigations have shown limited use by micro (clinical) level decision-makers. A considerable amount of health decisions take place daily at the point of the clinical encounter; especially in primary care. Since every decision has an opportunity cost, ignoring economic information in family physicians' (FPs) decision-making may have a broad impact on health care efficiency. Knowledge translation of economic evaluation is often based on taken-for-granted assumptions about actors' interests and interactions, neglecting much of the complexity of social reality. Health economics literature frequently assumes a rational and linear decision-making process. Clinical decision-making is in fact a complex social, dynamic, multifaceted process, involving relationships and contextual embeddedness. FPs are embedded in complex social networks that have a significant impact on skills, attitudes, knowledge, practices, and on the information being used. Because of their socially constructed nature, understanding preferences, professional culture, practices, and knowledge translation requires serious attention to social reality. There has been little exploration by health economists of whether the problem may be more fundamental and reside in a misunderstanding of the process of decision-making. There is a need to enhance our understanding of the role of economic evaluation in decision-making from a disciplinary perspective different than health economics. This paper argues for a different conceptualization of the role of economic evaluation in FPs' decision-making, and proposes Bourdieu's sociological theory as a research framework. Bourdieu's theory of practice illustrates how the context-sensitive nature of practice must be understood as a socially constituted practical knowledge. The proposed approach could substantially contribute to a more complex understanding of the role of economic evaluation in FPs' decision-making. Copyright 2010 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Tomasino, Arthur P.
2013-01-01
In spite of the best efforts of researchers and practitioners, Information Systems (IS) developers are having problems "getting it right". IS developments are challenged by the emergence of unanticipated IS characteristics undermining managers ability to predict and manage IS change. Because IS are complex, development formulas, best…
The design of dual-mode complex signal processors based on quadratic modular number codes
NASA Astrophysics Data System (ADS)
Jenkins, W. K.; Krogmeier, J. V.
1987-04-01
It has been known for a long time that quadratic modular number codes admit an unusual representation of complex numbers which leads to complete decoupling of the real and imaginary channels, thereby simplifying complex multiplication and providing error isolation between the real and imaginary channels. This paper first presents a tutorial review of the theory behind the different types of complex modular rings (fields) that result from particular parameter selections, and then presents a theory for a 'dual-mode' complex signal processor based on the choice of augmented power-of-2 moduli. It is shown how a diminished-1 binary code, used by previous designers for the realization of Fermat number transforms, also leads to efficient realizations for dual-mode complex arithmetic for certain augmented power-of-2 moduli. Then a design is presented for a recursive complex filter based on a ROM/ACCUMULATOR architecture and realized in an augmented power-of-2 quadratic code, and a computer-generated example of a complex recursive filter is shown to illustrate the principles of the theory.
Feasibility study of molecular memory device based on DNA using methylation to store information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Liming; Al-Dirini, Feras; Center for Neural Engineering
DNA, because of its robustness and dense information storage capability, has been proposed as a potential candidate for next-generation storage media. However, encoding information into the DNA sequence requires molecular synthesis technology, which to date is costly and prone to synthesis errors. Reading the DNA strand information is also complex. Ideally, DNA storage will provide methods for modifying stored information. Here, we conduct a feasibility study investigating the use of the DNA 5-methylcytosine (5mC) methylation state as a molecular memory to store information. We propose a new 1-bit memory device and study, based on the density functional theory and non-equilibriummore » Green's function method, the feasibility of electrically reading the information. Our results show that changes to methylation states lead to changes in the peak of negative differential resistance which can be used to interrogate memory state. Our work demonstrates a new memory concept based on methylation state which can be beneficial in the design of next generation DNA based molecular electronic memory devices.« less
Mathematical biodescriptors of proteomics maps: background and applications.
Basak, Subhash C; Gute, Brian D
2008-05-01
This article reviews recent developments in the formulation and application of biodescriptors to characterize proteomics maps. Such biodescriptors can be derived by applying techniques from discrete mathematics (graph theory, linear algebra and information theory). This review focuses on the development of biodescriptors for proteomics maps derived from 2D gel electrophoresis. Preliminary results demonstrated that such descriptors have a reasonable ability to differentiate between proteomics patterns that result from exposure to closely related individual chemicals and complex mixtures, such as the jet fuel JP-8. Further research is required to evaluate the utility of these proteomics-based biodescriptors for drug discovery and predictive toxicology.
Approximating Reflectance and Transmittance of Vegetation Using Multiple Spectral Invariants
NASA Astrophysics Data System (ADS)
Mottus, M.
2011-12-01
Canopy spectral invariants, eigenvalues of the radiative transfer equation and photon recollision probability are some of the new theoretical tools that have been applied in remote sensing of vegetation and atmosphere. The theoretical approach based on spectral invariants, informally also referred to as the p-theory, owns its attractivity to several factors. Firstly, it provides a rapid and physically-based way of describing canopy scattering. Secondly, the p-theory aims at parameterizing canopy structure in reflectance models using a simple and intuitive concept which can be applied at various structural levels, from shoot to tree crown. The theory has already been applied at scales from the molecular level to forest stands. The most important shortcoming of the p-theory lies in its inability to predict the directionality of scattering. The theory is currently based on only one physical parameter, the photon recollision probability p. It is evident that one parameter cannot contain enough information to reasonably predict the observed complex reflectance patterns produced by natural vegetation canopies. Without estimating scattering directionality, however, the theory cannot be compared with even the most simple (and well-tested) two-stream vegetation reflectance models. In this study, we evaluate the possibility to use additional parameters to fit the measured reflectance and transmittance of a vegetation stand. As a first step, the parameters are applied to separate canopy scattering into reflectance and transmittance. New parameters are introduced following the general approach of eigenvector expansion. Thus, the new parameters are coined higher-order spectral invariants. Calculation of higher-order invariants is based on separating first-order scattering from total scattering. Thus, the method explicitly accounts for different view geometries with different fractions of visible sunlit canopy (e.g., hot-spot). It additionally allows to produce different irradiation levels on leaf surfaces for direct and diffuse incidence, thus (in theory) allowing more accurate calculation of potential photosynthesis rates. Similarly to the p-theory, the use of multiple spectral invariants facilitates easy parametrization of canopy structure and scaling between different structural levels (leaf-shoot-stand). Spectral invariant-based remote sensing approaches are well suited for relatively large pixels even when no detailed ground truth information is available. In a case study, the theory of multiple spectral invariants was applied to measured canopy scattering. Spectral reflectance and transmittance measurements were carried out in gray alder (Alnus incana) plantation at Tartu Observatory, Estonia, in August 2006. The equations produced by the theory of spectral invariants were fitted to measured radiation fluxes. Preliminary results indicate that quantities with invariant-like behavior may indeed be used to approximate canopy scattering directionality.
LeVine, Michael V.; Weinstein, Harel
2014-01-01
Complex networks of interacting residues and microdomains in the structures of biomolecular systems underlie the reliable propagation of information from an input signal, such as the concentration of a ligand, to sites that generate the appropriate output signal, such as enzymatic activity. This information transduction often carries the signal across relatively large distances at the molecular scale in a form of allostery that is essential for the physiological functions performed by biomolecules. While allosteric behaviors have been documented from experiments and computation, the mechanism of this form of allostery proved difficult to identify at the molecular level. Here, we introduce a novel analysis framework, called N-body Information Theory (NbIT) analysis, which is based on information theory and uses measures of configurational entropy in a biomolecular system to identify microdomains and individual residues that act as (i)-channels for long-distance information sharing between functional sites, and (ii)-coordinators that organize dynamics within functional sites. Application of the new method to molecular dynamics (MD) trajectories of the occluded state of the bacterial leucine transporter LeuT identifies a channel of allosteric coupling between the functionally important intracellular gate and the substrate binding sites known to modulate it. NbIT analysis is shown also to differentiate residues involved primarily in stabilizing the functional sites, from those that contribute to allosteric couplings between sites. NbIT analysis of MD data thus reveals rigorous mechanistic elements of allostery underlying the dynamics of biomolecular systems. PMID:24785005
Objects and processes: Two notions for understanding biological information.
Mercado-Reyes, Agustín; Padilla-Longoria, Pablo; Arroyo-Santos, Alfonso
2015-09-07
In spite of being ubiquitous in life sciences, the concept of information is harshly criticized. Uses of the concept other than those derived from Shannon׳s theory are denounced as metaphoric. We perform a computational experiment to explore whether Shannon׳s information is adequate to describe the uses of said concept in commonplace scientific practice. Our results show that semantic sequences do not have unique complexity values different from the value of meaningless sequences. This result suggests that quantitative theoretical frameworks do not account fully for the complex phenomenon that the term "information" refers to. We propose a restructuring of the concept into two related, but independent notions, and conclude that a complete theory of biological information must account completely not only for both notions, but also for the relationship between them. Copyright © 2015 Elsevier Ltd. All rights reserved.
A new approach to preserve privacy data mining based on fuzzy theory in numerical database
NASA Astrophysics Data System (ADS)
Cui, Run; Kim, Hyoung Joong
2014-01-01
With the rapid development of information techniques, data mining approaches have become one of the most important tools to discover the in-deep associations of tuples in large-scale database. Hence how to protect the private information is quite a huge challenge, especially during the data mining procedure. In this paper, a new method is proposed for privacy protection which is based on fuzzy theory. The traditional fuzzy approach in this area will apply fuzzification to the data without considering its readability. A new style of obscured data expression is introduced to provide more details of the subsets without reducing the readability. Also we adopt a balance approach between the privacy level and utility when to achieve the suitable subgroups. An experiment is provided to show that this approach is suitable for the classification without a lower accuracy. In the future, this approach can be adapted to the data stream as the low computation complexity of the fuzzy function with a suitable modification.
The Complexity of Language Learning
ERIC Educational Resources Information Center
Nelson, Charles
2011-01-01
This paper takes a complexity theory approach to looking at language learning, an approach that investigates how language learners adapt to and interact with people and their environment. Based on interviews with four graduate students, it shows how complexity theory can help us understand both the situatedness of language learning and also…
Automatic Trading Agent. RMT Based Portfolio Theory and Portfolio Selection
NASA Astrophysics Data System (ADS)
Snarska, M.; Krzych, J.
2006-11-01
Portfolio theory is a very powerful tool in the modern investment theory. It is helpful in estimating risk of an investor's portfolio, arosen from lack of information, uncertainty and incomplete knowledge of reality, which forbids a perfect prediction of future price changes. Despite of many advantages this tool is not known and not widely used among investors on Warsaw Stock Exchange. The main reason for abandoning this method is a high level of complexity and immense calculations. The aim of this paper is to introduce an automatic decision-making system, which allows a single investor to use complex methods of Modern Portfolio Theory (MPT). The key tool in MPT is an analysis of an empirical covariance matrix. This matrix, obtained from historical data, biased by such a high amount of statistical uncertainty, that it can be seen as random. By bringing into practice the ideas of Random Matrix Theory (RMT), the noise is removed or significantly reduced, so the future risk and return are better estimated and controlled. These concepts are applied to the Warsaw Stock Exchange Simulator {http://gra.onet.pl}. The result of the simulation is 18% level of gains in comparison with respective 10% loss of the Warsaw Stock Exchange main index WIG.
Systemic Operational Design: Epistemological Bumpf or the Way Ahead for Operational Design?
2006-05-25
facilitating the design of such architectural frames (meta-concepts), they are doomed to be trapped in a simplistic structuralist approach.”1...systems theory and complexity theory . SOD emerged and evolved in response to inherent challenges in the contemporary Israeli security environment...discussed in subsequent chapters. Theory . Theory is critical to this examination of the CEOD approach and SOD because theory underpins and informs
Macfarlane, Fraser; Greenhalgh, Trish; Humphrey, Charlotte; Hughes, Jane; Butler, Ceri; Pawson, Ray
2011-01-01
This paper seeks to describe the exploration of human resource issues in one large-scale program of innovation in healthcare. It is informed by established theories of management in the workplace and a multi-level model of diffusion of innovations. A realist approach was used based on interviews, ethnographic observation and documentary analysis. Five main approaches ("theories of change") were adopted to develop and support the workforce: recruiting staff with skills in service transformation; redesigning roles and creating new roles; enhancing workforce planning; linking staff development to service needs; creating opportunities for shared learning and knowledge exchange. Each had differing levels of success. The paper includes HR implications for the modernisation of a complex service organisation. This is the first time a realist evaluation of a complex health modernisation initiative has been undertaken.
Granular support vector machines with association rules mining for protein homology prediction.
Tang, Yuchun; Jin, Bo; Zhang, Yan-Qing
2005-01-01
Protein homology prediction between protein sequences is one of critical problems in computational biology. Such a complex classification problem is common in medical or biological information processing applications. How to build a model with superior generalization capability from training samples is an essential issue for mining knowledge to accurately predict/classify unseen new samples and to effectively support human experts to make correct decisions. A new learning model called granular support vector machines (GSVM) is proposed based on our previous work. GSVM systematically and formally combines the principles from statistical learning theory and granular computing theory and thus provides an interesting new mechanism to address complex classification problems. It works by building a sequence of information granules and then building support vector machines (SVM) in some of these information granules on demand. A good granulation method to find suitable granules is crucial for modeling a GSVM with good performance. In this paper, we also propose an association rules-based granulation method. For the granules induced by association rules with high enough confidence and significant support, we leave them as they are because of their high "purity" and significant effect on simplifying the classification task. For every other granule, a SVM is modeled to discriminate the corresponding data. In this way, a complex classification problem is divided into multiple smaller problems so that the learning task is simplified. The proposed algorithm, here named GSVM-AR, is compared with SVM by KDDCUP04 protein homology prediction data. The experimental results show that finding the splitting hyperplane is not a trivial task (we should be careful to select the association rules to avoid overfitting) and GSVM-AR does show significant improvement compared to building one single SVM in the whole feature space. Another advantage is that the utility of GSVM-AR is very good because it is easy to be implemented. More importantly and more interestingly, GSVM provides a new mechanism to address complex classification problems.
Taxonomy for complexity theory in the context of maternity care.
Nieuwenhuijze, Marianne; Downe, Soo; Gottfreðsdóttir, Helga; Rijnders, Marlies; du Preez, Antoinette; Vaz Rebelo, Piedade
2015-09-01
The linear focus of 'normal science' is unable to adequately take account of the complex interactions that direct health care systems. There is a turn towards complexity theory as a more appropriate framework for understanding system behaviour. However, a comprehensive taxonomy for complexity theory in the context of health care is lacking. This paper aims to build a taxonomy based on the key complexity theory components that have been used in publications on complexity theory and health care, and to explore their explanatory power for health care system behaviour, specifically for maternity care. A search strategy was devised in PubMed and 31 papers were identified as relevant for the taxonomy. The final taxonomy for complexity theory included and defined 11 components. The use of waterbirth and the impact of the Term Breech trial showed that each of the components of our taxonomy has utility in helping to understand how these techniques became widely adopted. It is not just the components themselves that characterise a complex system but also the dynamics between them. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Domercant, Jean Charles
The combination of today's national security environment and mandated acquisition policies makes it necessary for military systems to interoperate with each other to greater degrees. This growing interdependency results in complex Systems-of-Systems (SoS) that only continue to grow in complexity to meet evolving capability needs. Thus, timely and affordable acquisition becomes more difficult, especially in the face of mounting budgetary pressures. To counter this, architecting principles must be applied to SoS design. The research objective is to develop an Architecture Real Options Complexity-Based Valuation Methodology (ARC-VM) suitable for acquisition-level decision making, where there is a stated desire for more informed tradeoffs between cost, schedule, and performance during the early phases of design. First, a framework is introduced to measure architecture complexity as it directly relates to military SoS. Development of the framework draws upon a diverse set of disciplines, including Complexity Science, software architecting, measurement theory, and utility theory. Next, a Real Options based valuation strategy is developed using techniques established for financial stock options that have recently been adapted for use in business and engineering decisions. The derived complexity measure provides architects with an objective measure of complexity that focuses on relevant complex system attributes. These attributes are related to the organization and distribution of SoS functionality and the sharing and processing of resources. The use of Real Options provides the necessary conceptual and visual framework to quantifiably and traceably combine measured architecture complexity, time-valued performance levels, as well as programmatic risks and uncertainties. An example suppression of enemy air defenses (SEAD) capability demonstrates the development and usefulness of the resulting architecture complexity & Real Options based valuation methodology. Different portfolios of candidate system types are used to generate an array of architecture alternatives that are then evaluated using an engagement model. This performance data is combined with both measured architecture complexity and programmatic data to assign an acquisition value to each alternative. This proves useful when selecting alternatives most likely to meet current and future capability needs.
A Holoinformational Model of the Physical Observer
NASA Astrophysics Data System (ADS)
di Biase, Francisco
2013-09-01
The author proposes a holoinformational view of the observer based, on the holonomic theory of brain/mind function and quantum brain dynamics developed by Karl Pribram, Sir John Eccles, R.L. Amoroso, Hameroff, Jibu and Yasue, and in the quantumholographic and holomovement theory of David Bohm. This conceptual framework is integrated with nonlocal information properties of the Quantum Field Theory of Umesawa, with the concept of negentropy, order, and organization developed by Shannon, Wiener, Szilard and Brillouin, and to the theories of self-organization and complexity of Prigogine, Atlan, Jantsch and Kauffman. Wheeler's "it from bit" concept of a participatory universe, and the developments of the physics of information made by Zureck and others with the concepts of statistical entropy and algorithmic entropy, related to the number of bits being processed in the mind of the observer are also considered. This new synthesis gives a self-organizing quantum nonlocal informational basis for a new model of awareness in a participatory universe. In this synthesis, awareness is conceived as meaningful quantum nonlocal information interconnecting the brain and the cosmos, by a holoinformational unified field (integrating nonlocal holistic (quantum) and local (Newtonian). We propose that the cosmology of the physical observer is this unified nonlocal quantum-holographic cosmos manifesting itself through awareness, interconnected in a participatory holistic and indivisible way the human mind-brain to all levels of the self-organizing holographic anthropic multiverse.
Introducing Evidence Through Research "Push": Using Theory and Qualitative Methods.
Morden, Andrew; Ong, Bie Nio; Brooks, Lauren; Jinks, Clare; Porcheret, Mark; Edwards, John J; Dziedzic, Krysia S
2015-11-01
A multitude of factors can influence the uptake and implementation of complex interventions in health care. A plethora of theories and frameworks recognize the need to establish relationships, understand organizational dynamics, address context and contingency, and engage key decision makers. Less attention is paid to how theories that emphasize relational contexts can actually be deployed to guide the implementation of an intervention. The purpose of the article is to demonstrate the potential role of qualitative research aligned with theory to inform complex interventions. We detail a study underpinned by theory and qualitative research that (a) ensured key actors made sense of the complex intervention at the earliest stage of adoption and (b) aided initial engagement with the intervention. We conclude that using theoretical approaches aligned with qualitative research can provide insights into the context and dynamics of health care settings that in turn can be used to aid intervention implementation. © The Author(s) 2015.
Quantum-like Viewpoint on the Complexity and Randomness of the Financial Market
NASA Astrophysics Data System (ADS)
Choustova, Olga
In economics and financial theory, analysts use random walk and more general martingale techniques to model behavior of asset prices, in particular share prices on stock markets, currency exchange rates and commodity prices. This practice has its basis in the presumption that investors act rationally and without bias, and that at any moment they estimate the value of an asset based on future expectations. Under these conditions, all existing information affects the price, which changes only when new information comes out. By definition, new information appears randomly and influences the asset price randomly. Corresponding continuous time models are based on stochastic processes (this approach was initiated in the thesis of [4]), see, e.g., the books of [33] and [37] for historical and mathematical details.
Global Complexity: Information, Chaos, and Control at ASIS 1996 Annual Meeting.
ERIC Educational Resources Information Center
Jacob, M. E. L.
1996-01-01
Discusses proceedings of the 1996 ASIS (American Society for Information Science) annual meeting in Baltimore (Maryland), including chaos theory; electronic universities; distance education; intellectual property, including information privacy on the Internet; the need for leadership in libraries and information centers; information warfare and…
NASA Astrophysics Data System (ADS)
Tacnet, Jean-Marc; Dupouy, Guillaume; Carladous, Simon; Dezert, Jean; Batton-Hubert, Mireille
2017-04-01
In mountain areas, natural phenomena such as snow avalanches, debris-flows and rock-falls, put people and objects at risk with sometimes dramatic consequences. Risk is classically considered as a combination of hazard, the combination of the intensity and frequency of the phenomenon, and vulnerability which corresponds to the consequences of the phenomenon on exposed people and material assets. Risk management consists in identifying the risk level as well as choosing the best strategies for risk prevention, i.e. mitigation. In the context of natural phenomena in mountainous areas, technical and scientific knowledge is often lacking. Risk management decisions are therefore based on imperfect information. This information comes from more or less reliable sources ranging from historical data, expert assessments, numerical simulations etc. Finally, risk management decisions are the result of complex knowledge management and reasoning processes. Tracing the information and propagating information quality from data acquisition to decisions are therefore important steps in the decision-making process. One major goal today is therefore to assist decision-making while considering the availability, quality and reliability of information content and sources. A global integrated framework is proposed to improve the risk management process in a context of information imperfection provided by more or less reliable sources: uncertainty as well as imprecision, inconsistency and incompleteness are considered. Several methods are used and associated in an original way: sequential decision context description, development of specific multi-criteria decision-making methods, imperfection propagation in numerical modeling and information fusion. This framework not only assists in decision-making but also traces the process and evaluates the impact of information quality on decision-making. We focus and present two main developments. The first one relates to uncertainty and imprecision propagation in numerical modeling using both classical Monte-Carlo probabilistic approach and also so-called Hybrid approach using possibility theory. Second approach deals with new multi-criteria decision-making methods which consider information imperfection, source reliability, importance and conflict, using fuzzy sets as well as possibility and belief function theories. Implemented methods consider information imperfection propagation and information fusion in total aggregation methods such as AHP (Saaty, 1980) or partial aggregation methods such as the Electre outranking method (see Soft Electre Tri ) or decisions in certain but also risky or uncertain contexts (see new COWA-ER and FOWA-ER- Cautious and Fuzzy Ordered Weighted Averaging-Evidential Reasoning). For example, the ER-MCDA methodology considers expert assessment as a multi-criteria decision process based on imperfect information provided by more or less heterogeneous, reliable and conflicting sources: it mixes AHP, fuzzy sets theory, possibility theory and belief function theory using DSmT (Dezert-Smarandache Theory) framework which provides powerful fusion rules.
Theory of attosecond delays in molecular photoionization.
Baykusheva, Denitsa; Wörner, Hans Jakob
2017-03-28
We present a theoretical formalism for the calculation of attosecond delays in molecular photoionization. It is shown how delays relevant to one-photon-ionization, also known as Eisenbud-Wigner-Smith delays, can be obtained from the complex dipole matrix elements provided by molecular quantum scattering theory. These results are used to derive formulae for the delays measured by two-photon attosecond interferometry based on an attosecond pulse train and a dressing femtosecond infrared pulse. These effective delays are first expressed in the molecular frame where maximal information about the molecular photoionization dynamics is available. The effects of averaging over the emission direction of the electron and the molecular orientation are introduced analytically. We illustrate this general formalism for the case of two polyatomic molecules. N 2 O serves as an example of a polar linear molecule characterized by complex photoionization dynamics resulting from the presence of molecular shape resonances. H 2 O illustrates the case of a non-linear molecule with comparably simple photoionization dynamics resulting from a flat continuum. Our theory establishes the foundation for interpreting measurements of the photoionization dynamics of all molecules by attosecond metrology.
Vazart, Fanny; Calderini, Danilo; Puzzarini, Cristina; Skouteris, Dimitrios
2017-01-01
We propose an integrated computational strategy aimed at providing reliable thermochemical and kinetic information on the formation processes of astrochemical complex organic molecules. The approach involves state-of-the-art quantum-mechanical computations, second-order vibrational perturbation theory, and kinetic models based on capture and transition state theory together with the master equation approach. Notably, tunneling, quantum reflection, and leading anharmonic contributions are accounted for in our model. Formamide has been selected as a case study in view of its interest as a precursor in the abiotic amino acid synthesis. After validation of the level of theory chosen for describing the potential energy surface, we have investigated several pathways of the OH+CH2NH and NH2+HCHO reaction channels. Our results indicate that both reaction channels are essentially barrier-less (in the sense that all relevant transition states lie below or only marginally above the reactants) and can, therefore, occur under the low temperature conditions of interstellar objects provided that tunneling is taken into the proper account. PMID:27689448
NASA Astrophysics Data System (ADS)
Mrugalla, Florian; Kast, Stefan M.
2016-09-01
Complex formation between molecules in solution is the key process by which molecular interactions are translated into functional systems. These processes are governed by the binding or free energy of association which depends on both direct molecular interactions and the solvation contribution. A design goal frequently addressed in pharmaceutical sciences is the optimization of chemical properties of the complex partners in the sense of minimizing their binding free energy with respect to a change in chemical structure. Here, we demonstrate that liquid-state theory in the form of the solute-solute equation of the reference interaction site model provides all necessary information for such a task with high efficiency. In particular, computing derivatives of the potential of mean force (PMF), which defines the free-energy surface of complex formation, with respect to potential parameters can be viewed as a means to define a direction in chemical space toward better binders. We illustrate the methodology in the benchmark case of alkali ion binding to the crown ether 18-crown-6 in aqueous solution. In order to examine the validity of the underlying solute-solute theory, we first compare PMFs computed by different approaches, including explicit free-energy molecular dynamics simulations as a reference. Predictions of an optimally binding ion radius based on free-energy derivatives are then shown to yield consistent results for different ion parameter sets and to compare well with earlier, orders-of-magnitude more costly explicit simulation results. This proof-of-principle study, therefore, demonstrates the potential of liquid-state theory for molecular design problems.
Mrugalla, Florian; Kast, Stefan M
2016-09-01
Complex formation between molecules in solution is the key process by which molecular interactions are translated into functional systems. These processes are governed by the binding or free energy of association which depends on both direct molecular interactions and the solvation contribution. A design goal frequently addressed in pharmaceutical sciences is the optimization of chemical properties of the complex partners in the sense of minimizing their binding free energy with respect to a change in chemical structure. Here, we demonstrate that liquid-state theory in the form of the solute-solute equation of the reference interaction site model provides all necessary information for such a task with high efficiency. In particular, computing derivatives of the potential of mean force (PMF), which defines the free-energy surface of complex formation, with respect to potential parameters can be viewed as a means to define a direction in chemical space toward better binders. We illustrate the methodology in the benchmark case of alkali ion binding to the crown ether 18-crown-6 in aqueous solution. In order to examine the validity of the underlying solute-solute theory, we first compare PMFs computed by different approaches, including explicit free-energy molecular dynamics simulations as a reference. Predictions of an optimally binding ion radius based on free-energy derivatives are then shown to yield consistent results for different ion parameter sets and to compare well with earlier, orders-of-magnitude more costly explicit simulation results. This proof-of-principle study, therefore, demonstrates the potential of liquid-state theory for molecular design problems.
Perfect gas effects in compressible rapid distortion theory
NASA Technical Reports Server (NTRS)
Kerschen, E. J.; Myers, M. R.
1987-01-01
The governing equations presented for small amplitude unsteady disturbances imposed on steady, compressible mean flows that are two-dimensional and nearly uniform have their basis in the perfect gas equations of state, and therefore generalize previous results based on tangent gas theory. While these equations are more complex, this complexity is required for adequate treatment of high frequency disturbances, especially when the base flow Mach number is large; under such circumstances, the simplifying assumptions of tangent gas theory are not applicable.
Ko, Linda K.; Turner-McGrievy, Gabrielle; Campbell, Marci K.
2016-01-01
Podcasting is an emerging technology, and previous interventions have shown promising results using theory-based podcast for weight loss among overweight and obese individuals. This study investigated whether constructs of social cognitive theory and information processing theories (IPTs) mediate the effect of a podcast intervention on weight loss among overweight individuals. Data are from Pounds off Digitally, a study testing the efficacy of two weight loss podcast interventions (control podcast and theory-based podcast). Path models were constructed (n = 66). The IPTs—elaboration likelihood model, information control theory, and cognitive load theory—mediated the effect of a theory-based podcast on weight loss. The intervention was significantly associated with all IPTs. Information control theory and cognitive load theory were related to elaboration, and elaboration was associated with weight loss. Social cognitive theory constructs did not mediate weight loss. Future podcast interventions grounded in theory may be effective in promoting weight loss. PMID:24082027
Aspects of Complexity in Sleep Analysis
NASA Astrophysics Data System (ADS)
Leitão, José M. N.; Da Rosa, Agostinho C.
The paper presents a selection of sleep analysis problems where some aspects and concepts of complexity come about. Emphasis is given to the electroencephalogram (EEG) as the most important sleep related variable. The conception of the EEG as a message to be deciphered stresses the importance of the communication and information theories in this field. An optimal detector of K complexes and vertex sharp waves based on a stochastic model of sleep EEG is considered. Besides detecting, the algorithm is also able to follow the evolution of the basic ongoing activity. It is shown that both the ostructure and microstructure of sleep can be described in terms of symbols and interpreted as sentences of a language. Syntactic models and Markov chain representations play in this context an important role.
Experimental econophysics: Complexity, self-organization, and emergent properties
NASA Astrophysics Data System (ADS)
Huang, J. P.
2015-03-01
Experimental econophysics is concerned with statistical physics of humans in the laboratory, and it is based on controlled human experiments developed by physicists to study some problems related to economics or finance. It relies on controlled human experiments in the laboratory together with agent-based modeling (for computer simulations and/or analytical theory), with an attempt to reveal the general cause-effect relationship between specific conditions and emergent properties of real economic/financial markets (a kind of complex adaptive systems). Here I review the latest progress in the field, namely, stylized facts, herd behavior, contrarian behavior, spontaneous cooperation, partial information, and risk management. Also, I highlight the connections between such progress and other topics of traditional statistical physics. The main theme of the review is to show diverse emergent properties of the laboratory markets, originating from self-organization due to the nonlinear interactions among heterogeneous humans or agents (complexity).
An imprecise probability approach for squeal instability analysis based on evidence theory
NASA Astrophysics Data System (ADS)
Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie
2017-01-01
An imprecise probability approach based on evidence theory is proposed for squeal instability analysis of uncertain disc brakes in this paper. First, the squeal instability of the finite element (FE) model of a disc brake is investigated and its dominant unstable eigenvalue is detected by running two typical numerical simulations, i.e., complex eigenvalue analysis (CEA) and transient dynamical analysis. Next, the uncertainty mainly caused by contact and friction is taken into account and some key parameters of the brake are described as uncertain parameters. All these uncertain parameters are usually involved with imprecise data such as incomplete information and conflict information. Finally, a squeal instability analysis model considering imprecise uncertainty is established by integrating evidence theory, Taylor expansion, subinterval analysis and surrogate model. In the proposed analysis model, the uncertain parameters with imprecise data are treated as evidence variables, and the belief measure and plausibility measure are employed to evaluate system squeal instability. The effectiveness of the proposed approach is demonstrated by numerical examples and some interesting observations and conclusions are summarized from the analyses and discussions. The proposed approach is generally limited to the squeal problems without too many investigated parameters. It can be considered as a potential method for squeal instability analysis, which will act as the first step to reduce squeal noise of uncertain brakes with imprecise information.
Theory-Based Evaluation Meets Ambiguity: The Role of Janus Variables
ERIC Educational Resources Information Center
Dahler-Larsen, Peter
2018-01-01
As theory-based evaluation (TBE) engages in situations where multiple stakeholders help develop complex program theory about dynamic phenomena in politically contested settings, it becomes difficult to develop and use program theory without ambiguity. The purpose of this article is to explore ambiguity as a fruitful perspective that helps TBE face…
The Complex Economic System of Supply Chain Financing
NASA Astrophysics Data System (ADS)
Zhang, Lili; Yan, Guangle
Supply Chain Financing (SCF) refers to a series of innovative and complicated financial services based on supply chain. The SCF set-up is a complex system, where the supply chain management and Small and Medium Enterprises (SMEs) financing services interpenetrate systematically. This paper establishes the organization structure of SCF System, and presents two financing models respectively, with or without the participation of the third-party logistic provider (3PL). Using Information Economics and Game Theory, the interrelationship among diverse economic sectors is analyzed, and the economic mechanism of development and existent for SCF system is demonstrated. New thoughts and approaches to solve SMEs financing problem are given.
A Combined Theoretical and Experimental Study for Silver Electroplating
Liu, Anmin; Ren, Xuefeng; An, Maozhong; Zhang, Jinqiu; Yang, Peixia; Wang, Bo; Zhu, Yongming; Wang, Chong
2014-01-01
A novel method combined theoretical and experimental study for environmental friendly silver electroplating was introduced. Quantum chemical calculations and molecular dynamic (MD) simulations were employed for predicting the behaviour and function of the complexing agents. Electronic properties, orbital information, and single point energies of the 5,5-dimethylhydantoin (DMH), nicotinic acid (NA), as well as their silver(I)-complexes were provided by quantum chemical calculations based on density functional theory (DFT). Adsorption behaviors of the agents on copper and silver surfaces were investigated using MD simulations. Basing on the data of quantum chemical calculations and MD simulations, we believed that DMH and NA could be the promising complexing agents for silver electroplating. The experimental results, including of electrochemical measurement and silver electroplating, further confirmed the above prediction. This efficient and versatile method thus opens a new window to study or design complexing agents for generalized metal electroplating and will vigorously promote the level of this research region. PMID:24452389
Enabling Controlling Complex Networks with Local Topological Information.
Li, Guoqi; Deng, Lei; Xiao, Gaoxi; Tang, Pei; Wen, Changyun; Hu, Wuhua; Pei, Jing; Shi, Luping; Stanley, H Eugene
2018-03-15
Complex networks characterize the nature of internal/external interactions in real-world systems including social, economic, biological, ecological, and technological networks. Two issues keep as obstacles to fulfilling control of large-scale networks: structural controllability which describes the ability to guide a dynamical system from any initial state to any desired final state in finite time, with a suitable choice of inputs; and optimal control, which is a typical control approach to minimize the cost for driving the network to a predefined state with a given number of control inputs. For large complex networks without global information of network topology, both problems remain essentially open. Here we combine graph theory and control theory for tackling the two problems in one go, using only local network topology information. For the structural controllability problem, a distributed local-game matching method is proposed, where every node plays a simple Bayesian game with local information and local interactions with adjacent nodes, ensuring a suboptimal solution at a linear complexity. Starring from any structural controllability solution, a minimizing longest control path method can efficiently reach a good solution for the optimal control in large networks. Our results provide solutions for distributed complex network control and demonstrate a way to link the structural controllability and optimal control together.
2014-05-01
DISTRIBUTION A. Approved for public release: distribution unlimited. INFORMATION, UNDERSTANDING, AND INFLUENCE: AN AGENCY THEORY STRATEGY ...Influence: An Agency Theory Strategy For Air Base Communications And Cyberspace Support 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...present communications and cyberspace support organizations. Next, it introduces a strategy based on this analysis to bring information, understanding
SDIA: A dynamic situation driven information fusion algorithm for cloud environment
NASA Astrophysics Data System (ADS)
Guo, Shuhang; Wang, Tong; Wang, Jian
2017-09-01
Information fusion is an important issue in information integration domain. In order to form an extensive information fusion technology under the complex and diverse situations, a new information fusion algorithm is proposed. Firstly, a fuzzy evaluation model of tag utility was proposed that can be used to count the tag entropy. Secondly, a ubiquitous situation tag tree model is proposed to define multidimensional structure of information situation. Thirdly, the similarity matching between the situation models is classified into three types: the tree inclusion, the tree embedding, and the tree compatibility. Next, in order to reduce the time complexity of the tree compatible matching algorithm, a fast and ordered tree matching algorithm is proposed based on the node entropy, which is used to support the information fusion by ubiquitous situation. Since the algorithm revolve from the graph theory of disordered tree matching algorithm, it can improve the information fusion present recall rate and precision rate in the situation. The information fusion algorithm is compared with the star and the random tree matching algorithm, and the difference between the three algorithms is analyzed in the view of isomorphism, which proves the innovation and applicability of the algorithm.
ERIC Educational Resources Information Center
Speirs, Samantha J.; Rinehart, Nicole J.; Robinson, Stephen R.; Tonge, Bruce J.; Yelland, Gregory W.
2014-01-01
Autism spectrum disorders (ASD) are characterised by a unique pattern of preserved abilities and deficits within and across cognitive domains. The Complex Information Processing Theory proposes this pattern reflects an altered capacity to respond to cognitive demands. This study compared how complexity induced by time constraints on processing…
ERIC Educational Resources Information Center
Pigott, Julian
2012-01-01
In this paper I give an overview of recent developments in the L2 motivation field, in particular the movement away from quantitative, questionnaire-based methodologies toward smaller-scale qualitative studies incorporating concepts from complexity theory. While complexity theory provides useful concepts for exploring motivation in new ways, it…
Galas, David J; Sakhanenko, Nikita A; Skupin, Alexander; Ignac, Tomasz
2014-02-01
Context dependence is central to the description of complexity. Keying on the pairwise definition of "set complexity," we use an information theory approach to formulate general measures of systems complexity. We examine the properties of multivariable dependency starting with the concept of interaction information. We then present a new measure for unbiased detection of multivariable dependency, "differential interaction information." This quantity for two variables reduces to the pairwise "set complexity" previously proposed as a context-dependent measure of information in biological systems. We generalize it here to an arbitrary number of variables. Critical limiting properties of the "differential interaction information" are key to the generalization. This measure extends previous ideas about biological information and provides a more sophisticated basis for the study of complexity. The properties of "differential interaction information" also suggest new approaches to data analysis. Given a data set of system measurements, differential interaction information can provide a measure of collective dependence, which can be represented in hypergraphs describing complex system interaction patterns. We investigate this kind of analysis using simulated data sets. The conjoining of a generalized set complexity measure, multivariable dependency analysis, and hypergraphs is our central result. While our focus is on complex biological systems, our results are applicable to any complex system.
Cognitive performance modeling based on general systems performance theory.
Kondraske, George V
2010-01-01
General Systems Performance Theory (GSPT) was initially motivated by problems associated with quantifying different aspects of human performance. It has proved to be invaluable for measurement development and understanding quantitative relationships between human subsystem capacities and performance in complex tasks. It is now desired to bring focus to the application of GSPT to modeling of cognitive system performance. Previous studies involving two complex tasks (i.e., driving and performing laparoscopic surgery) and incorporating measures that are clearly related to cognitive performance (information processing speed and short-term memory capacity) were revisited. A GSPT-derived method of task analysis and performance prediction termed Nonlinear Causal Resource Analysis (NCRA) was employed to determine the demand on basic cognitive performance resources required to support different levels of complex task performance. This approach is presented as a means to determine a cognitive workload profile and the subsequent computation of a single number measure of cognitive workload (CW). Computation of CW may be a viable alternative to measuring it. Various possible "more basic" performance resources that contribute to cognitive system performance are discussed. It is concluded from this preliminary exploration that a GSPT-based approach can contribute to defining cognitive performance models that are useful for both individual subjects and specific groups (e.g., military pilots).
NASA Astrophysics Data System (ADS)
Suo, M. Q.; Li, Y. P.; Huang, G. H.
2011-09-01
In this study, an inventory-theory-based interval-parameter two-stage stochastic programming (IB-ITSP) model is proposed through integrating inventory theory into an interval-parameter two-stage stochastic optimization framework. This method can not only address system uncertainties with complex presentation but also reflect transferring batch (the transferring quantity at once) and period (the corresponding cycle time) in decision making problems. A case of water allocation problems in water resources management planning is studied to demonstrate the applicability of this method. Under different flow levels, different transferring measures are generated by this method when the promised water cannot be met. Moreover, interval solutions associated with different transferring costs also have been provided. They can be used for generating decision alternatives and thus help water resources managers to identify desired policies. Compared with the ITSP method, the IB-ITSP model can provide a positive measure for solving water shortage problems and afford useful information for decision makers under uncertainty.
Andrews, Kristin
2017-01-01
I suggest that the Stereotype Rationality Hypothesis (Jussim 2012) is only partially right. I agree it is rational to rely on stereotypes, but in the complexity of real world social interactions, most of our individuating information invokes additional stereotypes. Despite assumptions to the contrary, there is reason to think theory of mind is not accurate, and social psychology's denial of stereotype accuracy led us toward mindreading/theory of mind - a less accurate account of how we understand other people.
Ford, John A; Jones, Andrew P; Wong, Geoff; Clark, Allan B; Porter, Tom; Shakespeare, Tom; Swart, Ann Marie; Steel, Nicholas
2015-01-01
Introduction The UK has an ageing population, especially in rural areas, where deprivation is high among older people. Previous research has identified this group as at high risk of poor access to healthcare. The aim of this study is to generate a theory of how socioeconomically disadvantaged older people from rural areas access primary care, to develop an intervention based on this theory and test it in a feasibility trial. Methods and analysis On the basis of the MRC Framework for Developing and Evaluating Complex Interventions, three methods will be used to generate the theory. First, a realist review will elucidate the patient pathway based on existing literature. Second, an analysis of the English Longitudinal Study of Ageing will be completed using structural equation modelling. Third, 15 semistructured interviews will be undertaken with patients and four focus groups with health professionals. A triangulation protocol will be used to allow each of these methods to inform and be informed by each other, and to integrate data into one overall realist theory. Based on this theory, an intervention will be developed in discussion with stakeholders to ensure that the intervention is feasible and practical. The intervention will be tested within a feasibility trial, the design of which will depend on the intervention. Lessons from the feasibility trial will be used to refine the intervention and gather the information needed for a definitive trial. Ethics and dissemination Ethics approval from the regional ethics committee has been granted for the focus groups with health professionals and interviews with patients. Ethics approval will be sought for the feasibility trial after the intervention has been designed. Findings will be disseminated to the key stakeholders involved in intervention development, to researchers, clinicians and health planners through peer-reviewed journal articles and conference publications, and locally through a dissemination event. PMID:26384728
Towards a general theory of neural computation based on prediction by single neurons.
Fiorillo, Christopher D
2008-10-01
Although there has been tremendous progress in understanding the mechanics of the nervous system, there has not been a general theory of its computational function. Here I present a theory that relates the established biophysical properties of single generic neurons to principles of Bayesian probability theory, reinforcement learning and efficient coding. I suggest that this theory addresses the general computational problem facing the nervous system. Each neuron is proposed to mirror the function of the whole system in learning to predict aspects of the world related to future reward. According to the model, a typical neuron receives current information about the state of the world from a subset of its excitatory synaptic inputs, and prior information from its other inputs. Prior information would be contributed by synaptic inputs representing distinct regions of space, and by different types of non-synaptic, voltage-regulated channels representing distinct periods of the past. The neuron's membrane voltage is proposed to signal the difference between current and prior information ("prediction error" or "surprise"). A neuron would apply a Hebbian plasticity rule to select those excitatory inputs that are the most closely correlated with reward but are the least predictable, since unpredictable inputs provide the neuron with the most "new" information about future reward. To minimize the error in its predictions and to respond only when excitation is "new and surprising," the neuron selects amongst its prior information sources through an anti-Hebbian rule. The unique inputs of a mature neuron would therefore result from learning about spatial and temporal patterns in its local environment, and by extension, the external world. Thus the theory describes how the structure of the mature nervous system could reflect the structure of the external world, and how the complexity and intelligence of the system might develop from a population of undifferentiated neurons, each implementing similar learning algorithms.
Plenoptic layer-based modeling for image based rendering.
Pearson, James; Brookes, Mike; Dragotti, Pier Luigi
2013-09-01
Image based rendering is an attractive alternative to model based rendering for generating novel views because of its lower complexity and potential for photo-realistic results. To reduce the number of images necessary for alias-free rendering, some geometric information for the 3D scene is normally necessary. In this paper, we present a fast automatic layer-based method for synthesizing an arbitrary new view of a scene from a set of existing views. Our algorithm takes advantage of the knowledge of the typical structure of multiview data to perform occlusion-aware layer extraction. In addition, the number of depth layers used to approximate the geometry of the scene is chosen based on plenoptic sampling theory with the layers placed non-uniformly to account for the scene distribution. The rendering is achieved using a probabilistic interpolation approach and by extracting the depth layer information on a small number of key images. Numerical results demonstrate that the algorithm is fast and yet is only 0.25 dB away from the ideal performance achieved with the ground-truth knowledge of the 3D geometry of the scene of interest. This indicates that there are measurable benefits from following the predictions of plenoptic theory and that they remain true when translated into a practical system for real world data.
Using complex networks towards information retrieval and diagnostics in multidimensional imaging
NASA Astrophysics Data System (ADS)
Banerjee, Soumya Jyoti; Azharuddin, Mohammad; Sen, Debanjan; Savale, Smruti; Datta, Himadri; Dasgupta, Anjan Kr; Roy, Soumen
2015-12-01
We present a fresh and broad yet simple approach towards information retrieval in general and diagnostics in particular by applying the theory of complex networks on multidimensional, dynamic images. We demonstrate a successful use of our method with the time series generated from high content thermal imaging videos of patients suffering from the aqueous deficient dry eye (ADDE) disease. Remarkably, network analyses of thermal imaging time series of contact lens users and patients upon whom Laser-Assisted in situ Keratomileusis (Lasik) surgery has been conducted, exhibit pronounced similarity with results obtained from ADDE patients. We also propose a general framework for the transformation of multidimensional images to networks for futuristic biometry. Our approach is general and scalable to other fluctuation-based devices where network parameters derived from fluctuations, act as effective discriminators and diagnostic markers.
NASA Astrophysics Data System (ADS)
Murphy, Glen; Salomone, Sonia
2013-03-01
While highly cohesive groups are potentially advantageous they are also often correlated with the emergence of knowledge and information silos based around those same functional or occupational clusters. Consequently, an essential challenge for engineering organisations wishing to overcome informational silos is to implement mechanisms that facilitate, encourage and sustain interactions between otherwise disconnected groups. This paper acts as a primer for those seeking to gain an understanding of the design, functionality and utility of a suite of software tools generically termed social media technologies in the context of optimising the management of tacit engineering knowledge. Underpinned by knowledge management theory and using detailed case examples, this paper explores how social media technologies achieve such goals, allowing for the transfer of knowledge by tapping into the tacit and explicit knowledge of disparate groups in complex engineering environments.
Using complex networks towards information retrieval and diagnostics in multidimensional imaging.
Banerjee, Soumya Jyoti; Azharuddin, Mohammad; Sen, Debanjan; Savale, Smruti; Datta, Himadri; Dasgupta, Anjan Kr; Roy, Soumen
2015-12-02
We present a fresh and broad yet simple approach towards information retrieval in general and diagnostics in particular by applying the theory of complex networks on multidimensional, dynamic images. We demonstrate a successful use of our method with the time series generated from high content thermal imaging videos of patients suffering from the aqueous deficient dry eye (ADDE) disease. Remarkably, network analyses of thermal imaging time series of contact lens users and patients upon whom Laser-Assisted in situ Keratomileusis (Lasik) surgery has been conducted, exhibit pronounced similarity with results obtained from ADDE patients. We also propose a general framework for the transformation of multidimensional images to networks for futuristic biometry. Our approach is general and scalable to other fluctuation-based devices where network parameters derived from fluctuations, act as effective discriminators and diagnostic markers.
Using complex networks towards information retrieval and diagnostics in multidimensional imaging
Banerjee, Soumya Jyoti; Azharuddin, Mohammad; Sen, Debanjan; Savale, Smruti; Datta, Himadri; Dasgupta, Anjan Kr; Roy, Soumen
2015-01-01
We present a fresh and broad yet simple approach towards information retrieval in general and diagnostics in particular by applying the theory of complex networks on multidimensional, dynamic images. We demonstrate a successful use of our method with the time series generated from high content thermal imaging videos of patients suffering from the aqueous deficient dry eye (ADDE) disease. Remarkably, network analyses of thermal imaging time series of contact lens users and patients upon whom Laser-Assisted in situ Keratomileusis (Lasik) surgery has been conducted, exhibit pronounced similarity with results obtained from ADDE patients. We also propose a general framework for the transformation of multidimensional images to networks for futuristic biometry. Our approach is general and scalable to other fluctuation-based devices where network parameters derived from fluctuations, act as effective discriminators and diagnostic markers. PMID:26626047
Discovery of Empirical Components by Information Theory
2016-08-10
AFRL-AFOSR-VA-TR-2016-0289 Discovery of Empirical Components by Information Theory Amit Singer TRUSTEES OF PRINCETON UNIVERSITY 1 NASSAU HALL...3. DATES COVERED (From - To) 15 Feb 2013 to 14 Feb 2016 5a. CONTRACT NUMBER Discovery of Empirical Components by Information Theory 5b. GRANT...they draw not only from traditional linear algebra based numerical analysis or approximation theory , but also from information theory , graph theory
Fuzzy logic of Aristotelian forms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perlovsky, L.I.
1996-12-31
Model-based approaches to pattern recognition and machine vision have been proposed to overcome the exorbitant training requirements of earlier computational paradigms. However, uncertainties in data were found to lead to a combinatorial explosion of the computational complexity. This issue is related here to the roles of a priori knowledge vs. adaptive learning. What is the a-priori knowledge representation that supports learning? I introduce Modeling Field Theory (MFT), a model-based neural network whose adaptive learning is based on a priori models. These models combine deterministic, fuzzy, and statistical aspects to account for a priori knowledge, its fuzzy nature, and data uncertainties.more » In the process of learning, a priori fuzzy concepts converge to crisp or probabilistic concepts. The MFT is a convergent dynamical system of only linear computational complexity. Fuzzy logic turns out to be essential for reducing the combinatorial complexity to linear one. I will discuss the relationship of the new computational paradigm to two theories due to Aristotle: theory of Forms and logic. While theory of Forms argued that the mind cannot be based on ready-made a priori concepts, Aristotelian logic operated with just such concepts. I discuss an interpretation of MFT suggesting that its fuzzy logic, combining a-priority and adaptivity, implements Aristotelian theory of Forms (theory of mind). Thus, 2300 years after Aristotle, a logic is developed suitable for his theory of mind.« less
Physical Complexity and Cognitive Evolution
NASA Astrophysics Data System (ADS)
Jedlicka, Peter
Our intuition tells us that there is a general trend in the evolution of nature, a trend towards greater complexity. However, there are several definitions of complexity and hence it is difficult to argue for or against the validity of this intuition. Christoph Adami has recently introduced a novel measure called physical complexity that assigns low complexity to both ordered and random systems and high complexity to those in between. Physical complexity measures the amount of information that an organism stores in its genome about the environment in which it evolves. The theory of physical complexity predicts that evolution increases the amount of `knowledge' an organism accumulates about its niche. It might be fruitful to generalize Adami's concept of complexity to the entire evolution (including the evolution of man). Physical complexity fits nicely into the philosophical framework of cognitive biology which considers biological evolution as a progressing process of accumulation of knowledge (as a gradual increase of epistemic complexity). According to this paradigm, evolution is a cognitive `ratchet' that pushes the organisms unidirectionally towards higher complexity. Dynamic environment continually creates problems to be solved. To survive in the environment means to solve the problem, and the solution is an embodied knowledge. Cognitive biology (as well as the theory of physical complexity) uses the concepts of information and entropy and views the evolution from both the information-theoretical and thermodynamical perspective. Concerning humans as conscious beings, it seems necessary to postulate an emergence of a new kind of knowledge - a self-aware and self-referential knowledge. Appearence of selfreflection in evolution indicates that the human brain reached a new qualitative level in the epistemic complexity.
Trierweiller, Andréa Cristina; Peixe, Blênio César Severo; Tezza, Rafael; Pereira, Vera Lúcia Duarte do Valle; Pacheco, Waldemar; Bornia, Antonio Cezar; de Andrade, Dalton Francisco
2012-01-01
The aim of this paper is to measure the effectiveness of the organizations Information and Communication Technology (ICT) from the point of view of the manager, using Item Response Theory (IRT). There is a need to verify the effectiveness of these organizations which are normally associated to complex, dynamic, and competitive environments. In academic literature, there is disagreement surrounding the concept of organizational effectiveness and its measurement. A construct was elaborated based on dimensions of effectiveness towards the construction of the items of the questionnaire which submitted to specialists for evaluation. It demonstrated itself to be viable in measuring organizational effectiveness of ICT companies under the point of view of a manager through using Two-Parameter Logistic Model (2PLM) of the IRT. This modeling permits us to evaluate the quality and property of each item placed within a single scale: items and respondents, which is not possible when using other similar tools.
Marchal, Bruno; Van Belle, Sara; De Brouwere, Vincent; Witter, Sophie
2013-11-08
The importance of complexity in health care policy-making and interventions, as well as research and evaluation is now widely acknowledged, but conceptual confusion reigns and few applications of complexity concepts in research design have been published. Taking user fee exemption policies as an entry point, we explore the methodological consequences of 'complexity' for health policy research and evaluation. We first discuss the difference between simple, complicated and complex and introduce key concepts of complex adaptive systems theory. We then apply these to fee exemption policies. We describe how the FEMHealth research project attempts to address the challenges of complexity in its evaluation of fee exemption policies for maternal care. We present how the development of a programme theory for fee exemption policies was used to structure the overall design. This allowed for structured discussions on the hypotheses held by the researchers and helped to structure, integrate and monitor the sub-studies. We then show how the choice of data collection methods and tools for each sub-study was informed by the overall design. Applying key concepts from complexity theory proved useful in broadening our view on fee exemption policies and in developing the overall research design. However, we encountered a number of challenges, including maintaining adaptiveness of the design during the evaluation, and ensuring cohesion in the disciplinary diversity of the research teams. Whether the programme theory can fulfil its claimed potential to help making sense of the findings is yet to be tested. Experience from other studies allows for some moderate optimism. However, the biggest challenge complexity throws at health system researchers may be to deal with the unknown unknowns and the consequence that complex issues can only be understood in retrospect. From a complexity theory point of view, only plausible explanations can be developed, not predictive theories. Yet here, theory-driven approaches may help.
Graph Theory-Based Pinning Synchronization of Stochastic Complex Dynamical Networks.
Li, Xiao-Jian; Yang, Guang-Hong
2017-02-01
This paper is concerned with the adaptive pinning synchronization problem of stochastic complex dynamical networks (CDNs). Based on algebraic graph theory and Lyapunov theory, pinning controller design conditions are derived, and the rigorous convergence analysis of synchronization errors in the probability sense is also conducted. Compared with the existing results, the topology structures of stochastic CDN are allowed to be unknown due to the use of graph theory. In particular, it is shown that the selection of nodes for pinning depends on the unknown lower bounds of coupling strengths. Finally, an example on a Chua's circuit network is given to validate the effectiveness of the theoretical results.
Computation in generalised probabilisitic theories
NASA Astrophysics Data System (ADS)
Lee, Ciarán M.; Barrett, Jonathan
2015-08-01
From the general difficulty of simulating quantum systems using classical systems, and in particular the existence of an efficient quantum algorithm for factoring, it is likely that quantum computation is intrinsically more powerful than classical computation. At present, the best upper bound known for the power of quantum computation is that {{BQP}}\\subseteq {{AWPP}}, where {{AWPP}} is a classical complexity class (known to be included in {{PP}}, hence {{PSPACE}}). This work investigates limits on computational power that are imposed by simple physical, or information theoretic, principles. To this end, we define a circuit-based model of computation in a class of operationally-defined theories more general than quantum theory, and ask: what is the minimal set of physical assumptions under which the above inclusions still hold? We show that given only an assumption of tomographic locality (roughly, that multipartite states and transformations can be characterized by local measurements), efficient computations are contained in {{AWPP}}. This inclusion still holds even without assuming a basic notion of causality (where the notion is, roughly, that probabilities for outcomes cannot depend on future measurement choices). Following Aaronson, we extend the computational model by allowing post-selection on measurement outcomes. Aaronson showed that the corresponding quantum complexity class, {{PostBQP}}, is equal to {{PP}}. Given only the assumption of tomographic locality, the inclusion in {{PP}} still holds for post-selected computation in general theories. Hence in a world with post-selection, quantum theory is optimal for computation in the space of all operational theories. We then consider whether one can obtain relativized complexity results for general theories. It is not obvious how to define a sensible notion of a computational oracle in the general framework that reduces to the standard notion in the quantum case. Nevertheless, it is possible to define computation relative to a ‘classical oracle’. Then, we show there exists a classical oracle relative to which efficient computation in any theory satisfying the causality assumption does not include {{NP}}.
2011-04-08
into how economics, information theory and computer science, psychology, sociology, evolutionary biology, physics (quantum mechanics) and cosmology ...include knowledge and definition of “self” (as “self” is part of the environment) and the shared experience and perspective of others That...including information, entropy, quantum behavior, and cosmological progress In short I assume the above and therefore my recommendations could be
Unpacking the Complexity of Patient Handoffs Through the Lens of Cognitive Load Theory.
Young, John Q; Ten Cate, Olle; O'Sullivan, Patricia S; Irby, David M
2016-01-01
The transfer of a patient from one clinician to another is a high-risk event. Errors are common and lead to patient harm. More effective methods for learning how to give and receive sign-out is an important public health priority. Performing a handoff is a complex task. Trainees must simultaneously apply and integrate clinical, communication, and systems skills into one time-limited and highly constrained activity. The task demands can easily exceed the information-processing capacity of the trainee, resulting in impaired learning and performance. Appreciating the limits of working memory can help identify the challenges that instructional techniques and research must then address. Cognitive load theory (CLT) identifies three types of load that impact working memory: intrinsic (task-essential), extraneous (not essential to task), and germane (learning related). The authors generated a list of factors that affect a trainee's learning and performance of a handoff based on CLT. The list was revised based on feedback from experts in medical education and in handoffs. By consensus, the authors associated each factor with the type of cognitive load it primarily effects. The authors used this analysis to build a conceptual model of handoffs through the lens of CLT. The resulting conceptual model unpacks the complexity of handoffs and identifies testable hypotheses for educational research and instructional design. The model identifies features of a handoff that drive extraneous, intrinsic, and germane load for both the sender and the receiver. The model highlights the importance of reducing extraneous load, matching intrinsic load to the developmental stage of the learner and optimizing germane load. Specific CLT-informed instructional techniques for handoffs are explored. Intrinsic and germane load are especially important to address and include factors such as knowledge of the learner, number of patients, time constraints, clinical uncertainties, overall patient/panel complexity, interacting comorbidities or therapeutics, experience or specialty gradients between the sender and receiver, the maturity of the evidence base for the patient's disease, and the use of metacognitive techniques. Research that identifies which cognitive load factors most significantly affect the learning and performance of handoffs can lead to novel, contextually adapted instructional techniques and handoff protocols. The application of CLT to handoffs may also help with the further development of CLT as a learning theory.
NASA Technical Reports Server (NTRS)
Alexandrov, Natalia (Technical Monitor); Kuby, Michael; Tierney, Sean; Roberts, Tyler; Upchurch, Christopher
2005-01-01
This report reviews six classes of models that are used for studying transportation network topologies. The report is motivated by two main questions. First, what can the "new science" of complex networks (scale-free, small-world networks) contribute to our understanding of transport network structure, compared to more traditional methods? Second, how can geographic information systems (GIS) contribute to studying transport networks? The report defines terms that can be used to classify different kinds of models by their function, composition, mechanism, spatial and temporal dimensions, certainty, linearity, and resolution. Six broad classes of models for analyzing transport network topologies are then explored: GIS; static graph theory; complex networks; mathematical programming; simulation; and agent-based modeling. Each class of models is defined and classified according to the attributes introduced earlier. The paper identifies some typical types of research questions about network structure that have been addressed by each class of model in the literature.
Petri net based model of the body iron homeostasis.
Formanowicz, Dorota; Sackmann, Andrea; Formanowicz, Piotr; Błazewicz, Jacek
2007-10-01
The body iron homeostasis is a not fully understood complex process. Despite the fact that some components of this process have been described in the literature, the complete model of the whole process has not been proposed. In this paper a Petri net based model of the body iron homeostasis is presented. Recently, Petri nets have been used for describing and analyzing various biological processes since they allow modeling the system under consideration very precisely. The main result presented in the paper is twofold, i.e., an informal description of the main part of the whole iron homeostasis process is described, and then it is also formulated in the formal language of Petri net theory. This model allows for a possible simulation of the process, since Petri net theory provides a lot of established analysis techniques.
Predicting Deformation Limits of Dual-Phase Steels Under Complex Loading Paths
Cheng, G.; Choi, K. S.; Hu, X.; ...
2017-04-05
Here in this study, the deformation limits of various DP980 steels are examined with the deformation instability theory. Under uniaxial tension, overall stress–strain curves of the material are estimated based on a simple rule of mixture (ROM) with both iso-strain and iso-stress assumptions. Under complex loading paths, an actual microstructure-based finite element (FE) method is used to resolve the deformation compatibilities explicitly between the soft ferrite and hard martensite phases. The results show that, for uniaxial tension, the deformation instability theory with iso-strain-based ROM can be used to provide the lower bound estimate of the uniform elongation (UE) for themore » various DP980 considered. Under complex loading paths, the deformation instability theory with microstructure-based FE method can be used in examining the effects of various microstructural features on the deformation limits of DP980 steels.« less
Predicting Deformation Limits of Dual-Phase Steels Under Complex Loading Paths
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, G.; Choi, K. S.; Hu, X.
The deformation limits of various DP980 steels are examined in this study with deformation instability theory. Under uniaxial tension, overall stress-strain curves of the material are estimated based on simple rule of mixture (ROM) with both iso-strain and iso-stress assumptions. Under complex loading paths, actual microstructure-based finite element (FE) method is used to explicitly resolve the deformation incompatibilities between the soft ferrite and hard martensite phases. The results show that, for uniaxial tension, the deformation instability theory with iso-strain-based ROM can be used to provide the lower bound estimate of the uniform elongation (UE) for the various DP980 considered. Undermore » complex loading paths, the deformation instability theory with microstructure-based FE method can be used in examining the effects of various microstructural features on the deformation limits of DP980 steels.« less
Predicting Deformation Limits of Dual-Phase Steels Under Complex Loading Paths
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, G.; Choi, K. S.; Hu, X.
Here in this study, the deformation limits of various DP980 steels are examined with the deformation instability theory. Under uniaxial tension, overall stress–strain curves of the material are estimated based on a simple rule of mixture (ROM) with both iso-strain and iso-stress assumptions. Under complex loading paths, an actual microstructure-based finite element (FE) method is used to resolve the deformation compatibilities explicitly between the soft ferrite and hard martensite phases. The results show that, for uniaxial tension, the deformation instability theory with iso-strain-based ROM can be used to provide the lower bound estimate of the uniform elongation (UE) for themore » various DP980 considered. Under complex loading paths, the deformation instability theory with microstructure-based FE method can be used in examining the effects of various microstructural features on the deformation limits of DP980 steels.« less
Fractal Complexity-Based Feature Extraction Algorithm of Communication Signals
NASA Astrophysics Data System (ADS)
Wang, Hui; Li, Jingchao; Guo, Lili; Dou, Zheng; Lin, Yun; Zhou, Ruolin
How to analyze and identify the characteristics of radiation sources and estimate the threat level by means of detecting, intercepting and locating has been the central issue of electronic support in the electronic warfare, and communication signal recognition is one of the key points to solve this issue. Aiming at accurately extracting the individual characteristics of the radiation source for the increasingly complex communication electromagnetic environment, a novel feature extraction algorithm for individual characteristics of the communication radiation source based on the fractal complexity of the signal is proposed. According to the complexity of the received signal and the situation of environmental noise, use the fractal dimension characteristics of different complexity to depict the subtle characteristics of the signal to establish the characteristic database, and then identify different broadcasting station by gray relation theory system. The simulation results demonstrate that the algorithm can achieve recognition rate of 94% even in the environment with SNR of -10dB, and this provides an important theoretical basis for the accurate identification of the subtle features of the signal at low SNR in the field of information confrontation.
Note on transmitted complexity for quantum dynamical systems
NASA Astrophysics Data System (ADS)
Watanabe, Noboru; Muto, Masahiro
2017-10-01
Transmitted complexity (mutual entropy) is one of the important measures for quantum information theory developed recently in several ways. We will review the fundamental concepts of the Kossakowski, Ohya and Watanabe entropy and define a transmitted complexity for quantum dynamical systems. This article is part of the themed issue `Second quantum revolution: foundational questions'.
Complexity in language acquisition.
Clark, Alexander; Lappin, Shalom
2013-01-01
Learning theory has frequently been applied to language acquisition, but discussion has largely focused on information theoretic problems-in particular on the absence of direct negative evidence. Such arguments typically neglect the probabilistic nature of cognition and learning in general. We argue first that these arguments, and analyses based on them, suffer from a major flaw: they systematically conflate the hypothesis class and the learnable concept class. As a result, they do not allow one to draw significant conclusions about the learner. Second, we claim that the real problem for language learning is the computational complexity of constructing a hypothesis from input data. Studying this problem allows for a more direct approach to the object of study--the language acquisition device-rather than the learnable class of languages, which is epiphenomenal and possibly hard to characterize. The learnability results informed by complexity studies are much more insightful. They strongly suggest that target grammars need to be objective, in the sense that the primitive elements of these grammars are based on objectively definable properties of the language itself. These considerations support the view that language acquisition proceeds primarily through data-driven learning of some form. Copyright © 2013 Cognitive Science Society, Inc.
Understanding Financial Market States Using an Artificial Double Auction Market
2016-01-01
The ultimate value of theories describing the fundamental mechanisms behind asset prices in financial systems is reflected in the capacity of such theories to understand these systems. Although the models that explain the various states of financial markets offer substantial evidence from the fields of finance, mathematics, and even physics, previous theories that attempt to address the complexities of financial markets in full have been inadequate. We propose an artificial double auction market as an agent-based model to study the origin of complex states in financial markets by characterizing important parameters with an investment strategy that can cover the dynamics of the financial market. The investment strategies of chartist traders in response to new market information should reduce market stability based on the price fluctuations of risky assets. However, fundamentalist traders strategically submit orders based on fundamental value and, thereby stabilize the market. We construct a continuous double auction market and find that the market is controlled by the proportion of chartists, Pc. We show that mimicking the real state of financial markets, which emerges in real financial systems, is given within the range Pc = 0.40 to Pc = 0.85; however, we show that mimicking the efficient market hypothesis state can be generated with values less than Pc = 0.40. In particular, we observe that mimicking a market collapse state is created with values greater than Pc = 0.85, at which point a liquidity shortage occurs, and the phase transition behavior is described at Pc = 0.85. PMID:27031110
Understanding Financial Market States Using an Artificial Double Auction Market.
Yim, Kyubin; Oh, Gabjin; Kim, Seunghwan
2016-01-01
The ultimate value of theories describing the fundamental mechanisms behind asset prices in financial systems is reflected in the capacity of such theories to understand these systems. Although the models that explain the various states of financial markets offer substantial evidence from the fields of finance, mathematics, and even physics, previous theories that attempt to address the complexities of financial markets in full have been inadequate. We propose an artificial double auction market as an agent-based model to study the origin of complex states in financial markets by characterizing important parameters with an investment strategy that can cover the dynamics of the financial market. The investment strategies of chartist traders in response to new market information should reduce market stability based on the price fluctuations of risky assets. However, fundamentalist traders strategically submit orders based on fundamental value and, thereby stabilize the market. We construct a continuous double auction market and find that the market is controlled by the proportion of chartists, Pc. We show that mimicking the real state of financial markets, which emerges in real financial systems, is given within the range Pc = 0.40 to Pc = 0.85; however, we show that mimicking the efficient market hypothesis state can be generated with values less than Pc = 0.40. In particular, we observe that mimicking a market collapse state is created with values greater than Pc = 0.85, at which point a liquidity shortage occurs, and the phase transition behavior is described at Pc = 0.85.
A systems-based food safety evaluation: an experimental approach.
Higgins, Charles L; Hartfield, Barry S
2004-11-01
Food establishments are complex systems with inputs, subsystems, underlying forces that affect the system, outputs, and feedback. Building on past exploration of the hazard analysis critical control point concept and Ludwig von Bertalanffy General Systems Theory, the National Park Service (NPS) is attempting to translate these ideas into a realistic field assessment of food service establishments and to use information gathered by these methods in efforts to improve food safety. Over the course of the last two years, an experimental systems-based methodology has been drafted, developed, and tested by the NPS Public Health Program. This methodology is described in this paper.
Mathematical morphology-based shape feature analysis for Chinese character recognition systems
NASA Astrophysics Data System (ADS)
Pai, Tun-Wen; Shyu, Keh-Hwa; Chen, Ling-Fan; Tai, Gwo-Chin
1995-04-01
This paper proposes an efficient technique of shape feature extraction based on the application of mathematical morphology theory. A new shape complexity index for preclassification of machine printed Chinese Character Recognition (CCR) is also proposed. For characters represented in different fonts/sizes or in a low resolution environment, a more stable local feature such as shape structure is preferred for character recognition. Morphological valley extraction filters are applied to extract the protrusive strokes from four sides of an input Chinese character. The number of extracted local strokes reflects the shape complexity of each side. These shape features of characters are encoded as corresponding shape complexity indices. Based on the shape complexity index, data base is able to be classified into 16 groups prior to recognition procedures. The performance of associating with shape feature analysis reclaims several characters from misrecognized character sets and results in an average of 3.3% improvement of recognition rate from an existing recognition system. In addition to enhance the recognition performance, the extracted stroke information can be further analyzed and classified its own stroke type. Therefore, the combination of extracted strokes from each side provides a means for data base clustering based on radical or subword components. It is one of the best solutions for recognizing high complexity characters such as Chinese characters which are divided into more than 200 different categories and consist more than 13,000 characters.
Thinking on building the network cardiovasology of Chinese medicine.
Yu, Gui; Wang, Jie
2012-11-01
With advances in complex network theory, the thinking and methods regarding complex systems have changed revolutionarily. Network biology and network pharmacology were built by applying network-based approaches in biomedical research. The cardiovascular system may be regarded as a complex network, and cardiovascular diseases may be taken as the damage of structure and function of the cardiovascular network. Although Chinese medicine (CM) is effective in treating cardiovascular diseases, its mechanisms are still unclear. With the guidance of complex network theory, network biology and network pharmacology, network-based approaches could be used in the study of CM in preventing and treating cardiovascular diseases. A new discipline-network cardiovasology of CM was, therefore, developed. In this paper, complex network theory, network biology and network pharmacology were introduced and the connotation of "disease-syndrome-formula-herb" was illustrated from the network angle. Network biology could be used to analyze cardiovascular diseases and syndromes and network pharmacology could be used to analyze CM formulas and herbs. The "network-network"-based approaches could provide a new view for elucidating the mechanisms of CM treatment.
Entropy evolution of moving mirrors and the information loss problem
NASA Astrophysics Data System (ADS)
Chen, Pisin; Yeom, Dong-han
2017-07-01
We investigate the entanglement entropy and the information flow of two-dimensional moving mirrors. Here we point out that various mirror trajectories can help to mimic different candidate resolutions to the information loss paradox following the semiclassical quantum field theory: (i) a suddenly stopping mirror corresponds to the assertion that all information is attached to the last burst, (ii) a slowly stopping mirror corresponds to the assertion that thermal Hawking radiation carries information, and (iii) a long propagating mirror corresponds to the remnant scenario. Based on such analogy, we find that the last burst of a black hole cannot contain enough information, while slowly emitting radiation can restore unitarity. For all cases, there is an apparent inconsistency between the picture based on quantum entanglements and that based on the semiclassical quantum field theory. Based on the quantum entanglement theory, a stopping mirror will generate a firewall-like violent emission which is in conflict with notions based on the semiclassical quantum field theory.
NASA Astrophysics Data System (ADS)
Pries-Heje, Jan; Baskerville, Richard L.
This paper elaborates a design science approach for management planning anchored to the concept of a management design theory. Unlike the notions of design theories arising from information systems, management design theories can appear as a system of technological rules, much as a system of hypotheses or propositions can embody scientific theories. The paper illus trates this form of management design theories with three grounded cases. These grounded cases include a software process improvement study, a user involvement study, and an organizational change study. Collectively these studies demonstrate how design theories founded on technological rules can not only improve the design of information systems, but that these concepts have great practical value for improving the framing of strategic organi zational design decisions about such systems. Each case is either grounded in an empirical sense, that is to say, actual practice, or it is grounded to practices described extensively in the practical literature. Such design theories will help managers more easily approach complex, strategic decisions.
Graphic and cultural aspects of pictograms: an information ergonomics viewpoint.
Spinillo, Carla Galvão
2012-01-01
The use of pictograms is discussed considering their information content, graphic complexity and cultural dimension. The resemblance and the illusion theories are highlighted to define pictogram as a salience-based representation system, which communicational efficacy depends upon historical and cultural aspects in their interpretation. Thus, the competence in interpreting pictograms is considered relative to users' acquaintance with the pictorial system and with the referents. Pictogram as a general/neutral visual statement is questioned, pointing out the cultural and gender attributes added to pictures to represent people, professions and social events. As a result of this discussion, some critical points of the standardization of pictograms are presented.
Random Matrix Theory Approach to Chaotic Coherent Perfect Absorbers
NASA Astrophysics Data System (ADS)
Li, Huanan; Suwunnarat, Suwun; Fleischmann, Ragnar; Schanz, Holger; Kottos, Tsampikos
2017-01-01
We employ random matrix theory in order to investigate coherent perfect absorption (CPA) in lossy systems with complex internal dynamics. The loss strength γCPA and energy ECPA, for which a CPA occurs, are expressed in terms of the eigenmodes of the isolated cavity—thus carrying over the information about the chaotic nature of the target—and their coupling to a finite number of scattering channels. Our results are tested against numerical calculations using complex networks of resonators and chaotic graphs as CPA cavities.
Discrimination of Complex Human Behavior by Pigeons (Columba livia) and Humans
Qadri, Muhammad A. J.; Sayde, Justin M.; Cook, Robert G.
2014-01-01
The cognitive and neural mechanisms for recognizing and categorizing behavior are not well understood in non-human animals. In the current experiments, pigeons and humans learned to categorize two non-repeating, complex human behaviors (“martial arts” vs. “Indian dance”). Using multiple video exemplars of a digital human model, pigeons discriminated these behaviors in a go/no-go task and humans in a choice task. Experiment 1 found that pigeons already experienced with discriminating the locomotive actions of digital animals acquired the discrimination more rapidly when action information was available than when only pose information was available. Experiments 2 and 3 found this same dynamic superiority effect with naïve pigeons and human participants. Both species used the same combination of immediately available static pose information and more slowly perceived dynamic action cues to discriminate the behavioral categories. Theories based on generalized visual mechanisms, as opposed to embodied, species-specific action networks, offer a parsimonious account of how these different animals recognize behavior across and within species. PMID:25379777
Reflections in the light of the complexity theory and nursing education.
Cruz, Ronny Anderson de Oliveira; Araujo, Elidianne Layanne Medeiros de; Nascimento, Neyce de Matos; Lima, Raquel Janyne de; França, Jael Rúbia Figueiredo de Sá; Oliveira, Jacira Dos Santos
2017-01-01
to reflect on nursing education, taking into account the principles of complex thinking proposed by Morin. reflection based on the principles of the complexity theory by Edgar Morin. the application of complexity in teaching proposes an emancipatory education based on questioning and social transformation. It comprises the education of nurses who interact with others as a characteristic of their work. It is necessary to prepare students to develop critical and reflective attitudes and actions to overcome the fragmentation and linearity of knowledge. nursing care has been based on a reductionist assistance, reflecting the Cartesian model. Thus, nursing education seeks to comprise shared knowledge and experiences so that no subject or professional overpowers another, accepting the uniqueness of professionals and patients.
Abou, Seraphin C
2012-03-01
In this paper, a new interpretation of intuitionistic fuzzy sets in the advanced framework of the Dempster-Shafer theory of evidence is extended to monitor safety-critical systems' performance. Not only is the proposed approach more effective, but it also takes into account the fuzzy rules that deal with imperfect knowledge/information and, therefore, is different from the classical Takagi-Sugeno fuzzy system, which assumes that the rule (the knowledge) is perfect. We provide an analytical solution to the practical and important problem of the conceptual probabilistic approach for formal ship safety assessment using the fuzzy set theory that involves uncertainties associated with the reliability input data. Thus, the overall safety of the ship engine is investigated as an object of risk analysis using the fuzzy mapping structure, which considers uncertainty and partial truth in the input-output mapping. The proposed method integrates direct evidence of the frame of discernment and is demonstrated through references to examples where fuzzy set models are informative. These simple applications illustrate how to assess the conflict of sensor information fusion for a sufficient cooling power system of vessels under extreme operation conditions. It was found that propulsion engine safety systems are not only a function of many environmental and operation profiles but are also dynamic and complex. Copyright © 2011 Elsevier Ltd. All rights reserved.
The Interplay of News Frames on Cognitive Complexity
ERIC Educational Resources Information Center
Shah, Dhavan V.; Kwak, Nojin; Schmierbach, Mike; Zubric, Jessica
2004-01-01
This research considers how distinct news frames work in combination to influence information processing. It extends framing research grounded in prospect theory (Tversky & Kahneman, 1981) and attribution theory (Iyengar, 1991) to study conditional framing effects on associative memory. Using a 2 x 3 experimental design embedded within a…
Understanding Vygotsky for the Classroom: Is It Too Late?
ERIC Educational Resources Information Center
Gredler, Margaret E.
2012-01-01
Determining the capability of Vygotsky's cultural-historical theory to fulfill key functions of educational theory (such as revealing the complexity of apparently simple events) has been hindered primarily by the following factors: (a) inaccurate information about a minor discussion, the zone of proximal development (ZPD), attracted attention…
Economic Thinking for Strategic Leaders
2011-03-24
unprepared to analyze certain complex, ambiguous issues and craft informed decisions. 15. SUBJECT TERMS Behavioral Economics, Public Choice Theory ...COUNT: 7,668 PAGES: 38 KEY TERMS: Behavioral Economics, Public Choice Theory , Army Profession CLASSIFICATION: Unclassified Military senior...various economic fields, including Identity Economics, Neoclassical Economics, Behavioral Economics, and Public Choice Economics. Finally, it
On Evaluating Human Problem Solving of Computationally Hard Problems
ERIC Educational Resources Information Center
Carruthers, Sarah; Stege, Ulrike
2013-01-01
This article is concerned with how computer science, and more exactly computational complexity theory, can inform cognitive science. In particular, we suggest factors to be taken into account when investigating how people deal with computational hardness. This discussion will address the two upper levels of Marr's Level Theory: the computational…
Coordination of Knowledge in Judging Animated Motion
ERIC Educational Resources Information Center
Thaden-Koch, Thomas C.; Dufresne, Robert J.; Mestre, Jose P.
2006-01-01
Coordination class theory is used to explain college students' judgments about animated depictions of moving objects. diSessa's coordination class theory models a "concept" as a complex knowledge system that can reliably determine a particular type of information in widely varying situations. In the experiment described here, fifty individually…
Coding Theory Information Theory and Radar
2005-01-01
the design and synthesis of artificial multiagent systems and for the understanding of human decision-making processes. This... altruism that may exist in a complex society. SGT derives its ability to account simultaneously for both group and individual interests from the structure of ...satisficing decision theory as a model of human decision mak- ing. 2 Multi-Attribute Decision Making Many decision problems involve the consideration of
ERIC Educational Resources Information Center
Borgos, Jill E.
2013-01-01
This article applies the theoretical framework of principal-agent theory in order to better understand the complex organisational relationships emerging between entities invested in the establishment and monitoring of cross-border international branch campus medical schools. Using the key constructs of principal-agent theory, information asymmetry…
NASA Astrophysics Data System (ADS)
Nordebo, Sven; Dalarsson, Mariana; Khodadad, Davood; Müller, Beat; Waldmann, Andreas D.; Becher, Tobias; Frerichs, Inez; Sophocleous, Louiza; Sjöberg, Daniel; Seifnaraghi, Nima; Bayford, Richard
2018-05-01
Classical homogenization theory based on the Hashin–Shtrikman coated ellipsoids is used to model the changes in the complex valued conductivity (or admittivity) of a lung during tidal breathing. Here, the lung is modeled as a two-phase composite material where the alveolar air-filling corresponds to the inclusion phase. The theory predicts a linear relationship between the real and the imaginary parts of the change in the complex valued conductivity of a lung during tidal breathing, and where the loss cotangent of the change is approximately the same as of the effective background conductivity and hence easy to estimate. The theory is illustrated with numerical examples based on realistic parameter values and frequency ranges used with electrical impedance tomography (EIT). The theory may be potentially useful for imaging and clinical evaluations in connection with lung EIT for respiratory management and control.
Petri net-based method for the analysis of the dynamics of signal propagation in signaling pathways.
Hardy, Simon; Robillard, Pierre N
2008-01-15
Cellular signaling networks are dynamic systems that propagate and process information, and, ultimately, cause phenotypical responses. Understanding the circuitry of the information flow in cells is one of the keys to understanding complex cellular processes. The development of computational quantitative models is a promising avenue for attaining this goal. Not only does the analysis of the simulation data based on the concentration variations of biological compounds yields information about systemic state changes, but it is also very helpful for obtaining information about the dynamics of signal propagation. This article introduces a new method for analyzing the dynamics of signal propagation in signaling pathways using Petri net theory. The method is demonstrated with the Ca(2+)/calmodulin-dependent protein kinase II (CaMKII) regulation network. The results constitute temporal information about signal propagation in the network, a simplified graphical representation of the network and of the signal propagation dynamics and a characterization of some signaling routes as regulation motifs.
Particularism and the retreat from theory in the archaeology of agricultural origins
Gremillion, Kristen J.; Barton, Loukas; Piperno, Dolores R.
2014-01-01
The introduction of new analytic methods and expansion of research into previously untapped regions have greatly increased the scale and resolution of data relevant to the origins of agriculture (OA). As a result, the recognition of varied historical pathways to agriculture and the continuum of management strategies have complicated the search for general explanations for the transition to food production. In this environment, higher-level theoretical frameworks are sometimes rejected on the grounds that they force conclusions that are incompatible with real-world variability. Some of those who take this position argue instead that OA should be explained in terms of local and historically contingent factors. This retreat from theory in favor of particularism is based on the faulty beliefs that complex phenomena such as agricultural origins demand equally complex explanations and that explanation is possible in the absence of theoretically based assumptions. The same scholars who are suspicious of generalization are reluctant to embrace evolutionary approaches to human behavior on the grounds that they are ahistorical, overly simplistic, and dismissive of agency and intent. We argue that these criticisms are misplaced and explain why a coherent theory of human behavior that acknowledges its evolutionary history is essential to advancing understanding of OA. Continued progress depends on the integration of human behavior and culture into the emerging synthesis of evolutionary developmental biology that informs contemporary research into plant and animal domestication. PMID:24753601
Particularism and the retreat from theory in the archaeology of agricultural origins.
Gremillion, Kristen J; Barton, Loukas; Piperno, Dolores R
2014-04-29
The introduction of new analytic methods and expansion of research into previously untapped regions have greatly increased the scale and resolution of data relevant to the origins of agriculture (OA). As a result, the recognition of varied historical pathways to agriculture and the continuum of management strategies have complicated the search for general explanations for the transition to food production. In this environment, higher-level theoretical frameworks are sometimes rejected on the grounds that they force conclusions that are incompatible with real-world variability. Some of those who take this position argue instead that OA should be explained in terms of local and historically contingent factors. This retreat from theory in favor of particularism is based on the faulty beliefs that complex phenomena such as agricultural origins demand equally complex explanations and that explanation is possible in the absence of theoretically based assumptions. The same scholars who are suspicious of generalization are reluctant to embrace evolutionary approaches to human behavior on the grounds that they are ahistorical, overly simplistic, and dismissive of agency and intent. We argue that these criticisms are misplaced and explain why a coherent theory of human behavior that acknowledges its evolutionary history is essential to advancing understanding of OA. Continued progress depends on the integration of human behavior and culture into the emerging synthesis of evolutionary developmental biology that informs contemporary research into plant and animal domestication.
Kruse, Clemens Scott; DeShazo, Jonathan; Kim, Forest; Fulton, Lawrence
2014-05-23
The Health Information Technology for Economic and Clinical Health Act (HITECH) allocated $19.2 billion to incentivize adoption of the electronic health record (EHR). Since 2009, Meaningful Use Criteria have dominated information technology (IT) strategy. Health care organizations have struggled to meet expectations and avoid penalties to reimbursements from the Center for Medicare and Medicaid Services (CMS). Organizational theories attempt to explain factors that influence organizational change, and many theories address changes in organizational strategy. However, due to the complexities of the health care industry, existing organizational theories fall short of demonstrating association with significant health care IT implementations. There is no organizational theory for health care that identifies, groups, and analyzes both internal and external factors of influence for large health care IT implementations like adoption of the EHR. The purpose of this systematic review is to identify a full-spectrum of both internal organizational and external environmental factors associated with the adoption of health information technology (HIT), specifically the EHR. The result is a conceptual model that is commensurate with the complexity of with the health care sector. We performed a systematic literature search in PubMed (restricted to English), EBSCO Host, and Google Scholar for both empirical studies and theory-based writing from 1993-2013 that demonstrated association between influential factors and three modes of HIT: EHR, electronic medical record (EMR), and computerized provider order entry (CPOE). We also looked at published books on organizational theories. We made notes and noted trends on adoption factors. These factors were grouped as adoption factors associated with various versions of EHR adoption. The resulting conceptual model summarizes the diversity of independent variables (IVs) and dependent variables (DVs) used in articles, editorials, books, as well as quantitative and qualitative studies (n=83). As of 2009, only 16.30% (815/4999) of nonfederal, acute-care hospitals had adopted a fully interoperable EHR. From the 83 articles reviewed in this study, 16/83 (19%) identified internal organizational factors and 9/83 (11%) identified external environmental factors associated with adoption of the EHR, EMR, or CPOE. The conceptual model for EHR adoption associates each variable with the work that identified it. Commonalities exist in the literature for internal organizational and external environmental factors associated with the adoption of the EHR and/or CPOE. The conceptual model for EHR adoption associates internal and external factors, specific to the health care industry, associated with adoption of the EHR. It becomes apparent that these factors have some level of association, but the association is not consistently calculated individually or in combination. To better understand effective adoption strategies, empirical studies should be performed from this conceptual model to quantify the positive or negative effect of each factor.
Application of a theoretical framework to foster a cardiac-diabetes self-management programme.
Wu, C-J Jo; Chang, A M
2014-09-01
This paper analyses and illustrates the application of Bandura's self-efficacy construct to an innovative self-management programme for patients with both type 2 diabetes and coronary heart disease. Using theory as a framework for any health intervention provides a solid and valid foundation for aspects of planning and delivering such an intervention; however, it is reported that many health behaviour intervention programmes are not based upon theory and are consequently limited in their applicability to different populations. The cardiac-diabetes self-management programme has been specifically developed for patients with dual conditions with the strategies for delivering the programme based upon Bandura's self-efficacy theory. This patient group is at greater risk of negative health outcomes than that with a single chronic condition and therefore requires appropriate intervention programmes with solid theoretical foundations that can address the complexity of care required. The cardiac-diabetes self-management programme has been developed incorporating theory, evidence and practical strategies. This paper provides explicit knowledge of the theoretical basis and components of a cardiac-diabetes self-management programme. Such detail enhances the ability to replicate or adopt the intervention in similar or differing populations and/or cultural contexts as it provides in-depth understanding of each element within the intervention. Knowledge of the concepts alone is not sufficient to deliver a successful health programme. Supporting patients to master skills of self-care is essential in order for patients to successfully manage two complex, chronic illnesses. Valuable information has been provided to close the theory-practice gap for more consistent health outcomes, engaging with patients for promoting holistic care within organizational and cultural contexts. © 2014 International Council of Nurses.
Doubravsky, Karel; Dohnal, Mirko
2015-01-01
Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (re)checked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items) can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities) are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details. PMID:26158662
Doubravsky, Karel; Dohnal, Mirko
2015-01-01
Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (re)checked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items) can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities) are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details.
Holographic complexity and noncommutative gauge theory
NASA Astrophysics Data System (ADS)
Couch, Josiah; Eccles, Stefan; Fischler, Willy; Xiao, Ming-Lei
2018-03-01
We study the holographic complexity of noncommutative field theories. The four-dimensional N=4 noncommutative super Yang-Mills theory with Moyal algebra along two of the spatial directions has a well known holographic dual as a type IIB supergravity theory with a stack of D3 branes and non-trivial NS-NS B fields. We start from this example and find that the late time holographic complexity growth rate, based on the "complexity equals action" conjecture, experiences an enhancement when the non-commutativity is turned on. This enhancement saturates a new limit which is exactly 1/4 larger than the commutative value. We then attempt to give a quantum mechanics explanation of the enhancement. Finite time behavior of the complexity growth rate is also studied. Inspired by the non-trivial result, we move on to more general setup in string theory where we have a stack of D p branes and also turn on the B field. Multiple noncommutative directions are considered in higher p cases.
Gasche, Loïc; Mahévas, Stéphanie; Marchal, Paul
2013-01-01
Ecosystems are usually complex, nonlinear and strongly influenced by poorly known environmental variables. Among these systems, marine ecosystems have high uncertainties: marine populations in general are known to exhibit large levels of natural variability and the intensity of fishing efforts can change rapidly. These uncertainties are a source of risks that threaten the sustainability of both fish populations and fishing fleets targeting them. Appropriate management measures have to be found in order to reduce these risks and decrease sensitivity to uncertainties. Methods have been developed within decision theory that aim at allowing decision making under severe uncertainty. One of these methods is the information-gap decision theory. The info-gap method has started to permeate ecological modelling, with recent applications to conservation. However, these practical applications have so far been restricted to simple models with analytical solutions. Here we implement a deterministic approach based on decision theory in a complex model of the Eastern English Channel. Using the ISIS-Fish modelling platform, we model populations of sole and plaice in this area. We test a wide range of values for ecosystem, fleet and management parameters. From these simulations, we identify management rules controlling fish harvesting that allow reaching management goals recommended by ICES (International Council for the Exploration of the Sea) working groups while providing the highest robustness to uncertainties on ecosystem parameters. PMID:24204873
Gasche, Loïc; Mahévas, Stéphanie; Marchal, Paul
2013-01-01
Ecosystems are usually complex, nonlinear and strongly influenced by poorly known environmental variables. Among these systems, marine ecosystems have high uncertainties: marine populations in general are known to exhibit large levels of natural variability and the intensity of fishing efforts can change rapidly. These uncertainties are a source of risks that threaten the sustainability of both fish populations and fishing fleets targeting them. Appropriate management measures have to be found in order to reduce these risks and decrease sensitivity to uncertainties. Methods have been developed within decision theory that aim at allowing decision making under severe uncertainty. One of these methods is the information-gap decision theory. The info-gap method has started to permeate ecological modelling, with recent applications to conservation. However, these practical applications have so far been restricted to simple models with analytical solutions. Here we implement a deterministic approach based on decision theory in a complex model of the Eastern English Channel. Using the ISIS-Fish modelling platform, we model populations of sole and plaice in this area. We test a wide range of values for ecosystem, fleet and management parameters. From these simulations, we identify management rules controlling fish harvesting that allow reaching management goals recommended by ICES (International Council for the Exploration of the Sea) working groups while providing the highest robustness to uncertainties on ecosystem parameters.
Stochastic cycle selection in active flow networks.
Woodhouse, Francis G; Forrow, Aden; Fawcett, Joanna B; Dunkel, Jörn
2016-07-19
Active biological flow networks pervade nature and span a wide range of scales, from arterial blood vessels and bronchial mucus transport in humans to bacterial flow through porous media or plasmodial shuttle streaming in slime molds. Despite their ubiquity, little is known about the self-organization principles that govern flow statistics in such nonequilibrium networks. Here we connect concepts from lattice field theory, graph theory, and transition rate theory to understand how topology controls dynamics in a generic model for actively driven flow on a network. Our combined theoretical and numerical analysis identifies symmetry-based rules that make it possible to classify and predict the selection statistics of complex flow cycles from the network topology. The conceptual framework developed here is applicable to a broad class of biological and nonbiological far-from-equilibrium networks, including actively controlled information flows, and establishes a correspondence between active flow networks and generalized ice-type models.
Stochastic cycle selection in active flow networks
NASA Astrophysics Data System (ADS)
Woodhouse, Francis; Forrow, Aden; Fawcett, Joanna; Dunkel, Jorn
2016-11-01
Active biological flow networks pervade nature and span a wide range of scales, from arterial blood vessels and bronchial mucus transport in humans to bacterial flow through porous media or plasmodial shuttle streaming in slime molds. Despite their ubiquity, little is known about the self-organization principles that govern flow statistics in such non-equilibrium networks. By connecting concepts from lattice field theory, graph theory and transition rate theory, we show how topology controls dynamics in a generic model for actively driven flow on a network. Through theoretical and numerical analysis we identify symmetry-based rules to classify and predict the selection statistics of complex flow cycles from the network topology. Our conceptual framework is applicable to a broad class of biological and non-biological far-from-equilibrium networks, including actively controlled information flows, and establishes a new correspondence between active flow networks and generalized ice-type models.
Governance of environmental risk: new approaches to managing stakeholder involvement.
Benn, Suzanne; Dunphy, Dexter; Martin, Andrew
2009-04-01
Disputes concerning industrial legacies such as the disposal of toxic wastes illustrate changing pressures on corporations and governments. Business and governments are now confronted with managing the expectations of a society increasingly aware of the social and environmental impacts and risks associated with economic development and demanding more equitable distribution and democratic management of such risks. The closed managerialist decision-making of the powerful bureaucracies and corporations of the industrial era is informed by traditional management theory which cannot provide a framework for the adequate governance of these risks. Recent socio-political theories have conceptualised some key themes that must be addressed in a more fitting approach to governance. We identify more recent management and governance theory which addresses these themes and develop a process-based approach to governance of environmental disputes that allows for the evolving nature of stakeholder relations in a highly complex multiple stakeholder arena.
Stochastic cycle selection in active flow networks
Woodhouse, Francis G.; Forrow, Aden; Fawcett, Joanna B.; Dunkel, Jörn
2016-01-01
Active biological flow networks pervade nature and span a wide range of scales, from arterial blood vessels and bronchial mucus transport in humans to bacterial flow through porous media or plasmodial shuttle streaming in slime molds. Despite their ubiquity, little is known about the self-organization principles that govern flow statistics in such nonequilibrium networks. Here we connect concepts from lattice field theory, graph theory, and transition rate theory to understand how topology controls dynamics in a generic model for actively driven flow on a network. Our combined theoretical and numerical analysis identifies symmetry-based rules that make it possible to classify and predict the selection statistics of complex flow cycles from the network topology. The conceptual framework developed here is applicable to a broad class of biological and nonbiological far-from-equilibrium networks, including actively controlled information flows, and establishes a correspondence between active flow networks and generalized ice-type models. PMID:27382186
NASA Astrophysics Data System (ADS)
Daminelli, Simone; Thomas, Josephine Maria; Durán, Claudio; Vittorio Cannistraci, Carlo
2015-11-01
Bipartite networks are powerful descriptions of complex systems characterized by two different classes of nodes and connections allowed only across but not within the two classes. Unveiling physical principles, building theories and suggesting physical models to predict bipartite links such as product-consumer connections in recommendation systems or drug-target interactions in molecular networks can provide priceless information to improve e-commerce or to accelerate pharmaceutical research. The prediction of nonobserved connections starting from those already present in the topology of a network is known as the link-prediction problem. It represents an important subject both in many-body interaction theory in physics and in new algorithms for applied tools in computer science. The rationale is that the existing connectivity structure of a network can suggest where new connections can appear with higher likelihood in an evolving network, or where nonobserved connections are missing in a partially known network. Surprisingly, current complex network theory presents a theoretical bottle-neck: a general framework for local-based link prediction directly in the bipartite domain is missing. Here, we overcome this theoretical obstacle and present a formal definition of common neighbour index and local-community-paradigm (LCP) for bipartite networks. As a consequence, we are able to introduce the first node-neighbourhood-based and LCP-based models for topological link prediction that utilize the bipartite domain. We performed link prediction evaluations in several networks of different size and of disparate origin, including technological, social and biological systems. Our models significantly improve topological prediction in many bipartite networks because they exploit local physical driving-forces that participate in the formation and organization of many real-world bipartite networks. Furthermore, we present a local-based formalism that allows to intuitively implement neighbourhood-based link prediction entirely in the bipartite domain.
Oudkerk Pool, Andrea; Govaerts, Marjan J B; Jaarsma, Debbie A D C; Driessen, Erik W
2018-05-01
While portfolios are increasingly used to assess competence, the validity of such portfolio-based assessments has hitherto remained unconfirmed. The purpose of the present research is therefore to further our understanding of how assessors form judgments when interpreting the complex data included in a competency-based portfolio. Eighteen assessors appraised one of three competency-based mock portfolios while thinking aloud, before taking part in semi-structured interviews. A thematic analysis of the think-aloud protocols and interviews revealed that assessors reached judgments through a 3-phase cyclical cognitive process of acquiring, organizing, and integrating evidence. Upon conclusion of the first cycle, assessors reviewed the remaining portfolio evidence to look for confirming or disconfirming evidence. Assessors were inclined to stick to their initial judgments even when confronted with seemingly disconfirming evidence. Although assessors reached similar final (pass-fail) judgments of students' professional competence, they differed in their information-processing approaches and the reasoning behind their judgments. Differences sprung from assessors' divergent assessment beliefs, performance theories, and inferences about the student. Assessment beliefs refer to assessors' opinions about what kind of evidence gives the most valuable and trustworthy information about the student's competence, whereas assessors' performance theories concern their conceptualizations of what constitutes professional competence and competent performance. Even when using the same pieces of information, assessors furthermore differed with respect to inferences about the student as a person as well as a (future) professional. Our findings support the notion that assessors' reasoning in judgment and decision-making varies and is guided by their mental models of performance assessment, potentially impacting feedback and the credibility of decisions. Our findings also lend further credence to the assertion that portfolios should be judged by multiple assessors who should, moreover, thoroughly substantiate their judgments. Finally, it is suggested that portfolios be designed in such a way that they facilitate the selection of and navigation through the portfolio evidence.
Expertise facilitates the transfer of anticipation skill across domains.
Rosalie, Simon M; Müller, Sean
2014-02-01
It is unclear whether perceptual-motor skill transfer is based upon similarity between the learning and transfer domains per identical elements theory, or facilitated by an understanding of underlying principles in accordance with general principle theory. Here, the predictions of identical elements theory, general principle theory, and aspects of a recently proposed model for the transfer of perceptual-motor skill with respect to expertise in the learning and transfer domains are examined. The capabilities of expert karate athletes, near-expert karate athletes, and novices to anticipate and respond to stimulus skills derived from taekwondo and Australian football were investigated in ecologically valid contexts using an in situ temporal occlusion paradigm and complex whole-body perceptual-motor skills. Results indicated that the karate experts and near-experts are as capable of using visual information to anticipate and guide motor skill responses as domain experts and near-experts in the taekwondo transfer domain, but only karate experts could perform like domain experts in the Australian football transfer domain. Findings suggest that transfer of anticipation skill is based upon expertise and an understanding of principles but may be supplemented by similarities that exist between the stimulus and response elements of the learning and transfer domains.
ERIC Educational Resources Information Center
Li, Feifei
2017-01-01
An information-correction method for testlet-based tests is introduced. This method takes advantage of both generalizability theory (GT) and item response theory (IRT). The measurement error for the examinee proficiency parameter is often underestimated when a unidimensional conditional-independence IRT model is specified for a testlet dataset. By…
Recoverability in quantum information theory
NASA Astrophysics Data System (ADS)
Wilde, Mark
The fact that the quantum relative entropy is non-increasing with respect to quantum physical evolutions lies at the core of many optimality theorems in quantum information theory and has applications in other areas of physics. In this work, we establish improvements of this entropy inequality in the form of physically meaningful remainder terms. One of the main results can be summarized informally as follows: if the decrease in quantum relative entropy between two quantum states after a quantum physical evolution is relatively small, then it is possible to perform a recovery operation, such that one can perfectly recover one state while approximately recovering the other. This can be interpreted as quantifying how well one can reverse a quantum physical evolution. Our proof method is elementary, relying on the method of complex interpolation, basic linear algebra, and the recently introduced Renyi generalization of a relative entropy difference. The theorem has a number of applications in quantum information theory, which have to do with providing physically meaningful improvements to many known entropy inequalities. This is based on arXiv:1505.04661, now accepted for publication in Proceedings of the Royal Society A. I acknowledge support from startup funds from the Department of Physics and Astronomy at LSU, the NSF under Award No. CCF-1350397, and the DARPA Quiness Program through US Army Research Office award W31P4Q-12-1-0019.
Statistical dielectronic recombination rates for multielectron ions in plasma
NASA Astrophysics Data System (ADS)
Demura, A. V.; Leont'iev, D. S.; Lisitsa, V. S.; Shurygin, V. A.
2017-10-01
We describe the general analytic derivation of the dielectronic recombination (DR) rate coefficient for multielectron ions in a plasma based on the statistical theory of an atom in terms of the spatial distribution of the atomic electron density. The dielectronic recombination rates for complex multielectron tungsten ions are calculated numerically in a wide range of variation of the plasma temperature, which is important for modern nuclear fusion studies. The results of statistical theory are compared with the data obtained using level-by-level codes ADPAK, FAC, HULLAC, and experimental results. We consider different statistical DR models based on the Thomas-Fermi distribution, viz., integral and differential with respect to the orbital angular momenta of the ion core and the trapped electron, as well as the Rost model, which is an analog of the Frank-Condon model as applied to atomic structures. In view of its universality and relative simplicity, the statistical approach can be used for obtaining express estimates of the dielectronic recombination rate coefficients in complex calculations of the parameters of the thermonuclear plasmas. The application of statistical methods also provides information for the dielectronic recombination rates with much smaller computer time expenditures as compared to available level-by-level codes.
Perspectives on hand function in girls and women with Rett syndrome.
Downs, Jenny; Parkinson, Stephanie; Ranelli, Sonia; Leonard, Helen; Diener, Pamela; Lotan, Meir
2014-06-01
Rett syndrome is a rare neurodevelopmental disorder that is usually associated with a mutation on the X-linked MECP2 gene. Hand function is particularly affected and we discuss theoretical and practical perspectives for optimising hand function in Rett syndrome. We reviewed the literature pertaining to hand function and stereotypies in Rett syndrome and developed a toolkit for their assessment and treatment. There is little published information on management of hand function in Rett syndrome. We suggest assessment and treatment strategies based on available literature, clinical experience and grounded in theories of motor control and motor learning. Additional studies are needed to determine the best treatments for hand function in Rett syndrome. Meanwhile, clinical needs can be addressed by supplementing the evidence base with an understanding of the complexities of Rett syndrome, clinical experience, environmental enrichment animal studies and theories of motor control and motor learning.
Fiber tracking of brain white matter based on graph theory.
Lu, Meng
2015-01-01
Brain white matter tractography is reconstructed via diffusion-weighted magnetic resonance images. Due to the complex structure of brain white matter fiber bundles, fiber crossing and fiber branching are abundant in human brain. And regular methods with diffusion tensor imaging (DTI) can't accurately handle this problem. the biggest problems of the brain tractography. Therefore, this paper presented a novel brain white matter tractography method based on graph theory, so the fiber tracking between two voxels is transformed into locating the shortest path in a graph. Besides, the presented method uses Q-ball imaging (QBI) as the source data instead of DTI, because QBI can provide accurate information about multiple fiber crossing and branching in one voxel using orientation distribution function (ODF). Experiments showed that the presented method can accurately handle the problem of brain white matter fiber crossing and branching, and reconstruct brain tractograhpy both in phantom data and real brain data.
Combining complex networks and data mining: Why and how
NASA Astrophysics Data System (ADS)
Zanin, M.; Papo, D.; Sousa, P. A.; Menasalvas, E.; Nicchi, A.; Kubik, E.; Boccaletti, S.
2016-05-01
The increasing power of computer technology does not dispense with the need to extract meaningful information out of data sets of ever growing size, and indeed typically exacerbates the complexity of this task. To tackle this general problem, two methods have emerged, at chronologically different times, that are now commonly used in the scientific community: data mining and complex network theory. Not only do complex network analysis and data mining share the same general goal, that of extracting information from complex systems to ultimately create a new compact quantifiable representation, but they also often address similar problems too. In the face of that, a surprisingly low number of researchers turn out to resort to both methodologies. One may then be tempted to conclude that these two fields are either largely redundant or totally antithetic. The starting point of this review is that this state of affairs should be put down to contingent rather than conceptual differences, and that these two fields can in fact advantageously be used in a synergistic manner. An overview of both fields is first provided, some fundamental concepts of which are illustrated. A variety of contexts in which complex network theory and data mining have been used in a synergistic manner are then presented. Contexts in which the appropriate integration of complex network metrics can lead to improved classification rates with respect to classical data mining algorithms and, conversely, contexts in which data mining can be used to tackle important issues in complex network theory applications are illustrated. Finally, ways to achieve a tighter integration between complex networks and data mining, and open lines of research are discussed.
Perceptual Learning, Cognition, and Expertise
ERIC Educational Resources Information Center
Kellman, Philip J.; Massey, Christine M.
2013-01-01
Recent research indicates that perceptual learning (PL)--experience-induced changes in the way perceivers extract information--plays a larger role in complex cognitive tasks, including abstract and symbolic domains, than has been understood in theory or implemented in instruction. Here, we describe the involvement of PL in complex cognitive tasks…
On long-only information-based portfolio diversification framework
NASA Astrophysics Data System (ADS)
Santos, Raphael A.; Takada, Hellinton H.
2014-12-01
Using the concepts from information theory, it is possible to improve the traditional frameworks for long-only asset allocation. In modern portfolio theory, the investor has two basic procedures: the choice of a portfolio that maximizes its risk-adjusted excess return or the mixed allocation between the maximum Sharpe portfolio and the risk-free asset. In the literature, the first procedure was already addressed using information theory. One contribution of this paper is the consideration of the second procedure in the information theory context. The performance of these approaches was compared with three traditional asset allocation methodologies: the Markowitz's mean-variance, the resampled mean-variance and the equally weighted portfolio. Using simulated and real data, the information theory-based methodologies were verified to be more robust when dealing with the estimation errors.
ERIC Educational Resources Information Center
Blackmon, Marilyn Hughes
2012-01-01
This paper draws from cognitive psychology and cognitive neuroscience to develop a preliminary similarity-choice theory of how people allocate attention among information patches on webpages while completing search tasks in complex informational websites. Study 1 applied stepwise multiple regression to a large dataset and showed that success rate…
Decomposition Theory in the Teaching of Elementary Linear Algebra.
ERIC Educational Resources Information Center
London, R. R.; Rogosinski, H. P.
1990-01-01
Described is a decomposition theory from which the Cayley-Hamilton theorem, the diagonalizability of complex square matrices, and functional calculus can be developed. The theory and its applications are based on elementary polynomial algebra. (KR)
Dorazio, R.M.; Johnson, F.A.
2003-01-01
Bayesian inference and decision theory may be used in the solution of relatively complex problems of natural resource management, owing to recent advances in statistical theory and computing. In particular, Markov chain Monte Carlo algorithms provide a computational framework for fitting models of adequate complexity and for evaluating the expected consequences of alternative management actions. We illustrate these features using an example based on management of waterfowl habitat.
Can You Lead Me Now? Leading in the Complex World of Homeland Security
2007-09-01
Theories of Formal Social Systems (Thousand Oaks, California: Sage Publications, 1999), 5. 3 Richard N. Osborn, James G. Hunt, and Lawrence Jauch...midst of an organizational crisis, or when executives and employees use social networks to gather information and initiate solutions for survival...GROUNDED THEORY Grounded theory offers a different way of knowing and understanding social interactions and patterns. The positivistic and deductive
Organizational Change at the Edge of Chaos: A Complexity Theory Perspective of Autopoietic Systems
ERIC Educational Resources Information Center
Susini, Domenico, III.
2010-01-01
This qualitative phenomenological study includes explorations of organizational change phenomena from the vantage point of complexity theory as experienced through the lived experiences of eight senior level managers and executives based in Northern N.J. who have experienced crisis situations in their organizations. Concepts from the natural…
Generalization of the Activated Complex Theory of Reaction Rates. II. Classical Mechanical Treatment
DOE R&D Accomplishments Database
Marcus, R. A.
1964-01-01
In its usual classical form activated complex theory assumes a particular expression for the kinetic energy of the reacting system -- one associated with a rectilinear motion along the reaction coordinate. The derivation of the rate expression given in the present paper is based on the general kinetic energy expression.
Martinez-Lavin, Manuel; Infante, Oscar; Lerma, Claudia
2008-02-01
Modern clinicians are often frustrated by their inability to understand fibromyalgia and similar maladies since these illnesses cannot be explained by the prevailing linear-reductionist medical paradigm. This article proposes that new concepts derived from the Complexity Theory may help understand the pathogenesis of fibromyalgia, chronic fatigue syndrome, and Gulf War syndrome. This hypothesis is based on the recent recognition of chaos fractals and complex systems in human physiology. These nonlinear dynamics concepts offer a different perspective to the notion of homeostasis and disease. They propose that the essence of disease is dysfunction and not structural damage. Studies using novel nonlinear instruments have shown that fibromyalgia and similar maladies may be caused by the degraded performance of our main complex adaptive system. This dysfunction explains the multifaceted manifestations of these entities. To understand and alleviate the suffering associated with these complex illnesses, a paradigm shift from reductionism to holism based on the Complexity Theory is suggested. This shift perceives health as resilient adaptation and some chronic illnesses as rigid dysfunction.
Pernar, Luise I M; Ashley, Stanley W; Smink, Douglas S; Zinner, Michael J; Peyre, Sarah E
2012-01-01
Practicing within the Halstedian model of surgical education, academic surgeons serve dual roles as physicians to their patients and educators of their trainees. Despite this significant responsibility, few surgeons receive formal training in educational theory to inform their practice. The goal of this work was to gain an understanding of how master surgeons approach teaching uncommon and highly complex operations and to determine the educational constructs that frame their teaching philosophies and approaches. Individuals included in the study were queried using electronically distributed open-ended, structured surveys. Responses to the surveys were analyzed and grouped using grounded theory and were examined for parallels to concepts of learning theory. Academic teaching hospital. Twenty-two individuals identified as master surgeons. Twenty-one (95.5%) individuals responded to the survey. Two primary thematic clusters were identified: global approach to teaching (90.5% of respondents) and approach to intraoperative teaching (76.2%). Many of the emergent themes paralleled principles of transfer learning theory outlined in the psychology and education literature. Key elements included: conferring graduated responsibility (57.1%), encouraging development of a mental set (47.6%), fostering or expecting deliberate practice (42.9%), deconstructing complex tasks (38.1%), vertical transfer of information (33.3%), and identifying general principles to structure knowledge (9.5%). Master surgeons employ many of the principles of learning theory when teaching uncommon and highly complex operations. The findings may hold significant implications for faculty development in surgical education. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Van Gerven, Pascal W. M.; Paas, Fred G. W. C.; Van Merrienboer, Jeroen J. G.; Schmidt, Henk G.
2000-01-01
Cognitive load (CL) theory suggests minimizing extraneous CL and maximizing germane CL in order not to overload working memory. Instructional design for older adults should therefore include goal-free problems, worked examples, and different modalities and avoid splitting attention and including redundant information. (SK)
Aging and the complexity of cardiovascular dynamics
NASA Technical Reports Server (NTRS)
Kaplan, D. T.; Furman, M. I.; Pincus, S. M.; Ryan, S. M.; Lipsitz, L. A.; Goldberger, A. L.
1991-01-01
Biomedical signals often vary in a complex and irregular manner. Analysis of variability in such signals generally does not address directly their complexity, and so may miss potentially useful information. We analyze the complexity of heart rate and beat-to-beat blood pressure using two methods motivated by nonlinear dynamics (chaos theory). A comparison of a group of healthy elderly subjects with healthy young adults indicates that the complexity of cardiovascular dynamics is reduced with aging. This suggests that complexity of variability may be a useful physiological marker.
Differences in game reading between selected and non-selected youth soccer players.
Den Hartigh, Ruud J R; Van Der Steen, Steffie; Hakvoort, Bas; Frencken, Wouter G P; Lemmink, Koen A P M
2018-02-01
Applying an established theory of cognitive development-Skill Theory-the current study compares the game-reading skills of youth players selected for a soccer school of a professional soccer club (n = 49) and their non-selected peers (n = 38). Participants described the actions taking place in videos of soccer game plays, and their verbalisations were coded using Skill Theory. Compared to the non-selected players, the selected players generally demonstrated higher levels of complexity in their game-reading, and structured the information of game elements-primarily the player, teammate and field-at higher complexity levels. These results demonstrate how Skill Theory can be used to assess, and distinguish game-reading of youth players with different expertise, a skill important for soccer, but also for other sports.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Yaru; Xing, Zhiyan; Zhang, Xiao
To systematically explore the influence of inorganic anions on building coordination complexes, five novel complexes based on 1-(benzotriazole-1-methyl)−2-propylimidazole (bpmi), [Cu(bpmi){sub 2}(Ac){sub 2}]·H{sub 2}O (1), [Cu(bpmi){sub 2}(H{sub 2}O){sub 2}]·2NO{sub 3}·2H{sub 2}O (2), [Cu(bpmi)(N{sub 3}){sub 2}] (3), [Ag(bpmi)(NO{sub 3})] (4) and [Cu{sub 3}(bpmi){sub 2}(SCN){sub 4}(DMF)] (5) (Ac{sup −}=CH{sub 3}COO{sup −}, DMF=N,N-Dimethylformamide) are synthesized through rationally introducing Cu(II) salts and Ag(I) salt with different inorganic anions. X-ray single-crystal analyses reveal that these complexes show interesting structural features from mononuclear (1), one-dimensional (2 and 3), two-dimensional (4) to three-dimensional (5) under the influence of inorganic anions with different basicities. The structural variation can bemore » explained by the hard-soft-acid-base (HSAB) theory. Magnetic susceptibility measurement indicates that complex 3 exhibits an antiferromagnetic coupling between adjacent Cu(II) ions. - Graphical abstract: Five new Cu(II)/Ag(I) complexes show interesting structural features from mononuclear, one-dimension, two-dimension to three-dimension under the influence of inorganic anions. The structural variation can be explained by the HSAB theory. - Highlights: • Five inorganic anion-dependent complexes are synthesized. • Structural variation can be explained by the hard-soft-acid-base (HSAB) theory. • The magnetic property of complex has been studied.« less
Protein-Protein Interface and Disease: Perspective from Biomolecular Networks.
Hu, Guang; Xiao, Fei; Li, Yuqian; Li, Yuan; Vongsangnak, Wanwipa
Protein-protein interactions are involved in many important biological processes and molecular mechanisms of disease association. Structural studies of interfacial residues in protein complexes provide information on protein-protein interactions. Characterizing protein-protein interfaces, including binding sites and allosteric changes, thus pose an imminent challenge. With special focus on protein complexes, approaches based on network theory are proposed to meet this challenge. In this review we pay attention to protein-protein interfaces from the perspective of biomolecular networks and their roles in disease. We first describe the different roles of protein complexes in disease through several structural aspects of interfaces. We then discuss some recent advances in predicting hot spots and communication pathway analysis in terms of amino acid networks. Finally, we highlight possible future aspects of this area with respect to both methodology development and applications for disease treatment.
Huff, Emily Silver; Leahy, Jessica E.; Hiebeler, David; Weiskittel, Aaron R.; Noblet, Caroline L.
2015-01-01
Privately owned woodlands are an important source of timber and ecosystem services in North America and worldwide. Impacts of management on these ecosystems and timber supply from these woodlands are difficult to estimate because complex behavioral theory informs the owner’s management decisions. The decision-making environment consists of exogenous market factors, internal cognitive processes, and social interactions with fellow landowners, foresters, and other rural community members. This study seeks to understand how social interactions, information flow, and peer-to-peer networks influence timber harvesting behavior using an agent-based model. This theoretical model includes forested polygons in various states of ‘harvest readiness’ and three types of agents: forest landowners, foresters, and peer leaders (individuals trained in conservation who use peer-to-peer networking). Agent rules, interactions, and characteristics were parameterized with values from existing literature and an empirical survey of forest landowner attitudes, intentions, and demographics. The model demonstrates that as trust in foresters and peer leaders increases, the percentage of the forest that is harvested sustainably increases. Furthermore, peer leaders can serve to increase landowner trust in foresters. Model output and equations will inform forest policy and extension/outreach efforts. The model also serves as an important testing ground for new theories of landowner decision making and behavior. PMID:26562429
Hajdukiewicz, John R; Vicente, Kim J
2002-01-01
Ecological interface design (EID) is a theoretical framework that aims to support worker adaptation to change and novelty in complex systems. Previous evaluations of EID have emphasized representativeness to enhance generalizability of results to operational settings. The research presented here is complementary, emphasizing experimental control to enhance theory building. Two experiments were conducted to test the impact of functional information and emergent feature graphics on adaptation to novelty and change in a thermal-hydraulic process control microworld. Presenting functional information in an interface using emergent features encouraged experienced participants to become perceptually coupled to the interface and thereby to exhibit higher-level control and more successful adaptation to unanticipated events. The absence of functional information or of emergent features generally led to lower-level control and less success at adaptation, the exception being a minority of participants who compensated by relying on analytical reasoning. These findings may have practical implications for shaping coordination in complex systems and fundamental implications for the development of a general unified theory of coordination for the technical, human, and social sciences. Actual or potential applications of this research include the design of human-computer interfaces that improve safety in complex sociotechnical systems.
Harmonic analysis of electric locomotive and traction power system based on wavelet singular entropy
NASA Astrophysics Data System (ADS)
Dun, Xiaohong
2018-05-01
With the rapid development of high-speed railway and heavy-haul transport, the locomotive and traction power system has become the main harmonic source of China's power grid. In response to this phenomenon, the system's power quality issues need timely monitoring, assessment and governance. Wavelet singular entropy is an organic combination of wavelet transform, singular value decomposition and information entropy theory, which combines the unique advantages of the three in signal processing: the time-frequency local characteristics of wavelet transform, singular value decomposition explores the basic modal characteristics of data, and information entropy quantifies the feature data. Based on the theory of singular value decomposition, the wavelet coefficient matrix after wavelet transform is decomposed into a series of singular values that can reflect the basic characteristics of the original coefficient matrix. Then the statistical properties of information entropy are used to analyze the uncertainty of the singular value set, so as to give a definite measurement of the complexity of the original signal. It can be said that wavelet entropy has a good application prospect in fault detection, classification and protection. The mat lab simulation shows that the use of wavelet singular entropy on the locomotive and traction power system harmonic analysis is effective.
ERIC Educational Resources Information Center
Eaton, Paul William
2016-01-01
This article examines the limitations and possibilities of the emerging competency-based movement in student affairs. Using complexity theory and postmodern educational theory as guiding frameworks, examination of the competency-based movement will raise questions about overapplication of competencies in graduate preparation programs and…
Spatiotemporal Dynamics and Fitness Analysis of Global Oil Market: Based on Complex Network
Wang, Minggang; Fang, Guochang; Shao, Shuai
2016-01-01
We study the overall topological structure properties of global oil trade network, such as degree, strength, cumulative distribution, information entropy and weight clustering. The structural evolution of the network is investigated as well. We find the global oil import and export networks do not show typical scale-free distribution, but display disassortative property. Furthermore, based on the monthly data of oil import values during 2005.01–2014.12, by applying random matrix theory, we investigate the complex spatiotemporal dynamic from the country level and fitness evolution of the global oil market from a demand-side analysis. Abundant information about global oil market can be obtained from deviating eigenvalues. The result shows that the oil market has experienced five different periods, which is consistent with the evolution of country clusters. Moreover, we find the changing trend of fitness function agrees with that of gross domestic product (GDP), and suggest that the fitness evolution of oil market can be predicted by forecasting GDP values. To conclude, some suggestions are provided according to the results. PMID:27706147
Zhang, Bing; Jin, Rui; Huang, Jianmei; Liu, Xiaoqing; Xue, Chunmiao; Lin, Zhijian
2012-08-01
Traditional Chinese medicine (TCM) property theory is believed to be a key and difficult point of basic theory studies of TCM. Complex concepts, components and characteristics of TCM property have long puzzled researchers and urged them to develop new angles and approaches. In the view of cognitive science, TCM property theory is a cognitive process of storing, extracting, rebuilding and summarizing the sensory information about TCMs and their effects during the medical practice struggling against diseases under the guidance of traditional Chinese philosophical thinking. The cognitive process of TCM property has particular cognitive elements and strategies. Taking into account clinical application characteristics of TCMs, this study defines the particular cognitive elements. In the combination of research methods of modern chemistry, biology and mathematics, and on the basis early-stage work for five years, we have built a TCM property cognition model based on three elements and practiced with drugs with pungent and hot properties as example, in the hope of interpreting TCM properties with modern science and providing thoughts for the nature of medical properties and instruction for rational clinical prescription.
NASA Astrophysics Data System (ADS)
Alfonso, Leonardo
2013-04-01
The role of decision-makers is to take the outputs from hydrological and hydraulic analyses and, in some extent, use them as inputs to make decisions that are related to planning, design and operation of water systems. However, the use of these technical analyses is frequently limited, since there are other non-hydrological issues that must be considered, that may end up in very different solutions than those envisaged by the purely technical ones. A possibility to account for the nature of the human decisions under uncertainty is by exploring the use of concepts from decision theory and behavioural economics, such as Value of Information and Prospect Theory and embed them into the methodologies we use in the hydrology practice. Three examples are presented to illustrate these multidisciplinary interactions. The first one, for monitoring network design, uses Value of Information within a methodology to locate water level stations in a complex canal of networks in the Netherlands. The second example, for operation, shows how the Value of Information concept can be used to formulate alternative methods to evaluate flood risk according to the set of options available for decision-making during a flood event. The third example, for planning, uses Prospect Theory concepts to understand how the "losses hurt more than gains feel good" effect can determine the final decision of urbanise or not a flood-prone area. It is demonstrated that decision theory and behavioural economic principles are promising to evaluate the complex decision-making process in water-related issues.
Information and material flows in complex networks
NASA Astrophysics Data System (ADS)
Helbing, Dirk; Armbruster, Dieter; Mikhailov, Alexander S.; Lefeber, Erjen
2006-04-01
In this special issue, an overview of the Thematic Institute (TI) on Information and Material Flows in Complex Systems is given. The TI was carried out within EXYSTENCE, the first EU Network of Excellence in the area of complex systems. Its motivation, research approach and subjects are presented here. Among the various methods used are many-particle and statistical physics, nonlinear dynamics, as well as complex systems, network and control theory. The contributions are relevant for complex systems as diverse as vehicle and data traffic in networks, logistics, production, and material flows in biological systems. The key disciplines involved are socio-, econo-, traffic- and bio-physics, and a new research area that could be called “biologistics”.
Cukras, Janusz; Kauczor, Joanna; Norman, Patrick; Rizzo, Antonio; Rikken, Geert L J A; Coriani, Sonia
2016-05-21
A computational protocol for magneto-chiral dichroism and magneto-chiral birefringence dispersion is presented within the framework of damped response theory, also known as complex polarization propagator theory, at the level of time-dependent Hartree-Fock and time-dependent density functional theory. Magneto-chiral dichroism and magneto-chiral birefringence spectra in the (resonant) frequency region below the first ionization threshold of R-methyloxirane and l-alanine are presented and compared with the corresponding results obtained for both the electronic circular dichroism and the magnetic circular dichroism. The additional information content yielded by the magneto-chiral phenomena, as well as their potential experimental detectability for the selected species, is discussed.
A Review of Computer-Based Human Behavior Representations and Their Relation to Military Simulations
2003-08-01
described by Emery and Trist (1960), activity theory introduced by Vygotsky in the 1930s and formalized by Leont’ev (1979) and situated cognition theory by...II-6 B. Adaptive Resonance Theory (ART) .......................................................... II-6 1. Model...II-31 G. Cognitive Complexity Theory (CCT
Information content and acoustic structure of male African elephant social rumbles
Stoeger, Angela S.; Baotic, Anton
2016-01-01
Until recently, the prevailing theory about male African elephants (Loxodonta africana) was that, once adult and sexually mature, males are solitary and targeted only at finding estrous females. While this is true during the state of ‘musth’ (a condition characterized by aggressive behavior and elevated androgen levels), ‘non-musth’ males exhibit a social system seemingly based on companionship, dominance and established hierarchies. Research on elephant vocal communication has so far focused on females, and very little is known about the acoustic structure and the information content of male vocalizations. Using the source and filter theory approach, we analyzed social rumbles of 10 male African elephants. Our results reveal that male rumbles encode information about individuality and maturity (age and size), with formant frequencies and absolute fundamental frequency values having the most informative power. This first comprehensive study on male elephant vocalizations gives important indications on their potential functional relevance for male-male and male-female communication. Our results suggest that, similar to the highly social females, future research on male elephant vocal behavior will reveal a complex communication system in which social knowledge, companionship, hierarchy, reproductive competition and the need to communicate over long distances play key roles. PMID:27273586
Ecological scenarios analyzed and evaluated by a shallow lake model.
Kardaetz, Sascha; Strube, Torsten; Brüggemann, Rainer; Nützmann, Gunnar
2008-07-01
We applied the complex ecosystem model EMMO, which was adopted to the shallow lake Müggelsee (Germany), in order to evaluate a large set of ecological scenarios. By means of EMMO, 33 scenarios and 17 indicators were defined to characterize their effects on the lake ecosystem. The indicators were based on model outputs of EMMO and can be separated into biological indicators, such as chlorophyll-a and cyanobacteria, and hydro-chemical indicators, such as phosphorus. The question to be solved was, what is the ranking of the scenarios based on their characterization by these 17 indicators? And how can we handle high quantities of complex data within evaluation procedures? The scenario evaluation was performed by partial order theory which, however, did not provide a clear result. By subsequently applying the hierarchical cluster analysis (complete linkage) it was possible to reduce the data matrix to indicator and scenario representatives. Even though this step implies losses of information, it simplifies the application of partial order theory and the post processing by METEOR. METEOR is derived from partial order theory and allows the stepwise aggregation of indicators, which subsequently leads to a distinct and clear decision. In the final evaluation result the best scenario was the one which defines a minimum nutrient input and no phosphorus release from the sediment while the worst scenario is characterized by a maximum nutrient input and extensive phosphorus release from the sediment. The reasonable and comprehensive results show that the combination of partial order, cluster analysis and METEOR can handle big amounts of data in a very clear and transparent way, and therefore is ideal in the context of complex ecosystem models, like that we applied.
Chen, Ying; Pham, Tuan D
2013-05-15
We apply for the first time the sample entropy (SampEn) and regularity dimension model for measuring signal complexity to quantify the structural complexity of the brain on MRI. The concept of the regularity dimension is based on the theory of chaos for studying nonlinear dynamical systems, where power laws and entropy measure are adopted to develop the regularity dimension for modeling a mathematical relationship between the frequencies with which information about signal regularity changes in various scales. The sample entropy and regularity dimension of MRI-based brain structural complexity are computed for early Alzheimer's disease (AD) elder adults and age and gender-matched non-demented controls, as well as for a wide range of ages from young people to elder adults. A significantly higher global cortical structure complexity is detected in AD individuals (p<0.001). The increase of SampEn and the regularity dimension are also found to be accompanied with aging which might indicate an age-related exacerbation of cortical structural irregularity. The provided model can be potentially used as an imaging bio-marker for early prediction of AD and age-related cognitive decline. Copyright © 2013 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Wang, Lin
2013-01-01
Background: Cultural-historical activity theory is an important theory in modern psychology. In recent years, it has drawn more attention from related disciplines including information science. Argument: This paper argues that activity theory and domain analysis which uses the theory as one of its bases could bring about some important…
Argument Complexity: Teaching Undergraduates to Make Better Arguments
ERIC Educational Resources Information Center
Kelly, Matthew A.; West, Robert L.
2017-01-01
The task of turning undergrads into academics requires teaching them to reason about the world in a more complex way. We present the Argument Complexity Scale, a tool for analysing the complexity of argumentation, based on the Integrative Complexity and Conceptual Complexity Scales from, respectively, political psychology and personality theory.…
del Moral, F; Vázquez, J A; Ferrero, J J; Willisch, P; Ramírez, R D; Teijeiro, A; López Medina, A; Andrade, B; Vázquez, J; Salvador, F; Medal, D; Salgado, M; Muñoz, V
2009-09-01
Modern radiotherapy uses complex treatments that necessitate more complex quality assurance procedures. As a continuous medium, GafChromic EBT films offer suitable features for such verification. However, its sensitometric curve is not fully understood in terms of classical theoretical models. In fact, measured optical densities and those predicted by the classical models differ significantly. This difference increases systematically with wider dose ranges. Thus, achieving the accuracy required for intensity-modulated radiotherapy (IMRT) by classical methods is not possible, plecluding their use. As a result, experimental parametrizations, such as polynomial fits, are replacing phenomenological expressions in modern investigations. This article focuses on identifying new theoretical ways to describe sensitometric curves and on evaluating the quality of fit for experimental data based on four proposed models. A whole mathematical formalism starting with a geometrical version of the classical theory is used to develop new expressions for the sensitometric curves. General results from the percolation theory are also used. A flat-bed-scanner-based method was chosen for the film analysis. Different tests were performed, such as consistency of the numeric results for the proposed model and double examination using data from independent researchers. Results show that the percolation-theory-based model provides the best theoretical explanation for the sensitometric behavior of GafChromic films. The different sizes of active centers or monomer crystals of the film are the basis of this model, allowing acquisition of information about the internal structure of the films. Values for the mean size of the active centers were obtained in accordance with technical specifications. In this model, the dynamics of the interaction between the active centers of GafChromic film and radiation is also characterized by means of its interaction cross-section value. The percolation model fulfills the accuracy requirements for quality-control procedures when large ranges of doses are used and offers a physical explanation for the film response.
Daly, Louise; McCarron, Mary; Higgins, Agnes; McCallion, Philip
2013-02-01
This paper presents a theory explaining the processes used by informal carers of people with dementia to mange alterations to their, and people with dementias' relationships with and places within their social worlds. Informal carers provide the majority of care to people with dementia. A great deal of international informal dementia care research is available, much of which elucidates the content, impacts and consequences of the informal caring role and the coping mechanisms that carers use. However, the socially situated experiences and processes integral to informal caring in dementia have not yet been robustly accounted for. A classic grounded theory approach was used as it is designed for research enquiries that aim to generate theory illustrating social patterns of action used to address an identified problem. Thirty interviews were conducted with 31 participants between 2006-2008. The theory was conceptualised from the data using the concurrent methods of theoretical sampling, constant comparative analysis, memo writing and theoretical sensitivity. Informal carers' main concern was identified as 'Living on the fringes', which was stimulated by dementia-related stigma and living a different life. The theory of 'Sustaining Place' explains the social pattern of actions employed by informal carers to manage this problem on behalf of themselves and the person with dementia. The theory of 'Sustaining Place' identifies an imperative for nurses, other formal carers and society to engage in actions to support and enable social connectedness, social inclusion and citizenship for informal carers and people with dementia. 'Sustaining Place' facilitates enhanced understanding of the complex and socially situated nature of informal dementia care through its portrayal of informal carers as social agents and can be used to guide nurses to better support those who live with dementia. © 2012 Blackwell Publishing Ltd.
Altered cerebral blood flow velocity features in fibromyalgia patients in resting-state conditions
Rodríguez, Alejandro; Tembl, José; Mesa-Gresa, Patricia; Muñoz, Miguel Ángel; Montoya, Pedro
2017-01-01
The aim of this study is to characterize in resting-state conditions the cerebral blood flow velocity (CBFV) signals of fibromyalgia patients. The anterior and middle cerebral arteries of both hemispheres from 15 women with fibromyalgia and 15 healthy women were monitored using Transcranial Doppler (TCD) during a 5-minute eyes-closed resting period. Several signal processing methods based on time, information theory, frequency and time-frequency analyses were used in order to extract different features to characterize the CBFV signals in the different vessels. Main results indicated that, in comparison with control subjects, fibromyalgia patients showed a higher complexity of the envelope CBFV and a different distribution of the power spectral density. In addition, it has been observed that complexity and spectral features show correlations with clinical pain parameters and emotional factors. The characterization features were used in a lineal model to discriminate between fibromyalgia patients and healthy controls, providing a high accuracy. These findings indicate that CBFV signals, specifically their complexity and spectral characteristics, contain information that may be relevant for the assessment of fibromyalgia patients in resting-state conditions. PMID:28700720
Altered cerebral blood flow velocity features in fibromyalgia patients in resting-state conditions.
Rodríguez, Alejandro; Tembl, José; Mesa-Gresa, Patricia; Muñoz, Miguel Ángel; Montoya, Pedro; Rey, Beatriz
2017-01-01
The aim of this study is to characterize in resting-state conditions the cerebral blood flow velocity (CBFV) signals of fibromyalgia patients. The anterior and middle cerebral arteries of both hemispheres from 15 women with fibromyalgia and 15 healthy women were monitored using Transcranial Doppler (TCD) during a 5-minute eyes-closed resting period. Several signal processing methods based on time, information theory, frequency and time-frequency analyses were used in order to extract different features to characterize the CBFV signals in the different vessels. Main results indicated that, in comparison with control subjects, fibromyalgia patients showed a higher complexity of the envelope CBFV and a different distribution of the power spectral density. In addition, it has been observed that complexity and spectral features show correlations with clinical pain parameters and emotional factors. The characterization features were used in a lineal model to discriminate between fibromyalgia patients and healthy controls, providing a high accuracy. These findings indicate that CBFV signals, specifically their complexity and spectral characteristics, contain information that may be relevant for the assessment of fibromyalgia patients in resting-state conditions.
How Does an Activity Theory Model Help to Know Better about Teaching with Electronic-Exercise-Bases?
ERIC Educational Resources Information Center
Abboud-Blanchard, Maha; Cazes, Claire
2012-01-01
The research presented in this paper relies on Activity Theory and particularly on Engestrom's model, to better understand the use of Electronic-Exercise-Bases (EEB) by mathematics teachers. This theory provides a holistic approach to illustrate the complexity of the EEB integration. The results highlight reasons and ways of using EEB and show…
The evolutionary basis of human social learning
Morgan, T. J. H.; Rendell, L. E.; Ehn, M.; Hoppitt, W.; Laland, K. N.
2012-01-01
Humans are characterized by an extreme dependence on culturally transmitted information. Such dependence requires the complex integration of social and asocial information to generate effective learning and decision making. Recent formal theory predicts that natural selection should favour adaptive learning strategies, but relevant empirical work is scarce and rarely examines multiple strategies or tasks. We tested nine hypotheses derived from theoretical models, running a series of experiments investigating factors affecting when and how humans use social information, and whether such behaviour is adaptive, across several computer-based tasks. The number of demonstrators, consensus among demonstrators, confidence of subjects, task difficulty, number of sessions, cost of asocial learning, subject performance and demonstrator performance all influenced subjects' use of social information, and did so adaptively. Our analysis provides strong support for the hypothesis that human social learning is regulated by adaptive learning rules. PMID:21795267
The evolutionary basis of human social learning.
Morgan, T J H; Rendell, L E; Ehn, M; Hoppitt, W; Laland, K N
2012-02-22
Humans are characterized by an extreme dependence on culturally transmitted information. Such dependence requires the complex integration of social and asocial information to generate effective learning and decision making. Recent formal theory predicts that natural selection should favour adaptive learning strategies, but relevant empirical work is scarce and rarely examines multiple strategies or tasks. We tested nine hypotheses derived from theoretical models, running a series of experiments investigating factors affecting when and how humans use social information, and whether such behaviour is adaptive, across several computer-based tasks. The number of demonstrators, consensus among demonstrators, confidence of subjects, task difficulty, number of sessions, cost of asocial learning, subject performance and demonstrator performance all influenced subjects' use of social information, and did so adaptively. Our analysis provides strong support for the hypothesis that human social learning is regulated by adaptive learning rules.
Cameron, Linda D.; Biesecker, Barbara Bowles; Peters, Ellen; Taber, Jennifer M.; Klein, William M. P.
2017-01-01
Advances in theory and research on self-regulation and decision-making processes have yielded important insights into how cognitive, emotional, and social processes shape risk perceptions and risk-related decisions. We examine how self-regulation theory can be applied to inform our understanding of decision-making processes within the context of genomic testing, a clinical arena in which individuals face complex risk information and potentially life-altering decisions. After presenting key principles of self-regulation, we present a genomic testing case example to illustrate how principles related to risk representations, approach and avoidance motivations, emotion regulation, defensive responses, temporal construals, and capacities such as numeric abilities can shape decisions and psychological responses during the genomic testing process. We conclude with implications for using self-regulation theory to advance science within genomic testing and opportunities for how this research can inform further developments in self-regulation theory. PMID:29225669
Cameron, Linda D; Biesecker, Barbara Bowles; Peters, Ellen; Taber, Jennifer M; Klein, William M P
2017-05-01
Advances in theory and research on self-regulation and decision-making processes have yielded important insights into how cognitive, emotional, and social processes shape risk perceptions and risk-related decisions. We examine how self-regulation theory can be applied to inform our understanding of decision-making processes within the context of genomic testing, a clinical arena in which individuals face complex risk information and potentially life-altering decisions. After presenting key principles of self-regulation, we present a genomic testing case example to illustrate how principles related to risk representations, approach and avoidance motivations, emotion regulation, defensive responses, temporal construals, and capacities such as numeric abilities can shape decisions and psychological responses during the genomic testing process. We conclude with implications for using self-regulation theory to advance science within genomic testing and opportunities for how this research can inform further developments in self-regulation theory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beigi, Salman
Sandwiched (quantum) α-Rényi divergence has been recently defined in the independent works of Wilde et al. [“Strong converse for the classical capacity of entanglement-breaking channels,” preprint http://arxiv.org/abs/arXiv:1306.1586 (2013)] and Müller-Lennert et al. [“On quantum Rényi entropies: a new definition, some properties and several conjectures,” preprint http://arxiv.org/abs/arXiv:1306.3142v1 (2013)]. This new quantum divergence has already found applications in quantum information theory. Here we further investigate properties of this new quantum divergence. In particular, we show that sandwiched α-Rényi divergence satisfies the data processing inequality for all values of α > 1. Moreover we prove that α-Holevo information, a variant of Holevo informationmore » defined in terms of sandwiched α-Rényi divergence, is super-additive. Our results are based on Hölder's inequality, the Riesz-Thorin theorem and ideas from the theory of complex interpolation. We also employ Sion's minimax theorem.« less
The mourning before: can anticipatory grief theory inform family care in adult intensive care?
Coombs, Maureen A
2010-12-01
Although anticipatory grief is a much-debated and critiqued bereavement concept, it does offer a way of understanding and exploring expected loss that may be helpful in certain situations. In end-of-life care in adult intensive care units, families often act as proxy decision makers for patients in the transition from curative treatment efforts to planned treatment withdrawal. Despite there being a developed evidence base to inform care of families at this time, few of the clinical studies that provided this evidence were underpinned by bereavement theory. Focusing on end-of-life intensive care practices, this paper integrates work on anticipatory grief and family interventions to present a family-centred framework of care. Through this it is argued that the complex needs of families must be more comprehensively understood by doctors and nurses and that interventions must be more systematically planned to improve quality end-of-life care for families in this setting.
Self-organization of meaning and the reflexive communication of information
Leydesdorff, Loet; Petersen, Alexander M.; Ivanova, Inga
2017-01-01
Following a suggestion from Warren Weaver, we extend the Shannon model of communication piecemeal into a complex systems model in which communication is differentiated both vertically and horizontally. This model enables us to bridge the divide between Niklas Luhmann’s theory of the self-organization of meaning in communications and empirical research using information theory. First, we distinguish between communication relations and correlations among patterns of relations. The correlations span a vector space in which relations are positioned and can be provided with meaning. Second, positions provide reflexive perspectives. Whereas the different meanings are integrated locally, each instantiation opens global perspectives – ‘horizons of meaning’ – along eigenvectors of the communication matrix. These next-order codifications of meaning can be expected to generate redundancies when interacting in instantiations. Increases in redundancy indicate new options and can be measured as local reduction of prevailing uncertainty (in bits). The systemic generation of new options can be considered as a hallmark of the knowledge-based economy. PMID:28232771
Sandwiched Rényi divergence satisfies data processing inequality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beigi, Salman
2013-12-15
Sandwiched (quantum) α-Rényi divergence has been recently defined in the independent works of Wilde et al. [“Strong converse for the classical capacity of entanglement-breaking channels,” preprint http://arxiv.org/abs/arXiv:1306.1586 (2013)] and Müller-Lennert et al. [“On quantum Rényi entropies: a new definition, some properties and several conjectures,” preprint http://arxiv.org/abs/arXiv:1306.3142v1 (2013)]. This new quantum divergence has already found applications in quantum information theory. Here we further investigate properties of this new quantum divergence. In particular, we show that sandwiched α-Rényi divergence satisfies the data processing inequality for all values of α > 1. Moreover we prove that α-Holevo information, a variant of Holevo informationmore » defined in terms of sandwiched α-Rényi divergence, is super-additive. Our results are based on Hölder's inequality, the Riesz-Thorin theorem and ideas from the theory of complex interpolation. We also employ Sion's minimax theorem.« less
Evaluating hydrological model performance using information theory-based metrics
USDA-ARS?s Scientific Manuscript database
The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...
Ford, John A; Jones, Andrew P; Wong, Geoff; Clark, Allan B; Porter, Tom; Shakespeare, Tom; Swart, Ann Marie; Steel, Nicholas
2015-09-18
The UK has an ageing population, especially in rural areas, where deprivation is high among older people. Previous research has identified this group as at high risk of poor access to healthcare. The aim of this study is to generate a theory of how socioeconomically disadvantaged older people from rural areas access primary care, to develop an intervention based on this theory and test it in a feasibility trial. On the basis of the MRC Framework for Developing and Evaluating Complex Interventions, three methods will be used to generate the theory. First, a realist review will elucidate the patient pathway based on existing literature. Second, an analysis of the English Longitudinal Study of Ageing will be completed using structural equation modelling. Third, 15 semistructured interviews will be undertaken with patients and four focus groups with health professionals. A triangulation protocol will be used to allow each of these methods to inform and be informed by each other, and to integrate data into one overall realist theory. Based on this theory, an intervention will be developed in discussion with stakeholders to ensure that the intervention is feasible and practical. The intervention will be tested within a feasibility trial, the design of which will depend on the intervention. Lessons from the feasibility trial will be used to refine the intervention and gather the information needed for a definitive trial. Ethics approval from the regional ethics committee has been granted for the focus groups with health professionals and interviews with patients. Ethics approval will be sought for the feasibility trial after the intervention has been designed. Findings will be disseminated to the key stakeholders involved in intervention development, to researchers, clinicians and health planners through peer-reviewed journal articles and conference publications, and locally through a dissemination event. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
NASA Technical Reports Server (NTRS)
Hargittai, M.
1980-01-01
The structural chemistry of complexes between aluminum chloride and other metal chlorides is important both for practice and theory. Condensed-phase as well as vapor-phase complexes are of interest. Structural information on such complexes is reviewed. The first emphasis is given to the molten state because of its practical importance. Aluminum chloride forms volatile complexes with other metal chlorides and these vapor-phase complexes are dealt with in the second part. Finally, the variations in molecular shape and geometrical parameters are summarized.
Towards Information Polycentricity Theory--Investigation of a Hospital Revenue Cycle
ERIC Educational Resources Information Center
Singh, Rajendra
2011-01-01
This research takes steps towards developing a new theory of organizational information management based on the ideas that, first, information creates ordering effects in transactions and, second, that there are multiple centers of authority in organizations. The rationale for developing this theory is the empirical observation that hospitals have…
Fast ITTBC using pattern code on subband segmentation
NASA Astrophysics Data System (ADS)
Koh, Sung S.; Kim, Hanchil; Lee, Kooyoung; Kim, Hongbin; Jeong, Hun; Cho, Gangseok; Kim, Chunghwa
2000-06-01
Iterated Transformation Theory-Based Coding suffers from very high computational complexity in encoding phase. This is due to its exhaustive search. In this paper, our proposed image coding algorithm preprocess an original image to subband segmentation image by wavelet transform before image coding to reduce encoding complexity. A similar block is searched by using the 24 block pattern codes which are coded by the edge information in the image block on the domain pool of the subband segmentation. As a result, numerical data shows that the encoding time of the proposed coding method can be reduced to 98.82% of that of Joaquin's method, while the loss in quality relative to the Jacquin's is about 0.28 dB in PSNR, which is visually negligible.
ERIC Educational Resources Information Center
Clarke, Simon; Wildy, Helen
2010-01-01
This paper examines the theories of organisation that have informed our understanding of schools as complex social worlds and the practice of school leadership that seems to be required in such environments. This understanding has, in turn, determined the research approach we have adopted for investigating principals' work and for ascertaining the…
Research on application of intelligent computation based LUCC model in urbanization process
NASA Astrophysics Data System (ADS)
Chen, Zemin
2007-06-01
Global change study is an interdisciplinary and comprehensive research activity with international cooperation, arising in 1980s, with the largest scopes. The interaction between land use and cover change, as a research field with the crossing of natural science and social science, has become one of core subjects of global change study as well as the front edge and hot point of it. It is necessary to develop research on land use and cover change in urbanization process and build an analog model of urbanization to carry out description, simulation and analysis on dynamic behaviors in urban development change as well as to understand basic characteristics and rules of urbanization process. This has positive practical and theoretical significance for formulating urban and regional sustainable development strategy. The effect of urbanization on land use and cover change is mainly embodied in the change of quantity structure and space structure of urban space, and LUCC model in urbanization process has been an important research subject of urban geography and urban planning. In this paper, based upon previous research achievements, the writer systematically analyzes the research on land use/cover change in urbanization process with the theories of complexity science research and intelligent computation; builds a model for simulating and forecasting dynamic evolution of urban land use and cover change, on the basis of cellular automation model of complexity science research method and multi-agent theory; expands Markov model, traditional CA model and Agent model, introduces complexity science research theory and intelligent computation theory into LUCC research model to build intelligent computation-based LUCC model for analog research on land use and cover change in urbanization research, and performs case research. The concrete contents are as follows: 1. Complexity of LUCC research in urbanization process. Analyze urbanization process in combination with the contents of complexity science research and the conception of complexity feature to reveal the complexity features of LUCC research in urbanization process. Urban space system is a complex economic and cultural phenomenon as well as a social process, is the comprehensive characterization of urban society, economy and culture, and is a complex space system formed by society, economy and nature. It has dissipative structure characteristics, such as opening, dynamics, self-organization, non-balance etc. Traditional model cannot simulate these social, economic and natural driving forces of LUCC including main feedback relation from LUCC to driving force. 2. Establishment of Markov extended model of LUCC analog research in urbanization process. Firstly, use traditional LUCC research model to compute change speed of regional land use through calculating dynamic degree, exploitation degree and consumption degree of land use; use the theory of fuzzy set to rewrite the traditional Markov model, establish structure transfer matrix of land use, forecast and analyze dynamic change and development trend of land use, and present noticeable problems and corresponding measures in urbanization process according to research results. 3. Application of intelligent computation research and complexity science research method in LUCC analog model in urbanization process. On the basis of detailed elaboration of the theory and the model of LUCC research in urbanization process, analyze the problems of existing model used in LUCC research (namely, difficult to resolve many complexity phenomena in complex urban space system), discuss possible structure realization forms of LUCC analog research in combination with the theories of intelligent computation and complexity science research. Perform application analysis on BP artificial neural network and genetic algorithms of intelligent computation and CA model and MAS technology of complexity science research, discuss their theoretical origins and their own characteristics in detail, elaborate the feasibility of them in LUCC analog research, and bring forward improvement methods and measures on existing problems of this kind of model. 4. Establishment of LUCC analog model in urbanization process based on theories of intelligent computation and complexity science. Based on the research on abovementioned BP artificial neural network, genetic algorithms, CA model and multi-agent technology, put forward improvement methods and application assumption towards their expansion on geography, build LUCC analog model in urbanization process based on CA model and Agent model, realize the combination of learning mechanism of BP artificial neural network and fuzzy logic reasoning, express the regulation with explicit formula, and amend the initial regulation through self study; optimize network structure of LUCC analog model and methods and procedures of model parameters with genetic algorithms. In this paper, I introduce research theory and methods of complexity science into LUCC analog research and presents LUCC analog model based upon CA model and MAS theory. Meanwhile, I carry out corresponding expansion on traditional Markov model and introduce the theory of fuzzy set into data screening and parameter amendment of improved model to improve the accuracy and feasibility of Markov model in the research on land use/cover change.
Noncovalent Interactions of DNA Bases with Naphthalene and Graphene.
Cho, Yeonchoo; Min, Seung Kyu; Yun, Jeonghun; Kim, Woo Youn; Tkatchenko, Alexandre; Kim, Kwang S
2013-04-09
The complexes of a DNA base bound to graphitic systems are studied. Considering naphthalene as the simplest graphitic system, DNA base-naphthalene complexes are scrutinized at high levels of ab initio theory including coupled cluster theory with singles, doubles, and perturbative triples excitations [CCSD(T)] at the complete basis set (CBS) limit. The stacked configurations are the most stable, where the CCSD(T)/CBS binding energies of guanine, adenine, thymine, and cytosine are 9.31, 8.48, 8.53, 7.30 kcal/mol, respectively. The energy components are investigated using symmetry-adapted perturbation theory based on density functional theory including the dispersion energy. We compared the CCSD(T)/CBS results with several density functional methods applicable to periodic systems. Considering accuracy and availability, the optB86b nonlocal functional and the Tkatchenko-Scheffler functional are used to study the binding energies of nucleobases on graphene. The predicted values are 18-24 kcal/mol, though many-body effects on screening and energy need to be further considered.
Agent-Based Models in Social Physics
NASA Astrophysics Data System (ADS)
Quang, Le Anh; Jung, Nam; Cho, Eun Sung; Choi, Jae Han; Lee, Jae Woo
2018-06-01
We review the agent-based models (ABM) on social physics including econophysics. The ABM consists of agent, system space, and external environment. The agent is autonomous and decides his/her behavior by interacting with the neighbors or the external environment with the rules of behavior. Agents are irrational because they have only limited information when they make decisions. They adapt using learning from past memories. Agents have various attributes and are heterogeneous. ABM is a non-equilibrium complex system that exhibits various emergence phenomena. The social complexity ABM describes human behavioral characteristics. In ABMs of econophysics, we introduce the Sugarscape model and the artificial market models. We review minority games and majority games in ABMs of game theory. Social flow ABM introduces crowding, evacuation, traffic congestion, and pedestrian dynamics. We also review ABM for opinion dynamics and voter model. We discuss features and advantages and disadvantages of Netlogo, Repast, Swarm, and Mason, which are representative platforms for implementing ABM.
Emergence of grouping in multi-resource minority game dynamics
NASA Astrophysics Data System (ADS)
Huang, Zi-Gang; Zhang, Ji-Qiang; Dong, Jia-Qi; Huang, Liang; Lai, Ying-Cheng
2012-10-01
Complex systems arising in a modern society typically have many resources and strategies available for their dynamical evolutions. To explore quantitatively the behaviors of such systems, we propose a class of models to investigate Minority Game (MG) dynamics with multiple strategies. In particular, agents tend to choose the least used strategies based on available local information. A striking finding is the emergence of grouping states defined in terms of distinct strategies. We develop an analytic theory based on the mean-field framework to understand the ``bifurcations'' of the grouping states. The grouping phenomenon has also been identified in the Shanghai Stock-Market system, and we discuss its prevalence in other real-world systems. Our work demonstrates that complex systems obeying the MG rules can spontaneously self-organize themselves into certain divided states, and our model represents a basic and general mathematical framework to address this kind of phenomena in social, economical and political systems.
Aventin, Áine; Lohan, Maria; O'Halloran, Peter; Henderson, Marion
2015-04-01
Following the UK Medical Research Council's (MRC) guidelines for the development and evaluation of complex interventions, this study aimed to design, develop and optimise an educational intervention about young men and unintended teenage pregnancy based around an interactive film. The process involved identification of the relevant evidence base, development of a theoretical understanding of the phenomenon of unintended teenage pregnancy in relation to young men, and exploratory mixed methods research. The result was an evidence-based, theory-informed, user-endorsed intervention designed to meet the much neglected pregnancy education needs of teenage men and intended to increase both boys' and girls' intentions to avoid an unplanned pregnancy during adolescence. In prioritising the development phase, this paper addresses a gap in the literature on the processes of research-informed intervention design. It illustrates the application of the MRC guidelines in practice while offering a critique and additional guidance to programme developers on the MRC prescribed processes of developing interventions. Key lessons learned were: (1) know and engage the target population and engage gatekeepers in addressing contextual complexities; (2) know the targeted behaviours and model a process of change; and (3) look beyond development to evaluation and implementation. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Reasoning Backwards by Design: Commentary on "Moral Reasoning among HEC Members".
Stephens, Ashley L; Heitman, Elizabeth
2015-01-01
Empirical assessment of the practice of clinical ethics is made difficult by the limited standardization of settings, structures, processes, roles, and training for ethics consultation, as well as by whether individual ethics consultants or hospital ethics committees (HECs) provide consultation. Efforts to study the relationship between theory and practice in the work of HECs likewise require the spelling out of assumptions and definition of key variables, based in knowledge of the core concepts of clinical ethics and logistics of clinical consultation. The survey of HEC members reported by Wasserman and colleagues illustrates the difficulty of such research and calls attention to need for studies of real-time, complex decision making to inform conclusions about how theory affects practice. Copyright 2015 The Journal of Clinical Ethics. All rights reserved.
Sebire, Simon J; Kesten, Joanna M; Edwards, Mark J; May, Thomas; Banfield, Kathryn; Tomkinson, Keeley; Blair, Peter S; Bird, Emma L; Powell, Jane E; Jago, Russell
2016-05-01
To report the theory-based process evaluation of the Bristol Girls' Dance Project, a cluster-randomised controlled trial to increase adolescent girls' physical activity. A mixed-method process evaluation of the intervention's self-determination theory components comprising lesson observations, post-intervention interviews and focus groups. Four intervention dance lessons per dance instructor were observed, audio recorded and rated to estimate the use of need-supportive teaching strategies. Intervention participants (n = 281) reported their dance instructors' provision of autonomy-support. Semi-structured interviews with the dance instructors (n = 10) explored fidelity to the theory and focus groups were conducted with participants (n = 59) in each school to explore their receipt of the intervention and views on the dance instructors' motivating style. Although instructors accepted the theory-based approach, intervention fidelity was variable. Relatedness support was the most commonly observed need-supportive teaching behaviour, provision of structure was moderate and autonomy-support was comparatively low. The qualitative findings identified how instructors supported competence and developed trusting relationships with participants. Fidelity was challenged where autonomy provision was limited to option choices rather than input into the pace or direction of lessons and where controlling teaching styles were adopted, often to manage disruptive behaviour. The successes and challenges to achieving theoretical fidelity in the Bristol Girls' Dance Project may help explain the intervention effects and can more broadly inform the design of theory-based complex interventions aimed at increasing young people's physical activity in after-school settings.
Sebire, Simon J.; Kesten, Joanna M.; Edwards, Mark J.; May, Thomas; Banfield, Kathryn; Tomkinson, Keeley; Blair, Peter S.; Bird, Emma L.; Powell, Jane E.; Jago, Russell
2016-01-01
Objectives To report the theory-based process evaluation of the Bristol Girls' Dance Project, a cluster-randomised controlled trial to increase adolescent girls' physical activity. Design A mixed-method process evaluation of the intervention's self-determination theory components comprising lesson observations, post-intervention interviews and focus groups. Method Four intervention dance lessons per dance instructor were observed, audio recorded and rated to estimate the use of need-supportive teaching strategies. Intervention participants (n = 281) reported their dance instructors' provision of autonomy-support. Semi-structured interviews with the dance instructors (n = 10) explored fidelity to the theory and focus groups were conducted with participants (n = 59) in each school to explore their receipt of the intervention and views on the dance instructors' motivating style. Results Although instructors accepted the theory-based approach, intervention fidelity was variable. Relatedness support was the most commonly observed need-supportive teaching behaviour, provision of structure was moderate and autonomy-support was comparatively low. The qualitative findings identified how instructors supported competence and developed trusting relationships with participants. Fidelity was challenged where autonomy provision was limited to option choices rather than input into the pace or direction of lessons and where controlling teaching styles were adopted, often to manage disruptive behaviour. Conclusion The successes and challenges to achieving theoretical fidelity in the Bristol Girls' Dance Project may help explain the intervention effects and can more broadly inform the design of theory-based complex interventions aimed at increasing young people's physical activity in after-school settings. PMID:27175102
Revisiting the European sovereign bonds with a permutation-information-theory approach
NASA Astrophysics Data System (ADS)
Fernández Bariviera, Aurelio; Zunino, Luciano; Guercio, María Belén; Martinez, Lisana B.; Rosso, Osvaldo A.
2013-12-01
In this paper we study the evolution of the informational efficiency in its weak form for seventeen European sovereign bonds time series. We aim to assess the impact of two specific economic situations in the hypothetical random behavior of these time series: the establishment of a common currency and a wide and deep financial crisis. In order to evaluate the informational efficiency we use permutation quantifiers derived from information theory. Specifically, time series are ranked according to two metrics that measure the intrinsic structure of their correlations: permutation entropy and permutation statistical complexity. These measures provide the rectangular coordinates of the complexity-entropy causality plane; the planar location of the time series in this representation space reveals the degree of informational efficiency. According to our results, the currency union contributed to homogenize the stochastic characteristics of the time series and produced synchronization in the random behavior of them. Additionally, the 2008 financial crisis uncovered differences within the apparently homogeneous European sovereign markets and revealed country-specific characteristics that were partially hidden during the monetary union heyday.
Pattern formation based on complex coupling mechanism in dielectric barrier discharge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Weibo; College of Aeronautical Engineering, Binzhou University, Binzhou 256603; Dong, Lifang, E-mail: donglfhbu@163.com, E-mail: pyy1616@163.com
2016-08-15
The pattern formation of cinque-dice square superlattice pattern (CDSSP) is investigated based on the complex coupling mechanism in a dielectric barrier discharge (DBD) system. The spatio-temporal structure of CDSSP obtained by using an intensified-charge coupled device indicates that CDSSP is an interleaving of two kinds of subpatterns (mixture of rectangle and square, and dot-line square) which discharge twice in one half voltage, respectively. Selected by the complex coupling of two subpatterns, the CDSSP can be formed and shows good stability. This investigation based on gas discharge theory together with nonlinear theory may provide a deeper understanding for the nonlinear characteristicsmore » and even the formation mechanism of patterns in DBD.« less
Skelton, JA; Buehler, C; Irby, MB; Grzywacz, JG
2014-01-01
Family-based approaches to pediatric obesity treatment are considered the ‘gold-standard,’ and are recommended for facilitating behavior change to improve child weight status and health. If family-based approaches are to be truly rooted in the family, clinicians and researchers must consider family process and function in designing effective interventions. To bring a better understanding of family complexities to family-based treatment, two relevant reviews were conducted and are presented: (1) a review of prominent and established theories of the family that may provide a more comprehensive and in-depth approach for addressing pediatric obesity; and (2) a systematic review of the literature to identify the use of prominent family theories in pediatric obesity research, which found little use of theories in intervention studies. Overlapping concepts across theories include: families are a system, with interdependence of units; the idea that families are goal-directed and seek balance; and the physical and social environment imposes demands on families. Family-focused theories provide valuable insight into the complexities of families. Increased use of these theories in both research and practice may identify key leverage points in family process and function to prevent the development of or more effectively treat obesity. The field of family studies provides an innovative approach to the difficult problem of pediatric obesity, building on the long-established approach of family-based treatment. PMID:22531090
Toward a Definition of Complexity for Quantum Field Theory States.
Chapman, Shira; Heller, Michal P; Marrochio, Hugo; Pastawski, Fernando
2018-03-23
We investigate notions of complexity of states in continuous many-body quantum systems. We focus on Gaussian states which include ground states of free quantum field theories and their approximations encountered in the context of the continuous version of the multiscale entanglement renormalization ansatz. Our proposal for quantifying state complexity is based on the Fubini-Study metric. It leads to counting the number of applications of each gate (infinitesimal generator) in the transformation, subject to a state-dependent metric. We minimize the defined complexity with respect to momentum-preserving quadratic generators which form su(1,1) algebras. On the manifold of Gaussian states generated by these operations, the Fubini-Study metric factorizes into hyperbolic planes with minimal complexity circuits reducing to known geodesics. Despite working with quantum field theories far outside the regime where Einstein gravity duals exist, we find striking similarities between our results and those of holographic complexity proposals.
Toward a Definition of Complexity for Quantum Field Theory States
NASA Astrophysics Data System (ADS)
Chapman, Shira; Heller, Michal P.; Marrochio, Hugo; Pastawski, Fernando
2018-03-01
We investigate notions of complexity of states in continuous many-body quantum systems. We focus on Gaussian states which include ground states of free quantum field theories and their approximations encountered in the context of the continuous version of the multiscale entanglement renormalization ansatz. Our proposal for quantifying state complexity is based on the Fubini-Study metric. It leads to counting the number of applications of each gate (infinitesimal generator) in the transformation, subject to a state-dependent metric. We minimize the defined complexity with respect to momentum-preserving quadratic generators which form s u (1 ,1 ) algebras. On the manifold of Gaussian states generated by these operations, the Fubini-Study metric factorizes into hyperbolic planes with minimal complexity circuits reducing to known geodesics. Despite working with quantum field theories far outside the regime where Einstein gravity duals exist, we find striking similarities between our results and those of holographic complexity proposals.
Graph-based linear scaling electronic structure theory.
Niklasson, Anders M N; Mniszewski, Susan M; Negre, Christian F A; Cawkwell, Marc J; Swart, Pieter J; Mohd-Yusof, Jamal; Germann, Timothy C; Wall, Michael E; Bock, Nicolas; Rubensson, Emanuel H; Djidjev, Hristo
2016-06-21
We show how graph theory can be combined with quantum theory to calculate the electronic structure of large complex systems. The graph formalism is general and applicable to a broad range of electronic structure methods and materials, including challenging systems such as biomolecules. The methodology combines well-controlled accuracy, low computational cost, and natural low-communication parallelism. This combination addresses substantial shortcomings of linear scaling electronic structure theory, in particular with respect to quantum-based molecular dynamics simulations.
Graph-based linear scaling electronic structure theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Niklasson, Anders M. N., E-mail: amn@lanl.gov; Negre, Christian F. A.; Cawkwell, Marc J.
2016-06-21
We show how graph theory can be combined with quantum theory to calculate the electronic structure of large complex systems. The graph formalism is general and applicable to a broad range of electronic structure methods and materials, including challenging systems such as biomolecules. The methodology combines well-controlled accuracy, low computational cost, and natural low-communication parallelism. This combination addresses substantial shortcomings of linear scaling electronic structure theory, in particular with respect to quantum-based molecular dynamics simulations.
Ng, Stella L
2013-05-01
The discipline of audiology has the opportunity to embark on research in education from an informed perspective, learning from professions that began this journey decades ago. The goal of this article is to position our discipline as a new member in the academic field of health professional education (HPE), with much to learn and contribute. In this article, I discuss the need for theory in informing HPE research. I also stress the importance of balancing our research goals by selecting appropriate methodologies for relevant research questions, to ensure that we respect the complexity of social processes inherent in HPE. Examples of relevant research questions are used to illustrate the need to consider alternative methodologies and to rethink the traditional hierarchy of evidence. I also provide an example of the thought processes and decisions that informed the design of an educational research study using a constructivist grounded theory methodology. As audiology enters the scholarly field of HPE, we need to arm ourselves with some of the knowledge and perspective that informs the field. Thus, we need to broaden our conceptions of what we consider to be appropriate styles of academic writing, relevant research questions, and valid evidence. Also, if we are to embark on qualitative inquiry into audiology education (or other audiology topics), we need to ensure that we conduct this research with an adequate understanding of the theories and methodologies informing such approaches. We must strive to conduct high quality, rigorous qualitative research more often than uninformed, generic qualitative research. These goals are imperative to the advancement of the theoretical landscape of audiology education and evolving the place of audiology in the field of HPE. American Academy of Audiology.
Complexity theory and physical unification: From microscopic to oscopic level
NASA Astrophysics Data System (ADS)
Pavlos, G. P.; Iliopoulos, A. C.; Karakatsanis, L. P.; Tsoutsouras, V. G.; Pavlos, E. G.
During the last two decades, low dimensional chaotic or self-organized criticality (SOC) processes have been observed by our group in many different physical systems such as space plasmas, the solar or the magnetospheric dynamics, the atmosphere, earthquakes, the brain activity as well as in informational systems. All these systems are complex systems living far from equilibrium with strong self-organization and phase transition character. The theoretical interpretation of these natural phenomena needs a deeper insight into the fundamentals of complexity theory. In this study, we try to give a synoptic description of complexity theory both at the microscopic and at the oscopic level of the physical reality. Also, we propose that the self-organization observed oscopically is a phenomenon that reveals the strong unifying character of the complex dynamics which includes thermodynamical and dynamical characteristics in all levels of the physical reality. From this point of view, oscopical deterministic and stochastic processes are closely related to the microscopical chaos and self-organization. In this study the scientific work of scientists such as Wilson, Nicolis, Prigogine, Hooft, Nottale, El Naschie, Castro, Tsallis, Chang and others is used for the development of a unified physical comprehension of complex dynamics from the microscopic to the oscopic level.
How Decision Support Systems Can Benefit from a Theory of Change Approach.
Allen, Will; Cruz, Jennyffer; Warburton, Bruce
2017-06-01
Decision support systems are now mostly computer and internet-based information systems designed to support land managers with complex decision-making. However, there is concern that many environmental and agricultural decision support systems remain underutilized and ineffective. Recent efforts to improve decision support systems use have focused on enhancing stakeholder participation in their development, but a mismatch between stakeholders' expectations and the reality of decision support systems outputs continues to limit uptake. Additional challenges remain in problem-framing and evaluation. We propose using an outcomes-based approach called theory of change in conjunction with decision support systems development to support both wider problem-framing and outcomes-based monitoring and evaluation. The theory of change helps framing by placing the decision support systems within a wider context. It highlights how decision support systems use can "contribute" to long-term outcomes, and helps align decision support systems outputs with these larger goals. We illustrate the benefits of linking decision support systems development and application with a theory of change approach using an example of pest rabbit management in Australia. We develop a theory of change that outlines the activities required to achieve the outcomes desired from an effective rabbit management program, and two decision support systems that contribute to specific aspects of decision making in this wider problem context. Using a theory of change in this way should increase acceptance of the role of decision support systems by end-users, clarify their limitations and, importantly, increase effectiveness of rabbit management. The use of a theory of change should benefit those seeking to improve decision support systems design, use and, evaluation.
How Decision Support Systems Can Benefit from a Theory of Change Approach
NASA Astrophysics Data System (ADS)
Allen, Will; Cruz, Jennyffer; Warburton, Bruce
2017-06-01
Decision support systems are now mostly computer and internet-based information systems designed to support land managers with complex decision-making. However, there is concern that many environmental and agricultural decision support systems remain underutilized and ineffective. Recent efforts to improve decision support systems use have focused on enhancing stakeholder participation in their development, but a mismatch between stakeholders' expectations and the reality of decision support systems outputs continues to limit uptake. Additional challenges remain in problem-framing and evaluation. We propose using an outcomes-based approach called theory of change in conjunction with decision support systems development to support both wider problem-framing and outcomes-based monitoring and evaluation. The theory of change helps framing by placing the decision support systems within a wider context. It highlights how decision support systems use can "contribute" to long-term outcomes, and helps align decision support systems outputs with these larger goals. We illustrate the benefits of linking decision support systems development and application with a theory of change approach using an example of pest rabbit management in Australia. We develop a theory of change that outlines the activities required to achieve the outcomes desired from an effective rabbit management program, and two decision support systems that contribute to specific aspects of decision making in this wider problem context. Using a theory of change in this way should increase acceptance of the role of decision support systems by end-users, clarify their limitations and, importantly, increase effectiveness of rabbit management. The use of a theory of change should benefit those seeking to improve decision support systems design, use and, evaluation.
Brand, Sarah L.; Fleming, Lora E.; Wyatt, Katrina M.
2015-01-01
Many healthy workplace interventions have been developed for healthcare settings to address the consistently low scores of healthcare professionals on assessments of mental and physical well-being. Complex healthcare settings present challenges for the scale-up and spread of successful interventions from one setting to another. Despite general agreement regarding the importance of the local setting in affecting intervention success across different settings, there is no consensus on what it is about a local setting that needs to be taken into account to design healthy workplace interventions appropriate for different local settings. Complexity theory principles were used to understand a workplace as a complex adaptive system and to create a framework of eight domains (system characteristics) that affect the emergence of system-level behaviour. This Workplace of Well-being (WoW) framework is responsive and adaptive to local settings and allows a shared understanding of the enablers and barriers to behaviour change by capturing local information for each of the eight domains. We use the results of applying the WoW framework to one workplace, a UK National Health Service ward, to describe the utility of this approach in informing design of setting-appropriate healthy workplace interventions that create workplaces conducive to healthy behaviour change. PMID:26380358
Brand, Sarah L; Fleming, Lora E; Wyatt, Katrina M
2015-01-01
Many healthy workplace interventions have been developed for healthcare settings to address the consistently low scores of healthcare professionals on assessments of mental and physical well-being. Complex healthcare settings present challenges for the scale-up and spread of successful interventions from one setting to another. Despite general agreement regarding the importance of the local setting in affecting intervention success across different settings, there is no consensus on what it is about a local setting that needs to be taken into account to design healthy workplace interventions appropriate for different local settings. Complexity theory principles were used to understand a workplace as a complex adaptive system and to create a framework of eight domains (system characteristics) that affect the emergence of system-level behaviour. This Workplace of Well-being (WoW) framework is responsive and adaptive to local settings and allows a shared understanding of the enablers and barriers to behaviour change by capturing local information for each of the eight domains. We use the results of applying the WoW framework to one workplace, a UK National Health Service ward, to describe the utility of this approach in informing design of setting-appropriate healthy workplace interventions that create workplaces conducive to healthy behaviour change.
Selecting Organization Development Theory from an HRD Perspective
ERIC Educational Resources Information Center
Lynham, Susan A.; Chermack, Thomas J.; Noggle, Melissa A.
2004-01-01
As is true for human resource development (HRD), the field of organization development (OD) draws from numerous disciplines to inform its theory base. However, the identification and selection of theory to inform improved practice remains a challenge and begs the question of what can be used to inform and guide one in the identification and…
Synchronization invariance under network structural transformations
NASA Astrophysics Data System (ADS)
Arola-Fernández, Lluís; Díaz-Guilera, Albert; Arenas, Alex
2018-06-01
Synchronization processes are ubiquitous despite the many connectivity patterns that complex systems can show. Usually, the emergence of synchrony is a macroscopic observable; however, the microscopic details of the system, as, e.g., the underlying network of interactions, is many times partially or totally unknown. We already know that different interaction structures can give rise to a common functionality, understood as a common macroscopic observable. Building upon this fact, here we propose network transformations that keep the collective behavior of a large system of Kuramoto oscillators invariant. We derive a method based on information theory principles, that allows us to adjust the weights of the structural interactions to map random homogeneous in-degree networks into random heterogeneous networks and vice versa, keeping synchronization values invariant. The results of the proposed transformations reveal an interesting principle; heterogeneous networks can be mapped to homogeneous ones with local information, but the reverse process needs to exploit higher-order information. The formalism provides analytical insight to tackle real complex scenarios when dealing with uncertainty in the measurements of the underlying connectivity structure.
How Students Learn: Information Processing, Intellectual Development and Confrontation
ERIC Educational Resources Information Center
Entwistle, Noel
1975-01-01
A model derived from information processing theory is described, which helps to explain the complex verbal learning of students and suggests implications for lecturing techniques. Other factors affecting learning, which are not covered by the model, are discussed in relationship to it: student's intellectual development and effects of individual…
Jiang, Quansheng; Shen, Yehu; Li, Hua; Xu, Fengyu
2018-01-24
Feature recognition and fault diagnosis plays an important role in equipment safety and stable operation of rotating machinery. In order to cope with the complexity problem of the vibration signal of rotating machinery, a feature fusion model based on information entropy and probabilistic neural network is proposed in this paper. The new method first uses information entropy theory to extract three kinds of characteristics entropy in vibration signals, namely, singular spectrum entropy, power spectrum entropy, and approximate entropy. Then the feature fusion model is constructed to classify and diagnose the fault signals. The proposed approach can combine comprehensive information from different aspects and is more sensitive to the fault features. The experimental results on simulated fault signals verified better performances of our proposed approach. In real two-span rotor data, the fault detection accuracy of the new method is more than 10% higher compared with the methods using three kinds of information entropy separately. The new approach is proved to be an effective fault recognition method for rotating machinery.
Using quantum theory to simplify input-output processes
NASA Astrophysics Data System (ADS)
Thompson, Jayne; Garner, Andrew J. P.; Vedral, Vlatko; Gu, Mile
2017-02-01
All natural things process and transform information. They receive environmental information as input, and transform it into appropriate output responses. Much of science is dedicated to building models of such systems-algorithmic abstractions of their input-output behavior that allow us to simulate how such systems can behave in the future, conditioned on what has transpired in the past. Here, we show that classical models cannot avoid inefficiency-storing past information that is unnecessary for correct future simulation. We construct quantum models that mitigate this waste, whenever it is physically possible to do so. This suggests that the complexity of general input-output processes depends fundamentally on what sort of information theory we use to describe them.
2002-08-01
the measurement noise, as well as the physical model of the forward scattered electric field. The Bayesian algorithms for the Uncertain Permittivity...received at multiple sensors. In this research project a tissue- model -based signal-detection theory approach for the detection of mammary tumors in the...oriented information processors. In this research project a tissue- model - based signal detection theory approach for the detection of mammary tumors in the
New strategy for protein interactions and application to structure-based drug design
NASA Astrophysics Data System (ADS)
Zou, Xiaoqin
One of the greatest challenges in computational biophysics is to predict interactions between biological molecules, which play critical roles in biological processes and rational design of therapeutic drugs. Biomolecular interactions involve delicate interplay between multiple interactions, including electrostatic interactions, van der Waals interactions, solvent effect, and conformational entropic effect. Accurate determination of these complex and subtle interactions is challenging. Moreover, a biological molecule such as a protein usually consists of thousands of atoms, and thus occupies a huge conformational space. The large degrees of freedom pose further challenges for accurate prediction of biomolecular interactions. Here, I will present our development of physics-based theory and computational modeling on protein interactions with other molecules. The major strategy is to extract microscopic energetics from the information embedded in the experimentally-determined structures of protein complexes. I will also present applications of the methods to structure-based therapeutic design. Supported by NSF CAREER Award DBI-0953839, NIH R01GM109980, and the American Heart Association (Midwest Affiliate) [13GRNT16990076].
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spanjers, Charles S.; Guillo, Pascal; Tilley, T. Don
X-ray absorption near-edge structure (XANES) is a common technique for elucidating oxidation state and first shell coordination geometry in transition metal complexes, among many other materials. However, the structural information obtained from XANES is often limited to the first coordination sphere. In this study, we show how XANES can be used to differentiate between C, Si, and Ge in the second coordination shell of Ti–O–(C, Si, Ge) molecular complexes based on differences in their Ti K-edge XANES spectra. Experimental spectra were compared with theoretical spectra calculated using density functional theory structural optimization and ab initio XANES calculations. The unique featuresmore » for second shell C, Si, and Ge present in the Ti K pre-edge XANES are attributed to the interaction between the Ti center and the O–X (X = C, Si, or Ge) antibonding orbitals.« less
Unraveling dynamics of human physical activity patterns in chronic pain conditions
NASA Astrophysics Data System (ADS)
Paraschiv-Ionescu, Anisoara; Buchser, Eric; Aminian, Kamiar
2013-06-01
Chronic pain is a complex disabling experience that negatively affects the cognitive, affective and physical functions as well as behavior. Although the interaction between chronic pain and physical functioning is a well-accepted paradigm in clinical research, the understanding of how pain affects individuals' daily life behavior remains a challenging task. Here we develop a methodological framework allowing to objectively document disruptive pain related interferences on real-life physical activity. The results reveal that meaningful information is contained in the temporal dynamics of activity patterns and an analytical model based on the theory of bivariate point processes can be used to describe physical activity behavior. The model parameters capture the dynamic interdependence between periods and events and determine a `signature' of activity pattern. The study is likely to contribute to the clinical understanding of complex pain/disease-related behaviors and establish a unified mathematical framework to quantify the complex dynamics of various human activities.
NASA Astrophysics Data System (ADS)
Rosso, Osvaldo A.; Craig, Hugh; Moscato, Pablo
2009-03-01
We introduce novel Information Theory quantifiers in a computational linguistic study that involves a large corpus of English Renaissance literature. The 185 texts studied (136 plays and 49 poems in total), with first editions that range from 1580 to 1640, form a representative set of its period. Our data set includes 30 texts unquestionably attributed to Shakespeare; in addition we also included A Lover’s Complaint, a poem which generally appears in Shakespeare collected editions but whose authorship is currently in dispute. Our statistical complexity quantifiers combine the power of Jensen-Shannon’s divergence with the entropy variations as computed from a probability distribution function of the observed word use frequencies. Our results show, among other things, that for a given entropy poems display higher complexity than plays, that Shakespeare’s work falls into two distinct clusters in entropy, and that his work is remarkable for its homogeneity and for its closeness to overall means.
NASA Astrophysics Data System (ADS)
Shafaatian, Bita; Mousavi, S. Sedighe; Afshari, Sadegh
2016-11-01
New dimer complexes of zinc(II), copper(II) and nickel(II) were synthesized using the Schiff base ligand which was formed by the condensation of 2-aminothiophenol and 2-hydroxy-5-methyl benzaldehyde. This tridentate Schiff base ligand was coordinated to the metal ions through the NSO donor atoms. In order to prevent the oxidation of the thiole group during the formation of Schiff base and its complexes, all of the reactions were carried out under an inert atmosphere of argon. The X-ray structure of the Schiff base ligand showed that in the crystalline form the SH groups were oxidized to produce a disulfide Schiff base as a new double Schiff base ligand. The molar conductivity values of the complexes in dichloromethane implied the presence of non-electrolyte species. The fluorescence properties of the Schiff base ligand and its complexes were also studied in dichloromethane. The products were characterized by FT-IR, 1H NMR, UV/Vis spectroscopies, elemental analysis, and conductometry. The crystal structure of the double Schiff base was determined by single crystal X-ray diffraction. Furthermore, the density functional theory (DFT) calculations were performed at the B3LYP/6-31G(d,p) level of theory for the determination of the optimized structures of Schiff base complexes.
Lu, Ruipeng; Mucaki, Eliseos J; Rogan, Peter K
2017-03-17
Data from ChIP-seq experiments can derive the genome-wide binding specificities of transcription factors (TFs) and other regulatory proteins. We analyzed 765 ENCODE ChIP-seq peak datasets of 207 human TFs with a novel motif discovery pipeline based on recursive, thresholded entropy minimization. This approach, while obviating the need to compensate for skewed nucleotide composition, distinguishes true binding motifs from noise, quantifies the strengths of individual binding sites based on computed affinity and detects adjacent cofactor binding sites that coordinate with the targets of primary, immunoprecipitated TFs. We obtained contiguous and bipartite information theory-based position weight matrices (iPWMs) for 93 sequence-specific TFs, discovered 23 cofactor motifs for 127 TFs and revealed six high-confidence novel motifs. The reliability and accuracy of these iPWMs were determined via four independent validation methods, including the detection of experimentally proven binding sites, explanation of effects of characterized SNPs, comparison with previously published motifs and statistical analyses. We also predict previously unreported TF coregulatory interactions (e.g. TF complexes). These iPWMs constitute a powerful tool for predicting the effects of sequence variants in known binding sites, performing mutation analysis on regulatory SNPs and predicting previously unrecognized binding sites and target genes. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
ERIC Educational Resources Information Center
Light, Richard L.; Kentel, Jeanne Adéle
2015-01-01
Background: Interest in the use of learning theory to inform sport and physical-education pedagogy over the past decade beyond games and team sports has been limited. Purpose: Following on from recent interest within the literature in Eastern philosophic traditions, this article draws on the Japanese concept of "mushin" and complex…
ERIC Educational Resources Information Center
Hogan, Vivienne
2012-01-01
This article investigates how feminist pedagogy and poststructuralist theory can inform both teacher and student in the teaching and learning of gender in relation to teacher education. With reference to the author's own experience of teaching student teachers in early childhood education the article attempts to unravel the complex interface…
Learning quadratic receptive fields from neural responses to natural stimuli.
Rajan, Kanaka; Marre, Olivier; Tkačik, Gašper
2013-07-01
Models of neural responses to stimuli with complex spatiotemporal correlation structure often assume that neurons are selective for only a small number of linear projections of a potentially high-dimensional input. In this review, we explore recent modeling approaches where the neural response depends on the quadratic form of the input rather than on its linear projection, that is, the neuron is sensitive to the local covariance structure of the signal preceding the spike. To infer this quadratic dependence in the presence of arbitrary (e.g., naturalistic) stimulus distribution, we review several inference methods, focusing in particular on two information theory-based approaches (maximization of stimulus energy and of noise entropy) and two likelihood-based approaches (Bayesian spike-triggered covariance and extensions of generalized linear models). We analyze the formal relationship between the likelihood-based and information-based approaches to demonstrate how they lead to consistent inference. We demonstrate the practical feasibility of these procedures by using model neurons responding to a flickering variance stimulus.
Numerical implementation of multiple peeling theory and its application to spider web anchorages.
Brely, Lucas; Bosia, Federico; Pugno, Nicola M
2015-02-06
Adhesion of spider web anchorages has been studied in recent years, including the specific functionalities achieved through different architectures. To better understand the delamination mechanisms of these and other biological or artificial fibrillar adhesives, and how their adhesion can be optimized, we develop a novel numerical model to simulate the multiple peeling of structures with arbitrary branching and adhesion angles, including complex architectures. The numerical model is based on a recently developed multiple peeling theory, which extends the energy-based single peeling theory of Kendall, and can be applied to arbitrarily complex structures. In particular, we numerically show that a multiple peeling problem can be treated as the superposition of single peeling configurations even for complex structures. Finally, we apply the developed numerical approach to study spider web anchorages, showing how their function is achieved through optimal geometrical configurations.
Numerical implementation of multiple peeling theory and its application to spider web anchorages
Brely, Lucas; Bosia, Federico; Pugno, Nicola M.
2015-01-01
Adhesion of spider web anchorages has been studied in recent years, including the specific functionalities achieved through different architectures. To better understand the delamination mechanisms of these and other biological or artificial fibrillar adhesives, and how their adhesion can be optimized, we develop a novel numerical model to simulate the multiple peeling of structures with arbitrary branching and adhesion angles, including complex architectures. The numerical model is based on a recently developed multiple peeling theory, which extends the energy-based single peeling theory of Kendall, and can be applied to arbitrarily complex structures. In particular, we numerically show that a multiple peeling problem can be treated as the superposition of single peeling configurations even for complex structures. Finally, we apply the developed numerical approach to study spider web anchorages, showing how their function is achieved through optimal geometrical configurations. PMID:25657835
Director of nursing and midwifery leadership: informed through the lens of critical social science.
Solman, Annette
2010-05-01
Highlight the use of critical social science theories, practice development principles and a situational leadership framework within transformational leadership to inform Directors of Nursing and Midwifery (DoNM) practices as leaders. Healthcare is constantly changing, unpredictable, strives for quality service and cost containment, which can result in stress and crisis for healthcare workers. DoNM leadership is critical to supporting and leading staff through these complex times within healthcare. Understanding theories, frameworks and their application to real-world practice can assist in supporting individuals and teams to navigate through the changing healthcare environment. Blending critical social science theories with practice development principles and the situational leadership framework can assist the DoNM to enact transformational leadership to support the development of individuals and teams to meet the complex healthcare needs of patients within the clinical setting. IMPLICATIONS FOR NURSE MANAGEMENT: This article contributes through the practical application of critical social science theories, practice development principles and situational leadership framework within transformational leadership as an approach for enacting DoNM leadership. To further understand and develop in the role of the contemporary DoNM in leadership, these directors are encouraged to publish their work.
Planning in Higher Education and Chaos Theory: A Model, a Method.
ERIC Educational Resources Information Center
Cutright, Marc
This paper proposes a model, based on chaos theory, that explores strategic planning in higher education. It notes that chaos theory was first developed in the physical sciences to explain how apparently random activity was, in fact, complexity patterned. The paper goes on to describe how chaos theory has subsequently been applied to the social…
Collective learning modeling based on the kinetic theory of active particles
NASA Astrophysics Data System (ADS)
Burini, D.; De Lillo, S.; Gibelli, L.
2016-03-01
This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom.
Simpson, Sharon A; Butler, Christopher C; Hood, Kerry; Cohen, David; Dunstan, Frank; Evans, Meirion R; Rollnick, Stephen; Moore, Laurence; Hare, Monika; Bekkers, Marie-Jet; Evans, John
2009-01-01
Background After some years of a downward trend, antibiotic prescribing rates in the community have tended to level out in many countries. There is also wide variation in antibiotic prescribing between general practices, and between countries. There are still considerable further gains that could be made in reducing inappropriate antibiotic prescribing, but complex interventions are required. Studies to date have generally evaluated the effect of interventions on antibiotic prescribing in a single consultation and pragmatic evaluations that assess maintenance of new skills are rare. This paper describes the protocol for a pragmatic, randomized evaluation of a complex intervention aimed at reducing antibiotic prescribing by primary care clinicians. Methods and design We developed a Social Learning Theory based, blended learning program (on-line learning, a practice based seminar, and context bound learning) called the STAR Educational Program. The 'why of change' is addressed by providing clinicians in general practice with information on antibiotic resistance in urine samples submitted by their practice and their antibiotic prescribing data, and facilitating a practice-based seminar on the implications of this data. The 'how of change' is addressed through context-bound communication skills training and information on antibiotic indication and choice. This intervention will be evaluated in a trial involving 60 general practices, with general practice as the unit of randomization (clinicians from each practice to either receive the STAR Educational Program or not) and analysis. The primary outcome will be the number of antibiotic items dispensed over one year. An economic and process evaluation will also be conducted. Discussion This trial will be the first to evaluate the effectiveness of this type of theory-based, blended learning intervention aimed at reducing antibiotic prescribing by primary care clinicians. Novel aspects include feedback of practice level data on antimicrobial resistance and prescribing, use of principles from motivational interviewing, training in enhanced communication skills that incorporates context-bound experience and reflection, and using antibiotic dispensing over one year (as opposed to antibiotic prescribing in a single consultation) as the main outcome. Trial registration Current Controlled Trials ISRCTN63355948. PMID:19309493
Simpson, Sharon A; Butler, Christopher C; Hood, Kerry; Cohen, David; Dunstan, Frank; Evans, Meirion R; Rollnick, Stephen; Moore, Laurence; Hare, Monika; Bekkers, Marie-Jet; Evans, John
2009-03-23
After some years of a downward trend, antibiotic prescribing rates in the community have tended to level out in many countries. There is also wide variation in antibiotic prescribing between general practices, and between countries. There are still considerable further gains that could be made in reducing inappropriate antibiotic prescribing, but complex interventions are required. Studies to date have generally evaluated the effect of interventions on antibiotic prescribing in a single consultation and pragmatic evaluations that assess maintenance of new skills are rare. This paper describes the protocol for a pragmatic, randomized evaluation of a complex intervention aimed at reducing antibiotic prescribing by primary care clinicians. We developed a Social Learning Theory based, blended learning program (on-line learning, a practice based seminar, and context bound learning) called the STAR Educational Program. The 'why of change' is addressed by providing clinicians in general practice with information on antibiotic resistance in urine samples submitted by their practice and their antibiotic prescribing data, and facilitating a practice-based seminar on the implications of this data. The 'how of change' is addressed through context-bound communication skills training and information on antibiotic indication and choice. This intervention will be evaluated in a trial involving 60 general practices, with general practice as the unit of randomization (clinicians from each practice to either receive the STAR Educational Program or not) and analysis. The primary outcome will be the number of antibiotic items dispensed over one year. An economic and process evaluation will also be conducted. This trial will be the first to evaluate the effectiveness of this type of theory-based, blended learning intervention aimed at reducing antibiotic prescribing by primary care clinicians. Novel aspects include feedback of practice level data on antimicrobial resistance and prescribing, use of principles from motivational interviewing, training in enhanced communication skills that incorporates context-bound experience and reflection, and using antibiotic dispensing over one year (as opposed to antibiotic prescribing in a single consultation) as the main outcome. Current Controlled Trials ISRCTN63355948.
Mechanism of Benzene Tribopolymerization on the RuO2 (110) Surface
NASA Astrophysics Data System (ADS)
Yang, J.; Qi, Y.; Kim, H. D.; Rappe, A. M.
2018-04-01
A tribopolymer formed on the contacts of microelectromechanical and nanoelectromechanical system (MEMS-NEMS) devices is a major concern hampering their practical use in information technology. Conductive metal oxides, such as RuO2 and ReO3 , have been regarded as promising candidate materials for MEMS-NEMS contacts due to their conductivity, hardness, and relatively chemically inert surfaces. However, recent experimental works demonstrate that trace amounts of a polymer could still form on RuO2 surfaces. We demonstrate the mechanism of this class of unexpected tribopolymer formation by conducting density-functional-theory-based computational compression experiments with benzene as the contamination gas. First, mechanical force during compression changes the benzene molecules from slightly physisorbed to strongly chemisorbed. Further compression causes deformation and chemical linkage of the benzene molecules. Finally, the two contacts detach, with one having a complex organic molecule attached and the other a more reactive surface. The complex organic molecule, which has an oxabicyclic segment, can be viewed as the rudiment of a tribopolymer, and the more reactive surface can trigger the next adsorption-reaction-tribopolymer formation cycle. Based on these results, we also predict tribopolymer formation rates by using transition-state theory and the second-order rate law. We promote a deeper understanding of tribopolymer formation (especially on metal oxides) and provide strategies for suppressing tribopolymerization.
CR-Calculus and adaptive array theory applied to MIMO random vibration control tests
NASA Astrophysics Data System (ADS)
Musella, U.; Manzato, S.; Peeters, B.; Guillaume, P.
2016-09-01
Performing Multiple-Input Multiple-Output (MIMO) tests to reproduce the vibration environment in a user-defined number of control points of a unit under test is necessary in applications where a realistic environment replication has to be achieved. MIMO tests require vibration control strategies to calculate the required drive signal vector that gives an acceptable replication of the target. This target is a (complex) vector with magnitude and phase information at the control points for MIMO Sine Control tests while in MIMO Random Control tests, in the most general case, the target is a complete spectral density matrix. The idea behind this work is to tailor a MIMO random vibration control approach that can be generalized to other MIMO tests, e.g. MIMO Sine and MIMO Time Waveform Replication. In this work the approach is to use gradient-based procedures over the complex space, applying the so called CR-Calculus and the adaptive array theory. With this approach it is possible to better control the process performances allowing the step-by-step Jacobian Matrix update. The theoretical bases behind the work are followed by an application of the developed method to a two-exciter two-axis system and by performance comparisons with standard methods.
A spread willingness computing-based information dissemination model.
Huang, Haojing; Cui, Zhiming; Zhang, Shukui
2014-01-01
This paper constructs a kind of spread willingness computing based on information dissemination model for social network. The model takes into account the impact of node degree and dissemination mechanism, combined with the complex network theory and dynamics of infectious diseases, and further establishes the dynamical evolution equations. Equations characterize the evolutionary relationship between different types of nodes with time. The spread willingness computing contains three factors which have impact on user's spread behavior: strength of the relationship between the nodes, views identity, and frequency of contact. Simulation results show that different degrees of nodes show the same trend in the network, and even if the degree of node is very small, there is likelihood of a large area of information dissemination. The weaker the relationship between nodes, the higher probability of views selection and the higher the frequency of contact with information so that information spreads rapidly and leads to a wide range of dissemination. As the dissemination probability and immune probability change, the speed of information dissemination is also changing accordingly. The studies meet social networking features and can help to master the behavior of users and understand and analyze characteristics of information dissemination in social network.
A Spread Willingness Computing-Based Information Dissemination Model
Cui, Zhiming; Zhang, Shukui
2014-01-01
This paper constructs a kind of spread willingness computing based on information dissemination model for social network. The model takes into account the impact of node degree and dissemination mechanism, combined with the complex network theory and dynamics of infectious diseases, and further establishes the dynamical evolution equations. Equations characterize the evolutionary relationship between different types of nodes with time. The spread willingness computing contains three factors which have impact on user's spread behavior: strength of the relationship between the nodes, views identity, and frequency of contact. Simulation results show that different degrees of nodes show the same trend in the network, and even if the degree of node is very small, there is likelihood of a large area of information dissemination. The weaker the relationship between nodes, the higher probability of views selection and the higher the frequency of contact with information so that information spreads rapidly and leads to a wide range of dissemination. As the dissemination probability and immune probability change, the speed of information dissemination is also changing accordingly. The studies meet social networking features and can help to master the behavior of users and understand and analyze characteristics of information dissemination in social network. PMID:25110738
ERIC Educational Resources Information Center
Imhof, Margarete; Starker, Ulrike; Spaude, Elena
2016-01-01
Building on Dörner's (1996) theory of complex problem-solving, a learning scenario for teacher students was created and tested. Classroom management is interpreted as a complex problem, which requires the integration of competing interests and tackling multiple, simultaneous tasks under time pressure and with limited information. In addition,…
Dehydration-driven evolution of topological complexity in ethylamonium uranyl selenates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gurzhiy, Vladislav V., E-mail: vladgeo17@mail.ru; Krivovichev, Sergey V.; Tananaev, Ivan G.
Single crystals of four novel uranyl selenate and selenite-selenate oxysalts with protonated ethylamine molecules, (C{sub 2}H{sub 8}N){sub 2}[(UO{sub 2})(SeO{sub 4}){sub 2}(H{sub 2}O)](H{sub 2}O) (I), (C{sub 2}H{sub 8}N){sub 3}[(UO{sub 2})(SeO{sub 4}){sub 2}(HSeO{sub 4})] (II), (C{sub 2}H{sub 8}N)[(UO{sub 2})(SeO{sub 4})(HSeO{sub 3})] (III), and (C{sub 2}H{sub 8}N)(H{sub 3}O)[(UO{sub 2})(SeO{sub 4}){sub 2}(H{sub 2}O)] (IV) have been prepared by isothermal evaporation from aqueous solutions. Uranyl-containing 1D and 2D units have been investigated using topological approach and information-based complexity measurements that demonstrate the evolution of structural units and the increase of topological complexity with the decrease of H{sub 2}O content. - Graphical abstract: Single crystals ofmore » four novel uranyl selenate and selenite-selenate oxysalts with protonated ethylamine molecules have been prepared by isothermal evaporation from aqueous solutions. Structural analysis and information-based topological complexity calculations points to the possible sequence of crystalline phases formation, showing both topological and structural branches of evolution. - Highlights: • Single crystals of four novel uranyl oxysalts were prepared by evaporation method. • The graph theory was used for investigation of topologies of structural units. • Dehydration processes drives the evolution of topological complexity of 1D and 2D structural units.« less
THE COGNITIVE NEUROSCIENCE OF WORKING MEMORY
D’Esposito, Mark; Postle, Bradley R.
2015-01-01
For over 50 years, psychologists and neuroscientists have recognized the importance of a “working memory” to coordinate processing when multiple goals are active, and to guide behavior with information that is not present in the immediate environment. In recent years, psychological theory and cognitive neuroscience data have converged on the idea that information is encoded into working memory via the allocation of attention to internal representations – be they semantic long-term memory (e.g., letters, digits, words), sensory, or motoric. Thus, information-based multivariate analyses of human functional MRI data typically find evidence for the temporary representation of stimuli in regions that also process this information in nonworking-memory contexts. The prefrontal cortex, on the other hand, exerts control over behavior by biasing the salience of mnemonic representations, and adjudicating among competing, context-dependent rules. The “control of the controller” emerges from a complex interplay between PFC and striatal circuits, and ascending dopaminergic neuromodulatory signals. PMID:25251486
Cannistraci, Carlo Vittorio; Alanis-Lobato, Gregorio; Ravasi, Timothy
2013-01-01
Growth and remodelling impact the network topology of complex systems, yet a general theory explaining how new links arise between existing nodes has been lacking, and little is known about the topological properties that facilitate link-prediction. Here we investigate the extent to which the connectivity evolution of a network might be predicted by mere topological features. We show how a link/community-based strategy triggers substantial prediction improvements because it accounts for the singular topology of several real networks organised in multiple local communities - a tendency here named local-community-paradigm (LCP). We observe that LCP networks are mainly formed by weak interactions and characterise heterogeneous and dynamic systems that use self-organisation as a major adaptation strategy. These systems seem designed for global delivery of information and processing via multiple local modules. Conversely, non-LCP networks have steady architectures formed by strong interactions, and seem designed for systems in which information/energy storage is crucial. PMID:23563395
Extending and expanding the Darwinian synthesis: the role of complex systems dynamics.
Weber, Bruce H
2011-03-01
Darwinism is defined here as an evolving research tradition based upon the concepts of natural selection acting upon heritable variation articulated via background assumptions about systems dynamics. Darwin's theory of evolution was developed within a context of the background assumptions of Newtonian systems dynamics. The Modern Evolutionary Synthesis, or neo-Darwinism, successfully joined Darwinian selection and Mendelian genetics by developing population genetics informed by background assumptions of Boltzmannian systems dynamics. Currently the Darwinian Research Tradition is changing as it incorporates new information and ideas from molecular biology, paleontology, developmental biology, and systems ecology. This putative expanded and extended synthesis is most perspicuously deployed using background assumptions from complex systems dynamics. Such attempts seek to not only broaden the range of phenomena encompassed by the Darwinian Research Tradition, such as neutral molecular evolution, punctuated equilibrium, as well as developmental biology, and systems ecology more generally, but to also address issues of the emergence of evolutionary novelties as well as of life itself. Copyright © 2010 Elsevier Ltd. All rights reserved.
Structural bases for neurophysiological investigations of amygdaloid complex of the brain
NASA Astrophysics Data System (ADS)
Kalimullina, Liliya B.; Kalkamanov, Kh. A.; Akhmadeev, Azat V.; Zakharov, Vadim P.; Sharafullin, Ildus F.
2015-11-01
Amygdala (Am) as a part of limbic system of the brain defines such important functions as adaptive behavior of animals, formation of emotions and memory, regulation of endocrine and visceral functions. We worked out, with the help of mathematic modelling of the pattern recognition theory, principles for organization of neurophysiological and neuromorphological studies of Am nuclei, which take into account the existing heterogeneity of its formations and optimize, to a great extent, the protocol for carrying out of such investigations. The given scheme of studies of Am’s structural-functional organization at its highly-informative sections can be used as a guide for precise placement of electrodes’, cannulae’s and microsensors into particular Am nucleus in the brain with the registration not only the nucleus itself, but also its extensions. This information is also important for defining the number of slices covering specific Am nuclei which must be investigated to reveal the physiological role of a particular part of amygdaloid complex.
On the formal definition of the systems' interoperability capability: an anthropomorphic approach
NASA Astrophysics Data System (ADS)
Zdravković, Milan; Luis-Ferreira, Fernando; Jardim-Goncalves, Ricardo; Trajanović, Miroslav
2017-03-01
The extended view of enterprise information systems in the Internet of Things (IoT) introduces additional complexity to the interoperability problems. In response to this, the problem of systems' interoperability is revisited by taking into the account the different aspects of philosophy, psychology, linguistics and artificial intelligence, namely by analysing the potential analogies between the processes of human and system communication. Then, the capability to interoperate as a property of the system, is defined as a complex ability to seamlessly sense and perceive a stimulus from its environment (assumingly, a message from any other system), make an informed decision about this perception and consequently, articulate a meaningful and useful action or response, based on this decision. Although this capability is defined on the basis of the existing interoperability theories, the proposed approach to its definition excludes the assumption on the awareness of co-existence of two interoperating systems. Thus, it establishes the links between the research of interoperability of systems and intelligent software agents, as one of the systems' digital identities.
Cannistraci, Carlo Vittorio; Alanis-Lobato, Gregorio; Ravasi, Timothy
2013-01-01
Growth and remodelling impact the network topology of complex systems, yet a general theory explaining how new links arise between existing nodes has been lacking, and little is known about the topological properties that facilitate link-prediction. Here we investigate the extent to which the connectivity evolution of a network might be predicted by mere topological features. We show how a link/community-based strategy triggers substantial prediction improvements because it accounts for the singular topology of several real networks organised in multiple local communities - a tendency here named local-community-paradigm (LCP). We observe that LCP networks are mainly formed by weak interactions and characterise heterogeneous and dynamic systems that use self-organisation as a major adaptation strategy. These systems seem designed for global delivery of information and processing via multiple local modules. Conversely, non-LCP networks have steady architectures formed by strong interactions, and seem designed for systems in which information/energy storage is crucial.
Analyzing the causation of a railway accident based on a complex network
NASA Astrophysics Data System (ADS)
Ma, Xin; Li, Ke-Ping; Luo, Zi-Yan; Zhou, Jin
2014-02-01
In this paper, a new model is constructed for the causation analysis of railway accident based on the complex network theory. In the model, the nodes are defined as various manifest or latent accident causal factors. By employing the complex network theory, especially its statistical indicators, the railway accident as well as its key causations can be analyzed from the overall perspective. As a case, the “7.23” China—Yongwen railway accident is illustrated based on this model. The results show that the inspection of signals and the checking of line conditions before trains run played an important role in this railway accident. In conclusion, the constructed model gives a theoretical clue for railway accident prediction and, hence, greatly reduces the occurrence of railway accidents.
Human systems dynamics: Toward a computational model
NASA Astrophysics Data System (ADS)
Eoyang, Glenda H.
2012-09-01
A robust and reliable computational model of complex human systems dynamics could support advancements in theory and practice for social systems at all levels, from intrapersonal experience to global politics and economics. Models of human interactions have evolved from traditional, Newtonian systems assumptions, which served a variety of practical and theoretical needs of the past. Another class of models has been inspired and informed by models and methods from nonlinear dynamics, chaos, and complexity science. None of the existing models, however, is able to represent the open, high dimension, and nonlinear self-organizing dynamics of social systems. An effective model will represent interactions at multiple levels to generate emergent patterns of social and political life of individuals and groups. Existing models and modeling methods are considered and assessed against characteristic pattern-forming processes in observed and experienced phenomena of human systems. A conceptual model, CDE Model, based on the conditions for self-organizing in human systems, is explored as an alternative to existing models and methods. While the new model overcomes the limitations of previous models, it also provides an explanatory base and foundation for prospective analysis to inform real-time meaning making and action taking in response to complex conditions in the real world. An invitation is extended to readers to engage in developing a computational model that incorporates the assumptions, meta-variables, and relationships of this open, high dimension, and nonlinear conceptual model of the complex dynamics of human systems.
Community outreach: from measuring the difference to making a difference with health information*
Ottoson, Judith M.; Green, Lawrence W.
2005-01-01
Background: Community-based outreach seeks to move libraries beyond their traditional institutional boundaries to improve both access to and effectiveness of health information. The evaluation of such outreach needs to involve the community in assessing the program's process and outcomes. Purpose: Evaluation of community-based library outreach programs benefits from a participatory approach. To explain this premise of the paper, three components of evaluation theory are paired with relevant participatory strategies. Concepts: The first component of evaluation theory is also a standard of program evaluation: use. Evaluation is intended to be useful for stakeholders to make decisions. A useful evaluation is credible, timely, and of adequate scope. Participatory approaches to increase use of evaluation findings include engaging end users early in planning the program itself and in deciding on the outcomes of the evaluation. A second component of evaluation theory seeks to understand what is being evaluated, such as specific aspects of outreach programs. A transparent understanding of the ways outreach achieves intended goals, its activities and linkages, and the context in which it operates precedes any attempt to measure it. Participatory approaches to evaluating outreach include having end users, such as health practitioners in other community-based organizations, identify what components of the outreach program are most important to their work. A third component of evaluation theory is concerned with the process by which value is placed on outreach. What will count as outreach success or failure? Who decides? Participatory approaches to valuing include assuring end-user representation in the formulation of evaluation questions and in the interpretation of evaluation results. Conclusions: The evaluation of community-based outreach is a complex process that is not made easier by a participatory approach. Nevertheless, a participatory approach is more likely to make the evaluation findings useful, ensure that program knowledge is shared, and make outreach valuing transparent. PMID:16239958
Analytical Tools Interface for Landscape Assessments
Environmental management practices are trending away from simple, local-scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to implemen...
Processing of oil palm empty fruit bunch as filler material of polymer recycles
NASA Astrophysics Data System (ADS)
Saepulloh, D. R.; Nikmatin, S.; Hardhienata, H.
2017-05-01
Oil palm empty fruit bunches (OPEFB) is waste from crude palm oil (CPO) processing plants. This research aims to process OPEFB to be a reinforcement polymer recycle with the mechanical milling method and identify each establishment molecular with the orbital hybridization theory. OPEFB fibers were synthesized using a mechanical milling until the size shortfiber and microfiber. Then do the biocomposite granular synthesis with single screw extruder. TAPPI chemical test shows levels of α-cellulose fibers amounted 41.68%. Based on density, the most optimum composition contained in the filler amounted 15% with the size is the microfiber. The test results of morphology with SEM showed deployment of filler OPEFB fiber is fairly equitable distributed. Regarding the molecular interaction between matrix with OPEFB fiber, described by the theory of orbital hybridization. But the explanation establishment of the bond for more complex molecules likes this from the side of the molecular orbital theory is necessary complete information of the hybrid levels.
Inconclusive quantum measurements and decisions under uncertainty
NASA Astrophysics Data System (ADS)
Yukalov, Vyacheslav; Sornette, Didier
2016-04-01
We give a mathematical definition for the notion of inconclusive quantum measurements. In physics, such measurements occur at intermediate stages of a complex measurement procedure, with the final measurement result being operationally testable. Since the mathematical structure of Quantum Decision Theory has been developed in analogy with the theory of quantum measurements, the inconclusive quantum measurements correspond, in Quantum Decision Theory, to intermediate stages of decision making in the process of taking decisions under uncertainty. The general form of the quantum probability for a composite event is the sum of a utility factor, describing a rational evaluation of the considered prospect, and of an attraction factor, characterizing irrational, subconscious attitudes of the decision maker. Despite the involved irrationality, the probability of prospects can be evaluated. This is equivalent to the possibility of calculating quantum probabilities without specifying hidden variables. We formulate a general way of evaluation, based on the use of non-informative priors. As an example, we suggest the explanation of the decoy effect. Our quantitative predictions are in very good agreement with experimental data.
Information and complexity measures in the interface of a metal and a superconductor
NASA Astrophysics Data System (ADS)
Moustakidis, Ch. C.; Panos, C. P.
2018-06-01
Fisher information, Shannon information entropy and Statistical Complexity are calculated for the interface of a normal metal and a superconductor, as a function of the temperature for several materials. The order parameter Ψ (r) derived from the Ginzburg-Landau theory is used as an input together with experimental values of critical transition temperature Tc and the superconducting coherence length ξ0. Analytical expressions are obtained for information and complexity measures. Thus Tc is directly related in a simple way with disorder and complexity. An analytical relation is found of the Fisher Information with the energy profile of superconductivity i.e. the ratio of surface free energy and the bulk free energy. We verify that a simple relation holds between Shannon and Fisher information i.e. a decomposition of a global information quantity (Shannon) in terms of two local ones (Fisher information), previously derived and verified for atoms and molecules by Liu et al. Finally, we find analytical expressions for generalized information measures like the Tsallis entropy and Fisher information. We conclude that the proper value of the non-extensivity parameter q ≃ 1, in agreement with previous work using a different model, where q ≃ 1.005.
Wolfe, Christopher R.; Reyna, Valerie F.; Widmer, Colin L.; Cedillos, Elizabeth M.; Fisher, Christopher R.; Brust-Renck, Priscila G.; Weil, Audrey M.
2014-01-01
Background Many healthy women consider genetic testing for breast cancer risk, yet BRCA testing issues are complex. Objective Determining whether an intelligent tutor, BRCA Gist, grounded in fuzzy-trace theory (FTT), increases gist comprehension and knowledge about genetic testing for breast cancer risk, improving decision-making. Design In two experiments, 410 healthy undergraduate women were randomly assigned to one of three groups: an online module using a web-based tutoring system (BRCA Gist) that uses artificial intelligence technology, a second group read highly similar content from the NCI web site, and a third completed an unrelated tutorial. Intervention BRCA Gist applied fuzzy trace theory and was designed to help participants develop gist comprehension of topics relevant to decisions about BRCA genetic testing, including how breast cancer spreads, inherited genetic mutations, and base rates. Measures We measured content knowledge, gist comprehension of decision-relevant information, interest in testing, and genetic risk and testing judgments. Results Control knowledge scores ranged from 54% to 56%, NCI improved significantly to 65% and 70%, and BRCA Gist improved significantly more to 75% and 77%, p<.0001. BRCA Gist scored higher on gist comprehension than NCI and control, p<.0001. Control genetic risk-assessment mean was 48% correct; BRCA Gist (61%), and NCI (56%) were significantly higher, p<.0001. BRCA Gist participants recommended less testing for women without risk factors (not good candidates), (24% and 19%) than controls (50%, both experiments) and NCI, (32%) Experiment 2, p<.0001. BRCA Gist testing interest was lower than controls, p<.0001. Limitations BRCA Gist has not been tested with older women from diverse groups. Conclusions Intelligent tutors, such as BRCA Gist, are scalable, cost effective ways of helping people understand complex issues, improving decision-making. PMID:24829276
Creativity, information, and consciousness: The information dynamics of thinking.
Wiggins, Geraint A
2018-05-07
This paper presents a theory of the basic operation of mind, Information Dynamics of Thinking, which is intended for computational implementation and thence empirical testing. It is based on the information theory of Shannon, and treats the mind/brain as an information processing organ that aims to be information-efficient, in that it predicts its world, so as to use information efficiently, and regularly re-represents it, so as to store information efficiently. The theory is presented in context of a background review of various research areas that impinge upon its development. Consequences of the theory and testable hypotheses arising from it are discussed. Copyright © 2018. Published by Elsevier B.V.
1985-09-01
appropriately, known as Paragraph Completion Test) (both requiring subjects to respond with several * sentences to presented conflict dilemmas ), along with...to better integrate and differentiate complex information. * Procedure: A literature review in the area of cognitive complexity is presented . The...treated in an overview fashion, and more detailed reviews of post- 1977 studies are then presented . This review updates the previous major review in this
Kinematic Determination of an Unmodeled Serial Manipulator by Means of an IMU
NASA Astrophysics Data System (ADS)
Ciarleglio, Constance A.
Kinematic determination for an unmodeled manipulator is usually done through a-priori knowledge of the manipulator physical characteristics or external sensor information. The mathematics of the kinematic estimation, often based on Denavit- Hartenberg convention, are complex and have high computation requirements, in addition to being unique to the manipulator for which the method is developed. Analytical methods that can compute kinematics on-the fly have the potential to be highly beneficial in dynamic environments where different configurations and variable manipulator types are often required. This thesis derives a new screw theory based method of kinematic determination, using a single inertial measurement unit (IMU), for use with any serial, revolute manipulator. The method allows the expansion of reconfigurable manipulator design and simplifies the kinematic process for existing manipulators. A simulation is presented where the theory of the method is verified and characterized with error. The method is then implemented on an existing manipulator as a verification of functionality.
Aspects of black holes and the information paradox
NASA Astrophysics Data System (ADS)
Levi, Thomas S.
In this thesis we explore various aspects of string theory and the black hole information paradox. The thesis is divided into two parts. In the first part, we examine black holes in the context of the AdS/CFT correspondence and holography. We show how the correspondence is formulated in a time dependent background when multiple vacua exist. We explain how particle production and Hawking radiation is expressed in the dual field theory. We then investigate the rotating BTZ black hole using AdS/CFT. We show how to compute field theory correlation functions in two ways. The first involves integration over the region up to and including the inner (Cauchy) horizon. The second integrates over only the region outside the outer (event) horizon, but over a contour in the complex time plane. We then show that the inner horizon is unstable to generic perturbations and how this instability can be detected in the dual field theory. We conjecture that signatures in the complex time plane might encode information behind the horizon in the dual field theory. In the second part of the thesis we turn to the "fuzzball" conjecture where black holes are seen as emergent phenomena that arise from a coarse-graining over many smooth microstates. We present a solution generating technique for general three-charge spacetimes that are candidate microstates for finite area black holes and rings. We show these microstates have the same asymptotic behavior as black holes or black rings, but in the interior are characterized by an intricate geometry of 2-cycles we call spacetime foam.
REGIME CHANGES IN ECOLOGICAL SYSTEMS: AN INFORMATION THEORY APPROACH
We present our efforts at developing an ecological system using Information Theory. We derive an expression for Fisher Information based on sampling of the system trajectory as it evolves in the state space. The Fisher Information index as we have derived it captures the characte...
Shannon information entropy in heavy-ion collisions
NASA Astrophysics Data System (ADS)
Ma, Chun-Wang; Ma, Yu-Gang
2018-03-01
The general idea of information entropy provided by C.E. Shannon "hangs over everything we do" and can be applied to a great variety of problems once the connection between a distribution and the quantities of interest is found. The Shannon information entropy essentially quantify the information of a quantity with its specific distribution, for which the information entropy based methods have been deeply developed in many scientific areas including physics. The dynamical properties of heavy-ion collisions (HICs) process make it difficult and complex to study the nuclear matter and its evolution, for which Shannon information entropy theory can provide new methods and observables to understand the physical phenomena both theoretically and experimentally. To better understand the processes of HICs, the main characteristics of typical models, including the quantum molecular dynamics models, thermodynamics models, and statistical models, etc., are briefly introduced. The typical applications of Shannon information theory in HICs are collected, which cover the chaotic behavior in branching process of hadron collisions, the liquid-gas phase transition in HICs, and the isobaric difference scaling phenomenon for intermediate mass fragments produced in HICs of neutron-rich systems. Even though the present applications in heavy-ion collision physics are still relatively simple, it would shed light on key questions we are seeking for. It is suggested to further develop the information entropy methods in nuclear reactions models, as well as to develop new analysis methods to study the properties of nuclear matters in HICs, especially the evolution of dynamics system.
Faithful Squashed Entanglement
NASA Astrophysics Data System (ADS)
Brandão, Fernando G. S. L.; Christandl, Matthias; Yard, Jon
2011-09-01
Squashed entanglement is a measure for the entanglement of bipartite quantum states. In this paper we present a lower bound for squashed entanglement in terms of a distance to the set of separable states. This implies that squashed entanglement is faithful, that is, it is strictly positive if and only if the state is entangled. We derive the lower bound on squashed entanglement from a lower bound on the quantum conditional mutual information which is used to define squashed entanglement. The quantum conditional mutual information corresponds to the amount by which strong subadditivity of von Neumann entropy fails to be saturated. Our result therefore sheds light on the structure of states that almost satisfy strong subadditivity with equality. The proof is based on two recent results from quantum information theory: the operational interpretation of the quantum mutual information as the optimal rate for state redistribution and the interpretation of the regularised relative entropy of entanglement as an error exponent in hypothesis testing. The distance to the set of separable states is measured in terms of the LOCC norm, an operationally motivated norm giving the optimal probability of distinguishing two bipartite quantum states, each shared by two parties, using any protocol formed by local quantum operations and classical communication (LOCC) between the parties. A similar result for the Frobenius or Euclidean norm follows as an immediate consequence. The result has two applications in complexity theory. The first application is a quasipolynomial-time algorithm solving the weak membership problem for the set of separable states in LOCC or Euclidean norm. The second application concerns quantum Merlin-Arthur games. Here we show that multiple provers are not more powerful than a single prover when the verifier is restricted to LOCC operations thereby providing a new characterisation of the complexity class QMA.
Information Dissemination of Public Health Emergency on Social Networks and Intelligent Computation
Hu, Hongzhi; Mao, Huajuan; Hu, Xiaohua; Hu, Feng; Sun, Xuemin; Jing, Zaiping; Duan, Yunsuo
2015-01-01
Due to the extensive social influence, public health emergency has attracted great attention in today's society. The booming social network is becoming a main information dissemination platform of those events and caused high concerns in emergency management, among which a good prediction of information dissemination in social networks is necessary for estimating the event's social impacts and making a proper strategy. However, information dissemination is largely affected by complex interactive activities and group behaviors in social network; the existing methods and models are limited to achieve a satisfactory prediction result due to the open changeable social connections and uncertain information processing behaviors. ACP (artificial societies, computational experiments, and parallel execution) provides an effective way to simulate the real situation. In order to obtain better information dissemination prediction in social networks, this paper proposes an intelligent computation method under the framework of TDF (Theory-Data-Feedback) based on ACP simulation system which was successfully applied to the analysis of A (H1N1) Flu emergency. PMID:26609303
Information Dissemination of Public Health Emergency on Social Networks and Intelligent Computation.
Hu, Hongzhi; Mao, Huajuan; Hu, Xiaohua; Hu, Feng; Sun, Xuemin; Jing, Zaiping; Duan, Yunsuo
2015-01-01
Due to the extensive social influence, public health emergency has attracted great attention in today's society. The booming social network is becoming a main information dissemination platform of those events and caused high concerns in emergency management, among which a good prediction of information dissemination in social networks is necessary for estimating the event's social impacts and making a proper strategy. However, information dissemination is largely affected by complex interactive activities and group behaviors in social network; the existing methods and models are limited to achieve a satisfactory prediction result due to the open changeable social connections and uncertain information processing behaviors. ACP (artificial societies, computational experiments, and parallel execution) provides an effective way to simulate the real situation. In order to obtain better information dissemination prediction in social networks, this paper proposes an intelligent computation method under the framework of TDF (Theory-Data-Feedback) based on ACP simulation system which was successfully applied to the analysis of A (H1N1) Flu emergency.
Quantum Algorithms for Fermionic Quantum Field Theories
2014-04-28
preskill@theory.caltech.edu 1 Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is...NAME OF RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98...operators of momentum modes. (The choice between these forms of measurement depends on the application.) 2.3 Complexity In this section we bound the
ERIC Educational Resources Information Center
Hardeman, Wendy; Sutton, Stephen; Griffin, Simon; Johnston, Marie; White, Anthony; Wareham, Nicholas J.; Kinmonth, Ann Louise
2005-01-01
Theory-based intervention programmes to support health-related behaviour change aim to increase health impact and improve understanding of mechanisms of behaviour change. However, the science of intervention development remains at an early stage. We present a causal modelling approach to developing complex interventions for evaluation in…
Pangenesis as a source of new genetic information. The history of a now disproven theory.
Bergman, Gerald
2006-01-01
Evolution is based on natural selection of existing biological phenotypic traits. Natural selection can only eliminate traits. It cannot create new ones, requiring a theory to explain the origin of new genetic information. The theory of pangenesis was a major attempt to explain the source of new genetic information required to produce phenotypic variety. This theory, advocated by Darwin as the main source of genetic variety, has now been empirically disproved. It is currently a theory mainly of interest to science historians.
'In the eye of the beholder': perceptions of local impact in English Health Action Zones.
Sullivan, Helen; Judge, Ken; Sewel, Kate
2004-10-01
Contemporary efforts to promote population health improvement and to reduce inequalities in the UK are characterised by their complexity as they engage with a multiplicity of agencies and sectors. Additionally, the emphasis on promoting evidence-based practice has challenged evaluators tasked with collecting and interpreting evidence of impact in complex local health economies. National policy makers, local implementers and other stakeholders will have varying perspectives on impact and the Labour Government's centralising tendencies have acted to 'crowd out' local voices from the policy process. Drawing on the national evaluation of Health Action Zones (HAZ) this article 'gives voice' to local stakeholders and their perceptions of impact. Informed by a Theories of Change perspective, we explore HAZ interventions to articulate the nature of impact and its limits. We analyse the claims made by local HAZs with reference to the evidence base and examine their significance in the context of overall HAZ objectives. We conclude that local implementer perspectives are no less sophisticated than those at the policy centre of central government, but that they are informed by three important factors: the local context, a need to be pragmatic and the limited potency of evidence in the public policy system.
Using Rasch Analysis to Inform Rating Scale Development
ERIC Educational Resources Information Center
Van Zile-Tamsen, Carol
2017-01-01
The use of surveys, questionnaires, and rating scales to measure important outcomes in higher education is pervasive, but reliability and validity information is often based on problematic Classical Test Theory approaches. Rasch Analysis, based on Item Response Theory, provides a better alternative for examining the psychometric quality of rating…
Concepts and Measurements for Manpower and Occupational Analysis.
ERIC Educational Resources Information Center
Scoville, James G.
This volume contains information on occupational data and their uses, jobs-theories, case studies, and improved data bases. A survey was made of current applications of occupational information data and conceptual bases and practical shortcomings of the more frequently used classification systems. In addition, an economic theory was developed to…
EPR & Klein Paradoxes in Complex Hamiltonian Dynamics and Krein Space Quantization
NASA Astrophysics Data System (ADS)
Payandeh, Farrin
2015-07-01
Negative energy states are applied in Krein space quantization approach to achieve a naturally renormalized theory. For example, this theory by taking the full set of Dirac solutions, could be able to remove the propagator Green function's divergences and automatically without any normal ordering, to vanish the expected value for vacuum state energy. However, since it is a purely mathematical theory, the results are under debate and some efforts are devoted to include more physics in the concept. Whereas Krein quantization is a pure mathematical approach, complex quantum Hamiltonian dynamics is based on strong foundations of Hamilton-Jacobi (H-J) equations and therefore on classical dynamics. Based on complex quantum Hamilton-Jacobi theory, complex spacetime is a natural consequence of including quantum effects in the relativistic mechanics, and is a bridge connecting the causality in special relativity and the non-locality in quantum mechanics, i.e. extending special relativity to the complex domain leads to relativistic quantum mechanics. So that, considering both relativistic and quantum effects, the Klein-Gordon equation could be derived as a special form of the Hamilton-Jacobi equation. Characterizing the complex time involved in an entangled energy state and writing the general form of energy considering quantum potential, two sets of positive and negative energies will be realized. The new states enable us to study the spacetime in a relativistic entangled “space-time” state leading to 12 extra wave functions than the four solutions of Dirac equation for a free particle. Arguing the entanglement of particle and antiparticle leads to a contradiction with experiments. So, in order to correct the results, along with a previous investigation [1], we realize particles and antiparticles as physical entities with positive energy instead of considering antiparticles with negative energy. As an application of modified descriptions for entangled (space-time) states, the original version of EPR paradox can be discussed and the correct answer can be verified based on the strong rooted complex quantum Hamilton-Jacobi theory [2-27] and as another example we can use the negative energy states, to remove the Klein's paradox without the need of any further explanations or justifications like backwardly moving electrons. Finally, comparing the two approaches, we can point out to the existence of a connection between quantum Hamiltonian dynamics, standard quantum field theory, and Krein space quantization [28-43].
Sender–receiver systems and applying information theory for quantitative synthetic biology
Barcena Menendez, Diego; Senthivel, Vivek Raj; Isalan, Mark
2015-01-01
Sender–receiver (S–R) systems abound in biology, with communication systems sending information in various forms. Information theory provides a quantitative basis for analysing these processes and is being applied to study natural genetic, enzymatic and neural networks. Recent advances in synthetic biology are providing us with a wealth of artificial S–R systems, giving us quantitative control over networks with a finite number of well-characterised components. Combining the two approaches can help to predict how to maximise signalling robustness, and will allow us to make increasingly complex biological computers. Ultimately, pushing the boundaries of synthetic biology will require moving beyond engineering the flow of information and towards building more sophisticated circuits that interpret biological meaning. PMID:25282688
A study on locating the sonic source of sinusoidal magneto-acoustic signals using a vector method.
Zhang, Shunqi; Zhou, Xiaoqing; Ma, Ren; Yin, Tao; Liu, Zhipeng
2015-01-01
Methods based on the magnetic-acoustic effect are of great significance in studying the electrical imaging properties of biological tissues and currents. The continuous wave method, which is commonly used, can only detect the current amplitude without the sound source position. Although the pulse mode adopted in magneto-acoustic imaging can locate the sonic source, the low measuring accuracy and low SNR has limited its application. In this study, a vector method was used to solve and analyze the magnetic-acoustic signal based on the continuous sine wave mode. This study includes theory modeling of the vector method, simulations to the line model, and experiments with wire samples to analyze magneto-acoustic (MA) signal characteristics. The results showed that the amplitude and phase of the MA signal contained the location information of the sonic source. The amplitude and phase obeyed the vector theory in the complex plane. This study sets a foundation for a new technique to locate sonic sources for biomedical imaging of tissue conductivity. It also aids in studying biological current detecting and reconstruction based on the magneto-acoustic effect.
Naval Postgraduate School Research. Volume 8, Number 3, October 1998
1998-10-01
the Bangor Submarine Base: “Understanding Racism ” and “Under- standing Sexism .” These two-day workshops are part of a four workshop series on Managing...organization theory and complex- ity theory and shaping them into design guidelines for mapping command and control processes to the needs of specific missions...Intranet- based decision support for the ACE. The methodol- ogy combines systems development life cycle (SDLC) practices, command and control theory , an
Complex basis functions for molecular resonances: Methodology and applications
NASA Astrophysics Data System (ADS)
White, Alec; McCurdy, C. William; Head-Gordon, Martin
The computation of positions and widths of metastable electronic states is a challenge for molecular electronic structure theory because, in addition to the difficulty of the many-body problem, such states obey scattering boundary conditions. These resonances cannot be addressed with naïve application of traditional bound state electronic structure theory. Non-Hermitian electronic structure methods employing complex basis functions is one way that we may rigorously treat resonances within the framework of traditional electronic structure theory. In this talk, I will discuss our recent work in this area including the methodological extension from single determinant SCF-based approaches to highly correlated levels of wavefunction-based theory such as equation of motion coupled cluster and many-body perturbation theory. These approaches provide a hierarchy of theoretical methods for the computation of positions and widths of molecular resonances. Within this framework, we may also examine properties of resonances including the dependence of these parameters on molecular geometry. Some applications of these methods to temporary anions and dianions will also be discussed.
Information properties of morphologically complex words modulate brain activity during word reading
Hultén, Annika; Lehtonen, Minna; Lagus, Krista; Salmelin, Riitta
2018-01-01
Abstract Neuroimaging studies of the reading process point to functionally distinct stages in word recognition. Yet, current understanding of the operations linked to those various stages is mainly descriptive in nature. Approaches developed in the field of computational linguistics may offer a more quantitative approach for understanding brain dynamics. Our aim was to evaluate whether a statistical model of morphology, with well‐defined computational principles, can capture the neural dynamics of reading, using the concept of surprisal from information theory as the common measure. The Morfessor model, created for unsupervised discovery of morphemes, is based on the minimum description length principle and attempts to find optimal units of representation for complex words. In a word recognition task, we correlated brain responses to word surprisal values derived from Morfessor and from other psycholinguistic variables that have been linked with various levels of linguistic abstraction. The magnetoencephalography data analysis focused on spatially, temporally and functionally distinct components of cortical activation observed in reading tasks. The early occipital and occipito‐temporal responses were correlated with parameters relating to visual complexity and orthographic properties, whereas the later bilateral superior temporal activation was correlated with whole‐word based and morphological models. The results show that the word processing costs estimated by the statistical Morfessor model are relevant for brain dynamics of reading during late processing stages. PMID:29524274
NASA Astrophysics Data System (ADS)
Xing, Lizhi; Dong, Xianlei; Guan, Jun
2017-04-01
Input-output table is very comprehensive and detailed in describing the national economic system with lots of economic relationships, which contains supply and demand information among industrial sectors. The complex network, a theory and method for measuring the structure of complex system, can describe the structural characteristics of the internal structure of the research object by measuring the structural indicators of the social and economic system, revealing the complex relationship between the inner hierarchy and the external economic function. This paper builds up GIVCN-WIOT models based on World Input-Output Database in order to depict the topological structure of Global Value Chain (GVC), and assumes the competitive advantage of nations is equal to the overall performance of its domestic sectors' impact on the GVC. Under the perspective of econophysics, Global Industrial Impact Coefficient (GIIC) is proposed to measure the national competitiveness in gaining information superiority and intermediate interests. Analysis of GIVCN-WIOT models yields several insights including the following: (1) sectors with higher Random Walk Centrality contribute more to transmitting value streams within the global economic system; (2) Half-Value Ratio can be used to measure robustness of open-economy macroeconomics in the process of globalization; (3) the positive correlation between GIIC and GDP indicates that one country's global industrial impact could reveal its international competitive advantage.
Information properties of morphologically complex words modulate brain activity during word reading.
Hakala, Tero; Hultén, Annika; Lehtonen, Minna; Lagus, Krista; Salmelin, Riitta
2018-06-01
Neuroimaging studies of the reading process point to functionally distinct stages in word recognition. Yet, current understanding of the operations linked to those various stages is mainly descriptive in nature. Approaches developed in the field of computational linguistics may offer a more quantitative approach for understanding brain dynamics. Our aim was to evaluate whether a statistical model of morphology, with well-defined computational principles, can capture the neural dynamics of reading, using the concept of surprisal from information theory as the common measure. The Morfessor model, created for unsupervised discovery of morphemes, is based on the minimum description length principle and attempts to find optimal units of representation for complex words. In a word recognition task, we correlated brain responses to word surprisal values derived from Morfessor and from other psycholinguistic variables that have been linked with various levels of linguistic abstraction. The magnetoencephalography data analysis focused on spatially, temporally and functionally distinct components of cortical activation observed in reading tasks. The early occipital and occipito-temporal responses were correlated with parameters relating to visual complexity and orthographic properties, whereas the later bilateral superior temporal activation was correlated with whole-word based and morphological models. The results show that the word processing costs estimated by the statistical Morfessor model are relevant for brain dynamics of reading during late processing stages. © 2018 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.
Multiscale structure in eco-evolutionary dynamics
NASA Astrophysics Data System (ADS)
Stacey, Blake C.
In a complex system, the individual components are neither so tightly coupled or correlated that they can all be treated as a single unit, nor so uncorrelated that they can be approximated as independent entities. Instead, patterns of interdependency lead to structure at multiple scales of organization. Evolution excels at producing such complex structures. In turn, the existence of these complex interrelationships within a biological system affects the evolutionary dynamics of that system. I present a mathematical formalism for multiscale structure, grounded in information theory, which makes these intuitions quantitative, and I show how dynamics defined in terms of population genetics or evolutionary game theory can lead to multiscale organization. For complex systems, "more is different," and I address this from several perspectives. Spatial host--consumer models demonstrate the importance of the structures which can arise due to dynamical pattern formation. Evolutionary game theory reveals the novel effects which can result from multiplayer games, nonlinear payoffs and ecological stochasticity. Replicator dynamics in an environment with mesoscale structure relates to generalized conditionalization rules in probability theory. The idea of natural selection "acting at multiple levels" has been mathematized in a variety of ways, not all of which are equivalent. We will face down the confusion, using the experience developed over the course of this thesis to clarify the situation.
NASA Astrophysics Data System (ADS)
Takemura, Kazuhiro; Guo, Hao; Sakuraba, Shun; Matubayasi, Nobuyuki; Kitao, Akio
2012-12-01
We propose a method to evaluate binding free energy differences among distinct protein-protein complex model structures through all-atom molecular dynamics simulations in explicit water using the solution theory in the energy representation. Complex model structures are generated from a pair of monomeric structures using the rigid-body docking program ZDOCK. After structure refinement by side chain optimization and all-atom molecular dynamics simulations in explicit water, complex models are evaluated based on the sum of their conformational and solvation free energies, the latter calculated from the energy distribution functions obtained from relatively short molecular dynamics simulations of the complex in water and of pure water based on the solution theory in the energy representation. We examined protein-protein complex model structures of two protein-protein complex systems, bovine trypsin/CMTI-1 squash inhibitor (PDB ID: 1PPE) and RNase SA/barstar (PDB ID: 1AY7), for which both complex and monomer structures were determined experimentally. For each system, we calculated the energies for the crystal complex structure and twelve generated model structures including the model most similar to the crystal structure and very different from it. In both systems, the sum of the conformational and solvation free energies tended to be lower for the structure similar to the crystal. We concluded that our energy calculation method is useful for selecting low energy complex models similar to the crystal structure from among a set of generated models.
Takemura, Kazuhiro; Guo, Hao; Sakuraba, Shun; Matubayasi, Nobuyuki; Kitao, Akio
2012-12-07
We propose a method to evaluate binding free energy differences among distinct protein-protein complex model structures through all-atom molecular dynamics simulations in explicit water using the solution theory in the energy representation. Complex model structures are generated from a pair of monomeric structures using the rigid-body docking program ZDOCK. After structure refinement by side chain optimization and all-atom molecular dynamics simulations in explicit water, complex models are evaluated based on the sum of their conformational and solvation free energies, the latter calculated from the energy distribution functions obtained from relatively short molecular dynamics simulations of the complex in water and of pure water based on the solution theory in the energy representation. We examined protein-protein complex model structures of two protein-protein complex systems, bovine trypsin/CMTI-1 squash inhibitor (PDB ID: 1PPE) and RNase SA/barstar (PDB ID: 1AY7), for which both complex and monomer structures were determined experimentally. For each system, we calculated the energies for the crystal complex structure and twelve generated model structures including the model most similar to the crystal structure and very different from it. In both systems, the sum of the conformational and solvation free energies tended to be lower for the structure similar to the crystal. We concluded that our energy calculation method is useful for selecting low energy complex models similar to the crystal structure from among a set of generated models.
Formulation and closure of compressible turbulence equations in the light of kinetic theory
NASA Technical Reports Server (NTRS)
Tsuge, S.; Sagara, K.
1976-01-01
Fluid-dynamic moment equations, based on a kinetic hierarchy system, are derived governing the interaction between turbulent and thermal fluctuations. The kinetic theory is shown to reduce the inherent complexity of the conventional formalism of compressible turbulence theory and to minimize arbitrariness in formulating the closure condition.
Hankey, Alex
2015-12-01
In the late 19th century Husserl studied our internal sense of time passing, maintaining that its deep connections into experience represent prima facie evidence for it as the basis for all investigations in the sciences: Phenomenology was born. Merleau-Ponty focused on perception pointing out that any theory of experience must accord with established aspects of biology i.e. be embodied. Recent analyses suggest that theories of experience require non-reductive, integrative information, together with a specific property connecting them to experience. Here we elucidate a new class of information states with just such properties found at the loci of control of complex biological systems, including nervous systems. Complexity biology concerns states satisfying self-organized criticality. Such states are located at critical instabilities, commonly observed in biological systems, and thought to maximize information diversity and processing, and hence to optimize regulation. Major results for biology follow: why organisms have unusually low entropies; and why they are not merely mechanical. Criticality states form singular self-observing systems, which reduce wave packets by processes of perfect self-observation associated with feedback gain g = 1. Analysis of their information properties leads to identification of a new kind of information state with high levels of internal coherence, and feedback loops integrated into their structure. The major idea presented here is that the integrated feedback loops are responsible for our 'sense of self', and also the feeling of continuity in our sense of time passing. Long-range internal correlations guarantee a unique kind of non-reductive, integrative information structure enabling such states to naturally support phenomenal experience. Being founded in complexity biology, they are 'embodied'; they also fulfill the statement that 'The self is a process', a singular process. High internal correlations and René Thom-style catastrophes support non-digital forms of information, gestalt cognition, and information transfer via quantum teleportation. Criticality in complexity biology can 'embody' cognitive states supporting gestalts, and phenomenology's senses of 'self,' time passing, existence and being. Copyright © 2015. Published by Elsevier Ltd.
Collective learning modeling based on the kinetic theory of active particles.
Burini, D; De Lillo, S; Gibelli, L
2016-03-01
This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom. Copyright © 2015 Elsevier B.V. All rights reserved.
Modeling of Wall-Bounded Complex Flows and Free Shear Flows
NASA Technical Reports Server (NTRS)
Shih, Tsan-Hsing; Zhu, Jiang; Lumley, John L.
1994-01-01
Various wall-bounded flows with complex geometries and free shear flows have been studied with a newly developed realizable Reynolds stress algebraic equation model. The model development is based on the invariant theory in continuum mechanics. This theory enables us to formulate a general constitutive relation for the Reynolds stresses. Pope was the first to introduce this kind of constitutive relation to turbulence modeling. In our study, realizability is imposed on the truncated constitutive relation to determine the coefficients so that, unlike the standard k-E eddy viscosity model, the present model will not produce negative normal stresses in any situations of rapid distortion. The calculations based on the present model have shown an encouraging success in modeling complex turbulent flows.
NASA Astrophysics Data System (ADS)
Long, Nicholas James
This thesis serves to develop a preliminary foundational methodology for evaluating the static complexity of future lunar oxygen production systems when extensive information is not yet available about the various systems under consideration. Evaluating static complexity, as part of a overall system complexity analysis, is an important consideration in ultimately selecting a process to be used in a lunar base. When system complexity is higher, there is generally an overall increase in risk which could impact the safety of astronauts and the economic performance of the mission. To evaluate static complexity in lunar oxygen production, static complexity is simplified and defined into its essential components. First, three essential dimensions of static complexity are investigated, including interconnective complexity, strength of connections, and complexity in variety. Then a set of methods is developed upon which to separately evaluate each dimension. Q-connectivity analysis is proposed as a means to evaluate interconnective complexity and strength of connections. The law of requisite variety originating from cybernetic theory is suggested to interpret complexity in variety. Secondly, a means to aggregate the results of each analysis is proposed to create holistic measurement for static complexity using the Single Multi-Attribute Ranking Technique (SMART). Each method of static complexity analysis and the aggregation technique is demonstrated using notional data for four lunar oxygen production processes.
New infinite-dimensional hidden symmetries for heterotic string theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao Yajun
The symmetry structures of two-dimensional heterotic string theory are studied further. A (2d+n)x(2d+n) matrix complex H-potential is constructed and the field equations are extended into a complex matrix formulation. A pair of Hauser-Ernst-type linear systems are established. Based on these linear systems, explicit formulations of new hidden symmetry transformations for the considered theory are given and then these symmetry transformations are verified to constitute infinite-dimensional Lie algebras: the semidirect product of the Kac-Moody o(d,d+n-circumflex) and Virasoro algebras (without center charges). These results demonstrate that the heterotic string theory under consideration possesses more and richer symmetry structures than previously expected.
Some thoughts about consciousness: from a quantum mechanics perspective.
Gargiulo, Gerald J
2013-08-01
The article explores some of the basic findings of quantum physics and information theory and their possible usefulness in offering new vistas for understanding psychoanalysis and the patient-analyst interchange. Technical terms are explained and placed in context, and examples of applying quantum models to clinical experience are offered. Given the complexity of the findings of quantum mechanics and information theory, the article aims only to introduce some of the major concepts from these disciplines. Within this framework the article also briefly addresses the question of mind as well as the problematic of reducing the experience of consciousness to neurological brain functioning.
An integrative health information systems approach for facilitating strategic planning in hospitals.
Killingsworth, Brenda; Newkirk, Henry E; Seeman, Elaine
2006-01-01
This article presents a framework for developing strategic information systems (SISs) for hospitals. It proposes a SIS formulation process which incorporates complexity theory, strategic/organizational analysis theory, and conventional MIS development concepts. Within the formulation process, four dimensions of SIS are proposed as well as an implementation plan. A major contribution of this article is the development of a hospital SIS framework which permits an organization to fluidly respond to external, interorganizational, and intraorganizational influences. In addition, this article offers a checklist which managers can utilize in developing an SIS in health care.
A scale-free systems theory of motivation and addiction.
Chambers, R Andrew; Bickel, Warren K; Potenza, Marc N
2007-01-01
Scale-free organizations, characterized by uneven distributions of linkages between nodal elements, describe the structure and function of many life-based complex systems developing under evolutionary pressures. We explore motivated behavior as a scale-free map toward a comprehensive translational theory of addiction. Motivational and behavioral repertoires are reframed as link and nodal element sets, respectively, comprising a scale-free structure. These sets are generated by semi-independent information-processing streams within cortical-striatal circuits that cooperatively provide decision-making and sequential processing functions necessary for traversing maps of motivational links connecting behavioral nodes. Dopamine modulation of cortical-striatal plasticity serves a central-hierarchical mechanism for survival-adaptive sculpting and development of motivational-behavioral repertoires by guiding a scale-free design. Drug-induced dopamine activity promotes drug taking as a highly connected behavioral hub at the expense of natural-adaptive motivational links and behavioral nodes. Conceptualizing addiction as pathological alteration of scale-free motivational-behavioral repertoires unifies neurobiological, neurocomputational and behavioral research while addressing addiction vulnerability in adolescence and psychiatric illness. This model may inform integrative research in defining more effective prevention and treatment strategies for addiction.
A Scale-Free Systems Theory of Motivation and Addiction
Bickel, Warren K.; Potenza, Marc N.
2007-01-01
Scale-free organizations, characterized by uneven distributions of linkages between nodal elements, describe the structure and function of many life-based complex systems developing under evolutionary pressures. We explore motivated behavior as a scale-free map toward a comprehensive translational theory of addiction. Motivational and behavioral repertoires are reframed as link and nodal element sets, respectively, comprising a scale-free structure. These sets are generated by semi-independent information-processing streams within cortical-striatal circuits that cooperatively provide decision-making and sequential processing functions necessary for traversing maps of motivational links connecting behavioral nodes. Dopamine modulation of cortical-striatal plasticity serves a central-hierarchical mechanism for survival-adaptive sculpting and development of motivational-behavioral repertoires by guiding a scale-free design. Drug-induced dopamine activity promotes drug-taking as a highly connected behavioral hub at the expense of natural-adaptive motivational links and behavioral nodes. Conceptualizing addiction as pathological alteration of scale-free motivational-behavioral repertoires unifies neurobiological, neurocomputational and behavioral research while addressing addiction vulnerability in adolescence and psychiatric illness. This model may inform integrative research in defining more effective prevention and treatment strategies for addiction. PMID:17574673
Renewal Processes in the Critical Brain
NASA Astrophysics Data System (ADS)
Allegrini, Paolo; Paradisi, Paolo; Menicucci, Danilo; Gemignani, Angelo
We describe herein a multidisciplinary research, as it developes and applies concepts of the theory of complexity, in turn stemming from recent advancements of statistical physics, onto cognitive neuroscience. We discuss (define) complexity, and how the human brain is a paradigm of it. We discuss how the hypothesis of brain activity dynamically behaving as a critical system is taking momentum in literature, then we focus on a feature of critical systems (hence of the brain), which is the intermittent passage between metastable states, marked by events, locally resetting the memory, but giving rise to correlation functions with infinite correlation times. The events, extracted from multi-channel ElectroEncephaloGrams, mark (are interpreted as) a birth/death process of cooperation, namely of system elements being recruited into collective states. Finally we discuss a recently discovered form of control (in the form of a new Linear Response Theory), that allows an optimized information transmission between complex systems, named Complexity Matching.
Interprofessional communication and medical error: a reframing of research questions and approaches.
Varpio, Lara; Hall, Pippa; Lingard, Lorelei; Schryer, Catherine F
2008-10-01
Progress toward understanding the links between interprofessional communication and issues of medical error has been slow. Recent research proposes that this delay may result from overlooking the complexities involved in interprofessional care. Medical education initiatives in this domain tend to simplify the complexities of team membership fluidity, rotation, and use of communication tools. A new theoretically informed research approach is required to take into account these complexities. To generate such an approach, we review two theories from the social sciences: Activity Theory and Knotworking. Using these perspectives, we propose that research into interprofessional communication and medical error can develop better understandings of (1) how and why medical errors are generated and (2) how and why gaps in team defenses occur. Such complexities will have to be investigated if students and practicing clinicians are to be adequately prepared to work safely in interprofessional teams.
Establishing a research agenda for scientific and technical information (STI) - Focus on the user
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.
1992-01-01
This report addresses the relationship between library science and information science theory and practice, between the development of conceptual understanding, and the practical competence of information professionals. Consideration is given to the concept of research, linking theory with practice, and the reality of theory based practice. Attention is given to the need for research and research priorities, focus on the user and information-seeking behavior, and a user-oriented research agenda for STI.
Establishing a research agenda for Scientific and Technical Information (STI): Focus on the user
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.
1992-01-01
This report addresses the relationship between library science and information science theory and practice, between the development of conceptual understanding, and the practical competence of information professionals. Consideration is given to the concept of research, linking theory with practice, and the reality of theory based practice. Attention is given to the need for research and research priorities, focus on the user and information-seeking behavior, and a user-oriented research agenda for STI.
Poltev, V I; Anisimov, V M; Sanchez, C; Deriabina, A; Gonzalez, E; Garcia, D; Rivas, F; Polteva, N A
2016-01-01
It is generally accepted that the important characteristic features of the Watson-Crick duplex originate from the molecular structure of its subunits. However, it still remains to elucidate what properties of each subunit are responsible for the significant characteristic features of the DNA structure. The computations of desoxydinucleoside monophosphates complexes with Na-ions using density functional theory revealed a pivotal role of DNA conformational properties of single-chain minimal fragments in the development of unique features of the Watson-Crick duplex. We found that directionality of the sugar-phosphate backbone and the preferable ranges of its torsion angles, combined with the difference between purines and pyrimidines. in ring bases, define the dependence of three-dimensional structure of the Watson-Crick duplex on nucleotide base sequence. In this work, we extended these density functional theory computations to the minimal' fragments of DNA duplex, complementary desoxydinucleoside monophosphates complexes with Na-ions. Using several computational methods and various functionals, we performed a search for energy minima of BI-conformation for complementary desoxydinucleoside monophosphates complexes with different nucleoside sequences. Two sequences are optimized using ab initio method at the MP2/6-31++G** level of theory. The analysis of torsion angles, sugar ring puckering and mutual base positions of optimized structures demonstrates that the conformational characteristic features of complementary desoxydinucleoside monophosphates complexes with Na-ions remain within BI ranges and become closer to the corresponding characteristic features of the Watson-Crick duplex crystals. Qualitatively, the main characteristic features of each studied complementary desoxydinucleoside monophosphates complex remain invariant when different computational methods are used, although the quantitative values of some conformational parameters could vary lying within the limits typical for the corresponding family. We observe that popular functionals in density functional theory calculations lead to the overestimated distances between base pairs, while MP2 computations and the newer complex functionals produce the structures that have too close atom-atom contacts. A detailed study of some complementary desoxydinucleoside monophosphate complexes with Na-ions highlights the existence of several energy minima corresponding to BI-conformations, in other words, the complexity of the relief pattern of the potential energy surface of complementary desoxydinucleoside monophosphate complexes. This accounts for variability of conformational parameters of duplex fragments with the same base sequence. Popular molecular mechanics force fields AMBER and CHARMM reproduce most of the conformational characteristics of desoxydinucleoside monophosphates and their complementary complexes with Na-ions but fail to reproduce some details of the dependence of the Watson-Crick duplex conformation on the nucleotide sequence.
The Role of Materiality in Apprenticeships: The Case of the Suame Magazine, Kumasi, Ghana
ERIC Educational Resources Information Center
Jaarsma, Thomas; Maat, Harro; Richards, Paul; Wals, Arjen
2011-01-01
Although the concept of the apprenticeship seems to be universal, its institutional form and status differ around the world. This article discusses informal apprenticeship training as it occurs among car mechanics in the informal industrial complex of the Suame Magazine, Kumasi, Ghana. Using on-site research and theories of social learning and…
Understanding the Online Informal Learning of English as a Complex Dynamic System: An Emic Approach
ERIC Educational Resources Information Center
Sockett, Geoffrey
2013-01-01
Research into the online informal learning of English has already shown it to be a widespread phenomenon involving a range of comprehension and production activities such as viewing original version television series, listening to music on demand and social networking with other English users. Dynamic systems theory provides a suitable framework…
An Interdisciplinary Network Making Progress on Climate Change Communication
NASA Astrophysics Data System (ADS)
Spitzer, W.; Anderson, J. C.; Bales, S.; Fraser, J.; Yoder, J. A.
2012-12-01
Public understanding of climate change continues to lag far behind the scientific consensus not merely because the public lacks information, but because there is in fact too much complex and contradictory information available. Fortunately, we can now (1) build on careful empirical cognitive and social science research to understand what people already value, believe, and understand; and then (2) design and test strategies for translating complex science so that people can examine evidence, make well-informed inferences, and embrace science-based solutions. Informal science education institutions can help bridge the gap between climate scientists and the public. In the US, more than 1,500 informal science venues (science centers, museums, aquariums, zoos, nature centers, national parks, etc.) are visited annually by 61% of the population. Extensive research shows that these visitors are receptive to learning about climate change and trust these institutions as reliable sources. Ultimately, we need to take a strategic approach to the way climate change is communicated. An interdisciplinary approach is needed to bring together three key areas of expertise (as recommended by Pidgeon and Fischhoff, 2011): 1. Climate and decision science experts - who can summarize and explain what is known, characterize risks, and describe appropriate mitigation and adaptation strategies; 2. Social scientists - who can bring to bear research, theory, and best practices from cognitive, communication, knowledge acquisition, and social learning theory; and 3. Informal educators and program designers - who bring a practitioner perspective and can exponentially facilitate a learning process for additional interpreters. With support from an NSF CCEP Phase I grant, we have tested this approach, bringing together Interdisciplinary teams of colleagues for a five month "study circles" to develop skills to communicate climate change based on research in the social and cognitive sciences. In 2011, social scientists, Ph.D. students studying oceanography, and staff from more than 20 institutions that teach science to the public came together in these learning groups. Most participants were motivated to create new or revised training or public programs based on lessons learned together. The success of this program rests on a twofold approach that combines collaborative learning with a cognitive and social sciences research based approach to communications. The learning process facilitated trust and experimentation among co-learners to practice applications for communications that has continued beyond the study circle experience through the networks established during the process. Examples drawn from the study circle outputs suggest that this approach could have a transformative impact on informal science education on a broad scale. Ultimately, we envision informal science interpreters as "vectors" for effective science communication, ocean and climate scientists with enhanced communication skills, and increased public demand for explanation and dialogue about global issues.
ERIC Educational Resources Information Center
Fielding-Wells, Jill; O'Brien, Mia; Makar, Katie
2017-01-01
Inquiry-based learning (IBL) is a pedagogical approach in which students address complex, ill-structured problems set in authentic contexts. While IBL is gaining ground in Australia as an instructional practice, there has been little research that considers implications for student motivation and engagement. Expectancy-value theory (Eccles and…
Reducing the Requirements and Cost of Astronomical Telescopes
NASA Technical Reports Server (NTRS)
Smith, W. Scott; Whitakter, Ann F. (Technical Monitor)
2002-01-01
Limits on astronomical telescope apertures are being rapidly approached. These limits result from logistics, increasing complexity, and finally budgetary constraints. In an historical perspective, great strides have been made in the area of aperture, adaptive optics, wavefront sensors, detectors, stellar interferometers and image reconstruction. What will be the next advances? Emerging data analysis techniques based on communication theory holds the promise of yielding more information from observational data based on significant computer post-processing. This paper explores some of the current telescope limitations and ponders the possibilities increasing the yield of scientific data based on the migration computer post-processing techniques to higher dimensions. Some of these processes hold the promise of reducing the requirements on the basic telescope hardware making the next generation of instruments more affordable.
Complete information acquisition in scanning probe microscopy
Belianinov, Alex; Kalinin, Sergei V.; Jesse, Stephen
2015-03-13
In the last three decades, scanning probe microscopy (SPM) has emerged as a primary tool for exploring and controlling the nanoworld. A critical part of the SPM measurements is the information transfer from the tip-surface junction to a macroscopic measurement system. This process reduces the many degrees of freedom of a vibrating cantilever to relatively few parameters recorded as images. Similarly, the details of dynamic cantilever response at sub-microsecond time scales of transients, higher-order eigenmodes and harmonics are averaged out by transitioning to millisecond time scale of pixel acquisition. Hence, the amount of information available to the external observer ismore » severely limited, and its selection is biased by the chosen data processing method. Here, we report a fundamentally new approach for SPM imaging based on information theory-type analysis of the data stream from the detector. This approach allows full exploration of complex tip-surface interactions, spatial mapping of multidimensional variability of material s properties and their mutual interactions, and SPM imaging at the information channel capacity limit.« less
Hybrid modeling and empirical analysis of automobile supply chain network
NASA Astrophysics Data System (ADS)
Sun, Jun-yan; Tang, Jian-ming; Fu, Wei-ping; Wu, Bing-ying
2017-05-01
Based on the connection mechanism of nodes which automatically select upstream and downstream agents, a simulation model for dynamic evolutionary process of consumer-driven automobile supply chain is established by integrating ABM and discrete modeling in the GIS-based map. Firstly, the rationality is proved by analyzing the consistency of sales and changes in various agent parameters between the simulation model and a real automobile supply chain. Second, through complex network theory, hierarchical structures of the model and relationships of networks at different levels are analyzed to calculate various characteristic parameters such as mean distance, mean clustering coefficients, and degree distributions. By doing so, it verifies that the model is a typical scale-free network and small-world network. Finally, the motion law of this model is analyzed from the perspective of complex self-adaptive systems. The chaotic state of the simulation system is verified, which suggests that this system has typical nonlinear characteristics. This model not only macroscopically illustrates the dynamic evolution of complex networks of automobile supply chain but also microcosmically reflects the business process of each agent. Moreover, the model construction and simulation of the system by means of combining CAS theory and complex networks supplies a novel method for supply chain analysis, as well as theory bases and experience for supply chain analysis of auto companies.
Information theory in systems biology. Part I: Gene regulatory and metabolic networks.
Mousavian, Zaynab; Kavousi, Kaveh; Masoudi-Nejad, Ali
2016-03-01
"A Mathematical Theory of Communication", was published in 1948 by Claude Shannon to establish a framework that is now known as information theory. In recent decades, information theory has gained much attention in the area of systems biology. The aim of this paper is to provide a systematic review of those contributions that have applied information theory in inferring or understanding of biological systems. Based on the type of system components and the interactions between them, we classify the biological systems into 4 main classes: gene regulatory, metabolic, protein-protein interaction and signaling networks. In the first part of this review, we attempt to introduce most of the existing studies on two types of biological networks, including gene regulatory and metabolic networks, which are founded on the concepts of information theory. Copyright © 2015 Elsevier Ltd. All rights reserved.
Community Detection for Correlation Matrices
NASA Astrophysics Data System (ADS)
MacMahon, Mel; Garlaschelli, Diego
2015-04-01
A challenging problem in the study of complex systems is that of resolving, without prior information, the emergent, mesoscopic organization determined by groups of units whose dynamical activity is more strongly correlated internally than with the rest of the system. The existing techniques to filter correlations are not explicitly oriented towards identifying such modules and can suffer from an unavoidable information loss. A promising alternative is that of employing community detection techniques developed in network theory. Unfortunately, this approach has focused predominantly on replacing network data with correlation matrices, a procedure that we show to be intrinsically biased because of its inconsistency with the null hypotheses underlying the existing algorithms. Here, we introduce, via a consistent redefinition of null models based on random matrix theory, the appropriate correlation-based counterparts of the most popular community detection techniques. Our methods can filter out both unit-specific noise and system-wide dependencies, and the resulting communities are internally correlated and mutually anticorrelated. We also implement multiresolution and multifrequency approaches revealing hierarchically nested subcommunities with "hard" cores and "soft" peripheries. We apply our techniques to several financial time series and identify mesoscopic groups of stocks which are irreducible to a standard, sectorial taxonomy; detect "soft stocks" that alternate between communities; and discuss implications for portfolio optimization and risk management.
Cadogan, Cathal A; Ryan, Cristín; Gormley, Gerard J; Francis, Jill J; Passmore, Peter; Kerse, Ngaire; Hughes, Carmel M
2018-01-01
A general practitioner (GP)-targeted intervention aimed at improving the prescribing of appropriate polypharmacy for older people was previously developed using a systematic, theory-based approach based on the UK Medical Research Council's complex intervention framework. The primary intervention component comprised a video demonstration of a GP prescribing appropriate polypharmacy during a consultation with an older patient. The video was delivered to GPs online and included feedback emphasising the positive outcomes of performing the behaviour. As a complementary intervention component, patients were invited to scheduled medication review consultations with GPs. This study aimed to test the feasibility of the intervention and study procedures (recruitment, data collection). GPs from two general practices were given access to the video, and reception staff scheduled consultations with older patients receiving polypharmacy (≥4 medicines). Primary feasibility study outcomes were the usability and acceptability of the intervention to GPs. Feedback was collected from GP and patient participants using structured questionnaires. Clinical data were also extracted from recruited patients' medical records (baseline and 1 month post-consultation). The feasibility of applying validated assessment of prescribing appropriateness (STOPP/START criteria, Medication Appropriateness Index) and medication regimen complexity (Medication Regimen Complexity Index) to these data was investigated. Data analysis was descriptive, providing an overview of participants' feedback and clinical assessment findings. Four GPs and ten patients were recruited across two practices. The intervention was considered usable and acceptable by GPs. Some reservations were expressed by GPs as to whether the video truly reflected resource and time pressures encountered in the general practice working environment. Patient feedback on the scheduled consultations was positive. Patients welcomed the opportunity to have their medications reviewed. Due to the short time to follow-up and a lack of detailed clinical information in patient records, it was not feasible to detect any prescribing changes or to apply the assessment tools to patients' clinical data. The findings will help to further refine the intervention and study procedures (including time to follow-up) which will be tested in a randomised pilot study that will inform the design of a definitive trial to evaluate the intervention's effectiveness. ISRCTN18176245.
Booth, Andrew; Harris, Janet; Croot, Elizabeth; Springett, Jane; Campbell, Fiona; Wilkins, Emma
2013-09-28
Systematic review methodologies can be harnessed to help researchers to understand and explain how complex interventions may work. Typically, when reviewing complex interventions, a review team will seek to understand the theories that underpin an intervention and the specific context for that intervention. A single published report from a research project does not typically contain this required level of detail. A review team may find it more useful to examine a "study cluster"; a group of related papers that explore and explain various features of a single project and thus supply necessary detail relating to theory and/or context.We sought to conduct a preliminary investigation, from a single case study review, of techniques required to identify a cluster of related research reports, to document the yield from such methods, and to outline a systematic methodology for cluster searching. In a systematic review of community engagement we identified a relevant project - the Gay Men's Task Force. From a single "key pearl citation" we conducted a series of related searches to find contextually or theoretically proximate documents. We followed up Citations, traced Lead authors, identified Unpublished materials, searched Google Scholar, tracked Theories, undertook ancestry searching for Early examples and followed up Related projects (embodied in the CLUSTER mnemonic). Our structured, formalised procedure for cluster searching identified useful reports that are not typically identified from topic-based searches on bibliographic databases. Items previously rejected by an initial sift were subsequently found to inform our understanding of underpinning theory (for example Diffusion of Innovations Theory), context or both. Relevant material included book chapters, a Web-based process evaluation, and peer reviewed reports of projects sharing a common ancestry. We used these reports to understand the context for the intervention and to explore explanations for its relative lack of success. Additional data helped us to challenge simplistic assumptions on the homogeneity of the target population. A single case study suggests the potential utility of cluster searching, particularly for reviews that depend on an understanding of context, e.g. realist synthesis. The methodology is transparent, explicit and reproducible. There is no reason to believe that cluster searching is not generalizable to other review topics. Further research should examine the contribution of the methodology beyond improved yield, to the final synthesis and interpretation, possibly by utilizing qualitative sensitivity analysis.
Demystifying theory and its use in improvement
Davidoff, Frank; Dixon-Woods, Mary; Leviton, Laura; Michie, Susan
2015-01-01
The role and value of theory in improvement work in healthcare has been seriously underrecognised. We join others in proposing that more informed use of theory can strengthen improvement programmes and facilitate the evaluation of their effectiveness. Many professionals, including improvement practitioners, are unfortunately mystified—and alienated—by theory, which discourages them from using it in their work. In an effort to demystify theory we make the point in this paper that, far from being discretionary or superfluous, theory (‘reason-giving’), both informal and formal, is intimately woven into virtually all human endeavour. We explore the special characteristics of grand, mid-range and programme theory; consider the consequences of misusing theory or failing to use it; review the process of developing and applying programme theory; examine some emerging criteria of ‘good’ theory; and emphasise the value, as well as the challenge, of combining informal experience-based theory with formal, publicly developed theory. We conclude that although informal theory is always at work in improvement, practitioners are often not aware of it or do not make it explicit. The germane issue for improvement practitioners, therefore, is not whether they use theory but whether they make explicit the particular theory or theories, informal and formal, they actually use. PMID:25616279
Quigley, Laura; Lacombe-Duncan, Ashley; Adams, Sherri; Hepburn, Charlotte Moore; Cohen, Eyal
2014-06-30
Children with medical complexity (CMC) are characterized by substantial family-identified service needs, chronic and severe conditions, functional limitations, and high health care use. Information exchange is critically important in high quality care of complex patients at high risk for poor care coordination. Written care plans for CMC are an excellent test case for how well information sharing is currently occurring. The purpose of this study was to identify the barriers to and facilitators of information sharing for CMC across providers, care settings, and families. A qualitative study design with data analysis informed by a grounded theory approach was utilized. Two independent coders conducted secondary analysis of interviews with parents of CMC and health care professionals involved in the care of CMC, collected from two studies of healthcare service delivery for this population. Additional interviews were conducted with privacy officers of associated organizations to supplement these data. Emerging themes related to barriers and facilitators to information sharing were identified by the two coders and the research team, and a theory of facilitators and barriers to information exchange evolved. Barriers to information sharing were related to one of three major themes; 1) the lack of an integrated, accessible, secure platform on which summative health care information is stored, 2) fragmentation of the current health system, and 3) the lack of consistent policies, standards, and organizational priorities across organizations for information sharing. Facilitators of information sharing were related to improving accessibility to a common document, expanding the use of technology, and improving upon a structured communication plan. Findings informed a model of how various barriers to information sharing interact to prevent optimal information sharing both within and across organizations and how the use of technology to improve communication and access to information can act as a solution.
Towards understanding the behavior of physical systems using information theory
NASA Astrophysics Data System (ADS)
Quax, Rick; Apolloni, Andrea; Sloot, Peter M. A.
2013-09-01
One of the goals of complex network analysis is to identify the most influential nodes, i.e., the nodes that dictate the dynamics of other nodes. In the case of autonomous systems or transportation networks, highly connected hubs play a preeminent role in diffusing the flow of information and viruses; in contrast, in language evolution most linguistic norms come from the peripheral nodes who have only few contacts. Clearly a topological analysis of the interactions alone is not sufficient to identify the nodes that drive the state of the network. Here we show how information theory can be used to quantify how the dynamics of individual nodes propagate through a system. We interpret the state of a node as a storage of information about the state of other nodes, which is quantified in terms of Shannon information. This information is transferred through interactions and lost due to noise, and we calculate how far it can travel through a network. We apply this concept to a model of opinion formation in a complex social network to calculate the impact of each node by measuring how long its opinion is remembered by the network. Counter-intuitively we find that the dynamics of opinions are not determined by the hubs or peripheral nodes, but rather by nodes with an intermediate connectivity.
Ormandy, Paula
2011-03-01
Key policy drivers worldwide include optimizing patients' roles in managing their care; focusing services around patients' needs and preferences; and providing information to support patients' contributions and choices. The term information need penetrates many policy documents. Information need is espoused as the foundation from which to develop patient-centred or patient-led services. Yet there is no clear definition as to what the term means or how patients' information needs inform and shape information provision and patient care. The assimilation of complex theories originating from information science has much to offer considerations of patient information need within the context of health care. Health-related research often focuses on the content of information patients prefer, not why they need information. This paper extends and applies knowledge of information behaviour to considerations of information need in health, exposing a working definition for patient information need that reiterates the importance of considering the patient's goals and understanding the patient's context/situation. A patient information need is defined as 'recognition that their knowledge is inadequate to satisfy a goal, within the context/situation that they find themselves at a specific point in the time'. This typifies the key concepts of national/international health policy, the centrality and importance of the patient. The proposed definition of patient information need provides a conceptual framework to guide health-care practitioners on what to consider and why when meeting the information needs of patients in practice. This creates a solid foundation from which to inform future research. © 2010 The Author. Health Expectations © 2010 Blackwell Publishing Ltd.
ERIC Educational Resources Information Center
Ko, Linda K.; Turner-McGrievy, Gabrielle M.; Campbell, Marci K.
2014-01-01
Podcasting is an emerging technology, and previous interventions have shown promising results using theory-based podcast for weight loss among overweight and obese individuals. This study investigated whether constructs of social cognitive theory and information processing theories (IPTs) mediate the effect of a podcast intervention on weight loss…
Complex adaptive systems: A new approach for understanding health practices.
Gomersall, Tim
2018-06-22
This article explores the potential of complex adaptive systems theory to inform behaviour change research. A complex adaptive system describes a collection of heterogeneous agents interacting within a particular context, adapting to each other's actions. In practical terms, this implies that behaviour change is 1) socially and culturally situated; 2) highly sensitive to small baseline differences in individuals, groups, and intervention components; and 3) determined by multiple components interacting "chaotically". Two approaches to studying complex adaptive systems are briefly reviewed. Agent-based modelling is a computer simulation technique that allows researchers to investigate "what if" questions in a virtual environment. Applied qualitative research techniques, on the other hand, offer a way to examine what happens when an intervention is pursued in real-time, and to identify the sorts of rules and assumptions governing social action. Although these represent very different approaches to complexity, there may be scope for mixing these methods - for example, by grounding models in insights derived from qualitative fieldwork. Finally, I will argue that the concept of complex adaptive systems offers one opportunity to gain a deepened understanding of health-related practices, and to examine the social psychological processes that produce health-promoting or damaging actions.
Tilson, Julie K; Mickan, Sharon
2014-06-25
There is a need for theoretically grounded and evidence-based interventions that enhance the use of research evidence in physical therapist practice. This paper and its companion paper introduce the Physical therapist-driven Education for Actionable Knowledge translation (PEAK) program, an educational program designed to promote physical therapists' integration of research evidence into clinical decision-making. The pedagogical foundations for the PEAK educational program include Albert Bandura's social cognitive theory and Malcolm Knowles's adult learning theory. Additionally, two complementary frameworks of knowledge translation, the Promoting Action on Research Implementation in Health Services (PARiHS) and Knowledge to Action (KTA) Cycle, were used to inform the organizational elements of the program. Finally, the program design was influenced by evidence from previous attempts to facilitate the use of research in practice at the individual and organizational levels. The 6-month PEAK program consisted of four consecutive and interdependent components. First, leadership support was secured and electronic resources were acquired and distributed to participants. Next, a two-day training workshop consisting of didactic and small group activities was conducted that addressed the five steps of evidence based practice. For five months following the workshop, participants worked in small groups to review and synthesize literature around a group-selected area of common clinical interest. Each group contributed to the generation of a "Best Practices List" - a list of locally generated, evidence-based, actionable behaviors relevant to the groups' clinical practice. Ultimately, participants agreed to implement the Best Practices List in their clinical practice. This, first of two companion papers, describes the underlying pedagogical theories, knowledge translation frameworks, and research evidence used to derive the PEAK program - an educational program designed to promote the use of research evidence to inform physical therapist practice. The four components of the program are described in detail. The companion paper reports the results of a mixed methods feasibility analysis of this complex educational intervention.
Learning to manage complexity through simulation: students' challenges and possible strategies.
Gormley, Gerard J; Fenwick, Tara
2016-06-01
Many have called for medical students to learn how to manage complexity in healthcare. This study examines the nuances of students' challenges in coping with a complex simulation learning activity, using concepts from complexity theory, and suggests strategies to help them better understand and manage complexity.Wearing video glasses, participants took part in a simulation ward-based exercise that incorporated characteristics of complexity. Video footage was used to elicit interviews, which were transcribed. Using complexity theory as a theoretical lens, an iterative approach was taken to identify the challenges that participants faced and possible coping strategies using both interview transcripts and video footage.Students' challenges in coping with clinical complexity included being: a) unprepared for 'diving in', b) caught in an escalating system, c) captured by the patient, and d) unable to assert boundaries of acceptable practice.Many characteristics of complexity can be recreated in a ward-based simulation learning activity, affording learners an embodied and immersive experience of these complexity challenges. Possible strategies for managing complexity themes include: a) taking time to size up the system, b) attuning to what emerges, c) reducing complexity, d) boundary practices, and e) working with uncertainty. This study signals pedagogical opportunities for recognizing and dealing with complexity.
OPTIMAL CONTROL THEORY FOR SUSTAINABLE ENVIRONMENTAL MANAGEMENT
Sustainable management of the human and natural systems, taking into account their interactions, has become paramount. To achieve this complex multidisciplinary objective, systems theory based techniques prove useful. The proposed work is a step in that direction. Taking a food w...
Information Processing Theory and Conceptual Development.
ERIC Educational Resources Information Center
Schroder, H. M.
An educational program based upon information processing theory has been developed at Southern Illinois University. The integrating theme was the development of conceptual ability for coping with social and personal problems. It utilized student information search and concept formation as foundations for discussion and judgment and was organized…
Complex Intelligent Systems: Juxtaposition of Foundational Notions and a Research Agenda
NASA Astrophysics Data System (ADS)
Gelepithis, Petros A.
2001-11-01
The cardinality of the class, C , of complex intelligent systems, i.e., systems of intelligent systems and their resources, is steadily increasing. Such an increase, whether designed, sometimes changes significantly and fundamentally, the structure of C . Recently,the study of members of C and its structure comes under a variety of multidisciplinary headings the most prominent of which include General Systems Theory, Complexity Science, Artificial Life, and Cybernetics. Their common characteristic is the quest for a unified theory of a certain class of systems like a living system or an organisation. So far, the only candidate for a general theory of intelligent systems is Newell's Soar. To my knowledge there is presently no candidate theory of C except Newell's claimed extensibility of Soar. This paper juxtaposes the elements of Newell's conceptual basis with those of an alternative conceptual framework based on the thesis that communication and understanding are the primary processes shaping the structure of C and its members. It is patently obvious that a research agenda for the study of C can be extremely varied and long. The third section of this paper presents a highly selective research agenda that aims to provoke discussion among complexity theory scientists.
Tunable magnetism in metal adsorbed fluorinated nanoporous graphene
Kumar, Pankaj; Sharma, Vinit; Reboredo, Fernando A.; ...
2016-08-24
Developing nanostructures with tunable magnetic states is crucial for designing novel data storage and quantum information devices. Using density functional theory, we study the thermodynamic stability and magnetic properties of tungsten adsorbed tri-vacancy fluorinated (TVF) graphene. We demonstrate a strong structure-property relationship and its response to external stimuli via defect engineering in graphene-based materials. Complex interplay between defect states and the chemisorbed atom results in a large magnetic moment of 7 μ B along with high in-plane magneto-crystalline anisotropy energy (MAE) of 17 meV. Under the influence of electric field, spin crossover effect accompanied by a change in the MAEmore » is observed. The ascribed change in spin-configuration is caused by the modification of exchange coupling between defect states and a change in the occupation of d-orbitals of the metal complex. In conclusion, our predictions open a promising way towards controlling the magnetic properties in graphene based spintronic and non-volatile memory devices.« less
Effects of individual popularity on information spreading in complex networks
NASA Astrophysics Data System (ADS)
Gao, Lei; Li, Ruiqi; Shu, Panpan; Wang, Wei; Gao, Hui; Cai, Shimin
2018-01-01
In real world, human activities often exhibit preferential selection mechanism based on the popularity of individuals. However, this mechanism is seldom taken into account by previous studies about spreading dynamics on networks. Thus in this work, an information spreading model is proposed by considering the preferential selection based on individuals' current popularity, which is defined as the number of individuals' cumulative contacts with informed neighbors. A mean-field theory is developed to analyze the spreading model. Through systematically studying the information spreading dynamics on uncorrelated configuration networks as well as real-world networks, we find that the popularity preference has great impacts on the information spreading. On the one hand, the information spreading is facilitated, i.e., a larger final prevalence of information and a smaller outbreak threshold, if nodes with low popularity are preferentially selected. In this situation, the effective contacts between informed nodes and susceptible nodes are increased, and nodes almost have uniform probabilities of obtaining the information. On the other hand, if nodes with high popularity are preferentially selected, the final prevalence of information is reduced, the outbreak threshold is increased, and even the information cannot outbreak. In addition, the heterogeneity of the degree distribution and the structure of real-world networks do not qualitatively affect the results. Our research can provide some theoretical supports for the promotion of spreading such as information, health related behaviors, and new products, etc.
Testing the criterion for correct convergence in the complex Langevin method
NASA Astrophysics Data System (ADS)
Nagata, Keitaro; Nishimura, Jun; Shimasaki, Shinji
2018-05-01
Recently the complex Langevin method (CLM) has been attracting attention as a solution to the sign problem, which occurs in Monte Carlo calculations when the effective Boltzmann weight is not real positive. An undesirable feature of the method, however, was that it can happen in some parameter regions that the method yields wrong results even if the Langevin process reaches equilibrium without any problem. In our previous work, we proposed a practical criterion for correct convergence based on the probability distribution of the drift term that appears in the complex Langevin equation. Here we demonstrate the usefulness of this criterion in two solvable theories with many dynamical degrees of freedom, i.e., two-dimensional Yang-Mills theory with a complex coupling constant and the chiral Random Matrix Theory for finite density QCD, which were studied by the CLM before. Our criterion can indeed tell the parameter regions in which the CLM gives correct results.
On the four-dimensional holoraumy of the 4D, 𝒩 = 1 complex linear supermultiplet
NASA Astrophysics Data System (ADS)
Caldwell, Wesley; Diaz, Alejandro N.; Friend, Isaac; Gates, S. James; Harmalkar, Siddhartha; Lambert-Brown, Tamar; Lay, Daniel; Martirosova, Karina; Meszaros, Victor A.; Omokanwaye, Mayowa; Rudman, Shaina; Shin, Daeljuck; Vershov, Anthony
2018-04-01
We present arguments to support the existence of weight spaces for supersymmetric field theories and identify the calculations of information about supermultiplets to define such spaces via the concept of “holoraumy.” For the first time, this is extended to the complex linear superfield by a calculation of the commutator of supercovariant derivatives on all of its component fields.
The mixed reality of things: emerging challenges for human-information interaction
NASA Astrophysics Data System (ADS)
Spicer, Ryan P.; Russell, Stephen M.; Rosenberg, Evan Suma
2017-05-01
Virtual and mixed reality technology has advanced tremendously over the past several years. This nascent medium has the potential to transform how people communicate over distance, train for unfamiliar tasks, operate in challenging environments, and how they visualize, interact, and make decisions based on complex data. At the same time, the marketplace has experienced a proliferation of network-connected devices and generalized sensors that are becoming increasingly accessible and ubiquitous. As the "Internet of Things" expands to encompass a predicted 50 billion connected devices by 2020, the volume and complexity of information generated in pervasive and virtualized environments will continue to grow exponentially. The convergence of these trends demands a theoretically grounded research agenda that can address emerging challenges for human-information interaction (HII). Virtual and mixed reality environments can provide controlled settings where HII phenomena can be observed and measured, new theories developed, and novel algorithms and interaction techniques evaluated. In this paper, we describe the intersection of pervasive computing with virtual and mixed reality, identify current research gaps and opportunities to advance the fundamental understanding of HII, and discuss implications for the design and development of cyber-human systems for both military and civilian use.
A new theory of development: the generation of complexity in ontogenesis.
Barbieri, Marcello
2016-03-13
Today there is a very wide consensus on the idea that embryonic development is the result of a genetic programme and of epigenetic processes. Many models have been proposed in this theoretical framework to account for the various aspects of development, and virtually all of them have one thing in common: they do not acknowledge the presence of organic codes (codes between organic molecules) in ontogenesis. Here it is argued instead that embryonic development is a convergent increase in complexity that necessarily requires organic codes and organic memories, and a few examples of such codes are described. This is the code theory of development, a theory that was originally inspired by an algorithm that is capable of reconstructing structures from incomplete information, an algorithm that here is briefly summarized because it makes it intuitively appealing how a convergent increase in complexity can be achieved. The main thesis of the new theory is that the presence of organic codes in ontogenesis is not only a theoretical necessity but, first and foremost, an idea that can be tested and that has already been found to be in agreement with the evidence. © 2016 The Author(s).
Academic Primer Series: Eight Key Papers about Education Theory.
Gottlieb, Michael; Boysen-Osborn, Megan; Chan, Teresa M; Krzyzaniak, Sara M; Pineda, Nicolas; Spector, Jordan; Sherbino, Jonathan
2017-02-01
Many teachers adopt instructional methods based on assumptions of best practices without attention to or knowledge of supporting education theory. Familiarity with a variety of theories informs education that is efficient, strategic, and evidence-based. As part of the Academic Life in Emergency Medicine Faculty Incubator Program, a list of key education theories for junior faculty was developed. A list of key papers on theories relevant to medical education was generated using an expert panel, a virtual community of practice synthetic discussion, and a social media call for resources. A three-round, Delphi-informed voting methodology including novice and expert educators produced a rank order of the top papers. These educators identified 34 unique papers. Eleven papers described the general use of education theory, while 23 papers focused on a specific theory. The top three papers on general education theories and top five papers on specific education theory were selected and summarized. The relevance of each paper for junior faculty and faculty developers is also presented. This paper presents a reading list of key papers for junior faculty in medical education roles. Three papers about general education theories and five papers about specific educational theories are identified and annotated. These papers may help provide foundational knowledge in education theory to inform junior faculty teaching practice.
Academic Primer Series: Eight Key Papers about Education Theory
Gottlieb, Michael; Boysen-Osborn, Megan; Chan, Teresa M.; Krzyzaniak, Sara M.; Pineda, Nicolas; Spector, Jordan; Sherbino, Jonathan
2017-01-01
Introduction Many teachers adopt instructional methods based on assumptions of best practices without attention to or knowledge of supporting education theory. Familiarity with a variety of theories informs education that is efficient, strategic, and evidence-based. As part of the Academic Life in Emergency Medicine Faculty Incubator Program, a list of key education theories for junior faculty was developed. Methods A list of key papers on theories relevant to medical education was generated using an expert panel, a virtual community of practice synthetic discussion, and a social media call for resources. A three-round, Delphi-informed voting methodology including novice and expert educators produced a rank order of the top papers. Results These educators identified 34 unique papers. Eleven papers described the general use of education theory, while 23 papers focused on a specific theory. The top three papers on general education theories and top five papers on specific education theory were selected and summarized. The relevance of each paper for junior faculty and faculty developers is also presented. Conclusion This paper presents a reading list of key papers for junior faculty in medical education roles. Three papers about general education theories and five papers about specific educational theories are identified and annotated. These papers may help provide foundational knowledge in education theory to inform junior faculty teaching practice. PMID:28210367
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ridolfi, E.; Napolitano, F., E-mail: francesco.napolitano@uniroma1.it; Alfonso, L.
2016-06-08
The description of river topography has a crucial role in accurate one-dimensional (1D) hydraulic modelling. Specifically, cross-sectional data define the riverbed elevation, the flood-prone area, and thus, the hydraulic behavior of the river. Here, the problem of the optimal cross-sectional spacing is solved through an information theory-based concept. The optimal subset of locations is the one with the maximum information content and the minimum amount of redundancy. The original contribution is the introduction of a methodology to sample river cross sections in the presence of bridges. The approach is tested on the Grosseto River (IT) and is compared to existingmore » guidelines. The results show that the information theory-based approach can support traditional methods to estimate rivers’ cross-sectional spacing.« less
Empowering Older Patients to Engage in Self Care: Designing an Interactive Robotic Device
Tiwari, Priyadarshi; Warren, Jim; Day, Karen
2011-01-01
Objectives: To develop and test an interactive robot mounted computing device to support medication management as an example of a complex self-care task in older adults. Method: A Grounded Theory (GT), Participatory Design (PD) approach was used within three Action Research (AR) cycles to understand design requirements and test the design configuration addressing the unique task requirements. Results: At the end of the first cycle a conceptual framework was evolved. The second cycle informed architecture and interface design. By the end of third cycle residents successfully interacted with the dialogue system and were generally satisfied with the robot. The results informed further refinement of the prototype. Conclusion: An interactive, touch screen based, robot-mounted information tool can be developed to support healthcare needs of older people. Qualitative methods such as the hybrid GT-PD-AR approach may be particularly helpful for innovating and articulating design requirements in challenging situations. PMID:22195203
Empowering older patients to engage in self care: designing an interactive robotic device.
Tiwari, Priyadarshi; Warren, Jim; Day, Karen
2011-01-01
To develop and test an interactive robot mounted computing device to support medication management as an example of a complex self-care task in older adults. A Grounded Theory (GT), Participatory Design (PD) approach was used within three Action Research (AR) cycles to understand design requirements and test the design configuration addressing the unique task requirements. At the end of the first cycle a conceptual framework was evolved. The second cycle informed architecture and interface design. By the end of third cycle residents successfully interacted with the dialogue system and were generally satisfied with the robot. The results informed further refinement of the prototype. An interactive, touch screen based, robot-mounted information tool can be developed to support healthcare needs of older people. Qualitative methods such as the hybrid GT-PD-AR approach may be particularly helpful for innovating and articulating design requirements in challenging situations.
Coherent multiscale image processing using dual-tree quaternion wavelets.
Chan, Wai Lam; Choi, Hyeokho; Baraniuk, Richard G
2008-07-01
The dual-tree quaternion wavelet transform (QWT) is a new multiscale analysis tool for geometric image features. The QWT is a near shift-invariant tight frame representation whose coefficients sport a magnitude and three phases: two phases encode local image shifts while the third contains image texture information. The QWT is based on an alternative theory for the 2-D Hilbert transform and can be computed using a dual-tree filter bank with linear computational complexity. To demonstrate the properties of the QWT's coherent magnitude/phase representation, we develop an efficient and accurate procedure for estimating the local geometrical structure of an image. We also develop a new multiscale algorithm for estimating the disparity between a pair of images that is promising for image registration and flow estimation applications. The algorithm features multiscale phase unwrapping, linear complexity, and sub-pixel estimation accuracy.
Neuroscience of water molecules: a salute to professor Linus Carl Pauling.
Nakada, Tsutomu
2009-04-01
More than 35 years ago double Nobel laureate Linus Carl Pauling published a powerful model of the molecular mechanism of general anesthesia, generally referred to as the hydrate-microcrystal (aqueous-phase) theory. This hypothesis, based on the molecular behavior of water molecules, did not receive serious attention during Pauling's life time, when scientific tools for examining complex systems such as the brain were still in their infancy. The situation has since drastically changed, and, now, in the twenty first century, many scientific tools are available for examining different types of complex systems. The discovery of aquaporin-4, a subtype of water channel abundantly expressed in glial systems, further highlighted the concept that the dynamics of water molecules in the cerebral cortex play an important role in important physiological brain functions including consciousness and information processing.
Zhang, Dan; Wang, Qing-Guo; Srinivasan, Dipti; Li, Hongyi; Yu, Li
2018-05-01
This paper is concerned with the asynchronous state estimation for a class of discrete-time switched complex networks with communication constraints. An asynchronous estimator is designed to overcome the difficulty that each node cannot access to the topology/coupling information. Also, the event-based communication, signal quantization, and the random packet dropout problems are studied due to the limited communication resource. With the help of switched system theory and by resorting to some stochastic system analysis method, a sufficient condition is proposed to guarantee the exponential stability of estimation error system in the mean-square sense and a prescribed performance level is also ensured. The characterization of the desired estimator gains is derived in terms of the solution to a convex optimization problem. Finally, the effectiveness of the proposed design approach is demonstrated by a simulation example.
ERIC Educational Resources Information Center
Ong, Chiek Pin; Tasir, Zaidatun
2015-01-01
The aim of the research is to study the information retention among trainee teachers using a self-instructional printed module based on Cognitive Load Theory for learning spreadsheet software. Effective pedagogical considerations integrating the theoretical concepts related to cognitive load are reflected in the design and development of the…
Chen, Huey T
2016-12-01
Theories of program and theories of evaluation form the foundation of program evaluation theories. Theories of program reflect assumptions on how to conceptualize an intervention program for evaluation purposes, while theories of evaluation reflect assumptions on how to design useful evaluation. These two types of theories are related, but often discussed separately. This paper attempts to use three theoretical perspectives (reductionism, systems thinking, and pragmatic synthesis) to interface them and discuss the implications for evaluation practice. Reductionism proposes that an intervention program can be broken into crucial components for rigorous analyses; systems thinking view an intervention program as dynamic and complex, requiring a holistic examination. In spite of their contributions, reductionism and systems thinking represent the extreme ends of a theoretical spectrum; many real-world programs, however, may fall in the middle. Pragmatic synthesis is being developed to serve these moderate- complexity programs. These three theoretical perspectives have their own strengths and challenges. Knowledge on these three perspectives and their evaluation implications can provide a better guide for designing fruitful evaluations, improving the quality of evaluation practice, informing potential areas for developing cutting-edge evaluation approaches, and contributing to advancing program evaluation toward a mature applied science. Copyright © 2016 Elsevier Ltd. All rights reserved.
Measuring the intangibles: a metrics for the economic complexity of countries and products.
Cristelli, Matthieu; Gabrielli, Andrea; Tacchella, Andrea; Caldarelli, Guido; Pietronero, Luciano
2013-01-01
We investigate a recent methodology we have proposed to extract valuable information on the competitiveness of countries and complexity of products from trade data. Standard economic theories predict a high level of specialization of countries in specific industrial sectors. However, a direct analysis of the official databases of exported products by all countries shows that the actual situation is very different. Countries commonly considered as developed ones are extremely diversified, exporting a large variety of products from very simple to very complex. At the same time countries generally considered as less developed export only the products also exported by the majority of countries. This situation calls for the introduction of a non-monetary and non-income-based measure for country economy complexity which uncovers the hidden potential for development and growth. The statistical approach we present here consists of coupled non-linear maps relating the competitiveness/fitness of countries to the complexity of their products. The fixed point of this transformation defines a metrics for the fitness of countries and the complexity of products. We argue that the key point to properly extract the economic information is the non-linearity of the map which is necessary to bound the complexity of products by the fitness of the less competitive countries exporting them. We present a detailed comparison of the results of this approach directly with those of the Method of Reflections by Hidalgo and Hausmann, showing the better performance of our method and a more solid economic, scientific and consistent foundation.
Measuring the Intangibles: A Metrics for the Economic Complexity of Countries and Products
Cristelli, Matthieu; Gabrielli, Andrea; Tacchella, Andrea; Caldarelli, Guido; Pietronero, Luciano
2013-01-01
We investigate a recent methodology we have proposed to extract valuable information on the competitiveness of countries and complexity of products from trade data. Standard economic theories predict a high level of specialization of countries in specific industrial sectors. However, a direct analysis of the official databases of exported products by all countries shows that the actual situation is very different. Countries commonly considered as developed ones are extremely diversified, exporting a large variety of products from very simple to very complex. At the same time countries generally considered as less developed export only the products also exported by the majority of countries. This situation calls for the introduction of a non-monetary and non-income-based measure for country economy complexity which uncovers the hidden potential for development and growth. The statistical approach we present here consists of coupled non-linear maps relating the competitiveness/fitness of countries to the complexity of their products. The fixed point of this transformation defines a metrics for the fitness of countries and the complexity of products. We argue that the key point to properly extract the economic information is the non-linearity of the map which is necessary to bound the complexity of products by the fitness of the less competitive countries exporting them. We present a detailed comparison of the results of this approach directly with those of the Method of Reflections by Hidalgo and Hausmann, showing the better performance of our method and a more solid economic, scientific and consistent foundation. PMID:23940633
An On-Demand Optical Quantum Random Number Generator with In-Future Action and Ultra-Fast Response
Stipčević, Mario; Ursin, Rupert
2015-01-01
Random numbers are essential for our modern information based society e.g. in cryptography. Unlike frequently used pseudo-random generators, physical random number generators do not depend on complex algorithms but rather on a physicsal process to provide true randomness. Quantum random number generators (QRNG) do rely on a process, wich can be described by a probabilistic theory only, even in principle. Here we present a conceptualy simple implementation, which offers a 100% efficiency of producing a random bit upon a request and simultaneously exhibits an ultra low latency. A careful technical and statistical analysis demonstrates its robustness against imperfections of the actual implemented technology and enables to quickly estimate randomness of very long sequences. Generated random numbers pass standard statistical tests without any post-processing. The setup described, as well as the theory presented here, demonstrate the maturity and overall understanding of the technology. PMID:26057576
Kim, Scott Y. H.; Wall, Ian F.; Stanczyk, Aimee; Vries, Raymond De
2010-01-01
In a Liberal Democracy, Policy Decisions regarding ethical controversies, including those in research ethics, should incorporate the opinions of its citizens. Eliciting informed and well-considered ethical opinions can be challenging. The issues may not be widely familiar and they may involve complex scientific, legal, historical, and ethical dimensions. Traditional surveys risk eliciting superficial and uninformed opinions that may be of dubious quality for policy formation. We argue that the theory and practice of deliberative democracy (DD) is especially useful in overcoming such inadequacies. We explain DD theory and practice, discuss the rationale for using DD methods in research ethics, and illustrate in depth the use of a DD method for a long-standing research ethics controversy involving research based on surrogate consent. The potential pitfalls of DD and the means of minimizing them as well as future research directions are also discussed. PMID:19919315
Kim, Scott Y H; Wall, Ian F; Stanczyk, Aimee; De Vries, Raymond
2009-12-01
In a liberal democracy, policy decisions regarding ethical controversies, including those in research ethics, should incorporate the opinions of its citizens. Eliciting informed and well-considered ethical opinions can be challenging. The issues may not be widely familiar and they may involve complex scientific, legal, historical, and ethical dimensions. Traditional surveys risk eliciting superficial and uninformed opinions that may be of dubious quality for policy formation. We argue that the theory and practice of deliberative democracy (DD) is especially useful in overcoming such inadequacies. We explain DD theory and practice, discuss the rationale for using DD methods in research ethics, and illustrate in depth the use of a DD method for a longstanding research ethics controversy involving research based on surrogate consent. The potential pitfalls of DD and the means of minimizing them as well as future research directions are also discussed.
Nezarat, Amin; Dastghaibifard, GH
2015-01-01
One of the most complex issues in the cloud computing environment is the problem of resource allocation so that, on one hand, the cloud provider expects the most profitability and, on the other hand, users also expect to have the best resources at their disposal considering the budget constraints and time. In most previous work conducted, heuristic and evolutionary approaches have been used to solve this problem. Nevertheless, since the nature of this environment is based on economic methods, using such methods can decrease response time and reducing the complexity of the problem. In this paper, an auction-based method is proposed which determines the auction winner by applying game theory mechanism and holding a repetitive game with incomplete information in a non-cooperative environment. In this method, users calculate suitable price bid with their objective function during several round and repetitions and send it to the auctioneer; and the auctioneer chooses the winning player based the suggested utility function. In the proposed method, the end point of the game is the Nash equilibrium point where players are no longer inclined to alter their bid for that resource and the final bid also satisfies the auctioneer’s utility function. To prove the response space convexity, the Lagrange method is used and the proposed model is simulated in the cloudsim and the results are compared with previous work. At the end, it is concluded that this method converges to a response in a shorter time, provides the lowest service level agreement violations and the most utility to the provider. PMID:26431035
Nezarat, Amin; Dastghaibifard, G H
2015-01-01
One of the most complex issues in the cloud computing environment is the problem of resource allocation so that, on one hand, the cloud provider expects the most profitability and, on the other hand, users also expect to have the best resources at their disposal considering the budget constraints and time. In most previous work conducted, heuristic and evolutionary approaches have been used to solve this problem. Nevertheless, since the nature of this environment is based on economic methods, using such methods can decrease response time and reducing the complexity of the problem. In this paper, an auction-based method is proposed which determines the auction winner by applying game theory mechanism and holding a repetitive game with incomplete information in a non-cooperative environment. In this method, users calculate suitable price bid with their objective function during several round and repetitions and send it to the auctioneer; and the auctioneer chooses the winning player based the suggested utility function. In the proposed method, the end point of the game is the Nash equilibrium point where players are no longer inclined to alter their bid for that resource and the final bid also satisfies the auctioneer's utility function. To prove the response space convexity, the Lagrange method is used and the proposed model is simulated in the cloudsim and the results are compared with previous work. At the end, it is concluded that this method converges to a response in a shorter time, provides the lowest service level agreement violations and the most utility to the provider.
Photoactivatable metal complexes: from theory to applications in biotechnology and medicine.
Smith, Nichola A; Sadler, Peter J
2013-07-28
This short review highlights some of the exciting new experimental and theoretical developments in the field of photoactivatable metal complexes and their applications in biotechnology and medicine. The examples chosen are based on some of the presentations at the Royal Society Discussion Meeting in June 2012, many of which are featured in more detail in other articles in this issue. This is a young field. Even the photochemistry of well-known systems such as metal-carbonyl complexes is still being elucidated. Striking are the recent developments in theory and computation (e.g. time-dependent density functional theory) and in ultrafast-pulsed radiation techniques which allow photochemical reactions to be followed and their mechanisms to be revealed on picosecond/nanosecond time scales. Not only do some metal complexes (e.g. those of Ru and Ir) possess favourable emission properties which allow functional imaging of cells and tissues (e.g. DNA interactions), but metal complexes can also provide spatially controlled photorelease of bioactive small molecules (e.g. CO and NO)--a novel strategy for site-directed therapy. This extends to cancer therapy, where metal-based precursors offer the prospect of generating excited-state drugs with new mechanisms of action that complement and augment those of current organic photosensitizers.
The Evolution of Biological Complexity in Digital Organisms
NASA Astrophysics Data System (ADS)
Ofria, Charles
2013-03-01
When Darwin first proposed his theory of evolution by natural selection, he realized that it had a problem explaining the origins of traits of ``extreme perfection and complication'' such as the vertebrate eye. Critics of Darwin's theory have latched onto this perceived flaw as a proof that Darwinian evolution is impossible. In anticipation of this issue, Darwin described the perfect data needed to understand this process, but lamented that such data are ``scarcely ever possible'' to obtain. In this talk, I will discuss research where we use populations of digital organisms (self-replicating and evolving computer programs) to elucidate the genetic and evolutionary processes by which new, highly-complex traits arise, drawing inspiration directly from Darwin's wistful thinking and hypotheses. During the process of evolution in these fully-transparent computational environments we can measure the incorporation of new information into the genome, a process akin to a natural Maxwell's Demon, and identify the original source of any such information. We show that, as Darwin predicted, much of the information used to encode a complex trait was already in the genome as part of simpler evolved traits, and that many routes must be possible for a new complex trait to have a high probability of successfully evolving. In even more extreme examples of the evolution of complexity, we are now using these same principles to examine the evolutionary dynamics the drive major transitions in evolution; that is transitions to higher-levels of organization, which are some of the most complex evolutionary events to occur in nature. Finally, I will explore some of the implications of this research to other aspects of evolutionary biology and as well as ways that these evolutionary principles can be applied toward solving computational and engineering problems.
Lakshman, Rajalakshmi; Griffin, Simon; Hardeman, Wendy; Schiff, Annie; Kinmonth, Ann Louise; Ong, Ken K
2014-01-01
We describe our experience of using the Medical Research Council framework on complex interventions to guide the development and evaluation of an intervention to prevent obesity by modifying infant feeding behaviours. We reviewed the epidemiological evidence on early life risk factors for obesity and interventions to prevent obesity in this age group. The review suggested prevention of excess weight gain in bottle-fed babies and appropriate weaning as intervention targets; hence we undertook systematic reviews to further our understanding of these behaviours. We chose theory and behaviour change techniques that demonstrated evidence of effectiveness in altering dietary behaviours. We subsequently developed intervention materials and evaluation tools and conducted qualitative studies with mothers (intervention recipients) and healthcare professionals (intervention deliverers) to refine them. We developed a questionnaire to assess maternal attitudes and feeding practices to understand the mechanism of any intervention effects. In addition to informing development of our specific intervention and evaluation materials, use of the Medical Research Council framework has helped to build a generalisable evidence base for early life nutritional interventions. However, the process is resource intensive and prolonged, and this should be taken into account by public health research funders. This trial is registered with ISRTCN: 20814693 Baby Milk Trial.
Hirose, Tomoyasu; Maita, Nobuo; Gouda, Hiroaki; Koseki, Jun; Yamamoto, Tsuyoshi; Sugawara, Akihiro; Nakano, Hirofumi; Hirono, Shuichi; Shiomi, Kazuro; Watanabe, Takeshi; Taniguchi, Hisaaki; Sharpless, K. Barry; Ōmura, Satoshi; Sunazuka, Toshiaki
2013-01-01
The Huisgen cycloaddition of azides and alkynes, accelerated by target biomolecules, termed “in situ click chemistry,” has been successfully exploited to discover highly potent enzyme inhibitors. We have previously reported a specific Serratia marcescens chitinase B (SmChiB)-templated syn-triazole inhibitor generated in situ from an azide-bearing inhibitor and an alkyne fragment. Several in situ click chemistry studies have been reported. Although some mechanistic evidence has been obtained, such as X-ray analysis of [protein]–[“click ligand”] complexes, indicating that proteins act as both mold and template between unique pairs of azide and alkyne fragments, to date, observations have been based solely on “postclick” structural information. Here, we describe crystal structures of SmChiB complexed with an azide ligand and an O-allyl oxime fragment as a mimic of a click partner, revealing a mechanism for accelerating syn-triazole formation, which allows generation of its own distinct inhibitor. We have also performed density functional theory calculations based on the X-ray structure to explore the acceleration of the Huisgen cycloaddition by SmChiB. The density functional theory calculations reasonably support that SmChiB plays a role by the cage effect during the pretranslation and posttranslation states of selective syn-triazole click formation. PMID:24043811
The Structure of Scientific Evolution
2013-01-01
Science is the construction and testing of systems that bind symbols to sensations according to rules. Material implication is the primary rule, providing the structure of definition, elaboration, delimitation, prediction, explanation, and control. The goal of science is not to secure truth, which is a binary function of accuracy, but rather to increase the information about data communicated by theory. This process is symmetric and thus entails an increase in the information about theory communicated by data. Important components in this communication are the elevation of data to the status of facts, the descent of models under the guidance of theory, and their close alignment through the evolving retroductive process. The information mutual to theory and data may be measured as the reduction in the entropy, or complexity, of the field of data given the model. It may also be measured as the reduction in the entropy of the field of models given the data. This symmetry explains the important status of parsimony (how thoroughly the data exploit what the model can say) alongside accuracy (how thoroughly the model represents what can be said about the data). Mutual information is increased by increasing model accuracy and parsimony, and by enlarging and refining the data field under purview. PMID:28018043
Complex networks for data-driven medicine: the case of Class III dentoskeletal disharmony
NASA Astrophysics Data System (ADS)
Scala, A.; Auconi, P.; Scazzocchio, M.; Caldarelli, G.; McNamara, JA; Franchi, L.
2014-11-01
In the last decade, the availability of innovative algorithms derived from complexity theory has inspired the development of highly detailed models in various fields, including physics, biology, ecology, economy, and medicine. Due to the availability of novel and ever more sophisticated diagnostic procedures, all biomedical disciplines face the problem of using the increasing amount of information concerning each patient to improve diagnosis and prevention. In particular, in the discipline of orthodontics the current diagnostic approach based on clinical and radiographic data is problematic due to the complexity of craniofacial features and to the numerous interacting co-dependent skeletal and dentoalveolar components. In this study, we demonstrate the capability of computational methods such as network analysis and module detection to extract organizing principles in 70 patients with excessive mandibular skeletal protrusion with underbite, a condition known in orthodontics as Class III malocclusion. Our results could possibly constitute a template framework for organising the increasing amount of medical data available for patients’ diagnosis.
A quantum theoretical approach to information processing in neural networks
NASA Astrophysics Data System (ADS)
Barahona da Fonseca, José; Barahona da Fonseca, Isabel; Suarez Araujo, Carmen Paz; Simões da Fonseca, José
2000-05-01
A reinterpretation of experimental data on learning was used to formulate a law on data acquisition similar to the Hamiltonian of a mechanical system. A matrix of costs in decision making specifies values attributable to a barrier that opposed to hypothesis formation about decision making. The interpretation of the encoding costs as frequencies of oscillatory phenomena leads to a quantum paradigm based in the models of photoelectric effect as well as of a particle against a potential barrier. Cognitive processes are envisaged as complex phenomena represented by structures linked by valence bounds. This metaphor is used to find some prerequisites to certain types of conscious experience as well as to find an explanation for some pathological distortions of cognitive operations as they are represented in the context of the isolobal model. Those quantum phenomena are understood as representing an analogue programming for specific special purpose computations. The formation of complex chemical structures within the context of isolobal theory is understood as an analog quantum paradigm for complex cognitive computations.
NASA Astrophysics Data System (ADS)
Nemati Aram, Tahereh; Ernzerhof, Matthias; Asgari, Asghar; Mayou, Didier
2017-01-01
We discuss the effects of charge carrier interaction and recombination on the operation of molecular photocells. Molecular photocells are devices where the energy conversion process takes place in a single molecular donor-acceptor complex attached to electrodes. Our investigation is based on the quantum scattering theory, in particular on the Lippmann-Schwinger equation; this minimizes the complexity of the problem while providing useful and non-trivial insight into the mechanism governing photocell operation. In this study, both exciton pair creation and dissociation are treated in the energy domain, and therefore there is access to detailed spectral information, which can be used as a framework to interpret the charge separation yield. We demonstrate that the charge carrier separation is a complex process that is affected by different parameters, such as the strength of the electron-hole interaction and the non-radiative recombination rate. Our analysis helps to optimize the charge separation process and the energy transfer in organic solar cells and in molecular photocells.
2014-01-01
Background Evidence indicates that post − stroke rehabilitation improves function, independence and quality of life. A key aspect of rehabilitation is the provision of appropriate information and feedback to the learner. Advances in information and communications technology (ICT) have allowed for the development of various systems to complement stroke rehabilitation that could be used in the home setting. These systems may increase the provision of rehabilitation a stroke survivor receives and carries out, as well as providing a learning platform that facilitates long-term self-managed rehabilitation and behaviour change. This paper describes the application of an innovative evaluative methodology to explore the utilisation of feedback for post-stroke upper-limb rehabilitation in the home. Methods Using the principles of realistic evaluation, this study aimed to test and refine intervention theories by exploring the complex interactions of contexts, mechanisms and outcomes that arise from technology deployment in the home. Methods included focus groups followed by multi-method case studies (n = 5) before, during and after the use of computer-based equipment. Data were analysed in relation to the context-mechanism-outcome hypotheses case by case. This was followed by a synthesis of the findings to answer the question, ‘what works for whom and in what circumstances and respects?’ Results Data analysis reveals that to achieve desired outcomes through the use of ICT, key elements of computer feedback, such as accuracy, measurability, rewarding feedback, adaptability, and knowledge of results feedback, are required to trigger the theory-driven mechanisms underpinning the intervention. In addition, the pre-existing context and the personal and environmental contexts, such as previous experience of service delivery, personal goals, trust in the technology, and social circumstances may also enable or constrain the underpinning theory-driven mechanisms. Conclusions Findings suggest that the theory-driven mechanisms underpinning the utilisation of feedback from computer-based technology for home-based upper-limb post-stroke rehabilitation are dependent on key elements of computer feedback and the personal and environmental context. The identification of these elements may therefore inform the development of technology; therapy education and the subsequent adoption of technology and a self-management paradigm; long-term self-managed rehabilitation; and importantly, improvements in the physical and psychosocial aspects of recovery. PMID:24903401
Parker, Jack; Mawson, Susan; Mountain, Gail; Nasr, Nasrin; Zheng, Huiru
2014-06-05
Evidence indicates that post-stroke rehabilitation improves function, independence and quality of life. A key aspect of rehabilitation is the provision of appropriate information and feedback to the learner.Advances in information and communications technology (ICT) have allowed for the development of various systems to complement stroke rehabilitation that could be used in the home setting. These systems may increase the provision of rehabilitation a stroke survivor receives and carries out, as well as providing a learning platform that facilitates long-term self-managed rehabilitation and behaviour change. This paper describes the application of an innovative evaluative methodology to explore the utilisation of feedback for post-stroke upper-limb rehabilitation in the home. Using the principles of realistic evaluation, this study aimed to test and refine intervention theories by exploring the complex interactions of contexts, mechanisms and outcomes that arise from technology deployment in the home. Methods included focus groups followed by multi-method case studies (n = 5) before, during and after the use of computer-based equipment. Data were analysed in relation to the context-mechanism-outcome hypotheses case by case. This was followed by a synthesis of the findings to answer the question, 'what works for whom and in what circumstances and respects?' Data analysis reveals that to achieve desired outcomes through the use of ICT, key elements of computer feedback, such as accuracy, measurability, rewarding feedback, adaptability, and knowledge of results feedback, are required to trigger the theory-driven mechanisms underpinning the intervention. In addition, the pre-existing context and the personal and environmental contexts, such as previous experience of service delivery, personal goals, trust in the technology, and social circumstances may also enable or constrain the underpinning theory-driven mechanisms. Findings suggest that the theory-driven mechanisms underpinning the utilisation of feedback from computer-based technology for home-based upper-limb post-stroke rehabilitation are dependent on key elements of computer feedback and the personal and environmental context. The identification of these elements may therefore inform the development of technology; therapy education and the subsequent adoption of technology and a self-management paradigm; long-term self-managed rehabilitation; and importantly, improvements in the physical and psychosocial aspects of recovery.
Saeed, Faisal; Salim, Naomie; Abdo, Ammar
2013-07-01
Many consensus clustering methods have been applied in different areas such as pattern recognition, machine learning, information theory and bioinformatics. However, few methods have been used for chemical compounds clustering. In this paper, an information theory and voting based algorithm (Adaptive Cumulative Voting-based Aggregation Algorithm A-CVAA) was examined for combining multiple clusterings of chemical structures. The effectiveness of clusterings was evaluated based on the ability of the clustering method to separate active from inactive molecules in each cluster, and the results were compared with Ward's method. The chemical dataset MDL Drug Data Report (MDDR) and the Maximum Unbiased Validation (MUV) dataset were used. Experiments suggest that the adaptive cumulative voting-based consensus method can improve the effectiveness of combining multiple clusterings of chemical structures. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Complexity Theory and Network Centric Warfare
2003-09-01
realms of the unknown. Defence thinkers everywhere are searching forward for the science and alchemy that will deliver operational success. CCRP...0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send
BP network identification technology of infrared polarization based on fuzzy c-means clustering
NASA Astrophysics Data System (ADS)
Zeng, Haifang; Gu, Guohua; He, Weiji; Chen, Qian; Yang, Wei
2011-08-01
Infrared detection system is frequently employed on surveillance operations and reconnaissance mission to detect particular targets of interest in both civilian and military communities. By incorporating the polarization of light as supplementary information, the target discrimination performance could be enhanced. So this paper proposed an infrared target identification method which is based on fuzzy theory and neural network with polarization properties of targets. The paper utilizes polarization degree and light intensity to advance the unsupervised KFCM (kernel fuzzy C-Means) clustering method. And establish different material pol1arization properties database. In the built network, the system can feedback output corresponding material types of probability distribution toward any input polarized degree such as 10° 15°, 20°, 25°, 30°. KFCM, which has stronger robustness and accuracy than FCM, introduces kernel idea and gives the noise points and invalid value different but intuitively reasonable weights. Because of differences in characterization of material properties, there will be some conflicts in classification results. And D - S evidence theory was used in the combination of the polarization and intensity information. Related results show KFCM clustering precision and operation rate are higher than that of the FCM clustering method. The artificial neural network method realizes material identification, which reasonable solved the problems of complexity in environmental information of infrared polarization, and improperness of background knowledge and inference rule. This method of polarization identification is fast in speed, good in self-adaption and high in resolution.
Du, Yongzhao; Fu, Yuqing; Zheng, Lixin
2016-12-20
A real-time complex amplitude reconstruction method for determining the dynamic beam quality M2 factor based on a Mach-Zehnder self-referencing interferometer wavefront sensor is developed. By using the proposed complex amplitude reconstruction method, full characterization of the laser beam, including amplitude (intensity profile) and phase information, can be reconstructed from a single interference pattern with the Fourier fringe pattern analysis method in a one-shot measurement. With the reconstructed complex amplitude, the beam fields at any position z along its propagation direction can be obtained by first utilizing the diffraction integral theory. Then the beam quality M2 factor of the dynamic beam is calculated according to the specified method of the Standard ISO11146. The feasibility of the proposed method is demonstrated with the theoretical analysis and experiment, including the static and dynamic beam process. The experimental method is simple, fast, and operates without movable parts and is allowed in order to investigate the laser beam in inaccessible conditions using existing methods.
Dynamic Analysis of the Carotid-Kundalini Map
NASA Astrophysics Data System (ADS)
Wang, Xingyuan; Liang, Qingyong; Meng, Juan
The nature of the fixed points of the Carotid-Kundalini (C-K) map was studied and the boundary equation of the first bifurcation of the C-K map in the parameter plane is presented. Using the quantitative criterion and rule of chaotic system, the paper reveals the general features of the C-K Map transforming from regularity to chaos. The following conclusions are obtained: (i) chaotic patterns of the C-K map may emerge out of double-periodic bifurcation; (ii) the chaotic crisis phenomena are found. At the same time, the authors analyzed the orbit of critical point of the complex C-K Map and put forward the definition of Mandelbrot-Julia set of the complex C-K Map. The authors generalized the Welstead and Cromer's periodic scanning technique and using this technology constructed a series of the Mandelbrot-Julia sets of the complex C-K Map. Based on the experimental mathematics method of combining the theory of analytic function of one complex variable with computer aided drawing, we investigated the symmetry of the Mandelbrot-Julia set and studied the topological inflexibility of distribution of the periodic region in the Mandelbrot set, and found that the Mandelbrot set contains abundant information of the structure of Julia sets by finding the whole portray of Julia sets based on Mandelbrot set qualitatively.
Cooperative Localization for Multi-AUVs Based on GM-PHD Filters and Information Entropy Theory
Zhang, Lichuan; Wang, Tonghao; Xu, Demin
2017-01-01
Cooperative localization (CL) is considered a promising method for underwater localization with respect to multiple autonomous underwater vehicles (multi-AUVs). In this paper, we proposed a CL algorithm based on information entropy theory and the probability hypothesis density (PHD) filter, aiming to enhance the global localization accuracy of the follower. In the proposed framework, the follower carries lower cost navigation systems, whereas the leaders carry better ones. Meanwhile, the leaders acquire the followers’ observations, including both measurements and clutter. Then, the PHD filters are utilized on the leaders and the results are communicated to the followers. The followers then perform weighted summation based on all received messages and obtain a final positioning result. Based on the information entropy theory and the PHD filter, the follower is able to acquire a precise knowledge of its position. PMID:28991191
Education: a complex and empowering social work intervention at the end of life.
Cagle, John G; Kovacs, Pamela J
2009-02-01
Education is a frequently used social work intervention. Yet it seems to be an underappreciated and a deceptively complex intervention that social workers may not be adequately prepared to use. Reliable, accessible information is essential as it helps prevent unnecessary crises, facilitates coping, and promotes self-determination. This article conceptualizes education as a fundamental social work intervention and discusses the role social workers play in providing information that is both empowering and culturally sensitive. In particular, this article focuses on social workers working with patients and families facing life-threatening situations, including those in hospice and other end-of-life care settings. After reviewing the relevant literature and theory and exploring the inherent complexities of educational interventions, the authors recommend strategies for more effectively helping patients and families access the information they need.
Nathan, Mitchell J; Walkington, Candace
2017-01-01
We develop a theory of grounded and embodied mathematical cognition (GEMC) that draws on action-cognition transduction for advancing understanding of how the body can support mathematical reasoning. GEMC proposes that participants' actions serve as inputs capable of driving the cognition-action system toward associated cognitive states. This occurs through a process of transduction that promotes valuable mathematical insights by eliciting dynamic depictive gestures that enact spatio-temporal properties of mathematical entities. Our focus here is on pre-college geometry proof production. GEMC suggests that action alone can foster insight but is insufficient for valid proof production if action is not coordinated with language systems for propositionalizing general properties of objects and space. GEMC guides the design of a video game-based learning environment intended to promote students' mathematical insights and informal proofs by eliciting dynamic gestures through in-game directed actions. GEMC generates several hypotheses that contribute to theories of embodied cognition and to the design of science, technology, engineering, and mathematics (STEM) education interventions. Pilot study results with a prototype video game tentatively support theory-based predictions regarding the role of dynamic gestures for fostering insight and proof-with-insight, and for the role of action coupled with language to promote proof-with-insight. But the pilot yields mixed results for deriving in-game interventions intended to elicit dynamic gesture production. Although our central purpose is an explication of GEMC theory and the role of action-cognition transduction, the theory-based video game design reveals the potential of GEMC to improve STEM education, and highlights the complex challenges of connecting embodiment research to education practices and learning environment design.
A Quantum-Based Similarity Method in Virtual Screening.
Al-Dabbagh, Mohammed Mumtaz; Salim, Naomie; Himmat, Mubarak; Ahmed, Ali; Saeed, Faisal
2015-10-02
One of the most widely-used techniques for ligand-based virtual screening is similarity searching. This study adopted the concepts of quantum mechanics to present as state-of-the-art similarity method of molecules inspired from quantum theory. The representation of molecular compounds in mathematical quantum space plays a vital role in the development of quantum-based similarity approach. One of the key concepts of quantum theory is the use of complex numbers. Hence, this study proposed three various techniques to embed and to re-represent the molecular compounds to correspond with complex numbers format. The quantum-based similarity method that developed in this study depending on complex pure Hilbert space of molecules called Standard Quantum-Based (SQB). The recall of retrieved active molecules were at top 1% and top 5%, and significant test is used to evaluate our proposed methods. The MDL drug data report (MDDR), maximum unbiased validation (MUV) and Directory of Useful Decoys (DUD) data sets were used for experiments and were represented by 2D fingerprints. Simulated virtual screening experiment show that the effectiveness of SQB method was significantly increased due to the role of representational power of molecular compounds in complex numbers forms compared to Tanimoto benchmark similarity measure.
Ellis, Beverley; Herbert, Stuart Ian
2011-01-01
To identify key elements and characteristics of complex adaptive systems (CAS) relevant to implementing clinical governance, drawing on lessons from quality improvement programmes and the use of informatics in primary care. The research strategy includes a literature review to develop theoretical models of clinical governance of quality improvement in primary care organisations (PCOs) and a survey of PCOs. Complex adaptive system theories are a valuable tool to help make sense of natural phenomena, which include human responses to problem solving within the sampled PCOs. The research commenced with a survey; 76% (n16) of respondents preferred to support the implementation of clinical governance initiatives guided by outputs from general practice electronic health records. There was considerable variation in the way in which consultation data was captured, recorded and organised. Incentivised information sharing led to consensus on coding policies and models of data recording ahead of national contractual requirements. Informatics was acknowledged as a mechanism to link electronic health record outputs, quality improvement and resources. Investment in informatics was identified as a development priority in order to embed clinical governance principles in practice. Complex adaptive system theory usefully describes evolutionary change processes, providing insight into how the origins of quality assurance were predicated on rational reductionism and linearity. New forms of governance do not neutralise previous models, but add further dimensions to them. Clinical governance models have moved from deterministic and 'objective' factors to incorporate cultural aspects with feedback about quality enabled by informatics. The socio-technical lessons highlighted should inform healthcare management.
A diffusion of innovations model of physician order entry.
Ash, J S; Lyman, J; Carpenter, J; Fournier, L
2001-01-01
To interpret the results of a cross-site study of physician order entry (POE) in hospitals using a diffusion of innovations theory framework. Qualitative study using observation, focus groups, and interviews. Data were analyzed by an interdisciplinary team of researchers using a grounded approach to identify themes. Themes were then interpreted using classical Diffusion of Innovations (DOI) theory as described by Rogers [1]. Four high level themes were identified: organizational issues; clinical and professional issues; technology implementation issues; and issues related to the organization of information and knowledge. Further analysis using the DOI framework indicated that POE is an especially complex information technology innovation when one considers communication, time, and social system issues in addition to attributes of the innovation itself. Implementation strategies for POE should be designed to account for its complex nature. The ideal would be a system that is both customizable and integrated with other parts of the information system, is implemented with maximum involvement of users and high levels of support, and is surrounded by an atmosphere of trust and collaboration.
Life in the Hive: Supporting Inquiry into Complexity Within the Zone of Proximal Development
NASA Astrophysics Data System (ADS)
Danish, Joshua A.; Peppler, Kylie; Phelps, David; Washington, Dianna
2011-10-01
Research into students' understanding of complex systems typically ignores young children because of misinterpretations of young children's competencies. Furthermore, studies that do recognize young children's competencies tend to focus on what children can do in isolation. As an alternative, we propose an approach to designing for young children that is grounded in the notion of the Zone of Proximal Development (Vygotsky 1978) and leverages Activity Theory to design learning environments. In order to highlight the benefits of this approach, we describe our process for using Activity Theory to inform the design of new software and curricula in a way that is productive for young children to learn concepts that we might have previously considered to be "developmentally inappropriate". As an illuminative example, we then present a discussion of the design of the BeeSign simulation software and accompanying curriculum which specifically designed from an Activity Theory perspective to engage young children in learning about complex systems (Danish 2009a, b). Furthermore, to illustrate the benefits of this approach, we will present findings from a new study where 40 first- and second-grade students participated in the BeeSign curriculum to learn about how honeybees collect nectar from a complex systems perspective. We conclude with some practical suggestions for how such an approach to using Activity Theory for research and design might be adopted by other science educators and designers.
Modeling and dynamical topology properties of VANET based on complex networks theory
NASA Astrophysics Data System (ADS)
Zhang, Hong; Li, Jie
2015-01-01
Vehicular Ad hoc Network (VANET) is a special subset of multi-hop Mobile Ad hoc Networks in which vehicles can not only communicate with each other but also with the fixed equipments along the roads through wireless interfaces. Recently, it has been discovered that essential systems in real world share similar properties. When they are regarded as networks, among which the dynamic topology structure of VANET system is an important issue. Many real world networks are actually growing with preferential attachment like Internet, transportation system and telephone network. Those phenomena have brought great possibility in finding a strategy to calibrate and control the topology parameters which can help find VANET topology change regulation to relieve traffic jam, prevent traffic accident and improve traffic safety. VANET is a typical complex network which has its basic characteristics. In this paper, we focus on the macroscopic Vehicle-to-Infrastructure (V2I) and Vehicle-to-Vehicle (V2V) inter-vehicle communication network with complex network theory. In particular, this paper is the first one to propose a method analyzing the topological structure and performance of VANET and present the communications in VANET from a new perspective. Accordingly, we propose degree distribution, clustering coefficient and the short path length of complex network to implement our strategy by numerical example and simulation. All the results demonstrate that VANET shows small world network features and is characterized by a truncated scale-free degree distribution with power-law degree distribution. The average path length of the network is simulated numerically, which indicates that the network shows small-world property and is rarely affected by the randomness. What's more, we carry out extensive simulations of information propagation and mathematically prove the power law property when γ > 2. The results of this study provide useful information for VANET optimization from a macroscopic perspective.
Modeling and dynamical topology properties of VANET based on complex networks theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Hong; Li, Jie, E-mail: prof.li@foxmail.com
2015-01-15
Vehicular Ad hoc Network (VANET) is a special subset of multi-hop Mobile Ad hoc Networks in which vehicles can not only communicate with each other but also with the fixed equipments along the roads through wireless interfaces. Recently, it has been discovered that essential systems in real world share similar properties. When they are regarded as networks, among which the dynamic topology structure of VANET system is an important issue. Many real world networks are actually growing with preferential attachment like Internet, transportation system and telephone network. Those phenomena have brought great possibility in finding a strategy to calibrate andmore » control the topology parameters which can help find VANET topology change regulation to relieve traffic jam, prevent traffic accident and improve traffic safety. VANET is a typical complex network which has its basic characteristics. In this paper, we focus on the macroscopic Vehicle-to-Infrastructure (V2I) and Vehicle-to-Vehicle (V2V) inter-vehicle communication network with complex network theory. In particular, this paper is the first one to propose a method analyzing the topological structure and performance of VANET and present the communications in VANET from a new perspective. Accordingly, we propose degree distribution, clustering coefficient and the short path length of complex network to implement our strategy by numerical example and simulation. All the results demonstrate that VANET shows small world network features and is characterized by a truncated scale-free degree distribution with power-law degree distribution. The average path length of the network is simulated numerically, which indicates that the network shows small-world property and is rarely affected by the randomness. What’s more, we carry out extensive simulations of information propagation and mathematically prove the power law property when γ > 2. The results of this study provide useful information for VANET optimization from a macroscopic perspective.« less
Brief Instrumental School-Based Mentoring for Middle School Students: Theory and Impact
ERIC Educational Resources Information Center
McQuillin, Samuel D.; Lyons, Michael D.
2016-01-01
This study evaluated the efficacy of an intentionally brief school-based mentoring program. This academic goal-focused mentoring program was developed through a series of iterative randomized controlled trials, and is informed by research in social cognitive theory, cognitive dissonance theory, motivational interviewing, and research in academic…
Equilibria, information and frustration in heterogeneous network games with conflicting preferences
NASA Astrophysics Data System (ADS)
Mazzoli, M.; Sánchez, A.
2017-11-01
Interactions between people are the basis on which the structure of our society arises as a complex system and, at the same time, are the starting point of any physical description of it. In the last few years, much theoretical research has addressed this issue by combining the physics of complex networks with a description of interactions in terms of evolutionary game theory. We here take this research a step further by introducing a most salient societal factor such as the individuals’ preferences, a characteristic that is key to understanding much of the social phenomenology these days. We consider a heterogeneous, agent-based model in which agents interact strategically with their neighbors, but their preferences and payoffs for the possible actions differ. We study how such a heterogeneous network behaves under evolutionary dynamics and different strategic interactions, namely coordination games and best shot games. With this model we study the emergence of the equilibria predicted analytically in random graphs under best response dynamics, and we extend this test to unexplored contexts like proportional imitation and scale free networks. We show that some theoretically predicted equilibria do not arise in simulations with incomplete information, and we demonstrate the importance of the graph topology and the payoff function parameters for some games. Finally, we discuss our results with the available experimental evidence on coordination games, showing that our model agrees better with the experiment than standard economic theories, and draw hints as to how to maximize social efficiency in situations of conflicting preferences.
NASA Astrophysics Data System (ADS)
Zhou, Renjie; So, Peter T. C.; Yaqoob, Zahid; Jin, Di; Hosseini, Poorya; Kuang, Cuifang; Singh, Vijay Raj; Kim, Yang-Hyo; Dasari, Ramachandra R.
2017-02-01
Most of the quantitative phase microscopy systems are unable to provide depth-resolved information for measuring complex biological structures. Optical diffraction tomography provides a non-trivial solution to it by 3D reconstructing the object with multiple measurements through different ways of realization. Previously, our lab developed a reflection-mode dynamic speckle-field phase microscopy (DSPM) technique, which can be used to perform depth resolved measurements in a single shot. Thus, this system is suitable for measuring dynamics in a layer of interest in the sample. DSPM can be also used for tomographic imaging, which promises to solve the long-existing "missing cone" problem in 3D imaging. However, the 3D imaging theory for this type of system has not been developed in the literature. Recently, we have developed an inverse scattering model to rigorously describe the imaging physics in DSPM. Our model is based on the diffraction tomography theory and the speckle statistics. Using our model, we first precisely calculated the defocus response and the depth resolution in our system. Then, we further calculated the 3D coherence transfer function to link the 3D object structural information with the axially scanned imaging data. From this transfer function, we found that in the reflection mode excellent sectioning effect exists in the low lateral spatial frequency region, thus allowing us to solve the "missing cone" problem. Currently, we are working on using this coherence transfer function to reconstruct layered structures and complex cells.
A hydrodynamic model for cooperating solidary countries
NASA Astrophysics Data System (ADS)
De Luca, Roberto; Di Mauro, Marco; Falzarano, Angelo; Naddeo, Adele
2017-07-01
The goal of international trade theories is to explain the exchange of goods and services between different countries, aiming to benefit from it. Albeit the idea is very simple and known since ancient history, smart policy and business strategies need to be implemented by each subject, resulting in a complex as well as not obvious interplay. In order to understand such a complexity, different theories have been developed since the sixteenth century and today new ideas still continue to enter the game. Among them, the so called classical theories are country-based and range from Absolute and Comparative Advantage theories by A. Smith and D. Ricardo to Factor Proportions theory by E. Heckscher and B. Ohlin. In this work we build a simple hydrodynamic model, able to reproduce the main conclusions of Comparative Advantage theory in its simplest setup, i.e. a two-country world with country A and country B exchanging two goods within a genuine exchange-based economy and a trade flow ruled only by market forces. The model is further generalized by introducing money in order to discuss its role in shaping trade patterns. Advantages and drawbacks of the model are also discussed together with perspectives for its improvement.
Menolascina, Filippo; Bellomo, Domenico; Maiwald, Thomas; Bevilacqua, Vitoantonio; Ciminelli, Caterina; Paradiso, Angelo; Tommasi, Stefania
2009-10-15
Mechanistic models are becoming more and more popular in Systems Biology; identification and control of models underlying biochemical pathways of interest in oncology is a primary goal in this field. Unfortunately the scarce availability of data still limits our understanding of the intrinsic characteristics of complex pathologies like cancer: acquiring information for a system understanding of complex reaction networks is time consuming and expensive. Stimulus response experiments (SRE) have been used to gain a deeper insight into the details of biochemical mechanisms underlying cell life and functioning. Optimisation of the input time-profile, however, still remains a major area of research due to the complexity of the problem and its relevance for the task of information retrieval in systems biology-related experiments. We have addressed the problem of quantifying the information associated to an experiment using the Fisher Information Matrix and we have proposed an optimal experimental design strategy based on evolutionary algorithm to cope with the problem of information gathering in Systems Biology. On the basis of the theoretical results obtained in the field of control systems theory, we have studied the dynamical properties of the signals to be used in cell stimulation. The results of this study have been used to develop a microfluidic device for the automation of the process of cell stimulation for system identification. We have applied the proposed approach to the Epidermal Growth Factor Receptor pathway and we observed that it minimises the amount of parametric uncertainty associated to the identified model. A statistical framework based on Monte-Carlo estimations of the uncertainty ellipsoid confirmed the superiority of optimally designed experiments over canonical inputs. The proposed approach can be easily extended to multiobjective formulations that can also take advantage of identifiability analysis. Moreover, the availability of fully automated microfluidic platforms explicitly developed for the task of biochemical model identification will hopefully reduce the effects of the 'data rich--data poor' paradox in Systems Biology.
Demystifying theory and its use in improvement.
Davidoff, Frank; Dixon-Woods, Mary; Leviton, Laura; Michie, Susan
2015-03-01
The role and value of theory in improvement work in healthcare has been seriously underrecognised. We join others in proposing that more informed use of theory can strengthen improvement programmes and facilitate the evaluation of their effectiveness. Many professionals, including improvement practitioners, are unfortunately mystified-and alienated-by theory, which discourages them from using it in their work. In an effort to demystify theory we make the point in this paper that, far from being discretionary or superfluous, theory ('reason-giving'), both informal and formal, is intimately woven into virtually all human endeavour. We explore the special characteristics of grand, mid-range and programme theory; consider the consequences of misusing theory or failing to use it; review the process of developing and applying programme theory; examine some emerging criteria of 'good' theory; and emphasise the value, as well as the challenge, of combining informal experience-based theory with formal, publicly developed theory. We conclude that although informal theory is always at work in improvement, practitioners are often not aware of it or do not make it explicit. The germane issue for improvement practitioners, therefore, is not whether they use theory but whether they make explicit the particular theory or theories, informal and formal, they actually use. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Invariant resolutions for several Fueter operators
NASA Astrophysics Data System (ADS)
Colombo, Fabrizio; Souček, Vladimir; Struppa, Daniele C.
2006-07-01
A proper generalization of complex function theory to higher dimension is Clifford analysis and an analogue of holomorphic functions of several complex variables were recently described as the space of solutions of several Dirac equations. The four-dimensional case has special features and is closely connected to functions of quaternionic variables. In this paper we present an approach to the Dolbeault sequence for several quaternionic variables based on symmetries and representation theory. In particular we prove that the resolution of the Cauchy-Fueter system obtained algebraically, via Gröbner bases techniques, is equivalent to the one obtained by R.J. Baston (J. Geom. Phys. 1992).
2014-01-01
Background The Medical Research Councils’ framework for complex interventions has been criticized for not including theory-driven approaches to evaluation. Although the framework does include broad guidance on the use of theory, it contains little practical guidance for implementers and there have been calls to develop a more comprehensive approach. A prospective, theory-driven process of intervention design and evaluation is required to develop complex healthcare interventions which are more likely to be effective, sustainable and scalable. Methods We propose a theory-driven approach to the design and evaluation of complex interventions by adapting and integrating a programmatic design and evaluation tool, Theory of Change (ToC), into the MRC framework for complex interventions. We provide a guide to what ToC is, how to construct one, and how to integrate its use into research projects seeking to design, implement and evaluate complex interventions using the MRC framework. We test this approach by using ToC within two randomized controlled trials and one non-randomized evaluation of complex interventions. Results Our application of ToC in three research projects has shown that ToC can strengthen key stages of the MRC framework. It can aid the development of interventions by providing a framework for enhanced stakeholder engagement and by explicitly designing an intervention that is embedded in the local context. For the feasibility and piloting stage, ToC enables the systematic identification of knowledge gaps to generate research questions that strengthen intervention design. ToC may improve the evaluation of interventions by providing a comprehensive set of indicators to evaluate all stages of the causal pathway through which an intervention achieves impact, combining evaluations of intervention effectiveness with detailed process evaluations into one theoretical framework. Conclusions Incorporating a ToC approach into the MRC framework holds promise for improving the design and evaluation of complex interventions, thereby increasing the likelihood that the intervention will be ultimately effective, sustainable and scalable. We urge researchers developing and evaluating complex interventions to consider using this approach, to evaluate its usefulness and to build an evidence base to further refine the methodology. Trial registration Clinical trials.gov: NCT02160249 PMID:24996765
De Silva, Mary J; Breuer, Erica; Lee, Lucy; Asher, Laura; Chowdhary, Neerja; Lund, Crick; Patel, Vikram
2014-07-05
The Medical Research Councils' framework for complex interventions has been criticized for not including theory-driven approaches to evaluation. Although the framework does include broad guidance on the use of theory, it contains little practical guidance for implementers and there have been calls to develop a more comprehensive approach. A prospective, theory-driven process of intervention design and evaluation is required to develop complex healthcare interventions which are more likely to be effective, sustainable and scalable. We propose a theory-driven approach to the design and evaluation of complex interventions by adapting and integrating a programmatic design and evaluation tool, Theory of Change (ToC), into the MRC framework for complex interventions. We provide a guide to what ToC is, how to construct one, and how to integrate its use into research projects seeking to design, implement and evaluate complex interventions using the MRC framework. We test this approach by using ToC within two randomized controlled trials and one non-randomized evaluation of complex interventions. Our application of ToC in three research projects has shown that ToC can strengthen key stages of the MRC framework. It can aid the development of interventions by providing a framework for enhanced stakeholder engagement and by explicitly designing an intervention that is embedded in the local context. For the feasibility and piloting stage, ToC enables the systematic identification of knowledge gaps to generate research questions that strengthen intervention design. ToC may improve the evaluation of interventions by providing a comprehensive set of indicators to evaluate all stages of the causal pathway through which an intervention achieves impact, combining evaluations of intervention effectiveness with detailed process evaluations into one theoretical framework. Incorporating a ToC approach into the MRC framework holds promise for improving the design and evaluation of complex interventions, thereby increasing the likelihood that the intervention will be ultimately effective, sustainable and scalable. We urge researchers developing and evaluating complex interventions to consider using this approach, to evaluate its usefulness and to build an evidence base to further refine the methodology. Clinical trials.gov: NCT02160249.
Satake, S; Park, J-K; Sugama, H; Kanno, R
2011-07-29
Neoclassical toroidal viscosities (NTVs) in tokamaks are investigated using a δf Monte Carlo simulation, and are successfully verified with a combined analytic theory over a wide range of collisionality. A Monte Carlo simulation has been required in the study of NTV since the complexities in guiding-center orbits of particles and their collisions cannot be fully investigated by any means of analytic theories alone. Results yielded the details of the complex NTV dependency on particle precessions and collisions, which were predicted roughly in a combined analytic theory. Both numerical and analytic methods can be utilized and extended based on these successful verifications.
Social cognitive theory, metacognition, and simulation learning in nursing education.
Burke, Helen; Mancuso, Lorraine
2012-10-01
Simulation learning encompasses simple, introductory scenarios requiring response to patients' needs during basic hygienic care and during situations demanding complex decision making. Simulation integrates principles of social cognitive theory (SCT) into an interactive approach to learning that encompasses the core principles of intentionality, forethought, self-reactiveness, and self-reflectiveness. Effective simulation requires an environment conducive to learning and introduces activities that foster symbolic coding operations and mastery of new skills; debriefing builds self-efficacy and supports self-regulation of behavior. Tailoring the level of difficulty to students' mastery level supports successful outcomes and motivation to set higher standards. Mindful selection of simulation complexity and structure matches course learning objectives and supports progressive development of metacognition. Theory-based facilitation of simulated learning optimizes efficacy of this learning method to foster maturation of cognitive processes of SCT, metacognition, and self-directedness. Examples of metacognition that are supported through mindful, theory-based implementation of simulation learning are provided. Copyright 2012, SLACK Incorporated.
NASA Astrophysics Data System (ADS)
Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf
2017-09-01
There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Escudero, Daniel, E-mail: escudero@kofo.mpg.de, E-mail: thiel@kofo.mpg.de; Thiel, Walter, E-mail: escudero@kofo.mpg.de, E-mail: thiel@kofo.mpg.de
2014-05-21
We report an assessment of the performance of density functional theory-based multireference configuration interaction (DFT/MRCI) calculations for a set of 3d- and 4d-transition metal (TM) complexes. The DFT/MRCI results are compared to published reference data from reliable high-level multi-configurational ab initio studies. The assessment covers the relative energies of different ground-state minima of the highly correlated CrF{sub 6} complex, the singlet and triplet electronically excited states of seven typical TM complexes (MnO{sub 4}{sup −}, Cr(CO){sub 6}, [Fe(CN){sub 6}]{sup 4−}, four larger Fe and Ru complexes), and the corresponding electronic spectra (vertical excitation energies and oscillator strengths). It includes comparisons withmore » results from different flavors of time-dependent DFT (TD-DFT) calculations using pure, hybrid, and long-range corrected functionals. The DFT/MRCI method is found to be superior to the tested TD-DFT approaches and is thus recommended for exploring the excited-state properties of TM complexes.« less
Information-theoretic measures of hydrogen-like ions in weakly coupled Debye plasmas
NASA Astrophysics Data System (ADS)
Zan, Li Rong; Jiao, Li Guang; Ma, Jia; Ho, Yew Kam
2017-12-01
Recent development of information theory provides researchers an alternative and useful tool to quantitatively investigate the variation of the electronic structure when atoms interact with the external environment. In this work, we make systematic studies on the information-theoretic measures for hydrogen-like ions immersed in weakly coupled plasmas modeled by Debye-Hückel potential. Shannon entropy, Fisher information, and Fisher-Shannon complexity in both position and momentum spaces are quantified in high accuracy for the hydrogen atom in a large number of stationary states. The plasma screening effect on embedded atoms can significantly affect the electronic density distributions, in both conjugate spaces, and it is quantified by the variation of information quantities. It is shown that the composite quantities (the Shannon entropy sum and the Fisher information product in combined spaces and Fisher-Shannon complexity in individual space) give a more comprehensive description of the atomic structure information than single ones. The nodes of wave functions play a significant role in the changes of composite information quantities caused by plasmas. With the continuously increasing screening strength, all composite quantities in circular states increase monotonously, while in higher-lying excited states where nodal structures exist, they first decrease to a minimum and then increase rapidly before the bound state approaches the continuum limit. The minimum represents the most reduction of uncertainty properties of the atom in plasmas. The lower bounds for the uncertainty product of the system based on composite information quantities are discussed. Our research presents a comprehensive survey in the investigation of information-theoretic measures for simple atoms embedded in Debye model plasmas.
Automatic Adviser on Mobile Objects Status Identification and Classification
NASA Astrophysics Data System (ADS)
Shabelnikov, A. N.; Liabakh, N. N.; Gibner, Ya M.; Saryan, A. S.
2018-05-01
A mobile object status identification task is defined within the image discrimination theory. It is proposed to classify objects into three classes: object operation status; its maintenance is required and object should be removed from the production process. Two methods were developed to construct the separating boundaries between the designated classes: a) using statistical information on the research objects executed movement, b) basing on regulatory documents and expert commentary. Automatic Adviser operation simulation and the operation results analysis complex were synthesized. Research results are commented using a specific example of cuts rolling from the hump yard. The work was supported by Russian Fundamental Research Fund, project No. 17-20-01040.
Understanding parenting in Manitoba First nations: implications for program development.
Eni, Rachel; Rowe, Gladys
2011-01-01
This qualitative study introduced the "Manitoba First Nation Strengthening Families Maternal Child Health Pilot Project" program and evaluation methodologies. The study provided a knowledge base for programmers, evaluators, and communities to develop relevant health promotion, prevention, and intervention programming to assist in meeting health needs of pregnant women and young families. Sixty-five open-ended, semistructured interviews were completed in 13 communities. Data analysis was through grounded theory. Three major themes emerged from the data: interpersonal support and relationships; socioeconomic factors; and community initiatives. Complex structural, historical events compromise parenting; capacity and resilience are supported through informal and formal health and social supports.
Khakzad, Nima; Khan, Faisal; Amyotte, Paul
2015-07-01
Compared to the remarkable progress in risk analysis of normal accidents, the risk analysis of major accidents has not been so well-established, partly due to the complexity of such accidents and partly due to low probabilities involved. The issue of low probabilities normally arises from the scarcity of major accidents' relevant data since such accidents are few and far between. In this work, knowing that major accidents are frequently preceded by accident precursors, a novel precursor-based methodology has been developed for likelihood modeling of major accidents in critical infrastructures based on a unique combination of accident precursor data, information theory, and approximate reasoning. For this purpose, we have introduced an innovative application of information analysis to identify the most informative near accident of a major accident. The observed data of the near accident were then used to establish predictive scenarios to foresee the occurrence of the major accident. We verified the methodology using offshore blowouts in the Gulf of Mexico, and then demonstrated its application to dam breaches in the United Sates. © 2015 Society for Risk Analysis.
Complexity: the organizing principle at the interface of biological (dis)order.
Bhat, Ramray; Pally, Dharma
2017-07-01
The term complexity means several things to biologists.When qualifying morphological phenotype, on the one hand, it is used to signify the sheer complicatedness of living systems, especially as a result of the multicomponent aspect of biological form. On the other hand, it has been used to represent the intricate nature of the connections between constituents that make up form: a more process-based explanation. In the context of evolutionary arguments, complexity has been defined, in a quantifiable fashion, as the amount of information, an informatic template such as a sequence of nucleotides or amino acids stores about its environment. In this perspective, we begin with a brief review of the history of complexity theory. We then introduce a developmental and an evolutionary understanding of what it means for biological systems to be complex.We propose that the complexity of living systems can be understood through two interdependent structural properties: multiscalarity of interconstituent mechanisms and excitability of the biological materials. The answer to whether a system becomes more or less complex over time depends on the potential for its constituents to interact in novel ways and combinations to give rise to new structures and functions, as well as on the evolution of excitable properties that would facilitate the exploration of interconstituent organization in the context of their microenvironments and macroenvironments.