Death of a Simulated Pediatric Patient: Toward a More Robust Theoretical Framework.
McBride, Mary E; Schinasi, Dana Aronson; Moga, Michael Alice; Tripathy, Shreepada; Calhoun, Aaron
2017-12-01
A theoretical framework was recently proposed that encapsulates learner responses to simulated death due to action or inaction in the pediatric context. This framework, however, was developed at an institution that allows simulated death and thus does not address the experience of those centers at which this technique is not used. To address this, we performed a parallel qualitative study with the intent of augmenting the initial framework. We conducted focus groups, using a constructivist grounded theory approach, using physicians and nurses who have experienced a simulated cardiac arrest. The participants were recruited via e-mail. Transcripts were analyzed by coders blinded to the original framework to generate a list of provisional themes that were iteratively refined. These themes were then compared with the themes from the original article and used to derive a consensus model that incorporated the most relevant features of each. Focus group data yielded 7 themes. Six were similar to those developed in the original framework. One important exception was noted; however, those learners not exposed to patient death due to action or inaction often felt that the mannequin's survival was artificial. This additional theme was incorporated into a revised framework. The original framework addresses most aspects of learner reactions to simulated death. Our work suggests that adding the theme pertaining to the lack of realism that can be perceived when the mannequin is unexpectedly saved results in a more robust theoretical framework transferable to centers that do not allow mannequin death.
Doubly robust nonparametric inference on the average treatment effect.
Benkeser, D; Carone, M; Laan, M J Van Der; Gilbert, P B
2017-12-01
Doubly robust estimators are widely used to draw inference about the average effect of a treatment. Such estimators are consistent for the effect of interest if either one of two nuisance parameters is consistently estimated. However, if flexible, data-adaptive estimators of these nuisance parameters are used, double robustness does not readily extend to inference. We present a general theoretical study of the behaviour of doubly robust estimators of an average treatment effect when one of the nuisance parameters is inconsistently estimated. We contrast different methods for constructing such estimators and investigate the extent to which they may be modified to also allow doubly robust inference. We find that while targeted minimum loss-based estimation can be used to solve this problem very naturally, common alternative frameworks appear to be inappropriate for this purpose. We provide a theoretical study and a numerical evaluation of the alternatives considered. Our simulations highlight the need for and usefulness of these approaches in practice, while our theoretical developments have broad implications for the construction of estimators that permit doubly robust inference in other problems.
Theory and applications of structured light single pixel imaging
NASA Astrophysics Data System (ADS)
Stokoe, Robert J.; Stockton, Patrick A.; Pezeshki, Ali; Bartels, Randy A.
2018-02-01
Many single-pixel imaging techniques have been developed in recent years. Though the methods of image acquisition vary considerably, the methods share unifying features that make general analysis possible. Furthermore, the methods developed thus far are based on intuitive processes that enable simple and physically-motivated reconstruction algorithms, however, this approach may not leverage the full potential of single-pixel imaging. We present a general theoretical framework of single-pixel imaging based on frame theory, which enables general, mathematically rigorous analysis. We apply our theoretical framework to existing single-pixel imaging techniques, as well as provide a foundation for developing more-advanced methods of image acquisition and reconstruction. The proposed frame theoretic framework for single-pixel imaging results in improved noise robustness, decrease in acquisition time, and can take advantage of special properties of the specimen under study. By building on this framework, new methods of imaging with a single element detector can be developed to realize the full potential associated with single-pixel imaging.
Sparse distributed memory: understanding the speed and robustness of expert memory
Brogliato, Marcelo S.; Chada, Daniel M.; Linhares, Alexandre
2014-01-01
How can experts, sometimes in exacting detail, almost immediately and very precisely recall memory items from a vast repertoire? The problem in which we will be interested concerns models of theoretical neuroscience that could explain the speed and robustness of an expert's recollection. The approach is based on Sparse Distributed Memory, which has been shown to be plausible, both in a neuroscientific and in a psychological manner, in a number of ways. A crucial characteristic concerns the limits of human recollection, the “tip-of-tongue” memory event—which is found at a non-linearity in the model. We expand the theoretical framework, deriving an optimization formula to solve this non-linearity. Numerical results demonstrate how the higher frequency of rehearsal, through work or study, immediately increases the robustness and speed associated with expert memory. PMID:24808842
A Framework for Information Theoretic Cooperative Sensing and Predictive Control
2012-09-11
Miroslav Barić and Francesco Borelli , Decentralized Robust Control Invariance for a Network of Integrators, Proceeding of American Control...from http: //www.mpc.berkeley.edu. P4 Miroslav Barić and Francesco Borelli , Distributed Averaging with Flow Constraints, Proceeding of American Control
Establishing an Explanatory Model for Mathematics Identity
ERIC Educational Resources Information Center
Cribbs, Jennifer D.; Hazari, Zahra; Sonnert, Gerhard; Sadler, Philip M.
2015-01-01
This article empirically tests a previously developed theoretical framework for mathematics identity based on students' beliefs. The study employs data from more than 9,000 college calculus students across the United States to build a robust structural equation model. While it is generally thought that students' beliefs about their own competence…
Breakdown of interdependent directed networks.
Liu, Xueming; Stanley, H Eugene; Gao, Jianxi
2016-02-02
Increasing evidence shows that real-world systems interact with one another via dependency connectivities. Failing connectivities are the mechanism behind the breakdown of interacting complex systems, e.g., blackouts caused by the interdependence of power grids and communication networks. Previous research analyzing the robustness of interdependent networks has been limited to undirected networks. However, most real-world networks are directed, their in-degrees and out-degrees may be correlated, and they are often coupled to one another as interdependent directed networks. To understand the breakdown and robustness of interdependent directed networks, we develop a theoretical framework based on generating functions and percolation theory. We find that for interdependent Erdős-Rényi networks the directionality within each network increases their vulnerability and exhibits hybrid phase transitions. We also find that the percolation behavior of interdependent directed scale-free networks with and without degree correlations is so complex that two criteria are needed to quantify and compare their robustness: the percolation threshold and the integrated size of the giant component during an entire attack process. Interestingly, we find that the in-degree and out-degree correlations in each network layer increase the robustness of interdependent degree heterogeneous networks that most real networks are, but decrease the robustness of interdependent networks with homogeneous degree distribution and with strong coupling strengths. Moreover, by applying our theoretical analysis to real interdependent international trade networks, we find that the robustness of these real-world systems increases with the in-degree and out-degree correlations, confirming our theoretical analysis.
Martins Pereira, Sandra; de Sá Brandão, Patrícia Joana; Araújo, Joana; Carvalho, Ana Sofia
2017-01-01
Introduction Antimicrobial resistance (AMR) is a challenging global and public health issue, raising bioethical challenges, considerations and strategies. Objectives This research protocol presents a conceptual model leading to formulating an empirically based bioethics framework for antibiotic use, AMR and designing ethically robust strategies to protect human health. Methods Mixed methods research will be used and operationalized into five substudies. The bioethical framework will encompass and integrate two theoretical models: global bioethics and ethical decision-making. Results Being a study protocol, this article reports on planned and ongoing research. Conclusions Based on data collection, future findings and using a comprehensive, integrative, evidence-based approach, a step-by-step bioethical framework will be developed for (i) responsible use of antibiotics in healthcare and (ii) design of strategies to decrease AMR. This will entail the analysis and interpretation of approaches from several bioethical theories, including deontological and consequentialist approaches, and the implications of uncertainty to these approaches. PMID:28459355
Robustness and Vulnerability of Networks with Dynamical Dependency Groups.
Bai, Ya-Nan; Huang, Ning; Wang, Lei; Wu, Zhi-Xi
2016-11-28
The dependency property and self-recovery of failure nodes both have great effects on the robustness of networks during the cascading process. Existing investigations focused mainly on the failure mechanism of static dependency groups without considering the time-dependency of interdependent nodes and the recovery mechanism in reality. In this study, we present an evolving network model consisting of failure mechanisms and a recovery mechanism to explore network robustness, where the dependency relations among nodes vary over time. Based on generating function techniques, we provide an analytical framework for random networks with arbitrary degree distribution. In particular, we theoretically find that an abrupt percolation transition exists corresponding to the dynamical dependency groups for a wide range of topologies after initial random removal. Moreover, when the abrupt transition point is above the failure threshold of dependency groups, the evolving network with the larger dependency groups is more vulnerable; when below it, the larger dependency groups make the network more robust. Numerical simulations employing the Erdős-Rényi network and Barabási-Albert scale free network are performed to validate our theoretical results.
A Study on the Security Levels of Spread-Spectrum Embedding Schemes in the WOA Framework.
Wang, Yuan-Gen; Zhu, Guopu; Kwong, Sam; Shi, Yun-Qing
2017-08-23
Security analysis is a very important issue for digital watermarking. Several years ago, according to Kerckhoffs' principle, the famous four security levels, namely insecurity, key security, subspace security, and stego-security, were defined for spread-spectrum (SS) embedding schemes in the framework of watermarked-only attack. However, up to now there has been little application of the definition of these security levels to the theoretical analysis of the security of SS embedding schemes, due to the difficulty of the theoretical analysis. In this paper, based on the security definition, we present a theoretical analysis to evaluate the security levels of five typical SS embedding schemes, which are the classical SS, the improved SS (ISS), the circular extension of ISS, the nonrobust and robust natural watermarking, respectively. The theoretical analysis of these typical SS schemes are successfully performed by taking advantage of the convolution of probability distributions to derive the probabilistic models of watermarked signals. Moreover, simulations are conducted to illustrate and validate our theoretical analysis. We believe that the theoretical and practical analysis presented in this paper can bridge the gap between the definition of the four security levels and its application to the theoretical analysis of SS embedding schemes.
Spatio-temporal Granger causality: a new framework
Luo, Qiang; Lu, Wenlian; Cheng, Wei; Valdes-Sosa, Pedro A.; Wen, Xiaotong; Ding, Mingzhou; Feng, Jianfeng
2015-01-01
That physiological oscillations of various frequencies are present in fMRI signals is the rule, not the exception. Herein, we propose a novel theoretical framework, spatio-temporal Granger causality, which allows us to more reliably and precisely estimate the Granger causality from experimental datasets possessing time-varying properties caused by physiological oscillations. Within this framework, Granger causality is redefined as a global index measuring the directed information flow between two time series with time-varying properties. Both theoretical analyses and numerical examples demonstrate that Granger causality is a monotonically increasing function of the temporal resolution used in the estimation. This is consistent with the general principle of coarse graining, which causes information loss by smoothing out very fine-scale details in time and space. Our results confirm that the Granger causality at the finer spatio-temporal scales considerably outperforms the traditional approach in terms of an improved consistency between two resting-state scans of the same subject. To optimally estimate the Granger causality, the proposed theoretical framework is implemented through a combination of several approaches, such as dividing the optimal time window and estimating the parameters at the fine temporal and spatial scales. Taken together, our approach provides a novel and robust framework for estimating the Granger causality from fMRI, EEG, and other related data. PMID:23643924
Berke, Ethan M; Vernez-Moudon, Anne
2014-06-01
As research examining the effect of the built environment on health accelerates, it is critical for health and planning researchers to conduct studies and make recommendations in the context of a robust theoretical framework. We propose a framework for built environment change (BEC) related to improving health. BEC consists of elements of the built environment, how people are exposed to and interact with them perceptually and functionally, and how this exposure may affect health-related behaviours. Integrated into this framework are the legal and regulatory mechanisms and instruments that are commonly used to effect change in the built environment. This framework would be applicable to medical research as well as to issues of policy and community planning.
Liu, Changxin; Gao, Jian; Li, Huiping; Xu, Demin
2018-05-01
The event-triggered control is a promising solution to cyber-physical systems, such as networked control systems, multiagent systems, and large-scale intelligent systems. In this paper, we propose an event-triggered model predictive control (MPC) scheme for constrained continuous-time nonlinear systems with bounded disturbances. First, a time-varying tightened state constraint is computed to achieve robust constraint satisfaction, and an event-triggered scheduling strategy is designed in the framework of dual-mode MPC. Second, the sufficient conditions for ensuring feasibility and closed-loop robust stability are developed, respectively. We show that robust stability can be ensured and communication load can be reduced with the proposed MPC algorithm. Finally, numerical simulations and comparison studies are performed to verify the theoretical results.
Wang, Ding; Liu, Derong; Zhang, Yun; Li, Hongyi
2018-01-01
In this paper, we aim to tackle the neural robust tracking control problem for a class of nonlinear systems using the adaptive critic technique. The main contribution is that a neural-network-based robust tracking control scheme is established for nonlinear systems involving matched uncertainties. The augmented system considering the tracking error and the reference trajectory is formulated and then addressed under adaptive critic optimal control formulation, where the initial stabilizing controller is not needed. The approximate control law is derived via solving the Hamilton-Jacobi-Bellman equation related to the nominal augmented system, followed by closed-loop stability analysis. The robust tracking control performance is guaranteed theoretically via Lyapunov approach and also verified through simulation illustration. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Robust Framework for Microbial Archaeology
Warinner, Christina; Herbig, Alexander; Mann, Allison; Yates, James A. Fellows; Weiβ, Clemens L.; Burbano, Hernán A.; Orlando, Ludovic; Krause, Johannes
2017-01-01
Microbial archaeology is flourishing in the era of high-throughput sequencing, revealing the agents behind devastating historical plagues, identifying the cryptic movements of pathogens in prehistory, and reconstructing the ancestral microbiota of humans. Here, we introduce the fundamental concepts and theoretical framework of the discipline, then discuss applied methodologies for pathogen identification and microbiome characterization from archaeological samples. We give special attention to the process of identifying, validating, and authenticating ancient microbes using high-throughput DNA sequencing data. Finally, we outline standards and precautions to guide future research in the field. PMID:28460196
NASA Technical Reports Server (NTRS)
Saleeb, Atef F.; Li, Wei
1995-01-01
This two-part report is concerned with the development of a general framework for the implicit time-stepping integrators for the flow and evolution equations in generalized viscoplastic models. The primary goal is to present a complete theoretical formulation, and to address in detail the algorithmic and numerical analysis aspects involved in its finite element implementation, as well as to critically assess the numerical performance of the developed schemes in a comprehensive set of test cases. On the theoretical side, the general framework is developed on the basis of the unconditionally-stable, backward-Euler difference scheme as a starting point. Its mathematical structure is of sufficient generality to allow a unified treatment of different classes of viscoplastic models with internal variables. In particular, two specific models of this type, which are representative of the present start-of-art in metal viscoplasticity, are considered in applications reported here; i.e., fully associative (GVIPS) and non-associative (NAV) models. The matrix forms developed for both these models are directly applicable for both initially isotropic and anisotropic materials, in general (three-dimensional) situations as well as subspace applications (i.e., plane stress/strain, axisymmetric, generalized plane stress in shells). On the computational side, issues related to efficiency and robustness are emphasized in developing the (local) interative algorithm. In particular, closed-form expressions for residual vectors and (consistent) material tangent stiffness arrays are given explicitly for both GVIPS and NAV models, with their maximum sizes 'optimized' to depend only on the number of independent stress components (but independent of the number of viscoplastic internal state parameters). Significant robustness of the local iterative solution is provided by complementing the basic Newton-Raphson scheme with a line-search strategy for convergence. In the present first part of the report, we focus on the theoretical developments, and discussions of the results of numerical-performance studies using the integration schemes for GVIPS and NAV models.
Robustness of Oscillatory Behavior in Correlated Networks
Sasai, Takeyuki; Morino, Kai; Tanaka, Gouhei; Almendral, Juan A.; Aihara, Kazuyuki
2015-01-01
Understanding network robustness against failures of network units is useful for preventing large-scale breakdowns and damages in real-world networked systems. The tolerance of networked systems whose functions are maintained by collective dynamical behavior of the network units has recently been analyzed in the framework called dynamical robustness of complex networks. The effect of network structure on the dynamical robustness has been examined with various types of network topology, but the role of network assortativity, or degree–degree correlations, is still unclear. Here we study the dynamical robustness of correlated (assortative and disassortative) networks consisting of diffusively coupled oscillators. Numerical analyses for the correlated networks with Poisson and power-law degree distributions show that network assortativity enhances the dynamical robustness of the oscillator networks but the impact of network disassortativity depends on the detailed network connectivity. Furthermore, we theoretically analyze the dynamical robustness of correlated bimodal networks with two-peak degree distributions and show the positive impact of the network assortativity. PMID:25894574
The robustness of multiplex networks under layer node-based attack
Zhao, Da-wei; Wang, Lian-hai; Zhi, Yong-feng; Zhang, Jun; Wang, Zhen
2016-01-01
From transportation networks to complex infrastructures, and to social and economic networks, a large variety of systems can be described in terms of multiplex networks formed by a set of nodes interacting through different network layers. Network robustness, as one of the most successful application areas of complex networks, has attracted great interest in a myriad of research realms. In this regard, how multiplex networks respond to potential attack is still an open issue. Here we study the robustness of multiplex networks under layer node-based random or targeted attack, which means that nodes just suffer attacks in a given layer yet no additional influence to their connections beyond this layer. A theoretical analysis framework is proposed to calculate the critical threshold and the size of giant component of multiplex networks when nodes are removed randomly or intentionally. Via numerous simulations, it is unveiled that the theoretical method can accurately predict the threshold and the size of giant component, irrespective of attack strategies. Moreover, we also compare the robustness of multiplex networks under multiplex node-based attack and layer node-based attack, and find that layer node-based attack makes multiplex networks more vulnerable, regardless of average degree and underlying topology. PMID:27075870
The robustness of multiplex networks under layer node-based attack.
Zhao, Da-wei; Wang, Lian-hai; Zhi, Yong-feng; Zhang, Jun; Wang, Zhen
2016-04-14
From transportation networks to complex infrastructures, and to social and economic networks, a large variety of systems can be described in terms of multiplex networks formed by a set of nodes interacting through different network layers. Network robustness, as one of the most successful application areas of complex networks, has attracted great interest in a myriad of research realms. In this regard, how multiplex networks respond to potential attack is still an open issue. Here we study the robustness of multiplex networks under layer node-based random or targeted attack, which means that nodes just suffer attacks in a given layer yet no additional influence to their connections beyond this layer. A theoretical analysis framework is proposed to calculate the critical threshold and the size of giant component of multiplex networks when nodes are removed randomly or intentionally. Via numerous simulations, it is unveiled that the theoretical method can accurately predict the threshold and the size of giant component, irrespective of attack strategies. Moreover, we also compare the robustness of multiplex networks under multiplex node-based attack and layer node-based attack, and find that layer node-based attack makes multiplex networks more vulnerable, regardless of average degree and underlying topology.
3D face recognition under expressions, occlusions, and pose variations.
Drira, Hassen; Ben Amor, Boulbaba; Srivastava, Anuj; Daoudi, Mohamed; Slama, Rim
2013-09-01
We propose a novel geometric framework for analyzing 3D faces, with the specific goals of comparing, matching, and averaging their shapes. Here we represent facial surfaces by radial curves emanating from the nose tips and use elastic shape analysis of these curves to develop a Riemannian framework for analyzing shapes of full facial surfaces. This representation, along with the elastic Riemannian metric, seems natural for measuring facial deformations and is robust to challenges such as large facial expressions (especially those with open mouths), large pose variations, missing parts, and partial occlusions due to glasses, hair, and so on. This framework is shown to be promising from both--empirical and theoretical--perspectives. In terms of the empirical evaluation, our results match or improve upon the state-of-the-art methods on three prominent databases: FRGCv2, GavabDB, and Bosphorus, each posing a different type of challenge. From a theoretical perspective, this framework allows for formal statistical inferences, such as the estimation of missing facial parts using PCA on tangent spaces and computing average shapes.
Carey, Mariko; Jefford, Michael; Schofield, Penelope; Kelly, Siobhan; Krishnasamy, Meinir; Aranda, Sanchia
2006-04-01
Based on a theoretical framework, we developed an audiovisual resource to promote self-management of eight common chemotherapy side-effects. A patient needs analysis identified content domains, best evidence for preparing patients for threatening medical procedures and a systematic review of effective self-care strategies informed script content. Patients and health professionals were invited to complete a written evaluation of the video. A 25-min video was produced. Fifty health professionals and 37 patients completed the evaluation. All considered the video informative and easy to understand. The majority believed the video would reduce anxiety and help patients prepare for chemotherapy. Underpinned by a robust theoretical framework, we have developed an evidence-based resource that is perceived by both patients and health professionals as likely to enhance preparedness for chemotherapy.
A new framework for comprehensive, robust, and efficient global sensitivity analysis: 2. Application
NASA Astrophysics Data System (ADS)
Razavi, Saman; Gupta, Hoshin V.
2016-01-01
Based on the theoretical framework for sensitivity analysis called "Variogram Analysis of Response Surfaces" (VARS), developed in the companion paper, we develop and implement a practical "star-based" sampling strategy (called STAR-VARS), for the application of VARS to real-world problems. We also develop a bootstrap approach to provide confidence level estimates for the VARS sensitivity metrics and to evaluate the reliability of inferred factor rankings. The effectiveness, efficiency, and robustness of STAR-VARS are demonstrated via two real-data hydrological case studies (a 5-parameter conceptual rainfall-runoff model and a 45-parameter land surface scheme hydrology model), and a comparison with the "derivative-based" Morris and "variance-based" Sobol approaches are provided. Our results show that STAR-VARS provides reliable and stable assessments of "global" sensitivity across the full range of scales in the factor space, while being 1-2 orders of magnitude more efficient than the Morris or Sobol approaches.
Self-determination theory: a framework for clubhouse psychosocial rehabilitation research.
Raeburn, Toby; Schmied, Virginia; Hungerford, Catherine; Cleary, Michelle
2015-02-01
The Clubhouse model is a widely used approach to psychosocial rehabilitation that has been a pioneer in supporting recovery-oriented programmes. Little consideration has been given however, to the theories that guide research of the recovery practices used by Clubhouses. In this paper, we provide a description of self-determination theory, including its philosophical background followed by explanation of its relevance to health care and Clubhouse contexts. We argue that self-determination theory provides a robust social constructionist theoretical framework that is well-suited to informing research related to psychosocial rehabilitation, recovery-oriented practices and the Clubhouse model.
Methodological development of the interactive INTERLINKS Framework for Long-term Care
Billings, Jenny; Leichsenring, Kai
2014-01-01
There is increasing international research into health and social care services for older people in need of long-term care (LTC), but problems remain with respect to acquiring robust comparative information to enable judgements to be made regarding the most beneficial and cost-effective approaches. The project ‘INTERLINKS’ (‘Health systems and LTC for older people in Europe’) funded by the EU 7th Framework programme was developed to address the challenges associated with the accumulation and comparison of evidence in LTC across Europe. It developed a concept and method to describe and analyse LTC and its links with the health and social care system through the accumulation of policy and practice examples on an interactive web-based framework for LTC. This paper provides a critical overview of the theoretical and methodological approaches used to develop and implement the INTERLINKS Framework for LTC, with the aim of providing some guidance to researchers in this area. INTERLINKS has made a significant contribution to knowledge but robust evidence and comparability across European countries remain problematic due to the current and growing complexity and diversity of integrated LTC implementation. PMID:25120413
A new framework for comprehensive, robust, and efficient global sensitivity analysis: 1. Theory
NASA Astrophysics Data System (ADS)
Razavi, Saman; Gupta, Hoshin V.
2016-01-01
Computer simulation models are continually growing in complexity with increasingly more factors to be identified. Sensitivity Analysis (SA) provides an essential means for understanding the role and importance of these factors in producing model responses. However, conventional approaches to SA suffer from (1) an ambiguous characterization of sensitivity, and (2) poor computational efficiency, particularly as the problem dimension grows. Here, we present a new and general sensitivity analysis framework (called VARS), based on an analogy to "variogram analysis," that provides an intuitive and comprehensive characterization of sensitivity across the full spectrum of scales in the factor space. We prove, theoretically, that Morris (derivative-based) and Sobol (variance-based) methods and their extensions are special cases of VARS, and that their SA indices can be computed as by-products of the VARS framework. Synthetic functions that resemble actual model response surfaces are used to illustrate the concepts, and show VARS to be as much as two orders of magnitude more computationally efficient than the state-of-the-art Sobol approach. In a companion paper, we propose a practical implementation strategy, and demonstrate the effectiveness, efficiency, and reliability (robustness) of the VARS framework on real-data case studies.
Risk, Robustness and Water Resources Planning Under Uncertainty
NASA Astrophysics Data System (ADS)
Borgomeo, Edoardo; Mortazavi-Naeini, Mohammad; Hall, Jim W.; Guillod, Benoit P.
2018-03-01
Risk-based water resources planning is based on the premise that water managers should invest up to the point where the marginal benefit of risk reduction equals the marginal cost of achieving that benefit. However, this cost-benefit approach may not guarantee robustness under uncertain future conditions, for instance under climatic changes. In this paper, we expand risk-based decision analysis to explore possible ways of enhancing robustness in engineered water resources systems under different risk attitudes. Risk is measured as the expected annual cost of water use restrictions, while robustness is interpreted in the decision-theoretic sense as the ability of a water resource system to maintain performance—expressed as a tolerable risk of water use restrictions—under a wide range of possible future conditions. Linking risk attitudes with robustness allows stakeholders to explicitly trade-off incremental increases in robustness with investment costs for a given level of risk. We illustrate the framework through a case study of London's water supply system using state-of-the -art regional climate simulations to inform the estimation of risk and robustness.
Impact of self-healing capability on network robustness
NASA Astrophysics Data System (ADS)
Shang, Yilun
2015-04-01
A wide spectrum of real-life systems ranging from neurons to botnets display spontaneous recovery ability. Using the generating function formalism applied to static uncorrelated random networks with arbitrary degree distributions, the microscopic mechanism underlying the depreciation-recovery process is characterized and the effect of varying self-healing capability on network robustness is revealed. It is found that the self-healing capability of nodes has a profound impact on the phase transition in the emergence of percolating clusters, and that salient difference exists in upholding network integrity under random failures and intentional attacks. The results provide a theoretical framework for quantitatively understanding the self-healing phenomenon in varied complex systems.
Impact of self-healing capability on network robustness.
Shang, Yilun
2015-04-01
A wide spectrum of real-life systems ranging from neurons to botnets display spontaneous recovery ability. Using the generating function formalism applied to static uncorrelated random networks with arbitrary degree distributions, the microscopic mechanism underlying the depreciation-recovery process is characterized and the effect of varying self-healing capability on network robustness is revealed. It is found that the self-healing capability of nodes has a profound impact on the phase transition in the emergence of percolating clusters, and that salient difference exists in upholding network integrity under random failures and intentional attacks. The results provide a theoretical framework for quantitatively understanding the self-healing phenomenon in varied complex systems.
Bogaerts, Thomas; Van Yperen-De Deyne, Andy; Liu, Ying-Ya; Lynen, Frederic; Van Speybroeck, Veronique; Van Der Voort, Pascal
2013-09-21
An enantioselective catalyst, consisting of a chiral Mn(III)salen complex entrapped in the MIL-101 metal organic framework, is reported. For the first time, we assemble a robust MOF-cage around a chiral complex. The heterogeneous catalyst shows the same selectivity as the homogeneous complex and is fully recyclable. Theoretical calculations provide insight into this retention of selectivity.
A General Framework of Persistence Strategies for Biological Systems Helps Explain Domains of Life
Yafremava, Liudmila S.; Wielgos, Monica; Thomas, Suravi; Nasir, Arshan; Wang, Minglei; Mittenthal, Jay E.; Caetano-Anollés, Gustavo
2012-01-01
The nature and cause of the division of organisms in superkingdoms is not fully understood. Assuming that environment shapes physiology, here we construct a novel theoretical framework that helps identify general patterns of organism persistence. This framework is based on Jacob von Uexküll’s organism-centric view of the environment and James G. Miller’s view of organisms as matter-energy-information processing molecular machines. Three concepts describe an organism’s environmental niche: scope, umwelt, and gap. Scope denotes the entirety of environmental events and conditions to which the organism is exposed during its lifetime. Umwelt encompasses an organism’s perception of these events. The gap is the organism’s blind spot, the scope that is not covered by umwelt. These concepts bring organisms of different complexity to a common ecological denominator. Ecological and physiological data suggest organisms persist using three strategies: flexibility, robustness, and economy. All organisms use umwelt information to flexibly adapt to environmental change. They implement robustness against environmental perturbations within the gap generally through redundancy and reliability of internal constituents. Both flexibility and robustness improve survival. However, they also incur metabolic matter-energy processing costs, which otherwise could have been used for growth and reproduction. Lineages evolve unique tradeoff solutions among strategies in the space of what we call “a persistence triangle.” Protein domain architecture and other evidence support the preferential use of flexibility and robustness properties. Archaea and Bacteria gravitate toward the triangle’s economy vertex, with Archaea biased toward robustness. Eukarya trade economy for survivability. Protista occupy a saddle manifold separating akaryotes from multicellular organisms. Plants and the more flexible Fungi share an economic stratum, and Metazoa are locked in a positive feedback loop toward flexibility. PMID:23443991
Quantum theory as plausible reasoning applied to data obtained by robust experiments.
De Raedt, H; Katsnelson, M I; Michielsen, K
2016-05-28
We review recent work that employs the framework of logical inference to establish a bridge between data gathered through experiments and their objective description in terms of human-made concepts. It is shown that logical inference applied to experiments for which the observed events are independent and for which the frequency distribution of these events is robust with respect to small changes of the conditions under which the experiments are carried out yields, without introducing any concept of quantum theory, the quantum theoretical description in terms of the Schrödinger or the Pauli equation, the Stern-Gerlach or Einstein-Podolsky-Rosen-Bohm experiments. The extraordinary descriptive power of quantum theory then follows from the fact that it is plausible reasoning, that is common sense, applied to reproducible and robust experimental data. © 2016 The Author(s).
Identification of nonlinear modes using phase-locked-loop experimental continuation and normal form
NASA Astrophysics Data System (ADS)
Denis, V.; Jossic, M.; Giraud-Audine, C.; Chomette, B.; Renault, A.; Thomas, O.
2018-06-01
In this article, we address the model identification of nonlinear vibratory systems, with a specific focus on systems modeled with distributed nonlinearities, such as geometrically nonlinear mechanical structures. The proposed strategy theoretically relies on the concept of nonlinear modes of the underlying conservative unforced system and the use of normal forms. Within this framework, it is shown that without internal resonance, a valid reduced order model for a nonlinear mode is a single Duffing oscillator. We then propose an efficient experimental strategy to measure the backbone curve of a particular nonlinear mode and we use it to identify the free parameters of the reduced order model. The experimental part relies on a Phase-Locked Loop (PLL) and enables a robust and automatic measurement of backbone curves as well as forced responses. It is theoretically and experimentally shown that the PLL is able to stabilize the unstable part of Duffing-like frequency responses, thus enabling its robust experimental measurement. Finally, the whole procedure is tested on three experimental systems: a circular plate, a chinese gong and a piezoelectric cantilever beam. It enable to validate the procedure by comparison to available theoretical models as well as to other experimental identification methods.
Halse, Meghan E; Procacci, Barbara; Henshaw, Sarah-Louise; Perutz, Robin N; Duckett, Simon B
2017-05-01
We recently reported a pump-probe method that uses a single laser pulse to introduce parahydrogen (p-H 2 ) into a metal dihydride complex and then follows the time-evolution of the p-H 2 -derived nuclear spin states by NMR. We present here a theoretical framework to describe the oscillatory behaviour of the resultant hyperpolarised NMR signals using a product operator formalism. We consider the cases where the p-H 2 -derived protons form part of an AX, AXY, AXYZ or AA'XX' spin system in the product molecule. We use this framework to predict the patterns for 2D pump-probe NMR spectra, where the indirect dimension represents the evolution during the pump-probe delay and the positions of the cross-peaks depend on the difference in chemical shift of the p-H 2 -derived protons and the difference in their couplings to other nuclei. The evolution of the NMR signals of the p-H 2 -derived protons, as well as the transfer of hyperpolarisation to other NMR-active nuclei in the product, is described. The theoretical framework is tested experimentally for a set of ruthenium dihydride complexes representing the different spin systems. Theoretical predictions and experimental results agree to within experimental error for all features of the hyperpolarised 1 H and 31 P pump-probe NMR spectra. Thus we establish the laser pump, NMR probe approach as a robust way to directly observe and quantitatively analyse the coherent evolution of p-H 2 -derived spin order over micro-to-millisecond timescales. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Synthetic biology and regulatory networks: where metabolic systems biology meets control engineering
He, Fei; Murabito, Ettore; Westerhoff, Hans V.
2016-01-01
Metabolic pathways can be engineered to maximize the synthesis of various products of interest. With the advent of computational systems biology, this endeavour is usually carried out through in silico theoretical studies with the aim to guide and complement further in vitro and in vivo experimental efforts. Clearly, what counts is the result in vivo, not only in terms of maximal productivity but also robustness against environmental perturbations. Engineering an organism towards an increased production flux, however, often compromises that robustness. In this contribution, we review and investigate how various analytical approaches used in metabolic engineering and synthetic biology are related to concepts developed by systems and control engineering. While trade-offs between production optimality and cellular robustness have already been studied diagnostically and statically, the dynamics also matter. Integration of the dynamic design aspects of control engineering with the more diagnostic aspects of metabolic, hierarchical control and regulation analysis is leading to the new, conceptual and operational framework required for the design of robust and productive dynamic pathways. PMID:27075000
Exploiting structure: Introduction and motivation
NASA Technical Reports Server (NTRS)
Xu, Zhong Ling
1993-01-01
Research activities performed during the period of 29 June 1993 through 31 Aug. 1993 are summarized. The Robust Stability of Systems where transfer function or characteristic polynomial are multilinear affine functions of parameters of interest in two directions, Algorithmic and Theoretical, was developed. In the algorithmic direction, a new approach that reduces the computational burden of checking the robust stability of the system with multilinear uncertainty is found. This technique is called 'Stability by linear process.' In fact, the 'Stability by linear process' described gives an algorithm. In analysis, we obtained a robustness criterion for the family of polynomials with coefficients of multilinear affine function in the coefficient space and obtained the result for the robust stability of diamond families of polynomials with complex coefficients also. We obtained the limited results for SPR design and we provide a framework for solving ACS. Finally, copies of the outline of our results are provided in the appendix. Also, there is an administration issue in the appendix.
Keshavan, J; Gremillion, G; Escobar-Alvarez, H; Humbert, J S
2014-06-01
Safe, autonomous navigation by aerial microsystems in less-structured environments is a difficult challenge to overcome with current technology. This paper presents a novel visual-navigation approach that combines bioinspired wide-field processing of optic flow information with control-theoretic tools for synthesis of closed loop systems, resulting in robustness and performance guarantees. Structured singular value analysis is used to synthesize a dynamic controller that provides good tracking performance in uncertain environments without resorting to explicit pose estimation or extraction of a detailed environmental depth map. Experimental results with a quadrotor demonstrate the vehicle's robust obstacle-avoidance behaviour in a straight line corridor, an S-shaped corridor and a corridor with obstacles distributed in the vehicle's path. The computational efficiency and simplicity of the current approach offers a promising alternative to satisfying the payload, power and bandwidth constraints imposed by aerial microsystems.
Siegert, Richard J; McPherson, Kathryn M; Taylor, William J
2004-10-21
The aim of this article is to argue that self-regulation theory might offer a useful model for clinical practice, theory-building and empirical research on goal-setting in rehabilitation. Relevant literature on goal-setting and motivation in rehabilitation is considered and some problematic issues for current practice and future research are highlighted. Carver and Scheier's self-regulation theory and its application to rehabilitation research is examined. It is argued that self-regulation theory offers a robust theoretical framework for goal-setting and one in which the salient concepts of motivation and emotion are prominent. Self-regulation theory offers a potentially useful heuristic framework for rehabilitation research.
Thermoacoustics of solids: A pathway to solid state engines and refrigerators
NASA Astrophysics Data System (ADS)
Hao, Haitian; Scalo, Carlo; Sen, Mihir; Semperlotti, Fabio
2018-01-01
Thermoacoustic oscillations have been one of the most exciting discoveries of the physics of fluids in the 19th century. Since its inception, scientists have formulated a comprehensive theoretical explanation of the basic phenomenon which has later found several practical applications to engineering devices. To date, all studies have concentrated on the thermoacoustics of fluid media where this fascinating mechanism was exclusively believed to exist. Our study shows theoretical and numerical evidence of the existence of thermoacoustic instabilities in solid media. Although the underlying physical mechanism exhibits some interesting similarities with its counterpart in fluids, the theoretical framework highlights relevant differences that have important implications on the ability to trigger and sustain the thermoacoustic response. This mechanism could pave the way to the development of highly robust and reliable solid-state thermoacoustic engines and refrigerators.
Synchrony and entrainment properties of robust circadian oscillators
Bagheri, Neda; Taylor, Stephanie R.; Meeker, Kirsten; Petzold, Linda R.; Doyle, Francis J.
2008-01-01
Systems theoretic tools (i.e. mathematical modelling, control, and feedback design) advance the understanding of robust performance in complex biological networks. We highlight phase entrainment as a key performance measure used to investigate dynamics of a single deterministic circadian oscillator for the purpose of generating insight into the behaviour of a population of (synchronized) oscillators. More specifically, the analysis of phase characteristics may facilitate the identification of appropriate coupling mechanisms for the ensemble of noisy (stochastic) circadian clocks. Phase also serves as a critical control objective to correct mismatch between the biological clock and its environment. Thus, we introduce methods of investigating synchrony and entrainment in both stochastic and deterministic frameworks, and as a property of a single oscillator or population of coupled oscillators. PMID:18426774
Gould, Natalie J; Lorencatto, Fabiana; Stanworth, Simon J; Michie, Susan; Prior, Maria E; Glidewell, Liz; Grimshaw, Jeremy M; Francis, Jill J
2014-07-29
Audits of blood transfusion demonstrate around 20% transfusions are outside national recommendations and guidelines. Audit and feedback is a widely used quality improvement intervention but effects on clinical practice are variable, suggesting potential for enhancement. Behavioural theory, theoretical frameworks of behaviour change and behaviour change techniques provide systematic processes to enhance intervention. This study is part of a larger programme of work to promote the uptake of evidence-based transfusion practice. The objectives of this study are to design two theoretically enhanced audit and feedback interventions; one focused on content and one on delivery, and investigate the feasibility and acceptability. Study A (Content): A coding framework based on current evidence regarding audit and feedback, and behaviour change theory and frameworks will be developed and applied as part of a structured content analysis to specify the key components of existing feedback documents. Prototype feedback documents with enhanced content and also a protocol, describing principles for enhancing feedback content, will be developed. Study B (Delivery): Individual semi-structured interviews with healthcare professionals and observations of team meetings in four hospitals will be used to specify, and identify views about, current audit and feedback practice. Interviews will be based on a topic guide developed using the Theoretical Domains Framework and the Consolidated Framework for Implementation Research. Analysis of transcripts based on these frameworks will form the evidence base for developing a protocol describing an enhanced intervention that focuses on feedback delivery. Study C (Feasibility and Acceptability): Enhanced interventions will be piloted in four hospitals. Semi-structured interviews, questionnaires and observations will be used to assess feasibility and acceptability. This intervention development work reflects the UK Medical Research Council's guidance on development of complex interventions, which emphasises the importance of a robust theoretical basis for intervention design and recommends systematic assessment of feasibility and acceptability prior to taking interventions to evaluation in a full-scale randomised study. The work-up includes specification of current practice so that, in the trials to be conducted later in this programme, there will be a clear distinction between the control (usual practice) conditions and the interventions to be evaluated.
He, Fei; Murabito, Ettore; Westerhoff, Hans V
2016-04-01
Metabolic pathways can be engineered to maximize the synthesis of various products of interest. With the advent of computational systems biology, this endeavour is usually carried out through in silico theoretical studies with the aim to guide and complement further in vitro and in vivo experimental efforts. Clearly, what counts is the result in vivo, not only in terms of maximal productivity but also robustness against environmental perturbations. Engineering an organism towards an increased production flux, however, often compromises that robustness. In this contribution, we review and investigate how various analytical approaches used in metabolic engineering and synthetic biology are related to concepts developed by systems and control engineering. While trade-offs between production optimality and cellular robustness have already been studied diagnostically and statically, the dynamics also matter. Integration of the dynamic design aspects of control engineering with the more diagnostic aspects of metabolic, hierarchical control and regulation analysis is leading to the new, conceptual and operational framework required for the design of robust and productive dynamic pathways. © 2016 The Author(s).
Kwok, T; Smith, K A
2000-09-01
The aim of this paper is to study both the theoretical and experimental properties of chaotic neural network (CNN) models for solving combinatorial optimization problems. Previously we have proposed a unifying framework which encompasses the three main model types, namely, Chen and Aihara's chaotic simulated annealing (CSA) with decaying self-coupling, Wang and Smith's CSA with decaying timestep, and the Hopfield network with chaotic noise. Each of these models can be represented as a special case under the framework for certain conditions. This paper combines the framework with experimental results to provide new insights into the effect of the chaotic neurodynamics of each model. By solving the N-queen problem of various sizes with computer simulations, the CNN models are compared in different parameter spaces, with optimization performance measured in terms of feasibility, efficiency, robustness and scalability. Furthermore, characteristic chaotic neurodynamics crucial to effective optimization are identified, together with a guide to choosing the corresponding model parameters.
Sekhon, Mandeep; Cartwright, Martin; Francis, Jill J
2017-01-26
It is increasingly acknowledged that 'acceptability' should be considered when designing, evaluating and implementing healthcare interventions. However, the published literature offers little guidance on how to define or assess acceptability. The purpose of this study was to develop a multi-construct theoretical framework of acceptability of healthcare interventions that can be applied to assess prospective (i.e. anticipated) and retrospective (i.e. experienced) acceptability from the perspective of intervention delivers and recipients. Two methods were used to select the component constructs of acceptability. 1) An overview of reviews was conducted to identify systematic reviews that claim to define, theorise or measure acceptability of healthcare interventions. 2) Principles of inductive and deductive reasoning were applied to theorise the concept of acceptability and develop a theoretical framework. Steps included (1) defining acceptability; (2) describing its properties and scope and (3) identifying component constructs and empirical indicators. From the 43 reviews included in the overview, none explicitly theorised or defined acceptability. Measures used to assess acceptability focused on behaviour (e.g. dropout rates) (23 reviews), affect (i.e. feelings) (5 reviews), cognition (i.e. perceptions) (7 reviews) or a combination of these (8 reviews). From the methods described above we propose a definition: Acceptability is a multi-faceted construct that reflects the extent to which people delivering or receiving a healthcare intervention consider it to be appropriate, based on anticipated or experienced cognitive and emotional responses to the intervention. The theoretical framework of acceptability (TFA) consists of seven component constructs: affective attitude, burden, perceived effectiveness, ethicality, intervention coherence, opportunity costs, and self-efficacy. Despite frequent claims that healthcare interventions have assessed acceptability, it is evident that acceptability research could be more robust. The proposed definition of acceptability and the TFA can inform assessment tools and evaluations of the acceptability of new or existing interventions.
Project risk management in the construction of high-rise buildings
NASA Astrophysics Data System (ADS)
Titarenko, Boris; Hasnaoui, Amir; Titarenko, Roman; Buzuk, Liliya
2018-03-01
This paper shows the project risk management methods, which allow to better identify risks in the construction of high-rise buildings and to manage them throughout the life cycle of the project. One of the project risk management processes is a quantitative analysis of risks. The quantitative analysis usually includes the assessment of the potential impact of project risks and their probabilities. This paper shows the most popular methods of risk probability assessment and tries to indicate the advantages of the robust approach over the traditional methods. Within the framework of the project risk management model a robust approach of P. Huber is applied and expanded for the tasks of regression analysis of project data. The suggested algorithms used to assess the parameters in statistical models allow to obtain reliable estimates. A review of the theoretical problems of the development of robust models built on the methodology of the minimax estimates was done and the algorithm for the situation of asymmetric "contamination" was developed.
Robust cubature Kalman filter for GNSS/INS with missing observations and colored measurement noise.
Cui, Bingbo; Chen, Xiyuan; Tang, Xihua; Huang, Haoqian; Liu, Xiao
2018-01-01
In order to improve the accuracy of GNSS/INS working in GNSS-denied environment, a robust cubature Kalman filter (RCKF) is developed by considering colored measurement noise and missing observations. First, an improved cubature Kalman filter (CKF) is derived by considering colored measurement noise, where the time-differencing approach is applied to yield new observations. Then, after analyzing the disadvantages of existing methods, the measurement augment in processing colored noise is translated into processing the uncertainties of CKF, and new sigma point update framework is utilized to account for the bounded model uncertainties. By reusing the diffused sigma points and approximation residual in the prediction stage of CKF, the RCKF is developed and its error performance is analyzed theoretically. Results of numerical experiment and field test reveal that RCKF is more robust than CKF and extended Kalman filter (EKF), and compared with EKF, the heading error of land vehicle is reduced by about 72.4%. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Shoemark, Helen; Rimmer, Jo; Bower, Janeen; Tucquet, Belinda; Miller, Lauren; Fisher, Michelle; Ogburn, Nicholas; Dun, Beth
2018-03-09
This article reports on a project at the Royal Children's Hospital Melbourne in which the music therapy team synthesized their practice and related theories to propose a new conceptual framework for music therapy in their acute pediatric setting. The impetus for the project was the realization that in the process of producing key statements about the non-musical benefits of music therapy, the cost was often the suppression of information about the patient's unique musical potential as the major (mediating) pathway from referral reason, to music therapy, and to effective outcomes. The purpose of the project was to articulate how this team of clinicians conceive of the patient's musical self as the major theoretical pathway for music therapy in an evidence-based acute medical setting. The clinicians' shared reflexive process across six months involved robust directed discussion, annotation of shared reading, and documentation of all engagement in words and diagrams. The outcome was a consensus framework including three constructs: the place of music in the life of the infant, child, and young people, Culture and Context, and Musical Manifestations. The constructs were tested in a clinical audit, and found to be robustly inclusive. In addition to the conceptual framework, this project serves to demonstrate a process by which clinical teams may reflect on their individual practice and theory together to create a consensus stance for the overall service they provide in the one setting.
Decision-Making Under Risk: Integrating Perspectives From Biology, Economics, and Psychology.
Mishra, Sandeep
2014-08-01
Decision-making under risk has been variably characterized and examined in many different disciplines. However, interdisciplinary integration has not been forthcoming. Classic theories of decision-making have not been amply revised in light of greater empirical data on actual patterns of decision-making behavior. Furthermore, the meta-theoretical framework of evolution by natural selection has been largely ignored in theories of decision-making under risk in the human behavioral sciences. In this review, I critically examine four of the most influential theories of decision-making from economics, psychology, and biology: expected utility theory, prospect theory, risk-sensitivity theory, and heuristic approaches. I focus especially on risk-sensitivity theory, which offers a framework for understanding decision-making under risk that explicitly involves evolutionary considerations. I also review robust empirical evidence for individual differences and environmental/situational factors that predict actual risky decision-making that any general theory must account for. Finally, I offer steps toward integrating various theoretical perspectives and empirical findings on risky decision-making. © 2014 by the Society for Personality and Social Psychology, Inc.
Control of Multilayer Networks
Menichetti, Giulia; Dall’Asta, Luca; Bianconi, Ginestra
2016-01-01
The controllability of a network is a theoretical problem of relevance in a variety of contexts ranging from financial markets to the brain. Until now, network controllability has been characterized only on isolated networks, while the vast majority of complex systems are formed by multilayer networks. Here we build a theoretical framework for the linear controllability of multilayer networks by mapping the problem into a combinatorial matching problem. We found that correlating the external signals in the different layers can significantly reduce the multiplex network robustness to node removal, as it can be seen in conjunction with a hybrid phase transition occurring in interacting Poisson networks. Moreover we observe that multilayer networks can stabilize the fully controllable multiplex network configuration that can be stable also when the full controllability of the single network is not stable. PMID:26869210
Tarzia, Laura; May, Carl; Hegarty, Kelsey
2016-11-24
Domestic violence shares many features with chronic disease, including ongoing physical and mental health problems and eroded self-efficacy. Given the challenges around help-seeking for women experiencing domestic violence, it is essential that they be given support to 'self-manage' their condition. The growing popularity of web-based applications for chronic disease self-management suggests that there may be opportunities to use them as an intervention strategy for women experiencing domestic violence, however, as yet, little is known about whether this might work in practice. It is critical that interventions for domestic violence-whether web-based or otherwise-promote agency and capacity for action rather than adding to the 'workload' of already stressed and vulnerable women. Although randomised controlled trials are vital to determine the effectiveness of interventions, robust theoretical frameworks can complement them as a way of examining the feasibility of implementing an intervention in practice. To date, no such frameworks have been developed for the domestic violence context. Consequently, in this paper we propose that it may be useful to appraise interventions for domestic violence using frameworks developed to help understand the barriers and facilitators around self-management of chronic conditions. Using a case study of an online healthy relationship tool and safety decision aid developed in Australia (I-DECIDE), this paper adapts and applies two theories: Burden of Treatment Theory and Normalisation Process Theory, to assess whether the intervention might increase women's agency and capacity for action. In doing this, it proposes a new theoretical model with which the practical application of domestic violence interventions could be appraised in conjunction with other evaluation frameworks. This paper argues that theoretical frameworks for chronic disease are appropriate to assess the feasibility of implementing interventions for domestic violence in practice. The use of the modified Burden of Treatment/Normalisation Process Theory framework developed in this paper strengthens the case for I-DECIDE and other web-based applications as a way of supporting women experiencing domestic violence.
Wilson, Rhonda L; Wilson, G Glenn; Usher, Kim
2015-09-01
The mental health of people in rural communities is influenced by the robustness of the mental health ecosystem within each community. Theoretical approaches such as social ecology and social capital are useful when applied to the practical context of promoting environmental conditions which maximise mental health helping capital to enhance resilience and reduce vulnerably as a buffer for mental illness. This paper explores the ecological conditions that affect the mental health and illness of people in rural communities. It proposes a new mental health social ecology framework that makes full use of the locally available unique social capital that is sufficiently flexible to facilitate mental health helping capital best suited to mental health service delivery for rural people in an Australian context.
NASA Technical Reports Server (NTRS)
Li, Wei; Saleeb, Atef F.
1995-01-01
This two-part report is concerned with the development of a general framework for the implicit time-stepping integrators for the flow and evolution equations in generalized viscoplastic models. The primary goal is to present a complete theoretical formulation, and to address in detail the algorithmic and numerical analysis aspects involved in its finite element implementation, as well as to critically assess the numerical performance of the developed schemes in a comprehensive set of test cases. On the theoretical side, the general framework is developed on the basis of the unconditionally-stable, backward-Euler difference scheme as a starting point. Its mathematical structure is of sufficient generality to allow a unified treatment of different classes of viscoplastic models with internal variables. In particular, two specific models of this type, which are representative of the present start-of-art in metal viscoplasticity, are considered in applications reported here; i.e., fully associative (GVIPS) and non-associative (NAV) models. The matrix forms developed for both these models are directly applicable for both initially isotropic and anisotropic materials, in general (three-dimensional) situations as well as subspace applications (i.e., plane stress/strain, axisymmetric, generalized plane stress in shells). On the computational side, issues related to efficiency and robustness are emphasized in developing the (local) interative algorithm. In particular, closed-form expressions for residual vectors and (consistent) material tangent stiffness arrays are given explicitly for both GVIPS and NAV models, with their maximum sizes 'optimized' to depend only on the number of independent stress components (but independent of the number of viscoplastic internal state parameters). Significant robustness of the local iterative solution is provided by complementing the basic Newton-Raphson scheme with a line-search strategy for convergence. In the present second part of the report, we focus on the specific details of the numerical schemes, and associated computer algorithms, for the finite-element implementation of GVIPS and NAV models.
Testing the Grossman model of medical spending determinants with macroeconomic panel data.
Hartwig, Jochen; Sturm, Jan-Egbert
2018-02-16
Michael Grossman's human capital model of the demand for health has been argued to be one of the major achievements in theoretical health economics. Attempts to test this model empirically have been sparse, however, and with mixed results. These attempts so far relied on using-mostly cross-sectional-micro data from household surveys. For the first time in the literature, we bring in macroeconomic panel data for 29 OECD countries over the period 1970-2010 to test the model. To check the robustness of the results for the determinants of medical spending identified by the model, we include additional covariates in an extreme bounds analysis (EBA) framework. The preferred model specifications (including the robust covariates) do not lend much empirical support to the Grossman model. This is in line with the mixed results of earlier studies.
Developing a theoretical framework for complex community-based interventions.
Angeles, Ricardo N; Dolovich, Lisa; Kaczorowski, Janusz; Thabane, Lehana
2014-01-01
Applying existing theories to research, in the form of a theoretical framework, is necessary to advance knowledge from what is already known toward the next steps to be taken. This article proposes a guide on how to develop a theoretical framework for complex community-based interventions using the Cardiovascular Health Awareness Program as an example. Developing a theoretical framework starts with identifying the intervention's essential elements. Subsequent steps include the following: (a) identifying and defining the different variables (independent, dependent, mediating/intervening, moderating, and control); (b) postulating mechanisms how the independent variables will lead to the dependent variables; (c) identifying existing theoretical models supporting the theoretical framework under development; (d) scripting the theoretical framework into a figure or sets of statements as a series of hypotheses, if/then logic statements, or a visual model; (e) content and face validation of the theoretical framework; and (f) revising the theoretical framework. In our example, we combined the "diffusion of innovation theory" and the "health belief model" to develop our framework. Using the Cardiovascular Health Awareness Program as the model, we demonstrated a stepwise process of developing a theoretical framework. The challenges encountered are described, and an overview of the strategies employed to overcome these challenges is presented.
Robustness and structure of complex networks
NASA Astrophysics Data System (ADS)
Shao, Shuai
This dissertation covers the two major parts of my PhD research on statistical physics and complex networks: i) modeling a new type of attack -- localized attack, and investigating robustness of complex networks under this type of attack; ii) discovering the clustering structure in complex networks and its influence on the robustness of coupled networks. Complex networks appear in every aspect of our daily life and are widely studied in Physics, Mathematics, Biology, and Computer Science. One important property of complex networks is their robustness under attacks, which depends crucially on the nature of attacks and the structure of the networks themselves. Previous studies have focused on two types of attack: random attack and targeted attack, which, however, are insufficient to describe many real-world damages. Here we propose a new type of attack -- localized attack, and study the robustness of complex networks under this type of attack, both analytically and via simulation. On the other hand, we also study the clustering structure in the network, and its influence on the robustness of a complex network system. In the first part, we propose a theoretical framework to study the robustness of complex networks under localized attack based on percolation theory and generating function method. We investigate the percolation properties, including the critical threshold of the phase transition pc and the size of the giant component Pinfinity. We compare localized attack with random attack and find that while random regular (RR) networks are more robust against localized attack, Erdoḧs-Renyi (ER) networks are equally robust under both types of attacks. As for scale-free (SF) networks, their robustness depends crucially on the degree exponent lambda. The simulation results show perfect agreement with theoretical predictions. We also test our model on two real-world networks: a peer-to-peer computer network and an airline network, and find that the real-world networks are much more vulnerable to localized attack compared with random attack. In the second part, we extend the tree-like generating function method to incorporating clustering structure in complex networks. We study the robustness of a complex network system, especially a network of networks (NON) with clustering structure in each network. We find that the system becomes less robust as we increase the clustering coefficient of each network. For a partially dependent network system, we also find that the influence of the clustering coefficient on network robustness decreases as we decrease the coupling strength, and the critical coupling strength qc, at which the first-order phase transition changes to second-order, increases as we increase the clustering coefficient.
Warm glow, free-riding and vehicle neutrality in a health-related contingent valuation study.
Hackl, Franz; Pruckner, Gerald J
2005-03-01
Criticism of contingent valuation (CV) stresses warm glow and free-riding as possible causes for biased willingness to pay figures. We present an empirical framework to study the existence of warm glow and free-riding in hypothetical WTP answers based on a CV survey for the measurement of health-related Red Cross services. Both in conventional double-bounded and spike models we do not find indication of warm glow phenomena and free-riding behaviour. The results are very robust and insensitive to the applied payment vehicles. Theoretical objections against CV do not find sufficient empirical support.
Loops in hierarchical channel networks
NASA Astrophysics Data System (ADS)
Katifori, Eleni; Magnasco, Marcelo
2012-02-01
Nature provides us with many examples of planar distribution and structural networks having dense sets of closed loops. An archetype of this form of network organization is the vasculature of dicotyledonous leaves, which showcases a hierarchically-nested architecture. Although a number of methods have been proposed to measure aspects of the structure of such networks, a robust metric to quantify their hierarchical organization is still lacking. We present an algorithmic framework that allows mapping loopy networks to binary trees, preserving in the connectivity of the trees the architecture of the original graph. We apply this framework to investigate computer generated and natural graphs extracted from digitized images of dicotyledonous leaves and animal vasculature. We calculate various metrics on the corresponding trees and discuss the relationship of these quantities to the architectural organization of the original graphs. This algorithmic framework decouples the geometric information from the metric topology (connectivity and edge weight) and it ultimately allows us to perform a quantitative statistical comparison between predictions of theoretical models and naturally occurring loopy graphs.
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties observed in biological systems at different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be enough to confer intrinsic robustness in order to tolerate intrinsic parameter fluctuations, genetic robustness for buffering genetic variations, and environmental robustness for resisting environmental disturbances. With this, the phenotypic stability of biological network can be maintained, thus guaranteeing phenotype robustness. This paper presents a survey on biological systems and then develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation in systems and evolutionary biology. Further, from the unifying mathematical framework, it was discovered that the phenotype robustness criterion for biological networks at different levels relies upon intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness. When this is true, the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in systems and evolutionary biology can also be investigated through their corresponding phenotype robustness criterion from the systematic point of view. PMID:23515240
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties observed in biological systems at different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be enough to confer intrinsic robustness in order to tolerate intrinsic parameter fluctuations, genetic robustness for buffering genetic variations, and environmental robustness for resisting environmental disturbances. With this, the phenotypic stability of biological network can be maintained, thus guaranteeing phenotype robustness. This paper presents a survey on biological systems and then develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation in systems and evolutionary biology. Further, from the unifying mathematical framework, it was discovered that the phenotype robustness criterion for biological networks at different levels relies upon intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness. When this is true, the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in systems and evolutionary biology can also be investigated through their corresponding phenotype robustness criterion from the systematic point of view.
The need for international nursing diagnosis research and a theoretical framework.
Lunney, Margaret
2008-01-01
To describe the need for nursing diagnosis research and a theoretical framework for such research. A linguistics theory served as the foundation for the theoretical framework. Reasons for additional nursing diagnosis research are: (a) file names are needed for implementation of electronic health records, (b) international consensus is needed for an international classification, and (c) continuous changes occur in clinical practice. A theoretical framework used by the author is explained. Theoretical frameworks provide support for nursing diagnosis research. Linguistics theory served as an appropriate exemplar theory to support nursing research. Additional nursing diagnosis studies based upon a theoretical framework are needed and linguistics theory can provide an appropriate structure for this research.
Data Curation and Visualization for MuSIASEM Analysis of the Nexus
NASA Astrophysics Data System (ADS)
Renner, Ansel
2017-04-01
A novel software-based approach to relational analysis applying recent theoretical advancements of the Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism (MuSIASEM) accounting framework is presented. This research explores and explains underutilized ways software can assist complex system analysis across the stages of data collection, exploration, analysis and dissemination and in a transparent and collaborative manner. This work is being conducted as part of, and in support of, the four-year European Commission H2020 project: Moving Towards Adaptive Governance in Complexity: Informing Nexus Security (MAGIC). In MAGIC, theoretical advancements to MuSIASEM propose a powerful new approach to spatial-temporal WEFC relational analysis in accordance with a structural-functional scaling mechanism appropriate for biophysically relevant complex system analyses. Software is designed primarily with JavaScript using the Angular2 model-view-controller framework and the Data-Driven Documents (D3) library. These design choices clarify and modularize data flow, simplify research practitioner's work, allow for and assist stakeholder involvement and advance collaboration at all stages. Data requirements and scalable, robust yet light-weight structuring will first be explained. Following, algorithms to process this data will be explored. Data interfaces and data visualization approaches will lastly be presented and described.
High-Energy Cosmic Rays from Supernovae
NASA Astrophysics Data System (ADS)
Morlino, Giovanni
Cosmic rays are charged relativistic particles that reach the Earth with extremely high energies, providing striking evidence of the existence of effective accelerators in the Universe. Below an energy around ˜ 1017 eV, cosmic rays are believed to be produced in the Milky Way, while above that energy, their origin is probably extragalactic. In the early 1930s, supernovae were already identified as possible sources for the galactic component of cosmic rays. After the 1970s this idea has gained more and more credibility, thanks to the development of the diffusive shock acceleration theory, which provides a robust theoretical framework for particle energization in astrophysical environments. Afterward, mostly in recent years, much observational evidence has been gathered in support of this framework, converting a speculative idea in a real paradigm. In this chapter the basic pillars of this paradigm will be illustrated. This includes the acceleration mechanism, the nonlinear effects produced by accelerated particles onto the shock dynamics needed to reach the highest energies, the escape process from the sources, and the transportation of cosmic rays through the Galaxy. The theoretical picture will be corroborated by discussing several observations which support the idea that supernova remnants are effective cosmic ray factories.
Design and performance frameworks for constructing problem-solving simulations.
Stevens, Ron; Palacio-Cayetano, Joycelin
2003-01-01
Rapid advancements in hardware, software, and connectivity are helping to shorten the times needed to develop computer simulations for science education. These advancements, however, have not been accompanied by corresponding theories of how best to design and use these technologies for teaching, learning, and testing. Such design frameworks ideally would be guided less by the strengths/limitations of the presentation media and more by cognitive analyses detailing the goals of the tasks, the needs and abilities of students, and the resulting decision outcomes needed by different audiences. This article describes a problem-solving environment and associated theoretical framework for investigating how students select and use strategies as they solve complex science problems. A framework is first described for designing on-line problem spaces that highlights issues of content, scale, cognitive complexity, and constraints. While this framework was originally designed for medical education, it has proven robust and has been successfully applied to learning environments from elementary school through medical school. Next, a similar framework is detailed for collecting student performance and progress data that can provide evidence of students' strategic thinking and that could potentially be used to accelerate student progress. Finally, experimental validation data are presented that link strategy selection and use with other metrics of scientific reasoning and student achievement.
Design and Performance Frameworks for Constructing Problem-Solving Simulations
Stevens, Ron; Palacio-Cayetano, Joycelin
2003-01-01
Rapid advancements in hardware, software, and connectivity are helping to shorten the times needed to develop computer simulations for science education. These advancements, however, have not been accompanied by corresponding theories of how best to design and use these technologies for teaching, learning, and testing. Such design frameworks ideally would be guided less by the strengths/limitations of the presentation media and more by cognitive analyses detailing the goals of the tasks, the needs and abilities of students, and the resulting decision outcomes needed by different audiences. This article describes a problem-solving environment and associated theoretical framework for investigating how students select and use strategies as they solve complex science problems. A framework is first described for designing on-line problem spaces that highlights issues of content, scale, cognitive complexity, and constraints. While this framework was originally designed for medical education, it has proven robust and has been successfully applied to learning environments from elementary school through medical school. Next, a similar framework is detailed for collecting student performance and progress data that can provide evidence of students' strategic thinking and that could potentially be used to accelerate student progress. Finally, experimental validation data are presented that link strategy selection and use with other metrics of scientific reasoning and student achievement. PMID:14506505
A Game Theoretic Framework for Analyzing Re-Identification Risk
Wan, Zhiyu; Vorobeychik, Yevgeniy; Xia, Weiyi; Clayton, Ellen Wright; Kantarcioglu, Murat; Ganta, Ranjit; Heatherly, Raymond; Malin, Bradley A.
2015-01-01
Given the potential wealth of insights in personal data the big databases can provide, many organizations aim to share data while protecting privacy by sharing de-identified data, but are concerned because various demonstrations show such data can be re-identified. Yet these investigations focus on how attacks can be perpetrated, not the likelihood they will be realized. This paper introduces a game theoretic framework that enables a publisher to balance re-identification risk with the value of sharing data, leveraging a natural assumption that a recipient only attempts re-identification if its potential gains outweigh the costs. We apply the framework to a real case study, where the value of the data to the publisher is the actual grant funding dollar amounts from a national sponsor and the re-identification gain of the recipient is the fine paid to a regulator for violation of federal privacy rules. There are three notable findings: 1) it is possible to achieve zero risk, in that the recipient never gains from re-identification, while sharing almost as much data as the optimal solution that allows for a small amount of risk; 2) the zero-risk solution enables sharing much more data than a commonly invoked de-identification policy of the U.S. Health Insurance Portability and Accountability Act (HIPAA); and 3) a sensitivity analysis demonstrates these findings are robust to order-of-magnitude changes in player losses and gains. In combination, these findings provide support that such a framework can enable pragmatic policy decisions about de-identified data sharing. PMID:25807380
Panaceas, uncertainty, and the robust control framework in sustainability science
Anderies, John M.; Rodriguez, Armando A.; Janssen, Marco A.; Cifdaloz, Oguzhan
2007-01-01
A critical challenge faced by sustainability science is to develop strategies to cope with highly uncertain social and ecological dynamics. This article explores the use of the robust control framework toward this end. After briefly outlining the robust control framework, we apply it to the traditional Gordon–Schaefer fishery model to explore fundamental performance–robustness and robustness–vulnerability trade-offs in natural resource management. We find that the classic optimal control policy can be very sensitive to parametric uncertainty. By exploring a large class of alternative strategies, we show that there are no panaceas: even mild robustness properties are difficult to achieve, and increasing robustness to some parameters (e.g., biological parameters) results in decreased robustness with respect to others (e.g., economic parameters). On the basis of this example, we extract some broader themes for better management of resources under uncertainty and for sustainability science in general. Specifically, we focus attention on the importance of a continual learning process and the use of robust control to inform this process. PMID:17881574
Shi, Zhenyu; Wedd, Anthony G.; Gras, Sally L.
2013-01-01
The development of synthetic biology requires rapid batch construction of large gene networks from combinations of smaller units. Despite the availability of computational predictions for well-characterized enzymes, the optimization of most synthetic biology projects requires combinational constructions and tests. A new building-brick-style parallel DNA assembly framework for simple and flexible batch construction is presented here. It is based on robust recombination steps and allows a variety of DNA assembly techniques to be organized for complex constructions (with or without scars). The assembly of five DNA fragments into a host genome was performed as an experimental demonstration. PMID:23468883
Toward a theory of organisms: Three founding principles in search of a useful integration
SOTO, ANA M.; LONGO, GIUSEPPE; MIQUEL, PAUL-ANTOINE; MONTEVIL, MAËL; MOSSIO, MATTEO; PERRET, NICOLE; POCHEVILLE, ARNAUD; SONNENSCHEIN, CARLOS
2016-01-01
Organisms, be they uni- or multi-cellular, are agents capable of creating their own norms; they are continuously harmonizing their ability to create novelty and stability, that is, they combine plasticity with robustness. Here we articulate the three principles for a theory of organisms proposed in this issue, namely: the default state of proliferation with variation and motility, the principle of variation and the principle of organization. These principles profoundly change both biological observables and their determination with respect to the theoretical framework of physical theories. This radical change opens up the possibility of anchoring mathematical modeling in biologically proper principles. PMID:27498204
1986-06-01
Energy and Natural Resources SWS Contract Report 391 FINAL REPORT A THEORETICAL FRAMEWORK FOR EXAMINING GEOGRAPHICAL VARIABILITY IN THE MICROPHYSICAL...U) A Theoretical Framework for Examining Geographical Variability in the Microphysical Mechanisms of Precipitation Development 12. PERSONAL AUTHOR(S...concentration. Oter key parameters include the degree of entrainment and stability of the environment. I 5 - T17 Unclassified ,.-. . A THEORETICAL FRAMEWORK FOR
Simeonov, Plamen L; Ehresmann, Andrée C
2017-12-01
Forty-two years ago, Capra published "The Tao of Physics" (Capra, 1975). In this book (page 17) he writes: "The exploration of the atomic and subatomic world in the twentieth century has …. necessitated a radical revision of many of our basic concepts" and that, unlike 'classical' physics, the sub-atomic and quantum "modern physics" shows resonances with Eastern thoughts and "leads us to a view of the world which is very similar to the views held by mystics of all ages and traditions." This article stresses an analogous situation in biology with respect to a new theoretical approach for studying living systems, Integral Biomathics (IB), which also exhibits some resonances with Eastern thought. Stepping on earlier research in cybernetics 1 and theoretical biology, 2 IB has been developed since 2011 by over 100 scientists from a number of disciplines who have been exploring a substantial set of theoretical frameworks. From that effort, the need for a robust core model utilizing advanced mathematics and computation adequate for understanding the behavior of organisms as dynamic wholes was identified. At this end, the authors of this article have proposed WLIMES (Ehresmann and Simeonov, 2012), a formal theory for modeling living systems integrating both the Memory Evolutive Systems (Ehresmann and Vanbremeersch, 2007) and the Wandering Logic Intelligence (Simeonov, 2002b). Its principles will be recalled here with respect to their resonances to Eastern thought. Copyright © 2017 Elsevier Ltd. All rights reserved.
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties that are observed in biological systems at many different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be large enough to confer: intrinsic robustness for tolerating intrinsic parameter fluctuations; genetic robustness for buffering genetic variations; and environmental robustness for resisting environmental disturbances. Network robustness is needed so phenotype stability of biological network can be maintained, guaranteeing phenotype robustness. Synthetic biology is foreseen to have important applications in biotechnology and medicine; it is expected to contribute significantly to a better understanding of functioning of complex biological systems. This paper presents a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation for synthetic gene networks in synthetic biology. Further, from the unifying mathematical framework, we found that the phenotype robustness criterion for synthetic gene networks is the following: if intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in synthetic biology can also be investigated through corresponding phenotype robustness criteria from the systematic point of view. Finally, a robust synthetic design that involves network evolution algorithms with desired behavior under intrinsic parameter fluctuations, genetic variations, and environmental disturbances, is also proposed, together with a simulation example. PMID:23515190
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties that are observed in biological systems at many different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be large enough to confer: intrinsic robustness for tolerating intrinsic parameter fluctuations; genetic robustness for buffering genetic variations; and environmental robustness for resisting environmental disturbances. Network robustness is needed so phenotype stability of biological network can be maintained, guaranteeing phenotype robustness. Synthetic biology is foreseen to have important applications in biotechnology and medicine; it is expected to contribute significantly to a better understanding of functioning of complex biological systems. This paper presents a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation for synthetic gene networks in synthetic biology. Further, from the unifying mathematical framework, we found that the phenotype robustness criterion for synthetic gene networks is the following: if intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in synthetic biology can also be investigated through corresponding phenotype robustness criteria from the systematic point of view. Finally, a robust synthetic design that involves network evolution algorithms with desired behavior under intrinsic parameter fluctuations, genetic variations, and environmental disturbances, is also proposed, together with a simulation example.
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
In ecological networks, network robustness should be large enough to confer intrinsic robustness for tolerating intrinsic parameter fluctuations, as well as environmental robustness for resisting environmental disturbances, so that the phenotype stability of ecological networks can be maintained, thus guaranteeing phenotype robustness. However, it is difficult to analyze the network robustness of ecological systems because they are complex nonlinear partial differential stochastic systems. This paper develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance sensitivity in ecological networks. We found that the phenotype robustness criterion for ecological networks is that if intrinsic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations and environmental disturbances. These results in robust ecological networks are similar to that in robust gene regulatory networks and evolutionary networks even they have different spatial-time scales. PMID:23515112
The predictive mind and the experience of visual art work
Kesner, Ladislav
2014-01-01
Among the main challenges of the predictive brain/mind concept is how to link prediction at the neural level to prediction at the cognitive-psychological level and finding conceptually robust and empirically verifiable ways to harness this theoretical framework toward explaining higher-order mental and cognitive phenomena, including the subjective experience of aesthetic and symbolic forms. Building on the tentative prediction error account of visual art, this article extends the application of the predictive coding framework to the visual arts. It does so by linking this theoretical discussion to a subjective, phenomenological account of how a work of art is experienced. In order to engage more deeply with a work of art, viewers must be able to tune or adapt their prediction mechanism to recognize art as a specific class of objects whose ontological nature defies predictability, and they must be able to sustain a productive flow of predictions from low-level sensory, recognitional to abstract semantic, conceptual, and affective inferences. The affective component of the process of predictive error optimization that occurs when a viewer enters into dialog with a painting is constituted both by activating the affective affordances within the image and by the affective consequences of prediction error minimization itself. The predictive coding framework also has implications for the problem of the culturality of vision. A person’s mindset, which determines what top–down expectations and predictions are generated, is co-constituted by culture-relative skills and knowledge, which form hyperpriors that operate in the perception of art. PMID:25566111
The predictive mind and the experience of visual art work.
Kesner, Ladislav
2014-01-01
Among the main challenges of the predictive brain/mind concept is how to link prediction at the neural level to prediction at the cognitive-psychological level and finding conceptually robust and empirically verifiable ways to harness this theoretical framework toward explaining higher-order mental and cognitive phenomena, including the subjective experience of aesthetic and symbolic forms. Building on the tentative prediction error account of visual art, this article extends the application of the predictive coding framework to the visual arts. It does so by linking this theoretical discussion to a subjective, phenomenological account of how a work of art is experienced. In order to engage more deeply with a work of art, viewers must be able to tune or adapt their prediction mechanism to recognize art as a specific class of objects whose ontological nature defies predictability, and they must be able to sustain a productive flow of predictions from low-level sensory, recognitional to abstract semantic, conceptual, and affective inferences. The affective component of the process of predictive error optimization that occurs when a viewer enters into dialog with a painting is constituted both by activating the affective affordances within the image and by the affective consequences of prediction error minimization itself. The predictive coding framework also has implications for the problem of the culturality of vision. A person's mindset, which determines what top-down expectations and predictions are generated, is co-constituted by culture-relative skills and knowledge, which form hyperpriors that operate in the perception of art.
NASA Astrophysics Data System (ADS)
McPhail, C.; Maier, H. R.; Kwakkel, J. H.; Giuliani, M.; Castelletti, A.; Westra, S.
2018-02-01
Robustness is being used increasingly for decision analysis in relation to deep uncertainty and many metrics have been proposed for its quantification. Recent studies have shown that the application of different robustness metrics can result in different rankings of decision alternatives, but there has been little discussion of what potential causes for this might be. To shed some light on this issue, we present a unifying framework for the calculation of robustness metrics, which assists with understanding how robustness metrics work, when they should be used, and why they sometimes disagree. The framework categorizes the suitability of metrics to a decision-maker based on (1) the decision-context (i.e., the suitability of using absolute performance or regret), (2) the decision-maker's preferred level of risk aversion, and (3) the decision-maker's preference toward maximizing performance, minimizing variance, or some higher-order moment. This article also introduces a conceptual framework describing when relative robustness values of decision alternatives obtained using different metrics are likely to agree and disagree. This is used as a measure of how "stable" the ranking of decision alternatives is when determined using different robustness metrics. The framework is tested on three case studies, including water supply augmentation in Adelaide, Australia, the operation of a multipurpose regulated lake in Italy, and flood protection for a hypothetical river based on a reach of the river Rhine in the Netherlands. The proposed conceptual framework is confirmed by the case study results, providing insight into the reasons for disagreements between rankings obtained using different robustness metrics.
Analysis of Implicit Uncertain Systems. Part 1: Theoretical Framework
1994-12-07
Analysis of Implicit Uncertain Systems Part I: Theoretical Framework Fernando Paganini * John Doyle 1 December 7, 1994 Abst rac t This paper...Analysis of Implicit Uncertain Systems Part I: Theoretical Framework 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...model and a number of constraints relevant to the analysis problem under consideration. In Part I of this paper we propose a theoretical framework which
Schalock, Robert L; Luckasson, Ruth; Tassé, Marc J; Verdugo, Miguel Angel
2018-04-01
This article describes a holistic theoretical framework that can be used to explain intellectual disability (ID) and organize relevant information into a usable roadmap to guide understanding and application. Developing the framework involved analyzing the four current perspectives on ID and synthesizing this information into a holistic theoretical framework. Practices consistent with the framework are described, and examples are provided of how multiple stakeholders can apply the framework. The article concludes with a discussion of the advantages and implications of a holistic theoretical approach to ID.
Robust model predictive control for multi-step short range spacecraft rendezvous
NASA Astrophysics Data System (ADS)
Zhu, Shuyi; Sun, Ran; Wang, Jiaolong; Wang, Jihe; Shao, Xiaowei
2018-07-01
This work presents a robust model predictive control (MPC) approach for the multi-step short range spacecraft rendezvous problem. During the specific short range phase concerned, the chaser is supposed to be initially outside the line-of-sight (LOS) cone. Therefore, the rendezvous process naturally includes two steps: the first step is to transfer the chaser into the LOS cone and the second step is to transfer the chaser into the aimed region with its motion confined within the LOS cone. A novel MPC framework named after Mixed MPC (M-MPC) is proposed, which is the combination of the Variable-Horizon MPC (VH-MPC) framework and the Fixed-Instant MPC (FI-MPC) framework. The M-MPC framework enables the optimization for the two steps to be implemented jointly rather than to be separated factitiously, and its computation workload is acceptable for the usually low-power processors onboard spacecraft. Then considering that disturbances including modeling error, sensor noise and thrust uncertainty may induce undesired constraint violations, a robust technique is developed and it is attached to the above M-MPC framework to form a robust M-MPC approach. The robust technique is based on the chance-constrained idea, which ensures that constraints can be satisfied with a prescribed probability. It improves the robust technique proposed by Gavilan et al., because it eliminates the unnecessary conservativeness by explicitly incorporating known statistical properties of the navigation uncertainty. The efficacy of the robust M-MPC approach is shown in a simulation study.
Upscaling species richness and abundances in tropical forests
Tovo, Anna; Suweis, Samir; Formentin, Marco; Favretti, Marco; Volkov, Igor; Banavar, Jayanth R.; Azaele, Sandro; Maritan, Amos
2017-01-01
The quantification of tropical tree biodiversity worldwide remains an open and challenging problem. More than two-fifths of the number of worldwide trees can be found either in tropical or in subtropical forests, but only ≈0.000067% of species identities are known. We introduce an analytical framework that provides robust and accurate estimates of species richness and abundances in biodiversity-rich ecosystems, as confirmed by tests performed on both in silico–generated and real forests. Our analysis shows that the approach outperforms other methods. In particular, we find that upscaling methods based on the log-series species distribution systematically overestimate the number of species and abundances of the rare species. We finally apply our new framework on 15 empirical tropical forest plots and quantify the minimum percentage cover that should be sampled to achieve a given average confidence interval in the upscaled estimate of biodiversity. Our theoretical framework confirms that the forests studied are comprised of a large number of rare or hyper-rare species. This is a signature of critical-like behavior of species-rich ecosystems and can provide a buffer against extinction. PMID:29057324
Use of theoretical and conceptual frameworks in qualitative research.
Green, Helen Elise
2014-07-01
To debate the definition and use of theoretical and conceptual frameworks in qualitative research. There is a paucity of literature to help the novice researcher to understand what theoretical and conceptual frameworks are and how they should be used. This paper acknowledges the interchangeable usage of these terms and researchers' confusion about the differences between the two. It discusses how researchers have used theoretical and conceptual frameworks and the notion of conceptual models. Detail is given about how one researcher incorporated a conceptual framework throughout a research project, the purpose for doing so and how this led to a resultant conceptual model. Concepts from Abbott (1988) and Witz ( 1992 ) were used to provide a framework for research involving two case study sites. The framework was used to determine research questions and give direction to interviews and discussions to focus the research. Some research methods do not overtly use a theoretical framework or conceptual framework in their design, but this is implicit and underpins the method design, for example in grounded theory. Other qualitative methods use one or the other to frame the design of a research project or to explain the outcomes. An example is given of how a conceptual framework was used throughout a research project. Theoretical and conceptual frameworks are terms that are regularly used in research but rarely explained. Textbooks should discuss what they are and how they can be used, so novice researchers understand how they can help with research design. Theoretical and conceptual frameworks need to be more clearly understood by researchers and correct terminology used to ensure clarity for novice researchers.
The Emergence of Relationship-based Cooperation.
Xu, Bo; Wang, Jianwei
2015-11-16
This paper investigates the emergence of relationship-based cooperation by coupling two simple mechanisms into the model: tie strength based investment preference and homophily assumption. We construct the model by categorizing game participants into four types: prosocialists (players who prefers to invest in their intimate friends), antisocialists (players who prefer to invest in strangers), egoists (players who never cooperate) and altruists (players who cooperate indifferently with anyone). We show that the relationship-based cooperation (prosocialists) is favored throughout the evolution if we assume players of the same type have stronger ties than different ones. Moreover, we discover that strengthening the internal bonds within the strategic clusters further promotes the competitiveness of prosocialists and therefore facilitates the emergence of relationship-based cooperation in our proposed scenarios. The robustness of the model is also tested under different strategy updating rules and network structures. The results show that this argument is robust against the variations of initial conditions and therefore can be considered as a fundamental theoretical framework to study relationship-based cooperation in reality.
The Emergence of Relationship-based Cooperation
NASA Astrophysics Data System (ADS)
Xu, Bo; Wang, Jianwei
2015-11-01
This paper investigates the emergence of relationship-based cooperation by coupling two simple mechanisms into the model: tie strength based investment preference and homophily assumption. We construct the model by categorizing game participants into four types: prosocialists (players who prefers to invest in their intimate friends), antisocialists (players who prefer to invest in strangers), egoists (players who never cooperate) and altruists (players who cooperate indifferently with anyone). We show that the relationship-based cooperation (prosocialists) is favored throughout the evolution if we assume players of the same type have stronger ties than different ones. Moreover, we discover that strengthening the internal bonds within the strategic clusters further promotes the competitiveness of prosocialists and therefore facilitates the emergence of relationship-based cooperation in our proposed scenarios. The robustness of the model is also tested under different strategy updating rules and network structures. The results show that this argument is robust against the variations of initial conditions and therefore can be considered as a fundamental theoretical framework to study relationship-based cooperation in reality.
Chaisangmongkon, Warasinee; Swaminathan, Sruthi K.; Freedman, David J.; Wang, Xiao-Jing
2017-01-01
Summary Decision making involves dynamic interplay between internal judgements and external perception, which has been investigated in delayed match-to-category (DMC) experiments. Our analysis of neural recordings shows that, during DMC tasks, LIP and PFC neurons demonstrate mixed, time-varying, and heterogeneous selectivity, but previous theoretical work has not established the link between these neural characteristics and population-level computations. We trained a recurrent network model to perform DMC tasks and found that the model can remarkably reproduce key features of neuronal selectivity at the single-neuron and population levels. Analysis of the trained networks elucidates that robust transient trajectories of the neural population are the key driver of sequential categorical decisions. The directions of trajectories are governed by network self-organized connectivity, defining a ‘neural landscape’, consisting of a task-tailored arrangement of slow states and dynamical tunnels. With this model, we can identify functionally-relevant circuit motifs and generalize the framework to solve other categorization tasks. PMID:28334612
The Emergence of Relationship-based Cooperation
Xu, Bo; Wang, Jianwei
2015-01-01
This paper investigates the emergence of relationship-based cooperation by coupling two simple mechanisms into the model: tie strength based investment preference and homophily assumption. We construct the model by categorizing game participants into four types: prosocialists (players who prefers to invest in their intimate friends), antisocialists (players who prefer to invest in strangers), egoists (players who never cooperate) and altruists (players who cooperate indifferently with anyone). We show that the relationship-based cooperation (prosocialists) is favored throughout the evolution if we assume players of the same type have stronger ties than different ones. Moreover, we discover that strengthening the internal bonds within the strategic clusters further promotes the competitiveness of prosocialists and therefore facilitates the emergence of relationship-based cooperation in our proposed scenarios. The robustness of the model is also tested under different strategy updating rules and network structures. The results show that this argument is robust against the variations of initial conditions and therefore can be considered as a fundamental theoretical framework to study relationship-based cooperation in reality. PMID:26567904
Cracking the chocolate egg problem: polymeric films coated on curved substrates
NASA Astrophysics Data System (ADS)
Brun, Pierre-Thomas; Lee, Anna; Marthelot, Joel; Balestra, Gioele; Gallaire, François; Reis, Pedro
2015-11-01
Inspired by the traditional chocolate egg recipe, we show that pouring a polymeric solution onto spherical molds yields a simple and robust path of fabrication of thin elastic curved shells. The drainage dynamics naturally leads to uniform coatings frozen in time as the polymer cures, which are subsequently peeled off their mold. We show how the polymer curing affects the drainage dynamics and eventually selects the shell thickness and sets its uniformity. To this end, we perform coating experiments using silicon based elastomers, Vinylpolysiloxane (VPS) and Polydimethylsiloxane (PDMS). These results are rationalized combining numerical simulations of the lubrication flow field to a theoretical model of the dynamics yielding an analytical prediction of the formed shell characteristics. In particular, the robustness of the coating technique and its flexibility, two critical features for providing a generic framework for future studies, are shown to be an inherent consequence of the flow field (memory loss). The shell structure is both independent of initial conditions and tailorable by changing a single experimental parameter.
Light transport on path-space manifolds
NASA Astrophysics Data System (ADS)
Jakob, Wenzel Alban
The pervasive use of computer-generated graphics in our society has led to strict demands on their visual realism. Generally, users of rendering software want their images to look, in various ways, "real", which has been a key driving force towards methods that are based on the physics of light transport. Until recently, industrial practice has relied on a different set of methods that had comparatively little rigorous grounding in physics---but within the last decade, advances in rendering methods and computing power have come together to create a sudden and dramatic shift, in which physics-based methods that were formerly thought impractical have become the standard tool. As a consequence, considerable attention is now devoted towards making these methods as robust as possible. In this context, robustness refers to an algorithm's ability to process arbitrary input without large increases of the rendering time or degradation of the output image. One particularly challenging aspect of robustness entails simulating the precise interaction of light with all the materials that comprise the input scene. This dissertation focuses on one specific group of materials that has fundamentally been the most important source of difficulties in this process. Specular materials, such as glass windows, mirrors or smooth coatings (e.g. on finished wood), account for a significant percentage of the objects that surround us every day. It is perhaps surprising, then, that it is not well-understood how they can be accommodated within the theoretical framework that underlies some of the most sophisticated rendering methods available today. Many of these methods operate using a theoretical framework known as path space integration. But this framework makes no provisions for specular materials: to date, it is not clear how to write down a path space integral involving something as simple as a piece of glass. Although implementations can in practice still render these materials by side-stepping limitations of the theory, they often suffer from unusably slow convergence; improvements to this situation have been hampered by the lack of a thorough theoretical understanding. We address these problems by developing a new theory of path-space light transport which, for the first time, cleanly incorporates specular scattering into the standard framework. Most of the results obtained in the analysis of the ideally smooth case can also be generalized to rendering of glossy materials and volumetric scattering so that this dissertation also provides a powerful new set of tools for dealing with them. The basis of our approach is that each specular material interaction locally collapses the dimension of the space of light paths so that all relevant paths lie on a submanifold of path space. We analyze the high-dimensional differential geometry of this submanifold and use the resulting information to construct an algorithm that is able to "walk" around on it using a simple and efficient equation-solving iteration. This manifold walking algorithm then constitutes the key operation of a new type of Markov Chain Monte Carlo (MCMC) rendering method that computes lighting through very general families of paths that can involve arbitrary combinations of specular, near-specular, glossy, and diffuse surface interactions as well as isotropic or highly anisotropic volume scattering. We demonstrate our implementation on a range of challenging scenes and evaluate it against previous methods.
A Riemannian framework for orientation distribution function computing.
Cheng, Jian; Ghosh, Aurobrata; Jiang, Tianzi; Deriche, Rachid
2009-01-01
Compared with Diffusion Tensor Imaging (DTI), High Angular Resolution Imaging (HARDI) can better explore the complex microstructure of white matter. Orientation Distribution Function (ODF) is used to describe the probability of the fiber direction. Fisher information metric has been constructed for probability density family in Information Geometry theory and it has been successfully applied for tensor computing in DTI. In this paper, we present a state of the art Riemannian framework for ODF computing based on Information Geometry and sparse representation of orthonormal bases. In this Riemannian framework, the exponential map, logarithmic map and geodesic have closed forms. And the weighted Frechet mean exists uniquely on this manifold. We also propose a novel scalar measurement, named Geometric Anisotropy (GA), which is the Riemannian geodesic distance between the ODF and the isotropic ODF. The Renyi entropy H1/2 of the ODF can be computed from the GA. Moreover, we present an Affine-Euclidean framework and a Log-Euclidean framework so that we can work in an Euclidean space. As an application, Lagrange interpolation on ODF field is proposed based on weighted Frechet mean. We validate our methods on synthetic and real data experiments. Compared with existing Riemannian frameworks on ODF, our framework is model-free. The estimation of the parameters, i.e. Riemannian coordinates, is robust and linear. Moreover it should be noted that our theoretical results can be used for any probability density function (PDF) under an orthonormal basis representation.
Cane, James; O'Connor, Denise; Michie, Susan
2012-04-24
An integrative theoretical framework, developed for cross-disciplinary implementation and other behaviour change research, has been applied across a wide range of clinical situations. This study tests the validity of this framework. Validity was investigated by behavioural experts sorting 112 unique theoretical constructs using closed and open sort tasks. The extent of replication was tested by Discriminant Content Validation and Fuzzy Cluster Analysis. There was good support for a refinement of the framework comprising 14 domains of theoretical constructs (average silhouette value 0.29): 'Knowledge', 'Skills', 'Social/Professional Role and Identity', 'Beliefs about Capabilities', 'Optimism', 'Beliefs about Consequences', 'Reinforcement', 'Intentions', 'Goals', 'Memory, Attention and Decision Processes', 'Environmental Context and Resources', 'Social Influences', 'Emotions', and 'Behavioural Regulation'. The refined Theoretical Domains Framework has a strengthened empirical base and provides a method for theoretically assessing implementation problems, as well as professional and other health-related behaviours as a basis for intervention development.
Group percolation in interdependent networks
NASA Astrophysics Data System (ADS)
Wang, Zexun; Zhou, Dong; Hu, Yanqing
2018-03-01
In many real network systems, nodes usually cooperate with each other and form groups to enhance their robustness to risks. This motivates us to study an alternative type of percolation, group percolation, in interdependent networks under attack. In this model, nodes belonging to the same group survive or fail together. We develop a theoretical framework for this group percolation and find that the formation of groups can improve the resilience of interdependent networks significantly. However, the percolation transition is always of first order, regardless of the distribution of group sizes. As an application, we map the interdependent networks with intersimilarity structures, which have attracted much attention recently, onto the group percolation and confirm the nonexistence of continuous phase transitions.
Bayesian classification theory
NASA Technical Reports Server (NTRS)
Hanson, Robin; Stutz, John; Cheeseman, Peter
1991-01-01
The task of inferring a set of classes and class descriptions most likely to explain a given data set can be placed on a firm theoretical foundation using Bayesian statistics. Within this framework and using various mathematical and algorithmic approximations, the AutoClass system searches for the most probable classifications, automatically choosing the number of classes and complexity of class descriptions. A simpler version of AutoClass has been applied to many large real data sets, has discovered new independently-verified phenomena, and has been released as a robust software package. Recent extensions allow attributes to be selectively correlated within particular classes, and allow classes to inherit or share model parameters though a class hierarchy. We summarize the mathematical foundations of AutoClass.
Classification of large-sized hyperspectral imagery using fast machine learning algorithms
NASA Astrophysics Data System (ADS)
Xia, Junshi; Yokoya, Naoto; Iwasaki, Akira
2017-07-01
We present a framework of fast machine learning algorithms in the context of large-sized hyperspectral images classification from the theoretical to a practical viewpoint. In particular, we assess the performance of random forest (RF), rotation forest (RoF), and extreme learning machine (ELM) and the ensembles of RF and ELM. These classifiers are applied to two large-sized hyperspectral images and compared to the support vector machines. To give the quantitative analysis, we pay attention to comparing these methods when working with high input dimensions and a limited/sufficient training set. Moreover, other important issues such as the computational cost and robustness against the noise are also discussed.
Penetration with Long Rods: A Theoretical Framework and Comparison with Instrumented Impacts,
1980-06-01
theoretical framework for an experimental program is described. The theory of one dimensional wave propagation is used to show how data from instrumented long rods and targets may be fitted together to give a...the theoretical framework . In the final section the results to date are discussed.
ERIC Educational Resources Information Center
Palmer, Zsuzsanna Bacsa
2013-01-01
The effects of globalization on communication products and processes have resulted in document features and interactional practices that are sometimes difficult to describe within current theoretical frameworks of inter/transcultural technical communication. Although it has been recognized in our field that the old theoretical frameworks and…
1989-10-02
REVIEW OF THE LITERATURE AND A J.M.C. Schraagen THEORETICAL FRAMEWORK 2 Nothing from this issue may be reproduced and/or published by print, photoprint...Availability Codes Dist Special 5 Report No.: IZF 1989-36 Title: Navigation in unfamiliar cities: a review of the literature and a theoretical framework Author... theoretical framework sketched above suggests that some people may be better in encoding spatial informa- tion than others. This may be because of their
A simple theoretical framework for understanding heterogeneous differentiation of CD4+ T cells
2012-01-01
Background CD4+ T cells have several subsets of functional phenotypes, which play critical yet diverse roles in the immune system. Pathogen-driven differentiation of these subsets of cells is often heterogeneous in terms of the induced phenotypic diversity. In vitro recapitulation of heterogeneous differentiation under homogeneous experimental conditions indicates some highly regulated mechanisms by which multiple phenotypes of CD4+ T cells can be generated from a single population of naïve CD4+ T cells. Therefore, conceptual understanding of induced heterogeneous differentiation will shed light on the mechanisms controlling the response of populations of CD4+ T cells under physiological conditions. Results We present a simple theoretical framework to show how heterogeneous differentiation in a two-master-regulator paradigm can be governed by a signaling network motif common to all subsets of CD4+ T cells. With this motif, a population of naïve CD4+ T cells can integrate the signals from their environment to generate a functionally diverse population with robust commitment of individual cells. Notably, two positive feedback loops in this network motif govern three bistable switches, which in turn, give rise to three types of heterogeneous differentiated states, depending upon particular combinations of input signals. We provide three prototype models illustrating how to use this framework to explain experimental observations and make specific testable predictions. Conclusions The process in which several types of T helper cells are generated simultaneously to mount complex immune responses upon pathogenic challenges can be highly regulated, and a simple signaling network motif can be responsible for generating all possible types of heterogeneous populations with respect to a pair of master regulators controlling CD4+ T cell differentiation. The framework provides a mathematical basis for understanding the decision-making mechanisms of CD4+ T cells, and it can be helpful for interpreting experimental results. Mathematical models based on the framework make specific testable predictions that may improve our understanding of this differentiation system. PMID:22697466
Schlägel, Ulrike E; Lewis, Mark A
2016-12-01
Discrete-time random walks and their extensions are common tools for analyzing animal movement data. In these analyses, resolution of temporal discretization is a critical feature. Ideally, a model both mirrors the relevant temporal scale of the biological process of interest and matches the data sampling rate. Challenges arise when resolution of data is too coarse due to technological constraints, or when we wish to extrapolate results or compare results obtained from data with different resolutions. Drawing loosely on the concept of robustness in statistics, we propose a rigorous mathematical framework for studying movement models' robustness against changes in temporal resolution. In this framework, we define varying levels of robustness as formal model properties, focusing on random walk models with spatially-explicit component. With the new framework, we can investigate whether models can validly be applied to data across varying temporal resolutions and how we can account for these different resolutions in statistical inference results. We apply the new framework to movement-based resource selection models, demonstrating both analytical and numerical calculations, as well as a Monte Carlo simulation approach. While exact robustness is rare, the concept of approximate robustness provides a promising new direction for analyzing movement models.
Examining Neuronal Connectivity and Its Role in Learning and Memory
NASA Astrophysics Data System (ADS)
Gala, Rohan
Learning and long-term memory formation are accompanied with changes in the patterns and weights of synaptic connections in the underlying neuronal network. However, the fundamental rules that drive connectivity changes, and the precise structure-function relationships within neuronal networks remain elusive. Technological improvements over the last few decades have enabled the observation of large but specific subsets of neurons and their connections in unprecedented detail. Devising robust and automated computational methods is critical to distill information from ever-increasing volumes of raw experimental data. Moreover, statistical models and theoretical frameworks are required to interpret the data and assemble evidence into understanding of brain function. In this thesis, I first describe computational methods to reconstruct connectivity based on light microscopy imaging experiments. Next, I use these methods to quantify structural changes in connectivity based on in vivo time-lapse imaging experiments. Finally, I present a theoretical model of associative learning that can explain many stereotypical features of experimentally observed connectivity.
Chimera states in spatiotemporal systems: Theory and Applications
NASA Astrophysics Data System (ADS)
Yao, Nan; Zheng, Zhigang
2016-03-01
In this paper, we propose a retrospective and summary on recent studies of chimera states. Chimera states demonstrate striking inhomogeneous spatiotemporal patterns emerging in homogeneous systems through unexpected spontaneous symmetry breaking, where the consequent spatiotemporal patterns are composed of both coherence and incoherence domains, respectively characterized by the synchronized and desynchronized motions of oscillators. Since the discovery of chimera states by Kuramoto and others, this striking collective behavior has attracted a great deal of research interest in the community of physics and related interdisciplinary fields from both theoretical and experimental viewpoints. In recent works exploring chimera states, rich phenomena such as the spiral wave chimera, multiple cluster chimera, amplitude chimera were observed from various types of model systems. Theoretical framework by means of self-consistency approach and Ott-Antonsen approach were proposed for further understanding to this symmetry-breaking-induced behavior. The stability and robustness of chimera states were also discussed. More importantly, experiments ranging from optical, chemical to mechanical designs successfully approve the existence of chimera states.
Performance Enhancement of a High Speed Jet Impingement System for Nonvolatile Residue Removal
NASA Technical Reports Server (NTRS)
Klausner, James F.; Mei, Renwei; Near, Steve; Stith, Rex
1996-01-01
A high speed jet impingement cleaning facility has been developed to study the effectiveness of the nonvolatile residue removal. The facility includes a high pressure air compressor which charges the k-bottles to supply high pressure air, an air heating section to vary the temperature of the high pressure air, an air-water mixing chamber to meter the water flow and generate small size droplets, and a converging- diverging nozzle to deliver the supersonic air-droplet mixture flow to the cleaning surface. To reliably quantify the cleanliness of the surface, a simple procedure for measurement and calibration is developed to relate the amount of the residue on the surface to the relative change in the reflectivity between a clean surface and the greased surface. This calibration procedure is economical, simple, reliable, and robust. a theoretical framework is developed to provide qualitative guidance for the design of the test and interpretation of the experimental results. The result documented in this report support the theoretical considerations.
A Social-Cognitive Theoretical Framework for Examining Music Teacher Identity
ERIC Educational Resources Information Center
McClellan, Edward
2017-01-01
The purpose of the study was to examine a diverse range of research literature to provide a social-cognitive theoretical framework as a foundation for definition of identity construction in the music teacher education program. The review of literature may reveal a theoretical framework based around tenets of commonly studied constructs in the…
Penetration with Long Rods: A Theoretical Framework and Comparison with Instrumented Impacts
1981-05-01
program to begin probing the details of the interaction process. The theoretical framework underlying such a program is explained in detail. The theory of...of the time sequence of events during penetration. Data from one series of experiments, reported in detail elsewhere, is presented and discussed within the theoretical framework .
Theoretical and Conceptual Frameworks Used in Research on Family-School Partnerships
ERIC Educational Resources Information Center
Yamauchi, Lois A.; Ponte, Eva; Ratliffe, Katherine T.; Traynor, Kevin
2017-01-01
This study investigated the theoretical frameworks used to frame research on family-school partnerships over a five-year period. Although many researchers have described their theoretical approaches, little has been written about the diversity of frameworks used and how they are applied. Coders analyzed 215 journal articles published from 2007 to…
Kaack, Lorraine; Bender, Miriam; Finch, Michael; Borns, Linda; Grasham, Katherine; Avolio, Alice; Clausen, Shawna; Terese, Nadine A; Johnstone, Diane; Williams, Marjory
The Veterans Health Administration (VHA) Office of Nursing Services (ONS) was an early adopter of Clinical Nurse Leader (CNL) practice, generating some of the earliest pilot data of CNL practice effectiveness. In 2011 the VHA ONS CNL Implementation & Evaluation Service (CNL I&E) piloted a curriculum to facilitate CNL transition to effective practice at local VHA settings. In 2015, the CNL I&E and local VHA setting stakeholders collaborated to refine the program, based on lessons learned at the national and local level. The workgroup reviewed the literature to identify theoretical frameworks for CNL practice and practice development. The workgroup selected Benner et al.'s Novice-to-Expert model as the defining framework for CNL practice development, and Bender et al.'s CNL Practice Model as the defining framework for CNL practice integration. The selected frameworks were cross-walked against existing curriculum elements to identify and clarify additional practice development needs. The work generated key insights into: core stages of transition to effective practice; CNL progress and expectations for each stage; and organizational support structures necessary for CNL success at each stage. The refined CNL development model is a robust tool that can be applied to support consistent and effective integration of CNL practice into care delivery. Published by Elsevier Inc.
SDE decomposition and A-type stochastic interpretation in nonequilibrium processes
NASA Astrophysics Data System (ADS)
Yuan, Ruoshi; Tang, Ying; Ao, Ping
2017-12-01
An innovative theoretical framework for stochastic dynamics based on the decomposition of a stochastic differential equation (SDE) into a dissipative component, a detailed-balance-breaking component, and a dual-role potential landscape has been developed, which has fruitful applications in physics, engineering, chemistry, and biology. It introduces the A-type stochastic interpretation of the SDE beyond the traditional Ito or Stratonovich interpretation or even the α-type interpretation for multidimensional systems. The potential landscape serves as a Hamiltonian-like function in nonequilibrium processes without detailed balance, which extends this important concept from equilibrium statistical physics to the nonequilibrium region. A question on the uniqueness of the SDE decomposition was recently raised. Our review of both the mathematical and physical aspects shows that uniqueness is guaranteed. The demonstration leads to a better understanding of the robustness of the novel framework. In addition, we discuss related issues including the limitations of an approach to obtaining the potential function from a steady-state distribution.
Theoretical and Empirical Analysis of a Spatial EA Parallel Boosting Algorithm.
Kamath, Uday; Domeniconi, Carlotta; De Jong, Kenneth
2018-01-01
Many real-world problems involve massive amounts of data. Under these circumstances learning algorithms often become prohibitively expensive, making scalability a pressing issue to be addressed. A common approach is to perform sampling to reduce the size of the dataset and enable efficient learning. Alternatively, one customizes learning algorithms to achieve scalability. In either case, the key challenge is to obtain algorithmic efficiency without compromising the quality of the results. In this article we discuss a meta-learning algorithm (PSBML) that combines concepts from spatially structured evolutionary algorithms (SSEAs) with concepts from ensemble and boosting methodologies to achieve the desired scalability property. We present both theoretical and empirical analyses which show that PSBML preserves a critical property of boosting, specifically, convergence to a distribution centered around the margin. We then present additional empirical analyses showing that this meta-level algorithm provides a general and effective framework that can be used in combination with a variety of learning classifiers. We perform extensive experiments to investigate the trade-off achieved between scalability and accuracy, and robustness to noise, on both synthetic and real-world data. These empirical results corroborate our theoretical analysis, and demonstrate the potential of PSBML in achieving scalability without sacrificing accuracy.
Seven Basic Steps to Solving Ethical Dilemmas in Special Education: A Decision-Making Framework
ERIC Educational Resources Information Center
Stockall, Nancy; Dennis, Lindsay R.
2015-01-01
This article presents a seven-step framework for decision making to solve ethical issues in special education. The authors developed the framework from the existing literature and theoretical frameworks of justice, critique, care, and professionalism. The authors briefly discuss each theoretical framework and then describe the decision-making…
2009-08-05
Socio-cultural data acquisition, extraction, and management.??? First the idea of a theoretical framework will be very briefly discussed as well as...SUBJECT TERMS human behavior, theoretical framework , hypothesis development, experimental design, ethical research, statistical power, human laboratory...who throw rocks? • How can we make them stay too far away to throw rocks? UNCLASSIFIED – Approved for Public Release Theoretical Framework / Conceptual
ERIC Educational Resources Information Center
Ornek, Funda
2008-01-01
One or more theoretical frameworks or orientations are used in qualitative education research. In this paper, the main tenets, the background and the appropriateness of phenomenography, which is one of the theoretical frameworks used in qualitative research, will be depicted. Further, the differences among phenomenography, phenomenology and…
Using a Theoretical Framework of Institutional Culture to Analyse an Institutional Strategy Document
ERIC Educational Resources Information Center
Jacobs, Anthea Hydi Maxine
2016-01-01
This paper builds on a conceptual analysis of institutional culture in higher education. A theoretical framework was proposed to analyse institutional documents of two higher education institutions in the Western Cape, for the period 2002 to 2012 (Jacobs 2012). The elements of this theoretical framework are "shared values and beliefs",…
ERIC Educational Resources Information Center
Asiri, Mohammed J. Sherbib; Mahmud, Rosnaini bt; Bakar, Kamariah Abu; Ayub, Ahmad Fauzi bin Mohd
2012-01-01
The purpose of this paper is to present the theoretical framework underlying a research on factors that influence utilization of the Jusur Learning Management System (Jusur LMS) in Saudi Arabian public universities. Development of the theoretical framework was done based on library research approach. Initially, the existing literature relevant to…
NASA Astrophysics Data System (ADS)
Fernandes, Geraldo W. Rocha; Rodrigues, António M.; Ferreira, Carlos Alberto
2018-03-01
This article aims to characterise the research on science teachers' professional development programs that support the use of Information and Communication Technologies (ICTs) and the main trends concerning the theoretical frameworks (theoretical foundation, literature review or background) that underpin these studies. Through a systematic review of the literature, 76 articles were found and divided into two axes on training science teachers and the use of digital technologies with their categories. The first axis (characterisation of articles) presents the category key features that characterise the articles selected (major subjects, training and actions for the professional development and major ICT tools and digital resources). The second axis (trends of theoretical frameworks) has three categories organised in theoretical frameworks that emphasise the following: (a) the digital technologies, (b) prospects of curricular renewal and (c) cognitive processes. It also characterised a group of articles with theoretical frameworks that contain multiple elements without deepening them or that even lack a theoretical framework that supports the studies. In this review, we found that many professional development programs for teachers still use inadequate strategies for bringing about change in teacher practices. New professional development proposals are emerging with the objective of minimising such difficulties and this analysis could be a helpful tool to restructure those proposals.
A theoretical framework to support research of health service innovation.
Fox, Amanda; Gardner, Glenn; Osborne, Sonya
2015-02-01
Health service managers and policy makers are increasingly concerned about the sustainability of innovations implemented in health care settings. The increasing demand on health services requires that innovations are both effective and sustainable; however, research in this field is limited, with multiple disciplines, approaches and paradigms influencing the field. These variations prevent a cohesive approach, and therefore the accumulation of research findings, in the development of a body of knowledge. The purpose of this paper is to provide a thorough examination of the research findings and provide an appropriate theoretical framework to examine sustainability of health service innovation. This paper presents an integrative review of the literature available in relation to sustainability of health service innovation and provides the development of a theoretical framework based on integration and synthesis of the literature. A theoretical framework serves to guide research, determine variables, influence data analysis and is central to the quest for ongoing knowledge development. This research outlines the sustainability of innovation framework; a theoretical framework suitable for examining the sustainability of health service innovation. If left unaddressed, health services research will continue in an ad hoc manner, preventing full utilisation of outcomes, recommendations and knowledge for effective provision of health services. The sustainability of innovation theoretical framework provides an operational basis upon which reliable future research can be conducted.
Cross contrast multi-channel image registration using image synthesis for MR brain images.
Chen, Min; Carass, Aaron; Jog, Amod; Lee, Junghoon; Roy, Snehashis; Prince, Jerry L
2017-02-01
Multi-modal deformable registration is important for many medical image analysis tasks such as atlas alignment, image fusion, and distortion correction. Whereas a conventional method would register images with different modalities using modality independent features or information theoretic metrics such as mutual information, this paper presents a new framework that addresses the problem using a two-channel registration algorithm capable of using mono-modal similarity measures such as sum of squared differences or cross-correlation. To make it possible to use these same-modality measures, image synthesis is used to create proxy images for the opposite modality as well as intensity-normalized images from each of the two available images. The new deformable registration framework was evaluated by performing intra-subject deformation recovery, intra-subject boundary alignment, and inter-subject label transfer experiments using multi-contrast magnetic resonance brain imaging data. Three different multi-channel registration algorithms were evaluated, revealing that the framework is robust to the multi-channel deformable registration algorithm that is used. With a single exception, all results demonstrated improvements when compared against single channel registrations using the same algorithm with mutual information. Copyright © 2016 Elsevier B.V. All rights reserved.
Hypercrosslinked phenolic polymers with well developed mesoporous frameworks
Zhang, Jinshui; Qiao, Zhenan -An; Mahurin, Shannon Mark; ...
2015-02-12
A soft chemistry synthetic strategy based on a Friedel Crafts alkylation reaction is developed for the textural engineering of phenolic resin (PR) with a robust mesoporous framework to avoid serious framework shrinkage and maximize retention of organic functional moieties. By taking advantage of the structural benefits of molecular bridges, the resultant sample maintains a bimodal micro-mesoporous architecture with well-preserved organic functional groups, which is effective for carbon capture. Furthermore, this soft chemistry synthetic protocol can be further extended to nanotexture other aromatic-based polymers with robust frameworks.
Petrou, Panagiotis; Talias, Michael A
2014-01-01
The continuing increase of pharmaceutical expenditure calls for new approaches to pricing and reimbursement of pharmaceuticals. Value based pricing of pharmaceuticals is emerging as a useful tool and possess theoretical attributes to help health system cope with rising pharmaceutical expenditure. To assess the feasibility of introducing a value-based pricing scheme of pharmaceuticals in Cyprus and explore the integrative framework. A probabilistic Markov chain Monte Carlo model was created to simulate progression of advanced renal cell cancer for comparison of sorafenib to standard best supportive care. Literature review was performed and efficacy data were transferred from a published landmark trial, while official pricelists and clinical guidelines from Cyprus Ministry of Health were utilised for cost calculation. Based on proposed willingness to pay threshold the maximum price of sorafenib for the indication of second line renal cell cancer was assessed. Sorafenib value based price was found to be significantly lower compared to its current reference price. Feasibility of Value Based Pricing is documented and pharmacoeconomic modelling can lead to robust results. Integration of value and affordability in the price are its main advantages which have to be weighed against lack of documentation for several theoretical parameters that influence outcome. Smaller countries such as Cyprus may experience adversities in establishing and sustaining essential structures for this scheme.
Miller, Warren B.; Rodgers, Joseph Lee; Pasta, David J.
2010-01-01
We examine how the motivational sequence that leads to childbearing predicts fertility outcomes across reproductive careers. Using a motivational traits-desires-intentions theoretical framework, we test a structural equation model using prospective male and female data from the National Longitudinal Survey of Youth. Specifically, we take motivational data collected during the 1979–1982 period, when the youths were in their teens and early twenties, to predict the timing of the next child born after 1982 and the total number of children born by 2002. Separate models were estimated for males and females but with equality constraints imposed unless relaxing these constraints improved the overall model fit. The results indicate substantial explanatory power of fertility motivations for both short-term and long-term fertility outcomes. They also reveal the effects of both gender role attitude and educational intentions on these outcomes. Although some sex differences in model pathways occurred, the primary hypothesized pathways were essentially the same across the sexes. Two validity substudies support the soundness of the results. A third substudy comparing the male and female models across the sample split on the basis of previous childbearing revealed a number of pattern differences within the four sex-by-previous childbearing groups. Several of the more robust of these pattern differences offer interesting insights and support the validity and usefulness of our theoretical framework. PMID:20463915
Li, Zukui; Floudas, Christodoulos A.
2012-01-01
Probabilistic guarantees on constraint satisfaction for robust counterpart optimization are studied in this paper. The robust counterpart optimization formulations studied are derived from box, ellipsoidal, polyhedral, “interval+ellipsoidal” and “interval+polyhedral” uncertainty sets (Li, Z., Ding, R., and Floudas, C.A., A Comparative Theoretical and Computational Study on Robust Counterpart Optimization: I. Robust Linear and Robust Mixed Integer Linear Optimization, Ind. Eng. Chem. Res, 2011, 50, 10567). For those robust counterpart optimization formulations, their corresponding probability bounds on constraint satisfaction are derived for different types of uncertainty characteristic (i.e., bounded or unbounded uncertainty, with or without detailed probability distribution information). The findings of this work extend the results in the literature and provide greater flexibility for robust optimization practitioners in choosing tighter probability bounds so as to find less conservative robust solutions. Extensive numerical studies are performed to compare the tightness of the different probability bounds and the conservatism of different robust counterpart optimization formulations. Guiding rules for the selection of robust counterpart optimization models and for the determination of the size of the uncertainty set are discussed. Applications in production planning and process scheduling problems are presented. PMID:23329868
Inquiry-based Learning and Digital Libraries in Undergraduate Science Education
NASA Astrophysics Data System (ADS)
Apedoe, Xornam S.; Reeves, Thomas C.
2006-12-01
The purpose of this paper is twofold: to describe robust rationales for integrating inquiry-based learning into undergraduate science education, and to propose that digital libraries are potentially powerful technological tools that can support inquiry-based learning goals in undergraduate science courses. Overviews of constructivism and situated cognition are provided with regard to how these two theoretical perspectives have influenced current science education reform movements, especially those that involve inquiry-based learning. The role that digital libraries can play in inquiry-based learning environments is discussed. Finally, the importance of alignment among critical pedagogical dimensions of an inquiry-based pedagogical framework is stressed in the paper, and an example of how this can be done is presented using earth science education as a context.
Autoclass: An automatic classification system
NASA Technical Reports Server (NTRS)
Stutz, John; Cheeseman, Peter; Hanson, Robin
1991-01-01
The task of inferring a set of classes and class descriptions most likely to explain a given data set can be placed on a firm theoretical foundation using Bayesian statistics. Within this framework, and using various mathematical and algorithmic approximations, the AutoClass System searches for the most probable classifications, automatically choosing the number of classes and complexity of class descriptions. A simpler version of AutoClass has been applied to many large real data sets, has discovered new independently-verified phenomena, and has been released as a robust software package. Recent extensions allow attributes to be selectively correlated within particular classes, and allow classes to inherit, or share, model parameters through a class hierarchy. The mathematical foundations of AutoClass are summarized.
Adaptive PID formation control of nonholonomic robots without leader's velocity information.
Shen, Dongbin; Sun, Weijie; Sun, Zhendong
2014-03-01
This paper proposes an adaptive proportional integral derivative (PID) algorithm to solve a formation control problem in the leader-follower framework where the leader robot's velocities are unknown for the follower robots. The main idea is first to design some proper ideal control law for the formation system to obtain a required performance, and then to propose the adaptive PID methodology to approach the ideal controller. As a result, the formation is achieved with much more enhanced robust formation performance. The stability of the closed-loop system is theoretically proved by Lyapunov method. Both numerical simulations and physical vehicle experiments are presented to verify the effectiveness of the proposed adaptive PID algorithm. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.
Combined invariants to similarity transformation and to blur using orthogonal Zernike moments
Beijing, Chen; Shu, Huazhong; Zhang, Hui; Coatrieux, Gouenou; Luo, Limin; Coatrieux, Jean-Louis
2011-01-01
The derivation of moment invariants has been extensively investigated in the past decades. In this paper, we construct a set of invariants derived from Zernike moments which is simultaneously invariant to similarity transformation and to convolution with circularly symmetric point spread function (PSF). Two main contributions are provided: the theoretical framework for deriving the Zernike moments of a blurred image and the way to construct the combined geometric-blur invariants. The performance of the proposed descriptors is evaluated with various PSFs and similarity transformations. The comparison of the proposed method with the existing ones is also provided in terms of pattern recognition accuracy, template matching and robustness to noise. Experimental results show that the proposed descriptors perform on the overall better. PMID:20679028
What Is Robustness?: Problem Framing Challenges for Water Systems Planning Under Change
NASA Astrophysics Data System (ADS)
Herman, J. D.; Reed, P. M.; Zeff, H. B.; Characklis, G. W.
2014-12-01
Water systems planners have long recognized the need for robust solutions capable of withstanding deviations from the conditions for which they were designed. Faced with a set of alternatives to choose from—for example, resulting from a multi-objective optimization—existing analysis frameworks offer competing definitions of robustness under change. Robustness analyses have moved from expected utility to exploratory "bottom-up" approaches in which vulnerable scenarios are identified prior to assigning likelihoods; examples include Robust Decision Making (RDM), Decision Scaling, Info-Gap, and Many-Objective Robust Decision Making (MORDM). We propose a taxonomy of robustness frameworks to compare and contrast these approaches, based on their methods of (1) alternative selection, (2) sampling of states of the world, (3) quantification of robustness measures, and (4) identification of key uncertainties using sensitivity analysis. Using model simulations from recent work in multi-objective urban water supply portfolio planning, we illustrate the decision-relevant consequences that emerge from each of these choices. Results indicate that the methodological choices in the taxonomy lead to substantially different planning alternatives, underscoring the importance of an informed definition of robustness. We conclude with a set of recommendations for problem framing: that alternatives should be searched rather than prespecified; dominant uncertainties should be discovered rather than assumed; and that a multivariate satisficing measure of robustness allows stakeholders to achieve their problem-specific performance requirements. This work highlights the importance of careful problem formulation, and provides a common vocabulary to link the robustness frameworks widely used in the field of water systems planning.
Gradient descent for robust kernel-based regression
NASA Astrophysics Data System (ADS)
Guo, Zheng-Chu; Hu, Ting; Shi, Lei
2018-06-01
In this paper, we study the gradient descent algorithm generated by a robust loss function over a reproducing kernel Hilbert space (RKHS). The loss function is defined by a windowing function G and a scale parameter σ, which can include a wide range of commonly used robust losses for regression. There is still a gap between theoretical analysis and optimization process of empirical risk minimization based on loss: the estimator needs to be global optimal in the theoretical analysis while the optimization method can not ensure the global optimality of its solutions. In this paper, we aim to fill this gap by developing a novel theoretical analysis on the performance of estimators generated by the gradient descent algorithm. We demonstrate that with an appropriately chosen scale parameter σ, the gradient update with early stopping rules can approximate the regression function. Our elegant error analysis can lead to convergence in the standard L 2 norm and the strong RKHS norm, both of which are optimal in the mini-max sense. We show that the scale parameter σ plays an important role in providing robustness as well as fast convergence. The numerical experiments implemented on synthetic examples and real data set also support our theoretical results.
2012-01-01
Background An integrative theoretical framework, developed for cross-disciplinary implementation and other behaviour change research, has been applied across a wide range of clinical situations. This study tests the validity of this framework. Methods Validity was investigated by behavioural experts sorting 112 unique theoretical constructs using closed and open sort tasks. The extent of replication was tested by Discriminant Content Validation and Fuzzy Cluster Analysis. Results There was good support for a refinement of the framework comprising 14 domains of theoretical constructs (average silhouette value 0.29): ‘Knowledge’, ‘Skills’, ‘Social/Professional Role and Identity’, ‘Beliefs about Capabilities’, ‘Optimism’, ‘Beliefs about Consequences’, ‘Reinforcement’, ‘Intentions’, ‘Goals’, ‘Memory, Attention and Decision Processes’, ‘Environmental Context and Resources’, ‘Social Influences’, ‘Emotions’, and ‘Behavioural Regulation’. Conclusions The refined Theoretical Domains Framework has a strengthened empirical base and provides a method for theoretically assessing implementation problems, as well as professional and other health-related behaviours as a basis for intervention development. PMID:22530986
Zhao, Junbo; Wang, Shaobu; Mili, Lamine; ...
2018-01-08
Here, this paper develops a robust power system state estimation framework with the consideration of measurement correlations and imperfect synchronization. In the framework, correlations of SCADA and Phasor Measurements (PMUs) are calculated separately through unscented transformation and a Vector Auto-Regression (VAR) model. In particular, PMU measurements during the waiting period of two SCADA measurement scans are buffered to develop the VAR model with robustly estimated parameters using projection statistics approach. The latter takes into account the temporal and spatial correlations of PMU measurements and provides redundant measurements to suppress bad data and mitigate imperfect synchronization. In case where the SCADAmore » and PMU measurements are not time synchronized, either the forecasted PMU measurements or the prior SCADA measurements from the last estimation run are leveraged to restore system observability. Then, a robust generalized maximum-likelihood (GM)-estimator is extended to integrate measurement error correlations and to handle the outliers in the SCADA and PMU measurements. Simulation results that stem from a comprehensive comparison with other alternatives under various conditions demonstrate the benefits of the proposed framework.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Junbo; Wang, Shaobu; Mili, Lamine
Here, this paper develops a robust power system state estimation framework with the consideration of measurement correlations and imperfect synchronization. In the framework, correlations of SCADA and Phasor Measurements (PMUs) are calculated separately through unscented transformation and a Vector Auto-Regression (VAR) model. In particular, PMU measurements during the waiting period of two SCADA measurement scans are buffered to develop the VAR model with robustly estimated parameters using projection statistics approach. The latter takes into account the temporal and spatial correlations of PMU measurements and provides redundant measurements to suppress bad data and mitigate imperfect synchronization. In case where the SCADAmore » and PMU measurements are not time synchronized, either the forecasted PMU measurements or the prior SCADA measurements from the last estimation run are leveraged to restore system observability. Then, a robust generalized maximum-likelihood (GM)-estimator is extended to integrate measurement error correlations and to handle the outliers in the SCADA and PMU measurements. Simulation results that stem from a comprehensive comparison with other alternatives under various conditions demonstrate the benefits of the proposed framework.« less
Many-objective robust decision making for water allocation under climate change.
Yan, Dan; Ludwig, Fulco; Huang, He Qing; Werners, Saskia E
2017-12-31
Water allocation is facing profound challenges due to climate change uncertainties. To identify adaptive water allocation strategies that are robust to climate change uncertainties, a model framework combining many-objective robust decision making and biophysical modeling is developed for large rivers. The framework was applied to the Pearl River basin (PRB), China where sufficient flow to the delta is required to reduce saltwater intrusion in the dry season. Before identifying and assessing robust water allocation plans for the future, the performance of ten state-of-the-art MOEAs (multi-objective evolutionary algorithms) is evaluated for the water allocation problem in the PRB. The Borg multi-objective evolutionary algorithm (Borg MOEA), which is a self-adaptive optimization algorithm, has the best performance during the historical periods. Therefore it is selected to generate new water allocation plans for the future (2079-2099). This study shows that robust decision making using carefully selected MOEAs can help limit saltwater intrusion in the Pearl River Delta. However, the framework could perform poorly due to larger than expected climate change impacts on water availability. Results also show that subjective design choices from the researchers and/or water managers could potentially affect the ability of the model framework, and cause the most robust water allocation plans to fail under future climate change. Developing robust allocation plans in a river basin suffering from increasing water shortage requires the researchers and water managers to well characterize future climate change of the study regions and vulnerabilities of their tools. Copyright © 2017 Elsevier B.V. All rights reserved.
Robustness Analysis of Integrated LPV-FDI Filters and LTI-FTC System for a Transport Aircraft
NASA Technical Reports Server (NTRS)
Khong, Thuan H.; Shin, Jong-Yeob
2007-01-01
This paper proposes an analysis framework for robustness analysis of a nonlinear dynamics system that can be represented by a polynomial linear parameter varying (PLPV) system with constant bounded uncertainty. The proposed analysis framework contains three key tools: 1) a function substitution method which can convert a nonlinear system in polynomial form into a PLPV system, 2) a matrix-based linear fractional transformation (LFT) modeling approach, which can convert a PLPV system into an LFT system with the delta block that includes key uncertainty and scheduling parameters, 3) micro-analysis, which is a well known robust analysis tool for linear systems. The proposed analysis framework is applied to evaluating the performance of the LPV-fault detection and isolation (FDI) filters of the closed-loop system of a transport aircraft in the presence of unmodeled actuator dynamics and sensor gain uncertainty. The robustness analysis results are compared with nonlinear time simulations.
Theory on the Dynamics of Oscillatory Loops in the Transcription Factor Networks
Murugan, Rajamanickam
2014-01-01
We develop a detailed theoretical framework for various types of transcription factor gene oscillators. We further demonstrate that one can build genetic-oscillators which are tunable and robust against perturbations in the critical control parameters by coupling two or more independent Goodwin-Griffith oscillators through either -OR- or -AND- type logic. Most of the coupled oscillators constructed in the literature so far seem to be of -OR- type. When there are transient perturbations in one of the -OR- type coupled-oscillators, then the overall period of the system remains constant (period-buffering) whereas in case of -AND- type coupling the overall period of the system moves towards the perturbed oscillator. Though there is a period-buffering, the amplitudes of oscillators coupled through -OR- type logic are more sensitive to perturbations in the parameters associated with the promoter state dynamics than -AND- type. Further analysis shows that the period of -AND- type coupled dual-feedback oscillators can be tuned without conceding on the amplitudes. Using these results we derive the basic design principles governing the robust and tunable synthetic gene oscillators without compromising on their amplitudes. PMID:25111803
The HTM Spatial Pooler-A Neocortical Algorithm for Online Sparse Distributed Coding.
Cui, Yuwei; Ahmad, Subutai; Hawkins, Jeff
2017-01-01
Hierarchical temporal memory (HTM) provides a theoretical framework that models several key computational principles of the neocortex. In this paper, we analyze an important component of HTM, the HTM spatial pooler (SP). The SP models how neurons learn feedforward connections and form efficient representations of the input. It converts arbitrary binary input patterns into sparse distributed representations (SDRs) using a combination of competitive Hebbian learning rules and homeostatic excitability control. We describe a number of key properties of the SP, including fast adaptation to changing input statistics, improved noise robustness through learning, efficient use of cells, and robustness to cell death. In order to quantify these properties we develop a set of metrics that can be directly computed from the SP outputs. We show how the properties are met using these metrics and targeted artificial simulations. We then demonstrate the value of the SP in a complete end-to-end real-world HTM system. We discuss the relationship with neuroscience and previous studies of sparse coding. The HTM spatial pooler represents a neurally inspired algorithm for learning sparse representations from noisy data streams in an online fashion.
A bargaining game analysis of international climate negotiations
NASA Astrophysics Data System (ADS)
Smead, Rory; Sandler, Ronald L.; Forber, Patrick; Basl, John
2014-06-01
Climate negotiations under the United Nations Framework Convention on Climate Change have so far failed to achieve a robust international agreement to reduce greenhouse gas emissions. Game theory has been used to investigate possible climate negotiation solutions and strategies for accomplishing them. Negotiations have been primarily modelled as public goods games such as the Prisoner's Dilemma, though coordination games or games of conflict have also been used. Many of these models have solutions, in the form of equilibria, corresponding to possible positive outcomes--that is, agreements with the requisite emissions reduction commitments. Other work on large-scale social dilemmas suggests that it should be possible to resolve the climate problem. It therefore seems that equilibrium selection may be a barrier to successful negotiations. Here we use an N-player bargaining game in an agent-based model with learning dynamics to examine the past failures of and future prospects for a robust international climate agreement. The model suggests reasons why the desirable solutions identified in previous game-theoretic models have not yet been accomplished in practice and what mechanisms might be used to achieve these solutions.
Pinchevsky, Gillian M
2016-05-22
This study fills a gap in the literature by exploring the utility of contemporary courtroom theoretical frameworks-uncertainty avoidance, causal attribution, and focal concerns-for explaining decision-making in specialized domestic violence courts. Using data from two specialized domestic violence courts, this study explores the predictors of prosecutorial and judicial decision-making and the extent to which these factors are congruent with theoretical frameworks often used in studies of court processing. Findings suggest that these theoretical frameworks only partially help explain decision-making in the courts under study. A discussion of the findings and implications for future research is provided. © The Author(s) 2016.
New robust statistical procedures for the polytomous logistic regression models.
Castilla, Elena; Ghosh, Abhik; Martin, Nirian; Pardo, Leandro
2018-05-17
This article derives a new family of estimators, namely the minimum density power divergence estimators, as a robust generalization of the maximum likelihood estimator for the polytomous logistic regression model. Based on these estimators, a family of Wald-type test statistics for linear hypotheses is introduced. Robustness properties of both the proposed estimators and the test statistics are theoretically studied through the classical influence function analysis. Appropriate real life examples are presented to justify the requirement of suitable robust statistical procedures in place of the likelihood based inference for the polytomous logistic regression model. The validity of the theoretical results established in the article are further confirmed empirically through suitable simulation studies. Finally, an approach for the data-driven selection of the robustness tuning parameter is proposed with empirical justifications. © 2018, The International Biometric Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jinshui; Qiao, Zhenan -An; Mahurin, Shannon Mark
A soft chemistry synthetic strategy based on a Friedel Crafts alkylation reaction is developed for the textural engineering of phenolic resin (PR) with a robust mesoporous framework to avoid serious framework shrinkage and maximize retention of organic functional moieties. By taking advantage of the structural benefits of molecular bridges, the resultant sample maintains a bimodal micro-mesoporous architecture with well-preserved organic functional groups, which is effective for carbon capture. Furthermore, this soft chemistry synthetic protocol can be further extended to nanotexture other aromatic-based polymers with robust frameworks.
Conceptualizing and assessing improvement capability: a review
Boaden, Ruth; Walshe, Kieran
2017-01-01
Abstract Purpose The literature is reviewed to examine how ‘improvement capability’ is conceptualized and assessed and to identify future areas for research. Data sources An iterative and systematic search of the literature was carried out across all sectors including healthcare. The search was limited to literature written in English. Data extraction The study identifies and analyses 70 instruments and frameworks for assessing or measuring improvement capability. Information about the source of the instruments, the sectors in which they were developed or used, the measurement constructs or domains they employ, and how they were tested was extracted. Results of data synthesis The instruments and framework constructs are very heterogeneous, demonstrating the ambiguity of improvement capability as a concept, and the difficulties involved in its operationalisation. Two-thirds of the instruments and frameworks have been subject to tests of reliability and half to tests of validity. Many instruments have little apparent theoretical basis and do not seem to have been used widely. Conclusion The assessment and development of improvement capability needs clearer and more consistent conceptual and terminological definition, used consistently across disciplines and sectors. There is scope to learn from existing instruments and frameworks, and this study proposes a synthetic framework of eight dimensions of improvement capability. Future instruments need robust testing for reliability and validity. This study contributes to practice and research by presenting the first review of the literature on improvement capability across all sectors including healthcare. PMID:28992146
Intelligent and robust optimization frameworks for smart grids
NASA Astrophysics Data System (ADS)
Dhansri, Naren Reddy
A smart grid implies a cyberspace real-time distributed power control system to optimally deliver electricity based on varying consumer characteristics. Although smart grids solve many of the contemporary problems, they give rise to new control and optimization problems with the growing role of renewable energy sources such as wind or solar energy. Under highly dynamic nature of distributed power generation and the varying consumer demand and cost requirements, the total power output of the grid should be controlled such that the load demand is met by giving a higher priority to renewable energy sources. Hence, the power generated from renewable energy sources should be optimized while minimizing the generation from non renewable energy sources. This research develops a demand-based automatic generation control and optimization framework for real-time smart grid operations by integrating conventional and renewable energy sources under varying consumer demand and cost requirements. Focusing on the renewable energy sources, the intelligent and robust control frameworks optimize the power generation by tracking the consumer demand in a closed-loop control framework, yielding superior economic and ecological benefits and circumvent nonlinear model complexities and handles uncertainties for superior real-time operations. The proposed intelligent system framework optimizes the smart grid power generation for maximum economical and ecological benefits under an uncertain renewable wind energy source. The numerical results demonstrate that the proposed framework is a viable approach to integrate various energy sources for real-time smart grid implementations. The robust optimization framework results demonstrate the effectiveness of the robust controllers under bounded power plant model uncertainties and exogenous wind input excitation while maximizing economical and ecological performance objectives. Therefore, the proposed framework offers a new worst-case deterministic optimization algorithm for smart grid automatic generation control.
How Robust is Your System Resilience?
NASA Astrophysics Data System (ADS)
Homayounfar, M.; Muneepeerakul, R.
2017-12-01
Robustness and resilience are concepts in system thinking that have grown in importance and popularity. For many complex social-ecological systems, however, robustness and resilience are difficult to quantify and the connections and trade-offs between them difficult to study. Most studies have either focused on qualitative approaches to discuss their connections or considered only one of them under particular classes of disturbances. In this study, we present an analytical framework to address the linkage between robustness and resilience more systematically. Our analysis is based on a stylized dynamical model that operationalizes a widely used concept framework for social-ecological systems. The model enables us to rigorously define robustness and resilience and consequently investigate their connections. The results reveal the tradeoffs among performance, robustness, and resilience. They also show how the nature of the such tradeoffs varies with the choices of certain policies (e.g., taxation and investment in public infrastructure), internal stresses and external disturbances.
A Theoretically Consistent Framework for Modelling Lagrangian Particle Deposition in Plant Canopies
NASA Astrophysics Data System (ADS)
Bailey, Brian N.; Stoll, Rob; Pardyjak, Eric R.
2018-06-01
We present a theoretically consistent framework for modelling Lagrangian particle deposition in plant canopies. The primary focus is on describing the probability of particles encountering canopy elements (i.e., potential deposition), and provides a consistent means for including the effects of imperfect deposition through any appropriate sub-model for deposition efficiency. Some aspects of the framework draw upon an analogy to radiation propagation through a turbid medium with which to develop model theory. The present method is compared against one of the most commonly used heuristic Lagrangian frameworks, namely that originally developed by Legg and Powell (Agricultural Meteorology, 1979, Vol. 20, 47-67), which is shown to be theoretically inconsistent. A recommendation is made to discontinue the use of this heuristic approach in favour of the theoretically consistent framework developed herein, which is no more difficult to apply under equivalent assumptions. The proposed framework has the additional advantage that it can be applied to arbitrary canopy geometries given readily measurable parameters describing vegetation structure.
ERIC Educational Resources Information Center
Kohli, Nidhi; Koran, Jennifer; Henn, Lisa
2015-01-01
There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…
Effect of interaction strength on robustness of controlling edge dynamics in complex networks
NASA Astrophysics Data System (ADS)
Pang, Shao-Peng; Hao, Fei
2018-05-01
Robustness plays a critical role in the controllability of complex networks to withstand failures and perturbations. Recent advances in the edge controllability show that the interaction strength among edges plays a more important role than network structure. Therefore, we focus on the effect of interaction strength on the robustness of edge controllability. Using three categories of all edges to quantify the robustness, we develop a universal framework to evaluate and analyze the robustness in complex networks with arbitrary structures and interaction strengths. Applying our framework to a large number of model and real-world networks, we find that the interaction strength is a dominant factor for the robustness in undirected networks. Meanwhile, the strongest robustness and the optimal edge controllability in undirected networks can be achieved simultaneously. Different from the case of undirected networks, the robustness in directed networks is determined jointly by the interaction strength and the network's degree distribution. Moreover, a stronger robustness is usually associated with a larger number of driver nodes required to maintain full control in directed networks. This prompts us to provide an optimization method by adjusting the interaction strength to optimize the robustness of edge controllability.
LOX droplet vaporization in a supercritical forced convective environment
NASA Technical Reports Server (NTRS)
Hsiao, Chia-Chun; Yang, Vigor
1993-01-01
Modern liquid rocket engines often use liquid oxygen (LOX) and liquid hydrogen (LH2) as propellants to achieve high performance, with the engine operational conditions in the supercritical regimes of the propellants. Once the propellant exceeds its critical state, it essentially becomes a puff of dense fluid. The entire field becomes a continuous medium, and no distinct interfacial boundary between the liquid and gas exists. Although several studies have been undertaken to investigate the supercritical droplet behavior at quiescent conditions, very little effort has been made to address the fundamental mechanisms associated with LOX droplet vaporization in a supercritical, forced convective environment. The purpose is to establish a theoretical framework within which supercritical droplet dynamics and vaporization can be studied systematically by means of an efficient and robust numerical algorithm.
Establishing an Explanatory Model for Mathematics Identity.
Cribbs, Jennifer D; Hazari, Zahra; Sonnert, Gerhard; Sadler, Philip M
2015-04-01
This article empirically tests a previously developed theoretical framework for mathematics identity based on students' beliefs. The study employs data from more than 9,000 college calculus students across the United States to build a robust structural equation model. While it is generally thought that students' beliefs about their own competence in mathematics directly impact their identity as a "math person," findings indicate that students' self-perceptions related to competence and performance have an indirect effect on their mathematics identity, primarily by association with students' interest and external recognition in mathematics. Thus, the model indicates that students' competence and performance beliefs are not sufficient for their mathematics identity development, and it highlights the roles of interest and recognition. © 2015 The Authors. Child Development © 2015 Society for Research in Child Development, Inc.
ERIC Educational Resources Information Center
Styres, Sandra D.; Zinga, Dawn M.
2013-01-01
This article introduces an emergent research theoretical framework, the community-first Land-centred research framework. Carefully examining the literature within Indigenous educational research, we noted the limited approaches for engaging in culturally aligned and relevant research within Indigenous communities. The community-first Land-centred…
An e-Learning Theoretical Framework
ERIC Educational Resources Information Center
Aparicio, Manuela; Bacao, Fernando; Oliveira, Tiago
2016-01-01
E-learning systems have witnessed a usage and research increase in the past decade. This article presents the e-learning concepts ecosystem. It summarizes the various scopes on e-learning studies. Here we propose an e-learning theoretical framework. This theory framework is based upon three principal dimensions: users, technology, and services…
Threshold Capabilities: Threshold Concepts and Knowledge Capability Linked through Variation Theory
ERIC Educational Resources Information Center
Baillie, Caroline; Bowden, John A.; Meyer, Jan H. F.
2013-01-01
The Threshold Capability Integrated Theoretical Framework (TCITF) is presented as a framework for the design of university curricula, aimed at developing graduates' capability to deal with previously unseen situations in their professional, social, and personal lives. The TCITF is a new theoretical framework derived from, and heavily dependent…
A metamorphic inorganic framework that can be switched between eight single-crystalline states
NASA Astrophysics Data System (ADS)
Zhan, Caihong; Cameron, Jamie M.; Gabb, David; Boyd, Thomas; Winter, Ross S.; Vilà-Nadal, Laia; Mitchell, Scott G.; Glatzel, Stefan; Breternitz, Joachim; Gregory, Duncan H.; Long, De-Liang; MacDonell, Andrew; Cronin, Leroy
2017-02-01
The design of highly flexible framework materials requires organic linkers, whereas inorganic materials are more robust but inflexible. Here, by using linkable inorganic rings made up of tungsten oxide (P8W48O184) building blocks, we synthesized an inorganic single crystal material that can undergo at least eight different crystal-to-crystal transformations, with gigantic crystal volume contraction and expansion changes ranging from -2,170 to +1,720 Å3 with no reduction in crystallinity. Not only does this material undergo the largest single crystal-to-single crystal volume transformation thus far reported (to the best of our knowledge), the system also shows conformational flexibility while maintaining robustness over several cycles in the reversible uptake and release of guest molecules switching the crystal between different metamorphic states. This material combines the robustness of inorganic materials with the flexibility of organic frameworks, thereby challenging the notion that flexible materials with robustness are mutually exclusive.
Robust, Efficient Depth Reconstruction With Hierarchical Confidence-Based Matching.
Sun, Li; Chen, Ke; Song, Mingli; Tao, Dacheng; Chen, Gang; Chen, Chun
2017-07-01
In recent years, taking photos and capturing videos with mobile devices have become increasingly popular. Emerging applications based on the depth reconstruction technique have been developed, such as Google lens blur. However, depth reconstruction is difficult due to occlusions, non-diffuse surfaces, repetitive patterns, and textureless surfaces, and it has become more difficult due to the unstable image quality and uncontrolled scene condition in the mobile setting. In this paper, we present a novel hierarchical framework with multi-view confidence-based matching for robust, efficient depth reconstruction in uncontrolled scenes. Particularly, the proposed framework combines local cost aggregation with global cost optimization in a complementary manner that increases efficiency and accuracy. A depth map is efficiently obtained in a coarse-to-fine manner by using an image pyramid. Moreover, confidence maps are computed to robustly fuse multi-view matching cues, and to constrain the stereo matching on a finer scale. The proposed framework has been evaluated with challenging indoor and outdoor scenes, and has achieved robust and efficient depth reconstruction.
Robustness Analysis and Optimally Robust Control Design via Sum-of-Squares
NASA Technical Reports Server (NTRS)
Dorobantu, Andrei; Crespo, Luis G.; Seiler, Peter J.
2012-01-01
A control analysis and design framework is proposed for systems subject to parametric uncertainty. The underlying strategies are based on sum-of-squares (SOS) polynomial analysis and nonlinear optimization to design an optimally robust controller. The approach determines a maximum uncertainty range for which the closed-loop system satisfies a set of stability and performance requirements. These requirements, de ned as inequality constraints on several metrics, are restricted to polynomial functions of the uncertainty. To quantify robustness, SOS analysis is used to prove that the closed-loop system complies with the requirements for a given uncertainty range. The maximum uncertainty range, calculated by assessing a sequence of increasingly larger ranges, serves as a robustness metric for the closed-loop system. To optimize the control design, nonlinear optimization is used to enlarge the maximum uncertainty range by tuning the controller gains. Hence, the resulting controller is optimally robust to parametric uncertainty. This approach balances the robustness margins corresponding to each requirement in order to maximize the aggregate system robustness. The proposed framework is applied to a simple linear short-period aircraft model with uncertain aerodynamic coefficients.
Calcium (Ca2+) waves data calibration and analysis using image processing techniques
2013-01-01
Background Calcium (Ca2+) propagates within tissues serving as an important information carrier. In particular, cilia beat frequency in oviduct cells is partially regulated by Ca2+ changes. Thus, measuring the calcium density and characterizing the traveling wave plays a key role in understanding biological phenomena. However, current methods to measure propagation velocities and other wave characteristics involve several manual or time-consuming procedures. This limits the amount of information that can be extracted, and the statistical quality of the analysis. Results Our work provides a framework based on image processing procedures that enables a fast, automatic and robust characterization of data from two-filter fluorescence Ca2+ experiments. We calculate the mean velocity of the wave-front, and use theoretical models to extract meaningful parameters like wave amplitude, decay rate and time of excitation. Conclusions Measurements done by different operators showed a high degree of reproducibility. This framework is also extended to a single filter fluorescence experiments, allowing higher sampling rates, and thus an increased accuracy in velocity measurements. PMID:23679062
Probabilistic inference in discrete spaces can be implemented into networks of LIF neurons.
Probst, Dimitri; Petrovici, Mihai A; Bytschok, Ilja; Bill, Johannes; Pecevski, Dejan; Schemmel, Johannes; Meier, Karlheinz
2015-01-01
The means by which cortical neural networks are able to efficiently solve inference problems remains an open question in computational neuroscience. Recently, abstract models of Bayesian computation in neural circuits have been proposed, but they lack a mechanistic interpretation at the single-cell level. In this article, we describe a complete theoretical framework for building networks of leaky integrate-and-fire neurons that can sample from arbitrary probability distributions over binary random variables. We test our framework for a model inference task based on a psychophysical phenomenon (the Knill-Kersten optical illusion) and further assess its performance when applied to randomly generated distributions. As the local computations performed by the network strongly depend on the interaction between neurons, we compare several types of couplings mediated by either single synapses or interneuron chains. Due to its robustness to substrate imperfections such as parameter noise and background noise correlations, our model is particularly interesting for implementation on novel, neuro-inspired computing architectures, which can thereby serve as a fast, low-power substrate for solving real-world inference problems.
Understanding Collective Activities of People from Videos.
Wongun Choi; Savarese, Silvio
2014-06-01
This paper presents a principled framework for analyzing collective activities at different levels of semantic granularity from videos. Our framework is capable of jointly tracking multiple individuals, recognizing activities performed by individuals in isolation (i.e., atomic activities such as walking or standing), recognizing the interactions between pairs of individuals (i.e., interaction activities) as well as understanding the activities of group of individuals (i.e., collective activities). A key property of our work is that it can coherently combine bottom-up information stemming from detections or fragments of tracks (or tracklets) with top-down evidence. Top-down evidence is provided by a newly proposed descriptor that captures the coherent behavior of groups of individuals in a spatial-temporal neighborhood of the sequence. Top-down evidence provides contextual information for establishing accurate associations between detections or tracklets across frames and, thus, for obtaining more robust tracking results. Bottom-up evidence percolates upwards so as to automatically infer collective activity labels. Experimental results on two challenging data sets demonstrate our theoretical claims and indicate that our model achieves enhances tracking results and the best collective classification results to date.
Probabilistic inference in discrete spaces can be implemented into networks of LIF neurons
Probst, Dimitri; Petrovici, Mihai A.; Bytschok, Ilja; Bill, Johannes; Pecevski, Dejan; Schemmel, Johannes; Meier, Karlheinz
2015-01-01
The means by which cortical neural networks are able to efficiently solve inference problems remains an open question in computational neuroscience. Recently, abstract models of Bayesian computation in neural circuits have been proposed, but they lack a mechanistic interpretation at the single-cell level. In this article, we describe a complete theoretical framework for building networks of leaky integrate-and-fire neurons that can sample from arbitrary probability distributions over binary random variables. We test our framework for a model inference task based on a psychophysical phenomenon (the Knill-Kersten optical illusion) and further assess its performance when applied to randomly generated distributions. As the local computations performed by the network strongly depend on the interaction between neurons, we compare several types of couplings mediated by either single synapses or interneuron chains. Due to its robustness to substrate imperfections such as parameter noise and background noise correlations, our model is particularly interesting for implementation on novel, neuro-inspired computing architectures, which can thereby serve as a fast, low-power substrate for solving real-world inference problems. PMID:25729361
Robust Expandable Carbon Nanotube Scaffold for Ultrahigh-Capacity Lithium-Metal Anodes.
Sun, Zhaowei; Jin, Song; Jin, Hongchang; Du, Zhenzhen; Zhu, Yanwu; Cao, Anyuan; Ji, Hengxing; Wan, Li-Jun
2018-06-19
There has been a renewed interest in using lithium (Li) metal as an anode material for rechargeable batteries owing to its high theoretical capacity of 3860 mA h g -1 . Despite extensive research, modifications to effectively inhibit Li dendrite growth still result in decreased Li loading and Li utilization. As a result, real capacities are often lower than values expected, if the total mass of the electrode is taken into consideration. Herein, a lightweight yet mechanically robust carbon nanotube (CNT) paper is demonstrated as a freestanding framework to accommodate Li metal with a Li mass fraction of 80.7 wt%. The highly conductive network made of sp2-hybridized carbon effectively inhibits formation of Li dendrites and affords a favorable coulombic efficiency of >97.5%. Moreover, the Li/CNT electrode retains practical areal and gravimetric capacities of 10 mA h cm -2 and 2830 mA h g -1 (vs the mass of electrode), respectively, with 90.9% Li utilization for 1000 cycles at a current density of 10 mA cm -2 . It is demonstrated that the robust and expandable nature is a distinguishing feature of the CNT paper as compared to other 3D scaffolds, and is a key factor that leads to the improved electrochemical performance of the Li/CNT anodes. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
2014-01-01
Background The continuing increase of pharmaceutical expenditure calls for new approaches to pricing and reimbursement of pharmaceuticals. Value based pricing of pharmaceuticals is emerging as a useful tool and possess theoretical attributes to help health system cope with rising pharmaceutical expenditure. Aim To assess the feasibility of introducing a value-based pricing scheme of pharmaceuticals in Cyprus and explore the integrative framework. Methods A probabilistic Markov chain Monte Carlo model was created to simulate progression of advanced renal cell cancer for comparison of sorafenib to standard best supportive care. Literature review was performed and efficacy data were transferred from a published landmark trial, while official pricelists and clinical guidelines from Cyprus Ministry of Health were utilised for cost calculation. Based on proposed willingness to pay threshold the maximum price of sorafenib for the indication of second line renal cell cancer was assessed. Results Sorafenib value based price was found to be significantly lower compared to its current reference price. Conclusion Feasibility of Value Based Pricing is documented and pharmacoeconomic modelling can lead to robust results. Integration of value and affordability in the price are its main advantages which have to be weighed against lack of documentation for several theoretical parameters that influence outcome. Smaller countries such as Cyprus may experience adversities in establishing and sustaining essential structures for this scheme. PMID:24910539
Levin, Yulia; Tzelgov, Joseph
2016-01-01
The present study suggests that the idea that Stroop interference originates from multiple components may gain theoretically from integrating two independent frameworks. The first framework is represented by the well-known notion of "semantic gradient" of interference and the second one is the distinction between two types of conflict - the task and the informational conflict - giving rise to the interference (MacLeod and MacDonald, 2000; Goldfarb and Henik, 2007). The proposed integration led to the conclusion that two (i.e., orthographic and lexical components) of the four theoretically distinct components represent task conflict, and the other two (i.e., indirect and direct informational conflict components) represent informational conflict. The four components were independently estimated in a series of experiments. The results confirmed the contribution of task conflict (estimated by a robust orthographic component) and of informational conflict (estimated by a strong direct informational conflict component) to Stroop interference. However, the performed critical review of the relevant literature (see General Discussion), as well as the results of the experiments reported, showed that the other two components expressing each type of conflict (i.e., the lexical component of task conflict and the indirect informational conflict) were small and unstable. The present analysis refines our knowledge of the origins of Stroop interference by providing evidence that each type of conflict has its major and minor contributions. The implications for cognitive control of an automatic reading process are also discussed.
Levin, Yulia; Tzelgov, Joseph
2016-01-01
The present study suggests that the idea that Stroop interference originates from multiple components may gain theoretically from integrating two independent frameworks. The first framework is represented by the well-known notion of “semantic gradient” of interference and the second one is the distinction between two types of conflict – the task and the informational conflict – giving rise to the interference (MacLeod and MacDonald, 2000; Goldfarb and Henik, 2007). The proposed integration led to the conclusion that two (i.e., orthographic and lexical components) of the four theoretically distinct components represent task conflict, and the other two (i.e., indirect and direct informational conflict components) represent informational conflict. The four components were independently estimated in a series of experiments. The results confirmed the contribution of task conflict (estimated by a robust orthographic component) and of informational conflict (estimated by a strong direct informational conflict component) to Stroop interference. However, the performed critical review of the relevant literature (see General Discussion), as well as the results of the experiments reported, showed that the other two components expressing each type of conflict (i.e., the lexical component of task conflict and the indirect informational conflict) were small and unstable. The present analysis refines our knowledge of the origins of Stroop interference by providing evidence that each type of conflict has its major and minor contributions. The implications for cognitive control of an automatic reading process are also discussed. PMID:26955363
Reiter-Theil, Stella; Mertz, Marcel; Schürmann, Jan; Stingelin Giles, Nicola; Meyer-Zehnder, Barbara
2011-09-01
In this paper we assume that 'theory' is important for Clinical Ethics Support Services (CESS). We will argue that the underlying implicit theory should be reflected. Moreover, we suggest that the theoretical components on which any clinical ethics support (CES) relies should be explicitly articulated in order to enhance the quality of CES. A theoretical framework appropriate for CES will be necessarily complex and should include ethical (both descriptive and normative), metaethical and organizational components. The various forms of CES that exist in North-America and in Europe show their underlying theory more or less explicitly, with most of them referring to some kind of theoretical components including 'how-to' questions (methodology), organizational issues (implementation), problem analysis (phenomenology or typology of problems), and related ethical issues such as end-of-life decisions (major ethical topics). In order to illustrate and explain the theoretical framework that we are suggesting for our own CES project METAP, we will outline this project which has been established in a multi-centre context in several healthcare institutions. We conceptualize three 'pillars' as the major components of our theoretical framework: (1) evidence, (2) competence, and (3) discourse. As a whole, the framework is aimed at developing a foundation of our CES project METAP. We conclude that this specific integration of theoretical components is a promising model for the fruitful further development of CES. © 2011 Blackwell Publishing Ltd.
Quantifying loopy network architectures.
Katifori, Eleni; Magnasco, Marcelo O
2012-01-01
Biology presents many examples of planar distribution and structural networks having dense sets of closed loops. An archetype of this form of network organization is the vasculature of dicotyledonous leaves, which showcases a hierarchically-nested architecture containing closed loops at many different levels. Although a number of approaches have been proposed to measure aspects of the structure of such networks, a robust metric to quantify their hierarchical organization is still lacking. We present an algorithmic framework, the hierarchical loop decomposition, that allows mapping loopy networks to binary trees, preserving in the connectivity of the trees the architecture of the original graph. We apply this framework to investigate computer generated graphs, such as artificial models and optimal distribution networks, as well as natural graphs extracted from digitized images of dicotyledonous leaves and vasculature of rat cerebral neocortex. We calculate various metrics based on the asymmetry, the cumulative size distribution and the Strahler bifurcation ratios of the corresponding trees and discuss the relationship of these quantities to the architectural organization of the original graphs. This algorithmic framework decouples the geometric information (exact location of edges and nodes) from the metric topology (connectivity and edge weight) and it ultimately allows us to perform a quantitative statistical comparison between predictions of theoretical models and naturally occurring loopy graphs.
A Public Health Grid (PHGrid): Architecture and value proposition for 21st century public health.
Savel, T; Hall, K; Lee, B; McMullin, V; Miles, M; Stinn, J; White, P; Washington, D; Boyd, T; Lenert, L
2010-07-01
This manuscript describes the value of and proposal for a high-level architectural framework for a Public Health Grid (PHGrid), which the authors feel has the capability to afford the public health community a robust technology infrastructure for secure and timely data, information, and knowledge exchange, not only within the public health domain, but between public health and the overall health care system. The CDC facilitated multiple Proof-of-Concept (PoC) projects, leveraging an open-source-based software development methodology, to test four hypotheses with regard to this high-level framework. The outcomes of the four PoCs in combination with the use of the Federal Enterprise Architecture Framework (FEAF) and the newly emerging Federal Segment Architecture Methodology (FSAM) was used to develop and refine a high-level architectural framework for a Public Health Grid infrastructure. The authors were successful in documenting a robust high-level architectural framework for a PHGrid. The documentation generated provided a level of granularity needed to validate the proposal, and included examples of both information standards and services to be implemented. Both the results of the PoCs as well as feedback from selected public health partners were used to develop the granular documentation. A robust high-level cohesive architectural framework for a Public Health Grid (PHGrid) has been successfully articulated, with its feasibility demonstrated via multiple PoCs. In order to successfully implement this framework for a Public Health Grid, the authors recommend moving forward with a three-pronged approach focusing on interoperability and standards, streamlining the PHGrid infrastructure, and developing robust and high-impact public health services. Published by Elsevier Ireland Ltd.
Theoretical Framework of Leadership in Higher Education of England and Wales
ERIC Educational Resources Information Center
Mukan, Nataliya; Havrylyuk, Marianna; Stolyarchuk, Lesia
2015-01-01
In the article the theoretical framework of leadership in higher education of England and Wales has been studied. The main objectives of the article are defined as analysis of scientific and pedagogical literature, which highlights different aspects of the problem under research; characteristic of the theoretical fundamentals of educational…
Towards Developing a Theoretical Framework for Measuring Public Sector Managers' Career Success
ERIC Educational Resources Information Center
Rasdi, Roziah Mohd; Ismail, Maimunah; Uli, Jegak; Noah, Sidek Mohd
2009-01-01
Purpose: The purpose of this paper is to develop a theoretical framework for measuring public sector managers' career success. Design/methodology/approach: The theoretical foundation used in this study is social cognitive career theory. To conduct a literature search, several keywords were identified, i.e. career success, objective and subjective…
The Importance of Theoretical Frameworks and Mathematical Constructs in Designing Digital Tools
ERIC Educational Resources Information Center
Trinter, Christine
2016-01-01
The increase in availability of educational technologies over the past few decades has not only led to new practice in teaching mathematics but also to new perspectives in research, methodologies, and theoretical frameworks within mathematics education. Hence, the amalgamation of theoretical and pragmatic considerations in digital tool design…
Transition metal complexes supported on metal-organic frameworks for heterogeneous catalysts
Farha, Omar K.; Hupp, Joseph T.; Delferro, Massimiliano; Klet, Rachel C.
2017-02-07
A robust mesoporous metal-organic framework comprising a hafnium-based metal-organic framework and a single-site zirconium-benzyl species is provided. The hafnium, zirconium-benzyl metal-organic framework is useful as a catalyst for the polymerization of an alkene.
NASA Astrophysics Data System (ADS)
Kim, Y.; Chung, E. S.
2014-12-01
This study suggests a robust prioritization framework for climate change adaptation strategies under multiple climate change scenarios with a case study of selecting sites for reusing treated wastewater (TWW) in a Korean urban watershed. The framework utilizes various multi-criteria decision making techniques, including the VIKOR method and the Shannon entropy-based weights. In this case study, the sustainability of TWW use is quantified with indicator-based approaches with the DPSIR framework, which considers both hydro-environmental and socio-economic aspects of the watershed management. Under the various climate change scenarios, the hydro-environmental responses to reusing TWW in potential alternative sub-watersheds are determined using the Hydrologic Simulation Program in Fortran (HSPF). The socio-economic indicators are obtained from the statistical databases. Sustainability scores for multiple scenarios are estimated individually and then integrated with the proposed approach. At last, the suggested framework allows us to prioritize adaptation strategies in a robust manner with varying levels of compromise between utility-based and regret-based strategies.
A Theoretically Grounded Framework for Integrating the Scholarship of Teaching and Learning
ERIC Educational Resources Information Center
Walls, Jill K.
2016-01-01
SoTL scholars have written about the importance and utility of teaching from a guiding theoretical framework. In this paper, ecological theory and specifically Bronfenbrenner's bioecological model, is examined as a potential framework for synthesizing SoTL research findings to inform teaching and learning scholarship at the college level. A…
Carmena, Jose M.
2016-01-01
Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain’s behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user’s motor intention during CLDA—a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to parameter initialization. Finally, the architecture extended control to tasks beyond those used for CLDA training. These results have significant implications towards the development of clinically-viable neuroprosthetics. PMID:27035820
Persistent model order reduction for complex dynamical systems using smooth orthogonal decomposition
NASA Astrophysics Data System (ADS)
Ilbeigi, Shahab; Chelidze, David
2017-11-01
Full-scale complex dynamic models are not effective for parametric studies due to the inherent constraints on available computational power and storage resources. A persistent reduced order model (ROM) that is robust, stable, and provides high-fidelity simulations for a relatively wide range of parameters and operating conditions can provide a solution to this problem. The fidelity of a new framework for persistent model order reduction of large and complex dynamical systems is investigated. The framework is validated using several numerical examples including a large linear system and two complex nonlinear systems with material and geometrical nonlinearities. While the framework is used for identifying the robust subspaces obtained from both proper and smooth orthogonal decompositions (POD and SOD, respectively), the results show that SOD outperforms POD in terms of stability, accuracy, and robustness.
Robust nonlinear control of vectored thrust aircraft
NASA Technical Reports Server (NTRS)
Doyle, John C.; Murray, Richard; Morris, John
1993-01-01
An interdisciplinary program in robust control for nonlinear systems with applications to a variety of engineering problems is outlined. Major emphasis will be placed on flight control, with both experimental and analytical studies. This program builds on recent new results in control theory for stability, stabilization, robust stability, robust performance, synthesis, and model reduction in a unified framework using Linear Fractional Transformations (LFT's), Linear Matrix Inequalities (LMI's), and the structured singular value micron. Most of these new advances have been accomplished by the Caltech controls group independently or in collaboration with researchers in other institutions. These recent results offer a new and remarkably unified framework for all aspects of robust control, but what is particularly important for this program is that they also have important implications for system identification and control of nonlinear systems. This combines well with Caltech's expertise in nonlinear control theory, both in geometric methods and methods for systems with constraints and saturations.
ERIC Educational Resources Information Center
Mpofu, Vongai; Otulaja, Femi S.; Mushayikwa, Emmanuel
2014-01-01
A theoretical framework is an important component of a research study. It grounds the study and guides the methodological design. It also forms a reference point for the interpretation of the research findings. This paper conceptually examines the process of constructing a multi-focal theoretical lens for guiding studies that aim to accommodate…
NASA Astrophysics Data System (ADS)
Alexandridis, Konstantinos T.
This dissertation adopts a holistic and detailed approach to modeling spatially explicit agent-based artificial intelligent systems, using the Multi Agent-based Behavioral Economic Landscape (MABEL) model. The research questions that addresses stem from the need to understand and analyze the real-world patterns and dynamics of land use change from a coupled human-environmental systems perspective. Describes the systemic, mathematical, statistical, socio-economic and spatial dynamics of the MABEL modeling framework, and provides a wide array of cross-disciplinary modeling applications within the research, decision-making and policy domains. Establishes the symbolic properties of the MABEL model as a Markov decision process, analyzes the decision-theoretic utility and optimization attributes of agents towards comprising statistically and spatially optimal policies and actions, and explores the probabilogic character of the agents' decision-making and inference mechanisms via the use of Bayesian belief and decision networks. Develops and describes a Monte Carlo methodology for experimental replications of agent's decisions regarding complex spatial parcel acquisition and learning. Recognizes the gap on spatially-explicit accuracy assessment techniques for complex spatial models, and proposes an ensemble of statistical tools designed to address this problem. Advanced information assessment techniques such as the Receiver-Operator Characteristic curve, the impurity entropy and Gini functions, and the Bayesian classification functions are proposed. The theoretical foundation for modular Bayesian inference in spatially-explicit multi-agent artificial intelligent systems, and the ensembles of cognitive and scenario assessment modular tools build for the MABEL model are provided. Emphasizes the modularity and robustness as valuable qualitative modeling attributes, and examines the role of robust intelligent modeling as a tool for improving policy-decisions related to land use change. Finally, the major contributions to the science are presented along with valuable directions for future research.
NASA Astrophysics Data System (ADS)
Sturtevant, C.; Hackley, S.; Lee, R.; Holling, G.; Bonarrigo, S.
2017-12-01
Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. Data quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from humans or the natural environment. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process heavily relying on visual inspection of data. In addition, notes of measurement interference are often recorded on paper without an explicit pathway to data flagging. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. We present a scalable QA/QC framework in development for NEON that combines the efficiency and standardization of automated checks with the power and flexibility of human review. This framework includes fast-response monitoring of sensor health, a mobile application for electronically recording maintenance activities, traditional point-based automated quality flagging, and continuous monitoring of quality outcomes and longer-term holistic evaluations. This framework maintains the traceability of quality information along the entirety of the data generation pipeline, and explicitly links field reports of measurement interference to quality flagging. Preliminary results show that data quality can be effectively monitored and managed for a multitude of sites with a small group of QA/QC staff. Several components of this framework are open-source, including a R-Shiny application for efficiently monitoring, synthesizing, and investigating data quality issues.
Robust flow stability: Theory, computations and experiments in near wall turbulence
NASA Astrophysics Data System (ADS)
Bobba, Kumar Manoj
Helmholtz established the field of hydrodynamic stability with his pioneering work in 1868. From then on, hydrodynamic stability became an important tool in understanding various fundamental fluid flow phenomena in engineering (mechanical, aeronautics, chemical, materials, civil, etc.) and science (astrophysics, geophysics, biophysics, etc.), and turbulence in particular. However, there are many discrepancies between classical hydrodynamic stability theory and experiments. In this thesis, the limitations of traditional hydrodynamic stability theory are shown and a framework for robust flow stability theory is formulated. A host of new techniques like gramians, singular values, operator norms, etc. are introduced to understand the role of various kinds of uncertainty. An interesting feature of this framework is the close interplay between theory and computations. It is shown that a subset of Navier-Stokes equations are globally, non-nonlinearly stable for all Reynolds number. Yet, invoking this new theory, it is shown that these equations produce structures (vortices and streaks) as seen in the experiments. The experiments are done in zero pressure gradient transiting boundary layer on a flat plate in free surface tunnel. Digital particle image velocimetry, and MEMS based laser Doppler velocimeter and shear stress sensors have been used to make quantitative measurements of the flow. Various theoretical and computational predictions are in excellent agreement with the experimental data. A closely related topic of modeling, simulation and complexity reduction of large mechanics problems with multiple spatial and temporal scales is also studied. A nice method that rigorously quantifies the important scales and automatically gives models of the problem to various levels of accuracy is introduced. Computations done using spectral methods are presented.
Valentijn, Pim P; Bruijnzeels, Marc A; de Leeuw, Rob J; Schrijvers, Guus J.P
2012-01-01
Purpose Capacity problems and political pressures have led to a rapid change in the organization of primary care from mono disciplinary small business to complex inter-organizational relationships. It is assumed that inter-organizational collaboration is the driving force to achieve integrated (primary) care. Despite the importance of collaboration and integration of services in primary care, there is no unambiguous definition for both concepts. The purpose of this study is to examine and link the conceptualisation and validation of the terms inter-organizational collaboration and integrated primary care using a theoretical framework. Theory The theoretical framework is based on the complex collaboration process of negotiation among multiple stakeholder groups in primary care. Methods A literature review of health sciences and business databases, and targeted grey literature sources. Based on the literature review we operationalized the constructs of inter-organizational collaboration and integrated primary care in a theoretical framework. The framework is being validated in an explorative study of 80 primary care projects in the Netherlands. Results and conclusions Integrated primary care is considered as a multidimensional construct based on a continuum of integration, extending from segregation to integration. The synthesis of the current theories and concepts of inter-organizational collaboration is insufficient to deal with the complexity of collaborative issues in primary care. One coherent and integrated theoretical framework was found that could make the complex collaboration process in primary care transparent. This study presented theoretical framework is a first step to understand the patterns of successful collaboration and integration in primary care services. These patterns can give insights in the organization forms needed to create a good working integrated (primary) care system that fits the local needs of a population. Preliminary data of the patterns of collaboration and integration will be presented.
IEP goals for school-age children with speech sound disorders.
Farquharson, Kelly; Tambyraja, Sherine R; Justice, Laura M; Redle, Erin E
2014-01-01
The purpose of the current study was to describe the current state of practice for writing Individualized Education Program (IEP) goals for children with speech sound disorders (SSDs). IEP goals for 146 children receiving services for SSDs within public school systems across two states were coded for their dominant theoretical framework and overall quality. A dichotomous scheme was used for theoretical framework coding: cognitive-linguistic or sensory-motor. Goal quality was determined by examining 7 specific indicators outlined by an empirically tested rating tool. In total, 147 long-term and 490 short-term goals were coded. The results revealed no dominant theoretical framework for long-term goals, whereas short-term goals largely reflected a sensory-motor framework. In terms of quality, the majority of speech production goals were functional and generalizable in nature, but were not able to be easily targeted during common daily tasks or by other members of the IEP team. Short-term goals were consistently rated higher in quality domains when compared to long-term goals. The current state of practice for writing IEP goals for children with SSDs indicates that theoretical framework may be eclectic in nature and likely written to support the individual needs of children with speech sound disorders. Further investigation is warranted to determine the relations between goal quality and child outcomes. (1) Identify two predominant theoretical frameworks and discuss how they apply to IEP goal writing. (2) Discuss quality indicators as they relate to IEP goals for children with speech sound disorders. (3) Discuss the relationship between long-term goals level of quality and related theoretical frameworks. (4) Identify the areas in which business-as-usual IEP goals exhibit strong quality.
Zheng, Wenming; Lin, Zhouchen; Wang, Haixian
2014-04-01
A novel discriminant analysis criterion is derived in this paper under the theoretical framework of Bayes optimality. In contrast to the conventional Fisher's discriminant criterion, the major novelty of the proposed one is the use of L1 norm rather than L2 norm, which makes it less sensitive to the outliers. With the L1-norm discriminant criterion, we propose a new linear discriminant analysis (L1-LDA) method for linear feature extraction problem. To solve the L1-LDA optimization problem, we propose an efficient iterative algorithm, in which a novel surrogate convex function is introduced such that the optimization problem in each iteration is to simply solve a convex programming problem and a close-form solution is guaranteed to this problem. Moreover, we also generalize the L1-LDA method to deal with the nonlinear robust feature extraction problems via the use of kernel trick, and hereafter proposed the L1-norm kernel discriminant analysis (L1-KDA) method. Extensive experiments on simulated and real data sets are conducted to evaluate the effectiveness of the proposed method in comparing with the state-of-the-art methods.
Transfer Learning for Class Imbalance Problems with Inadequate Data.
Al-Stouhi, Samir; Reddy, Chandan K
2016-07-01
A fundamental problem in data mining is to effectively build robust classifiers in the presence of skewed data distributions. Class imbalance classifiers are trained specifically for skewed distribution datasets. Existing methods assume an ample supply of training examples as a fundamental prerequisite for constructing an effective classifier. However, when sufficient data is not readily available, the development of a representative classification algorithm becomes even more difficult due to the unequal distribution between classes. We provide a unified framework that will potentially take advantage of auxiliary data using a transfer learning mechanism and simultaneously build a robust classifier to tackle this imbalance issue in the presence of few training samples in a particular target domain of interest. Transfer learning methods use auxiliary data to augment learning when training examples are not sufficient and in this paper we will develop a method that is optimized to simultaneously augment the training data and induce balance into skewed datasets. We propose a novel boosting based instance-transfer classifier with a label-dependent update mechanism that simultaneously compensates for class imbalance and incorporates samples from an auxiliary domain to improve classification. We provide theoretical and empirical validation of our method and apply to healthcare and text classification applications.
Multi-criteria robustness analysis of metro networks
NASA Astrophysics Data System (ADS)
Wang, Xiangrong; Koç, Yakup; Derrible, Sybil; Ahmad, Sk Nasir; Pino, Willem J. A.; Kooij, Robert E.
2017-05-01
Metros (heavy rail transit systems) are integral parts of urban transportation systems. Failures in their operations can have serious impacts on urban mobility, and measuring their robustness is therefore critical. Moreover, as physical networks, metros can be viewed as topological entities, and as such they possess measurable network properties. In this article, by using network science and graph theory, we investigate ten theoretical and four numerical robustness metrics and their performance in quantifying the robustness of 33 metro networks under random failures or targeted attacks. We find that the ten theoretical metrics capture two distinct aspects of robustness of metro networks. First, several metrics place an emphasis on alternative paths. Second, other metrics place an emphasis on the length of the paths. To account for all aspects, we standardize all ten indicators and plot them on radar diagrams to assess the overall robustness for metro networks. Overall, we find that Tokyo and Rome are the most robust networks. Rome benefits from short transferring and Tokyo has a significant number of transfer stations, both in the city center and in the peripheral area of the city, promoting both a higher number of alternative paths and overall relatively short path-lengths.
Intellect: a theoretical framework for personality traits related to intellectual achievements.
Mussel, Patrick
2013-05-01
The present article develops a theoretical framework for the structure of personality traits related to intellectual achievements. We postulate a 2-dimensional model, differentiating between 2 processes (Seek and Conquer) and 3 operations (Think, Learn, and Create). The framework was operationalized by a newly developed measure, which was validated based on 2 samples. Subsequently, in 3 studies (overall N = 1,478), the 2-dimensional structure of the Intellect framework was generally supported. Additionally, subdimensions of the Intellect framework specifically predicted conceptually related criteria, including scholastic performance, vocational interest, and leisure activities. Furthermore, results from multidimensional scaling and higher order confirmatory factor analyses show that the framework allows for the incorporation of several constructs that have been proposed on different theoretical backgrounds, such as need for cognition, typical intellectual engagement, curiosity, intrinsic motivation, goal orientation, and openness to ideas. It is concluded that based on the Intellect framework, these constructs, which have been researched separately in the literature, can be meaningfully integrated.
Giordano, Bruno L.; Kayser, Christoph; Rousselet, Guillaume A.; Gross, Joachim; Schyns, Philippe G.
2016-01-01
Abstract We begin by reviewing the statistical framework of information theory as applicable to neuroimaging data analysis. A major factor hindering wider adoption of this framework in neuroimaging is the difficulty of estimating information theoretic quantities in practice. We present a novel estimation technique that combines the statistical theory of copulas with the closed form solution for the entropy of Gaussian variables. This results in a general, computationally efficient, flexible, and robust multivariate statistical framework that provides effect sizes on a common meaningful scale, allows for unified treatment of discrete, continuous, unidimensional and multidimensional variables, and enables direct comparisons of representations from behavioral and brain responses across any recording modality. We validate the use of this estimate as a statistical test within a neuroimaging context, considering both discrete stimulus classes and continuous stimulus features. We also present examples of analyses facilitated by these developments, including application of multivariate analyses to MEG planar magnetic field gradients, and pairwise temporal interactions in evoked EEG responses. We show the benefit of considering the instantaneous temporal derivative together with the raw values of M/EEG signals as a multivariate response, how we can separately quantify modulations of amplitude and direction for vector quantities, and how we can measure the emergence of novel information over time in evoked responses. Open‐source Matlab and Python code implementing the new methods accompanies this article. Hum Brain Mapp 38:1541–1573, 2017. © 2016 Wiley Periodicals, Inc. PMID:27860095
Hutchinson, Marie; Jackson, Debra; Daly, John; Usher, Kim
2015-05-01
Intelligent, robust and courageous nursing leadership is essential in all areas of nursing, including mental health. However, in the nursing leadership literature, the theoretical discourse regarding how leaders recognise the need for action and make the choice to act with moral purpose is currently limited. Little has been written about the cognitions, capabilities and contextual factors that enable leader courage. In particular, the interplay between leader values and actions that are characterised as good or moral remains underexplored in the nursing leadership literature. In this article, through a discursive literature synthesis we seek to distill a more detailed understanding of leader moral courage; specifically, what factors contribute to leaders' ability to act with moral courage, what factors impede such action, and what factors do leaders need to foster within themselves and others to enable action that is driven by moral courage. From the analysis, we distilled a multi-level framework that identifies a range of individual characteristics and capabilities, and enabling contextual factors that underpin leader moral courage. The framework suggests leader moral courage is more complex than often posited in theories of leadership, as it comprises elements that shape moral thought and conduct. Given the complexity and challenges of nursing work, the framework for moral action derived from our analysis provides insight and suggestions for strengthening individual and group capacity to assist nurse leaders and mental health nurses to act with integrity and courage.
Scheydt, Stefan; Needham, Ian; Behrens, Johann
2017-01-01
Background: Within the scope of the research project on the subjects of sensory overload and stimulus regulation, a theoretical framework model of the nursing care of patients with sensory overload in psychiatry was developed. In a second step, this theoretical model should now be theoretically compressed and, if necessary, modified. Aim: Empirical verification as well as modification, enhancement and theoretical densification of the framework model of nursing care of patients with sensory overload in psychiatry. Method: Analysis of 8 expert interviews by summarizing and structuring content analysis methods based on Meuser and Nagel (2009) as well as Mayring (2010). Results: The developed framework model (Scheydt et al., 2016b) could be empirically verified, theoretically densificated and extended by one category (perception modulation). Thus, four categories of nursing care of patients with sensory overload can be described in inpatient psychiatry: removal from stimuli, modulation of environmental factors, perceptual modulation as well as help somebody to help him- or herself / coping support. Conclusions: Based on the methodological approach, a relatively well-saturated, credible conceptualization of a theoretical model for the description of the nursing care of patients with sensory overload in stationary psychiatry could be worked out. In further steps, these measures have to be further developed, implemented and evaluated regarding to their efficacy.
Unified framework for automated iris segmentation using distantly acquired face images.
Tan, Chun-Wei; Kumar, Ajay
2012-09-01
Remote human identification using iris biometrics has high civilian and surveillance applications and its success requires the development of robust segmentation algorithm to automatically extract the iris region. This paper presents a new iris segmentation framework which can robustly segment the iris images acquired using near infrared or visible illumination. The proposed approach exploits multiple higher order local pixel dependencies to robustly classify the eye region pixels into iris or noniris regions. Face and eye detection modules have been incorporated in the unified framework to automatically provide the localized eye region from facial image for iris segmentation. We develop robust postprocessing operations algorithm to effectively mitigate the noisy pixels caused by the misclassification. Experimental results presented in this paper suggest significant improvement in the average segmentation errors over the previously proposed approaches, i.e., 47.5%, 34.1%, and 32.6% on UBIRIS.v2, FRGC, and CASIA.v4 at-a-distance databases, respectively. The usefulness of the proposed approach is also ascertained from recognition experiments on three different publicly available databases.
ERIC Educational Resources Information Center
Harjunen, Elina
2012-01-01
In this theoretical paper the role of power in classroom interactions is examined in terms of a dominance continuum to advance a theoretical framework justifying the emergence of three ways of distributing power when it comes to dealing with the control over the teaching-studying-learning (TSL) "pattern of teacher domination," "pattern of…
Robustness results in LQG based multivariable control designs
NASA Technical Reports Server (NTRS)
Lehtomaki, N. A.; Sandell, N. R., Jr.; Athans, M.
1980-01-01
The robustness of control systems with respect to model uncertainty is considered using simple frequency domain criteria. Results are derived under a common framework in which the minimum singular value of the return difference transfer matrix is the key quantity. In particular, the LQ and LQG robustness results are discussed.
Upping the "Anti-": The Value of an Anti-Racist Theoretical Framework in Music Education
ERIC Educational Resources Information Center
Hess, Juliet
2015-01-01
In a time that some have argued is "postracial" following the election and reelection of Barack Obama (see Wise 2010, for discussion), this paper argues that antiracism is a crucial theoretical framework for music education. I explore three areas of music education, in which such a framework can push toward change. The first area speaks…
Interplay of migratory and division forces as a generic mechanism for stem cell patterns
NASA Astrophysics Data System (ADS)
Hannezo, Edouard; Coucke, Alice; Joanny, Jean-François
2016-02-01
In many adult tissues, stem cells and differentiated cells are not homogeneously distributed: stem cells are arranged in periodic "niches," and differentiated cells are constantly produced and migrate out of these niches. In this article, we provide a general theoretical framework to study mixtures of dividing and actively migrating particles, which we apply to biological tissues. We show in particular that the interplay between the stresses arising from active cell migration and stem cell division give rise to robust stem cell patterns. The instability of the tissue leads to spatial patterns which are either steady or oscillating in time. The wavelength of the instability has an order of magnitude consistent with the biological observations. We also discuss the implications of these results for future in vitro and in vivo experiments.
NASA Astrophysics Data System (ADS)
Fallarino, Lorenzo; Berger, Andreas; Binek, Christian
2015-02-01
A Landau-theoretical approach is utilized to model the magnetic field induced reversal of the antiferromagnetic order parameter in thin films of magnetoelectric antiferromagnets. A key ingredient of this peculiar switching phenomenon is the presence of a robust spin polarized state at the surface of the antiferromagnetic films. Surface or boundary magnetization is symmetry allowed in magnetoelectric antiferromagnets and experimentally established for chromia thin films. It couples rigidly to the antiferromagnetic order parameter and its Zeeman energy creates a pathway to switch the antiferromagnet via magnetic field application. In the framework of a minimalist Landau free energy expansion, the temperature dependence of the switching field and the field dependence of the transition width are derived. Least-squares fits to magnetometry data of (0001 ) textured chromia thin films strongly support this model of the magnetic reversal mechanism.
Robust Decision-making Applied to Model Selection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hemez, Francois M.
2012-08-06
The scientific and engineering communities are relying more and more on numerical models to simulate ever-increasingly complex phenomena. Selecting a model, from among a family of models that meets the simulation requirements, presents a challenge to modern-day analysts. To address this concern, a framework is adopted anchored in info-gap decision theory. The framework proposes to select models by examining the trade-offs between prediction accuracy and sensitivity to epistemic uncertainty. The framework is demonstrated on two structural engineering applications by asking the following question: Which model, of several numerical models, approximates the behavior of a structure when parameters that define eachmore » of those models are unknown? One observation is that models that are nominally more accurate are not necessarily more robust, and their accuracy can deteriorate greatly depending upon the assumptions made. It is posited that, as reliance on numerical models increases, establishing robustness will become as important as demonstrating accuracy.« less
Ilott, Irene; Gerrish, Kate; Booth, Andrew; Field, Becky
2013-10-01
There is an international imperative to implement research into clinical practice to improve health care. Understanding the dynamics of change requires knowledge from theoretical and empirical studies. This paper presents a novel approach to testing a new meta theoretical framework: the Consolidated Framework for Implementation Research. The utility of the Framework was evaluated using a post hoc, deductive analysis of 11 narrative accounts of innovation in health care services and practice from England, collected in 2010. A matrix, comprising the five domains and 39 constructs of the Framework was developed to examine the coherence of the terminology, to compare results across contexts and to identify new theoretical developments. The Framework captured the complexity of implementation across 11 diverse examples, offering theoretically informed, comprehensive coverage. The Framework drew attention to relevant points in individual cases together with patterns across cases; for example, all were internally developed innovations that brought direct or indirect patient advantage. In 10 cases, the change was led by clinicians. Most initiatives had been maintained for several years and there was evidence of spread in six examples. Areas for further development within the Framework include sustainability and patient/public engagement in implementation. Our analysis suggests that this conceptual framework has the potential to offer useful insights, whether as part of a situational analysis or by developing context-specific propositions for hypothesis testing. Such studies are vital now that innovation is being promoted as core business for health care. © 2012 John Wiley & Sons Ltd.
Wuest, J
1997-01-01
While research exploring diverse groups enhances understanding of their unique perspectives and experiences, it also contributes to the exclusion of such groups from mainstream frameworks and solutions. The feminist grounded theory method allows for inclusion of marginalized groups through theoretical sensitivity to feminist theory and theoretical sampling. This paper demonstrates how this approach results in an explanatory framework that accounts for diverse realities in a study of women's caring. Fraying connections were identified as women's initial response to competing and changing caring demands. The range of dimensions and properties of fraying connections was identified through theoretical sampling guided by the emerging themes and theoretical sensitivity to issues of gender, culture, age, ability, class, and sexual orientation.
Theoretical framework to study exercise motivation for breast cancer risk reduction.
Wood, Maureen E
2008-01-01
To identify an appropriate theoretical framework to study exercise motivation for breast cancer risk reduction among high-risk women. An extensive review of the literature was conducted to gather relevant information pertaining to the Health Promotion Model, self-determination theory, social cognitive theory, Health Belief Model, Transtheoretical Model, theory of planned behavior, and protection motivation theory. An iterative approach was used to summarize the literature related to exercise motivation within each theoretical framework. Protection motivation theory could be used to examine the effects of perceived risk and self-efficacy in motivating women to exercise to facilitate health-related behavioral change. Evidence-based research within a chosen theoretical model can aid practitioners when making practical recommendations to reduce breast cancer risk.
Innovation value chain capability in Malaysian-owned company: A theoretical framework
NASA Astrophysics Data System (ADS)
Abidin, Norkisme Zainal; Suradi, Nur Riza Mohd
2014-09-01
Good quality products or services are no longer adequate to guarantee the sustainability of a company in the present competitive business. Prior research has developed various innovation models with the hope to better understand the innovativeness of the company. Due to countless definitions, indicators, factors, parameter and approaches in the study of innovation, it is difficult to ensure which one will best suit the Malaysian-owned company innovativeness. This paper aims to provide a theoretical background to support the framework of the innovation value chain capability in Malaysian-owned Company. The theoretical framework was based on the literature reviews, expert interviews and focus group study. The framework will be used to predict and assess the innovation value chain capability in Malaysian-owned company.
A framework for biodynamic feedthrough analysis--part I: theoretical foundations.
Venrooij, Joost; van Paassen, Marinus M; Mulder, Mark; Abbink, David A; Mulder, Max; van der Helm, Frans C T; Bulthoff, Heinrich H
2014-09-01
Biodynamic feedthrough (BDFT) is a complex phenomenon, which has been studied for several decades. However, there is little consensus on how to approach the BDFT problem in terms of definitions, nomenclature, and mathematical descriptions. In this paper, a framework for biodynamic feedthrough analysis is presented. The goal of this framework is two-fold. First, it provides some common ground between the seemingly large range of different approaches existing in the BDFT literature. Second, the framework itself allows for gaining new insights into BDFT phenomena. It will be shown how relevant signals can be obtained from measurement, how different BDFT dynamics can be derived from them, and how these different dynamics are related. Using the framework, BDFT can be dissected into several dynamical relationships, each relevant in understanding BDFT phenomena in more detail. The presentation of the BDFT framework is divided into two parts. This paper, Part I, addresses the theoretical foundations of the framework. Part II, which is also published in this issue, addresses the validation of the framework. The work is presented in two separate papers to allow for a detailed discussion of both the framework's theoretical background and its validation.
Jennings, Karen M
Using a nursing theoretical framework to understand, elucidate, and propose nursing research is fundamental to knowledge development. This article presents the Roy Adaptation Model as a theoretical framework to better understand individuals with anorexia nervosa during acute treatment, and the role of nursing assessments and interventions in the promotion of weight restoration. Nursing assessments and interventions situated within the Roy Adaptation Model take into consideration how weight restoration does not occur in isolation but rather reflects an adaptive process within external and internal environments, and has the potential for more holistic care.
National Strategic Planning: Linking DIMEFIL/PMESII to a Theory of Victory
2009-04-01
theoretical and one practical, and both are interlinked, The theoretical problem is the lack of a mental framework tying the desired end state...mental framework tying the desired end state (usually broadly stated) to the activities undertaken with the instruments of national power. This is a... FRAMEWORK TO DIMEFIL/PMESII ............ 39 CHAPTER 4. HOLY GRAIL OR WITCHES’ BREW? RECORDING REASONING IN SOFTWARE
ERIC Educational Resources Information Center
Fransson, Goran; Holmberg, Jorgen
2012-01-01
This paper describes a self-study research project that focused on our experiences when planning, teaching, and evaluating a course in initial teacher education. The theoretical framework of technological pedagogical content knowledge (TPACK) was used as a conceptual structure for the self-study. Our understanding of the framework in relation to…
ERIC Educational Resources Information Center
Smith, Sedef Uzuner; Hayes, Suzanne; Shea, Peter
2017-01-01
After presenting a brief overview of the key elements that underpin Etienne Wenger's communities of practice (CoP) theoretical framework, one of the most widely cited and influential conceptions of social learning, this paper reviews extant empirical work grounded in this framework to investigate online/blended learning in higher education and in…
Heat-Passing Framework for Robust Interpretation of Data in Networks
Fang, Yi; Sun, Mengtian; Ramani, Karthik
2015-01-01
Researchers are regularly interested in interpreting the multipartite structure of data entities according to their functional relationships. Data is often heterogeneous with intricately hidden inner structure. With limited prior knowledge, researchers are likely to confront the problem of transforming this data into knowledge. We develop a new framework, called heat-passing, which exploits intrinsic similarity relationships within noisy and incomplete raw data, and constructs a meaningful map of the data. The proposed framework is able to rank, cluster, and visualize the data all at once. The novelty of this framework is derived from an analogy between the process of data interpretation and that of heat transfer, in which all data points contribute simultaneously and globally to reveal intrinsic similarities between regions of data, meaningful coordinates for embedding the data, and exemplar data points that lie at optimal positions for heat transfer. We demonstrate the effectiveness of the heat-passing framework for robustly partitioning the complex networks, analyzing the globin family of proteins and determining conformational states of macromolecules in the presence of high levels of noise. The results indicate that the methodology is able to reveal functionally consistent relationships in a robust fashion with no reference to prior knowledge. The heat-passing framework is very general and has the potential for applications to a broad range of research fields, for example, biological networks, social networks and semantic analysis of documents. PMID:25668316
Towards a theory of PACS deployment: an integrative PACS maturity framework.
van de Wetering, Rogier; Batenburg, Ronald
2014-06-01
Owing to large financial investments that go along with the picture archiving and communication system (PACS) deployments and inconsistent PACS performance evaluations, there is a pressing need for a better understanding of the implications of PACS deployment in hospitals. We claim that there is a gap in the research field, both theoretically and empirically, to explain the success of the PACS deployment and maturity in hospitals. Theoretical principles are relevant to the PACS performance; maturity and alignment are reviewed from a system and complexity perspective. A conceptual model to explain the PACS performance and a set of testable hypotheses are then developed. Then, structural equation modeling (SEM), i.e. causal modeling, is applied to validate the model and hypotheses based on a research sample of 64 hospitals that use PACS, i.e. 70 % of all hospitals in the Netherlands. Outcomes of the SEM analyses substantiate that the measurements of all constructs are reliable and valid. The PACS alignment-modeled as a higher-order construct of five complementary organizational dimensions and maturity levels-has a significant positive impact on the PACS performance. This result is robust and stable for various sub-samples and segments. This paper presents a conceptual model that explains how alignment in deploying PACS in hospitals is positively related to the perceived performance of PACS. The conceptual model is extended with tools as checklists to systematically identify the improvement areas for hospitals in the PACS domain. The holistic approach towards PACS alignment and maturity provides a framework for clinical practice.
Feedback, Mass Conservation and Reaction Kinetics Impact the Robustness of Cellular Oscillations
Baum, Katharina; Kofahl, Bente; Steuer, Ralf; Wolf, Jana
2016-01-01
Oscillations occur in a wide variety of cellular processes, for example in calcium and p53 signaling responses, in metabolic pathways or within gene-regulatory networks, e.g. the circadian system. Since it is of central importance to understand the influence of perturbations on the dynamics of these systems a number of experimental and theoretical studies have examined their robustness. The period of circadian oscillations has been found to be very robust and to provide reliable timing. For intracellular calcium oscillations the period has been shown to be very sensitive and to allow for frequency-encoded signaling. We here apply a comprehensive computational approach to study the robustness of period and amplitude of oscillatory systems. We employ different prototype oscillator models and a large number of parameter sets obtained by random sampling. This framework is used to examine the effect of three design principles on the sensitivities towards perturbations of the kinetic parameters. We find that a prototype oscillator with negative feedback has lower period sensitivities than a prototype oscillator relying on positive feedback, but on average higher amplitude sensitivities. For both oscillator types, the use of Michaelis-Menten instead of mass action kinetics in all degradation and conversion reactions leads to an increase in period as well as amplitude sensitivities. We observe moderate changes in sensitivities if replacing mass conversion reactions by purely regulatory reactions. These insights are validated for a set of established models of various cellular rhythms. Overall, our work highlights the importance of reaction kinetics and feedback type for the variability of period and amplitude and therefore for the establishment of predictive models. PMID:28027301
Tejedor, Alejandro; Longjas, Anthony; Zaliapin, Ilya; Ambroj, Samuel; Foufoula-Georgiou, Efi
2017-08-17
Network robustness against attacks has been widely studied in fields as diverse as the Internet, power grids and human societies. But current definition of robustness is only accounting for half of the story: the connectivity of the nodes unaffected by the attack. Here we propose a new framework to assess network robustness, wherein the connectivity of the affected nodes is also taken into consideration, acknowledging that it plays a crucial role in properly evaluating the overall network robustness in terms of its future recovery from the attack. Specifically, we propose a dual perspective approach wherein at any instant in the network evolution under attack, two distinct networks are defined: (i) the Active Network (AN) composed of the unaffected nodes and (ii) the Idle Network (IN) composed of the affected nodes. The proposed robustness metric considers both the efficiency of destroying the AN and that of building-up the IN. We show, via analysis of well-known prototype networks and real world data, that trade-offs between the efficiency of Active and Idle Network dynamics give rise to surprising robustness crossovers and re-rankings, which can have significant implications for decision making.
A Theoretical Framework for Studying Adolescent Contraceptive Use.
ERIC Educational Resources Information Center
Urberg, Kathryn A.
1982-01-01
Presents a theoretical framework for viewing adolescent contraceptive usage. The problem-solving process is used for developmentally examining the competencies that must be present for effective contraceptive use, including: problem recognition, motivation, generation of alternatives, decision making and implementation. Each aspect is discussed…
Toward high-energy-density, high-efficiency, and moderate-temperature chip-scale thermophotovoltaics
Chan, Walker R.; Bermel, Peter; Pilawa-Podgurski, Robert C. N.; Marton, Christopher H.; Jensen, Klavs F.; Senkevich, Jay J.; Joannopoulos, John D.; Soljačić, Marin; Celanovic, Ivan
2013-01-01
The challenging problem of ultra-high-energy-density, high-efficiency, and small-scale portable power generation is addressed here using a distinctive thermophotovoltaic energy conversion mechanism and chip-based system design, which we name the microthermophotovoltaic (μTPV) generator. The approach is predicted to be capable of up to 32% efficient heat-to-electricity conversion within a millimeter-scale form factor. Although considerable technological barriers need to be overcome to reach full performance, we have performed a robust experimental demonstration that validates the theoretical framework and the key system components. Even with a much-simplified μTPV system design with theoretical efficiency prediction of 2.7%, we experimentally demonstrate 2.5% efficiency. The μTPV experimental system that was built and tested comprises a silicon propane microcombustor, an integrated high-temperature photonic crystal selective thermal emitter, four 0.55-eV GaInAsSb thermophotovoltaic diodes, and an ultra-high-efficiency maximum power-point tracking power electronics converter. The system was demonstrated to operate up to 800 °C (silicon microcombustor temperature) with an input thermal power of 13.7 W, generating 344 mW of electric power over a 1-cm2 area. PMID:23440220
Information flow dynamics in the brain
NASA Astrophysics Data System (ADS)
Rabinovich, Mikhail I.; Afraimovich, Valentin S.; Bick, Christian; Varona, Pablo
2012-03-01
Timing and dynamics of information in the brain is a hot field in modern neuroscience. The analysis of the temporal evolution of brain information is crucially important for the understanding of higher cognitive mechanisms in normal and pathological states. From the perspective of information dynamics, in this review we discuss working memory capacity, language dynamics, goal-dependent behavior programming and other functions of brain activity. In contrast with the classical description of information theory, which is mostly algebraic, brain flow information dynamics deals with problems such as the stability/instability of information flows, their quality, the timing of sequential processing, the top-down cognitive control of perceptual information, and information creation. In this framework, different types of information flow instabilities correspond to different cognitive disorders. On the other hand, the robustness of cognitive activity is related to the control of the information flow stability. We discuss these problems using both experimental and theoretical approaches, and we argue that brain activity is better understood considering information flows in the phase space of the corresponding dynamical model. In particular, we show how theory helps to understand intriguing experimental results in this matter, and how recent knowledge inspires new theoretical formalisms that can be tested with modern experimental techniques.
GLOBALLY ADAPTIVE QUANTILE REGRESSION WITH ULTRA-HIGH DIMENSIONAL DATA
Zheng, Qi; Peng, Limin; He, Xuming
2015-01-01
Quantile regression has become a valuable tool to analyze heterogeneous covaraite-response associations that are often encountered in practice. The development of quantile regression methodology for high dimensional covariates primarily focuses on examination of model sparsity at a single or multiple quantile levels, which are typically prespecified ad hoc by the users. The resulting models may be sensitive to the specific choices of the quantile levels, leading to difficulties in interpretation and erosion of confidence in the results. In this article, we propose a new penalization framework for quantile regression in the high dimensional setting. We employ adaptive L1 penalties, and more importantly, propose a uniform selector of the tuning parameter for a set of quantile levels to avoid some of the potential problems with model selection at individual quantile levels. Our proposed approach achieves consistent shrinkage of regression quantile estimates across a continuous range of quantiles levels, enhancing the flexibility and robustness of the existing penalized quantile regression methods. Our theoretical results include the oracle rate of uniform convergence and weak convergence of the parameter estimators. We also use numerical studies to confirm our theoretical findings and illustrate the practical utility of our proposal. PMID:26604424
Connell, Louise A; McMahon, Naoimh E; Tyson, Sarah F; Watkins, Caroline L; Eng, Janice J
2016-09-30
Despite best evidence demonstrating the effectiveness of increased intensity of exercise after stroke, current levels of therapy continue to be below those required to optimise motor recovery. We developed and tested an implementation intervention that aims to increase arm exercise in stroke rehabilitation. The aim of this study was to illustrate the use of a behaviour change framework, the Behaviour Change Wheel, to identify the mechanisms of action that explain how the intervention produced change. We implemented the intervention at three stroke rehabilitation units in the United Kingdom. A purposive sample of therapy team members were recruited to participate in semi-structured interviews to explore their perceptions of how the intervention produced change at their work place. Audio recordings were transcribed and imported into NVivo 10 for content analysis. Two coders separately analysed the transcripts and coded emergent mechanisms. Mechanisms were categorised using the Theoretical Domains Framework (TDF) (an extension of the Capability, Opportunity, Motivation and Behaviour model (COM-B) at the hub of the Behaviour Change Wheel). We identified five main mechanisms of action: 'social/professional role and identity', 'intentions', 'reinforcement', 'behavioural regulation' and 'beliefs about consequences'. At the outset, participants viewed the research team as an external influence for whom they endeavoured to complete the study activities. The study design, with a focus on implementation in real world settings, influenced participants' intentions to implement the intervention components. Monthly meetings between the research and therapy teams were central to the intervention and acted as prompt or reminder to sustain implementation. The phased approach to introducing and implementing intervention components influenced participants' beliefs about the feasibility of implementation. The Behaviour Change Wheel, and in particular the Theoretical Domains Framework, were used to investigate mechanisms of action of an implementation intervention. This approach allowed for consideration of a range of possible mechanisms, and allowed us to categorise these mechanisms using an established behaviour change framework. Identification of the mechanisms of action, following testing of the intervention in a number of settings, has resulted in a refined and more robust intervention programme theory for future testing.
The Dynamic Multiprocess Framework: Evidence from Prospective Memory with Contextual Variability
Scullin, Michael K.; McDaniel, Mark A.; Shelton, Jill Talley
2013-01-01
The ability to remember to execute delayed intentions is referred to as prospective memory. Previous theoretical and empirical work has focused on isolating whether a particular prospective memory task is supported either by effortful monitoring processes or by cue-driven spontaneous processes. In the present work, we advance the Dynamic Multiprocess Framework, which contends that both monitoring and spontaneous retrieval may be utilized dynamically to support prospective remembering. To capture the dynamic interplay between monitoring and spontaneous retrieval we had participants perform many ongoing tasks and told them that their prospective memory cue may occur in any context. Following either a 20-min or a 12-hr retention interval, the prospective memory cues were presented infrequently across three separate ongoing tasks. The monitoring patterns (measured as ongoing task cost relative to a between-subjects control condition) were consistent and robust across the three contexts. There was no evidence for monitoring prior to the initial prospective memory cue; however, individuals who successfully spontaneously retrieved the prospective memory intention, thereby realizing that prospective memory cues could be expected within that context, subsequently monitored. These data support the Dynamic Multiprocess Framework, which contends that individuals will engage monitoring when prospective memory cues are expected, disengage monitoring when cues are not expected, and that when monitoring is disengaged, a probabilistic spontaneous retrieval mechanism can support prospective remembering. PMID:23916951
Bowleg, Lisa
2012-07-01
Intersectionality is a theoretical framework that posits that multiple social categories (e.g., race, ethnicity, gender, sexual orientation, socioeconomic status) intersect at the micro level of individual experience to reflect multiple interlocking systems of privilege and oppression at the macro, social-structural level (e.g., racism, sexism, heterosexism). Public health's commitment to social justice makes it a natural fit with intersectionality's focus on multiple historically oppressed populations. Yet despite a plethora of research focused on these populations, public health studies that reflect intersectionality in their theoretical frameworks, designs, analyses, or interpretations are rare. Accordingly, I describe the history and central tenets of intersectionality, address some theoretical and methodological challenges, and highlight the benefits of intersectionality for public health theory, research, and policy.
2012-01-01
Intersectionality is a theoretical framework that posits that multiple social categories (e.g., race, ethnicity, gender, sexual orientation, socioeconomic status) intersect at the micro level of individual experience to reflect multiple interlocking systems of privilege and oppression at the macro, social-structural level (e.g., racism, sexism, heterosexism). Public health’s commitment to social justice makes it a natural fit with intersectionality’s focus on multiple historically oppressed populations. Yet despite a plethora of research focused on these populations, public health studies that reflect intersectionality in their theoretical frameworks, designs, analyses, or interpretations are rare. Accordingly, I describe the history and central tenets of intersectionality, address some theoretical and methodological challenges, and highlight the benefits of intersectionality for public health theory, research, and policy. PMID:22594719
Adopting Health Behavior Change Theory throughout the Clinical Practice Guideline Process
ERIC Educational Resources Information Center
Ceccato, Natalie E.; Ferris, Lorraine E.; Manuel, Douglas; Grimshaw, Jeremy M.
2007-01-01
Adopting a theoretical framework throughout the clinical practice guideline (CPG) process (development, dissemination, implementation, and evaluation) can be useful in systematically identifying, addressing, and explaining behavioral influences impacting CPG uptake and effectiveness. This article argues that using a theoretical framework should…
Fly-by-feel aeroservoelasticity
NASA Astrophysics Data System (ADS)
Suryakumar, Vishvas Samuel
Recent experiments have suggested a strong correlation between local flow features on the airfoil surface such as the leading edge stagnation point (LESP), transition or the flow separation point with global integrated quantities such as aerodynamic lift. "Fly-By-Feel" refers to a physics-based sensing and control framework where local flow features are tracked in real-time to determine aerodynamic loads. This formulation offers possibilities for the development of robust, low-order flight control architectures. An essential contribution towards this objective is the theoretical development showing the direct relationship of the LESP with circulation for small-amplitude, unsteady, airfoil maneuvers. The theory is validated through numerical simulations and wind tunnel tests. With the availability of an aerodynamic observable, a low-order, energy-based control formulation is derived for aeroelastic stabilization and gust load alleviation. The sensing and control framework is implemented on the Nonlinear Aeroelastic Test Apparatus at Texas A&M University. The LESP is located using hot-film sensors distributed around the wing leading edge. Stabilization of limit cycle oscillations exhibited by a nonlinear wing section is demonstrated in the presence of gusts. Aeroelastic stabilization is also demonstrated on a flying wing configuration exhibiting body freedom flutter through numerical simulations.
Pairwise registration of TLS point clouds using covariance descriptors and a non-cooperative game
NASA Astrophysics Data System (ADS)
Zai, Dawei; Li, Jonathan; Guo, Yulan; Cheng, Ming; Huang, Pengdi; Cao, Xiaofei; Wang, Cheng
2017-12-01
It is challenging to automatically register TLS point clouds with noise, outliers and varying overlap. In this paper, we propose a new method for pairwise registration of TLS point clouds. We first generate covariance matrix descriptors with an adaptive neighborhood size from point clouds to find candidate correspondences, we then construct a non-cooperative game to isolate mutual compatible correspondences, which are considered as true positives. The method was tested on three models acquired by two different TLS systems. Experimental results demonstrate that our proposed adaptive covariance (ACOV) descriptor is invariant to rigid transformation and robust to noise and varying resolutions. The average registration errors achieved on three models are 0.46 cm, 0.32 cm and 1.73 cm, respectively. The computational times cost on these models are about 288 s, 184 s and 903 s, respectively. Besides, our registration framework using ACOV descriptors and a game theoretic method is superior to the state-of-the-art methods in terms of both registration error and computational time. The experiment on a large outdoor scene further demonstrates the feasibility and effectiveness of our proposed pairwise registration framework.
Shemesh, Noam; Ozarslan, Evren; Basser, Peter J; Cohen, Yoram
2010-01-21
NMR observable nuclei undergoing restricted diffusion within confining pores are important reporters for microstructural features of porous media including, inter-alia, biological tissues, emulsions and rocks. Diffusion NMR, and especially the single-pulsed field gradient (s-PFG) methodology, is one of the most important noninvasive tools for studying such opaque samples, enabling extraction of important microstructural information from diffusion-diffraction phenomena. However, when the pores are not monodisperse and are characterized by a size distribution, the diffusion-diffraction patterns disappear from the signal decay, and the relevant microstructural information is mostly lost. A recent theoretical study predicted that the diffusion-diffraction patterns in double-PFG (d-PFG) experiments have unique characteristics, such as zero-crossings, that make them more robust with respect to size distributions. In this study, we theoretically compared the signal decay arising from diffusion in isolated cylindrical pores characterized by lognormal size distributions in both s-PFG and d-PFG methodologies using a recently presented general framework for treating diffusion in NMR experiments. We showed the gradual loss of diffusion-diffraction patterns in broadening size distributions in s-PFG and the robustness of the zero-crossings in d-PFG even for very large standard deviations of the size distribution. We then performed s-PFG and d-PFG experiments on well-controlled size distribution phantoms in which the ground-truth is well-known a priori. We showed that the microstructural information, as manifested in the diffusion-diffraction patterns, is lost in the s-PFG experiments, whereas in d-PFG experiments the zero-crossings of the signal persist from which relevant microstructural information can be extracted. This study provides a proof of concept that d-PFG may be useful in obtaining important microstructural features in samples characterized by size distributions.
Petrini, Carlo
2015-01-01
The "Framework for the Ethical Conduct of Public Health Initiatives", developed by Public Health Ontario, is a practical guide for assessing the ethical implications of evidence-generating public health initiatives, whether research or non-research activities, involving people, their biological materials or their personal information. The Framework is useful not only to those responsible for determining the ethical acceptability of an initiative, but also to investigators planning new public health initiatives. It is informed by a theoretical approach that draws on widely shared bioethical principles. Two considerations emerge from both the theoretical framework and its practical application: the line between practice and research is often blurred; public health ethics and biomedical research ethics are based on the same common heritage of values.
The role of language in learning physics
NASA Astrophysics Data System (ADS)
Brookes, David T.
Many studies in PER suggest that language poses a serious difficulty for students learning physics. These difficulties are mostly attributed to misunderstanding of specialized terminology. This terminology often assigns new meanings to everyday terms used to describe physical models and phenomena. In this dissertation I present a novel approach to analyzing of the role of language in learning physics. This approach is based on the analysis of the historical development of physics ideas, the language of modern physicists, and students' difficulties in the areas of quantum mechanics, classical mechanics, and thermodynamics. These data are analyzed using linguistic tools borrowed from cognitive linguistics and systemic functional grammar. Specifically, I combine the idea of conceptual metaphor and grammar to build a theoretical framework that accounts for: (1) the role and function that language serves for physicists when they speak and reason about physical ideas and phenomena, (2) specific features of students' reasoning and difficulties that may be related to or derived from language that students read or hear. The theoretical framework is developed using the methodology of a grounded theoretical approach. The theoretical framework allows us to make predictions about the relationship between student discourse and their conceptual and problem solving difficulties. Tests of the theoretical framework are presented in the context of "heat" in thermodynamics and "force" in dynamics. In each case the language that students use to reason about the concepts of "heat" and "force" is analyzed using the theoretical framework. The results of this analysis show that language is very important in students' learning. In particular, students are (1) using features of physicists' conceptual metaphors to reason about physical phenomena, often overextending and misapplying these features, (2) drawing cues from the grammar of physicists' speech and writing to categorize physics concepts; this categorization of physics concepts plays a key role in students' ability to solve physics problems. In summary, I present a theoretical framework that provides a possible explanation of the role that language plays in learning physics. The framework also attempts to account for how and why physicists' language influences students in the way that it does.
The CRISP theory of hippocampal function in episodic memory
Cheng, Sen
2013-01-01
Over the past four decades, a “standard framework” has emerged to explain the neural mechanisms of episodic memory storage. This framework has been instrumental in driving hippocampal research forward and now dominates the design and interpretation of experimental and theoretical studies. It postulates that cortical inputs drive plasticity in the recurrent cornu ammonis 3 (CA3) synapses to rapidly imprint memories as attractor states in CA3. Here we review a range of experimental studies and argue that the evidence against the standard framework is mounting, notwithstanding the considerable evidence in its support. We propose CRISP as an alternative theory to the standard framework. CRISP is based on Context Reset by dentate gyrus (DG), Intrinsic Sequences in CA3, and Pattern completion in cornu ammonis 1 (CA1). Compared to previous models, CRISP uses a radically different mechanism for storing episodic memories in the hippocampus. Neural sequences are intrinsic to CA3, and inputs are mapped onto these intrinsic sequences through synaptic plasticity in the feedforward projections of the hippocampus. Hence, CRISP does not require plasticity in the recurrent CA3 synapses during the storage process. Like in other theories DG and CA1 play supporting roles, however, their function in CRISP have distinct implications. For instance, CA1 performs pattern completion in the absence of CA3 and DG contributes to episodic memory retrieval, increasing the speed, precision, and robustness of retrieval. We propose the conceptual theory, discuss its implications for experimental results and suggest testable predictions. It appears that CRISP not only accounts for those experimental results that are consistent with the standard framework, but also for results that are at odds with the standard framework. We therefore suggest that CRISP is a viable, and perhaps superior, theory for the hippocampal function in episodic memory. PMID:23653597
Effective real-time vehicle tracking using discriminative sparse coding on local patches
NASA Astrophysics Data System (ADS)
Chen, XiangJun; Ye, Feiyue; Ruan, Yaduan; Chen, Qimei
2016-01-01
A visual tracking framework that provides an object detector and tracker, which focuses on effective and efficient visual tracking in surveillance of real-world intelligent transport system applications, is proposed. The framework casts the tracking task as problems of object detection, feature representation, and classification, which is different from appearance model-matching approaches. Through a feature representation of discriminative sparse coding on local patches called DSCLP, which trains a dictionary on local clustered patches sampled from both positive and negative datasets, the discriminative power and robustness has been improved remarkably, which makes our method more robust to a complex realistic setting with all kinds of degraded image quality. Moreover, by catching objects through one-time background subtraction, along with offline dictionary training, computation time is dramatically reduced, which enables our framework to achieve real-time tracking performance even in a high-definition sequence with heavy traffic. Experiment results show that our work outperforms some state-of-the-art methods in terms of speed, accuracy, and robustness and exhibits increased robustness in a complex real-world scenario with degraded image quality caused by vehicle occlusion, image blur of rain or fog, and change in viewpoint or scale.
Understanding the Role of Numeracy in Health: Proposed Theoretical Framework and Practical Insights
Lipkus, Isaac M.; Peters, Ellen
2009-01-01
Numeracy, that is how facile people are with mathematical concepts and their applications, is gaining importance in medical decision making and risk communication. This paper proposes six critical functions of health numeracy. These functions are integrated into a theoretical framework on health numeracy that has implications for risk-communication and medical-decision-making processes. We examine practical underpinnings for targeted interventions aimed at improving such processes as a function of health numeracy. It is hoped that the proposed functions and theoretical framework will spur more research to determine how an understanding of health numeracy can lead to more effective communication and decision outcomes. PMID:19834054
Towards a Theoretical Framework for Educational Simulations.
ERIC Educational Resources Information Center
Winer, Laura R.; Vazquez-Abad, Jesus
1981-01-01
Discusses the need for a sustained and systematic effort toward establishing a theoretical framework for educational simulations, proposes the adaptation of models borrowed from the natural and applied sciences, and describes three simulations based on such a model adapted using Brunerian learning theory. Sixteen references are listed. (LLS)
Evolution or Revolution: Mobility Requirements for the AirLand Battle Future Concept
1991-02-20
analysis and the model a theoretical framework for tactical mobility is established. The considerations for tactical mobility on the future battlefield are...examined in the context of the theoretical framework . Finally, using the criteria of sufficiency, feasibility, and the time/space continuum, the
2008-11-01
is particularly important in order to design a network that is realistically deployable. The goal of this project is the design of a theoretical ... framework to assess and predict the effectiveness and performance of networks and their loads.
School District Organization and Student Dropout.
ERIC Educational Resources Information Center
Engelhard, George, Jr.
The purpose of this study was to develop and test a theoretical framework that would examine the structural relationships between select organizational and environmental variables and school district effectiveness in Michigan. The theoretical framework was derived from organizational theory and represents a social-ecological approach to the study…
Educational Communities of Inquiry: Theoretical Framework, Research and Practice
ERIC Educational Resources Information Center
Akyol, Zehra; Garrison, D. Randy
2013-01-01
Communications technologies have been continuously integrated into learning and training environments which has revealed the need for a clear understanding of the process. The Community of Inquiry (COI) Theoretical Framework has a philosophical foundation which provides planned guidelines and principles to development useful learning environments…
Exploring How Globalization Shapes Education: Methodology and Theoretical Framework
ERIC Educational Resources Information Center
Pan, Su-Yan
2010-01-01
This is a commentary on some major issues raised in Carter and Dediwalage's "Globalisation and science education: The case of "Sustainability by the bay"" (this issue), particularly their methodology and theoretical framework for understanding how globalisation shapes education (including science education). While acknowledging the authors'…
A Computational Framework to Control Verification and Robustness Analysis
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2010-01-01
This paper presents a methodology for evaluating the robustness of a controller based on its ability to satisfy the design requirements. The framework proposed is generic since it allows for high-fidelity models, arbitrary control structures and arbitrary functional dependencies between the requirements and the uncertain parameters. The cornerstone of this contribution is the ability to bound the region of the uncertain parameter space where the degradation in closed-loop performance remains acceptable. The size of this bounding set, whose geometry can be prescribed according to deterministic or probabilistic uncertainty models, is a measure of robustness. The robustness metrics proposed herein are the parametric safety margin, the reliability index, the failure probability and upper bounds to this probability. The performance observed at the control verification setting, where the assumptions and approximations used for control design may no longer hold, will fully determine the proposed control assessment.
NASA Astrophysics Data System (ADS)
Herman, Jonathan D.; Zeff, Harrison B.; Reed, Patrick M.; Characklis, Gregory W.
2014-10-01
While optimality is a foundational mathematical concept in water resources planning and management, "optimal" solutions may be vulnerable to failure if deeply uncertain future conditions deviate from those assumed during optimization. These vulnerabilities may produce severely asymmetric impacts across a region, making it vital to evaluate the robustness of management strategies as well as their impacts for regional stakeholders. In this study, we contribute a multistakeholder many-objective robust decision making (MORDM) framework that blends many-objective search and uncertainty analysis tools to discover key tradeoffs between water supply alternatives and their robustness to deep uncertainties (e.g., population pressures, climate change, and financial risks). The proposed framework is demonstrated for four interconnected water utilities representing major stakeholders in the "Research Triangle" region of North Carolina, U.S. The utilities supply well over one million customers and have the ability to collectively manage drought via transfer agreements and shared infrastructure. We show that water portfolios for this region that compose optimal tradeoffs (i.e., Pareto-approximate solutions) under expected future conditions may suffer significantly degraded performance with only modest changes in deeply uncertain hydrologic and economic factors. We then use the Patient Rule Induction Method (PRIM) to identify which uncertain factors drive the individual and collective vulnerabilities for the four cooperating utilities. Our framework identifies key stakeholder dependencies and robustness tradeoffs associated with cooperative regional planning, which are critical to understanding the tensions between individual versus regional water supply goals. Cooperative demand management was found to be the key factor controlling the robustness of regional water supply planning, dominating other hydroclimatic and economic uncertainties through the 2025 planning horizon. Results suggest that a modest reduction in the projected rate of demand growth (from approximately 3% per year to 2.4%) will substantially improve the utilities' robustness to future uncertainty and reduce the potential for regional tensions. The proposed multistakeholder MORDM framework offers critical insights into the risks and challenges posed by rising water demands and hydrological uncertainties, providing a planning template for regions now forced to confront rapidly evolving water scarcity risks.
An Expanded Theoretical Framework of Care Coordination Across Transitions in Care Settings.
Radwin, Laurel E; Castonguay, Denise; Keenan, Carolyn B; Hermann, Cherice
2016-01-01
For many patients, high-quality, patient-centered, and cost-effective health care requires coordination among multiple clinicians and settings. Ensuring optimal care coordination requires a clear understanding of how clinician activities and continuity during transitions affect patient-centeredness and quality outcomes. This article describes an expanded theoretical framework to better understand care coordination. The framework provides clear articulation of concepts. Examples are provided of ways to measure the concepts.
Fabrication of slender elastic shells by the coating of curved surfaces
NASA Astrophysics Data System (ADS)
Lee, A.; Brun, P.-T.; Marthelot, J.; Balestra, G.; Gallaire, F.; Reis, P. M.
2016-04-01
Various manufacturing techniques exist to produce double-curvature shells, including injection, rotational and blow molding, as well as dip coating. However, these industrial processes are typically geared for mass production and are not directly applicable to laboratory research settings, where adaptable, inexpensive and predictable prototyping tools are desirable. Here, we study the rapid fabrication of hemispherical elastic shells by coating a curved surface with a polymer solution that yields a nearly uniform shell, upon polymerization of the resulting thin film. We experimentally characterize how the curing of the polymer affects its drainage dynamics and eventually selects the shell thickness. The coating process is then rationalized through a theoretical analysis that predicts the final thickness, in quantitative agreement with experiments and numerical simulations of the lubrication flow field. This robust fabrication framework should be invaluable for future studies on the mechanics of thin elastic shells and their intrinsic geometric nonlinearities.
Redundant imprinting of information in non-ideal environments: Quantum Darwinism via a noisy channel
NASA Astrophysics Data System (ADS)
Zwolak, Michael; Quan, Haitao; Zurek, Wojciech
2011-03-01
Quantum Darwinism provides an information-theoretic framework for the emergence of the classical world from the quantum substrate. It recognizes that we - the observers - acquire our information about the ``systems of interest'' indirectly from their imprints on the environment. Objectivity, a key property of the classical world, arises via the proliferation of redundant information into the environment where many observers can then intercept it and independently determine the state of the system. While causing a system to decohere, environments that remain nearly invariant under the Hamiltonian dynamics, such as very mixed states, have a diminished ability to transmit information about the system, yet can still acquire redundant information about the system [1,2]. Our results show that Quantum Darwinism is robust with respect to non-ideal initial states of the environment. This research is supported by the U.S. Department of Energy through the LANL/LDRD Program.
A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules
NASA Astrophysics Data System (ADS)
Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.
2012-08-01
Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.
A distributed model predictive control scheme for leader-follower multi-agent systems
NASA Astrophysics Data System (ADS)
Franzè, Giuseppe; Lucia, Walter; Tedesco, Francesco
2018-02-01
In this paper, we present a novel receding horizon control scheme for solving the formation problem of leader-follower configurations. The algorithm is based on set-theoretic ideas and is tuned for agents described by linear time-invariant (LTI) systems subject to input and state constraints. The novelty of the proposed framework relies on the capability to jointly use sequences of one-step controllable sets and polyhedral piecewise state-space partitions in order to online apply the 'better' control action in a distributed receding horizon fashion. Moreover, we prove that the design of both robust positively invariant sets and one-step-ahead controllable regions is achieved in a distributed sense. Simulations and numerical comparisons with respect to centralised and local-based strategies are finally performed on a group of mobile robots to demonstrate the effectiveness of the proposed control strategy.
Performance analysis of improved iterated cubature Kalman filter and its application to GNSS/INS.
Cui, Bingbo; Chen, Xiyuan; Xu, Yuan; Huang, Haoqian; Liu, Xiao
2017-01-01
In order to improve the accuracy and robustness of GNSS/INS navigation system, an improved iterated cubature Kalman filter (IICKF) is proposed by considering the state-dependent noise and system uncertainty. First, a simplified framework of iterated Gaussian filter is derived by using damped Newton-Raphson algorithm and online noise estimator. Then the effect of state-dependent noise coming from iterated update is analyzed theoretically, and an augmented form of CKF algorithm is applied to improve the estimation accuracy. The performance of IICKF is verified by field test and numerical simulation, and results reveal that, compared with non-iterated filter, iterated filter is less sensitive to the system uncertainty, and IICKF improves the accuracy of yaw, roll and pitch by 48.9%, 73.1% and 83.3%, respectively, compared with traditional iterated KF. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
The global susceptibility of coastal forage fish to competition by large jellyfish
Mariani, Patrizio
2016-01-01
Competition between large jellyfish and forage fish for zooplankton prey is both a possible cause of jellyfish increases and a concern for the management of marine ecosystems and fisheries. Identifying principal factors affecting this competition is therefore important for marine management, but the lack of both good quality data and a robust theoretical framework have prevented general global analyses. Here, we present a general mechanistic food web model that considers fundamental differences in feeding modes and predation pressure between fish and jellyfish. The model predicts forage fish dominance at low primary production, and a shift towards jellyfish with increasing productivity, turbidity and fishing. We present an index of global ecosystem susceptibility to shifts in fish–jellyfish dominance that compares well with data on jellyfish distributions and trends. The results are a step towards better understanding the processes that govern jellyfish occurrences globally and highlight the advantage of considering feeding traits in ecosystem models. PMID:28120793
Ultra-fast consensus of discrete-time multi-agent systems with multi-step predictive output feedback
NASA Astrophysics Data System (ADS)
Zhang, Wenle; Liu, Jianchang
2016-04-01
This article addresses the ultra-fast consensus problem of high-order discrete-time multi-agent systems based on a unified consensus framework. A novel multi-step predictive output mechanism is proposed under a directed communication topology containing a spanning tree. By predicting the outputs of a network several steps ahead and adding this information into the consensus protocol, it is shown that the asymptotic convergence factor is improved by a power of q + 1 compared to the routine consensus. The difficult problem of selecting the optimal control gain is solved well by introducing a variable called convergence step. In addition, the ultra-fast formation achievement is studied on the basis of this new consensus protocol. Finally, the ultra-fast consensus with respect to a reference model and robust consensus is discussed. Some simulations are performed to illustrate the effectiveness of the theoretical results.
A New Standard for Assessing the Performance of High Contrast Imaging Systems
NASA Astrophysics Data System (ADS)
Jensen-Clem, Rebecca; Mawet, Dimitri; Gomez Gonzalez, Carlos A.; Absil, Olivier; Belikov, Ruslan; Currie, Thayne; Kenworthy, Matthew A.; Marois, Christian; Mazoyer, Johan; Ruane, Garreth; Tanner, Angelle; Cantalloube, Faustine
2018-01-01
As planning for the next generation of high contrast imaging instruments (e.g., WFIRST, HabEx, and LUVOIR, TMT-PFI, EELT-EPICS) matures and second-generation ground-based extreme adaptive optics facilities (e.g., VLT-SPHERE, Gemini-GPI) finish their principal surveys, it is imperative that the performance of different designs, post-processing algorithms, observing strategies, and survey results be compared in a consistent, statistically robust framework. In this paper, we argue that the current industry standard for such comparisons—the contrast curve—falls short of this mandate. We propose a new figure of merit, the “performance map,” that incorporates three fundamental concepts in signal detection theory: the true positive fraction, the false positive fraction, and the detection threshold. By supplying a theoretical basis and recipe for generating the performance map, we hope to encourage the widespread adoption of this new metric across subfields in exoplanet imaging.
Kirk, Megan A; Rhodes, Ryan E
2011-07-01
Preschoolers with developmental delay (DD) are at risk for poor fundamental movement skills (FMS), but a paucity of early FMS interventions exist. The purpose of this review was to critically appraise the existing interventions to establish direction for future trials targeting preschoolers with DD. A total of 11 studies met the inclusion criteria. Major findings were summarized based on common subtopics of overall intervention effect, locomotor skill outcomes, object-control outcomes, and gender differences. Trials ranged from 8 to 24 weeks and offered 540-1700 min of instruction. The majority of trials (n = 9) significantly improved FMS of preschoolers with DD, with a large intervention effect (η(2) = 0.57-0.85). This review supports the utility of interventions to improve FMS of preschoolers with DD. Future researchers are encouraged to include more robust designs, a theoretical framework, and involvement of parents and teachers in the delivery of the intervention.
Switching LPV Control with Double-Layer LPV Model for Aero-Engines
NASA Astrophysics Data System (ADS)
Tang, Lili; Huang, Jinquan; Pan, Muxuan
2017-11-01
To cover the whole range of operating conditions of aero-engine, a double-layer LPV model is built so as to take into account of the variability due to the flight altitude, Mach number and the rotational speed. With this framework, the problem of designing LPV state-feedback robust controller that guarantees desired bounds on both H_∞ and H_2 performances is considered. Besides this, to reduce the conservativeness caused by a single LPV controller of the whole flight envelope and the common Lyapunov function method, a new method is proposed to design a family of LPV switching controllers. The switching LPV controllers can ensure that the closed-loop system remains stable in the sense of Lyapunov under arbitrary switching logic. Meanwhile, the switching LPV controllers can ensure the parameters change smoothly. The validity and performance of the theoretical results are demonstrated through a numerical example.
Absorbate-induced piezochromism in a porous molecular crystal
Hendon, Christopher H.; Wittering, Kate E.; Chen, Teng -Hao; ...
2015-02-23
Atmospherically stable porous frameworks and materials are interesting for heterogeneous solid–gas applications. One motivation is the direct and selective uptake of pollutant/hazardous gases, where the material produces a measurable response in the presence of the analyte. In this report, we present a combined experimental and theoretical rationalization for the piezochromic response of a robust and porous molecular crystal built from an extensively fluorinated trispyrazole. The electronic response of the material is directly determined by analyte uptake, which provokes a subtle lattice contraction and an observable bathochromic shift in the optical absorption onset. Selectivity for fluorinated absorbates is demonstrated, and toluenemore » is also found to crystallize within the pore. Lastly, we demonstrate the application of electronic structure calculations to predict a physicochemical response, providing the foundations for the design of electronically tunable porous solids with the chemical properties required for development of novel gas-uptake media.« less
Social Context of First Birth Timing in a Rapidly Changing Rural Setting
Ghimire, Dirgha J.
2016-01-01
This article examines the influence of social context on the rate of first birth. Drawing on socialization models, I develop a theoretical framework to explain how different aspects of social context (i.e., neighbors), may affect the rate of first birth. Neighbors, who in the study setting comprise individuals’ immediate social context, have an important influence on the rate of first birth. To test my hypotheses, I leverage a setting, measures and analytical techniques designed to study the impact of macro-level social contexts on micro-level individual behavior. The results show that neighbors’ age at first birth, travel to the capital city and media exposure tend to reduce the first birth rate, while neighbors’ non-family work experience increases first birth rate. These effects are independent of neighborhood characteristics and are robust against several key variations in model specifications. PMID:27886737
Filtering Based Adaptive Visual Odometry Sensor Framework Robust to Blurred Images
Zhao, Haiying; Liu, Yong; Xie, Xiaojia; Liao, Yiyi; Liu, Xixi
2016-01-01
Visual odometry (VO) estimation from blurred image is a challenging problem in practical robot applications, and the blurred images will severely reduce the estimation accuracy of the VO. In this paper, we address the problem of visual odometry estimation from blurred images, and present an adaptive visual odometry estimation framework robust to blurred images. Our approach employs an objective measure of images, named small image gradient distribution (SIGD), to evaluate the blurring degree of the image, then an adaptive blurred image classification algorithm is proposed to recognize the blurred images, finally we propose an anti-blurred key-frame selection algorithm to enable the VO robust to blurred images. We also carried out varied comparable experiments to evaluate the performance of the VO algorithms with our anti-blur framework under varied blurred images, and the experimental results show that our approach can achieve superior performance comparing to the state-of-the-art methods under the condition with blurred images while not increasing too much computation cost to the original VO algorithms. PMID:27399704
Ince, Robin A A; Giordano, Bruno L; Kayser, Christoph; Rousselet, Guillaume A; Gross, Joachim; Schyns, Philippe G
2017-03-01
We begin by reviewing the statistical framework of information theory as applicable to neuroimaging data analysis. A major factor hindering wider adoption of this framework in neuroimaging is the difficulty of estimating information theoretic quantities in practice. We present a novel estimation technique that combines the statistical theory of copulas with the closed form solution for the entropy of Gaussian variables. This results in a general, computationally efficient, flexible, and robust multivariate statistical framework that provides effect sizes on a common meaningful scale, allows for unified treatment of discrete, continuous, unidimensional and multidimensional variables, and enables direct comparisons of representations from behavioral and brain responses across any recording modality. We validate the use of this estimate as a statistical test within a neuroimaging context, considering both discrete stimulus classes and continuous stimulus features. We also present examples of analyses facilitated by these developments, including application of multivariate analyses to MEG planar magnetic field gradients, and pairwise temporal interactions in evoked EEG responses. We show the benefit of considering the instantaneous temporal derivative together with the raw values of M/EEG signals as a multivariate response, how we can separately quantify modulations of amplitude and direction for vector quantities, and how we can measure the emergence of novel information over time in evoked responses. Open-source Matlab and Python code implementing the new methods accompanies this article. Hum Brain Mapp 38:1541-1573, 2017. © 2016 Wiley Periodicals, Inc. 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.
Thematic Processes in the Comprehension of Technical Prose.
1982-02-20
theoretical framework for this process is that the important content of a passage is constructed by the reader based on the semantic content of the...against actual reader behavior. These models represent the general theoretical framework in a highly specific way, and thus summarize the major results of the project. (Author)
1990-08-01
evidence for a surprising degree of long-term skill retention. We formulated a theoretical framework , focusing on the importance of procedural reinstatement...considerable forgetting over even relatively short retention intervals. We have been able to place these studies in the same general theoretical framework developed
Time, Space, and Mass at the Operational Level of War: The Dynamics of the Culminating Point,
1988-04-28
theoretical framework for operational culmination and then examining the theory as reflected in recent history. This paper focuses on the concept of...the paper first examines key definitions and provides a theoretical framework for understanding culmination. Next, it considers the application of the
Strategic Innovation in HE: The Roles of Academic Middle Managers
ERIC Educational Resources Information Center
Kallenberg, Ton
2007-01-01
This article explains the development of, and presents a theoretical framework for, harnessing the roles of the academic middle manager in strategic innovation in Dutch higher education, thereby increasing higher education's ability to learn, innovate and develop a competitive advantage. The framework is developed from theoretical models of role…
Implicit Theoretical Leadership Frameworks of Higher Education Administrators.
ERIC Educational Resources Information Center
Lees, Kimberly; And Others
Colleges and universities have a unique organizational culture that influences the decision-making processes used by leaders of higher education. This paper presents findings of a study that attempted to identify the theoretical frameworks that administrators of higher education use to guide their decision-making processes. The following…
NLPIR: A Theoretical Framework for Applying Natural Language Processing to Information Retrieval.
ERIC Educational Resources Information Center
Zhou, Lina; Zhang, Dongsong
2003-01-01
Proposes a theoretical framework called NLPIR that integrates natural language processing (NLP) into information retrieval (IR) based on the assumption that there exists representation distance between queries and documents. Discusses problems in traditional keyword-based IR, including relevance, and describes some existing NLP techniques.…
ERIC Educational Resources Information Center
Aquino, Katherine C.
2016-01-01
Disability is often viewed as an obstacle to postsecondary inclusion, but not a characteristic of student diversity. Additionally, current theoretical frameworks isolate disability from other student diversity characteristics. In response, a new conceptual framework, the Disability-Diversity (Dis)Connect Model (DDDM), was created to address…
A Theoretical Framework towards Understanding of Emotional and Behavioural Difficulties
ERIC Educational Resources Information Center
Poulou, Maria S.
2014-01-01
Children's emotional and behavioural difficulties are the result of multiple individual, social and contextual factors working in concert. The current paper proposes a theoretical framework to interpret students' emotional and behavioural difficulties in schools, by taking into consideration teacher-student relationships, students'…
Couples coping with cancer: exploration of theoretical frameworks from dyadic studies.
Regan, Tim W; Lambert, Sylvie D; Kelly, Brian; Falconier, Mariana; Kissane, David; Levesque, Janelle V
2015-12-01
A diagnosis of cancer and subsequent treatment are distressing not only for the person directly affected, but also for their intimate partner. The aim of this review is to (a) identify the main theoretical frameworks underpinning research addressing dyadic coping among couples affected by cancer, (b) summarise the evidence supporting the concepts described in these theoretical frameworks, and (c) examine the similarities and differences between these theoretical perspectives. A literature search was undertaken to identify descriptive studies published between 1990 and 2013 (English and French) that examined the interdependence of patients' and partners' coping, and the impact of coping on psychosocial outcomes. Data were extracted using a standardised form and reviewed by three of the authors. Twenty-three peer-reviewed manuscripts were identified, from which seven theoretical perspectives were derived: Relationship-Focused Coping, Transactional Model of Stress and Coping, Systemic-Transactional Model (STM) of dyadic coping, Collaborative Coping, Relationship Intimacy model, Communication models, and Coping Congruence. Although these theoretical perspectives emphasised different aspects of coping, a number of conceptual commonalities were noted. This review identified key theoretical frameworks of dyadic coping used in cancer. Evidence indicates that responses within the couple that inhibit open communication between partner and patient are likely to have an adverse impact on psychosocial outcomes. Models that incorporate the interdependence of emotional responses and coping behaviours within couples have an emerging evidence base in psycho-oncology and may have greatest validity and clinical utility in this setting. Copyright © 2015 John Wiley & Sons, Ltd.
Francis, Jill J; O'Connor, Denise; Curran, Janet
2012-04-24
Behaviour change is key to increasing the uptake of evidence into healthcare practice. Designing behaviour-change interventions first requires problem analysis, ideally informed by theory. Yet the large number of partly overlapping theories of behaviour makes it difficult to select the most appropriate theory. The need for an overarching theoretical framework of behaviour change was addressed in research in which 128 explanatory constructs from 33 theories of behaviour were identified and grouped. The resulting Theoretical Domains Framework (TDF) appears to be a helpful basis for investigating implementation problems. Research groups in several countries have conducted TDF-based studies. It seems timely to bring together the experience of these teams in a thematic series to demonstrate further applications and to report key developments. This overview article describes the TDF, provides a brief critique of the framework, and introduces this thematic series.In a brief review to assess the extent of TDF-based research, we identified 133 papers that cite the framework. Of these, 17 used the TDF as the basis for empirical studies to explore health professionals' behaviour. The identified papers provide evidence of the impact of the TDF on implementation research. Two major strengths of the framework are its theoretical coverage and its capacity to elicit beliefs that could signify key mediators of behaviour change. The TDF provides a useful conceptual basis for assessing implementation problems, designing interventions to enhance healthcare practice, and understanding behaviour-change processes. We discuss limitations and research challenges and introduce papers in this series.
2012-01-01
Behaviour change is key to increasing the uptake of evidence into healthcare practice. Designing behaviour-change interventions first requires problem analysis, ideally informed by theory. Yet the large number of partly overlapping theories of behaviour makes it difficult to select the most appropriate theory. The need for an overarching theoretical framework of behaviour change was addressed in research in which 128 explanatory constructs from 33 theories of behaviour were identified and grouped. The resulting Theoretical Domains Framework (TDF) appears to be a helpful basis for investigating implementation problems. Research groups in several countries have conducted TDF-based studies. It seems timely to bring together the experience of these teams in a thematic series to demonstrate further applications and to report key developments. This overview article describes the TDF, provides a brief critique of the framework, and introduces this thematic series. In a brief review to assess the extent of TDF-based research, we identified 133 papers that cite the framework. Of these, 17 used the TDF as the basis for empirical studies to explore health professionals’ behaviour. The identified papers provide evidence of the impact of the TDF on implementation research. Two major strengths of the framework are its theoretical coverage and its capacity to elicit beliefs that could signify key mediators of behaviour change. The TDF provides a useful conceptual basis for assessing implementation problems, designing interventions to enhance healthcare practice, and understanding behaviour-change processes. We discuss limitations and research challenges and introduce papers in this series. PMID:22531601
NASA Astrophysics Data System (ADS)
Lian, Tao; Shen, Zheqi; Ying, Jun; Tang, Youmin; Li, Junde; Ling, Zheng
2018-03-01
A new criterion was proposed recently to measure the influence of internal variations on secular trends in a time series. When the magnitude of the trend is greater than a theoretical threshold that scales the influence from internal variations, the sign of the estimated trend can be interpreted as the underlying long-term change. Otherwise, the sign may depend on the period chosen. An improved least squares method is developed here to further reduce the theoretical threshold and is applied to eight sea surface temperature (SST) data sets covering the period 1881-2013 to investigate whether there are robust trends in global SSTs. It is found that the warming trends in the western boundary regions, the South Atlantic, and the tropical and southern-most Indian Ocean are robust. However, robust trends are not found in the North Pacific, the North Atlantic, or the South Indian Ocean. The globally averaged SST and Indian Ocean Dipole indices are found to have robustly increased, whereas trends in the zonal SST gradient across the equatorial Pacific, Niño 3.4 SST, and the Atlantic Multidecadal Oscillation indices are within the uncertainty range associated with internal variations. These results indicate that great care is required when interpreting SST trends using the available records in certain regions and indices. It is worth noting that the theoretical threshold can be strongly influenced by low-frequency oscillations, and the above conclusions are based on the assumption that trends are linear. Caution should be exercised when applying the theoretical threshold criterion to real data.
Scalable large format 3D displays
NASA Astrophysics Data System (ADS)
Chang, Nelson L.; Damera-Venkata, Niranjan
2010-02-01
We present a general framework for the modeling and optimization of scalable large format 3-D displays using multiple projectors. Based on this framework, we derive algorithms that can robustly optimize the visual quality of an arbitrary combination of projectors (e.g. tiled, superimposed, combinations of the two) without manual adjustment. The framework creates for the first time a new unified paradigm that is agnostic to a particular configuration of projectors yet robustly optimizes for the brightness, contrast, and resolution of that configuration. In addition, we demonstrate that our algorithms support high resolution stereoscopic video at real-time interactive frame rates achieved on commodity graphics hardware. Through complementary polarization, the framework creates high quality multi-projector 3-D displays at low hardware and operational cost for a variety of applications including digital cinema, visualization, and command-and-control walls.
Omery, A
1991-09-01
The purposes of this article were to provide insight into the process of ethics and ethical inquiry and to explore the ethical issues of culpability and pain management/control. Critical care nurses who currently care for vascular patients identified these issues as occurring frequently in their practice. Authors in critical care nursing generally have limited the process of ethical inquiry to a theoretical framework built around an ethic of principles. The message many critical care nurses heard was that this one type of theoretical ethical framework was the totality of ethics. The application of these principles was ethical inquiry. For some nurses, the ethic of principles is sufficient. For others, an ethic of principles is either incomplete or foreign. This second group of nurses may believe that they have no moral voice if the language of ethics is only the language of principles. The language of principles, however, is not the only theoretical framework available. There is also the ethic of care, and ethical inquiry can include the application of that framework. Indeed, the language of the ethic of care may give a voice to nurses who previously felt morally mute. In fact, these two theoretical frameworks are not the only frameworks available to nurses. There is also virtue ethics, a framework not discussed in this article. A multiplicity of ethical frameworks is available for nurses to use in analyzing their professional and personal dilemmas. Recognizing that multiplicity, nurses can analyze their ethical dilemmas more comprehensively and effectively. Applying differing ethical frameworks can result in the same conclusions. This was the case for the issue of culpability.(ABSTRACT TRUNCATED AT 250 WORDS)
LMI-Based Generation of Feedback Laws for a Robust Model Predictive Control Algorithm
NASA Technical Reports Server (NTRS)
Acikmese, Behcet; Carson, John M., III
2007-01-01
This technical note provides a mathematical proof of Corollary 1 from the paper 'A Nonlinear Model Predictive Control Algorithm with Proven Robustness and Resolvability' that appeared in the 2006 Proceedings of the American Control Conference. The proof was omitted for brevity in the publication. The paper was based on algorithms developed for the FY2005 R&TD (Research and Technology Development) project for Small-body Guidance, Navigation, and Control [2].The framework established by the Corollary is for a robustly stabilizing MPC (model predictive control) algorithm for uncertain nonlinear systems that guarantees the resolvability of the associated nite-horizon optimal control problem in a receding-horizon implementation. Additional details of the framework are available in the publication.
Barnes-Holmes, Dermot; Hussey, Ian
2016-02-01
The functional-cognitive meta-theoretical framework has been offered as a conceptual basis for facilitating greater communication and cooperation between the functional/behavioural and cognitive traditions within psychology, thus leading to benefits for both scientific communities. The current article is written from the perspective of two functional researchers, who are also proponents of the functional-cognitive framework, and attended the "Building Bridges between the Functional and Cognitive Traditions" meeting at Ghent University in the summer of 2014. The article commences with a brief summary of the functional approach to theory, followed by our reflections upon the functional-cognitive framework in light of that meeting. In doing so, we offer three ways in which the framework could be clarified: (a) effective communication between the two traditions is likely to be found at the level of behavioural observations rather than effects or theory, (b) not all behavioural observations will be deemed to be of mutual interest to both traditions, and (c) observations of mutual interest will be those that serve to elaborate and extend existing theorising in the functional and/or cognitive traditions. The article concludes with a summary of what we perceive to be the strengths and weaknesses of the framework, and a suggestion that there is a need to determine if the framework is meta-theoretical or is in fact a third theoretical approach to doing psychological science. © 2015 International Union of Psychological Science.
An Affective Neuroscience Framework for the Molecular Study of Internet Addiction.
Montag, Christian; Sindermann, Cornelia; Becker, Benjamin; Panksepp, Jaak
2016-01-01
Internet addiction represents an emerging global health issue. Increasing efforts have been made to characterize risk factors for the development of Internet addiction and consequences of excessive Internet use. During the last years, classic research approaches from psychology considering personality variables as vulnerability factor, especially in conjunction with neuroscience approaches such as brain imaging, have led to coherent theoretical conceptualizations of Internet addiction. Although such conceptualizations can be valuable aid, the research field is currently lacking a comprehensive framework for determining brain-based and neurochemical markers of Internet addiction. The present work aims at providing a framework on the molecular level as a basis for future research on the neural and behavioral level, in order to facilitate a comprehensive neurobiological model of Internet addiction and its clinical symptomatology. To help establish such a molecular framework for the study of Internet addiction, we investigated in N = 680 participants associations between individual differences in tendencies toward Internet addiction measured by the Generalized Problematic Internet Use Scale-2 (GPIUS-2) and individual differences in primary emotional systems as assessed by the Affective Neuroscience Personality Scales (ANPS). Regression analysis revealed that the ANPS scales FEAR and SADNESS were the ANPS scales most robustly positively linked to several (sub)scales of the GPIUS-2. Also the scales SEEKING, CARE and PLAY explain variance in some of the GPIUS-2 subscales. As such, these scales are negatively linked to the GPIUS-2 subscales. As the ANPS has been constructed on substantial available brain data including an extensive molecular body with respect to evolutionary highly conserved emotional circuitry in the ancient mammalian brain, the present study gives first ideas on putative molecular mechanisms underlying different facets of Internet addiction as derived from associations between tendencies toward Internet addiction and individual differences in primary emotional systems. For example, as SADNESS is linked to the overall GPIUS-2 score, and the neuropeptide oxytocin is known to downregulate SADNESS, it is conceivable that the neuropeptide might play a role in Internet addition on the molecular level. Our findings provide a theoretical framework potentially illuminating the molecular underpinnings of Internet addiction. Finally, we also present data on the ANPS and smartphone addiction at the end of the paper. Similar to the reported associations between the ANPS and the GPIUS-2, these correlations might provide an initial outline for a framework guiding future studies that aim to address the molecular basis of smartphone addiction.
An Affective Neuroscience Framework for the Molecular Study of Internet Addiction
Montag, Christian; Sindermann, Cornelia; Becker, Benjamin; Panksepp, Jaak
2016-01-01
Internet addiction represents an emerging global health issue. Increasing efforts have been made to characterize risk factors for the development of Internet addiction and consequences of excessive Internet use. During the last years, classic research approaches from psychology considering personality variables as vulnerability factor, especially in conjunction with neuroscience approaches such as brain imaging, have led to coherent theoretical conceptualizations of Internet addiction. Although such conceptualizations can be valuable aid, the research field is currently lacking a comprehensive framework for determining brain-based and neurochemical markers of Internet addiction. The present work aims at providing a framework on the molecular level as a basis for future research on the neural and behavioral level, in order to facilitate a comprehensive neurobiological model of Internet addiction and its clinical symptomatology. To help establish such a molecular framework for the study of Internet addiction, we investigated in N = 680 participants associations between individual differences in tendencies toward Internet addiction measured by the Generalized Problematic Internet Use Scale-2 (GPIUS-2) and individual differences in primary emotional systems as assessed by the Affective Neuroscience Personality Scales (ANPS). Regression analysis revealed that the ANPS scales FEAR and SADNESS were the ANPS scales most robustly positively linked to several (sub)scales of the GPIUS-2. Also the scales SEEKING, CARE and PLAY explain variance in some of the GPIUS-2 subscales. As such, these scales are negatively linked to the GPIUS-2 subscales. As the ANPS has been constructed on substantial available brain data including an extensive molecular body with respect to evolutionary highly conserved emotional circuitry in the ancient mammalian brain, the present study gives first ideas on putative molecular mechanisms underlying different facets of Internet addiction as derived from associations between tendencies toward Internet addiction and individual differences in primary emotional systems. For example, as SADNESS is linked to the overall GPIUS-2 score, and the neuropeptide oxytocin is known to downregulate SADNESS, it is conceivable that the neuropeptide might play a role in Internet addition on the molecular level. Our findings provide a theoretical framework potentially illuminating the molecular underpinnings of Internet addiction. Finally, we also present data on the ANPS and smartphone addiction at the end of the paper. Similar to the reported associations between the ANPS and the GPIUS-2, these correlations might provide an initial outline for a framework guiding future studies that aim to address the molecular basis of smartphone addiction. PMID:28018255
Model-theoretic framework for sensor data fusion
NASA Astrophysics Data System (ADS)
Zavoleas, Kyriakos P.; Kokar, Mieczyslaw M.
1993-09-01
The main goal of our research in sensory data fusion (SDF) is the development of a systematic approach (a methodology) to designing systems for interpreting sensory information and for reasoning about the situation based upon this information and upon available data bases and knowledge bases. To achieve such a goal, two kinds of subgoals have been set: (1) develop a theoretical framework in which rational design/implementation decisions can be made, and (2) design a prototype SDF system along the lines of the framework. Our initial design of the framework has been described in our previous papers. In this paper we concentrate on the model-theoretic aspects of this framework. We postulate that data are embedded in data models, and information processing mechanisms are embedded in model operators. The paper is devoted to analyzing the classes of model operators and their significance in SDF. We investigate transformation abstraction and fusion operators. A prototype SDF system, fusing data from range and intensity sensors, is presented, exemplifying the structures introduced. Our framework is justified by the fact that it provides modularity, traceability of information flow, and a basis for a specification language for SDF.
Optimization-Based Robust Nonlinear Control
2006-08-01
ABSTRACT New control algorithms were developed for robust stabilization of nonlinear dynamical systems . Novel, linear matrix inequality-based synthesis...was to further advance optimization-based robust nonlinear control design, for general nonlinear systems (especially in discrete time ), for linear...Teel, IEEE Transactions on Control Systems Technology, vol. 14, no. 3, p. 398-407, May 2006. 3. "A unified framework for input-to-state stability in
Olbert, Charles M; Gala, Gary J; Tupler, Larry A
2014-05-01
Heterogeneity within psychiatric disorders is both theoretically and practically problematic: For many disorders, it is possible for 2 individuals to share very few or even no symptoms in common yet share the same diagnosis. Polythetic diagnostic criteria have long been recognized to contribute to this heterogeneity, yet no unified theoretical understanding of the coherence of symptom criteria sets currently exists. A general framework for analyzing the logical and mathematical structure, coherence, and diversity of Diagnostic and Statistical Manual diagnostic categories (DSM-5 and DSM-IV-TR) is proposed, drawing from combinatorial mathematics, set theory, and information theory. Theoretical application of this framework to 18 diagnostic categories indicates that in most categories, 2 individuals with the same diagnosis may share no symptoms in common, and that any 2 theoretically possible symptom combinations will share on average less than half their symptoms. Application of this framework to 2 large empirical datasets indicates that patients who meet symptom criteria for major depressive disorder and posttraumatic stress disorder tend to share approximately three-fifths of symptoms in common. For both disorders in each of the datasets, pairs of individuals who shared no common symptoms were observed. Any 2 individuals with either diagnosis were unlikely to exhibit identical symptomatology. The theoretical and empirical results stemming from this approach have substantive implications for etiological research into, and measurement of, psychiatric disorders.
Health information systems: a survey of frameworks for developing countries.
Marcelo, A B
2010-01-01
The objective of this paper is to perform a survey of excellent research on health information systems (HIS) analysis and design, and their underlying theoretical frameworks. It classifies these frameworks along major themes, and analyzes the different approaches to HIS development that are practical in resource-constrained environments. Literature review based on PubMed citations and conference proceedings, as well as Internet searches on information systems in general, and health information systems in particular. The field of health information systems development has been studied extensively. Despite this, failed implementations are still common. Theoretical frameworks for HIS development are available that can guide implementers. As awareness, acceptance, and demand for health information systems increase globally, the variety of approaches and strategies will also follow. For developing countries with scarce resources, a trial-and-error approach can be very costly. Lessons from the successes and failures of initial HIS implementations have been abstracted into theoretical frameworks. These frameworks organize complex HIS concepts into methodologies that standardize techniques in implementation. As globalization continues to impact healthcare in the developing world, demand for more responsive health systems will become urgent. More comprehensive frameworks and practical tools to guide HIS implementers will be imperative.
HYPOTHESIS SETTING AND ORDER STATISTIC FOR ROBUST GENOMIC META-ANALYSIS.
Song, Chi; Tseng, George C
2014-01-01
Meta-analysis techniques have been widely developed and applied in genomic applications, especially for combining multiple transcriptomic studies. In this paper, we propose an order statistic of p-values ( r th ordered p-value, rOP) across combined studies as the test statistic. We illustrate different hypothesis settings that detect gene markers differentially expressed (DE) "in all studies", "in the majority of studies", or "in one or more studies", and specify rOP as a suitable method for detecting DE genes "in the majority of studies". We develop methods to estimate the parameter r in rOP for real applications. Statistical properties such as its asymptotic behavior and a one-sided testing correction for detecting markers of concordant expression changes are explored. Power calculation and simulation show better performance of rOP compared to classical Fisher's method, Stouffer's method, minimum p-value method and maximum p-value method under the focused hypothesis setting. Theoretically, rOP is found connected to the naïve vote counting method and can be viewed as a generalized form of vote counting with better statistical properties. The method is applied to three microarray meta-analysis examples including major depressive disorder, brain cancer and diabetes. The results demonstrate rOP as a more generalizable, robust and sensitive statistical framework to detect disease-related markers.
Health administrative data can be used to define a shared care typology for people with HIV.
Kendall, Claire E; Younger, Jaime; Manuel, Douglas G; Hogg, William; Glazier, Richard H; Taljaard, Monica
2015-11-01
Building on an existing theoretical shared primary care/specialist care framework to (1) develop a unique typology of care for people living with human immunodeficiency virus (HIV) in Ontario, (2) assess sensitivity of the typology by varying typology definitions, and (3) describe characteristics of typology categories. Retrospective population-based observational study from April 1, 2009, to March 31, 2012. A total of 13,480 eligible patients with HIV and receiving publicly funded health care in Ontario. We derived a typology of care by linking patients to usual family physicians and to HIV specialists with five possible patterns of care. Patient and physician characteristics and outpatient visits for HIV-related and non-HIV-related care were used to assess the robustness and characteristics of the typology. Five possible patterns of care were described as low engagement (8.6%), exclusively primary care (52.7%), family physician-dominated comanagement (10.0%), specialist-dominated comanagement (30.5%), and exclusively specialist care (5.2%). Sensitivity analyses demonstrated robustness of typology assignments. Visit patterns varied in ways that conform to typology assignments. We anticipate this typology can be used to assess the impact of care patterns on the quality of primary care for people living with HIV. Copyright © 2015 Elsevier Inc. All rights reserved.
Tough Hydrogel Robots: High-Speed, High-Force and Opto-sonically Invisible in Water
NASA Astrophysics Data System (ADS)
Zhao, Xuanhe
Sea animals such as leptocephali develop tissues and organs composed of active transparent hydrogels to achieve agile motions and natural camouflage in water. Hydrogel-based actuators that can imitate the capabilities of leptocephali will enable new applications in diverse fields. However, existing hydrogel actuators, mostly osmotic-driven, are intrinsically low-speed and/or low-force; and their camouflage capabilities have not been explored. Here we show that hydraulic actuations of tough hydrogels with designed structures and properties can give soft actuators and robots that are high-speed, high-force, and optically and sonically camouflaged in water. We invent a simple method capable of assembling physically-crosslinked hydrogel parts followed by covalent crosslinking to fabricate large-scale hydraulic hydrogel actuators and robots with robust bodies and interfaces. The hydrogel actuators and robots can maintain their robustness and functionality over multiple cycles of actuations, owning to the anti-fatigue property of the hydrogel under moderate stresses. A multiscale theoretical framework has been developed to guide the design and optimization of the hydrogel robots. We further demonstrate that the agile and transparent hydrogel actuators and robots perform extraordinary functions including swimming, kicking rubber-balls and catching a live fish in water. The work was supported by NSF(No. CMMI- 1253495) and ONR (No. N00014-14-1-0528).
Demographic noise can reverse the direction of deterministic selection
Constable, George W. A.; Rogers, Tim; McKane, Alan J.; Tarnita, Corina E.
2016-01-01
Deterministic evolutionary theory robustly predicts that populations displaying altruistic behaviors will be driven to extinction by mutant cheats that absorb common benefits but do not themselves contribute. Here we show that when demographic stochasticity is accounted for, selection can in fact act in the reverse direction to that predicted deterministically, instead favoring cooperative behaviors that appreciably increase the carrying capacity of the population. Populations that exist in larger numbers experience a selective advantage by being more stochastically robust to invasions than smaller populations, and this advantage can persist even in the presence of reproductive costs. We investigate this general effect in the specific context of public goods production and find conditions for stochastic selection reversal leading to the success of public good producers. This insight, developed here analytically, is missed by the deterministic analysis as well as by standard game theoretic models that enforce a fixed population size. The effect is found to be amplified by space; in this scenario we find that selection reversal occurs within biologically reasonable parameter regimes for microbial populations. Beyond the public good problem, we formulate a general mathematical framework for models that may exhibit stochastic selection reversal. In this context, we describe a stochastic analog to r−K theory, by which small populations can evolve to higher densities in the absence of disturbance. PMID:27450085
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Raedt, Hans; Katsnelson, Mikhail I.; Donker, Hylke C.
It is shown that the Pauli equation and the concept of spin naturally emerge from logical inference applied to experiments on a charged particle under the conditions that (i) space is homogeneous (ii) the observed events are logically independent, and (iii) the observed frequency distributions are robust with respect to small changes in the conditions under which the experiment is carried out. The derivation does not take recourse to concepts of quantum theory and is based on the same principles which have already been shown to lead to e.g. the Schrödinger equation and the probability distributions of pairs of particles inmore » the singlet or triplet state. Application to Stern–Gerlach experiments with chargeless, magnetic particles, provides additional support for the thesis that quantum theory follows from logical inference applied to a well-defined class of experiments. - Highlights: • The Pauli equation is obtained through logical inference applied to robust experiments on a charged particle. • The concept of spin appears as an inference resulting from the treatment of two-valued data. • The same reasoning yields the quantum theoretical description of neutral magnetic particles. • Logical inference provides a framework to establish a bridge between objective knowledge gathered through experiments and their description in terms of concepts.« less
Mahmood, Zohaib; McDaniel, Patrick; Guérin, Bastien; Keil, Boris; Vester, Markus; Adalsteinsson, Elfar; Wald, Lawrence L; Daniel, Luca
2016-07-01
In a coupled parallel transmit (pTx) array, the power delivered to a channel is partially distributed to other channels because of coupling. This power is dissipated in circulators resulting in a significant reduction in power efficiency. In this study, a technique for designing robust decoupling matrices interfaced between the RF amplifiers and the coils is proposed. The decoupling matrices ensure that most forward power is delivered to the load without loss of encoding capabilities of the pTx array. The decoupling condition requires that the impedance matrix seen by the power amplifiers is a diagonal matrix whose entries match the characteristic impedance of the power amplifiers. In this work, the impedance matrix of the coupled coils is diagonalized by a successive multiplication by its eigenvectors. A general design procedure and software are developed to generate automatically the hardware that implements diagonalization using passive components. The general design method is demonstrated by decoupling two example parallel transmit arrays. Our decoupling matrices achieve better than -20 db decoupling in both cases. A robust framework for designing decoupling matrices for pTx arrays is presented and validated. The proposed decoupling strategy theoretically scales to any arbitrary number of channels. Magn Reson Med 76:329-339, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
Conceptual models for cumulative risk assessment.
Linder, Stephen H; Sexton, Ken
2011-12-01
In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive "family" of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects.
Conceptual Models for Cumulative Risk Assessment
Sexton, Ken
2011-01-01
In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive “family” of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects. PMID:22021317
Robust MOE Detector for DS-CDMA Systems with Signature Waveform Mismatch
NASA Astrophysics Data System (ADS)
Lin, Tsui-Tsai
In this letter, a decision-directed MOE detector with excellent robustness against signature waveform mismatch is proposed for DS-CDMA systems. Both the theoretic analysis and computer simulation results demonstrate that the proposed detector can provide better SINR performance than that of conventional detectors.
Learning Physical Domains: Toward a Theoretical Framework.
ERIC Educational Resources Information Center
Forbus, Kenneth D.; Gentner, Dedre
People use and extend their knowledge of the physical world constantly. Understanding how this fluency is achieved would be an important milestone in understanding human learning and intelligence, as well as a useful guide for constructing machines that learn. This paper presents a theoretical framework that is being developed in an attempt to…
Memory and the Self in Autism: A Review and Theoretical Framework
ERIC Educational Resources Information Center
Lind, Sophie E.
2010-01-01
This article reviews research on (a) autobiographical episodic and semantic memory, (b) the self-reference effect, (c) memory for the actions of self versus other (the self-enactment effect), and (d) non-autobiographical episodic memory in autism spectrum disorder (ASD), and provides a theoretical framework to account for the bidirectional…
Approximation Methods for Inverse Problems Governed by Nonlinear Parabolic Systems
1999-12-17
We present a rigorous theoretical framework for approximation of nonlinear parabolic systems with delays in the context of inverse least squares...numerical results demonstrating the convergence are given for a model of dioxin uptake and elimination in a distributed liver model that is a special case of the general theoretical framework .
A general theoretical framework for decoherence in open and closed systems
NASA Astrophysics Data System (ADS)
Castagnino, Mario; Fortin, Sebastian; Laura, Roberto; Lombardi, Olimpia
2008-08-01
A general theoretical framework for decoherence is proposed, which encompasses formalisms originally devised to deal just with open or closed systems. The conditions for decoherence are clearly stated and the relaxation and decoherence times are compared. Finally, the spin-bath model is developed in detail from the new perspective.
21st Century Pedagogical Content Knowledge and Science Teaching and Learning
ERIC Educational Resources Information Center
Slough, Scott; Chamblee, Gregory
2017-01-01
Technological Pedagogical Content Knowledge (TPACK) is a theoretical framework that has enjoyed widespread applications as it applies to the integration of technology in the teaching and learning process. This paper reviews the background for TPACK, discusses some of its limitations, and reviews and introduces a new theoretical framework, 21st…
Analysing Theoretical Frameworks of Moral Education through Lakatos's Philosophy of Science
ERIC Educational Resources Information Center
Han, Hyemin
2014-01-01
The structure of studies of moral education is basically interdisciplinary; it includes moral philosophy, psychology, and educational research. This article systematically analyses the structure of studies of moral educational from the vantage points of philosophy of science. Among the various theoretical frameworks in the field of philosophy of…
Applying the Grossman et al. Theoretical Framework: The Case of Reading
ERIC Educational Resources Information Center
Kucan, Linda; Palincsar, Annemarie Sullivan; Busse, Tracy; Heisey, Natalie; Klingelhofer, Rachel; Rimbey, Michelle; Schutz, Kristine
2011-01-01
Background/Context: This article describes the application of the theoretical framework proposed by Grossman and her colleagues to a research effort focusing on text-based discussion as a context for comprehension instruction. According to Grossman and her colleagues, a useful way to consider the teaching of complex practices to candidates is to…
Unifying Different Theories of Learning: Theoretical Framework and Empirical Evidence
ERIC Educational Resources Information Center
Phan, Huy Phuong
2008-01-01
The main aim of this research study was to test out a conceptual model encompassing the theoretical frameworks of achievement goals, study processing strategies, effort, and reflective thinking practice. In particular, it was postulated that the causal influences of achievement goals on academic performance are direct and indirect through study…
Internet Use and Cognitive Development: A Theoretical Framework
ERIC Educational Resources Information Center
Johnson, Genevieve
2006-01-01
The number of children and adolescents accessing the Internet as well as the amount of time online are steadily increasing. The most common online activities include playing video games, accessing web sites, and communicating via chat rooms, email, and instant messaging. A theoretical framework for understanding the effects of Internet use on…
Growth in Mathematical Understanding While Learning How To Teach: A Theoretical Perspective.
ERIC Educational Resources Information Center
Cavey, Laurie O.
This theoretical paper outlines a conceptual framework for examining growth in prospective teachers' mathematical understanding as they engage in thinking about and planning for the mathematical learning of others. The framework is based on the Pirie-Kieren (1994) Dynamical Theory for the Growth of Mathematical Understanding and extends into the…
Design-Based Research: Case of a Teaching Sequence on Mechanics
ERIC Educational Resources Information Center
Tiberghien, Andree; Vince, Jacques; Gaidioz, Pierre
2009-01-01
Design-based research, and particularly its theoretical status, is a subject of debate in the science education community. In the first part of this paper, a theoretical framework drawn up to develop design-based research will be presented. This framework is mainly based on epistemological analysis of physics modelling, learning and teaching…
ERIC Educational Resources Information Center
Koh, Kyungwon
2011-01-01
Contemporary young people are engaged in a variety of information behaviors, such as information seeking, using, sharing, and creating. The ways youth interact with information have transformed in the shifting digital information environment; however, relatively little empirical research exists and no theoretical framework adequately explains…
ERIC Educational Resources Information Center
Mecoli, Storey
2013-01-01
Pedagogical Content Knowledge, Lee S. Shulman's theoretical framework, has had a substantial influence on research in preservice teacher education, and consequently, schools of education. This review builds from Grossman's case studies that concluded that beginning teachers provided with excellent teacher education developed more substantial PCK…
"Theorizing Teacher Mobility": A Critical Review of Literature
ERIC Educational Resources Information Center
Vagi, Robert; Pivovarova, Margarita
2017-01-01
In this critical review of literature, we summarize the major theoretical frameworks that have been used to study teacher mobility. In total we identified 40 teacher mobility studies that met our inclusion criteria. We conclude that relatively few theoretical frameworks have been used to study teacher mobility and those that have been used are…
Utilizing the Theoretical Framework of Collective Identity to Understand Processes in Youth Programs
ERIC Educational Resources Information Center
Futch, Valerie A.
2016-01-01
This article explores collective identity as a useful theoretical framework for understanding social and developmental processes that occur in youth programs. Through narrative analysis of past participant interviews (n = 21) from an after-school theater program, known as "The SOURCE", it was found that participants very clearly describe…
ERIC Educational Resources Information Center
Schalock, Robert L.; Luckasson, Ruth; Tassé, Marc J.; Verdugo, Miguel Angel
2018-01-01
This article describes a holistic theoretical framework that can be used to explain intellectual disability (ID) and organize relevant information into a usable roadmap to guide understanding and application. Developing the framework involved analyzing the four current perspectives on ID and synthesizing this information into a holistic…
Network planning under uncertainties
NASA Astrophysics Data System (ADS)
Ho, Kwok Shing; Cheung, Kwok Wai
2008-11-01
One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a generic framework for solving the network planning problem under uncertainties. In addition to reviewing the various network planning problems involving uncertainties, we also propose that a unified framework based on robust optimization can be used to solve a rather large segment of network planning problem under uncertainties. Robust optimization is first introduced in the operations research literature and is a framework that incorporates information about the uncertainty sets for the parameters in the optimization model. Even though robust optimization is originated from tackling the uncertainty in the optimization process, it can serve as a comprehensive and suitable framework for tackling generic network planning problems under uncertainties. In this paper, we begin by explaining the main ideas behind the robust optimization approach. Then we demonstrate the capabilities of the proposed framework by giving out some examples of how the robust optimization framework can be applied to the current common network planning problems under uncertain environments. Next, we list some practical considerations for solving the network planning problem under uncertainties with the proposed framework. Finally, we conclude this article with some thoughts on the future directions for applying this framework to solve other network planning problems.
Drolet, Marie-Josée; Hudon, Anne
2015-02-01
In the past, several researchers in the field of physiotherapy have asserted that physiotherapy clinicians rarely use ethical knowledge to solve ethical issues raised by their practice. Does this assertion still hold true? Do the theoretical frameworks used by researchers and clinicians allow them to analyze thoroughly the ethical issues they encounter in their everyday practice? In our quest for answers, we conducted a literature review and analyzed the ethical theoretical frameworks used by physiotherapy researchers and clinicians to discuss the ethical issues raised by private physiotherapy practice. Our final analysis corpus consisted of thirty-nine texts. Our main finding is that researchers and clinicians in physiotherapy rarely use ethical knowledge to analyze the ethical issues raised in their practice and that gaps exist in the theoretical frameworks currently used to analyze these issues. Consequently, we developed, for ethical analysis, a four-part prism which we have called the Quadripartite Ethical Tool (QET). This tool can be incorporated into existing theoretical frameworks to enable professionals to integrate ethical knowledge into their ethical analyses. The innovative particularity of the QET is that it encompasses three ethical theories (utilitarism, deontologism, and virtue ethics) and axiological ontology (professional values) and also draws on both deductive and inductive approaches. It is our hope that this new tool will help researchers and clinicians integrate ethical knowledge into their analysis of ethical issues and contribute to fostering ethical analyses that are grounded in relevant philosophical and axiological foundations.
Tremblay, Marie-Claude; Martin, Debbie H; Macaulay, Ann C; Pluye, Pierre
2017-06-01
A long-standing challenge in community-based participatory research (CBPR) has been to anchor practice and evaluation in a relevant and comprehensive theoretical framework of community change. This study describes the development of a multidimensional conceptual framework that builds on social movement theories to identify key components of CBPR processes. Framework synthesis was used as a general literature search and analysis strategy. An initial conceptual framework was developed from the theoretical literature on social movement. A literature search performed to identify illustrative CBPR projects yielded 635 potentially relevant documents, from which eight projects (corresponding to 58 publications) were retained after record and full-text screening. Framework synthesis was used to code and organize data from these projects, ultimately providing a refined framework. The final conceptual framework maps key concepts of CBPR mobilization processes, such as the pivotal role of the partnership; resources and opportunities as necessary components feeding the partnership's development; the importance of framing processes; and a tight alignment between the cause (partnership's goal), the collective action strategy, and the system changes targeted. The revised framework provides a context-specific model to generate a new, innovative understanding of CBPR mobilization processes, drawing on existing theoretical foundations. © 2017 The Authors American Journal of Community Psychology published by Wiley Periodicals, Inc. on behalf of Society for Community Research and Action.
Optimality conditions for the numerical solution of optimization problems with PDE constraints :
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aguilo Valentin, Miguel Alejandro; Ridzal, Denis
2014-03-01
A theoretical framework for the numerical solution of partial di erential equation (PDE) constrained optimization problems is presented in this report. This theoretical framework embodies the fundamental infrastructure required to e ciently implement and solve this class of problems. Detail derivations of the optimality conditions required to accurately solve several parameter identi cation and optimal control problems are also provided in this report. This will allow the reader to further understand how the theoretical abstraction presented in this report translates to the application.
A Robust Absorbing Boundary Condition for Compressible Flows
NASA Technical Reports Server (NTRS)
Loh, Ching Y.; orgenson, Philip C. E.
2005-01-01
An absorbing non-reflecting boundary condition (NRBC) for practical computations in fluid dynamics and aeroacoustics is presented with theoretical proof. This paper is a continuation and improvement of a previous paper by the author. The absorbing NRBC technique is based on a first principle of non reflecting, which contains the essential physics that a plane wave solution of the Euler equations remains intact across the boundary. The technique is theoretically shown to work for a large class of finite volume approaches. When combined with the hyperbolic conservation laws, the NRBC is simple, robust and truly multi-dimensional; no additional implementation is needed except the prescribed physical boundary conditions. Several numerical examples in multi-dimensional spaces using two different finite volume schemes are illustrated to demonstrate its robustness in practical computations. Limitations and remedies of the technique are also discussed.
Chung, Eun-Sung; Kim, Yeonjoo
2014-12-15
This study proposed a robust prioritization framework to identify the priorities of treated wastewater (TWW) use locations with consideration of various uncertainties inherent in the climate change scenarios and the decision-making process. First, a fuzzy concept was applied because future forecast precipitation and their hydrological impact analysis results displayed significant variances when considering various climate change scenarios and long periods (e.g., 2010-2099). Second, various multi-criteria decision making (MCDM) techniques including weighted sum method (WSM), Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) and fuzzy TOPSIS were introduced to robust prioritization because different MCDM methods use different decision philosophies. Third, decision making method under complete uncertainty (DMCU) including maximin, maximax, minimax regret, Hurwicz, and equal likelihood were used to find robust final rankings. This framework is then applied to a Korean urban watershed. As a result, different rankings were obviously appeared between fuzzy TOPSIS and non-fuzzy MCDMs (e.g., WSM and TOPSIS) because the inter-annual variability in effectiveness was considered only with fuzzy TOPSIS. Then, robust prioritizations were derived based on 18 rankings from nine decadal periods of RCP4.5 and RCP8.5. For more robust rankings, five DMCU approaches using the rankings from fuzzy TOPSIS were derived. This framework combining fuzzy TOPSIS with DMCU approaches can be rendered less controversial among stakeholders under complete uncertainty of changing environments. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Tuminaro, Jonathan
Many introductory, algebra-based physics students perform poorly on mathematical problem solving tasks in physics. There are at least two possible, distinct reasons for this poor performance: (1) students simply lack the mathematical skills needed to solve problems in physics, or (2) students do not know how to apply the mathematical skills they have to particular problem situations in physics. While many students do lack the requisite mathematical skills, a major finding from this work is that the majority of students possess the requisite mathematical skills, yet fail to use or interpret them in the context of physics. In this thesis I propose a theoretical framework to analyze and describe students' mathematical thinking in physics. In particular, I attempt to answer two questions. What are the cognitive tools involved in formal mathematical thinking in physics? And, why do students make the kinds of mistakes they do when using mathematics in physics? According to the proposed theoretical framework there are three major theoretical constructs: mathematical resources, which are the knowledge elements that are activated in mathematical thinking and problem solving; epistemic games, which are patterns of activities that use particular kinds of knowledge to create new knowledge or solve a problem; and frames, which are structures of expectations that determine how individuals interpret situations or events. The empirical basis for this study comes from videotaped sessions of college students solving homework problems. The students are enrolled in an algebra-based introductory physics course. The videotapes were transcribed and analyzed using the aforementioned theoretical framework. Two important results from this work are: (1) the construction of a theoretical framework that offers researchers a vocabulary (ontological classification of cognitive structures) and grammar (relationship between the cognitive structures) for understanding the nature and origin of mathematical use in the context physics, and (2) a detailed understanding, in terms of the proposed theoretical framework, of the errors that students make when using mathematics in the context of physics.
Sparse alignment for robust tensor learning.
Lai, Zhihui; Wong, Wai Keung; Xu, Yong; Zhao, Cairong; Sun, Mingming
2014-10-01
Multilinear/tensor extensions of manifold learning based algorithms have been widely used in computer vision and pattern recognition. This paper first provides a systematic analysis of the multilinear extensions for the most popular methods by using alignment techniques, thereby obtaining a general tensor alignment framework. From this framework, it is easy to show that the manifold learning based tensor learning methods are intrinsically different from the alignment techniques. Based on the alignment framework, a robust tensor learning method called sparse tensor alignment (STA) is then proposed for unsupervised tensor feature extraction. Different from the existing tensor learning methods, L1- and L2-norms are introduced to enhance the robustness in the alignment step of the STA. The advantage of the proposed technique is that the difficulty in selecting the size of the local neighborhood can be avoided in the manifold learning based tensor feature extraction algorithms. Although STA is an unsupervised learning method, the sparsity encodes the discriminative information in the alignment step and provides the robustness of STA. Extensive experiments on the well-known image databases as well as action and hand gesture databases by encoding object images as tensors demonstrate that the proposed STA algorithm gives the most competitive performance when compared with the tensor-based unsupervised learning methods.
Cheng, Zhongtao; Liu, Dong; Luo, Jing; Yang, Yongying; Zhou, Yudi; Zhang, Yupeng; Duan, Lulin; Su, Lin; Yang, Liming; Shen, Yibing; Wang, Kaiwei; Bai, Jian
2015-05-04
A field-widened Michelson interferometer (FWMI) is developed to act as the spectral discriminator in high-spectral-resolution lidar (HSRL). This realization is motivated by the wide-angle Michelson interferometer (WAMI) which has been used broadly in the atmospheric wind and temperature detection. This paper describes an independent theoretical framework about the application of the FWMI in HSRL for the first time. In the framework, the operation principles and application requirements of the FWMI are discussed in comparison with that of the WAMI. Theoretical foundations for designing this type of interferometer are introduced based on these comparisons. Moreover, a general performance estimation model for the FWMI is established, which can provide common guidelines for the performance budget and evaluation of the FWMI in the both design and operation stages. Examples incorporating many practical imperfections or conditions that may degrade the performance of the FWMI are given to illustrate the implementation of the modeling. This theoretical framework presents a complete and powerful tool for solving most of theoretical or engineering problems encountered in the FWMI application, including the designing, parameter calibration, prior performance budget, posterior performance estimation, and so on. It will be a valuable contribution to the lidar community to develop a new generation of HSRLs based on the FWMI spectroscopic filter.
A theoretical framework for psychiatric nursing practice.
Onega, L L
1991-01-01
Traditionally, specific theoretical frameworks which are congruent with psychiatric nursing practice have been poorly articulated. The purpose of this paper is to identify and discuss a philosophical base, a theoretical framework, application to psychiatric nursing, and issues related to psychiatric nursing knowledge development and practice. A philosophical framework that is likely to be congruent with psychiatric nursing, which is based on the nature of human beings, health, psychiatric nursing and reality, is identified. Aaron Antonovsky's Salutogenic Model is discussed and applied to psychiatric nursing. This model provides a helpful way for psychiatric nurses to organize their thinking processes and ultimately improve the health care services that they offer to their clients. Goal setting and nursing interventions using this model are discussed. Additionally, application of the use of Antonovsky's model is made to nursing research areas such as hardiness, uncertainty, suffering, empathy and literary works. Finally, specific issues related to psychiatric nursing are addressed.
Labyrinthine flows across multilayer graphene-based membranes
NASA Astrophysics Data System (ADS)
Yoshida, Hiroaki
Graphene-based materials have recently found extremely wide applications for fluidic purposes thanks to remarkable developments in micro-/nano-fabrication techniques. In particular, high permeability and specific selectivity have been reported for these graphene-based membranes, such as the graphene-oxide membranes, with however controversial experimental results. There is therefore an urgent need to propose a theoretical framework of fluid transport in these architectures in order to rationalize the experimental results.In this presentation, we report a theoretical study of mass transport across multilayer graphene based membranes, which we benchmark by atomic-scale molecular dynamics. Specifically, we consider the water flow across multiple graphene layers with an inter-layer distance ranging from sub-nanometer to a few nanometers. The graphene layers have nanoslits aligned in a staggered fashion, and thus the water flows involve multiple twists and turns. We compare the continuum model predictions for the permeability with the lattice Boltzmann calculations and molecular dynamics simulations. The highlight is that, in spite of extreme confinement, the permeability across the graphene-based membrane is quantitatively predicted on the basis of a properly designed continuum model. The framework of this study constitutes a benchmark to which we compare favourably published experimental data.In addition, flow properties of a water-ethanol mixture are presented, demonstrating the possibility of a novel separation technique. While the membrane is permeable to both pure liquids, it exhibits a counter-intuitive ``self-semi-permeability'' to water in the presence of the mixture. This suggests a robust and versatile membrane-based separation method built on a pressure-driven reverse-osmosis process, which is considerably less energy consuming than distillation processes. The author acknowledges the ERC project Micromegas and the ANR projects BlueEnergy and Equip@Meso.
NASA Astrophysics Data System (ADS)
Razavi, Saman; Gupta, Hoshin; Haghnegahdar, Amin
2016-04-01
Global sensitivity analysis (GSA) is a systems theoretic approach to characterizing the overall (average) sensitivity of one or more model responses across the factor space, by attributing the variability of those responses to different controlling (but uncertain) factors (e.g., model parameters, forcings, and boundary and initial conditions). GSA can be very helpful to improve the credibility and utility of Earth and Environmental System Models (EESMs), as these models are continually growing in complexity and dimensionality with continuous advances in understanding and computing power. However, conventional approaches to GSA suffer from (1) an ambiguous characterization of sensitivity, and (2) poor computational efficiency, particularly as the problem dimension grows. Here, we identify several important sensitivity-related characteristics of response surfaces that must be considered when investigating and interpreting the ''global sensitivity'' of a model response (e.g., a metric of model performance) to its parameters/factors. Accordingly, we present a new and general sensitivity and uncertainty analysis framework, Variogram Analysis of Response Surfaces (VARS), based on an analogy to 'variogram analysis', that characterizes a comprehensive spectrum of information on sensitivity. We prove, theoretically, that Morris (derivative-based) and Sobol (variance-based) methods and their extensions are special cases of VARS, and that their SA indices are contained within the VARS framework. We also present a practical strategy for the application of VARS to real-world problems, called STAR-VARS, including a new sampling strategy, called "star-based sampling". Our results across several case studies show the STAR-VARS approach to provide reliable and stable assessments of "global" sensitivity, while being at least 1-2 orders of magnitude more efficient than the benchmark Morris and Sobol approaches.
A new way to improve the robustness of complex communication networks by allocating redundancy links
NASA Astrophysics Data System (ADS)
Shi, Chunhui; Peng, Yunfeng; Zhuo, Yue; Tang, Jieying; Long, Keping
2012-03-01
We investigate the robustness of complex communication networks on allocating redundancy links. The protecting key nodes (PKN) strategy is proposed to improve the robustness of complex communication networks against intentional attack. Our numerical simulations show that allocating a few redundant links among key nodes using the PKN strategy will significantly increase the robustness of scale-free complex networks. We have also theoretically proved and demonstrated the effectiveness of the PKN strategy. We expect that our work will help achieve a better understanding of communication networks.
Conceptualizing and Measuring Working Memory and its Relationship to Aphasia
Wright, Heather Harris; Fergadiotis, Gerasimos
2011-01-01
Background General agreement exists in the literature that individuals with aphasia can exhibit a working memory deficit that contributes to their language processing impairments. Though conceptualized within different working memory frameworks, researchers have suggested that individuals with aphasia have limited working memory capacity, impaired attention-control processes as well as impaired inhibitory mechanisms. However, across studies investigating working memory ability in individuals with aphasia, different measures have been used to quantify their working memory ability and identify the relationship between working memory and language performance. Aims The primary objectives of this article are to (1) review current working memory theoretical frameworks, (2) review tasks used to measure working memory, and (3) discuss findings from studies that have investigated working memory as they relate to language processing in aphasia. Main Contribution Though findings have been consistent across studies investigating working memory ability in individuals with aphasia, discussion of how working memory is conceptualized and defined is often missing, as is discussion of results within a theoretical framework. This is critical, as working memory is conceptualized differently across the different theoretical frameworks. They differ in explaining what limits capacity and the source of individual differences as well as how information is encoded, maintained, and retrieved. When test methods are considered within a theoretical framework, specific hypotheses can be tested and stronger conclusions that are less susceptible to different interpretations can be made. Conclusions Working memory ability has been investigated in numerous studies with individuals with aphasia. To better understand the underlying cognitive constructs that contribute to the language deficits exhibited by individuals with aphasia, future investigations should operationally define the cognitive constructs of interest and discuss findings within theoretical frameworks. PMID:22639480
Tavender, Emma J; Bosch, Marije; Gruen, Russell L; Green, Sally E; Michie, Susan; Brennan, Sue E; Francis, Jill J; Ponsford, Jennie L; Knott, Jonathan C; Meares, Sue; Smyth, Tracy; O'Connor, Denise A
2015-05-25
Despite the availability of evidence-based guidelines for the management of mild traumatic brain injury in the emergency department (ED), variations in practice exist. Interventions designed to implement recommended behaviours can reduce this variation. Using theory to inform intervention development is advocated; however, there is no consensus on how to select or apply theory. Integrative theoretical frameworks, based on syntheses of theories and theoretical constructs relevant to implementation, have the potential to assist in the intervention development process. This paper describes the process of applying two theoretical frameworks to investigate the factors influencing recommended behaviours and the choice of behaviour change techniques and modes of delivery for an implementation intervention. A stepped approach was followed: (i) identification of locally applicable and actionable evidence-based recommendations as targets for change, (ii) selection and use of two theoretical frameworks for identifying barriers to and enablers of change (Theoretical Domains Framework and Model of Diffusion of Innovations in Service Organisations) and (iii) identification and operationalisation of intervention components (behaviour change techniques and modes of delivery) to address the barriers and enhance the enablers, informed by theory, evidence and feasibility/acceptability considerations. We illustrate this process in relation to one recommendation, prospective assessment of post-traumatic amnesia (PTA) by ED staff using a validated tool. Four recommendations for managing mild traumatic brain injury were targeted with the intervention. The intervention targeting the PTA recommendation consisted of 14 behaviour change techniques and addressed 6 theoretical domains and 5 organisational domains. The mode of delivery was informed by six Cochrane reviews. It was delivered via five intervention components : (i) local stakeholder meetings, (ii) identification of local opinion leader teams, (iii) a train-the-trainer workshop for appointed local opinion leaders, (iv) local training workshops for delivery by trained local opinion leaders and (v) provision of tools and materials to prompt recommended behaviours. Two theoretical frameworks were used in a complementary manner to inform intervention development in managing mild traumatic brain injury in the ED. The effectiveness and cost-effectiveness of the developed intervention is being evaluated in a cluster randomised trial, part of the Neurotrauma Evidence Translation (NET) program.
ERIC Educational Resources Information Center
Brymer, Eric; Davids, Keith
2013-01-01
This paper proposes how the theoretical framework of ecological dynamics can provide an influential model of the learner and the learning process to pre-empt effective behaviour changes. Here we argue that ecological dynamics supports a well-established model of the learner ideally suited to the environmental education context because of its…
An Exploration of E-Learning Benefits for Saudi Arabia: Toward Policy Reform
ERIC Educational Resources Information Center
Alrashidi, Abdulaziz
2013-01-01
Purpose: The purpose of this study was to examine policies and solutions addressing (a) improving education for citizens of the Kingdom of Saudi Arabia and (b) providing alternative instructional delivery methods, including e-learning for those living in remote areas. Theoretical Framework: The theoretical framework of this study was based on the…
Applying a Conceptual Design Framework to Study Teachers' Use of Educational Technology
ERIC Educational Resources Information Center
Holmberg, Jörgen
2017-01-01
Theoretical outcomes of design-based research (DBR) are often presented in the form of local theory design principles. This article suggests a complementary theoretical construction in DBR, in the form of a "design framework" at a higher abstract level, to study and inform educational design with ICT in different situated contexts.…
A Theoretical Framework to Guide the Re-Engineering of Technology Education
ERIC Educational Resources Information Center
Kelley, Todd; Kellam, Nadia
2009-01-01
Before leaders in technology education are able to identify a theoretical framework upon which a curriculum is to stand, they must first grapple with two opposing views of the purpose of technology education--education for all learners or career/technical education. Dakers (2006) identifies two opposing philosophies that can serve as a framework…
ERIC Educational Resources Information Center
Martin, James L.
This paper reports on attempts by the author to construct a theoretical framework of adult education participation using a theory development process and the corresponding multivariate statistical techniques. Two problems are identified: the lack of theoretical framework in studying problems, and the limiting of statistical analysis to univariate…
ERIC Educational Resources Information Center
Thomas, Amanda Garland
2009-01-01
The purpose of this study was to understand the extent to which students' psychological sense of community was influenced by IM use using the psychological sense of community theoretical framework created by McMillan and Chavis (1986), and the student development theoretical frameworks created by Schlossberg (1989) and Astin (1984). Thus, this…
Proverbs as Theoretical Frameworks for Lifelong Learning in Indigenous African Education
ERIC Educational Resources Information Center
Avoseh, Mejai B. M.
2013-01-01
Every aspect of a community's life and values in indigenous Africa provide the theoretical framework for education. The holistic worldview of the traditional system places a strong emphasis on the centrality of the human element and orature in the symmetrical relationship between life and learning. This article focuses on proverbs and the words…
ERIC Educational Resources Information Center
Gade, Sharada
2015-01-01
Long association with a mathematics teacher at a Grade 4-6 school in Sweden, is basis for reporting a case of teacher-researcher collaboration. Three theoretical frameworks used to study its development over time are relational knowing, relational agency and cogenerative dialogue. While relational knowing uses narrative perspectives to explore the…
A Theoretical Framework for Organizing the Effect of the Internet on Cognitive Development
ERIC Educational Resources Information Center
Johnson, Genevieve Marie
2006-01-01
The number of children and adolescents accessing the Internet as well as the amount of time online are steadily increasing. The most common online activities include playing video games, navigating web sites, and communicating via chat rooms, email, and instant messaging. A theoretical framework for understanding the effects of Internet use on…
ERIC Educational Resources Information Center
Rooney, Pauline
2012-01-01
It is widely acknowledged that digital games can provide an engaging, motivating and "fun" experience for students. However an entertaining game does not necessarily constitute a meaningful, valuable learning experience. For this reason, experts espouse the importance of underpinning serious games with a sound theoretical framework which…
ERIC Educational Resources Information Center
Bussey, Thomas J.; Orgill, MaryKay; Crippen, Kent J.
2013-01-01
Instructors are constantly baffled by the fact that two students who are sitting in the same class, who have access to the same materials, can come to understand a particular chemistry concept differently. Variation theory offers a theoretical framework from which to explore possible variations in experience and the resulting differences in…
ERIC Educational Resources Information Center
Cooper, Susan M.; Wilkerson, Trena L.; Montgomery, Mark; Mechell, Sara; Arterbury, Kristin; Moore, Sherrie
2012-01-01
In 2007, a group of mathematics educators and researchers met to examine rational numbers and why children have such an issue with them. An extensive review of the literature on fractional understanding was conducted. The ideas in that literature were then consolidated into a theoretical framework for examining fractions. Once that theoretical…
Network control principles predict neuron function in the Caenorhabditis elegans connectome
Chew, Yee Lian; Walker, Denise S.; Schafer, William R.; Barabási, Albert-László
2017-01-01
Recent studies on the controllability of complex systems offer a powerful mathematical framework to systematically explore the structure-function relationship in biological, social and technological networks1–3. Despite theoretical advances, we lack direct experimental proof of the validity of these widely used control principles. Here we fill this gap by applying a control framework to the connectome of the nematode C. elegans4–6, allowing us to predict the involvement of each C. elegans neuron in locomotor behaviours. We predict that control of the muscles or motor neurons requires twelve neuronal classes, which include neuronal groups previously implicated in locomotion by laser ablation7–13, as well as one previously uncharacterised neuron, PDB. We validate this prediction experimentally, finding that the ablation of PDB leads to a significant loss of dorsoventral polarity in large body bends. Importantly, control principles also allow us to investigate the involvement of individual neurons within each neuronal class. For example, we predict that, within the class of DD motor neurons, only three (DD04, DD05, or DD06) should affect locomotion when ablated individually. This prediction is also confirmed, with single-cell ablations of DD04 or DD05, but not DD02 or DD03, specifically affecting posterior body movements. Our predictions are robust to deletions of weak connections, missing connections, and rewired connections in the current connectome, indicating the potential applicability of this analytical framework to larger and less well-characterised connectomes. PMID:29045391
Network control principles predict neuron function in the Caenorhabditis elegans connectome
NASA Astrophysics Data System (ADS)
Yan, Gang; Vértes, Petra E.; Towlson, Emma K.; Chew, Yee Lian; Walker, Denise S.; Schafer, William R.; Barabási, Albert-László
2017-10-01
Recent studies on the controllability of complex systems offer a powerful mathematical framework to systematically explore the structure-function relationship in biological, social, and technological networks. Despite theoretical advances, we lack direct experimental proof of the validity of these widely used control principles. Here we fill this gap by applying a control framework to the connectome of the nematode Caenorhabditis elegans, allowing us to predict the involvement of each C. elegans neuron in locomotor behaviours. We predict that control of the muscles or motor neurons requires 12 neuronal classes, which include neuronal groups previously implicated in locomotion by laser ablation, as well as one previously uncharacterized neuron, PDB. We validate this prediction experimentally, finding that the ablation of PDB leads to a significant loss of dorsoventral polarity in large body bends. Importantly, control principles also allow us to investigate the involvement of individual neurons within each neuronal class. For example, we predict that, within the class of DD motor neurons, only three (DD04, DD05, or DD06) should affect locomotion when ablated individually. This prediction is also confirmed; single cell ablations of DD04 or DD05 specifically affect posterior body movements, whereas ablations of DD02 or DD03 do not. Our predictions are robust to deletions of weak connections, missing connections, and rewired connections in the current connectome, indicating the potential applicability of this analytical framework to larger and less well-characterized connectomes.
Algorithms and Array Design Criteria for Robust Imaging in Interferometry
NASA Astrophysics Data System (ADS)
Kurien, Binoy George
Optical interferometry is a technique for obtaining high-resolution imagery of a distant target by interfering light from multiple telescopes. Image restoration from interferometric measurements poses a unique set of challenges. The first challenge is that the measurement set provides only a sparse-sampling of the object's Fourier Transform and hence image formation from these measurements is an inherently ill-posed inverse problem. Secondly, atmospheric turbulence causes severe distortion of the phase of the Fourier samples. We develop array design conditions for unique Fourier phase recovery, as well as a comprehensive algorithmic framework based on the notion of redundant-spaced-calibration (RSC), which together achieve reliable image reconstruction in spite of these challenges. Within this framework, we see that classical interferometric observables such as the bispectrum and closure phase can limit sensitivity, and that generalized notions of these observables can improve both theoretical and empirical performance. Our framework leverages techniques from lattice theory to resolve integer phase ambiguities in the interferometric phase measurements, and from graph theory, to select a reliable set of generalized observables. We analyze the expected shot-noise-limited performance of our algorithm for both pairwise and Fizeau interferometric architectures and corroborate this analysis with simulation results. We apply techniques from the field of compressed sensing to perform image reconstruction from the estimates of the object's Fourier coefficients. The end result is a comprehensive strategy to achieve well-posed and easily-predictable reconstruction performance in optical interferometry.
Network control principles predict neuron function in the Caenorhabditis elegans connectome.
Yan, Gang; Vértes, Petra E; Towlson, Emma K; Chew, Yee Lian; Walker, Denise S; Schafer, William R; Barabási, Albert-László
2017-10-26
Recent studies on the controllability of complex systems offer a powerful mathematical framework to systematically explore the structure-function relationship in biological, social, and technological networks. Despite theoretical advances, we lack direct experimental proof of the validity of these widely used control principles. Here we fill this gap by applying a control framework to the connectome of the nematode Caenorhabditis elegans, allowing us to predict the involvement of each C. elegans neuron in locomotor behaviours. We predict that control of the muscles or motor neurons requires 12 neuronal classes, which include neuronal groups previously implicated in locomotion by laser ablation, as well as one previously uncharacterized neuron, PDB. We validate this prediction experimentally, finding that the ablation of PDB leads to a significant loss of dorsoventral polarity in large body bends. Importantly, control principles also allow us to investigate the involvement of individual neurons within each neuronal class. For example, we predict that, within the class of DD motor neurons, only three (DD04, DD05, or DD06) should affect locomotion when ablated individually. This prediction is also confirmed; single cell ablations of DD04 or DD05 specifically affect posterior body movements, whereas ablations of DD02 or DD03 do not. Our predictions are robust to deletions of weak connections, missing connections, and rewired connections in the current connectome, indicating the potential applicability of this analytical framework to larger and less well-characterized connectomes.
Devkota, Jagannath; Kim, Ki-Joong; Ohodnicki, Paul R.; ...
2018-01-01
The integration of nanoporous materials such as metal organic frameworks (MOFs) with sensitive transducers can result in robust sensing platforms for monitoring gases and chemical vapors for a range of applications.
Deep Neural Networks for Speech Separation With Application to Robust Speech Recognition
acoustic -phonetic features. The second objective is integration of spectrotemporal context for improved separation performance. Conditional random fields...will be used to encode contextual constraints. The third objective is to achieve robust ASR in the DNN framework through integrated acoustic modeling
Robust infrared targets tracking with covariance matrix representation
NASA Astrophysics Data System (ADS)
Cheng, Jian
2009-07-01
Robust infrared target tracking is an important and challenging research topic in many military and security applications, such as infrared imaging guidance, infrared reconnaissance, scene surveillance, etc. To effectively tackle the nonlinear and non-Gaussian state estimation problems, particle filtering is introduced to construct the theory framework of infrared target tracking. Under this framework, the observation probabilistic model is one of main factors for infrared targets tracking performance. In order to improve the tracking performance, covariance matrices are introduced to represent infrared targets with the multi-features. The observation probabilistic model can be constructed by computing the distance between the reference target's and the target samples' covariance matrix. Because the covariance matrix provides a natural tool for integrating multiple features, and is scale and illumination independent, target representation with covariance matrices can hold strong discriminating ability and robustness. Two experimental results demonstrate the proposed method is effective and robust for different infrared target tracking, such as the sensor ego-motion scene, and the sea-clutter scene.
Seward, Kirsty; Wolfenden, Luke; Wiggers, John; Finch, Meghan; Wyse, Rebecca; Oldmeadow, Christopher; Presseau, Justin; Clinton-McHarg, Tara; Yoong, Sze Lin
2017-04-04
While there are number of frameworks which focus on supporting the implementation of evidence based approaches, few psychometrically valid measures exist to assess constructs within these frameworks. This study aimed to develop and psychometrically assess a scale measuring each domain of the Theoretical Domains Framework for use in assessing the implementation of dietary guidelines within a non-health care setting (childcare services). A 75 item 14-domain Theoretical Domains Framework Questionnaire (TDFQ) was developed and administered via telephone interview to 202 centre based childcare service cooks who had a role in planning the service menu. Confirmatory factor analysis (CFA) was undertaken to assess the reliability, discriminant validity and goodness of fit of the 14-domain theoretical domain framework measure. For the CFA, five iterative processes of adjustment were undertaken where 14 items were removed, resulting in a final measure consisting of 14 domains and 61 items. For the final measure: the Chi-Square goodness of fit statistic was 3447.19; the Standardized Root Mean Square Residual (SRMR) was 0.070; the Root Mean Square Error of Approximation (RMSEA) was 0.072; and the Comparative Fit Index (CFI) had a value of 0.78. While only one of the three indices support goodness of fit of the measurement model tested, a 14-domain model with 61 items showed good discriminant validity and internally consistent items. Future research should aim to assess the psychometric properties of the developed TDFQ in other community-based settings.
Obesity in sub-Saharan Africa: development of an ecological theoretical framework.
Scott, Alison; Ejikeme, Chinwe Stella; Clottey, Emmanuel Nii; Thomas, Joy Goens
2013-03-01
The prevalence of overweight and obesity is increasing in sub-Saharan Africa (SSA). There is a need for theoretical frameworks to catalyze further research and to inform the development of multi-level, context-appropriate interventions. In this commentary, we propose a preliminary ecological theoretical framework to conceptualize factors that contribute to increases in overweight and obesity in SSA. The framework is based on a Causality Continuum model [Coreil et al. Social and Behavioral Foundations of Public Health. Sage Publications, Thousand Oaks] that considers distant, intermediate and proximate influences. The influences incorporated in the model include globalization and urbanization as distant factors; occupation, social relationships, built environment and cultural perceptions of weight as intermediate factors and caloric intake, physical inactivity and genetics as proximate factors. The model illustrates the interaction of factors along a continuum, from the individual to the global marketplace, in shaping trends in overweight and obesity in SSA. The framework will be presented, each influence elucidated and implications for research and intervention development discussed. There is a tremendous need for further research on obesity in SSA. An improved evidence base will serve to validate and develop the proposed framework further.
A robust nonparametric framework for reconstruction of stochastic differential equation models
NASA Astrophysics Data System (ADS)
Rajabzadeh, Yalda; Rezaie, Amir Hossein; Amindavar, Hamidreza
2016-05-01
In this paper, we employ a nonparametric framework to robustly estimate the functional forms of drift and diffusion terms from discrete stationary time series. The proposed method significantly improves the accuracy of the parameter estimation. In this framework, drift and diffusion coefficients are modeled through orthogonal Legendre polynomials. We employ the least squares regression approach along with the Euler-Maruyama approximation method to learn coefficients of stochastic model. Next, a numerical discrete construction of mean squared prediction error (MSPE) is established to calculate the order of Legendre polynomials in drift and diffusion terms. We show numerically that the new method is robust against the variation in sample size and sampling rate. The performance of our method in comparison with the kernel-based regression (KBR) method is demonstrated through simulation and real data. In case of real dataset, we test our method for discriminating healthy electroencephalogram (EEG) signals from epilepsy ones. We also demonstrate the efficiency of the method through prediction in the financial data. In both simulation and real data, our algorithm outperforms the KBR method.
Robust detection-isolation-accommodation for sensor failures
NASA Technical Reports Server (NTRS)
Weiss, J. L.; Pattipati, K. R.; Willsky, A. S.; Eterno, J. S.; Crawford, J. T.
1985-01-01
The results of a one year study to: (1) develop a theory for Robust Failure Detection and Identification (FDI) in the presence of model uncertainty, (2) develop a design methodology which utilizes the robust FDI ththeory, (3) apply the methodology to a sensor FDI problem for the F-100 jet engine, and (4) demonstrate the application of the theory to the evaluation of alternative FDI schemes are presented. Theoretical results in statistical discrimination are used to evaluate the robustness of residual signals (or parity relations) in terms of their usefulness for FDI. Furthermore, optimally robust parity relations are derived through the optimization of robustness metrics. The result is viewed as decentralization of the FDI process. A general structure for decentralized FDI is proposed and robustness metrics are used for determining various parameters of the algorithm.
2018-04-01
Reports an error in "Robust, replicable, and theoretically-grounded: A response to Brown and Coyne's (2017) commentary on the relationship between emodiversity and health" by Jordi Quoidbach, Moïra Mikolajczak, June Gruber, Ilios Kotsou, Aleksandr Kogan and Michael I. Norton ( Journal of Experimental Psychology: General , 2018[Mar], Vol 147[3], 451-458). In the article, there is an error in the byline for the first author due to a printer error. The complete, correct institutional affiliation for Jordi Quoidbach is ESADE Business School, Ramon Llull University. The online version of this article has been corrected. (The following abstract of the original article appeared in record 2018-06787-002.) In 2014 in the Journal of Experimental Psychology: General , we reported 2 studies demonstrating that the diversity of emotions that people experience-as measured by the Shannon-Wiener entropy index-was an independent predictor of mental and physical health, over and above the effect of mean levels of emotion. Brown and Coyne (2017) questioned both our use of Shannon's entropy and our analytic approach. We thank Brown and Coyne for their interest in our research; however, both their theoretical and empirical critiques do not undermine the central theoretical tenets and empirical findings of our research. We present an in-depth examination that reveals that our findings are statistically robust, replicable, and reflect a theoretically grounded phenomenon with real-world implications. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Joint graph cut and relative fuzzy connectedness image segmentation algorithm.
Ciesielski, Krzysztof Chris; Miranda, Paulo A V; Falcão, Alexandre X; Udupa, Jayaram K
2013-12-01
We introduce an image segmentation algorithm, called GC(sum)(max), which combines, in novel manner, the strengths of two popular algorithms: Relative Fuzzy Connectedness (RFC) and (standard) Graph Cut (GC). We show, both theoretically and experimentally, that GC(sum)(max) preserves robustness of RFC with respect to the seed choice (thus, avoiding "shrinking problem" of GC), while keeping GC's stronger control over the problem of "leaking though poorly defined boundary segments." The analysis of GC(sum)(max) is greatly facilitated by our recent theoretical results that RFC can be described within the framework of Generalized GC (GGC) segmentation algorithms. In our implementation of GC(sum)(max) we use, as a subroutine, a version of RFC algorithm (based on Image Forest Transform) that runs (provably) in linear time with respect to the image size. This results in GC(sum)(max) running in a time close to linear. Experimental comparison of GC(sum)(max) to GC, an iterative version of RFC (IRFC), and power watershed (PW), based on a variety medical and non-medical images, indicates superior accuracy performance of GC(sum)(max) over these other methods, resulting in a rank ordering of GC(sum)(max)>PW∼IRFC>GC. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Herman, J. D.; Zeff, H. B.; Reed, P. M.; Characklis, G. W.
2013-12-01
In the Eastern United States, water infrastructure and institutional frameworks have evolved in a historically water-rich environment. However, large regional droughts over the past decade combined with continuing population growth have marked a transition to a state of water scarcity, for which current planning paradigms are ill-suited. Significant opportunities exist to improve the efficiency of water infrastructure via regional coordination, namely, regional 'portfolios' of water-related assets such as reservoirs, conveyance, conservation measures, and transfer agreements. Regional coordination offers the potential to improve reliability, cost, and environmental impact in the expected future state of the world, and, with informed planning, to improve robustness to future uncertainty. In support of this challenge, this study advances a multi-agent many-objective robust decision making (multi-agent MORDM) framework that blends novel computational search and uncertainty analysis tools to discover flexible, robust regional portfolios. Our multi-agent MORDM framework is demonstrated for four water utilities in the Research Triangle region of North Carolina, USA. The utilities supply nearly two million customers and have the ability to interact with one another via transfer agreements and shared infrastructure. We show that strategies for this region which are Pareto-optimal in the expected future state of the world remain vulnerable to performance degradation under alternative scenarios of deeply uncertain hydrologic and economic factors. We then apply the Patient Rule Induction Method (PRIM) to identify which of these uncertain factors drives the individual and collective vulnerabilities for the four cooperating utilities. Our results indicate that clear multi-agent tradeoffs emerge for attaining robustness across the utilities. Furthermore, the key factor identified for improving the robustness of the region's water supply is cooperative demand reduction. This type of approach is critically important given the risks and challenges posed by rising supply development costs, limits on new infrastructure, growing water demands and the underlying uncertainties associated with climate change. The proposed framework serves as a planning template for other historically water-rich regions which must now confront the reality of impending water scarcity.
Designing effective human-automation-plant interfaces: a control-theoretic perspective.
Jamieson, Greg A; Vicente, Kim J
2005-01-01
In this article, we propose the application of a control-theoretic framework to human-automation interaction. The framework consists of a set of conceptual distinctions that should be respected in automation research and design. We demonstrate how existing automation interface designs in some nuclear plants fail to recognize these distinctions. We further show the value of the approach by applying it to modes of automation. The design guidelines that have been proposed in the automation literature are evaluated from the perspective of the framework. This comparison shows that the framework reveals insights that are frequently overlooked in this literature. A new set of design guidelines is introduced that builds upon the contributions of previous research and draws complementary insights from the control-theoretic framework. The result is a coherent and systematic approach to the design of human-automation-plant interfaces that will yield more concrete design criteria and a broader set of design tools. Applications of this research include improving the effectiveness of human-automation interaction design and the relevance of human-automation interaction research.
Stochastic Robust Mathematical Programming Model for Power System Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Cong; Changhyeok, Lee; Haoyong, Chen
2016-01-01
This paper presents a stochastic robust framework for two-stage power system optimization problems with uncertainty. The model optimizes the probabilistic expectation of different worst-case scenarios with ifferent uncertainty sets. A case study of unit commitment shows the effectiveness of the proposed model and algorithms.
NASA Astrophysics Data System (ADS)
Sutrisno, Agung; Gunawan, Indra; Vanany, Iwan
2017-11-01
In spite of being integral part in risk - based quality improvement effort, studies improving quality of selection of corrective action priority using FMEA technique are still limited in literature. If any, none is considering robustness and risk in selecting competing improvement initiatives. This study proposed a theoretical model to select risk - based competing corrective action by considering robustness and risk of competing corrective actions. We incorporated the principle of robust design in counting the preference score among corrective action candidates. Along with considering cost and benefit of competing corrective actions, we also incorporate the risk and robustness of corrective actions. An example is provided to represent the applicability of the proposed model.
Liaw, Siaw-Teng; Pearce, Christopher; Liyanage, Harshana; Liaw, Gladys S S; de Lusignan, Simon
2014-01-01
Increasing investment in eHealth aims to improve cost effectiveness and safety of care. Data extraction and aggregation can create new data products to improve professional practice and provide feedback to improve the quality of source data. A previous systematic review concluded that locally relevant clinical indicators and use of clinical record systems could support clinical governance. We aimed to extend and update the review with a theoretical framework. We searched PubMed, Medline, Web of Science, ABI Inform (Proquest) and Business Source Premier (EBSCO) using the terms curation, information ecosystem, data quality management (DQM), data governance, information governance (IG) and data stewardship. We focused on and analysed the scope of DQM and IG processes, theoretical frameworks, and determinants of the processing, quality assurance, presentation and sharing of data across the enterprise. There are good theoretical reasons for integrated governance, but there is variable alignment of DQM, IG and health system objectives across the health enterprise. Ethical constraints exist that require health information ecosystems to process data in ways that are aligned with improving health and system efficiency and ensuring patient safety. Despite an increasingly 'big-data' environment, DQM and IG in health services are still fragmented across the data production cycle. We extend current work on DQM and IG with a theoretical framework for integrated IG across the data cycle. The dimensions of this theory-based framework would require testing with qualitative and quantitative studies to examine the applicability and utility, along with an evaluation of its impact on data quality across the health enterprise.
Robust boundary treatment for open-channel flows in divergence-free incompressible SPH
NASA Astrophysics Data System (ADS)
Pahar, Gourabananda; Dhar, Anirban
2017-03-01
A robust Incompressible Smoothed Particle Hydrodynamics (ISPH) framework is developed to simulate specified inflow and outflow boundary conditions for open-channel flow. Being purely divergence-free, the framework offers smoothed and structured pressure distribution. An implicit treatment of Pressure Poison Equation and Dirichlet boundary condition is applied on free-surface to minimize error in velocity-divergence. Beyond inflow and outflow threshold, multiple layers of dummy particles are created according to specified boundary condition. Inflow boundary acts as a soluble wave-maker. Fluid particles beyond outflow threshold are removed and replaced with dummy particles with specified boundary velocity. The framework is validated against different cases of open channel flow with different boundary conditions. The model can efficiently capture flow evolution and vortex generation for random geometry and variable boundary conditions.
Pandemic influenza preparedness: an ethical framework to guide decision-making.
Thompson, Alison K; Faith, Karen; Gibson, Jennifer L; Upshur, Ross E G
2006-12-04
Planning for the next pandemic influenza outbreak is underway in hospitals across the world. The global SARS experience has taught us that ethical frameworks to guide decision-making may help to reduce collateral damage and increase trust and solidarity within and between health care organisations. Good pandemic planning requires reflection on values because science alone cannot tell us how to prepare for a public health crisis. In this paper, we present an ethical framework for pandemic influenza planning. The ethical framework was developed with expertise from clinical, organisational and public health ethics and validated through a stakeholder engagement process. The ethical framework includes both substantive and procedural elements for ethical pandemic influenza planning. The incorporation of ethics into pandemic planning can be helped by senior hospital administrators sponsoring its use, by having stakeholders vet the framework, and by designing or identifying decision review processes. We discuss the merits and limits of an applied ethical framework for hospital decision-making, as well as the robustness of the framework. The need for reflection on the ethical issues raised by the spectre of a pandemic influenza outbreak is great. Our efforts to address the normative aspects of pandemic planning in hospitals have generated interest from other hospitals and from the governmental sector. The framework will require re-evaluation and refinement and we hope that this paper will generate feedback on how to make it even more robust.
ERIC Educational Resources Information Center
Tripuraneni, Vinaya L.
2010-01-01
Purpose: The purpose of this study is to identify the leadership orientation of the academic library leader considered ideal by faculty, administrators and librarians in private, non-profit, doctoral universities in Southern California. Theoretical Framework: The theoretical framework used for this study was Bolman and Deal's Leadership…
ERIC Educational Resources Information Center
Qandile, Yasine A.; Al-Qasim, Wajeeh Q.
2014-01-01
The purpose of this study is to construct a clear instructional philosophy for Salman bin Abdulaziz University as a fundamental basis for teaching and training as well as a theoretical framework for curriculum design and development. The study attempts to answer the main questions about pertaining to the basic structure of contemporary higher…
ERIC Educational Resources Information Center
Quinn, Frances; Pegg, John; Panizzon, Debra
2009-01-01
Meiosis is a biological concept that is both complex and important for students to learn. This study aims to explore first-year biology students' explanations of the process of meiosis, using an explicit theoretical framework provided by the Structure of the Observed Learning Outcome (SOLO) model. The research was based on responses of 334…
ERIC Educational Resources Information Center
Byerlee, Derek; Eicher, Carl K.
Employment problems in Africa were examined with special emphasis on rural employment and migration within the context of overall economic development. A framework was provided for analyzing rural employment in development; that framework was used to analyze empirical information from Africa; and theoretical issues were raised in analyzing rural…
ERIC Educational Resources Information Center
Haj-Yahia, Muhammad M.; Uysal, Aynur
2011-01-01
An integrative theoretical framework was tested as the basis for explaining beliefs about wife beating among Turkish nursing students. Based on a survey design, 406 nursing students (404 females) in all 4 years of undergraduate studies completed a self-administered questionnaire. Questionnaires were distributed and collected from the participants…
ERIC Educational Resources Information Center
Barnett, Janet Heine; Lodder, Jerry; Pengelley, David
2014-01-01
We analyze our method of teaching with primary historical sources within the context of theoretical frameworks for the role of history in teaching mathematics developed by Barbin, Fried, Jahnke, Jankvist, and Kjeldsen and Blomhøj, and more generally from the perspective of Sfard's theory of learning as communication. We present case studies…
ERIC Educational Resources Information Center
Grant, Cynthia; Osanloo, Azadeh
2014-01-01
The theoretical framework is one of the most important aspects in the research process, yet is often misunderstood by doctoral candidates as they prepare their dissertation research study. The importance of theory-driven thinking and acting is emphasized in relation to the selection of a topic, the development of research questions, the…
International Voluntary Health Networks (IVHNs). A social-geographical framework.
Reid, Benet; Laurie, Nina; Smith, Matt Baillie
2018-03-01
Trans-national medicine, historically associated with colonial politics, is now central to discourses of global health and development, thrust into mainstream media by catastrophic events (earthquakes, disease epidemics), and enshrined in the 2015 Sustainable Development Goals. Volunteer human-resource is an important contributor to international health-development work. International Voluntary Health Networks (IVHNs, that connect richer and poorer countries through healthcare) are situated at a meeting-point between geographies and sociologies of health. More fully developed social-geographic understandings will illuminate this area, currently dominated by instrumental health-professional perspectives. The challenge we address is to produce a geographically and sociologically-robust conceptual framework that appropriately recognises IVHNs' potentials for valuable impacts, while also unlocking spaces of constructive critique. We examine the importance of the social in health geography, and geographical potentials in health sociology (focusing on professional knowledge construction, inequality and capital, and power), to highlight the mutual interests of these two fields in relation to IVHNs. We propose some socio-geographical theories of IVHNs that do not naturalise inequality, that understand health as a form of capital, prioritise explorations of power and ethical practice, and acknowledge the more-than-human properties of place. This sets an agenda for theoretically-supported empirical work on IVHNs. Copyright © 2018 Elsevier Ltd. All rights reserved.
Water age and stream solute dynamics at the Hubbard Brook Experimental Forest (US)
NASA Astrophysics Data System (ADS)
Botter, Gianluca; Benettin, Paolo; McGuire, Kevin; Rinaldo, Andrea
2016-04-01
The contribution discusses experimental and modeling results from a headwater catchment at the Hubbard Brook Experimental Forest (New Hampshire, USA) to explore the link between stream solute dynamics and water age. A theoretical framework based on water age dynamics, which represents a general basis for characterizing solute transport at the catchment scale, is used to model both conservative and weathering-derived solutes. Based on the available information about the hydrology of the site, an integrated transport model was developed and used to estimate the relevant hydrochemical fluxes. The model was designed to reproduce the deuterium content of streamflow and allowed for the estimate of catchment water storage and dynamic travel time distributions (TTDs). Within this framework, dissolved silicon and sodium concentration in streamflow were simulated by implementing first-order chemical kinetics based explicitly on dynamic TTD, thus upscaling local geochemical processes to catchment scale. Our results highlight the key role of water stored within the subsoil glacial material in both the short-term and long-term solute circulation at Hubbard Brook. The analysis of the results provided by the calibrated model allowed a robust estimate of the emerging concentration-discharge relationship, streamflow age distributions (including the fraction of event water) and storage size, and their evolution in time due to hydrologic variability.
Network reconstruction via graph blending
NASA Astrophysics Data System (ADS)
Estrada, Rolando
2016-05-01
Graphs estimated from empirical data are often noisy and incomplete due to the difficulty of faithfully observing all the components (nodes and edges) of the true graph. This problem is particularly acute for large networks where the number of components may far exceed available surveillance capabilities. Errors in the observed graph can render subsequent analyses invalid, so it is vital to develop robust methods that can minimize these observational errors. Errors in the observed graph may include missing and spurious components, as well fused (multiple nodes are merged into one) and split (a single node is misinterpreted as many) nodes. Traditional graph reconstruction methods are only able to identify missing or spurious components (primarily edges, and to a lesser degree nodes), so we developed a novel graph blending framework that allows us to cast the full estimation problem as a simple edge addition/deletion problem. Armed with this framework, we systematically investigate the viability of various topological graph features, such as the degree distribution or the clustering coefficients, and existing graph reconstruction methods for tackling the full estimation problem. Our experimental results suggest that incorporating any topological feature as a source of information actually hinders reconstruction accuracy. We provide a theoretical analysis of this phenomenon and suggest several avenues for improving this estimation problem.
Approximate kernel competitive learning.
Wu, Jian-Sheng; Zheng, Wei-Shi; Lai, Jian-Huang
2015-03-01
Kernel competitive learning has been successfully used to achieve robust clustering. However, kernel competitive learning (KCL) is not scalable for large scale data processing, because (1) it has to calculate and store the full kernel matrix that is too large to be calculated and kept in the memory and (2) it cannot be computed in parallel. In this paper we develop a framework of approximate kernel competitive learning for processing large scale dataset. The proposed framework consists of two parts. First, it derives an approximate kernel competitive learning (AKCL), which learns kernel competitive learning in a subspace via sampling. We provide solid theoretical analysis on why the proposed approximation modelling would work for kernel competitive learning, and furthermore, we show that the computational complexity of AKCL is largely reduced. Second, we propose a pseudo-parallelled approximate kernel competitive learning (PAKCL) based on a set-based kernel competitive learning strategy, which overcomes the obstacle of using parallel programming in kernel competitive learning and significantly accelerates the approximate kernel competitive learning for large scale clustering. The empirical evaluation on publicly available datasets shows that the proposed AKCL and PAKCL can perform comparably as KCL, with a large reduction on computational cost. Also, the proposed methods achieve more effective clustering performance in terms of clustering precision against related approximate clustering approaches. Copyright © 2014 Elsevier Ltd. All rights reserved.
From field notes to data portal - An operational QA/QC framework for tower networks
NASA Astrophysics Data System (ADS)
Sturtevant, C.; Hackley, S.; Meehan, T.; Roberti, J. A.; Holling, G.; Bonarrigo, S.
2016-12-01
Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. This is especially so for environmental sensor networks collecting numerous high-frequency measurement streams at distributed sites. Here, the quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from the natural environment. To complicate matters, there are often multiple personnel managing different sites or different steps in the data flow. For large, centrally managed sensor networks such as NEON, the separation of field and processing duties is in the extreme. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process relying on visual inspection of the data. In addition, notes of observed measurement interference or visible problems are often recorded on paper without an explicit pathway to data flagging during processing. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. There is a need for a scalable, operational QA/QC framework that combines the efficiency and standardization of automated tests with the power and flexibility of visual checks, and includes an efficient communication pathway from field personnel to data processors to end users. Here we propose such a framework and an accompanying set of tools in development, including a mobile application template for recording tower maintenance and an R/shiny application for efficiently monitoring and synthesizing data quality issues. This framework seeks to incorporate lessons learned from the Ameriflux community and provide tools to aid continued network advancements.
Su, Bi-ying; Liu, Shao-nan; Li, Xiao-yan
2011-11-01
To study the train of thoughts and procedures for developing the theoretical framework and the item pool of the peri-operative recovery scale for integrative medicine, thus making preparation for the development of this scale and psychometric testing. Under the guidance for Chinese medicine theories and the guidance for developing psychometric scale, the theoretical framework and the item pool of the scale were initially laid out by literature retrieval, and expert consultation, etc. The scale covered the domains of physical function, mental function, activity function, pain, and general assessment. Besides, social function is involved, which is suitable for pre-operative testing and long-term therapeutic efficacy testing after discharge from hospital. Each domain should cover correlated Zang-Fu organs, qi, blood, and the patient-reported outcomes. Totally 122 items were initially covered in the item pool according to theoretical framework of the scale. The peri-operative recovery scale of integrative medicine was the embodiment of the combination of Chinese medicine theories and patient-reported outcome concepts. The scale could reasonably assess the peri-operative recovery outcomes of patients treated by integrative medicine.
Phillips, Cameron J; Marshall, Andrea P; Chaves, Nadia J; Jankelowitz, Stacey K; Lin, Ivan B; Loy, Clement T; Rees, Gwyneth; Sakzewski, Leanne; Thomas, Susie; To, The-Phung; Wilkinson, Shelley A; Michie, Susan
2015-01-01
The Theoretical Domains Framework (TDF) is an integrative framework developed from a synthesis of psychological theories as a vehicle to help apply theoretical approaches to interventions aimed at behavior change. This study explores experiences of TDF use by professionals from multiple disciplines across diverse clinical settings. Mixed methods were used to examine experiences, attitudes, and perspectives of health professionals in using the TDF in health care implementation projects. Individual interviews were conducted with ten health care professionals from six disciplines who used the TDF in implementation projects. Deductive content and thematic analysis were used. Three main themes and associated subthemes were identified including: 1) reasons for use of the TDF (increased confidence, broader perspective, and theoretical underpinnings); 2) challenges using the TDF (time and resources, operationalization of the TDF) and; 3) future use of the TDF. The TDF provided a useful, flexible framework for a diverse group of health professionals working across different clinical settings for the assessment of barriers and targeting resources to influence behavior change for implementation projects. The development of practical tools and training or support is likely to aid the utility of TDF.
Phillips, Cameron J; Marshall, Andrea P; Chaves, Nadia J; Jankelowitz, Stacey K; Lin, Ivan B; Loy, Clement T; Rees, Gwyneth; Sakzewski, Leanne; Thomas, Susie; To, The-Phung; Wilkinson, Shelley A; Michie, Susan
2015-01-01
Background The Theoretical Domains Framework (TDF) is an integrative framework developed from a synthesis of psychological theories as a vehicle to help apply theoretical approaches to interventions aimed at behavior change. Purpose This study explores experiences of TDF use by professionals from multiple disciplines across diverse clinical settings. Methods Mixed methods were used to examine experiences, attitudes, and perspectives of health professionals in using the TDF in health care implementation projects. Individual interviews were conducted with ten health care professionals from six disciplines who used the TDF in implementation projects. Deductive content and thematic analysis were used. Results Three main themes and associated subthemes were identified including: 1) reasons for use of the TDF (increased confidence, broader perspective, and theoretical underpinnings); 2) challenges using the TDF (time and resources, operationalization of the TDF) and; 3) future use of the TDF. Conclusion The TDF provided a useful, flexible framework for a diverse group of health professionals working across different clinical settings for the assessment of barriers and targeting resources to influence behavior change for implementation projects. The development of practical tools and training or support is likely to aid the utility of TDF. PMID:25834455
Hunter, Teressa Sanders; Tilley, Donna Scott
2015-01-01
This review of the literature identifies themes, variable, goals, and gaps in the literature related to HIV and AIDS among African American women. Black Feminist Epistemology and symbolic interactionism are used as a theoretical perspective and philosophical framework to examine experiences and social behaviors of African-American women and to guide and framework to explain the findings from the literature. This theoretical perspective/philosophical framework can also be used in understanding processes used by African-American women in behavioral, social, and intimate interactions.
Mitchell, Brett G; Gardner, Anne
2014-03-01
To present a discussion on theoretical frameworks in infection prevention and control. Infection prevention and control programmes have been in place for several years in response to the incidence of healthcare-associated infections and their associated morbidity and mortality. Theoretical frameworks play an important role in formalizing the understanding of infection prevention activities. Discussion paper. A literature search using electronic databases was conducted for published articles in English addressing theoretical frameworks in infection prevention and control between 1980-2012. Nineteen papers that included a reference to frameworks were identified in the review. A narrative analysis of these papers was completed. Two models were identified and neither included the role of surveillance. To reduce the risk of acquiring a healthcare-associated infection, a multifaceted approach to infection prevention is required. One key component in this approach is surveillance. The review identified two infection prevention and control frameworks, yet these are rarely applied in infection prevention and control programmes. Only one framework considered the multifaceted approach required for infection prevention. It did not, however, incorporate the role of surveillance. We present a framework that incorporates the role of surveillance into a biopsychosocial approach to infection prevention and control. Infection prevention and control programmes and associated research are led primarily by nurses. There is a need for an explicit infection prevention and control framework incorporating the important role that surveillance has in infection prevention activities. This study presents one framework for further critique and discussion. © 2013 John Wiley & Sons Ltd.
Decision support models for solid waste management: Review and game-theoretic approaches
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karmperis, Athanasios C., E-mail: athkarmp@mail.ntua.gr; Army Corps of Engineers, Hellenic Army General Staff, Ministry of Defence; Aravossis, Konstantinos
Highlights: ► The mainly used decision support frameworks for solid waste management are reviewed. ► The LCA, CBA and MCDM models are presented and their strengths, weaknesses, similarities and possible combinations are analyzed. ► The game-theoretic approach in a solid waste management context is presented. ► The waste management bargaining game is introduced as a specific decision support framework. ► Cooperative and non-cooperative game-theoretic approaches to decision support for solid waste management are discussed. - Abstract: This paper surveys decision support models that are commonly used in the solid waste management area. Most models are mainly developed within three decisionmore » support frameworks, which are the life-cycle assessment, the cost–benefit analysis and the multi-criteria decision-making. These frameworks are reviewed and their strengths and weaknesses as well as their critical issues are analyzed, while their possible combinations and extensions are also discussed. Furthermore, the paper presents how cooperative and non-cooperative game-theoretic approaches can be used for the purpose of modeling and analyzing decision-making in situations with multiple stakeholders. Specifically, since a waste management model is sustainable when considering not only environmental and economic but also social aspects, the waste management bargaining game is introduced as a specific decision support framework in which future models can be developed.« less
Why do children and adolescents bully their peers? A critical review of key theoretical frameworks.
Thomas, Hannah J; Connor, Jason P; Scott, James G
2018-05-01
Bullying is a significant public health problem for children and adolescents worldwide. Evidence suggests that both being bullied (bullying victimisation) and bullying others (bullying perpetration) are associated with concurrent and future mental health problems. The onset and course of bullying perpetration are influenced by individual as well as systemic factors. Identifying effective solutions to address bullying requires a fundamental understanding of why it occurs. Drawing from multi-disciplinary domains, this review provides a summary and synthesis of the key theoretical frameworks applied to understanding and intervening on the issue of bullying. A number of explanatory models have been used to elucidate the dynamics of bullying, and broadly these correspond with either system (e.g., social-ecological, family systems, peer-group socialisation) or individual-level (e.g., developmental psychopathology, genetic, resource control, social-cognitive) frameworks. Each theory adds a unique perspective; however, no single framework comprehensively explains why bullying occurs. This review demonstrates that the integration of theoretical perspectives achieves a more nuanced understanding of bullying which is necessary for strengthening evidence-based interventions. Future progress requires researchers to integrate both the systems and individual-level theoretical frameworks to further improve current interventions. More effective intervention across different systems as well as tailoring interventions to the specific needs of the individuals directly involved in bullying will reduce exposure to a key risk factor for mental health problems.
DOT National Transportation Integrated Search
2012-05-01
EnableATIS is looking ahead to a future operational environment that will support and enable an advanced, transformational traveler information services framework. This future framework is envisioned to be enabled with a much more robust pool of real...
Robust group-wise rigid registration of point sets using t-mixture model
NASA Astrophysics Data System (ADS)
Ravikumar, Nishant; Gooya, Ali; Frangi, Alejandro F.; Taylor, Zeike A.
2016-03-01
A probabilistic framework for robust, group-wise rigid alignment of point-sets using a mixture of Students t-distribution especially when the point sets are of varying lengths, are corrupted by an unknown degree of outliers or in the presence of missing data. Medical images (in particular magnetic resonance (MR) images), their segmentations and consequently point-sets generated from these are highly susceptible to corruption by outliers. This poses a problem for robust correspondence estimation and accurate alignment of shapes, necessary for training statistical shape models (SSMs). To address these issues, this study proposes to use a t-mixture model (TMM), to approximate the underlying joint probability density of a group of similar shapes and align them to a common reference frame. The heavy-tailed nature of t-distributions provides a more robust registration framework in comparison to state of the art algorithms. Significant reduction in alignment errors is achieved in the presence of outliers, using the proposed TMM-based group-wise rigid registration method, in comparison to its Gaussian mixture model (GMM) counterparts. The proposed TMM-framework is compared with a group-wise variant of the well-known Coherent Point Drift (CPD) algorithm and two other group-wise methods using GMMs, using both synthetic and real data sets. Rigid alignment errors for groups of shapes are quantified using the Hausdorff distance (HD) and quadratic surface distance (QSD) metrics.
Unified Framework for Development, Deployment and Robust Testing of Neuroimaging Algorithms
Joshi, Alark; Scheinost, Dustin; Okuda, Hirohito; Belhachemi, Dominique; Murphy, Isabella; Staib, Lawrence H.; Papademetris, Xenophon
2011-01-01
Developing both graphical and command-line user interfaces for neuroimaging algorithms requires considerable effort. Neuroimaging algorithms can meet their potential only if they can be easily and frequently used by their intended users. Deployment of a large suite of such algorithms on multiple platforms requires consistency of user interface controls, consistent results across various platforms and thorough testing. We present the design and implementation of a novel object-oriented framework that allows for rapid development of complex image analysis algorithms with many reusable components and the ability to easily add graphical user interface controls. Our framework also allows for simplified yet robust nightly testing of the algorithms to ensure stability and cross platform interoperability. All of the functionality is encapsulated into a software object requiring no separate source code for user interfaces, testing or deployment. This formulation makes our framework ideal for developing novel, stable and easy-to-use algorithms for medical image analysis and computer assisted interventions. The framework has been both deployed at Yale and released for public use in the open source multi-platform image analysis software—BioImage Suite (bioimagesuite.org). PMID:21249532
Luo, Xiongbiao; Wan, Ying; He, Xiangjian
2015-04-01
Electromagnetically guided endoscopic procedure, which aims at accurately and robustly localizing the endoscope, involves multimodal sensory information during interventions. However, it still remains challenging in how to integrate these information for precise and stable endoscopic guidance. To tackle such a challenge, this paper proposes a new framework on the basis of an enhanced particle swarm optimization method to effectively fuse these information for accurate and continuous endoscope localization. The authors use the particle swarm optimization method, which is one of stochastic evolutionary computation algorithms, to effectively fuse the multimodal information including preoperative information (i.e., computed tomography images) as a frame of reference, endoscopic camera videos, and positional sensor measurements (i.e., electromagnetic sensor outputs). Since the evolutionary computation method usually limits its possible premature convergence and evolutionary factors, the authors introduce the current (endoscopic camera and electromagnetic sensor's) observation to boost the particle swarm optimization and also adaptively update evolutionary parameters in accordance with spatial constraints and the current observation, resulting in advantageous performance in the enhanced algorithm. The experimental results demonstrate that the authors' proposed method provides a more accurate and robust endoscopic guidance framework than state-of-the-art methods. The average guidance accuracy of the authors' framework was about 3.0 mm and 5.6° while the previous methods show at least 3.9 mm and 7.0°. The average position and orientation smoothness of their method was 1.0 mm and 1.6°, which is significantly better than the other methods at least with (2.0 mm and 2.6°). Additionally, the average visual quality of the endoscopic guidance was improved to 0.29. A robust electromagnetically guided endoscopy framework was proposed on the basis of an enhanced particle swarm optimization method with using the current observation information and adaptive evolutionary factors. The authors proposed framework greatly reduced the guidance errors from (4.3, 7.8) to (3.0 mm, 5.6°), compared to state-of-the-art methods.
The necessity of a theory of biology for tissue engineering: metabolism-repair systems.
Ganguli, Suman; Hunt, C Anthony
2004-01-01
Since there is no widely accepted global theory of biology, tissue engineering and bioengineering lack a theoretical understanding of the systems being engineered. By default, tissue engineering operates with a "reductionist" theoretical approach, inherited from traditional engineering of non-living materials. Long term, that approach is inadequate, since it ignores essential aspects of biology. Metabolism-repair systems are a theoretical framework which explicitly represents two "functional" aspects of living organisms: self-repair and self-replication. Since repair and replication are central to tissue engineering, we advance metabolism-repair systems as a potential theoretical framework for tissue engineering. We present an overview of the framework, and indicate directions to pursue for extending it to the context of tissue engineering. We focus on biological networks, both metabolic and cellular, as one such direction. The construction of these networks, in turn, depends on biological protocols. Together these concepts may help point the way to a global theory of biology appropriate for tissue engineering.
ERIC Educational Resources Information Center
Winkle-Wagner, Rachelle
2012-01-01
This article examines the psychological theoretical foundations of college student development theory and the theoretical assumptions of this framework. A complimentary, sociological perspective and the theoretical assumptions of this approach are offered. The potential limitations of the overuse of each perspective are considered. The conclusion…
Robustness and percolation of holes in complex networks
NASA Astrophysics Data System (ADS)
Zhou, Andu; Maletić, Slobodan; Zhao, Yi
2018-07-01
Efficient robustness and fault tolerance of complex network is significantly influenced by its connectivity, commonly modeled by the structure of pairwise relations between network elements, i.e., nodes. Nevertheless, aggregations of nodes build higher-order structures embedded in complex network, which may be more vulnerable when the fraction of nodes is removed. The structure of higher-order aggregations of nodes can be naturally modeled by simplicial complexes, whereas the removal of nodes affects the values of topological invariants, like the number of higher-dimensional holes quantified with Betti numbers. Following the methodology of percolation theory, as the fraction of nodes is removed, new holes appear, which have the role of merger between already present holes. In the present article, relationship between the robustness and homological properties of complex network is studied, through relating the graph-theoretical signatures of robustness and the quantities derived from topological invariants. The simulation results of random failures and intentional attacks on networks suggest that the changes of graph-theoretical signatures of robustness are followed by differences in the distribution of number of holes per cluster under different attack strategies. In the broader sense, the results indicate the importance of topological invariants research for obtaining further insights in understanding dynamics taking place over complex networks.
Cusack, Lynette; Smith, Morgan; Hegney, Desley; Rees, Clare S; Breen, Lauren J; Witt, Regina R; Rogers, Cath; Williams, Allison; Cross, Wendy; Cheung, Kin
2016-01-01
Building nurses' resilience to complex and stressful practice environments is necessary to keep skilled nurses in the workplace and ensuring safe patient care. A unified theoretical framework titled Health Services Workplace Environmental Resilience Model (HSWERM), is presented to explain the environmental factors in the workplace that promote nurses' resilience. The framework builds on a previously-published theoretical model of individual resilience, which identified the key constructs of psychological resilience as self-efficacy, coping and mindfulness, but did not examine environmental factors in the workplace that promote nurses' resilience. This unified theoretical framework was developed using a literary synthesis drawing on data from international studies and literature reviews on the nursing workforce in hospitals. The most frequent workplace environmental factors were identified, extracted and clustered in alignment with key constructs for psychological resilience. Six major organizational concepts emerged that related to a positive resilience-building workplace and formed the foundation of the theoretical model. Three concepts related to nursing staff support (professional, practice, personal) and three related to nursing staff development (professional, practice, personal) within the workplace environment. The unified theoretical model incorporates these concepts within the workplace context, linking to the nurse, and then impacting on personal resilience and workplace outcomes, and its use has the potential to increase staff retention and quality of patient care.
Prescott, Sarah; Fleming, Jennifer; Doig, Emmah
2017-06-11
The aim of this study was to explore clinicians' experiences of implementing goal setting with community dwelling clients with acquired brain injury, to develop a goal setting practice framework. Grounded theory methodology was employed. Clinicians, representing six disciplines across seven services, were recruited and interviewed until theoretical saturation was achieved. A total of 22 clinicians were interviewed. A theoretical framework was developed to explain how clinicians support clients to actively engage in goal setting in routine practice. The framework incorporates three phases: a needs identification phase, a goal operationalisation phase, and an intervention phase. Contextual factors, including personal and environmental influences, also affect how clinicians and clients engage in this process. Clinicians use additional strategies to support clients with impaired self-awareness. These include structured communication and metacognitive strategies to operationalise goals. For clients with emotional distress, clinicians provide additional time and intervention directed at new identity development. The goal setting practice framework may guide clinician's understanding of how to engage in client-centred goal setting in brain injury rehabilitation. There is a predilection towards a client-centred goal setting approach in the community setting, however, contextual factors can inhibit implementation of this approach. Implications for Rehabilitation The theoretical framework describes processes used to develop achievable client-centred goals with people with brain injury. Building rapport is a core strategy to engage clients with brain injury in goal setting. Clients with self-awareness impairment benefit from additional metacognitive strategies to participate in goal setting. Clients with emotional distress may need additional time for new identity development.
Shembel, Adrianna C; Sandage, Mary J; Verdolini Abbott, Katherine
2017-01-01
The purposes of this literature review were (1) to identify and assess frameworks for clinical characterization of episodic laryngeal breathing disorders (ELBD) and their subtypes, (2) to integrate concepts from these frameworks into a novel theoretical paradigm, and (3) to provide a preliminary algorithm to classify clinical features of ELBD for future study of its clinical manifestations and underlying pathophysiological mechanisms. This is a literature review. Peer-reviewed literature from 1983 to 2015 pertaining to models for ELBD was searched using Pubmed, Ovid, Proquest, Cochrane Database of Systematic Reviews, and Google Scholar. Theoretical models for ELBD were identified, evaluated, and integrated into a novel comprehensive framework. Consensus across three salient models provided a working definition and inclusionary criteria for ELBD within the new framework. Inconsistencies and discrepancies within the models provided an analytic platform for future research. Comparison among three conceptual models-(1) Irritable larynx syndrome, (2) Dichotomous triggers, and (3) Periodic occurrence of laryngeal obstruction-showed that the models uniformly consider ELBD to involve episodic laryngeal obstruction causing dyspnea. The models differed in their description of source of dyspnea, in their inclusion of corollary behaviors, in their inclusion of other laryngeal-based behaviors (eg, cough), and types of triggers. The proposed integrated theoretical framework for ELBD provides a preliminary systematic platform for the identification of key clinical feature patterns indicative of ELBD and associated clinical subgroups. This algorithmic paradigm should evolve with better understanding of this spectrum of disorders and its underlying pathophysiological mechanisms. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Uher, Jana
2011-09-01
Animal researchers are increasingly interested in individual differences in behavior. Their interpretation as meaningful differences in behavioral strategies stable over time and across contexts, adaptive, heritable, and acted upon by natural selection has triggered new theoretical developments. However, the analytical approaches used to explore behavioral data still address population-level phenomena, and statistical methods suitable to analyze individual behavior are rarely applied. I discuss fundamental investigative principles and analytical approaches to explore whether, in what ways, and under which conditions individual behavioral differences are actually meaningful. I elaborate the meta-theoretical ideas underlying common theoretical concepts and integrate them into an overarching meta-theoretical and methodological framework. This unravels commonalities and differences, and shows that assumptions of analogy to concepts of human personality are not always warranted and that some theoretical developments may be based on methodological artifacts. Yet, my results also highlight possible directions for new theoretical developments in animal behavior research. Copyright © 2011 Wiley Periodicals, Inc.
An index-based robust decision making framework for watershed management in a changing climate.
Kim, Yeonjoo; Chung, Eun-Sung
2014-03-01
This study developed an index-based robust decision making framework for watershed management dealing with water quantity and quality issues in a changing climate. It consists of two parts of management alternative development and analysis. The first part for alternative development consists of six steps: 1) to understand the watershed components and process using HSPF model, 2) to identify the spatial vulnerability ranking using two indices: potential streamflow depletion (PSD) and potential water quality deterioration (PWQD), 3) to quantify the residents' preferences on water management demands and calculate the watershed evaluation index which is the weighted combinations of PSD and PWQD, 4) to set the quantitative targets for water quantity and quality, 5) to develop a list of feasible alternatives and 6) to eliminate the unacceptable alternatives. The second part for alternative analysis has three steps: 7) to analyze all selected alternatives with a hydrologic simulation model considering various climate change scenarios, 8) to quantify the alternative evaluation index including social and hydrologic criteria with utilizing multi-criteria decision analysis methods and 9) to prioritize all options based on a minimax regret strategy for robust decision. This framework considers the uncertainty inherent in climate models and climate change scenarios with utilizing the minimax regret strategy, a decision making strategy under deep uncertainty and thus this procedure derives the robust prioritization based on the multiple utilities of alternatives from various scenarios. In this study, the proposed procedure was applied to the Korean urban watershed, which has suffered from streamflow depletion and water quality deterioration. Our application shows that the framework provides a useful watershed management tool for incorporating quantitative and qualitative information into the evaluation of various policies with regard to water resource planning and management. Copyright © 2013 Elsevier B.V. All rights reserved.
Cloud computing strategic framework (FY13 - FY15).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arellano, Lawrence R.; Arroyo, Steven C.; Giese, Gerald J.
This document presents an architectural framework (plan) and roadmap for the implementation of a robust Cloud Computing capability at Sandia National Laboratories. It is intended to be a living document and serve as the basis for detailed implementation plans, project proposals and strategic investment requests.
The Fundamentals of Care Framework as a Point-of-Care Nursing Theory.
Kitson, Alison L
Nursing theories have attempted to shape the everyday practice of clinical nurses and patient care. However, many theories-because of their level of abstraction and distance from everyday caring activity-have failed to help nurses undertake the routine practical aspects of nursing care in a theoretically informed way. The purpose of the paper is to present a point-of-care theoretical framework, called the fundamentals of care (FOC) framework, which explains, guides, and potentially predicts the quality of care nurses provide to patients, their carers, and family members. The theoretical framework is presented: person-centered fundamental care (PCFC)-the outcome for the patient and the nurse and the goal of the FOC framework are achieved through the active management of the practice process, which involves the nurse and the patient working together to integrate three core dimensions: establishing the nurse-patient relationship, integrating the FOC into the patient's care plan, and ensuring that the setting or context where care is transacted and coordinated is conducive to achieving PCFC outcomes. Each dimension has multiple elements and subelements, which require unique assessment for each nurse-patient encounter. The FOC framework is presented along with two scenarios to demonstrate its usefulness. The dimensions, elements, and subelements are described, and next steps in the development are articulated.
Robust estimation approach for blind denoising.
Rabie, Tamer
2005-11-01
This work develops a new robust statistical framework for blind image denoising. Robust statistics addresses the problem of estimation when the idealized assumptions about a system are occasionally violated. The contaminating noise in an image is considered as a violation of the assumption of spatial coherence of the image intensities and is treated as an outlier random variable. A denoised image is estimated by fitting a spatially coherent stationary image model to the available noisy data using a robust estimator-based regression method within an optimal-size adaptive window. The robust formulation aims at eliminating the noise outliers while preserving the edge structures in the restored image. Several examples demonstrating the effectiveness of this robust denoising technique are reported and a comparison with other standard denoising filters is presented.
NASA Astrophysics Data System (ADS)
Zhang, Lucy
In this talk, we show a robust numerical framework to model and simulate gas-liquid-solid three-phase flows. The overall algorithm adopts a non-boundary-fitted approach that avoids frequent mesh-updating procedures by defining independent meshes and explicit interfacial points to represent each phase. In this framework, we couple the immersed finite element method (IFEM) and the connectivity-free front tracking (CFFT) method that model fluid-solid and gas-liquid interactions, respectively, for the three-phase models. The CFFT is used here to simulate gas-liquid multi-fluid flows that uses explicit interfacial points to represent the gas-liquid interface and for its easy handling of interface topology changes. Instead of defining different levels simultaneously as used in level sets, an indicator function naturally couples the two methods together to represent and track each of the three phases. Several 2-D and 3-D testing cases are performed to demonstrate the robustness and capability of the coupled numerical framework in dealing with complex three-phase problems, in particular free surfaces interacting with deformable solids. The solution technique offers accuracy and stability, which provides a means to simulate various engineering applications. The author would like to acknowledge the supports from NIH/DHHS R01-2R01DC005642-10A1 and the National Natural Science Foundation of China (NSFC) 11550110185.
Motion Field Estimation for a Dynamic Scene Using a 3D LiDAR
Li, Qingquan; Zhang, Liang; Mao, Qingzhou; Zou, Qin; Zhang, Pin; Feng, Shaojun; Ochieng, Washington
2014-01-01
This paper proposes a novel motion field estimation method based on a 3D light detection and ranging (LiDAR) sensor for motion sensing for intelligent driverless vehicles and active collision avoidance systems. Unlike multiple target tracking methods, which estimate the motion state of detected targets, such as cars and pedestrians, motion field estimation regards the whole scene as a motion field in which each little element has its own motion state. Compared to multiple target tracking, segmentation errors and data association errors have much less significance in motion field estimation, making it more accurate and robust. This paper presents an intact 3D LiDAR-based motion field estimation method, including pre-processing, a theoretical framework for the motion field estimation problem and practical solutions. The 3D LiDAR measurements are first projected to small-scale polar grids, and then, after data association and Kalman filtering, the motion state of every moving grid is estimated. To reduce computing time, a fast data association algorithm is proposed. Furthermore, considering the spatial correlation of motion among neighboring grids, a novel spatial-smoothing algorithm is also presented to optimize the motion field. The experimental results using several data sets captured in different cities indicate that the proposed motion field estimation is able to run in real-time and performs robustly and effectively. PMID:25207868
Motion field estimation for a dynamic scene using a 3D LiDAR.
Li, Qingquan; Zhang, Liang; Mao, Qingzhou; Zou, Qin; Zhang, Pin; Feng, Shaojun; Ochieng, Washington
2014-09-09
This paper proposes a novel motion field estimation method based on a 3D light detection and ranging (LiDAR) sensor for motion sensing for intelligent driverless vehicles and active collision avoidance systems. Unlike multiple target tracking methods, which estimate the motion state of detected targets, such as cars and pedestrians, motion field estimation regards the whole scene as a motion field in which each little element has its own motion state. Compared to multiple target tracking, segmentation errors and data association errors have much less significance in motion field estimation, making it more accurate and robust. This paper presents an intact 3D LiDAR-based motion field estimation method, including pre-processing, a theoretical framework for the motion field estimation problem and practical solutions. The 3D LiDAR measurements are first projected to small-scale polar grids, and then, after data association and Kalman filtering, the motion state of every moving grid is estimated. To reduce computing time, a fast data association algorithm is proposed. Furthermore, considering the spatial correlation of motion among neighboring grids, a novel spatial-smoothing algorithm is also presented to optimize the motion field. The experimental results using several data sets captured in different cities indicate that the proposed motion field estimation is able to run in real-time and performs robustly and effectively.
Crises and Collective Socio-Economic Phenomena: Simple Models and Challenges
NASA Astrophysics Data System (ADS)
Bouchaud, Jean-Philippe
2013-05-01
Financial and economic history is strewn with bubbles and crashes, booms and busts, crises and upheavals of all sorts. Understanding the origin of these events is arguably one of the most important problems in economic theory. In this paper, we review recent efforts to include heterogeneities and interactions in models of decision. We argue that the so-called Random Field Ising model ( rfim) provides a unifying framework to account for many collective socio-economic phenomena that lead to sudden ruptures and crises. We discuss different models that can capture potentially destabilizing self-referential feedback loops, induced either by herding, i.e. reference to peers, or trending, i.e. reference to the past, and that account for some of the phenomenology missing in the standard models. We discuss some empirically testable predictions of these models, for example robust signatures of rfim-like herding effects, or the logarithmic decay of spatial correlations of voting patterns. One of the most striking result, inspired by statistical physics methods, is that Adam Smith's invisible hand can fail badly at solving simple coordination problems. We also insist on the issue of time-scales, that can be extremely long in some cases, and prevent socially optimal equilibria from being reached. As a theoretical challenge, the study of so-called "detailed-balance" violating decision rules is needed to decide whether conclusions based on current models (that all assume detailed-balance) are indeed robust and generic.
Collaborative Research: Robust Climate Projections and Stochastic Stability of Dynamical Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghil, Michael; McWilliams, James; Neelin, J. David
The project was completed along the lines of the original proposal, with additional elements arising as new results were obtained. The originally proposed three thrusts were expanded to include an additional, fourth one. (i) The e ffects of stochastic perturbations on climate models have been examined at the fundamental level by using the theory of deterministic and random dynamical systems, in both nite and in nite dimensions. (ii) The theoretical results have been implemented first on a delay-diff erential equation (DDE) model of the El-Nino/Southern-Oscillation (ENSO) phenomenon. (iii) More detailed, physical aspects of model robustness have been considered, as proposed,more » within the stripped-down ICTP-AGCM (formerly SPEEDY) climate model. This aspect of the research has been complemented by both observational and intermediate-model aspects of mid-latitude and tropical climate. (iv) An additional thrust of the research relied on new and unexpected results of (i) and involved reduced-modeling strategies and associated prediction aspects have been tested within the team's empirical model reduction (EMR) framework. Finally, more detailed, physical aspects have been considered within the stripped-down SPEEDY climate model. The results of each of these four complementary e fforts are presented in the next four sections, organized by topic and by the team members concentrating on the topic under discussion.« less
Moorkanikkara, Srinivas Nageswaran; Blankschtein, Daniel
2010-12-21
How does one design a surfactant mixture using a set of available surfactants such that it exhibits a desired adsorption kinetics behavior? The traditional approach used to address this design problem involves conducting trial-and-error experiments with specific surfactant mixtures. This approach is typically time-consuming and resource-intensive and becomes increasingly challenging when the number of surfactants that can be mixed increases. In this article, we propose a new theoretical framework to identify a surfactant mixture that most closely meets a desired adsorption kinetics behavior. Specifically, the new theoretical framework involves (a) formulating the surfactant mixture design problem as an optimization problem using an adsorption kinetics model and (b) solving the optimization problem using a commercial optimization package. The proposed framework aims to identify the surfactant mixture that most closely satisfies the desired adsorption kinetics behavior subject to the predictive capabilities of the chosen adsorption kinetics model. Experiments can then be conducted at the identified surfactant mixture condition to validate the predictions. We demonstrate the reliability and effectiveness of the proposed theoretical framework through a realistic case study by identifying a nonionic surfactant mixture consisting of up to four alkyl poly(ethylene oxide) surfactants (C(10)E(4), C(12)E(5), C(12)E(6), and C(10)E(8)) such that it most closely exhibits a desired dynamic surface tension (DST) profile. Specifically, we use the Mulqueen-Stebe-Blankschtein (MSB) adsorption kinetics model (Mulqueen, M.; Stebe, K. J.; Blankschtein, D. Langmuir 2001, 17, 5196-5207) to formulate the optimization problem as well as the SNOPT commercial optimization solver to identify a surfactant mixture consisting of these four surfactants that most closely exhibits the desired DST profile. Finally, we compare the experimental DST profile measured at the surfactant mixture condition identified by the new theoretical framework with the desired DST profile and find good agreement between the two profiles.
Patient Autonomy in a High-Tech Care Context - A Theoretical Framework.
Lindberg, Catharina; Fagerström, Cecilia; Willman, Ania
2018-06-12
To synthesise and interpret previous findings with the aim of developing a theoretical framework for patient autonomy in a high-tech care context. Putting the somewhat abstract concept of patient autonomy into practice can prove difficult since when it is highlighted in healthcare literature the patient perspective is often invisible. Autonomy presumes that a person has experience, education, self-discipline and decision-making capacity. Reference to autonomy in relation to patients in high-tech care environments could therefore be considered paradoxical, as in most cases these persons are vulnerable, with impaired physical and/or metacognitive capacity, thus making extended knowledge of patient autonomy for these persons even more important. Theory development. The basic approaches in theory development by Walker and Avant were used to create a theoretical framework through an amalgamation of the results from three qualitative studies conducted previously by the same research group. A theoretical framework - the control-partnership-transition framework - was delineated disclosing different parts co-creating the prerequisites for patient autonomy in high-tech care environments. Assumptions and propositional statements that guide theory development were also outlined, as were guiding principles for use in day-to-day nursing care. Four strategies used by patients were revealed: the strategy of control, the strategy of partnership, the strategy of trust, and the strategy of transition. An extended knowledge base, founded on theoretical reasoning about patient autonomy, could facilitate nursing care that would allow people to remain/become autonomous in the role of patient in high-tech care environments. The control-partnership-transition framework would be of help in supporting and defending patient autonomy when caring for individual patients, as it provides an understanding of the strategies employed by patients to achieve autonomy in high-tech care contexts. The guiding principles for patient autonomy presented could be used in nursing guidelines. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Multi-scale theoretical investigation of hydrogen storage in covalent organic frameworks.
Tylianakis, Emmanuel; Klontzas, Emmanouel; Froudakis, George E
2011-03-01
The quest for efficient hydrogen storage materials has been the limiting step towards the commercialization of hydrogen as an energy carrier and has attracted a lot of attention from the scientific community. Sophisticated multi-scale theoretical techniques have been considered as a valuable tool for the prediction of materials storage properties. Such techniques have also been used for the investigation of hydrogen storage in a novel category of porous materials known as Covalent Organic Frameworks (COFs). These framework materials are consisted of light elements and are characterized by exceptional physicochemical properties such as large surface areas and pore volumes. Combinations of ab initio, Molecular Dynamics (MD) and Grand Canonical Monte-Carlo (GCMC) calculations have been performed to investigate the hydrogen adsorption in these ultra-light materials. The purpose of the present review is to summarize the theoretical hydrogen storage studies that have been published after the discovery of COFs. Experimental and theoretical studies have proven that COFs have comparable or better hydrogen storage abilities than other competitive materials such as MOF. The key factors that can lead to the improvement of the hydrogen storage properties of COFs are highlighted, accompanied with some recently presented theoretical multi-scale studies concerning these factors.
Teaching for clinical reasoning - helping students make the conceptual links.
McMillan, Wendy Jayne
2010-01-01
Dental educators complain that students struggle to apply what they have learnt theoretically in the clinical context. This paper is premised on the assumption that there is a relationship between conceptual thinking and clinical reasoning. The paper provides a theoretical framework for understanding the relationship between conceptual learning and clinical reasoning. A review of current literature is used to explain the way in which conceptual understanding influences clinical reasoning and the transfer of theoretical understandings to the clinical context. The paper argues that the connections made between concepts are what is significant about conceptual understanding. From this point of departure the paper describes teaching strategies that facilitate the kinds of learning opportunities that students need in order to develop conceptual understanding and to be able to transfer knowledge from theoretical to clinical contexts. Along with a variety of teaching strategies, the value of concept maps is discussed. The paper provides a framework for understanding the difficulties that students have in developing conceptual networks appropriate for later clinical reasoning. In explaining how students learn for clinical application, the paper provides a theoretical framework that can inform how dental educators facilitate the conceptual learning, and later clinical reasoning, of their students.
Sparse coding for flexible, robust 3D facial-expression synthesis.
Lin, Yuxu; Song, Mingli; Quynh, Dao Thi Phuong; He, Ying; Chen, Chun
2012-01-01
Computer animation researchers have been extensively investigating 3D facial-expression synthesis for decades. However, flexible, robust production of realistic 3D facial expressions is still technically challenging. A proposed modeling framework applies sparse coding to synthesize 3D expressive faces, using specified coefficients or expression examples. It also robustly recovers facial expressions from noisy and incomplete data. This approach can synthesize higher-quality expressions in less time than the state-of-the-art techniques.
Efficient Computation of Info-Gap Robustness for Finite Element Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stull, Christopher J.; Hemez, Francois M.; Williams, Brian J.
2012-07-05
A recent research effort at LANL proposed info-gap decision theory as a framework by which to measure the predictive maturity of numerical models. Info-gap theory explores the trade-offs between accuracy, that is, the extent to which predictions reproduce the physical measurements, and robustness, that is, the extent to which predictions are insensitive to modeling assumptions. Both accuracy and robustness are necessary to demonstrate predictive maturity. However, conducting an info-gap analysis can present a formidable challenge, from the standpoint of the required computational resources. This is because a robustness function requires the resolution of multiple optimization problems. This report offers anmore » alternative, adjoint methodology to assess the info-gap robustness of Ax = b-like numerical models solved for a solution x. Two situations that can arise in structural analysis and design are briefly described and contextualized within the info-gap decision theory framework. The treatments of the info-gap problems, using the adjoint methodology are outlined in detail, and the latter problem is solved for four separate finite element models. As compared to statistical sampling, the proposed methodology offers highly accurate approximations of info-gap robustness functions for the finite element models considered in the report, at a small fraction of the computational cost. It is noted that this report considers only linear systems; a natural follow-on study would extend the methodologies described herein to include nonlinear systems.« less
A Century of Gestalt Psychology in Visual Perception II. Conceptual and Theoretical Foundations
Wagemans, Johan; Feldman, Jacob; Gepshtein, Sergei; Kimchi, Ruth; Pomerantz, James R.; van der Helm, Peter A.; van Leeuwen, Cees
2012-01-01
Our first review paper on the occasion of the centennial anniversary of Gestalt psychology focused on perceptual grouping and figure-ground organization. It concluded that further progress requires a reconsideration of the conceptual and theoretical foundations of the Gestalt approach, which is provided here. In particular, we review contemporary formulations of holism within an information-processing framework, allowing for operational definitions (e.g., integral dimensions, emergent features, configural superiority, global precedence, primacy of holistic/configural properties) and a refined understanding of its psychological implications (e.g., at the level of attention, perception, and decision). We also review four lines of theoretical progress regarding the law of Prägnanz—the brain’s tendency of being attracted towards states corresponding to the simplest possible organization, given the available stimulation. The first considers the brain as a complex adaptive system and explains how self-organization solves the conundrum of trading between robustness and flexibility of perceptual states. The second specifies the economy principle in terms of optimization of neural resources, showing that elementary sensors working independently to minimize uncertainty can respond optimally at the system level. The third considers how Gestalt percepts (e.g., groups, objects) are optimal given the available stimulation, with optimality specified in Bayesian terms. Fourth, Structural Information Theory explains how a Gestaltist visual system that focuses on internal coding efficiency yields external veridicality as a side-effect. To answer the fundamental question of why things look as they do, a further synthesis of these complementary perspectives is required. PMID:22845750
A century of Gestalt psychology in visual perception: II. Conceptual and theoretical foundations.
Wagemans, Johan; Feldman, Jacob; Gepshtein, Sergei; Kimchi, Ruth; Pomerantz, James R; van der Helm, Peter A; van Leeuwen, Cees
2012-11-01
Our first review article (Wagemans et al., 2012) on the occasion of the centennial anniversary of Gestalt psychology focused on perceptual grouping and figure-ground organization. It concluded that further progress requires a reconsideration of the conceptual and theoretical foundations of the Gestalt approach, which is provided here. In particular, we review contemporary formulations of holism within an information-processing framework, allowing for operational definitions (e.g., integral dimensions, emergent features, configural superiority, global precedence, primacy of holistic/configural properties) and a refined understanding of its psychological implications (e.g., at the level of attention, perception, and decision). We also review 4 lines of theoretical progress regarding the law of Prägnanz-the brain's tendency of being attracted towards states corresponding to the simplest possible organization, given the available stimulation. The first considers the brain as a complex adaptive system and explains how self-organization solves the conundrum of trading between robustness and flexibility of perceptual states. The second specifies the economy principle in terms of optimization of neural resources, showing that elementary sensors working independently to minimize uncertainty can respond optimally at the system level. The third considers how Gestalt percepts (e.g., groups, objects) are optimal given the available stimulation, with optimality specified in Bayesian terms. Fourth, structural information theory explains how a Gestaltist visual system that focuses on internal coding efficiency yields external veridicality as a side effect. To answer the fundamental question of why things look as they do, a further synthesis of these complementary perspectives is required.
NASA Astrophysics Data System (ADS)
Oh, W. S.; Yu, D. J.; Davis, T.; Hillis, V.; Waring, T. M.
2017-12-01
One ongoing challenge to socio-hydrology is the problem of generalization: to what extent do common human-water co-evolutions exist across distinct cases and what are underlying mechanisms of these co-evolutions. This problem stems in part from a lack of unifying theories in socio-hydrology, which hinders the explanation and generalization of results between cases in different regions. Theories help an analyst to make assumptions that are necessary to diagnose a specific phenomenon, to explain the general mechanisms of causation, and, thus, to predict future outcomes. To help address the issue, this study introduces two theories that are increasingly used in the fields of sustainability science and social-ecological systems research: robustness-fragility tradeoff (RFTO) and cultural multi-level selection (CMLS). We apply each of these theories to two distinct cases (water management issues in southwest Bangladesh and the Kissimmee River Basin, Florida) and interpret the phenomena of the levee and adaptation effects. CMLS and RFTO focus on complementary aspects of socio-hydrological phenomena. The theory of RFTO, which is mostly about inherent tradeoffs associated with infrastructure improvements, explains how efforts to increase system robustness can generate hidden endogenous risks. CMLS theory, rooted in the broader theory of cultural evolution, concerns how human cultural dynamics can act as an endogenous driver of system change across multiple levels of social organizations. Using the applied examples, we demonstrate that these two theories can provide an effective way to study social-hydrological systems and to overcome the generalization problem. Our work shows that multiple theories can be synthesized to give a richer understanding of diverse socio-hydrological patterns.
Baker, Phillip; Hawkes, Corinna; Wingrove, Kate; Demaio, Alessandro Rhyl; Parkhurst, Justin; Thow, Anne Marie; Walls, Helen
2018-01-01
Generating country-level political commitment will be critical to driving forward action throughout the United Nations Decade of Action on Nutrition (2016-2025). In this review of the empirical nutrition policy literature, we ask: what factors generate, sustain and constrain political commitment for nutrition, how and under what circumstances? Our aim is to inform strategic 'commitment-building' actions. We adopted a framework synthesis method and realist review protocol. An initial framework was derived from relevant theory and then populated with empirical evidence to test and modify it. Five steps were undertaken: initial theoretical framework development; search for relevant empirical literature; study selection and quality appraisal; data extraction, analysis and synthesis and framework modification. 75 studies were included. We identified 18 factors that drive commitment, organised into five categories: actors; institutions; political and societal contexts; knowledge, evidence and framing; and, capacities and resources. Irrespective of country-context, effective nutrition actor networks, strong leadership, civil society mobilisation, supportive political administrations, societal change and focusing events, cohesive and resonant framing, and robust data systems and available evidence were commitment drivers. Low-income and middle-income country studies also frequently reported international actors, empowered institutions, vertical coordination and capacities and resources. In upper-middle-income and high-income country studies, private sector interference frequently undermined commitment. Political commitment is not something that simply exists or emerges accidentally; it can be created and strengthened over time through strategic action. Successfully generating commitment will likely require a core set of actions with some context-dependent adaptations. Ultimately, it will necessitate strategic actions by cohesive, resourced and strongly led nutrition actor networks that are responsive to the multifactorial, multilevel and dynamic political systems in which they operate and attempt to influence. Accelerating the formation and effectiveness of such networks over the Nutrition Decade should be a core task for all actors involved.
Baker, Phillip; Hawkes, Corinna; Wingrove, Kate; Parkhurst, Justin; Thow, Anne Marie; Walls, Helen
2018-01-01
Introduction Generating country-level political commitment will be critical to driving forward action throughout the United Nations Decade of Action on Nutrition (2016–2025). In this review of the empirical nutrition policy literature, we ask: what factors generate, sustain and constrain political commitment for nutrition, how and under what circumstances? Our aim is to inform strategic ‘commitment-building’ actions. Method We adopted a framework synthesis method and realist review protocol. An initial framework was derived from relevant theory and then populated with empirical evidence to test and modify it. Five steps were undertaken: initial theoretical framework development; search for relevant empirical literature; study selection and quality appraisal; data extraction, analysis and synthesis and framework modification. Results 75 studies were included. We identified 18 factors that drive commitment, organised into five categories: actors; institutions; political and societal contexts; knowledge, evidence and framing; and, capacities and resources. Irrespective of country-context, effective nutrition actor networks, strong leadership, civil society mobilisation, supportive political administrations, societal change and focusing events, cohesive and resonant framing, and robust data systems and available evidence were commitment drivers. Low-income and middle-income country studies also frequently reported international actors, empowered institutions, vertical coordination and capacities and resources. In upper-middle-income and high-income country studies, private sector interference frequently undermined commitment. Conclusion Political commitment is not something that simply exists or emerges accidentally; it can be created and strengthened over time through strategic action. Successfully generating commitment will likely require a core set of actions with some context-dependent adaptations. Ultimately, it will necessitate strategic actions by cohesive, resourced and strongly led nutrition actor networks that are responsive to the multifactorial, multilevel and dynamic political systems in which they operate and attempt to influence. Accelerating the formation and effectiveness of such networks over the Nutrition Decade should be a core task for all actors involved. PMID:29527338
Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis
ERIC Educational Resources Information Center
Young, Cristobal; Holsteen, Katherine
2017-01-01
Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…
Robust and efficient anomaly detection using heterogeneous representations
NASA Astrophysics Data System (ADS)
Hu, Xing; Hu, Shiqiang; Xie, Jinhua; Zheng, Shiyou
2015-05-01
Various approaches have been proposed for video anomaly detection. Yet these approaches typically suffer from one or more limitations: they often characterize the pattern using its internal information, but ignore its external relationship which is important for local anomaly detection. Moreover, the high-dimensionality and the lack of robustness of pattern representation may lead to problems, including overfitting, increased computational cost and memory requirements, and high false alarm rate. We propose a video anomaly detection framework which relies on a heterogeneous representation to account for both the pattern's internal information and external relationship. The internal information is characterized by slow features learned by slow feature analysis from low-level representations, and the external relationship is characterized by the spatial contextual distances. The heterogeneous representation is compact, robust, efficient, and discriminative for anomaly detection. Moreover, both the pattern's internal information and external relationship can be taken into account in the proposed framework. Extensive experiments demonstrate the robustness and efficiency of our approach by comparison with the state-of-the-art approaches on the widely used benchmark datasets.
Fuzzy robust credibility-constrained programming for environmental management and planning.
Zhang, Yimei; Hang, Guohe
2010-06-01
In this study, a fuzzy robust credibility-constrained programming (FRCCP) is developed and applied to the planning for waste management systems. It incorporates the concepts of credibility-based chance-constrained programming and robust programming within an optimization framework. The developed method can reflect uncertainties presented as possibility-density by fuzzy-membership functions. Fuzzy credibility constraints are transformed to the crisp equivalents with different credibility levels, and ordinary fuzzy inclusion constraints are determined by their robust deterministic constraints by setting a-cut levels. The FRCCP method can provide different system costs under different credibility levels (lambda). From the results of sensitivity analyses, the operation cost of the landfill is a critical parameter. For the management, any factors that would induce cost fluctuation during landfilling operation would deserve serious observation and analysis. By FRCCP, useful solutions can be obtained to provide decision-making support for long-term planning of solid waste management systems. It could be further enhanced through incorporating methods of inexact analysis into its framework. It can also be applied to other environmental management problems.
NASA Astrophysics Data System (ADS)
Che, Chang; Yu, Xiaoyang; Sun, Xiaoming; Yu, Boyang
2017-12-01
In recent years, Scalable Vocabulary Tree (SVT) has been shown to be effective in image retrieval. However, for general images where the foreground is the object to be recognized while the background is cluttered, the performance of the current SVT framework is restricted. In this paper, a new image retrieval framework that incorporates a robust distance metric and information fusion is proposed, which improves the retrieval performance relative to the baseline SVT approach. First, the visual words that represent the background are diminished by using a robust Hausdorff distance between different images. Second, image matching results based on three image signature representations are fused, which enhances the retrieval precision. We conducted intensive experiments on small-scale to large-scale image datasets: Corel-9, Corel-48, and PKU-198, where the proposed Hausdorff metric and information fusion outperforms the state-of-the-art methods by about 13, 15, and 15%, respectively.
Theoretical Models and Operational Frameworks in Public Health Ethics
Petrini, Carlo
2010-01-01
The article is divided into three sections: (i) an overview of the main ethical models in public health (theoretical foundations); (ii) a summary of several published frameworks for public health ethics (practical frameworks); and (iii) a few general remarks. Rather than maintaining the superiority of one position over the others, the main aim of the article is to summarize the basic approaches proposed thus far concerning the development of public health ethics by describing and comparing the various ideas in the literature. With this in mind, an extensive list of references is provided. PMID:20195441
Interdependent networks - Topological percolation research and application in finance
NASA Astrophysics Data System (ADS)
Zhou, Di
This dissertation covers the two major parts of my Ph.D. research: i) developing a theoretical framework of complex networks and applying simulation and numerical methods to study the robustness of the network system, and ii) applying statistical physics concepts and methods to quantitatively analyze complex systems and applying the theoretical framework to study real-world systems. In part I, we focus on developing theories of interdependent networks as well as building computer simulation models, which includes three parts: 1) We report on the effects of topology on failure propagation for a model system consisting of two interdependent networks. We find that the internal node correlations in each of the networks significantly changes the critical density of failures, which can trigger the total disruption of the two-network system. Specifically, we find that the assortativity within a single network decreases the robustness of the entire system. 2) We study the percolation behavior of two interdependent scale-free (SF) networks under random failure of 1-p fraction of nodes. We find that as the coupling strength q between the two networks reduces from 1 (fully coupled) to 0 (no coupling), there exist two critical coupling strengths q1 and q2 , which separate the behaviors of the giant component as a function of p into three different regions, and for q2 < q < q 1 , we observe a hybrid order phase transition phenomenon. 3) We study the robustness of n interdependent networks with partially support-dependent relationship both analytically and numerically. We study a starlike network of n Erdos-Renyi (ER), SF networks and a looplike network of n ER networks, and we find for starlike networks, their phase transition regions change with n, but for looplike networks the phase regions change with average degree k . In part II, we apply concepts and methods developed in statistical physics to study economic systems. We analyze stock market indices and foreign exchange daily returns for 60 countries over the period of 1999-2012. We build a multi-layer network model based on different correlation measures, and introduce a dynamic network model to simulate and analyze the initializing and spreading of financial crisis. Using different computational approaches and econometric tests, we find atypical behavior of the cross correlations and community formations in the financial networks that we study during the financial crisis of 2008. For example, the overall correlation of stock market increases during crisis while the correlation between stock market and foreign exchange market decreases. The dramatic increase in correlations between a specific nation and other nations may indicate that this nation could trigger a global financial crisis. Specifically, core countries that have higher correlations with other countries and larger Gross Domestic Product (GDP) values spread financial crisis quite effectively, yet some countries with small GDPs like Greece and Cyprus are also effective in propagating systemic risk and spreading global financial crisis.
Unsupervised active learning based on hierarchical graph-theoretic clustering.
Hu, Weiming; Hu, Wei; Xie, Nianhua; Maybank, Steve
2009-10-01
Most existing active learning approaches are supervised. Supervised active learning has the following problems: inefficiency in dealing with the semantic gap between the distribution of samples in the feature space and their labels, lack of ability in selecting new samples that belong to new categories that have not yet appeared in the training samples, and lack of adaptability to changes in the semantic interpretation of sample categories. To tackle these problems, we propose an unsupervised active learning framework based on hierarchical graph-theoretic clustering. In the framework, two promising graph-theoretic clustering algorithms, namely, dominant-set clustering and spectral clustering, are combined in a hierarchical fashion. Our framework has some advantages, such as ease of implementation, flexibility in architecture, and adaptability to changes in the labeling. Evaluations on data sets for network intrusion detection, image classification, and video classification have demonstrated that our active learning framework can effectively reduce the workload of manual classification while maintaining a high accuracy of automatic classification. It is shown that, overall, our framework outperforms the support-vector-machine-based supervised active learning, particularly in terms of dealing much more efficiently with new samples whose categories have not yet appeared in the training samples.
Pandemic influenza preparedness: an ethical framework to guide decision-making
Thompson, Alison K; Faith, Karen; Gibson, Jennifer L; Upshur, Ross EG
2006-01-01
Background Planning for the next pandemic influenza outbreak is underway in hospitals across the world. The global SARS experience has taught us that ethical frameworks to guide decision-making may help to reduce collateral damage and increase trust and solidarity within and between health care organisations. Good pandemic planning requires reflection on values because science alone cannot tell us how to prepare for a public health crisis. Discussion In this paper, we present an ethical framework for pandemic influenza planning. The ethical framework was developed with expertise from clinical, organisational and public health ethics and validated through a stakeholder engagement process. The ethical framework includes both substantive and procedural elements for ethical pandemic influenza planning. The incorporation of ethics into pandemic planning can be helped by senior hospital administrators sponsoring its use, by having stakeholders vet the framework, and by designing or identifying decision review processes. We discuss the merits and limits of an applied ethical framework for hospital decision-making, as well as the robustness of the framework. Summary The need for reflection on the ethical issues raised by the spectre of a pandemic influenza outbreak is great. Our efforts to address the normative aspects of pandemic planning in hospitals have generated interest from other hospitals and from the governmental sector. The framework will require re-evaluation and refinement and we hope that this paper will generate feedback on how to make it even more robust. PMID:17144926
Hutchings, Maggie; Scammell, Janet; Quinney, Anne
2013-09-01
While there is growing evidence of theoretical perspectives adopted in interprofessional education, learning theories tend to foreground the individual, focusing on psycho-social aspects of individual differences and professional identity to the detriment of considering social-structural factors at work in social practices. Conversely socially situated practice is criticised for being context-specific, making it difficult to draw generalisable conclusions for improving interprofessional education. This article builds on a theoretical framework derived from earlier research, drawing on the dynamics of Dewey's experiential learning theory and Archer's critical realist social theory, to make a case for a meta-theoretical framework enabling social-constructivist and situated learning theories to be interlinked and integrated through praxis and reflexivity. Our current analysis is grounded in an interprofessional curriculum initiative mediated by a virtual community peopled by health and social care users. Student perceptions, captured through quantitative and qualitative data, suggest three major disruptive themes, creating opportunities for congruence and disjuncture and generating a model of zones of interlinked praxis associated with professional differences and identity, pedagogic strategies and technology-mediated approaches. This model contributes to a framework for understanding the complexity of interprofessional learning and offers bridges between individual and structural factors for engaging with the enablements and constraints at work in communities of practice and networks for interprofessional education.
Supervision of Facilitators in a Multisite Study: Goals, Process, and Outcomes
2010-01-01
Objective To describe the aims, implementation, and desired outcomes of facilitator supervision for both interventions (treatment and control) in Project Eban and to present the Eban Theoretical Framework for Supervision that guided the facilitators’ supervision. The qualifications and training of supervisors and facilitators are also described. Design This article provides a detailed description of supervision in a multisite behavioral intervention trial. The Eban Theoretical Framework for Supervision is guided by 3 theories: cognitive behavior therapy, the Life-long Model of Supervision, and “Empowering supervisees to empower others: a culturally responsive supervision model.” Methods Supervision is based on the Eban Theoretical Framework for Supervision, which provides guidelines for implementing both interventions using goals, process, and outcomes. Results Because of effective supervision, the interventions were implemented with fidelity to the protocol and were standard across the multiple sites. Conclusions Supervision of facilitators is a crucial aspect of multisite intervention research quality assurance. It provides them with expert advice, optimizes the effectiveness of facilitators, and increases adherence to the protocol across multiple sites. Based on the experience in this trial, some of the challenges that arise when conducting a multisite randomized control trial and how they can be handled by implementing the Eban Theoretical Framework for Supervision are described. PMID:18724192
Foo, Mathias; Sawlekar, Rucha; Kulkarni, Vishwesh V; Bates, Declan G
2016-08-01
The use of abstract chemical reaction networks (CRNs) as a modelling and design framework for the implementation of computing and control circuits using enzyme-free, entropy driven DNA strand displacement (DSD) reactions is starting to garner widespread attention in the area of synthetic biology. Previous work in this area has demonstrated the theoretical plausibility of using this approach to design biomolecular feedback control systems based on classical proportional-integral (PI) controllers, which may be constructed from CRNs implementing gain, summation and integrator operators. Here, we propose an alternative design approach that utilises the abstract chemical reactions involved in cellular signalling cycles to implement a biomolecular controller - termed a signalling-cycle (SC) controller. We compare the performance of the PI and SC controllers in closed-loop with a nonlinear second-order chemical process. Our results show that the SC controller outperforms the PI controller in terms of both performance and robustness, and also requires fewer abstract chemical reactions to implement, highlighting its potential usefulness in the construction of biomolecular control circuits.
Women, Demography, and Politics: How Lower Fertility Rates Lead to Democracy.
Sommer, Udi
2018-04-01
Where connections between demography and politics are examined in the literature, it is largely in the context of the effects of male aspects of demography on phenomena such as political violence. This project aims to place the study of demographic variables' influence on politics, particularly on democracy, squarely within the scope of political and social sciences, and to focus on the effects of woman-related demographics-namely, fertility rate. I test the hypothesis that demographic variables-female-related predictors, in particular-have an independent effect on political structure. Comparing countries over time, this study finds a growth in democracy when fertility rates decline. In the theoretical framework developed, it is family structure as well as the economic and political status of women that account for this change at the macro and micro levels. Findings based on data for more than 140 countries over three decades are robust when controlling not only for alternative effects but also for reverse causality and data limitations.
Generalized species sampling priors with latent Beta reinforcements
Airoldi, Edoardo M.; Costa, Thiago; Bassetti, Federico; Leisen, Fabrizio; Guindani, Michele
2014-01-01
Many popular Bayesian nonparametric priors can be characterized in terms of exchangeable species sampling sequences. However, in some applications, exchangeability may not be appropriate. We introduce a novel and probabilistically coherent family of non-exchangeable species sampling sequences characterized by a tractable predictive probability function with weights driven by a sequence of independent Beta random variables. We compare their theoretical clustering properties with those of the Dirichlet Process and the two parameters Poisson-Dirichlet process. The proposed construction provides a complete characterization of the joint process, differently from existing work. We then propose the use of such process as prior distribution in a hierarchical Bayes modeling framework, and we describe a Markov Chain Monte Carlo sampler for posterior inference. We evaluate the performance of the prior and the robustness of the resulting inference in a simulation study, providing a comparison with popular Dirichlet Processes mixtures and Hidden Markov Models. Finally, we develop an application to the detection of chromosomal aberrations in breast cancer by leveraging array CGH data. PMID:25870462
Transformational leadership in nursing: towards a more critical interpretation.
Hutchinson, Marie; Jackson, Debra
2013-03-01
Effective nurse leadership is positioned as an essential factor in achieving optimal patient outcomes and workplace enhancement. Over the last two decades, writing and research on nursing leadership has been dominated by one conceptual theory, that of transformational leadership. This theoretical framework has provided insight into various leader characteristics, with research findings presented as persuasive evidence. While elsewhere there has been robust debate on the merits of the transformational model of leadership, in the nursing literature, there has been little critical review of the model and the commonly used assessment instruments. In this article, we critically review more than a decade of nursing scholarship on the transformational model of leadership and its empirical evidence. Applying a critical lens to the literature, the conceptual and methodological weaknesses of much nursing research on this topic, we question whether the uncritical adoption of the transformational model has resulted in a limited interpretation of nursing leadership. Given the limitations of the model, we advocate embracing new ways of thinking about nursing leadership. © 2012 Blackwell Publishing Ltd.
Slow rupture of frictional interfaces
NASA Astrophysics Data System (ADS)
Bar Sinai, Yohai; Brener, Efim A.; Bouchbinder, Eran
2012-02-01
The failure of frictional interfaces and the spatiotemporal structures that accompany it are central to a wide range of geophysical, physical and engineering systems. Recent geophysical and laboratory observations indicated that interfacial failure can be mediated by slow slip rupture phenomena which are distinct from ordinary, earthquake-like, fast rupture. These discoveries have influenced the way we think about frictional motion, yet the nature and properties of slow rupture are not completely understood. We show that slow rupture is an intrinsic and robust property of simple non-monotonic rate-and-state friction laws. It is associated with a new velocity scale cmin, determined by the friction law, below which steady state rupture cannot propagate. We further show that rupture can occur in a continuum of states, spanning a wide range of velocities from cmin to elastic wave-speeds, and predict different properties for slow rupture and ordinary fast rupture. Our results are qualitatively consistent with recent high-resolution laboratory experiments and may provide a theoretical framework for understanding slow rupture phenomena along frictional interfaces.
Semantic richness effects in lexical decision: The role of feedback.
Yap, Melvin J; Lim, Gail Y; Pexman, Penny M
2015-11-01
Across lexical processing tasks, it is well established that words with richer semantic representations are recognized faster. This suggests that the lexical system has access to meaning before a word is fully identified, and is consistent with a theoretical framework based on interactive and cascaded processing. Specifically, semantic richness effects are argued to be produced by feedback from semantic representations to lower-level representations. The present study explores the extent to which richness effects are mediated by feedback from lexical- to letter-level representations. In two lexical decision experiments, we examined the joint effects of stimulus quality and four semantic richness dimensions (imageability, number of features, semantic neighborhood density, semantic diversity). With the exception of semantic diversity, robust additive effects of stimulus quality and richness were observed for the targeted dimensions. Our results suggest that semantic feedback does not typically reach earlier levels of representation in lexical decision, and further reinforces the idea that task context modulates the processing dynamics of early word recognition processes.
Dahmen, Jessamyn; Cook, Diane J; Wang, Xiaobo; Honglei, Wang
2017-08-01
Smart home design has undergone a metamorphosis in recent years. The field has evolved from designing theoretical smart home frameworks and performing scripted tasks in laboratories. Instead, we now find robust smart home technologies that are commonly used by large segments of the population in a variety of settings. Recent smart home applications are focused on activity recognition, health monitoring, and automation. In this paper, we take a look at another important role for smart homes: security. We first explore the numerous ways smart homes can and do provide protection for their residents. Next, we provide a comparative analysis of the alternative tools and research that has been developed for this purpose. We investigate not only existing commercial products that have been introduced but also discuss the numerous research that has been focused on detecting and identifying potential threats. Finally, we close with open challenges and ideas for future research that will keep individuals secure and healthy while in their own homes.
The global susceptibility of coastal forage fish to competition by large jellyfish.
Schnedler-Meyer, Nicolas Azaña; Mariani, Patrizio; Kiørboe, Thomas
2016-11-16
Competition between large jellyfish and forage fish for zooplankton prey is both a possible cause of jellyfish increases and a concern for the management of marine ecosystems and fisheries. Identifying principal factors affecting this competition is therefore important for marine management, but the lack of both good quality data and a robust theoretical framework have prevented general global analyses. Here, we present a general mechanistic food web model that considers fundamental differences in feeding modes and predation pressure between fish and jellyfish. The model predicts forage fish dominance at low primary production, and a shift towards jellyfish with increasing productivity, turbidity and fishing. We present an index of global ecosystem susceptibility to shifts in fish-jellyfish dominance that compares well with data on jellyfish distributions and trends. The results are a step towards better understanding the processes that govern jellyfish occurrences globally and highlight the advantage of considering feeding traits in ecosystem models. © 2016 The Author(s).
Optimal stomatal behaviour around the world
NASA Astrophysics Data System (ADS)
Lin, Yan-Shih; Medlyn, Belinda E.; Duursma, Remko A.; Prentice, I. Colin; Wang, Han; Baig, Sofia; Eamus, Derek; de Dios, Victor Resco; Mitchell, Patrick; Ellsworth, David S.; de Beeck, Maarten Op; Wallin, Göran; Uddling, Johan; Tarvainen, Lasse; Linderson, Maj-Lena; Cernusak, Lucas A.; Nippert, Jesse B.; Ocheltree, Troy W.; Tissue, David T.; Martin-Stpaul, Nicolas K.; Rogers, Alistair; Warren, Jeff M.; de Angelis, Paolo; Hikosaka, Kouki; Han, Qingmin; Onoda, Yusuke; Gimeno, Teresa E.; Barton, Craig V. M.; Bennie, Jonathan; Bonal, Damien; Bosc, Alexandre; Löw, Markus; Macinins-Ng, Cate; Rey, Ana; Rowland, Lucy; Setterfield, Samantha A.; Tausz-Posch, Sabine; Zaragoza-Castells, Joana; Broadmeadow, Mark S. J.; Drake, John E.; Freeman, Michael; Ghannoum, Oula; Hutley, Lindsay B.; Kelly, Jeff W.; Kikuzawa, Kihachiro; Kolari, Pasi; Koyama, Kohei; Limousin, Jean-Marc; Meir, Patrick; Lola da Costa, Antonio C.; Mikkelsen, Teis N.; Salinas, Norma; Sun, Wei; Wingate, Lisa
2015-05-01
Stomatal conductance (gs) is a key land-surface attribute as it links transpiration, the dominant component of global land evapotranspiration, and photosynthesis, the driving force of the global carbon cycle. Despite the pivotal role of gs in predictions of global water and carbon cycle changes, a global-scale database and an associated globally applicable model of gs that allow predictions of stomatal behaviour are lacking. Here, we present a database of globally distributed gs obtained in the field for a wide range of plant functional types (PFTs) and biomes. We find that stomatal behaviour differs among PFTs according to their marginal carbon cost of water use, as predicted by the theory underpinning the optimal stomatal model and the leaf and wood economics spectrum. We also demonstrate a global relationship with climate. These findings provide a robust theoretical framework for understanding and predicting the behaviour of gs across biomes and across PFTs that can be applied to regional, continental and global-scale modelling of ecosystem productivity, energy balance and ecohydrological processes in a future changing climate.
Adaptive fixed-time trajectory tracking control of a stratospheric airship.
Zheng, Zewei; Feroskhan, Mir; Sun, Liang
2018-05-01
This paper addresses the fixed-time trajectory tracking control problem of a stratospheric airship. By extending the method of adding a power integrator to a novel adaptive fixed-time control method, the convergence of a stratospheric airship to its reference trajectory is guaranteed to be achieved within a fixed time. The control algorithm is firstly formulated without the consideration of external disturbances to establish the stability of the closed-loop system in fixed-time and demonstrate that the convergence time of the airship is essentially independent of its initial conditions. Subsequently, a smooth adaptive law is incorporated into the proposed fixed-time control framework to provide the system with robustness to external disturbances. Theoretical analyses demonstrate that under the adaptive fixed-time controller, the tracking errors will converge towards a residual set in fixed-time. The results of a comparative simulation study with other recent methods illustrate the remarkable performance and superiority of the proposed control method. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Limited Bandwidth Recognition of Collective Behaviors in Bio-Inspired Swarms
2014-05-01
Nevai, K. M. Passino, and P. Srinivasan. Stability of choice in the honey bee nest-site selection processs. Journal of Theoretical Biology , 263(1):93...and N. Franks. Collective memory and spatial sorting in animal groups. Journal of Theoretical Biology , 218(1):1–11, 2002. [4] D. Cvetkovic, P...motion from local attraction. Journal of Theoretical Biology , 283(1):145–151, 2011. [18] G. Sukthankar and K. Sycara. Robust recognition of physical team
Selective adsorption of sulfur dioxide in a robust metal-organic framework material
Savage, Mathew; Cheng, Yongqiang; Easun, Timothy L.; ...
2016-08-16
Here, selective adsorption of SO 2 is realized in a porous metal–organic framework material, and in-depth structural and spectroscopic investigations using X-rays, infrared, and neutrons define the underlying interactions that cause SO 2 to bind more strongly than CO 2 and N 2.
Keller, Brad M; Oustimov, Andrew; Wang, Yan; Chen, Jinbo; Acciavatti, Raymond J; Zheng, Yuanjie; Ray, Shonket; Gee, James C; Maidment, Andrew D A; Kontos, Despina
2015-04-01
An analytical framework is presented for evaluating the equivalence of parenchymal texture features across different full-field digital mammography (FFDM) systems using a physical breast phantom. Phantom images (FOR PROCESSING) are acquired from three FFDM systems using their automated exposure control setting. A panel of texture features, including gray-level histogram, co-occurrence, run length, and structural descriptors, are extracted. To identify features that are robust across imaging systems, a series of equivalence tests are performed on the feature distributions, in which the extent of their intersystem variation is compared to their intrasystem variation via the Hodges-Lehmann test statistic. Overall, histogram and structural features tend to be most robust across all systems, and certain features, such as edge enhancement, tend to be more robust to intergenerational differences between detectors of a single vendor than to intervendor differences. Texture features extracted from larger regions of interest (i.e., [Formula: see text]) and with a larger offset length (i.e., [Formula: see text]), when applicable, also appear to be more robust across imaging systems. This framework and observations from our experiments may benefit applications utilizing mammographic texture analysis on images acquired in multivendor settings, such as in multicenter studies of computer-aided detection and breast cancer risk assessment.
A constrained robust least squares approach for contaminant release history identification
NASA Astrophysics Data System (ADS)
Sun, Alexander Y.; Painter, Scott L.; Wittmeyer, Gordon W.
2006-04-01
Contaminant source identification is an important type of inverse problem in groundwater modeling and is subject to both data and model uncertainty. Model uncertainty was rarely considered in the previous studies. In this work, a robust framework for solving contaminant source recovery problems is introduced. The contaminant source identification problem is first cast into one of solving uncertain linear equations, where the response matrix is constructed using a superposition technique. The formulation presented here is general and is applicable to any porous media flow and transport solvers. The robust least squares (RLS) estimator, which originated in the field of robust identification, directly accounts for errors arising from model uncertainty and has been shown to significantly reduce the sensitivity of the optimal solution to perturbations in model and data. In this work, a new variant of RLS, the constrained robust least squares (CRLS), is formulated for solving uncertain linear equations. CRLS allows for additional constraints, such as nonnegativity, to be imposed. The performance of CRLS is demonstrated through one- and two-dimensional test problems. When the system is ill-conditioned and uncertain, it is found that CRLS gave much better performance than its classical counterpart, the nonnegative least squares. The source identification framework developed in this work thus constitutes a reliable tool for recovering source release histories in real applications.
NASA Astrophysics Data System (ADS)
Kim, J.
2016-12-01
Considering high levels of uncertainty, epistemological conflicts over facts and values, and a sense of urgency, normal paradigm-driven science will be insufficient to mobilize people and nation toward sustainability. The conceptual framework to bridge the societal system dynamics with that of natural ecosystems in which humanity operates remains deficient. The key to understanding their coevolution is to understand `self-organization.' Information-theoretic approach may shed a light to provide a potential framework which enables not only to bridge human and nature but also to generate useful knowledge for understanding and sustaining the integrity of ecological-societal systems. How can information theory help understand the interface between ecological systems and social systems? How to delineate self-organizing processes and ensure them to fulfil sustainability? How to evaluate the flow of information from data through models to decision-makers? These are the core questions posed by sustainability science in which visioneering (i.e., the engineering of vision) is an essential framework. Yet, visioneering has neither quantitative measure nor information theoretic framework to work with and teach. This presentation is an attempt to accommodate the framework of self-organizing hierarchical open systems with visioneering into a common information-theoretic framework. A case study is presented with the UN/FAO's communal vision of climate-smart agriculture (CSA) which pursues a trilemma of efficiency, mitigation, and resilience. Challenges of delineating and facilitating self-organizing systems are discussed using transdisciplinary toold such as complex systems thinking, dynamic process network analysis and multi-agent systems modeling. Acknowledgments: This study was supported by the Korea Meteorological Administration Research and Development Program under Grant KMA-2012-0001-A (WISE project).
Li, Zukui; Ding, Ran; Floudas, Christodoulos A.
2011-01-01
Robust counterpart optimization techniques for linear optimization and mixed integer linear optimization problems are studied in this paper. Different uncertainty sets, including those studied in literature (i.e., interval set; combined interval and ellipsoidal set; combined interval and polyhedral set) and new ones (i.e., adjustable box; pure ellipsoidal; pure polyhedral; combined interval, ellipsoidal, and polyhedral set) are studied in this work and their geometric relationship is discussed. For uncertainty in the left hand side, right hand side, and objective function of the optimization problems, robust counterpart optimization formulations induced by those different uncertainty sets are derived. Numerical studies are performed to compare the solutions of the robust counterpart optimization models and applications in refinery production planning and batch process scheduling problem are presented. PMID:21935263
Cell-type-specific modelling of intracellular calcium signalling: a urothelial cell model.
Appleby, Peter A; Shabir, Saqib; Southgate, Jennifer; Walker, Dawn
2013-09-06
Calcium signalling plays a central role in regulating a wide variety of cell processes. A number of calcium signalling models exist in the literature that are capable of reproducing a variety of experimentally observed calcium transients. These models have been used to examine in more detail the mechanisms underlying calcium transients, but very rarely has a model been directly linked to a particular cell type and experimentally verified. It is important to show that this can be achieved within the general theoretical framework adopted by these models. Here, we develop a framework designed specifically for modelling cytosolic calcium transients in urothelial cells. Where possible, we draw upon existing calcium signalling models, integrating descriptions of components known to be important in this cell type from a number of studies in the literature. We then add descriptions of several additional pathways that play a specific role in urothelial cell signalling, including an explicit ionic influx term and an active pumping mechanism that drives the cytosolic calcium concentration to a target equilibrium. The resulting one-pool model of endoplasmic reticulum (ER)-dependent calcium signalling relates the cytosolic, extracellular and ER calcium concentrations and can generate a wide range of calcium transients, including spikes, bursts, oscillations and sustained elevations in the cytosolic calcium concentration. Using single-variate robustness and multivariate sensitivity analyses, we quantify how varying each of the parameters of the model leads to changes in key features of the calcium transient, such as initial peak amplitude and the frequency of bursting or spiking, and in the transitions between bursting- and plateau-dominated modes. We also show that, novel to our urothelial cell model, the ionic and purinergic P2Y pathways make distinct contributions to the calcium transient. We then validate the model using human bladder epithelial cells grown in monolayer cell culture and show that the model robustly captures the key features of the experimental data in a way that is not possible using more generic calcium models from the literature.
Cusack, Lynette; Smith, Morgan; Hegney, Desley; Rees, Clare S.; Breen, Lauren J.; Witt, Regina R.; Rogers, Cath; Williams, Allison; Cross, Wendy; Cheung, Kin
2016-01-01
Building nurses' resilience to complex and stressful practice environments is necessary to keep skilled nurses in the workplace and ensuring safe patient care. A unified theoretical framework titled Health Services Workplace Environmental Resilience Model (HSWERM), is presented to explain the environmental factors in the workplace that promote nurses' resilience. The framework builds on a previously-published theoretical model of individual resilience, which identified the key constructs of psychological resilience as self-efficacy, coping and mindfulness, but did not examine environmental factors in the workplace that promote nurses' resilience. This unified theoretical framework was developed using a literary synthesis drawing on data from international studies and literature reviews on the nursing workforce in hospitals. The most frequent workplace environmental factors were identified, extracted and clustered in alignment with key constructs for psychological resilience. Six major organizational concepts emerged that related to a positive resilience-building workplace and formed the foundation of the theoretical model. Three concepts related to nursing staff support (professional, practice, personal) and three related to nursing staff development (professional, practice, personal) within the workplace environment. The unified theoretical model incorporates these concepts within the workplace context, linking to the nurse, and then impacting on personal resilience and workplace outcomes, and its use has the potential to increase staff retention and quality of patient care. PMID:27242567
Robustness of continuous-time adaptive control algorithms in the presence of unmodeled dynamics
NASA Technical Reports Server (NTRS)
Rohrs, C. E.; Valavani, L.; Athans, M.; Stein, G.
1985-01-01
This paper examines the robustness properties of existing adaptive control algorithms to unmodeled plant high-frequency dynamics and unmeasurable output disturbances. It is demonstrated that there exist two infinite-gain operators in the nonlinear dynamic system which determines the time-evolution of output and parameter errors. The pragmatic implications of the existence of such infinite-gain operators is that: (1) sinusoidal reference inputs at specific frequencies and/or (2) sinusoidal output disturbances at any frequency (including dc), can cause the loop gain to increase without bound, thereby exciting the unmodeled high-frequency dynamics, and yielding an unstable control system. Hence, it is concluded that existing adaptive control algorithms as they are presented in the literature referenced in this paper, cannot be used with confidence in practical designs where the plant contains unmodeled dynamics because instability is likely to result. Further understanding is required to ascertain how the currently implemented adaptive systems differ from the theoretical systems studied here and how further theoretical development can improve the robustness of adaptive controllers.
Towards a neuro-computational account of prism adaptation.
Petitet, Pierre; O'Reilly, Jill X; O'Shea, Jacinta
2017-12-14
Prism adaptation has a long history as an experimental paradigm used to investigate the functional and neural processes that underlie sensorimotor control. In the neuropsychology literature, prism adaptation behaviour is typically explained by reference to a traditional cognitive psychology framework that distinguishes putative functions, such as 'strategic control' versus 'spatial realignment'. This theoretical framework lacks conceptual clarity, quantitative precision and explanatory power. Here, we advocate for an alternative computational framework that offers several advantages: 1) an algorithmic explanatory account of the computations and operations that drive behaviour; 2) expressed in quantitative mathematical terms; 3) embedded within a principled theoretical framework (Bayesian decision theory, state-space modelling); 4) that offers a means to generate and test quantitative behavioural predictions. This computational framework offers a route towards mechanistic neurocognitive explanations of prism adaptation behaviour. Thus it constitutes a conceptual advance compared to the traditional theoretical framework. In this paper, we illustrate how Bayesian decision theory and state-space models offer principled explanations for a range of behavioural phenomena in the field of prism adaptation (e.g. visual capture, magnitude of visual versus proprioceptive realignment, spontaneous recovery and dynamics of adaptation memory). We argue that this explanatory framework can advance understanding of the functional and neural mechanisms that implement prism adaptation behaviour, by enabling quantitative tests of hypotheses that go beyond merely descriptive mapping claims that 'brain area X is (somehow) involved in psychological process Y'. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Wang, Leimin; Shen, Yi; Sheng, Yin
2016-04-01
This paper is concerned with the finite-time robust stabilization of delayed neural networks (DNNs) in the presence of discontinuous activations and parameter uncertainties. By using the nonsmooth analysis and control theory, a delayed controller is designed to realize the finite-time robust stabilization of DNNs with discontinuous activations and parameter uncertainties, and the upper bound of the settling time functional for stabilization is estimated. Finally, two examples are provided to demonstrate the effectiveness of the theoretical results. Copyright © 2016 Elsevier Ltd. All rights reserved.
Human Factors of CC-130 Operations. Volume 5: Human Factors in Decision Making
1998-02-01
known about human information processing and decision making. Topics for HFDM training come directly from this theoretical framework . The proposed...The proposed training can be distinguished from other approaches with similar goals (either explicit or implicit) by its base within a theoretical ... framework of human information processing. The differences lie less in the content than in the way the material is organized and shaped by theory. The
Toward a Theoretical Framework for the Study of Humor in Literature and the Other Arts
ERIC Educational Resources Information Center
Farber, Jerry
2007-01-01
With a clearer understanding of the way humor works, individuals might be better able to give it the attention it deserves when they study and teach the arts. But where do they turn to find a theoretical framework for the study of humor--one that will help them clarify the role that humor plays in the arts and that will help them as well to…
Extended physics as a theoretical framework for systems biology?
Miquel, Paul-Antoine
2011-08-01
In this essay we examine whether a theoretical and conceptual framework for systems biology could be built from the Bailly and Longo (2008, 2009) proposal. These authors aim to understand life as a coherent critical structure, and propose to develop an extended physical approach of evolution, as a diffusion of biomass in a space of complexity. Their attempt leads to a simple mathematical reconstruction of Gould's assumption (1989) concerning the bacterial world as a "left wall of least complexity" that we will examine. Extended physical systems are characterized by their constructive properties. Time is acting and new properties emerge by their history that can open the list of their initial properties. This conceptual and theoretical framework is nothing more than a philosophical assumption, but as such it provides a new and exciting approach concerning the evolution of life, and the transition between physics and biology. Copyright © 2011 Elsevier Ltd. All rights reserved.
Hartnell, Chad A; Ou, Amy Yi; Kinicki, Angelo
2011-07-01
We apply Quinn and Rohrbaugh's (1983) competing values framework (CVF) as an organizing taxonomy to meta-analytically test hypotheses about the relationship between 3 culture types and 3 major indices of organizational effectiveness (employee attitudes, operational performance [i.e., innovation and product and service quality], and financial performance). The paper also tests theoretical suppositions undergirding the CVF by investigating the framework's nomological validity and proposed internal structure (i.e., interrelationships among culture types). Results based on data from 84 empirical studies with 94 independent samples indicate that clan, adhocracy, and market cultures are differentially and positively associated with the effectiveness criteria, though not always as hypothesized. The findings provide mixed support for the CVF's nomological validity and fail to support aspects of the CVF's proposed internal structure. We propose an alternative theoretical approach to the CVF and delineate directions for future research.
Toward a theoretical framework for trustworthy cyber sensing
NASA Astrophysics Data System (ADS)
Xu, Shouhuai
2010-04-01
Cyberspace is an indispensable part of the economy and society, but has been "polluted" with many compromised computers that can be abused to launch further attacks against the others. Since it is likely that there always are compromised computers, it is important to be aware of the (dynamic) cyber security-related situation, which is however challenging because cyberspace is an extremely large-scale complex system. Our project aims to investigate a theoretical framework for trustworthy cyber sensing. With the perspective of treating cyberspace as a large-scale complex system, the core question we aim to address is: What would be a competent theoretical (mathematical and algorithmic) framework for designing, analyzing, deploying, managing, and adapting cyber sensor systems so as to provide trustworthy information or input to the higher layer of cyber situation-awareness management, even in the presence of sophisticated malicious attacks against the cyber sensor systems?
Mixing Categories and Modal Logics in the Quantum Setting
NASA Astrophysics Data System (ADS)
Cinà, Giovanni
The study of the foundations of Quantum Mechanics, especially after the advent of Quantum Computation and Information, has benefited from the application of category-theoretic tools and modal logics to the analysis of Quantum processes: we witness a wealth of theoretical frameworks casted in either of the two languages. This paper explores the interplay of the two formalisms in the peculiar context of Quantum Theory. After a review of some influential abstract frameworks, we show how different modal logic frames can be extracted from the category of finite dimensional Hilbert spaces, connecting the Categorical Quantum Mechanics approach to some modal logics that have been proposed for Quantum Computing. We then apply a general version of the same technique to two other categorical frameworks, the `topos approach' of Doering and Isham and the sheaf-theoretic work on contextuality by Abramsky and Brandenburger, suggesting how some key features can be expressed with modal languages.
A Generally Robust Approach for Testing Hypotheses and Setting Confidence Intervals for Effect Sizes
ERIC Educational Resources Information Center
Keselman, H. J.; Algina, James; Lix, Lisa M.; Wilcox, Rand R.; Deering, Kathleen N.
2008-01-01
Standard least squares analysis of variance methods suffer from poor power under arbitrarily small departures from normality and fail to control the probability of a Type I error when standard assumptions are violated. This article describes a framework for robust estimation and testing that uses trimmed means with an approximate degrees of…
The Robust Learning Model (RLM): A Comprehensive Approach to a New Online University
ERIC Educational Resources Information Center
Neumann, Yoram; Neumann, Edith F.
2010-01-01
This paper outlines the components of the Robust Learning Model (RLM) as a conceptual framework for creating a new online university offering numerous degree programs at all degree levels. The RLM is a multi-factorial model based on the basic belief that successful learning outcomes depend on multiple factors employed together in a holistic…
ERIC Educational Resources Information Center
McGee, Ebony O.
2015-01-01
I introduce the construct of fragile and robust identities for the purpose of exploring the experiences that influenced the mathematical and racial identities of high-achieving Black college students in mathematics and engineering. These students maintained high levels of academic achievement in these fields while enduring marginalization,…
Phenotypic Robustness and the Assortativity Signature of Human Transcription Factor Networks
Pechenick, Dov A.; Payne, Joshua L.; Moore, Jason H.
2014-01-01
Many developmental, physiological, and behavioral processes depend on the precise expression of genes in space and time. Such spatiotemporal gene expression phenotypes arise from the binding of sequence-specific transcription factors (TFs) to DNA, and from the regulation of nearby genes that such binding causes. These nearby genes may themselves encode TFs, giving rise to a transcription factor network (TFN), wherein nodes represent TFs and directed edges denote regulatory interactions between TFs. Computational studies have linked several topological properties of TFNs — such as their degree distribution — with the robustness of a TFN's gene expression phenotype to genetic and environmental perturbation. Another important topological property is assortativity, which measures the tendency of nodes with similar numbers of edges to connect. In directed networks, assortativity comprises four distinct components that collectively form an assortativity signature. We know very little about how a TFN's assortativity signature affects the robustness of its gene expression phenotype to perturbation. While recent theoretical results suggest that increasing one specific component of a TFN's assortativity signature leads to increased phenotypic robustness, the biological context of this finding is currently limited because the assortativity signatures of real-world TFNs have not been characterized. It is therefore unclear whether these earlier theoretical findings are biologically relevant. Moreover, it is not known how the other three components of the assortativity signature contribute to the phenotypic robustness of TFNs. Here, we use publicly available DNaseI-seq data to measure the assortativity signatures of genome-wide TFNs in 41 distinct human cell and tissue types. We find that all TFNs share a common assortativity signature and that this signature confers phenotypic robustness to model TFNs. Lastly, we determine the extent to which each of the four components of the assortativity signature contributes to this robustness. PMID:25121490
Theoretical Grounding: The "Missing Link" in Suicide Research.
ERIC Educational Resources Information Center
Rogers, James R.
2001-01-01
Discusses the strengths and limitations of the current pragmatic focus of research in suicidology and presents an argument for theoretical grounding as a precursor for continued advancement in this area. Presents an existential-constructivist framework of "meaning creation" as a theoretical heuristic for understanding suicide. Outlines general…
A Future-Oriented Retirement Transition Adjustment Framework
ERIC Educational Resources Information Center
Hesketh, Beryl; Griffin, Barbara; Loh, Vanessa
2011-01-01
This theoretical paper presents a person-environment fit framework that extends the Minnesota Theory of Work Adjustment to retirement transition and adjustment. The proposed Retirement Transition and Adjustment Framework (RTAF) also accommodates dynamic intra-individual and environment change over time, configural combinations of variables, and an…
French, Simon D; Green, Sally E; O'Connor, Denise A; McKenzie, Joanne E; Francis, Jill J; Michie, Susan; Buchbinder, Rachelle; Schattner, Peter; Spike, Neil; Grimshaw, Jeremy M
2012-04-24
There is little systematic operational guidance about how best to develop complex interventions to reduce the gap between practice and evidence. This article is one in a Series of articles documenting the development and use of the Theoretical Domains Framework (TDF) to advance the science of implementation research. The intervention was developed considering three main components: theory, evidence, and practical issues. We used a four-step approach, consisting of guiding questions, to direct the choice of the most appropriate components of an implementation intervention: Who needs to do what, differently? Using a theoretical framework, which barriers and enablers need to be addressed? Which intervention components (behaviour change techniques and mode(s) of delivery) could overcome the modifiable barriers and enhance the enablers? And how can behaviour change be measured and understood? A complex implementation intervention was designed that aimed to improve acute low back pain management in primary care. We used the TDF to identify the barriers and enablers to the uptake of evidence into practice and to guide the choice of intervention components. These components were then combined into a cohesive intervention. The intervention was delivered via two facilitated interactive small group workshops. We also produced a DVD to distribute to all participants in the intervention group. We chose outcome measures in order to assess the mediating mechanisms of behaviour change. We have illustrated a four-step systematic method for developing an intervention designed to change clinical practice based on a theoretical framework. The method of development provides a systematic framework that could be used by others developing complex implementation interventions. While this framework should be iteratively adjusted and refined to suit other contexts and settings, we believe that the four-step process should be maintained as the primary framework to guide researchers through a comprehensive intervention development process.
2012-01-01
Background There is little systematic operational guidance about how best to develop complex interventions to reduce the gap between practice and evidence. This article is one in a Series of articles documenting the development and use of the Theoretical Domains Framework (TDF) to advance the science of implementation research. Methods The intervention was developed considering three main components: theory, evidence, and practical issues. We used a four-step approach, consisting of guiding questions, to direct the choice of the most appropriate components of an implementation intervention: Who needs to do what, differently? Using a theoretical framework, which barriers and enablers need to be addressed? Which intervention components (behaviour change techniques and mode(s) of delivery) could overcome the modifiable barriers and enhance the enablers? And how can behaviour change be measured and understood? Results A complex implementation intervention was designed that aimed to improve acute low back pain management in primary care. We used the TDF to identify the barriers and enablers to the uptake of evidence into practice and to guide the choice of intervention components. These components were then combined into a cohesive intervention. The intervention was delivered via two facilitated interactive small group workshops. We also produced a DVD to distribute to all participants in the intervention group. We chose outcome measures in order to assess the mediating mechanisms of behaviour change. Conclusions We have illustrated a four-step systematic method for developing an intervention designed to change clinical practice based on a theoretical framework. The method of development provides a systematic framework that could be used by others developing complex implementation interventions. While this framework should be iteratively adjusted and refined to suit other contexts and settings, we believe that the four-step process should be maintained as the primary framework to guide researchers through a comprehensive intervention development process. PMID:22531013
NASA Astrophysics Data System (ADS)
Shen, Ji; Sung, Shannon; Zhang, Dongmei
2015-11-01
Students need to think and work across disciplinary boundaries in the twenty-first century. However, it is unclear what interdisciplinary thinking means and how to analyze interdisciplinary interactions in teamwork. In this paper, drawing on multiple theoretical perspectives and empirical analysis of discourse contents, we formulate a theoretical framework that helps analyze interdisciplinary reasoning and communication (IRC) processes in interdisciplinary collaboration. Specifically, we propose four interrelated IRC processes-integration, translation, transfer, and transformation, and develop a corresponding analytic framework. We apply the framework to analyze two meetings of a project that aims to develop interdisciplinary science assessment items. The results illustrate that the framework can help interpret the interdisciplinary meeting dynamics and patterns. Our coding process and results also suggest that these IRC processes can be further examined in terms of interconnected sub-processes. We also discuss the implications of using the framework in conceptualizing, practicing, and researching interdisciplinary learning and teaching in science education.
Analysis of poetic literature using B. F. Skinner's theoretical framework from verbal behavior
Luke, Nicole M.
2003-01-01
This paper examines Skinner's work on verbal behavior in the context of literature as a particular class of written verbal behavior. It looks at contemporary literary theory and analysis and the contributions that Skinner's theoretical framework can make. Two diverse examples of poetic literature are chosen and analyzed following Skinner's framework, examining the dynamic interplay between the writer and reader that take place within the bounds of the work presented. It is concluded that Skinner's hypotheses about verbal behavior and the functional approach to understanding it have much to offer literary theorists in their efforts to understand literary works and should be more carefully examined.
A Theoretical Framework for Defense Acquisition Analysis
1989-09-01
Carnegie Reports on this issue discuss the need further. Being imaginative in business means having the ability to visualize systematic...Curriculum in Business and Public Administration," Negotiation Journal, 2: 191-204. (April 1987). 15. Cheney, Dick. Defense Management Report to the...Framework, Framework I and Framework II ....... .................. 105 13. Typical Perspective of an American Business Organization
Designing a Broadband Pump for High-Quality Micro-Lasers via Modified Net Radiation Method.
Nechayev, Sergey; Reusswig, Philip D; Baldo, Marc A; Rotschild, Carmel
2016-12-07
High-quality micro-lasers are key ingredients in non-linear optics, communication, sensing and low-threshold solar-pumped lasers. However, such micro-lasers exhibit negligible absorption of free-space broadband pump light. Recently, this limitation was lifted by cascade energy transfer, in which the absorption and quality factor are modulated with wavelength, enabling non-resonant pumping of high-quality micro-lasers and solar-pumped laser to operate at record low solar concentration. Here, we present a generic theoretical framework for modeling the absorption, emission and energy transfer of incoherent radiation between cascade sensitizer and laser gain media. Our model is based on linear equations of the modified net radiation method and is therefore robust, fast converging and has low complexity. We apply this formalism to compute the optimal parameters of low-threshold solar-pumped lasers. It is revealed that the interplay between the absorption and self-absorption of such lasers defines the optimal pump absorption below the maximal value, which is in contrast to conventional lasers for which full pump absorption is desired. Numerical results are compared to experimental data on a sensitized Nd 3+ :YAG cavity, and quantitative agreement with theoretical models is found. Our work modularizes the gain and sensitizing components and paves the way for the optimal design of broadband-pumped high-quality micro-lasers and efficient solar-pumped lasers.
Designing a Broadband Pump for High-Quality Micro-Lasers via Modified Net Radiation Method
Nechayev, Sergey; Reusswig, Philip D.; Baldo, Marc A.; Rotschild, Carmel
2016-01-01
High-quality micro-lasers are key ingredients in non-linear optics, communication, sensing and low-threshold solar-pumped lasers. However, such micro-lasers exhibit negligible absorption of free-space broadband pump light. Recently, this limitation was lifted by cascade energy transfer, in which the absorption and quality factor are modulated with wavelength, enabling non-resonant pumping of high-quality micro-lasers and solar-pumped laser to operate at record low solar concentration. Here, we present a generic theoretical framework for modeling the absorption, emission and energy transfer of incoherent radiation between cascade sensitizer and laser gain media. Our model is based on linear equations of the modified net radiation method and is therefore robust, fast converging and has low complexity. We apply this formalism to compute the optimal parameters of low-threshold solar-pumped lasers. It is revealed that the interplay between the absorption and self-absorption of such lasers defines the optimal pump absorption below the maximal value, which is in contrast to conventional lasers for which full pump absorption is desired. Numerical results are compared to experimental data on a sensitized Nd3+:YAG cavity, and quantitative agreement with theoretical models is found. Our work modularizes the gain and sensitizing components and paves the way for the optimal design of broadband-pumped high-quality micro-lasers and efficient solar-pumped lasers. PMID:27924844
Lopez, Andrea M; Bourgois, Philippe; Wenger, Lynn D; Lorvick, Jennifer; Martinez, Alexis N; Kral, Alex H
2013-03-01
Research with injection drug users (IDUs) benefits from interdisciplinary theoretical and methodological innovation because drug use is illegal, socially sanctioned and often hidden. Despite the increasing visibility of interdisciplinary, mixed methods research projects with IDUs, qualitative components are often subordinated to quantitative approaches and page restrictions in top addiction journals limit detailed reports of complex data collection and analysis logistics, thus minimizing the fuller scientific potential of genuine mixed methods. We present the methodological logistics and conceptual approaches of four mixed-methods research projects that our interdisciplinary team conducted in San Francisco with IDUs over the past two decades. These projects include combinations of participant-observation ethnography, in-depth qualitative interviewing, epidemiological surveys, photo-documentation, and geographic mapping. We adapted Greene et al.'s framework for combining methods in a single research project through: data triangulation, methodological complementarity, methodological initiation, and methodological expansion. We argue that: (1) flexible and self-reflexive methodological procedures allowed us to seize strategic opportunities to document unexpected and sometimes contradictory findings as they emerged to generate new research questions, (2) iteratively mixing methods increased the scope, reliability, and generalizability of our data, and (3) interdisciplinary collaboration contributed to a scientific "value added" that allowed for more robust theoretical and practical findings about drug use and risk-taking. Copyright © 2013 Elsevier B.V. All rights reserved.
Lopez, Andrea; Bourgois, Philippe; Wenger, Lynn; Lorvick, Jennifer; Martinez, Alexis; Kral, Alex H.
2013-01-01
Research with injection drug users (IDUs) benefits from interdisciplinary theoretical and methodological innovation because drug use is illegal, socially sanctioned and often hidden. Despite the increasing visibility of interdisciplinary, mixed methods research projects with IDUs, qualitative components are often subordinated to quantitative approaches and page restrictions in top addiction journals limit detailed reports of complex data collection and analysis logistics, thus minimizing the fuller scientific potential of genuine mixed methods. We present the methodological logistics and conceptual approaches of four mixed-methods research projects that our interdisciplinary team conducted in San Francisco with IDUs over the past two decades. These projects include combinations of participant-observation ethnography, in-depth qualitative interviewing, epidemiological surveys, photo-documentation, and geographic mapping. We adapted Greene et al.’s framework for combining methods in a single research project through: data triangulation, methodological complementarity, methodological initiation, and methodological expansion. We argue that: (1) flexible and self-reflexive methodological procedures allowed us to seize strategic opportunities to document unexpected and sometimes contradictory findings as they emerged to generate new research questions, (2) iteratively mixing methods increased the scope, reliability, and generalizability of our data, and (3) interdisciplinary collaboration contributed to a scientific “value added” that allowed for more robust theoretical and practical findings about drug use and risk-taking. PMID:23312109
A Survey and Analysis of Frameworks and Framework Issues for Information Fusion Applications
NASA Astrophysics Data System (ADS)
Llinas, James
This paper was stimulated by the proposed project for the Santander Bank-sponsored "Chairs of Excellence" program in Spain, of which the author is a recipient. That project involves research on characterizing a robust, problem-domain-agnostic framework in which Information Fusion (IF) processes of all description, to include artificial intelligence processes and techniques could be developed. The paper describes the IF process and its requirements, a literature survey on IF frameworks, and a new proposed framework that will be implemented and evaluated at Universidad Carlos III de Madrid, Colmenarejo Campus.
NASA Astrophysics Data System (ADS)
Reed, P. M.
2013-12-01
Water resources planning and management has always required the consideration of uncertainties and the associated system vulnerabilities that they may cause. Despite the long legacy of these issues, our decision support frameworks that have dominated the literature over the past 50 years have struggled with the strongly multiobjective and deeply uncertain nature of water resources systems. The term deep uncertainty (or Knightian uncertainty) refers to factors in planning that strongly shape system risks that maybe unknown and even if known there is a strong lack of consensus on their likelihoods over decadal planning horizons (population growth, financial stability, valuation of resources, ecosystem requirements, evolving water institutions, regulations, etc). In this presentation, I will propose and demonstrate the many-objective robust decision making (MORDM) framework for water resources management under deep uncertainty. The MORDM framework will be demonstrated using an urban water portfolio management test case. In the test case, a city in the Lower Rio Grande Valley managing population and drought pressures must cost effectively maintain the reliability of its water supply by blending permanent rights to reservoir inflows with alternative strategies for purchasing water within the region's water market. The case study illustrates the significant potential pitfalls in the classic Cost-Reliability conception of the problem. Moreover, the proposed MORDM framework exploits recent advances in multiobjective search, visualization, and sensitivity analysis to better expose these pitfalls en route to identifying highly robust water planning alternatives.
In the Rearview Mirror: Social Skill Development in Deaf Youth, 1990-2015.
Cawthon, Stephanie W; Fink, Bentley; Schoffstall, Sarah; Wendel, Erica
2018-01-01
Social skills are a vehicle by which individuals negotiate important relationships. The present article presents historical data on how social skills in deaf students were conceptualized and studied empirically during the period 1990-2015. Using a structured literature review approach, the researchers coded 266 articles for theoretical frameworks used and constructs studied. The vast majority of articles did not explicitly align with a specific theoretical framework. Of the 37 that did, most focused on socioemotional and cognitive frameworks, while a minority drew from frameworks focusing on attitudes, developmental theories, or ecological systems theory. In addition, 315 social-skill constructs were coded across the data set; the majority focused on socioemotional functioning. Trends in findings across the past quarter century and implications for research and practice are examined.
Robust model predictive control for constrained continuous-time nonlinear systems
NASA Astrophysics Data System (ADS)
Sun, Tairen; Pan, Yongping; Zhang, Jun; Yu, Haoyong
2018-02-01
In this paper, a robust model predictive control (MPC) is designed for a class of constrained continuous-time nonlinear systems with bounded additive disturbances. The robust MPC consists of a nonlinear feedback control and a continuous-time model-based dual-mode MPC. The nonlinear feedback control guarantees the actual trajectory being contained in a tube centred at the nominal trajectory. The dual-mode MPC is designed to ensure asymptotic convergence of the nominal trajectory to zero. This paper extends current results on discrete-time model-based tube MPC and linear system model-based tube MPC to continuous-time nonlinear model-based tube MPC. The feasibility and robustness of the proposed robust MPC have been demonstrated by theoretical analysis and applications to a cart-damper springer system and a one-link robot manipulator.
Fingeret, Michelle Cororve; Nipomnick, Summer; Crosby, Melissa A.; Reece, Gregory P.
2013-01-01
Within the field of breast reconstruction there is increasing focus on patient-reported outcomes related to satisfaction, body image, and quality of life. These outcomes are deemed highly relevant because the primary goal of breast reconstruction is to recreate the appearance of a breast (or breasts) that is satisfying to the patient. Prominent researchers have suggested the need to develop improved standards for outcome evaluation which can ultimately benefit patients as well as physicians. The purpose of this article is to summarize key findings in the area of patient-reported outcomes for breast reconstruction and introduce a theoretical framework for advancing research in this field. We conducted an extensive literature review of outcome studies for breast reconstruction focusing on patient-reported results. We developed a theoretical framework illustrating core patient-reported outcomes related to breast reconstruction and factors associated with these outcomes. Our theoretical model highlights domains and distinguishing features of patient satisfaction, body image, and quality of life outcomes for women undergoing breast reconstruction. This model further identifies a broad range of variables (e.g., historical/premorbid influences, disease and treatment-related factors) that have been found to influence patient-reported outcomes and need to be taken into consideration when designing future research in this area. Additional attention is given to examining the relationship between patient reported outcomes and outside evaluation of breast reconstruction. Our proposed theoretical framework suggests key opportunities to expand research in this area with the goal of optimizing body image adjustment, satisfaction, and psychosocial outcomes for the individual patient. PMID:23380309
Tomasone, Jennifer R; Arbour-Nicitopoulos, Kelly P; Pila, Eva; Lamontagne, Marie-Eve; Cummings, Isabelle; Latimer-Cheung, Amy E; Routhier, François
2017-06-01
In Canada, two counseling services are offered to facilitate physical activity participation among persons with physical disabilities, yet both have encountered concerns related to the recruitment and retainment of clients. The purpose of this paper is to explore factors related to service adoption among nonusers, and the barriers and facilitators to maintaining service participation among adopters. Individuals who had never enrolled in the services (nonusers, n = 13) as well as current/previous service clients (adopters, n = 26) participated in interviews based on the Theoretical Domains Framework. Transcripts were subjected to deductive thematic analysis according to participant group. Fifteen themes relating to service adoption within 10 of the 12 theoretical domains were identified for nonusers, while 23 themes relating to maintenence of service participation were identified across all 12 theoretical domains for adopters. The findings provide strategies to improve recruitment, adoption, and retention of clients in counseling services and to enhance the experiences of targeted service users. Implications for Rehabiliation Peer support and education for equipment use should be built into physical activity programs to encourage participation among persons with physical disabilities. Programs that encourage physical activity among individuals with disabilities should be designed by practitioners to be responsive to a variety of needs, which are addressed in the program's advertisements and offerings. The Theoretical Domains Framework is a useful framework for providing valuable insight about clients' experiences of adoption and maintenance of a behavior change service, suggesting merit in other rehabilitation settings.
Neurofeedback and the Neural Representation of Self: Lessons From Awake State and Sleep.
Ioannides, Andreas A
2018-01-01
Neurofeedback has been around for half a century, but despite some promising results it is not yet widely appreciated. Recently, some of the concerns about neurofeedback have been addressed with functional magnetic resonance imaging and magnetoencephalography adding their contributions to the long history of neurofeedback with electroencephalography. Attempts to address other concerns related to methodological issues with new experiments and meta-analysis of earlier studies, have opened up new questions about its efficacy. A key concern about neurofeedback is the missing framework to explain how improvements in very different and apparently unrelated conditions are achieved. Recent advances in neuroscience begin to address this concern. A particularly promising approach is the analysis of resting state of fMRI data, which has revealed robust covariations in brain networks that maintain their integrity in sleep and even anesthesia. Aberrant activity in three brain wide networks (i.e., the default mode, central executive and salience networks) has been associated with a number of psychiatric disorders. Recent publications have also suggested that neurofeedback guides the restoration of "normal" activity in these three networks. Using very recent results from our analysis of whole night MEG sleep data together with key concepts from developmental psychology, cloaked in modern neuroscience terms, a theoretical framework is proposed for a neural representation of the self, located at the core of a double onion-like structure of the default mode network. This framework fits a number of old and recent neuroscientific findings, and unites the way attention and memory operate in awake state and during sleep. In the process, safeguards are uncovered, put in place by evolution, before any interference with the core representation of self can proceed. Within this framework, neurofeedback is seen as set of methods for restoration of aberrant activity in large scale networks. The framework also admits quantitative measures of improvements to be made by personalized neurofeedback protocols. Finally, viewed through the framework developed, neurofeedback's safe nature is revealed while raising some concerns for interventions that attempt to alter the neural self-representation bypassing the safeguards evolution has put in place.
Gale, Nicola K; Shapiro, Jonathan; McLeod, Hugh S T; Redwood, Sabi; Hewison, Alistair
2014-08-20
Organizational culture is considered by policy-makers, clinicians, health service managers and researchers to be a crucial mediator in the success of implementing health service redesign. It is a challenge to find a method to capture cultural issues that is both theoretically robust and meaningful to those working in the organizations concerned. As part of a comparative study of service redesign in three acute hospital organizations in England, UK, a framework for collecting data reflective of culture was developed that was informed by previous work in the field and social and cultural theory. As part of a larger mixed method comparative case study of hospital service redesign, informed by realist evaluation, the authors developed a framework for researching organisational culture during health service redesign and change. This article documents the development of the model, which involved an iterative process of data analysis, critical interdisciplinary discussion in the research team, and feedback from staff in the partner organisations. Data from semi-structured interviews with 77 key informants are used to illustrate the model. In workshops with NHS partners to share and debate the early findings of the study, organizational culture was identified as a key concept to explore because it was perceived to underpin the whole redesign process. The Patients-People-Place framework for studying culture focuses on three thematic areas ('domains') and three levels of culture in which the data could be organised. The framework can be used to help explain the relationship between observable behaviours and cultural artefacts, the values and habits of social actors and the basic assumptions underpinning an organization's culture in each domain. This paper makes a methodological contribution to the study of culture in health care organizations. It offers guidance and a practical approach to investigating the inherently complex phenomenon of culture in hospital organizations. The Patients-People-Place framework could be applied in other settings as a means of ensuring the three domains and three levels that are important to an organization's culture are addressed in future health service research.
Neurofeedback and the Neural Representation of Self: Lessons From Awake State and Sleep
Ioannides, Andreas A.
2018-01-01
Neurofeedback has been around for half a century, but despite some promising results it is not yet widely appreciated. Recently, some of the concerns about neurofeedback have been addressed with functional magnetic resonance imaging and magnetoencephalography adding their contributions to the long history of neurofeedback with electroencephalography. Attempts to address other concerns related to methodological issues with new experiments and meta-analysis of earlier studies, have opened up new questions about its efficacy. A key concern about neurofeedback is the missing framework to explain how improvements in very different and apparently unrelated conditions are achieved. Recent advances in neuroscience begin to address this concern. A particularly promising approach is the analysis of resting state of fMRI data, which has revealed robust covariations in brain networks that maintain their integrity in sleep and even anesthesia. Aberrant activity in three brain wide networks (i.e., the default mode, central executive and salience networks) has been associated with a number of psychiatric disorders. Recent publications have also suggested that neurofeedback guides the restoration of “normal” activity in these three networks. Using very recent results from our analysis of whole night MEG sleep data together with key concepts from developmental psychology, cloaked in modern neuroscience terms, a theoretical framework is proposed for a neural representation of the self, located at the core of a double onion-like structure of the default mode network. This framework fits a number of old and recent neuroscientific findings, and unites the way attention and memory operate in awake state and during sleep. In the process, safeguards are uncovered, put in place by evolution, before any interference with the core representation of self can proceed. Within this framework, neurofeedback is seen as set of methods for restoration of aberrant activity in large scale networks. The framework also admits quantitative measures of improvements to be made by personalized neurofeedback protocols. Finally, viewed through the framework developed, neurofeedback’s safe nature is revealed while raising some concerns for interventions that attempt to alter the neural self-representation bypassing the safeguards evolution has put in place. PMID:29755332
Rycroft-Malone, Jo; Seers, Kate; Chandler, Jackie; Hawkes, Claire A; Crichton, Nicola; Allen, Claire; Bullock, Ian; Strunin, Leo
2013-03-09
The case has been made for more and better theory-informed process evaluations within trials in an effort to facilitate insightful understandings of how interventions work. In this paper, we provide an explanation of implementation processes from one of the first national implementation research randomized controlled trials with embedded process evaluation conducted within acute care, and a proposed extension to the Promoting Action on Research Implementation in Health Services (PARIHS) framework. The PARIHS framework was prospectively applied to guide decisions about intervention design, data collection, and analysis processes in a trial focussed on reducing peri-operative fasting times. In order to capture a holistic picture of implementation processes, the same data were collected across 19 participating hospitals irrespective of allocation to intervention. This paper reports on findings from data collected from a purposive sample of 151 staff and patients pre- and post-intervention. Data were analysed using content analysis within, and then across data sets. A robust and uncontested evidence base was a necessary, but not sufficient condition for practice change, in that individual staff and patient responses such as caution influenced decision making. The implementation context was challenging, in which individuals and teams were bounded by professional issues, communication challenges, power and a lack of clarity for the authority and responsibility for practice change. Progress was made in sites where processes were aligned with existing initiatives. Additionally, facilitators reported engaging in many intervention implementation activities, some of which result in practice changes, but not significant improvements to outcomes. This study provided an opportunity for reflection on the comprehensiveness of the PARIHS framework. Consistent with the underlying tenant of PARIHS, a multi-faceted and dynamic story of implementation was evident. However, the prominent role that individuals played as part of the interaction between evidence and context is not currently explicit within the framework. We propose that successful implementation of evidence into practice is a planned facilitated process involving an interplay between individuals, evidence, and context to promote evidence-informed practice. This proposal will enhance the potential of the PARIHS framework for explanation, and ensure theoretical development both informs and responds to the evidence base for implementation.
Batista Ferrer, Harriet; Audrey, Suzanne; Trotter, Caroline; Hickman, Matthew
2015-01-01
Background Interventions to increase uptake of Human Papillomavirus (HPV) vaccination by young women may be more effective if they are underpinned by an appropriate theoretical model or framework. The aims of this review were: to describe the theoretical models or frameworks used to explain behaviours in relation to HPV vaccination of young women, and: to consider the appropriateness of the theoretical models or frameworks used for informing the development of interventions to increase uptake. Methods Primary studies were identified through a comprehensive search of databases from inception to December 2013. Results Thirty-four relevant studies were identified, of which 31 incorporated psychological health behaviour models or frameworks and three used socio-cultural models or theories. The primary studies used a variety of approaches to measure a diverse range of outcomes in relation to behaviours of professionals, parents, and young women. The majority appeared to use theory appropriately throughout. About half of the quantitative studies presented data in relation to goodness of fit tests and the proportion of the variability in the data. Conclusion Due to diverse approaches and inconsistent findings across studies, the current contribution of theory to understanding and promoting HPV vaccination uptake is difficult to assess. Ecological frameworks encourage the integration of individual and social approaches by encouraging exploration of the intrapersonal, interpersonal, organisational, community and policy levels when examining public health issues. Given the small number of studies using such approach, combined with the importance of these factors in predicting behaviour, more research in this area is warranted. PMID:26314783
Zhang, Chengwei; Li, Xiaohong; Li, Shuxin; Feng, Zhiyong
2017-09-20
Biological environment is uncertain and its dynamic is similar to the multiagent environment, thus the research results of the multiagent system area can provide valuable insights to the understanding of biology and are of great significance for the study of biology. Learning in a multiagent environment is highly dynamic since the environment is not stationary anymore and each agent's behavior changes adaptively in response to other coexisting learners, and vice versa. The dynamics becomes more unpredictable when we move from fixed-agent interaction environments to multiagent social learning framework. Analytical understanding of the underlying dynamics is important and challenging. In this work, we present a social learning framework with homogeneous learners (e.g., Policy Hill Climbing (PHC) learners), and model the behavior of players in the social learning framework as a hybrid dynamical system. By analyzing the dynamical system, we obtain some conditions about convergence or non-convergence. We experimentally verify the predictive power of our model using a number of representative games. Experimental results confirm the theoretical analysis. Under multiagent social learning framework, we modeled the behavior of agent in biologic environment, and theoretically analyzed the dynamics of the model. We present some sufficient conditions about convergence or non-convergence and prove them theoretically. It can be used to predict the convergence of the system.
Lee, Tak Yan
2011-01-01
This is a theoretical paper with an aim to construct an integrated conceptual framework for the prevention of adolescents' use and abuse of psychotropic drugs. This paper first reports the subjective reasons for adolescents' drug use and abuse in Hong Kong and reviews the theoretical underpinnings. Theories of drug use and abuse, including neurological, pharmacological, genetic predisposition, psychological, and sociological theories, were reviewed. It provides a critical re-examination of crucial factors that support the construction of a conceptual framework for primary prevention of adolescents' drug use and abuse building on, with minor revision, the model of victimization and substance abuse among women presented by Logan et al. This revised model provides a comprehensive and coherent framework synthesized from theories of drug abuse. This paper then provides empirical support for integrating a positive youth development perspective in the revised model. It further explains how the 15 empirically sound constructs identified by Catalano et al. and used in a positive youth development program, the Project P.A.T.H.S., relate generally to the components of the revised model to formulate an integrated positive youth development conceptual framework for primary prevention of adolescent drug use. Theoretical and practical implications as well as limitations and recommendations are discussed. PMID:22194671
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuan, Chris, E-mail: cyuan@uwm.edu; Wang, Endong; Zhai, Qiang
Temporal homogeneity of inventory data is one of the major problems in life cycle assessment (LCA). Addressing temporal homogeneity of life cycle inventory data is important in reducing the uncertainties and improving the reliability of LCA results. This paper attempts to present a critical review and discussion on the fundamental issues of temporal homogeneity in conventional LCA and propose a theoretical framework for temporal discounting in LCA. Theoretical perspectives for temporal discounting in life cycle inventory analysis are discussed first based on the key elements of a scientific mechanism for temporal discounting. Then generic procedures for performing temporal discounting inmore » LCA is derived and proposed based on the nature of the LCA method and the identified key elements of a scientific temporal discounting method. A five-step framework is proposed and reported in details based on the technical methods and procedures needed to perform a temporal discounting in life cycle inventory analysis. Challenges and possible solutions are also identified and discussed for the technical procedure and scientific accomplishment of each step within the framework. - Highlights: • A critical review for temporal homogeneity problem of life cycle inventory data • A theoretical framework for performing temporal discounting on inventory data • Methods provided to accomplish each step of the temporal discounting framework.« less
2004-12-01
handling using the X10 home automation protocol. Each 3D graphics client renders its scene according to an assigned virtual camera position. By having...control protocol. DMX is a versatile and robust framework which overcomes limitations of the X10 home automation protocol which we are currently using
ERIC Educational Resources Information Center
Koedinger, Kenneth R.; Corbett, Albert T.; Perfetti, Charles
2012-01-01
Despite the accumulation of substantial cognitive science research relevant to education, there remains confusion and controversy in the application of research to educational practice. In support of a more systematic approach, we describe the Knowledge-Learning-Instruction (KLI) framework. KLI promotes the emergence of instructional principles of…
Rangarajan, Srinivas; Maravelias, Christos T.; Mavrikakis, Manos
2017-11-09
Here, we present a general optimization-based framework for (i) ab initio and experimental data driven mechanistic modeling and (ii) optimal catalyst design of heterogeneous catalytic systems. Both cases are formulated as a nonlinear optimization problem that is subject to a mean-field microkinetic model and thermodynamic consistency requirements as constraints, for which we seek sparse solutions through a ridge (L 2 regularization) penalty. The solution procedure involves an iterative sequence of forward simulation of the differential algebraic equations pertaining to the microkinetic model using a numerical tool capable of handling stiff systems, sensitivity calculations using linear algebra, and gradient-based nonlinear optimization.more » A multistart approach is used to explore the solution space, and a hierarchical clustering procedure is implemented for statistically classifying potentially competing solutions. An example of methanol synthesis through hydrogenation of CO and CO 2 on a Cu-based catalyst is used to illustrate the framework. The framework is fast, is robust, and can be used to comprehensively explore the model solution and design space of any heterogeneous catalytic system.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rangarajan, Srinivas; Maravelias, Christos T.; Mavrikakis, Manos
Here, we present a general optimization-based framework for (i) ab initio and experimental data driven mechanistic modeling and (ii) optimal catalyst design of heterogeneous catalytic systems. Both cases are formulated as a nonlinear optimization problem that is subject to a mean-field microkinetic model and thermodynamic consistency requirements as constraints, for which we seek sparse solutions through a ridge (L 2 regularization) penalty. The solution procedure involves an iterative sequence of forward simulation of the differential algebraic equations pertaining to the microkinetic model using a numerical tool capable of handling stiff systems, sensitivity calculations using linear algebra, and gradient-based nonlinear optimization.more » A multistart approach is used to explore the solution space, and a hierarchical clustering procedure is implemented for statistically classifying potentially competing solutions. An example of methanol synthesis through hydrogenation of CO and CO 2 on a Cu-based catalyst is used to illustrate the framework. The framework is fast, is robust, and can be used to comprehensively explore the model solution and design space of any heterogeneous catalytic system.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Xiongbiao, E-mail: xluo@robarts.ca, E-mail: Ying.Wan@student.uts.edu.au; Wan, Ying, E-mail: xluo@robarts.ca, E-mail: Ying.Wan@student.uts.edu.au; He, Xiangjian
Purpose: Electromagnetically guided endoscopic procedure, which aims at accurately and robustly localizing the endoscope, involves multimodal sensory information during interventions. However, it still remains challenging in how to integrate these information for precise and stable endoscopic guidance. To tackle such a challenge, this paper proposes a new framework on the basis of an enhanced particle swarm optimization method to effectively fuse these information for accurate and continuous endoscope localization. Methods: The authors use the particle swarm optimization method, which is one of stochastic evolutionary computation algorithms, to effectively fuse the multimodal information including preoperative information (i.e., computed tomography images) asmore » a frame of reference, endoscopic camera videos, and positional sensor measurements (i.e., electromagnetic sensor outputs). Since the evolutionary computation method usually limits its possible premature convergence and evolutionary factors, the authors introduce the current (endoscopic camera and electromagnetic sensor’s) observation to boost the particle swarm optimization and also adaptively update evolutionary parameters in accordance with spatial constraints and the current observation, resulting in advantageous performance in the enhanced algorithm. Results: The experimental results demonstrate that the authors’ proposed method provides a more accurate and robust endoscopic guidance framework than state-of-the-art methods. The average guidance accuracy of the authors’ framework was about 3.0 mm and 5.6° while the previous methods show at least 3.9 mm and 7.0°. The average position and orientation smoothness of their method was 1.0 mm and 1.6°, which is significantly better than the other methods at least with (2.0 mm and 2.6°). Additionally, the average visual quality of the endoscopic guidance was improved to 0.29. Conclusions: A robust electromagnetically guided endoscopy framework was proposed on the basis of an enhanced particle swarm optimization method with using the current observation information and adaptive evolutionary factors. The authors proposed framework greatly reduced the guidance errors from (4.3, 7.8) to (3.0 mm, 5.6°), compared to state-of-the-art methods.« less
Organizational strategy, structure, and process.
Miles, R E; Snow, C C; Meyer, A D; Coleman, H J
1978-07-01
Organizational adaptation is a topic that has received only limited and fragmented theoretical treatment. Any attempt to examine organizational adaptation is difficult, since the process is highly complex and changeable. The proposed theoretical framework deals with alternative ways in which organizations define their product-market domains (strategy) and construct mechanisms (structures and processes) to pursue these strategies. The framework is based on interpretation of existing literature and continuing studies in four industries (college textbook publishing, electronics, food processing, and health care).
The theoretical tools of experimental gravitation
NASA Technical Reports Server (NTRS)
Will, C. M.
1972-01-01
Theoretical frameworks for testing relativistic gravity are presented in terms of a system for analyzing theories of gravity invented as alternatives to Einstein. The parametrized post-Newtonian (PPN) formalism, based on the Dicke framework and the Eotvos-Dicke-Braginsky experiment, is discussed in detail. The metric theories of gravity, and their post-Newtonian limits are reviewed, and PPN equations of motion are derived. These equations are used to analyze specific effects and experimental tests in the solar system.
Less can be more: How to make operations more flexible and robust with fewer resources
NASA Astrophysics Data System (ADS)
Haksöz, ćaǧrı; Katsikopoulos, Konstantinos; Gigerenzer, Gerd
2018-06-01
We review empirical evidence from practice and general theoretical conditions, under which simple rules of thumb can help to make operations flexible and robust. An operation is flexible when it responds adaptively to adverse events such as natural disasters; an operation is robust when it is less affected by adverse events in the first place. We illustrate the relationship between flexibility and robustness in the context of supply chain risk. In addition to increasing flexibility and robustness, simple rules simultaneously reduce the need for resources such as time, money, information, and computation. We illustrate the simple-rules approach with an easy-to-use graphical aid for diagnosing and managing supply chain risk. More generally, we recommend a four-step process for determining the amount of resources that decision makers should invest in so as to increase flexibility and robustness.
Harnessing Sparse and Low-Dimensional Structures for Robust Clustering of Imagery Data
ERIC Educational Resources Information Center
Rao, Shankar Ramamohan
2009-01-01
We propose a robust framework for clustering data. In practice, data obtained from real measurement devices can be incomplete, corrupted by gross errors, or not correspond to any assumed model. We show that, by properly harnessing the intrinsic low-dimensional structure of the data, these kinds of practical problems can be dealt with in a uniform…
A model to assess the Mars Telecommunications Network relay robustness
NASA Technical Reports Server (NTRS)
Girerd, Andre R.; Meshkat, Leila; Edwards, Charles D., Jr.; Lee, Charles H.
2005-01-01
The relatively long mission durations and compatible radio protocols of current and projected Mars orbiters have enabled the gradual development of a heterogeneous constellation providing proximity communication services for surface assets. The current and forecasted capability of this evolving network has reached the point that designers of future surface missions consider complete dependence on it. Such designers, along with those architecting network requirements, have a need to understand the robustness of projected communication service. A model has been created to identify the robustness of the Mars Network as a function of surface location and time. Due to the decade-plus time horizon considered, the network will evolve, with emerging productive nodes and nodes that cease or fail to contribute. The model is a flexible framework to holistically process node information into measures of capability robustness that can be visualized for maximum understanding. Outputs from JPL's Telecom Orbit Analysis Simulation Tool (TOAST) provide global telecom performance parameters for current and projected orbiters. Probabilistic estimates of orbiter fuel life are derived from orbit keeping burn rates, forecasted maneuver tasking, and anomaly resolution budgets. Orbiter reliability is estimated probabilistically. A flexible scheduling framework accommodates the projected mission queue as well as potential alterations.
On decentralized adaptive full-order sliding mode control of multiple UAVs.
Xiang, Xianbo; Liu, Chao; Su, Housheng; Zhang, Qin
2017-11-01
In this study, a novel decentralized adaptive full-order sliding mode control framework is proposed for the robust synchronized formation motion of multiple unmanned aerial vehicles (UAVs) subject to system uncertainty. First, a full-order sliding mode surface in a decentralized manner is designed to incorporate both the individual position tracking error and the synchronized formation error while the UAV group is engaged in building a certain desired geometric pattern in three dimensional space. Second, a decentralized virtual plant controller is constructed which allows the embedded low-pass filter to attain the chattering free property of the sliding mode controller. In addition, robust adaptive technique is integrated in the decentralized chattering free sliding control design in order to handle unknown bounded uncertainties, without requirements for assuming a priori knowledge of bounds on the system uncertainties as stated in conventional chattering free control methods. Subsequently, system robustness as well as stability of the decentralized full-order sliding mode control of multiple UAVs is synthesized. Numerical simulation results illustrate the effectiveness of the proposed control framework to achieve robust 3D formation flight of the multi-UAV system. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Anees, Asim; Aryal, Jagannath; O'Reilly, Małgorzata M.; Gale, Timothy J.; Wardlaw, Tim
2016-12-01
A robust non-parametric framework, based on multiple Radial Basic Function (RBF) kernels, is proposed in this study, for detecting land/forest cover changes using Landsat 7 ETM+ images. One of the widely used frameworks is to find change vectors (difference image) and use a supervised classifier to differentiate between change and no-change. The Bayesian Classifiers e.g. Maximum Likelihood Classifier (MLC), Naive Bayes (NB), are widely used probabilistic classifiers which assume parametric models, e.g. Gaussian function, for the class conditional distributions. However, their performance can be limited if the data set deviates from the assumed model. The proposed framework exploits the useful properties of Least Squares Probabilistic Classifier (LSPC) formulation i.e. non-parametric and probabilistic nature, to model class posterior probabilities of the difference image using a linear combination of a large number of Gaussian kernels. To this end, a simple technique, based on 10-fold cross-validation is also proposed for tuning model parameters automatically instead of selecting a (possibly) suboptimal combination from pre-specified lists of values. The proposed framework has been tested and compared with Support Vector Machine (SVM) and NB for detection of defoliation, caused by leaf beetles (Paropsisterna spp.) in Eucalyptus nitens and Eucalyptus globulus plantations of two test areas, in Tasmania, Australia, using raw bands and band combination indices of Landsat 7 ETM+. It was observed that due to multi-kernel non-parametric formulation and probabilistic nature, the LSPC outperforms parametric NB with Gaussian assumption in change detection framework, with Overall Accuracy (OA) ranging from 93.6% (κ = 0.87) to 97.4% (κ = 0.94) against 85.3% (κ = 0.69) to 93.4% (κ = 0.85), and is more robust to changing data distributions. Its performance was comparable to SVM, with added advantages of being probabilistic and capable of handling multi-class problems naturally with its original formulation.
Robustness surfaces of complex networks
NASA Astrophysics Data System (ADS)
Manzano, Marc; Sahneh, Faryad; Scoglio, Caterina; Calle, Eusebi; Marzo, Jose Luis
2014-09-01
Despite the robustness of complex networks has been extensively studied in the last decade, there still lacks a unifying framework able to embrace all the proposed metrics. In the literature there are two open issues related to this gap: (a) how to dimension several metrics to allow their summation and (b) how to weight each of the metrics. In this work we propose a solution for the two aforementioned problems by defining the R*-value and introducing the concept of robustness surface (Ω). The rationale of our proposal is to make use of Principal Component Analysis (PCA). We firstly adjust to 1 the initial robustness of a network. Secondly, we find the most informative robustness metric under a specific failure scenario. Then, we repeat the process for several percentage of failures and different realizations of the failure process. Lastly, we join these values to form the robustness surface, which allows the visual assessment of network robustness variability. Results show that a network presents different robustness surfaces (i.e., dissimilar shapes) depending on the failure scenario and the set of metrics. In addition, the robustness surface allows the robustness of different networks to be compared.
Robustness surfaces of complex networks.
Manzano, Marc; Sahneh, Faryad; Scoglio, Caterina; Calle, Eusebi; Marzo, Jose Luis
2014-09-02
Despite the robustness of complex networks has been extensively studied in the last decade, there still lacks a unifying framework able to embrace all the proposed metrics. In the literature there are two open issues related to this gap: (a) how to dimension several metrics to allow their summation and (b) how to weight each of the metrics. In this work we propose a solution for the two aforementioned problems by defining the R*-value and introducing the concept of robustness surface (Ω). The rationale of our proposal is to make use of Principal Component Analysis (PCA). We firstly adjust to 1 the initial robustness of a network. Secondly, we find the most informative robustness metric under a specific failure scenario. Then, we repeat the process for several percentage of failures and different realizations of the failure process. Lastly, we join these values to form the robustness surface, which allows the visual assessment of network robustness variability. Results show that a network presents different robustness surfaces (i.e., dissimilar shapes) depending on the failure scenario and the set of metrics. In addition, the robustness surface allows the robustness of different networks to be compared.
Using the TPACK Framework to Facilitate Decision Making on Instructional Technologies
ERIC Educational Resources Information Center
Sobel, Karen; Grotti, Margaret G.
2013-01-01
Technological pedagogical content knowledge ("TPACK") is a theoretical framework used primarily in the field of education to facilitate the integration of technology into educational endeavors. This framework can be particularly valuable to librarians, who are heavy users of technology, and can provide a structure that can help…
Metacognition, Positioning and Emotions in Mathematical Activities
ERIC Educational Resources Information Center
Daher, Wajeeh; Anabousy, Ahlam; Jabarin, Roqaya
2018-01-01
Researchers of mathematics education have been paying attention to the affective aspect of learning mathematics for more than one decade. Different theoretical frameworks have been suggested to analyze this aspect, where we utilize in the present research the discursive framework of Evans, Morgan and Tsatsaroni. This framework enables to link…
Theoretical Framework of Researcher Knowledge Development in Mathematics Education
ERIC Educational Resources Information Center
Kontorovich, Igor'
2016-01-01
The goal of this paper is to present a framework of researcher knowledge development in conducting a study in mathematics education. The key components of the framework are: knowledge germane to conducting a particular study, processes of knowledge accumulation, and catalyzing filters that influence a researcher's decision making. The components…
Adventure Learning and Learner-Engagement: Frameworks for Designers and Educators
ERIC Educational Resources Information Center
Henrickson, Jeni; Doering, Aaron
2013-01-01
There is a recognized need for theoretical frameworks that can guide designers and educators in the development of engagement-rich learning experiences that incorporate emerging technologies in pedagogically sound ways. This study investigated one such promising framework, adventure learning (AL). Data were gathered via surveys, interviews, direct…
Developmental Implications of the Levels of Processing Memory Framework.
ERIC Educational Resources Information Center
Naus, Mary J.
The levels of processing framework for understanding memory development has generated little empirical or theoretical work that furthers an understanding of the developmental memory system. Although empirical studies by those testing the levels of processing framework have demonstrated that mnemonic strategies employed by children are the critical…
Peer-Formativity: A Framework for Academic Writing
ERIC Educational Resources Information Center
Murray, Rowena; Thow, Morag
2014-01-01
The system currently deployed to assess research outputs in higher education can influence what, how and for whom academics write; for some it may determine whether or not they write at all. This article offers a framework for negotiating this performative context--the writing meeting. This framework uses the established theoretical underpinning…
A Competency Approach to Developing Leaders--Is This Approach Effective?
ERIC Educational Resources Information Center
Richards, Patricia
2008-01-01
This paper examines the underlying assumptions that competency-based frameworks are based upon in relation to leadership development. It examines the impetus for this framework becoming the prevailing theoretical base for developing leaders and tracks the historical path to this phenomenon. Research suggests that a competency-based framework may…
Practical robustness measures in multivariable control system analysis. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Lehtomaki, N. A.
1981-01-01
The robustness of the stability of multivariable linear time invariant feedback control systems with respect to model uncertainty is considered using frequency domain criteria. Available robustness tests are unified under a common framework based on the nature and structure of model errors. These results are derived using a multivariable version of Nyquist's stability theorem in which the minimum singular value of the return difference transfer matrix is shown to be the multivariable generalization of the distance to the critical point on a single input, single output Nyquist diagram. Using the return difference transfer matrix, a very general robustness theorem is presented from which all of the robustness tests dealing with specific model errors may be derived. The robustness tests that explicitly utilized model error structure are able to guarantee feedback system stability in the face of model errors of larger magnitude than those robustness tests that do not. The robustness of linear quadratic Gaussian control systems are analyzed.
Redesigning Orientation in an Intensive Care Unit Using 2 Theoretical Models.
Kozub, Elizabeth; Hibanada-Laserna, Maribel; Harget, Gwen; Ecoff, Laurie
2015-01-01
To accommodate a higher demand for critical care nurses, an orientation program in a surgical intensive care unit was revised and streamlined. Two theoretical models served as a foundation for the revision and resulted in clear clinical benchmarks for orientation progress evaluation. The purpose of the project was to integrate theoretical frameworks into practice to improve the unit orientation program. Performance improvement methods served as a framework for the revision, and outcomes were measured before and after implementation. The revised orientation program increased 1- and 2-year nurse retention and decreased turnover. Critical care knowledge increased after orientation for both the preintervention and postintervention groups. Incorporating a theoretical basis for orientation has been shown to be successful in increasing the number of nurses completing orientation and improving retention, turnover rates, and knowledge gained.
NASA Astrophysics Data System (ADS)
Tryfonidis, Michail
It has been observed that during orbital spaceflight the absence of gravitation related sensory inputs causes incongruence between the expected and the actual sensory feedback resulting from voluntary movements. This incongruence results in a reinterpretation or neglect of gravity-induced sensory input signals. Over time, new internal models develop, gradually compensating for the loss of spatial reference. The study of adaptation of goal-directed movements is the main focus of this thesis. The hypothesis is that during the adaptive learning process the neural connections behave in ways that can be described by an adaptive control method. The investigation presented in this thesis includes two different sets of experiments. A series of dart throwing experiments took place onboard the space station Mir. Experiments also took place at the Biomechanics lab at MIT, where the subjects performed a series of continuous trajectory tracking movements while a planar robotic manipulandum exerted external torques on the subjects' moving arms. The experimental hypothesis for both experiments is that during the first few trials the subjects will perform poorly trying to follow a prescribed trajectory, or trying to hit a target. A theoretical framework is developed that is a modification of the sliding control method used in robotics. The new control framework is an attempt to explain the adaptive behavior of the subjects. Numerical simulations of the proposed framework are compared with experimental results and predictions from competitive models. The proposed control methodology extends the results of the sliding mode theory to human motor control. The resulting adaptive control model of the motor system is robust to external dynamics, even those of negative gain, uses only position and velocity feedback, and achieves bounded steady-state error without explicit knowledge of the system's nonlinearities. In addition, the experimental and modeling results demonstrate that visuomotor learning is important not only for error correction through internal model adaptation on ground or in microgravity, but also for the minimization of the total mean-square error in the presence of random variability. Thus human intelligent decision displays certain attributes that seem to conform to Bayesian statistical games. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.)
Atlas-based liver segmentation and hepatic fat-fraction assessment for clinical trials.
Yan, Zhennan; Zhang, Shaoting; Tan, Chaowei; Qin, Hongxing; Belaroussi, Boubakeur; Yu, Hui Jing; Miller, Colin; Metaxas, Dimitris N
2015-04-01
Automated assessment of hepatic fat-fraction is clinically important. A robust and precise segmentation would enable accurate, objective and consistent measurement of hepatic fat-fraction for disease quantification, therapy monitoring and drug development. However, segmenting the liver in clinical trials is a challenging task due to the variability of liver anatomy as well as the diverse sources the images were acquired from. In this paper, we propose an automated and robust framework for liver segmentation and assessment. It uses single statistical atlas registration to initialize a robust deformable model to obtain fine segmentation. Fat-fraction map is computed by using chemical shift based method in the delineated region of liver. This proposed method is validated on 14 abdominal magnetic resonance (MR) volumetric scans. The qualitative and quantitative comparisons show that our proposed method can achieve better segmentation accuracy with less variance comparing with two other atlas-based methods. Experimental results demonstrate the promises of our assessment framework. Copyright © 2014 Elsevier Ltd. All rights reserved.
Robust all-source positioning of UAVs based on belief propagation
NASA Astrophysics Data System (ADS)
Chen, Xi; Gao, Wenyun; Wang, Jiabo
2013-12-01
For unmanned air vehicles (UAVs) to survive hostile operational environments, it is always preferable to utilize all wireless positioning sources available to fuse a robust position. While belief propagation is a well-established method for all source data fusion, it is not an easy job to handle all the mathematics therein. In this work, a comprehensive mathematical framework for belief propagation-based all-source positioning of UAVs is developed, taking wireless sources including Global Navigation Satellite Systems (GNSS) space vehicles, peer UAVs, ground control stations, and signal of opportunities. Based on the mathematical framework, a positioning algorithm named Belief propagation-based Opportunistic Positioning of UAVs (BOPU) is proposed, with an unscented particle filter for Bayesian approximation. The robustness of the proposed BOPU is evaluated by a fictitious scenario that a group of formation flying UAVs encounter GNSS countermeasures en route. Four different configurations of measurements availability are simulated. The results show that the performance of BOPU varies only slightly with different measurements availability.
Methods for compressible multiphase flows and their applications
NASA Astrophysics Data System (ADS)
Kim, H.; Choe, Y.; Kim, H.; Min, D.; Kim, C.
2018-06-01
This paper presents an efficient and robust numerical framework to deal with multiphase real-fluid flows and their broad spectrum of engineering applications. A homogeneous mixture model incorporated with a real-fluid equation of state and a phase change model is considered to calculate complex multiphase problems. As robust and accurate numerical methods to handle multiphase shocks and phase interfaces over a wide range of flow speeds, the AUSMPW+_N and RoeM_N schemes with a system preconditioning method are presented. These methods are assessed by extensive validation problems with various types of equation of state and phase change models. Representative realistic multiphase phenomena, including the flow inside a thermal vapor compressor, pressurization in a cryogenic tank, and unsteady cavitating flow around a wedge, are then investigated as application problems. With appropriate physical modeling followed by robust and accurate numerical treatments, compressible multiphase flow physics such as phase changes, shock discontinuities, and their interactions are well captured, confirming the suitability of the proposed numerical framework to wide engineering applications.
A Theoretical Framework for a Virtual Diabetes Self-Management Community Intervention
Vorderstrasse, Allison; Shaw, Ryan J.; Blascovich, Jim; Johnson, Constance M.
2015-01-01
Due to its high prevalence, chronic nature, potential complications, and self-management challenges for patients, diabetes presents significant health education and support issues. We developed and pilot-tested a virtual community for adults with type 2 diabetes to promote self-management education and provide social support. Although digital-based programs such as virtual environments can address significant barriers to reaching patients (i.e., child care, transportation, location), they must be strongly grounded in a theoretical basis to be well-developed and effective. In this article, we discuss how we synthesized behavioral and virtual environment theoretical frameworks to guide the development of SLIDES (Second Life Impacts Diabetes Education and Support). PMID:24451083
A theoretical framework for a virtual diabetes self-management community intervention.
Vorderstrasse, Allison; Shaw, Ryan J; Blascovich, Jim; Johnson, Constance M
2014-10-01
Due to its high prevalence, chronic nature, potential complications, and self-management challenges for patients, diabetes presents significant health education and support issues. We developed and pilot-tested a virtual community for adults with type 2 diabetes to promote self-management education and provide social support. Although digital-based programs such as virtual environments can address significant barriers to reaching patients (i.e., child care, transportation, location), they must be strongly grounded in a theoretical basis to be well-developed and effective. In this article, we discuss how we synthesized behavioral and virtual environment theoretical frameworks to guide the development of SLIDES (Second Life Impacts Diabetes Education and Support). © The Author(s) 2014.
Hunt, Louise A; McGee, Paula; Gutteridge, Robin; Hughes, Malcolm
2016-04-01
This study was undertaken in response to concerns that mentors who assessed practical competence were reluctant to fail student nurses which generated doubts about the fitness to practise of some registered nurses. Limited evidence was available about the experiences of mentors who had failed underperforming students and what had helped them to do this. To investigate what enabled some mentors to fail underperforming students when it was recognised that many were hesitant to do so. An ethically approved, grounded theory approach was used to explore thirty-one nurses' experiences of failing student nurses in practical assessments in England. Participants were recruited using theoretical sampling techniques. Semi-structured interviews were conducted. Analysis was undertaken using iterative, constant comparative techniques and reflexive processes. The theoretical framework which emerged had strong resonance with professionals. Five categories emerged from the findings: (1) Braving the assessment vortex; (2) Identifying the 'gist' of underperformance; (3) Tempering Reproach; (4) Standing up to scrutiny; and (5) Drawing on an interpersonal network. These categories together revealed that mentors needed to feel secure to fail a student nurse in a practical assessment and that they used a three stage decision making process to ascertain if this was the case. Many of the components which helped mentors to feel secure were informal in nature and functioned on goodwill and local arrangements rather than on timely, formal, organisational systems. The mentor's partner/spouse and practice education facilitator or link lecturer were identified as the key people who provided essential emotional support during this challenging experience. This study contributes to understanding of the combined supportive elements required for robust practical assessment. It presents a new explanatory framework about how mentors formulate the decision to fail a student nurse and the supportive structures which are necessary for this to occur. Copyright © 2016 Elsevier Ltd. All rights reserved.
Liao, Qiuyan; Cowling, Benjamin J; Lam, Wendy Wing Tak; Fielding, Richard
2011-06-01
Understanding population responses to influenza helps optimize public health interventions. Relevant theoretical frameworks remain nascent. To model associations between trust in information, perceived hygiene effectiveness, knowledge about the causes of influenza, perceived susceptibility and worry, and personal hygiene practices (PHPs) associated with influenza. Cross-sectional household telephone surveys on avian influenza A/H5N1 (2006) and pandemic influenza A/H1N1 (2009) gathered comparable data on trust in formal and informal sources of influenza information, influenza-related knowledge, perceived hygiene effectiveness, worry, perceived susceptibility, and PHPs. Exploratory factor analysis confirmed domain content while confirmatory factor analysis was used to evaluate the extracted factors. The hypothesized model, compiled from different theoretical frameworks, was optimized with structural equation modelling using the A/H5N1 data. The optimized model was then tested against the A/H1N1 dataset. The model was robust across datasets though corresponding path weights differed. Trust in formal information was positively associated with perceived hygiene effectiveness which was positively associated with PHPs in both datasets. Trust in formal information was positively associated with influenza worry in A/H5N1 data, and with knowledge of influenza cause in A/H1N1 data, both variables being positively associated with PHPs. Trust in informal information was positively associated with influenza worry in both datasets. Independent of information trust, perceived influenza susceptibility associated with influenza worry. Worry associated with PHPs in A/H5N1 data only. Knowledge of influenza cause and perceived PHP effectiveness were associated with PHPs. Improving trust in formal information should increase PHPs. Worry was significantly associated with PHPs in A/H5N1.
NASA Astrophysics Data System (ADS)
Herrera, Jorge M.; Chapanoff, Miguel
2017-12-01
In the field of maritime archaeology, the use of maritime, coastal, riverine, and lacustrine spaces by past societies has been perceived in different and changing viewpoints. These perspectives have flourished in dynamic and varying ways in many countries, and under different theoretical constructs. If in the 1970s the subject was perhaps not recognized as a central research subject by much of our community, it is now not only accepted but it has become a robust area of interest in maritime research. Two concepts in Latin America have been accepted that have had widespread application and influence, namely the regional maritime context and the maritorio. The points of contact between both are so intense that it is possible to speak about a single alternative with two possible names. In this article, their origins, applications, and theoretical influences are presented in a way that unifies these two concepts into a single approach (the maritorium), and examines how these ideas have been applied to research carried out in Mexico, Chile, and Uruguay. These applications are wide ranging, as they include the interconnected complexity between land and sea as used and inhabited by past societies. They have been applied in the study of ship traps, whole fleets, sites of maritime conflict and warfare, exploration activities, and ethnographic research. These will also be presented in light of other concepts of similar interest in the international sphere, such as the widespread concept of the Maritime Cultural Landscape, and also in view of other theoretical frameworks coming from the wider sphere of the profession, such as Landscape Archaeology and Phenomenological Archaeology.
2010-01-01
Background The measurement of healthcare provider performance is becoming more widespread. Physicians have been guarded about performance measurement, in part because the methodology for comparative measurement of care quality is underdeveloped. Comprehensive quality improvement will require comprehensive measurement, implying the aggregation of multiple quality metrics into composite indicators. Objective To present a conceptual framework to develop comprehensive, robust, and transparent composite indicators of pediatric care quality, and to highlight aspects specific to quality measurement in children. Methods We reviewed the scientific literature on composite indicator development, health systems, and quality measurement in the pediatric healthcare setting. Frameworks were selected for explicitness and applicability to a hospital-based measurement system. Results We synthesized various frameworks into a comprehensive model for the development of composite indicators of quality of care. Among its key premises, the model proposes identifying structural, process, and outcome metrics for each of the Institute of Medicine's six domains of quality (safety, effectiveness, efficiency, patient-centeredness, timeliness, and equity) and presents a step-by-step framework for embedding the quality of care measurement model into composite indicator development. Conclusions The framework presented offers researchers an explicit path to composite indicator development. Without a scientifically robust and comprehensive approach to measurement of the quality of healthcare, performance measurement will ultimately fail to achieve its quality improvement goals. PMID:20181129
Understanding HIV disclosure: A review and application of the Disclosure Processes Model
Chaudoir, Stephenie R.; Fisher, Jeffrey D.; Simoni, Jane M.
2014-01-01
HIV disclosure is a critical component of HIV/AIDS prevention and treatment efforts, yet the field lacks a comprehensive theoretical framework with which to study how HIV-positive individuals make decisions about disclosing their serostatus and how these decisions affect them. Recent theorizing in the context of the Disclosure Processes Model has suggested that the disclosure process consists of antecedent goals, the disclosure event itself, mediating processes and outcomes, and a feedback loop. In this paper, we apply this new theoretical framework to HIV disclosure in order to review the current state of the literature, identify gaps in existing research, and highlight the implications of the framework for future work in this area. PMID:21514708
Social energy exchange theory for postpartum depression.
Posmontier, Bobbie; Waite, Roberta
2011-01-01
Postpartum depression (PPD), a significant health problem affecting about 19.4% of postpartum women worldwide, may result in long-term cognitive and behavior problems in children, spousal depression, widespread family dysfunction, and chronic and increasingly severe maternal depression. Although current theoretical frameworks provide a rich context for studying PPD,none provides a framework that specifically addresses the dynamic relationship of the inner personal experience with the social and cultural context of PPD. The authors propose the social energy exchange theory for postpartum depression to understand how PPD impedes this dynamic relationship and suggest it as a theoretical framework for the study of interventions that would target intra- and interpersonal disturbance within the social and cultural context.
Akimbekov, Zamirbek; Katsenis, Athanassios D; Nagabhushana, G P; Ayoub, Ghada; Arhangelskis, Mihails; Morris, Andrew J; Friščić, Tomislav; Navrotsky, Alexandra
2017-06-14
We provide the first combined experimental and theoretical evaluation of how differences in ligand structure and framework topology affect the relative stabilities of isocompositional (i.e., true polymorph) metal-organic frameworks (MOFs). We used solution calorimetry and periodic DFT calculations to analyze the thermodynamics of two families of topologically distinct polymorphs of zinc zeolitic imidazolate frameworks (ZIFs) based on 2-methyl- and 2-ethylimidazolate linkers, demonstrating a correlation between measured thermodynamic stability and density, and a pronounced effect of the ligand substituent on their stability. The results show that mechanochemical syntheses and transformations of ZIFs are consistent with Ostwald's rule of stages and proceed toward thermodynamically increasingly stable, more dense phases.
Jack, Leonard; Liburd, Leandris; Spencer, Tirzah; Airhihenbuwa, Collins O
2004-06-01
Eight studies included in a recent systematic review of the efficacy of diabetes self-management education were qualitatively reexamined to determine the presence of theoretical frameworks, methods used to ensure cultural appropriateness, and the quality of the instrument. Theoretical frameworks that help to explain complex pathways that produce health outcomes were lacking; culture indices were not incorporated into diabetes self-management education; and the instruments used to measure outcomes were inadequate. We provide recommendations to improve research on diabetes self-management education in community settings through use of a contextual framework that encourages targeting multiple levels of influence--individual, family, organizational, community, and policy.
Theoretical Grounds for the Propagation of Uncertainties in Monte Carlo Particle Transport
NASA Astrophysics Data System (ADS)
Saracco, Paolo; Pia, Maria Grazia; Batic, Matej
2014-04-01
We introduce a theoretical framework for the calculation of uncertainties affecting observables produced by Monte Carlo particle transport, which derive from uncertainties in physical parameters input into simulation. The theoretical developments are complemented by a heuristic application, which illustrates the method of calculation in a streamlined simulation environment.
Some New Theoretical Issues in Systems Thinking Relevant for Modelling Corporate Learning
ERIC Educational Resources Information Center
Minati, Gianfranco
2007-01-01
Purpose: The purpose of this paper is to describe fundamental concepts and theoretical challenges with regard to systems, and to build on these in proposing new theoretical frameworks relevant to learning, for example in so-called learning organizations. Design/methodology/approach: The paper focuses on some crucial fundamental aspects introduced…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Devkota, Jagannath; Kim, Ki-Joong; Ohodnicki, Paul R.
The integration of nanoporous materials such as metal organic frameworks (MOFs) with sensitive transducers can result in robust sensing platforms for monitoring gases and chemical vapors for a range of applications.
Darcy's law predicts widespread forest mortalityunder climate warming
NASA Astrophysics Data System (ADS)
Allen, C. D.; McDowell, N. G.
2015-12-01
Drought and heat-induced tree mortality is accelerating in many forest biomes as a consequence of a warming climate, resulting in a threat to global forests unlike any in recorded history. Forests store the majority of terrestrial carbon, thus their loss may have significant and sustained impacts on the global carbon cycle. We used a hydraulic corollary to Darcy's law, a core principle of vascular plant physiology, to predict characteristics of plants that will survive and die during drought under warmer future climates. Plants that are tall with isohydric stomatal regulation, low hydraulic conductance, and high leaf area are most likely to die from future drought stress. Thus, tall trees of old-growth forests are at the greatest risk of loss, which has ominous implications for terrestrial carbon storage. This application of Darcy's law indicates today's forests generally should be replaced by shorter and more xeric plants, owing to future warmer droughts and associated wildfires and pest attacks. The Darcy's corollary also provides a simple, robust framework for informing forest management interventions needed to promote the survival of current forests. There are assumptions and omissions in this theoretical prediction, as well as new evidence supporting its predictions, both of which I will review. Given the robustness of Darcy's law for predictions of vascular plant function, we conclude with high certainty that today's forests are going to be subject to continued increases in mortality rates that will result in substantial reorganization of their structure and carbon storage.
Category-length and category-strength effects using images of scenes.
Baumann, Oliver; Vromen, Joyce M G; Boddy, Adam C; Crawshaw, Eloise; Humphreys, Michael S
2018-06-21
Global matching models have provided an important theoretical framework for recognition memory. Key predictions of this class of models are that (1) increasing the number of occurrences in a study list of some items affects the performance on other items (list-strength effect) and that (2) adding new items results in a deterioration of performance on the other items (list-length effect). Experimental confirmation of these predictions has been difficult, and the results have been inconsistent. A review of the existing literature, however, suggests that robust length and strength effects do occur when sufficiently similar hard-to-label items are used. In an effort to investigate this further, we had participants study lists containing one or more members of visual scene categories (bathrooms, beaches, etc.). Experiments 1 and 2 replicated and extended previous findings showing that the study of additional category members decreased accuracy, providing confirmation of the category-length effect. Experiment 3 showed that repeating some category members decreased the accuracy of nonrepeated members, providing evidence for a category-strength effect. Experiment 4 eliminated a potential challenge to these results. Taken together, these findings provide robust support for global matching models of recognition memory. The overall list lengths, the category sizes, and the number of repetitions used demonstrated that scene categories are well-suited to testing the fundamental assumptions of global matching models. These include (A) interference from memories for similar items and contexts, (B) nondestructive interference, and (C) that conjunctive information is made available through a matching operation.
ERIC Educational Resources Information Center
Huber, Elaine
2017-01-01
Scholarly evaluation practices in learning and teaching projects are under-reported in the literature. In order for robust evaluative measures to be implemented, a project requires a well-designed evaluation plan. This research study describes the development of a practical evaluation planning framework through an action research approach, using…
Metzger, Marc J.; Bunce, Robert G.H.; Jongman, Rob H.G.; Sayre, Roger G.; Trabucco, Antonio; Zomer, Robert
2013-01-01
Main conclusions: The GEnS provides a robust spatial analytical framework for the aggregation of local observations, identification of gaps in current monitoring efforts and systematic design of complementary and new monitoring and research. The dataset is available for non-commercial use through the GEO portal (http://www.geoportal.org).
ERIC Educational Resources Information Center
Karisan, Dilek; Zeidler, Dana L.
2017-01-01
The aim of this paper is to examine the importance of contextualization of Nature of Science (NOS) within the Socioscientific Issues (SSI) framework, because of the importance to science education. The emphasis on advancing scientific literacy is contingent upon a robust understanding and appreciation of NOS, as well as the acquisition of…
ERIC Educational Resources Information Center
Manning, Patrick R.
2012-01-01
While the U.S. Bishops' Doctrinal Elements of a Curriculum Framework provides robust content guidelines for a national high school Religion curriculum, its successful implementation will depend largely on concurrent development of, and training in, pedagogy suited to Christian education. This paper directs educators to existing catechetical…
Reciprocity Between Robustness of Period and Plasticity of Phase in Biological Clocks
NASA Astrophysics Data System (ADS)
Hatakeyama, Tetsuhiro S.; Kaneko, Kunihiko
2015-11-01
Circadian clocks exhibit the robustness of period and plasticity of phase against environmental changes such as temperature and nutrient conditions. Thus far, however, it is unclear how both are simultaneously achieved. By investigating distinct models of circadian clocks, we demonstrate reciprocity between robustness and plasticity: higher robustness in the period implies higher plasticity in the phase, where changes in period and in phase follow a linear relationship with a negative coefficient. The robustness of period is achieved by the adaptation on the limit cycle via a concentration change of a buffer molecule, whose temporal change leads to a phase shift following a shift of the limit-cycle orbit in phase space. Generality of reciprocity in clocks with the adaptation mechanism is confirmed with theoretical analysis of simple models, while biological significance is discussed.
Guo, Yu; Dong, Daoyi; Shu, Chuan-Cun
2018-04-04
Achieving fast and efficient quantum state transfer is a fundamental task in physics, chemistry and quantum information science. However, the successful implementation of the perfect quantum state transfer also requires robustness under practically inevitable perturbative defects. Here, we demonstrate how an optimal and robust quantum state transfer can be achieved by shaping the spectral phase of an ultrafast laser pulse in the framework of frequency domain quantum optimal control theory. Our numerical simulations of the single dibenzoterrylene molecule as well as in atomic rubidium show that optimal and robust quantum state transfer via spectral phase modulated laser pulses can be achieved by incorporating a filtering function of the frequency into the optimization algorithm, which in turn has potential applications for ultrafast robust control of photochemical reactions.
Argumentation in Science Education: A Model-based Framework
NASA Astrophysics Data System (ADS)
Böttcher, Florian; Meisert, Anke
2011-02-01
The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons for the appropriateness of a theoretical model which explains a certain phenomenon. Argumentation is considered to be the process of the critical evaluation of such a model if necessary in relation to alternative models. Secondly, some methodological details are exemplified for the use of a model-based analysis in the concrete classroom context. Third, the application of the approach in comparison with other analytical models will be presented to demonstrate the explicatory power and depth of the model-based perspective. Primarily, the framework of Toulmin to structurally analyse arguments is contrasted with the approach presented here. It will be demonstrated how common methodological and theoretical problems in the context of Toulmin's framework can be overcome through a model-based perspective. Additionally, a second more complex argumentative sequence will also be analysed according to the invented analytical scheme to give a broader impression of its potential in practical use.