DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, Philip LaRoche
At the end of his life, Stephen Jay Kline, longtime professor of mechanical engineering at Stanford University, completed a book on how to address complex systems. The title of the book is 'Conceptual Foundations of Multi-Disciplinary Thinking' (1995), but the topic of the book is systems. Kline first establishes certain limits that are characteristic of our conscious minds. Kline then establishes a complexity measure for systems and uses that complexity measure to develop a hierarchy of systems. Kline then argues that our minds, due to their characteristic limitations, are unable to model the complex systems in that hierarchy. Computers aremore » of no help to us here. Our attempts at modeling these complex systems are based on the way we successfully model some simple systems, in particular, 'inert, naturally-occurring' objects and processes, such as what is the focus of physics. But complex systems overwhelm such attempts. As a result, the best we can do in working with these complex systems is to use a heuristic, what Kline calls the 'Guideline for Complex Systems.' Kline documents the problems that have developed due to 'oversimple' system models and from the inappropriate application of a system model from one domain to another. One prominent such problem is the Procrustean attempt to make the disciplines that deal with complex systems be 'physics-like.' Physics deals with simple systems, not complex ones, using Kline's complexity measure. The models that physics has developed are inappropriate for complex systems. Kline documents a number of the wasteful and dangerous fallacies of this type.« less
Research on complex 3D tree modeling based on L-system
NASA Astrophysics Data System (ADS)
Gang, Chen; Bin, Chen; Yuming, Liu; Hui, Li
2018-03-01
L-system as a fractal iterative system could simulate complex geometric patterns. Based on the field observation data of trees and knowledge of forestry experts, this paper extracted modeling constraint rules and obtained an L-system rules set. Using the self-developed L-system modeling software the L-system rule set was parsed to generate complex tree 3d models.The results showed that the geometrical modeling method based on l-system could be used to describe the morphological structure of complex trees and generate 3D tree models.
Complex systems as lenses on learning and teaching
NASA Astrophysics Data System (ADS)
Hurford, Andrew C.
From metaphors to mathematized models, the complexity sciences are changing the ways disciplines view their worlds, and ideas borrowed from complexity are increasingly being used to structure conversations and guide research on teaching and learning. The purpose of this corpus of research is to further those conversations and to extend complex systems ideas, theories, and modeling to curricula and to research on learning and teaching. A review of the literatures of learning and of complexity science and a discussion of the intersections between those disciplines are provided. The work reported represents an evolving model of learning qua complex system and that evolution is the result of iterative cycles of design research. One of the signatures of complex systems is the presence of scale invariance and this line of research furnishes empirical evidence of scale invariant behaviors in the activity of learners engaged in participatory simulations. The offered discussion of possible causes for these behaviors and chaotic phase transitions in human learning favors real-time optimization of decision-making as the means for producing such behaviors. Beyond theoretical development and modeling, this work includes the development of teaching activities intended to introduce pre-service mathematics and science teachers to complex systems. While some of the learning goals for this activity focused on the introduction of complex systems as a content area, we also used complex systems to frame perspectives on learning. Results of scoring rubrics and interview responses from students illustrate attributes of the proposed model of complex systems learning and also how these pre-service teachers made sense of the ideas. Correlations between established theories of learning and a complex adaptive systems model of learning are established and made explicit, and a means for using complex systems ideas for designing instruction is offered. It is a fundamental assumption of this research and researcher that complex systems ideas and understandings can be appropriated from more complexity-developed disciplines and put to use modeling and building increasingly productive understandings of learning and teaching.
On the Way to Appropriate Model Complexity
NASA Astrophysics Data System (ADS)
Höge, M.
2016-12-01
When statistical models are used to represent natural phenomena they are often too simple or too complex - this is known. But what exactly is model complexity? Among many other definitions, the complexity of a model can be conceptualized as a measure of statistical dependence between observations and parameters (Van der Linde, 2014). However, several issues remain when working with model complexity: A unique definition for model complexity is missing. Assuming a definition is accepted, how can model complexity be quantified? How can we use a quantified complexity to the better of modeling? Generally defined, "complexity is a measure of the information needed to specify the relationships between the elements of organized systems" (Bawden & Robinson, 2015). The complexity of a system changes as the knowledge about the system changes. For models this means that complexity is not a static concept: With more data or higher spatio-temporal resolution of parameters, the complexity of a model changes. There are essentially three categories into which all commonly used complexity measures can be classified: (1) An explicit representation of model complexity as "Degrees of freedom" of a model, e.g. effective number of parameters. (2) Model complexity as code length, a.k.a. "Kolmogorov complexity": The longer the shortest model code, the higher its complexity (e.g. in bits). (3) Complexity defined via information entropy of parametric or predictive uncertainty. Preliminary results show that Bayes theorem allows for incorporating all parts of the non-static concept of model complexity like data quality and quantity or parametric uncertainty. Therefore, we test how different approaches for measuring model complexity perform in comparison to a fully Bayesian model selection procedure. Ultimately, we want to find a measure that helps to assess the most appropriate model.
Greek, Ray; Hansen, Lawrence A
2013-11-01
We surveyed the scientific literature regarding amyotrophic lateral sclerosis, the SOD1 mouse model, complex adaptive systems, evolution, drug development, animal models, and philosophy of science in an attempt to analyze the SOD1 mouse model of amyotrophic lateral sclerosis in the context of evolved complex adaptive systems. Humans and animals are examples of evolved complex adaptive systems. It is difficult to predict the outcome from perturbations to such systems because of the characteristics of complex systems. Modeling even one complex adaptive system in order to predict outcomes from perturbations is difficult. Predicting outcomes to one evolved complex adaptive system based on outcomes from a second, especially when the perturbation occurs at higher levels of organization, is even more problematic. Using animal models to predict human outcomes to perturbations such as disease and drugs should have a very low predictive value. We present empirical evidence confirming this and suggest a theory to explain this phenomenon. We analyze the SOD1 mouse model of amyotrophic lateral sclerosis in order to illustrate this position. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.
Pattern-oriented modeling of agent-based complex systems: Lessons from ecology
Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.
2005-01-01
Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.
Pattern-Oriented Modeling of Agent-Based Complex Systems: Lessons from Ecology
NASA Astrophysics Data System (ADS)
Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.
2005-11-01
Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.
Some Approaches to Modeling Complex Information Systems.
ERIC Educational Resources Information Center
Rao, V. Venkata; Zunde, Pranas
1982-01-01
Brief discussion of state-of-the-art of modeling complex information systems distinguishes between macrolevel and microlevel modeling of such systems. Network layout and hierarchical system models, simulation, information acquisition and dissemination, databases and information storage, and operating systems are described and assessed. Thirty-four…
Modeling Complex Cross-Systems Software Interfaces Using SysML
NASA Technical Reports Server (NTRS)
Mandutianu, Sanda; Morillo, Ron; Simpson, Kim; Liepack, Otfrid; Bonanne, Kevin
2013-01-01
The complex flight and ground systems for NASA human space exploration are designed, built, operated and managed as separate programs and projects. However, each system relies on one or more of the other systems in order to accomplish specific mission objectives, creating a complex, tightly coupled architecture. Thus, there is a fundamental need to understand how each system interacts with the other. To determine if a model-based system engineering approach could be utilized to assist with understanding the complex system interactions, the NASA Engineering and Safety Center (NESC) sponsored a task to develop an approach for performing cross-system behavior modeling. This paper presents the results of applying Model Based Systems Engineering (MBSE) principles using the System Modeling Language (SysML) to define cross-system behaviors and how they map to crosssystem software interfaces documented in system-level Interface Control Documents (ICDs).
Acquisition of Complex Systemic Thinking: Mental Models of Evolution
ERIC Educational Resources Information Center
d'Apollonia, Sylvia T.; Charles, Elizabeth S.; Boyd, Gary M.
2004-01-01
We investigated the impact of introducing college students to complex adaptive systems on their subsequent mental models of evolution compared to those of students taught in the same manner but with no reference to complex systems. The students' mental models (derived from similarity ratings of 12 evolutionary terms using the pathfinder algorithm)…
Reliability analysis in interdependent smart grid systems
NASA Astrophysics Data System (ADS)
Peng, Hao; Kan, Zhe; Zhao, Dandan; Han, Jianmin; Lu, Jianfeng; Hu, Zhaolong
2018-06-01
Complex network theory is a useful way to study many real complex systems. In this paper, a reliability analysis model based on complex network theory is introduced in interdependent smart grid systems. In this paper, we focus on understanding the structure of smart grid systems and studying the underlying network model, their interactions, and relationships and how cascading failures occur in the interdependent smart grid systems. We propose a practical model for interdependent smart grid systems using complex theory. Besides, based on percolation theory, we also study the effect of cascading failures effect and reveal detailed mathematical analysis of failure propagation in such systems. We analyze the reliability of our proposed model caused by random attacks or failures by calculating the size of giant functioning components in interdependent smart grid systems. Our simulation results also show that there exists a threshold for the proportion of faulty nodes, beyond which the smart grid systems collapse. Also we determine the critical values for different system parameters. In this way, the reliability analysis model based on complex network theory can be effectively utilized for anti-attack and protection purposes in interdependent smart grid systems.
Moving alcohol prevention research forward-Part I: introducing a complex systems paradigm.
Apostolopoulos, Yorghos; Lemke, Michael K; Barry, Adam E; Lich, Kristen Hassmiller
2018-02-01
The drinking environment is a complex system consisting of a number of heterogeneous, evolving and interacting components, which exhibit circular causality and emergent properties. These characteristics reduce the efficacy of commonly used research approaches, which typically do not account for the underlying dynamic complexity of alcohol consumption and the interdependent nature of diverse factors influencing misuse over time. We use alcohol misuse among college students in the United States as an example for framing our argument for a complex systems paradigm. A complex systems paradigm, grounded in socio-ecological and complex systems theories and computational modeling and simulation, is introduced. Theoretical, conceptual, methodological and analytical underpinnings of this paradigm are described in the context of college drinking prevention research. The proposed complex systems paradigm can transcend limitations of traditional approaches, thereby fostering new directions in alcohol prevention research. By conceptualizing student alcohol misuse as a complex adaptive system, computational modeling and simulation methodologies and analytical techniques can be used. Moreover, use of participatory model-building approaches to generate simulation models can further increase stakeholder buy-in, understanding and policymaking. A complex systems paradigm for research into alcohol misuse can provide a holistic understanding of the underlying drinking environment and its long-term trajectory, which can elucidate high-leverage preventive interventions. © 2017 Society for the Study of Addiction.
NASA Technical Reports Server (NTRS)
Kavi, K. M.
1984-01-01
There have been a number of simulation packages developed for the purpose of designing, testing and validating computer systems, digital systems and software systems. Complex analytical tools based on Markov and semi-Markov processes have been designed to estimate the reliability and performance of simulated systems. Petri nets have received wide acceptance for modeling complex and highly parallel computers. In this research data flow models for computer systems are investigated. Data flow models can be used to simulate both software and hardware in a uniform manner. Data flow simulation techniques provide the computer systems designer with a CAD environment which enables highly parallel complex systems to be defined, evaluated at all levels and finally implemented in either hardware or software. Inherent in data flow concept is the hierarchical handling of complex systems. In this paper we will describe how data flow can be used to model computer system.
Mathematical Models to Determine Stable Behavior of Complex Systems
NASA Astrophysics Data System (ADS)
Sumin, V. I.; Dushkin, A. V.; Smolentseva, T. E.
2018-05-01
The paper analyzes a possibility to predict functioning of a complex dynamic system with a significant amount of circulating information and a large number of random factors impacting its functioning. Functioning of the complex dynamic system is described as a chaotic state, self-organized criticality and bifurcation. This problem may be resolved by modeling such systems as dynamic ones, without applying stochastic models and taking into account strange attractors.
Designing novel cellulase systems through agent-based modeling and global sensitivity analysis.
Apte, Advait A; Senger, Ryan S; Fong, Stephen S
2014-01-01
Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement.
Designing novel cellulase systems through agent-based modeling and global sensitivity analysis
Apte, Advait A; Senger, Ryan S; Fong, Stephen S
2014-01-01
Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement. PMID:24830736
Urbina, Angel; Mahadevan, Sankaran; Paez, Thomas L.
2012-03-01
Here, performance assessment of complex systems is ideally accomplished through system-level testing, but because they are expensive, such tests are seldom performed. On the other hand, for economic reasons, data from tests on individual components that are parts of complex systems are more readily available. The lack of system-level data leads to a need to build computational models of systems and use them for performance prediction in lieu of experiments. Because their complexity, models are sometimes built in a hierarchical manner, starting with simple components, progressing to collections of components, and finally, to the full system. Quantification of uncertainty inmore » the predicted response of a system model is required in order to establish confidence in the representation of actual system behavior. This paper proposes a framework for the complex, but very practical problem of quantification of uncertainty in system-level model predictions. It is based on Bayes networks and uses the available data at multiple levels of complexity (i.e., components, subsystem, etc.). Because epistemic sources of uncertainty were shown to be secondary, in this application, aleatoric only uncertainty is included in the present uncertainty quantification. An example showing application of the techniques to uncertainty quantification of measures of response of a real, complex aerospace system is included.« less
Risk Modeling of Interdependent Complex Systems of Systems: Theory and Practice.
Haimes, Yacov Y
2018-01-01
The emergence of the complexity characterizing our systems of systems (SoS) requires a reevaluation of the way we model, assess, manage, communicate, and analyze the risk thereto. Current models for risk analysis of emergent complex SoS are insufficient because too often they rely on the same risk functions and models used for single systems. These models commonly fail to incorporate the complexity derived from the networks of interdependencies and interconnectedness (I-I) characterizing SoS. There is a need to reevaluate currently practiced risk analysis to respond to this reality by examining, and thus comprehending, what makes emergent SoS complex. The key to evaluating the risk to SoS lies in understanding the genesis of characterizing I-I of systems manifested through shared states and other essential entities within and among the systems that constitute SoS. The term "essential entities" includes shared decisions, resources, functions, policies, decisionmakers, stakeholders, organizational setups, and others. This undertaking can be accomplished by building on state-space theory, which is fundamental to systems engineering and process control. This article presents a theoretical and analytical framework for modeling the risk to SoS with two case studies performed with the MITRE Corporation and demonstrates the pivotal contributions made by shared states and other essential entities to modeling and analysis of the risk to complex SoS. A third case study highlights the multifarious representations of SoS, which require harmonizing the risk analysis process currently applied to single systems when applied to complex SoS. © 2017 Society for Risk Analysis.
An Exploratory Study of the Butterfly Effect Using Agent-Based Modeling
NASA Technical Reports Server (NTRS)
Khasawneh, Mahmoud T.; Zhang, Jun; Shearer, Nevan E. N.; Rodriquez-Velasquez, Elkin; Bowling, Shannon R.
2010-01-01
This paper provides insights about the behavior of chaotic complex systems, and the sensitive dependence of the system on the initial starting conditions. How much does a small change in the initial conditions of a complex system affect it in the long term? Do complex systems exhibit what is called the "Butterfly Effect"? This paper uses an agent-based modeling approach to address these questions. An existing model from NetLogo library was extended in order to compare chaotic complex systems with near-identical initial conditions. Results show that small changes in initial starting conditions can have a huge impact on the behavior of chaotic complex systems. The term the "butterfly effect" is attributed to the work of Edward Lorenz [1]. It is used to describe the sensitive dependence of the behavior of chaotic complex systems on the initial conditions of these systems. The metaphor refers to the notion that a butterfly flapping its wings somewhere may cause extreme changes in the ecological system's behavior in the future, such as a hurricane.
Challenges in Developing Models Describing Complex Soil Systems
NASA Astrophysics Data System (ADS)
Simunek, J.; Jacques, D.
2014-12-01
Quantitative mechanistic models that consider basic physical, mechanical, chemical, and biological processes have the potential to be powerful tools to integrate our understanding of complex soil systems, and the soil science community has often called for models that would include a large number of these diverse processes. However, once attempts have been made to develop such models, the response from the community has not always been overwhelming, especially after it discovered that these models are consequently highly complex, requiring not only a large number of parameters, not all of which can be easily (or at all) measured and/or identified, and which are often associated with large uncertainties, but also requiring from their users deep knowledge of all/most of these implemented physical, mechanical, chemical and biological processes. Real, or perceived, complexity of these models then discourages users from using them even for relatively simple applications, for which they would be perfectly adequate. Due to the nonlinear nature and chemical/biological complexity of the soil systems, it is also virtually impossible to verify these types of models analytically, raising doubts about their applicability. Code inter-comparisons, which is then likely the most suitable method to assess code capabilities and model performance, requires existence of multiple models of similar/overlapping capabilities, which may not always exist. It is thus a challenge not only to developed models describing complex soil systems, but also to persuade the soil science community in using them. As a result, complex quantitative mechanistic models are still an underutilized tool in soil science research. We will demonstrate some of the challenges discussed above on our own efforts in developing quantitative mechanistic models (such as HP1/2) for complex soil systems.
Schryver, Jack; Nutaro, James; Shankar, Mallikarjun
2015-10-30
An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease statesmore » in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schryver, Jack; Nutaro, James; Shankar, Mallikarjun
An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease statesmore » in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.« less
Application of Complex Adaptive Systems in Portfolio Management
ERIC Educational Resources Information Center
Su, Zheyuan
2017-01-01
Simulation-based methods are becoming a promising research tool in financial markets. A general Complex Adaptive System can be tailored to different application scenarios. Based on the current research, we built two models that would benefit portfolio management by utilizing Complex Adaptive Systems (CAS) in Agent-based Modeling (ABM) approach.…
ADAM: analysis of discrete models of biological systems using computer algebra.
Hinkelmann, Franziska; Brandon, Madison; Guang, Bonny; McNeill, Rustin; Blekherman, Grigoriy; Veliz-Cuba, Alan; Laubenbacher, Reinhard
2011-07-20
Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics.
Development of structural model of adaptive training complex in ergatic systems for professional use
NASA Astrophysics Data System (ADS)
Obukhov, A. D.; Dedov, D. L.; Arkhipov, A. E.
2018-03-01
The article considers the structural model of the adaptive training complex (ATC), which reflects the interrelations between the hardware, software and mathematical model of ATC and describes the processes in this subject area. The description of the main components of software and hardware complex, their interaction and functioning within the common system are given. Also the article scrutinizers a brief description of mathematical models of personnel activity, a technical system and influences, the interactions of which formalize the regularities of ATC functioning. The studies of main objects of training complexes and connections between them will make it possible to realize practical implementation of ATC in ergatic systems for professional use.
Strategies and Rubrics for Teaching Complex Systems Theory to Novices (Invited)
NASA Astrophysics Data System (ADS)
Fichter, L. S.
2010-12-01
Bifurcation. Self-similarity. Fractal. Sensitive dependent. Agents. Self-organized criticality. Avalanche behavior. Power laws. Strange attractors. Emergence. The language of complexity is fundamentally different from the language of equilibrium. If students do not know these phenomena, and what they tell us about the pulse of dynamic systems, complex systems will be opaque. A complex system is a group of agents. (individual interacting units, like birds in a flock, sand grains in a ripple, or individual friction units along a fault zone), existing far from equilibrium, interacting through positive and negative feedbacks, following simple rules, forming interdependent, dynamic, evolutionary networks. Complex systems produce behaviors that cannot be predicted deductively from knowledge of the behaviors of the individual components themselves; they must be experienced. What complexity theory demonstrates is that, by following simple rules, all the agents end up coordinating their behavior—self organizing—so that what emerges is not chaos, but meaningful patterns. How can we introduce Freshman, non-science, general education students to complex systems theories, in 3 to 5 classes; in a way they really get it, and can use the principles to understand real systems? Complex systems theories are not a series of unconnected or disconnected equations or models; they are developed as narratives that makes sense of how all the pieces and properties are interrelated. The principles of complex systems must be taught as deliberately and systematically as the equilibrium principles normally taught; as, say, the systematic training from pre-algebra and geometry to algebra. We have developed a sequence of logically connected narratives (strategies and rubrics) that introduce complex systems principles using models that can be simulated in a computer, in class, in real time. The learning progression has a series of 12 models (e.g. logistic system, bifurcation diagrams, genetic algorithms, etc.) leading to 19 learning outcomes that encompass most of the universality properties that characterize complex systems. They are developed in a specific order to achieve specific ends of understanding. We use these models in various depths and formats in courses ranging from gened courses, to evolutionary systems and environmental systems, to upper level geology courses. Depending on the goals of a course, the learning outcomes can be applied to understanding many other complex systems; e.g. oscillating chemical reactions (reaction-diffusion and activator-inhibitor systems), autocatalytic networks, hysteresis (bistable) systems, networks, and the rise/collapse of complex societies. We use these and other complex systems concepts in various classes to talk about the origin of life, ecosystem organization, game theory, extinction events, and environmental system behaviors. The applications are almost endless. The complete learning progression with models, computer programs, experiments, and learning outcomes is available at: www.jmu.edu/geology/ComplexEvolutionarySystems/
An Efficient Model-based Diagnosis Engine for Hybrid Systems Using Structural Model Decomposition
NASA Technical Reports Server (NTRS)
Bregon, Anibal; Narasimhan, Sriram; Roychoudhury, Indranil; Daigle, Matthew; Pulido, Belarmino
2013-01-01
Complex hybrid systems are present in a large range of engineering applications, like mechanical systems, electrical circuits, or embedded computation systems. The behavior of these systems is made up of continuous and discrete event dynamics that increase the difficulties for accurate and timely online fault diagnosis. The Hybrid Diagnosis Engine (HyDE) offers flexibility to the diagnosis application designer to choose the modeling paradigm and the reasoning algorithms. The HyDE architecture supports the use of multiple modeling paradigms at the component and system level. However, HyDE faces some problems regarding performance in terms of complexity and time. Our focus in this paper is on developing efficient model-based methodologies for online fault diagnosis in complex hybrid systems. To do this, we propose a diagnosis framework where structural model decomposition is integrated within the HyDE diagnosis framework to reduce the computational complexity associated with the fault diagnosis of hybrid systems. As a case study, we apply our approach to a diagnostic testbed, the Advanced Diagnostics and Prognostics Testbed (ADAPT), using real data.
Stochastic simulation of multiscale complex systems with PISKaS: A rule-based approach.
Perez-Acle, Tomas; Fuenzalida, Ignacio; Martin, Alberto J M; Santibañez, Rodrigo; Avaria, Rodrigo; Bernardin, Alejandro; Bustos, Alvaro M; Garrido, Daniel; Dushoff, Jonathan; Liu, James H
2018-03-29
Computational simulation is a widely employed methodology to study the dynamic behavior of complex systems. Although common approaches are based either on ordinary differential equations or stochastic differential equations, these techniques make several assumptions which, when it comes to biological processes, could often lead to unrealistic models. Among others, model approaches based on differential equations entangle kinetics and causality, failing when complexity increases, separating knowledge from models, and assuming that the average behavior of the population encompasses any individual deviation. To overcome these limitations, simulations based on the Stochastic Simulation Algorithm (SSA) appear as a suitable approach to model complex biological systems. In this work, we review three different models executed in PISKaS: a rule-based framework to produce multiscale stochastic simulations of complex systems. These models span multiple time and spatial scales ranging from gene regulation up to Game Theory. In the first example, we describe a model of the core regulatory network of gene expression in Escherichia coli highlighting the continuous model improvement capacities of PISKaS. The second example describes a hypothetical outbreak of the Ebola virus occurring in a compartmentalized environment resembling cities and highways. Finally, in the last example, we illustrate a stochastic model for the prisoner's dilemma; a common approach from social sciences describing complex interactions involving trust within human populations. As whole, these models demonstrate the capabilities of PISKaS providing fertile scenarios where to explore the dynamics of complex systems. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems
NASA Astrophysics Data System (ADS)
Koch, Patrick Nathan
Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.
Behavior of the gypsy moth life system model and development of synoptic model formulations
J. J. Colbert; Xu Rumei
1991-01-01
Aims of the research: The gypsy moth life system model (GMLSM) is a complex model which incorporates numerous components (both biotic and abiotic) and ecological processes. It is a detailed simulation model which has much biological reality. However, it has not yet been tested with life system data. For such complex models, evaluation and testing cannot be adequately...
Research Area 3: Mathematics (3.1 Modeling of Complex Systems)
2017-10-31
RESEARCH AREA 3: MATHEMATICS (3.1 Modeling of Complex Systems). Proposal should be directed to Dr. John Lavery The views, opinions and/or findings...so designated by other documentation. 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS (ES) U.S. Army Research Office P.O. Box 12211 Research ...Title: RESEARCH AREA 3: MATHEMATICS (3.1 Modeling of Complex Systems). Proposal should be directed to Dr. John Lavery Report Term: 0-Other Email
Modeling of the Human - Operator in a Complex System Functioning Under Extreme Conditions
NASA Astrophysics Data System (ADS)
Getzov, Peter; Hubenova, Zoia; Yordanov, Dimitar; Popov, Wiliam
2013-12-01
Problems, related to the explication of sophisticated control systems of objects, operating under extreme conditions, have been examined and the impact of the effectiveness of the operator's activity on the systems as a whole. The necessity of creation of complex simulation models, reflecting operator's activity, is discussed. Organizational and technical system of an unmanned aviation complex is described as a sophisticated ergatic system. Computer realization of main subsystems of algorithmic system of the man as a controlling system is implemented and specialized software for data processing and analysis is developed. An original computer model of a Man as a tracking system has been implemented. Model of unmanned complex for operators training and formation of a mental model in emergency situation, implemented in "matlab-simulink" environment, has been synthesized. As a unit of the control loop, the pilot (operator) is simplified viewed as an autocontrol system consisting of three main interconnected subsystems: sensitive organs (perception sensors); central nervous system; executive organs (muscles of the arms, legs, back). Theoretical-data model of prediction the level of operator's information load in ergatic systems is proposed. It allows the assessment and prediction of the effectiveness of a real working operator. Simulation model of operator's activity in takeoff based on the Petri nets has been synthesized.
Multifaceted Modelling of Complex Business Enterprises
2015-01-01
We formalise and present a new generic multifaceted complex system approach for modelling complex business enterprises. Our method has a strong focus on integrating the various data types available in an enterprise which represent the diverse perspectives of various stakeholders. We explain the challenges faced and define a novel approach to converting diverse data types into usable Bayesian probability forms. The data types that can be integrated include historic data, survey data, and management planning data, expert knowledge and incomplete data. The structural complexities of the complex system modelling process, based on various decision contexts, are also explained along with a solution. This new application of complex system models as a management tool for decision making is demonstrated using a railway transport case study. The case study demonstrates how the new approach can be utilised to develop a customised decision support model for a specific enterprise. Various decision scenarios are also provided to illustrate the versatility of the decision model at different phases of enterprise operations such as planning and control. PMID:26247591
Multifaceted Modelling of Complex Business Enterprises.
Chakraborty, Subrata; Mengersen, Kerrie; Fidge, Colin; Ma, Lin; Lassen, David
2015-01-01
We formalise and present a new generic multifaceted complex system approach for modelling complex business enterprises. Our method has a strong focus on integrating the various data types available in an enterprise which represent the diverse perspectives of various stakeholders. We explain the challenges faced and define a novel approach to converting diverse data types into usable Bayesian probability forms. The data types that can be integrated include historic data, survey data, and management planning data, expert knowledge and incomplete data. The structural complexities of the complex system modelling process, based on various decision contexts, are also explained along with a solution. This new application of complex system models as a management tool for decision making is demonstrated using a railway transport case study. The case study demonstrates how the new approach can be utilised to develop a customised decision support model for a specific enterprise. Various decision scenarios are also provided to illustrate the versatility of the decision model at different phases of enterprise operations such as planning and control.
Quantitative computational models of molecular self-assembly in systems biology
Thomas, Marcus; Schwartz, Russell
2017-01-01
Molecular self-assembly is the dominant form of chemical reaction in living systems, yet efforts at systems biology modeling are only beginning to appreciate the need for and challenges to accurate quantitative modeling of self-assembly. Self-assembly reactions are essential to nearly every important process in cell and molecular biology and handling them is thus a necessary step in building comprehensive models of complex cellular systems. They present exceptional challenges, however, to standard methods for simulating complex systems. While the general systems biology world is just beginning to deal with these challenges, there is an extensive literature dealing with them for more specialized self-assembly modeling. This review will examine the challenges of self-assembly modeling, nascent efforts to deal with these challenges in the systems modeling community, and some of the solutions offered in prior work on self-assembly specifically. The review concludes with some consideration of the likely role of self-assembly in the future of complex biological system models more generally. PMID:28535149
Quantitative computational models of molecular self-assembly in systems biology.
Thomas, Marcus; Schwartz, Russell
2017-05-23
Molecular self-assembly is the dominant form of chemical reaction in living systems, yet efforts at systems biology modeling are only beginning to appreciate the need for and challenges to accurate quantitative modeling of self-assembly. Self-assembly reactions are essential to nearly every important process in cell and molecular biology and handling them is thus a necessary step in building comprehensive models of complex cellular systems. They present exceptional challenges, however, to standard methods for simulating complex systems. While the general systems biology world is just beginning to deal with these challenges, there is an extensive literature dealing with them for more specialized self-assembly modeling. This review will examine the challenges of self-assembly modeling, nascent efforts to deal with these challenges in the systems modeling community, and some of the solutions offered in prior work on self-assembly specifically. The review concludes with some consideration of the likely role of self-assembly in the future of complex biological system models more generally.
Human systems dynamics: Toward a computational model
NASA Astrophysics Data System (ADS)
Eoyang, Glenda H.
2012-09-01
A robust and reliable computational model of complex human systems dynamics could support advancements in theory and practice for social systems at all levels, from intrapersonal experience to global politics and economics. Models of human interactions have evolved from traditional, Newtonian systems assumptions, which served a variety of practical and theoretical needs of the past. Another class of models has been inspired and informed by models and methods from nonlinear dynamics, chaos, and complexity science. None of the existing models, however, is able to represent the open, high dimension, and nonlinear self-organizing dynamics of social systems. An effective model will represent interactions at multiple levels to generate emergent patterns of social and political life of individuals and groups. Existing models and modeling methods are considered and assessed against characteristic pattern-forming processes in observed and experienced phenomena of human systems. A conceptual model, CDE Model, based on the conditions for self-organizing in human systems, is explored as an alternative to existing models and methods. While the new model overcomes the limitations of previous models, it also provides an explanatory base and foundation for prospective analysis to inform real-time meaning making and action taking in response to complex conditions in the real world. An invitation is extended to readers to engage in developing a computational model that incorporates the assumptions, meta-variables, and relationships of this open, high dimension, and nonlinear conceptual model of the complex dynamics of human systems.
Hybrid estimation of complex systems.
Hofbaur, Michael W; Williams, Brian C
2004-10-01
Modern automated systems evolve both continuously and discretely, and hence require estimation techniques that go well beyond the capability of a typical Kalman Filter. Multiple model (MM) estimation schemes track these system evolutions by applying a bank of filters, one for each discrete system mode. Modern systems, however, are often composed of many interconnected components that exhibit rich behaviors, due to complex, system-wide interactions. Modeling these systems leads to complex stochastic hybrid models that capture the large number of operational and failure modes. This large number of modes makes a typical MM estimation approach infeasible for online estimation. This paper analyzes the shortcomings of MM estimation, and then introduces an alternative hybrid estimation scheme that can efficiently estimate complex systems with large number of modes. It utilizes search techniques from the toolkit of model-based reasoning in order to focus the estimation on the set of most likely modes, without missing symptoms that might be hidden amongst the system noise. In addition, we present a novel approach to hybrid estimation in the presence of unknown behavioral modes. This leads to an overall hybrid estimation scheme for complex systems that robustly copes with unforeseen situations in a degraded, but fail-safe manner.
Jetha, Arif; Pransky, Glenn; Hettinger, Lawrence J
2016-01-01
Work disability (WD) is characterized by variable and occasionally undesirable outcomes. The underlying determinants of WD outcomes include patterns of dynamic relationships among health, personal, organizational and regulatory factors that have been challenging to characterize, and inadequately represented by contemporary WD models. System dynamics modeling (SDM) methodology applies a sociotechnical systems thinking lens to view WD systems as comprising a range of influential factors linked by feedback relationships. SDM can potentially overcome limitations in contemporary WD models by uncovering causal feedback relationships, and conceptualizing dynamic system behaviors. It employs a collaborative and stakeholder-based model building methodology to create a visual depiction of the system as a whole. SDM can also enable researchers to run dynamic simulations to provide evidence of anticipated or unanticipated outcomes that could result from policy and programmatic intervention. SDM may advance rehabilitation research by providing greater insights into the structure and dynamics of WD systems while helping to understand inherent complexity. Challenges related to data availability, determining validity, and the extensive time and technical skill requirements for model building may limit SDM's use in the field and should be considered. Contemporary work disability (WD) models provide limited insight into complexity associated with WD processes. System dynamics modeling (SDM) has the potential to capture complexity through a stakeholder-based approach that generates a simulation model consisting of multiple feedback loops. SDM may enable WD researchers and practitioners to understand the structure and behavior of the WD system as a whole, and inform development of improved strategies to manage straightforward and complex WD cases.
ADAM: Analysis of Discrete Models of Biological Systems Using Computer Algebra
2011-01-01
Background Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. Results We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Conclusions Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics. PMID:21774817
Derivative Free Optimization of Complex Systems with the Use of Statistical Machine Learning Models
2015-09-12
AFRL-AFOSR-VA-TR-2015-0278 DERIVATIVE FREE OPTIMIZATION OF COMPLEX SYSTEMS WITH THE USE OF STATISTICAL MACHINE LEARNING MODELS Katya Scheinberg...COMPLEX SYSTEMS WITH THE USE OF STATISTICAL MACHINE LEARNING MODELS 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-11-1-0239 5c. PROGRAM ELEMENT...developed, which has been the focus of our research. 15. SUBJECT TERMS optimization, Derivative-Free Optimization, Statistical Machine Learning 16. SECURITY
Mathematical and Computational Modeling in Complex Biological Systems
Li, Wenyang; Zhu, Xiaoliang
2017-01-01
The biological process and molecular functions involved in the cancer progression remain difficult to understand for biologists and clinical doctors. Recent developments in high-throughput technologies urge the systems biology to achieve more precise models for complex diseases. Computational and mathematical models are gradually being used to help us understand the omics data produced by high-throughput experimental techniques. The use of computational models in systems biology allows us to explore the pathogenesis of complex diseases, improve our understanding of the latent molecular mechanisms, and promote treatment strategy optimization and new drug discovery. Currently, it is urgent to bridge the gap between the developments of high-throughput technologies and systemic modeling of the biological process in cancer research. In this review, we firstly studied several typical mathematical modeling approaches of biological systems in different scales and deeply analyzed their characteristics, advantages, applications, and limitations. Next, three potential research directions in systems modeling were summarized. To conclude, this review provides an update of important solutions using computational modeling approaches in systems biology. PMID:28386558
Mathematical and Computational Modeling in Complex Biological Systems.
Ji, Zhiwei; Yan, Ke; Li, Wenyang; Hu, Haigen; Zhu, Xiaoliang
2017-01-01
The biological process and molecular functions involved in the cancer progression remain difficult to understand for biologists and clinical doctors. Recent developments in high-throughput technologies urge the systems biology to achieve more precise models for complex diseases. Computational and mathematical models are gradually being used to help us understand the omics data produced by high-throughput experimental techniques. The use of computational models in systems biology allows us to explore the pathogenesis of complex diseases, improve our understanding of the latent molecular mechanisms, and promote treatment strategy optimization and new drug discovery. Currently, it is urgent to bridge the gap between the developments of high-throughput technologies and systemic modeling of the biological process in cancer research. In this review, we firstly studied several typical mathematical modeling approaches of biological systems in different scales and deeply analyzed their characteristics, advantages, applications, and limitations. Next, three potential research directions in systems modeling were summarized. To conclude, this review provides an update of important solutions using computational modeling approaches in systems biology.
DOT National Transportation Integrated Search
2013-01-01
The ability to model and understand the complex dynamics of intelligent agents as they interact within a transportation system could lead to revolutionary advances in transportation engineering and intermodal surface transportation in the United Stat...
ERIC Educational Resources Information Center
Brown, Callum
2008-01-01
Understanding the dynamic behaviour of organisations is challenging and this study uses a model of complex adaptive systems as a generative metaphor to address this challenge. The research question addressed is: How might a conceptual model of complex adaptive systems be used to assist in understanding the dynamic nature of organisations? Using an…
Tools and techniques for developing policies for complex and uncertain systems.
Bankes, Steven C
2002-05-14
Agent-based models (ABM) are examples of complex adaptive systems, which can be characterized as those systems for which no model less complex than the system itself can accurately predict in detail how the system will behave at future times. Consequently, the standard tools of policy analysis, based as they are on devising policies that perform well on some best estimate model of the system, cannot be reliably used for ABM. This paper argues that policy analysis by using ABM requires an alternative approach to decision theory. The general characteristics of such an approach are described, and examples are provided of its application to policy analysis.
NASA Astrophysics Data System (ADS)
Haghnevis, Moeed
The main objective of this research is to develop an integrated method to study emergent behavior and consequences of evolution and adaptation in engineered complex adaptive systems (ECASs). A multi-layer conceptual framework and modeling approach including behavioral and structural aspects is provided to describe the structure of a class of engineered complex systems and predict their future adaptive patterns. The approach allows the examination of complexity in the structure and the behavior of components as a result of their connections and in relation to their environment. This research describes and uses the major differences of natural complex adaptive systems (CASs) with artificial/engineered CASs to build a framework and platform for ECAS. While this framework focuses on the critical factors of an engineered system, it also enables one to synthetically employ engineering and mathematical models to analyze and measure complexity in such systems. In this way concepts of complex systems science are adapted to management science and system of systems engineering. In particular an integrated consumer-based optimization and agent-based modeling (ABM) platform is presented that enables managers to predict and partially control patterns of behaviors in ECASs. Demonstrated on the U.S. electricity markets, ABM is integrated with normative and subjective decision behavior recommended by the U.S. Department of Energy (DOE) and Federal Energy Regulatory Commission (FERC). The approach integrates social networks, social science, complexity theory, and diffusion theory. Furthermore, it has unique and significant contribution in exploring and representing concrete managerial insights for ECASs and offering new optimized actions and modeling paradigms in agent-based simulation.
2016-04-30
also that we have started building in a domain where structural patterns matter, especially for large projects. Complex Systems Complexity has been...through minimalistic thinking and parsimony” and perceived elegance, which “hides systemic or organizational complexity from the user.” If the system
Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling
NASA Technical Reports Server (NTRS)
Hojnicki, Jeffrey S.; Rusick, Jeffrey J.
2005-01-01
Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).
Contribution to the meaning and understanding of anticipatory systems
NASA Astrophysics Data System (ADS)
Kljajić, Miroljub
2001-06-01
The present article discusses the cybernetic method in the modelling and understanding of complex systems from the epistemological, semantic as well as psychological point of view. Biological and organisational systems are the most important among complex systems. According to Rosen [1] anticipatory systems is another name for complex systems because, in a way, they function to anticipate the future state in order to preserve its structure and functioning. This paper demonstrates a strong analogy between Rosen's modified definition of anticipatory systems [2] and decision-making through simulation in organisational systems. The possible meaning of several models modified in the anticipatory mode will also be discussed as for example: a) The modified Verhaulst Model and its anticipatory modification in the case of the description of human behavior, b) The Prey-Predator Model, and c) The Evans Market Model under different conditions of the demand and supply function.
NASA Astrophysics Data System (ADS)
Takemura, Kazuhiro; Guo, Hao; Sakuraba, Shun; Matubayasi, Nobuyuki; Kitao, Akio
2012-12-01
We propose a method to evaluate binding free energy differences among distinct protein-protein complex model structures through all-atom molecular dynamics simulations in explicit water using the solution theory in the energy representation. Complex model structures are generated from a pair of monomeric structures using the rigid-body docking program ZDOCK. After structure refinement by side chain optimization and all-atom molecular dynamics simulations in explicit water, complex models are evaluated based on the sum of their conformational and solvation free energies, the latter calculated from the energy distribution functions obtained from relatively short molecular dynamics simulations of the complex in water and of pure water based on the solution theory in the energy representation. We examined protein-protein complex model structures of two protein-protein complex systems, bovine trypsin/CMTI-1 squash inhibitor (PDB ID: 1PPE) and RNase SA/barstar (PDB ID: 1AY7), for which both complex and monomer structures were determined experimentally. For each system, we calculated the energies for the crystal complex structure and twelve generated model structures including the model most similar to the crystal structure and very different from it. In both systems, the sum of the conformational and solvation free energies tended to be lower for the structure similar to the crystal. We concluded that our energy calculation method is useful for selecting low energy complex models similar to the crystal structure from among a set of generated models.
Takemura, Kazuhiro; Guo, Hao; Sakuraba, Shun; Matubayasi, Nobuyuki; Kitao, Akio
2012-12-07
We propose a method to evaluate binding free energy differences among distinct protein-protein complex model structures through all-atom molecular dynamics simulations in explicit water using the solution theory in the energy representation. Complex model structures are generated from a pair of monomeric structures using the rigid-body docking program ZDOCK. After structure refinement by side chain optimization and all-atom molecular dynamics simulations in explicit water, complex models are evaluated based on the sum of their conformational and solvation free energies, the latter calculated from the energy distribution functions obtained from relatively short molecular dynamics simulations of the complex in water and of pure water based on the solution theory in the energy representation. We examined protein-protein complex model structures of two protein-protein complex systems, bovine trypsin/CMTI-1 squash inhibitor (PDB ID: 1PPE) and RNase SA/barstar (PDB ID: 1AY7), for which both complex and monomer structures were determined experimentally. For each system, we calculated the energies for the crystal complex structure and twelve generated model structures including the model most similar to the crystal structure and very different from it. In both systems, the sum of the conformational and solvation free energies tended to be lower for the structure similar to the crystal. We concluded that our energy calculation method is useful for selecting low energy complex models similar to the crystal structure from among a set of generated models.
Trends in modeling Biomedical Complex Systems
Milanesi, Luciano; Romano, Paolo; Castellani, Gastone; Remondini, Daniel; Liò, Petro
2009-01-01
In this paper we provide an introduction to the techniques for multi-scale complex biological systems, from the single bio-molecule to the cell, combining theoretical modeling, experiments, informatics tools and technologies suitable for biological and biomedical research, which are becoming increasingly multidisciplinary, multidimensional and information-driven. The most important concepts on mathematical modeling methodologies and statistical inference, bioinformatics and standards tools to investigate complex biomedical systems are discussed and the prominent literature useful to both the practitioner and the theoretician are presented. PMID:19828068
Using SysML to model complex systems for security.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cano, Lester Arturo
2010-08-01
As security systems integrate more Information Technology the design of these systems has tended to become more complex. Some of the most difficult issues in designing Complex Security Systems (CSS) are: Capturing Requirements: Defining Hardware Interfaces: Defining Software Interfaces: Integrating Technologies: Radio Systems: Voice Over IP Systems: Situational Awareness Systems.
Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems.
Williams, Richard A; Timmis, Jon; Qwarnstrom, Eva E
2016-01-01
Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model.
Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems
Timmis, Jon; Qwarnstrom, Eva E.
2016-01-01
Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model. PMID:27571414
A Complex Systems Model Approach to Quantified Mineral Resource Appraisal
Gettings, M.E.; Bultman, M.W.; Fisher, F.S.
2004-01-01
For federal and state land management agencies, mineral resource appraisal has evolved from value-based to outcome-based procedures wherein the consequences of resource development are compared with those of other management options. Complex systems modeling is proposed as a general framework in which to build models that can evaluate outcomes. Three frequently used methods of mineral resource appraisal (subjective probabilistic estimates, weights of evidence modeling, and fuzzy logic modeling) are discussed to obtain insight into methods of incorporating complexity into mineral resource appraisal models. Fuzzy logic and weights of evidence are most easily utilized in complex systems models. A fundamental product of new appraisals is the production of reusable, accessible databases and methodologies so that appraisals can easily be repeated with new or refined data. The data are representations of complex systems and must be so regarded if all of their information content is to be utilized. The proposed generalized model framework is applicable to mineral assessment and other geoscience problems. We begin with a (fuzzy) cognitive map using (+1,0,-1) values for the links and evaluate the map for various scenarios to obtain a ranking of the importance of various links. Fieldwork and modeling studies identify important links and help identify unanticipated links. Next, the links are given membership functions in accordance with the data. Finally, processes are associated with the links; ideally, the controlling physical and chemical events and equations are found for each link. After calibration and testing, this complex systems model is used for predictions under various scenarios.
Marshall, Deborah A; Burgos-Liz, Lina; IJzerman, Maarten J; Osgood, Nathaniel D; Padula, William V; Higashi, Mitchell K; Wong, Peter K; Pasupathy, Kalyan S; Crown, William
2015-01-01
Health care delivery systems are inherently complex, consisting of multiple tiers of interdependent subsystems and processes that are adaptive to changes in the environment and behave in a nonlinear fashion. Traditional health technology assessment and modeling methods often neglect the wider health system impacts that can be critical for achieving desired health system goals and are often of limited usefulness when applied to complex health systems. Researchers and health care decision makers can either underestimate or fail to consider the interactions among the people, processes, technology, and facility designs. Health care delivery system interventions need to incorporate the dynamics and complexities of the health care system context in which the intervention is delivered. This report provides an overview of common dynamic simulation modeling methods and examples of health care system interventions in which such methods could be useful. Three dynamic simulation modeling methods are presented to evaluate system interventions for health care delivery: system dynamics, discrete event simulation, and agent-based modeling. In contrast to conventional evaluations, a dynamic systems approach incorporates the complexity of the system and anticipates the upstream and downstream consequences of changes in complex health care delivery systems. This report assists researchers and decision makers in deciding whether these simulation methods are appropriate to address specific health system problems through an eight-point checklist referred to as the SIMULATE (System, Interactions, Multilevel, Understanding, Loops, Agents, Time, Emergence) tool. It is a primer for researchers and decision makers working in health care delivery and implementation sciences who face complex challenges in delivering effective and efficient care that can be addressed with system interventions. On reviewing this report, the readers should be able to identify whether these simulation modeling methods are appropriate to answer the problem they are addressing and to recognize the differences of these methods from other modeling approaches used typically in health technology assessment applications. Copyright © 2015 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.
2014-12-01
Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping points) in the face of environmental and anthropogenic change (Perz, Muñoz-Carpena, Kiker and Holt, 2013), and through MonteCarlo mapping potential management activities over the most important factors or processes to influence the system towards behavioral (desirable) outcomes (Chu-Agor, Muñoz-Carpena et al., 2012).
Intelligent classifier for dynamic fault patterns based on hidden Markov model
NASA Astrophysics Data System (ADS)
Xu, Bo; Feng, Yuguang; Yu, Jinsong
2006-11-01
It's difficult to build precise mathematical models for complex engineering systems because of the complexity of the structure and dynamics characteristics. Intelligent fault diagnosis introduces artificial intelligence and works in a different way without building the analytical mathematical model of a diagnostic object, so it's a practical approach to solve diagnostic problems of complex systems. This paper presents an intelligent fault diagnosis method, an integrated fault-pattern classifier based on Hidden Markov Model (HMM). This classifier consists of dynamic time warping (DTW) algorithm, self-organizing feature mapping (SOFM) network and Hidden Markov Model. First, after dynamic observation vector in measuring space is processed by DTW, the error vector including the fault feature of being tested system is obtained. Then a SOFM network is used as a feature extractor and vector quantization processor. Finally, fault diagnosis is realized by fault patterns classifying with the Hidden Markov Model classifier. The importing of dynamic time warping solves the problem of feature extracting from dynamic process vectors of complex system such as aeroengine, and makes it come true to diagnose complex system by utilizing dynamic process information. Simulating experiments show that the diagnosis model is easy to extend, and the fault pattern classifier is efficient and is convenient to the detecting and diagnosing of new faults.
Time Factor in the Theory of Anthropogenic Risk Prediction in Complex Dynamic Systems
NASA Astrophysics Data System (ADS)
Ostreikovsky, V. A.; Shevchenko, Ye N.; Yurkov, N. K.; Kochegarov, I. I.; Grishko, A. K.
2018-01-01
The article overviews the anthropogenic risk models that take into consideration the development of different factors in time that influence the complex system. Three classes of mathematical models have been analyzed for the use in assessing the anthropogenic risk of complex dynamic systems. These models take into consideration time factor in determining the prospect of safety change of critical systems. The originality of the study is in the analysis of five time postulates in the theory of anthropogenic risk and the safety of highly important objects. It has to be stressed that the given postulates are still rarely used in practical assessment of equipment service life of critically important systems. That is why, the results of study presented in the article can be used in safety engineering and analysis of critically important complex technical systems.
ERIC Educational Resources Information Center
Haugwitz, Marion; Sandmann, Angela
2010-01-01
Understanding biological structures and functions is often difficult because of their complexity and micro-structure. For example, the vascular system is a complex and only partly visible system. Constructing models to better understand biological functions is seen as a suitable learning method. Models function as simplified versions of real…
Managing Complex Interoperability Solutions using Model-Driven Architecture
2011-06-01
such as Oracle or MySQL . Each data model for a specific RDBMS is a distinct PSM. Or the system may want to exchange information with other C2...reduced number of transformations, e.g., from an RDBMS physical schema to the corresponding SQL script needed to instantiate the tables in a relational...tance of models. In engineering, a model serves several purposes: 1. It presents an abstract view of a complex system or of a complex information
Using multi-criteria analysis of simulation models to understand complex biological systems
Maureen C. Kennedy; E. David Ford
2011-01-01
Scientists frequently use computer-simulation models to help solve complex biological problems. Typically, such models are highly integrated, they produce multiple outputs, and standard methods of model analysis are ill suited for evaluating them. We show how multi-criteria optimization with Pareto optimality allows for model outputs to be compared to multiple system...
Persistent model order reduction for complex dynamical systems using smooth orthogonal decomposition
NASA Astrophysics Data System (ADS)
Ilbeigi, Shahab; Chelidze, David
2017-11-01
Full-scale complex dynamic models are not effective for parametric studies due to the inherent constraints on available computational power and storage resources. A persistent reduced order model (ROM) that is robust, stable, and provides high-fidelity simulations for a relatively wide range of parameters and operating conditions can provide a solution to this problem. The fidelity of a new framework for persistent model order reduction of large and complex dynamical systems is investigated. The framework is validated using several numerical examples including a large linear system and two complex nonlinear systems with material and geometrical nonlinearities. While the framework is used for identifying the robust subspaces obtained from both proper and smooth orthogonal decompositions (POD and SOD, respectively), the results show that SOD outperforms POD in terms of stability, accuracy, and robustness.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsao, Jeffrey Y.; Trucano, Timothy G.; Kleban, Stephen D.
This report contains the written footprint of a Sandia-hosted workshop held in Albuquerque, New Mexico, June 22-23, 2016 on “Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation and Uncertainty Quantification,” as well as of pre-work that fed into the workshop. The workshop’s intent was to explore and begin articulating research opportunities at the intersection between two important Sandia communities: the complex systems (CS) modeling community, and the verification, validation and uncertainty quantification (VVUQ) community The overarching research opportunity (and challenge) that we ultimately hope to address is: how can we quantify the credibility of knowledgemore » gained from complex systems models, knowledge that is often incomplete and interim, but will nonetheless be used, sometimes in real-time, by decision makers?« less
The clinical educator and complexity: a review.
Schoo, Adrian; Kumar, Koshila
2018-02-08
Complexity science perspectives have helped in examining fundamental assumptions about learning and teaching in the health professions. The implications of complexity thinking for how we understand the role and development of the clinical educator is less well articulated. This review article outlines: the key principles of complexity science; a conceptual model that situates the clinical educator in a complex system; and the implications for the individual, organisation and the system. Our conceptual model situates the clinical educator at the centre of a complex and dynamic system spanning four domains and multiple levels. The four domains are: personal (encompassing personal/professional needs and expectations); health services (health agencies and their consumers); educational (educational institutions and their health students); and societal (local community/region and government). The system also comprises: micro or individual, meso or organisational, and macro or socio-political levels. Our model highlights that clinical educators are situated within a complex system comprising different agents and connections. It emphasises that individuals, teams and organisations need to recognise and be responsive to the unpredictability, interconnectedness and evolving nature of this system. Importantly, our article also calls for an epistemological shift from faculty development to capacity building in health professions education, aimed at developing individual, team, organisational and system capabilities to work with(in) complexity. Clinical educators are situated within a complex system comprising different agents and connections. © 2018 John Wiley & Sons Ltd and The Association for the Study of Medical Education.
Classrooms as Complex Adaptive Systems: A Relational Model
ERIC Educational Resources Information Center
Burns, Anne; Knox, John S.
2011-01-01
In this article, we describe and model the language classroom as a complex adaptive system (see Logan & Schumann, 2005). We argue that linear, categorical descriptions of classroom processes and interactions do not sufficiently explain the complex nature of classrooms, and cannot account for how classroom change occurs (or does not occur), over…
A toolbox for discrete modelling of cell signalling dynamics.
Paterson, Yasmin Z; Shorthouse, David; Pleijzier, Markus W; Piterman, Nir; Bendtsen, Claus; Hall, Benjamin A; Fisher, Jasmin
2018-06-18
In an age where the volume of data regarding biological systems exceeds our ability to analyse it, many researchers are looking towards systems biology and computational modelling to help unravel the complexities of gene and protein regulatory networks. In particular, the use of discrete modelling allows generation of signalling networks in the absence of full quantitative descriptions of systems, which are necessary for ordinary differential equation (ODE) models. In order to make such techniques more accessible to mainstream researchers, tools such as the BioModelAnalyzer (BMA) have been developed to provide a user-friendly graphical interface for discrete modelling of biological systems. Here we use the BMA to build a library of discrete target functions of known canonical molecular interactions, translated from ordinary differential equations (ODEs). We then show that these BMA target functions can be used to reconstruct complex networks, which can correctly predict many known genetic perturbations. This new library supports the accessibility ethos behind the creation of BMA, providing a toolbox for the construction of complex cell signalling models without the need for extensive experience in computer programming or mathematical modelling, and allows for construction and simulation of complex biological systems with only small amounts of quantitative data.
ERIC Educational Resources Information Center
Doskey, Steven Craig
2014-01-01
This research presents an innovative means of gauging Systems Engineering effectiveness through a Systems Engineering Relative Effectiveness Index (SE REI) model. The SE REI model uses a Bayesian Belief Network to map causal relationships in government acquisitions of Complex Information Systems (CIS), enabling practitioners to identify and…
Research in Optical Symbolic Tasks
1989-11-29
November 1989. Specifically, we have concentrated on the following topics: complexity studies for optical neural and digital systems, architecture and...1989. Specifically, we hav, concentrated on the following topics: complexity studies for optical neural and digital systems, architecture and models for...Digital Systems 1.1 Digital Optical Parallel System Complexity Our study of digital optical system complexity has included a comparison of optical and
A modeling framework for exposing risks in complex systems.
Sharit, J
2000-08-01
This article introduces and develops a modeling framework for exposing risks in the form of human errors and adverse consequences in high-risk systems. The modeling framework is based on two components: a two-dimensional theory of accidents in systems developed by Perrow in 1984, and the concept of multiple system perspectives. The theory of accidents differentiates systems on the basis of two sets of attributes. One set characterizes the degree to which systems are interactively complex; the other emphasizes the extent to which systems are tightly coupled. The concept of multiple perspectives provides alternative descriptions of the entire system that serve to enhance insight into system processes. The usefulness of these two model components derives from a modeling framework that cross-links them, enabling a variety of work contexts to be exposed and understood that would otherwise be very difficult or impossible to identify. The model components and the modeling framework are illustrated in the case of a large and comprehensive trauma care system. In addition to its general utility in the area of risk analysis, this methodology may be valuable in applications of current methods of human and system reliability analysis in complex and continually evolving high-risk systems.
Anatomy and Physiology of Multiscale Modeling and Simulation in Systems Medicine.
Mizeranschi, Alexandru; Groen, Derek; Borgdorff, Joris; Hoekstra, Alfons G; Chopard, Bastien; Dubitzky, Werner
2016-01-01
Systems medicine is the application of systems biology concepts, methods, and tools to medical research and practice. It aims to integrate data and knowledge from different disciplines into biomedical models and simulations for the understanding, prevention, cure, and management of complex diseases. Complex diseases arise from the interactions among disease-influencing factors across multiple levels of biological organization from the environment to molecules. To tackle the enormous challenges posed by complex diseases, we need a modeling and simulation framework capable of capturing and integrating information originating from multiple spatiotemporal and organizational scales. Multiscale modeling and simulation in systems medicine is an emerging methodology and discipline that has already demonstrated its potential in becoming this framework. The aim of this chapter is to present some of the main concepts, requirements, and challenges of multiscale modeling and simulation in systems medicine.
Understanding Complex Natural Systems by Articulating Structure-Behavior-Function Models
ERIC Educational Resources Information Center
Vattam, Swaroop S.; Goel, Ashok K.; Rugaber, Spencer; Hmelo-Silver, Cindy E.; Jordan, Rebecca; Gray, Steven; Sinha, Suparna
2011-01-01
Artificial intelligence research on creative design has led to Structure-Behavior-Function (SBF) models that emphasize functions as abstractions for organizing understanding of physical systems. Empirical studies on understanding complex systems suggest that novice understanding is shallow, typically focusing on their visible structures and…
A systems-based approach for integrated design of materials, products and design process chains
NASA Astrophysics Data System (ADS)
Panchal, Jitesh H.; Choi, Hae-Jin; Allen, Janet K.; McDowell, David L.; Mistree, Farrokh
2007-12-01
The concurrent design of materials and products provides designers with flexibility to achieve design objectives that were not previously accessible. However, the improved flexibility comes at a cost of increased complexity of the design process chains and the materials simulation models used for executing the design chains. Efforts to reduce the complexity generally result in increased uncertainty. We contend that a systems based approach is essential for managing both the complexity and the uncertainty in design process chains and simulation models in concurrent material and product design. Our approach is based on simplifying the design process chains systematically such that the resulting uncertainty does not significantly affect the overall system performance. Similarly, instead of striving for accurate models for multiscale systems (that are inherently complex), we rely on making design decisions that are robust to uncertainties in the models. Accordingly, we pursue hierarchical modeling in the context of design of multiscale systems. In this paper our focus is on design process chains. We present a systems based approach, premised on the assumption that complex systems can be designed efficiently by managing the complexity of design process chains. The approach relies on (a) the use of reusable interaction patterns to model design process chains, and (b) consideration of design process decisions using value-of-information based metrics. The approach is illustrated using a Multifunctional Energetic Structural Material (MESM) design example. Energetic materials store considerable energy which can be released through shock-induced detonation; conventionally, they are not engineered for strength properties. The design objectives for the MESM in this paper include both sufficient strength and energy release characteristics. The design is carried out by using models at different length and time scales that simulate different aspects of the system. Finally, by applying the method to the MESM design problem, we show that the integrated design of materials and products can be carried out more efficiently by explicitly accounting for design process decisions with the hierarchy of models.
Flood vulnerability evaluation in complex urban areas
NASA Astrophysics Data System (ADS)
Giosa, L.; Pascale, S.; Sdao, F.; Sole, A.; Cantisani, A.
2009-04-01
This paper deals the conception, the development and the subsequent validation of an integrated numerical model for the assessment of systemic vulnerability in complex and urbanized areas, subject to flood risk. The proposed methodology is based on the application of the concept of "systemic vulnerability", the model is a mathematician-decisional model action to estimate the vulnerability of complex a territorial system during a flood event. The model uses a group of "pressure pointers" in order to define, qualitatively and quantitatively, the influence exercised on the territorial system from factors like as those physicists, social, economic, etc.. The model evaluates the exposure to the flood risk of the elements that belong to a system. The proposed model, which is based on the studies of Tamura et al., 2000; Minciardi et al., 2004; Pascale et al., 2008; considers the vulnerability not as a characteristic of a particular element at risk, but as a peculiarity of a complex territorial system, in which the different elements are reciprocally linked in a functional way. The proposed model points out the elements with the major functional lost and that make the whole system critical. This characteristic makes the proposed model able to support a correct territorial planning and a suitable management of the emergency following natural disasters such as floods. The proposed approach was tested on the study area in the city of Potenza, southern Italy.
The spatiotemporal system dynamics of acquired resistance in an engineered microecology.
Datla, Udaya Sree; Mather, William H; Chen, Sheng; Shoultz, Isaac W; Täuber, Uwe C; Jones, Caroline N; Butzin, Nicholas C
2017-11-22
Great strides have been made in the understanding of complex networks; however, our understanding of natural microecologies is limited. Modelling of complex natural ecological systems has allowed for new findings, but these models typically ignore the constant evolution of species. Due to the complexity of natural systems, unanticipated interactions may lead to erroneous conclusions concerning the role of specific molecular components. To address this, we use a synthetic system to understand the spatiotemporal dynamics of growth and to study acquired resistance in vivo. Our system differs from earlier synthetic systems in that it focuses on the evolution of a microecology from a killer-prey relationship to coexistence using two different non-motile Escherichia coli strains. Using empirical data, we developed the first ecological model emphasising the concept of the constant evolution of species, where the survival of the prey species is dependent on location (distance from the killer) or the evolution of resistance. Our simple model, when expanded to complex microecological association studies under varied spatial and nutrient backgrounds may help to understand the complex relationships between multiple species in intricate natural ecological networks. This type of microecological study has become increasingly important, especially with the emergence of antibiotic-resistant pathogens.
Modeling complexity in engineered infrastructure system: Water distribution network as an example
NASA Astrophysics Data System (ADS)
Zeng, Fang; Li, Xiang; Li, Ke
2017-02-01
The complex topology and adaptive behavior of infrastructure systems are driven by both self-organization of the demand and rigid engineering solutions. Therefore, engineering complex systems requires a method balancing holism and reductionism. To model the growth of water distribution networks, a complex network model was developed following the combination of local optimization rules and engineering considerations. The demand node generation is dynamic and follows the scaling law of urban growth. The proposed model can generate a water distribution network (WDN) similar to reported real-world WDNs on some structural properties. Comparison with different modeling approaches indicates that a realistic demand node distribution and co-evolvement of demand node and network are important for the simulation of real complex networks. The simulation results indicate that the efficiency of water distribution networks is exponentially affected by the urban growth pattern. On the contrary, the improvement of efficiency by engineering optimization is limited and relatively insignificant. The redundancy and robustness, on another aspect, can be significantly improved through engineering methods.
Structural model of control system for hydraulic stepper motor complex
NASA Astrophysics Data System (ADS)
Obukhov, A. D.; Dedov, D. L.; Kolodin, A. N.
2018-03-01
The article considers the problem of developing a structural model of the control system for a hydraulic stepper drive complex. A comparative analysis of stepper drives and assessment of the applicability of HSM for solving problems, requiring accurate displacement in space with subsequent positioning of the object, are carried out. The presented structural model of the automated control system of the multi-spindle complex of hydraulic stepper drives reflects the main components of the system, as well as the process of its control based on the control signals transfer to the solenoid valves by the controller. The models and methods described in the article can be used to formalize the control process in technical systems based on the application hydraulic stepper drives and allow switching from mechanical control to automated control.
Tutoring and Multi-Agent Systems: Modeling from Experiences
ERIC Educational Resources Information Center
Bennane, Abdellah
2010-01-01
Tutoring systems become complex and are offering varieties of pedagogical software as course modules, exercises, simulators, systems online or offline, for single user or multi-user. This complexity motivates new forms and approaches to the design and the modelling. Studies and research in this field introduce emergent concepts that allow the…
Model-Based Compositional Reasoning for Complex Systems of Systems (SoS)
2016-11-01
more structured approach for finding flaws /weaknesses in the systems . As the system is updated, either in response to a found flaw or new...AFRL-RQ-WP-TR-2016-0172 MODEL-BASED COMPOSITIONAL REASONING FOR COMPLEX SYSTEMS OF SYSTEMS (SoS) M. Anthony Aiello, Benjamin D. Rodes...LABORATORY AEROSPACE SYSTEMS DIRECTORATE WRIGHT-PATTERSON AIR FORCE BASE, OH 45433-7541 AIR FORCE MATERIEL COMMAND UNITED STATES AIR FORCE NOTICE
Automated reverse engineering of nonlinear dynamical systems
Bongard, Josh; Lipson, Hod
2007-01-01
Complex nonlinear dynamics arise in many fields of science and engineering, but uncovering the underlying differential equations directly from observations poses a challenging task. The ability to symbolically model complex networked systems is key to understanding them, an open problem in many disciplines. Here we introduce for the first time a method that can automatically generate symbolic equations for a nonlinear coupled dynamical system directly from time series data. This method is applicable to any system that can be described using sets of ordinary nonlinear differential equations, and assumes that the (possibly noisy) time series of all variables are observable. Previous automated symbolic modeling approaches of coupled physical systems produced linear models or required a nonlinear model to be provided manually. The advance presented here is made possible by allowing the method to model each (possibly coupled) variable separately, intelligently perturbing and destabilizing the system to extract its less observable characteristics, and automatically simplifying the equations during modeling. We demonstrate this method on four simulated and two real systems spanning mechanics, ecology, and systems biology. Unlike numerical models, symbolic models have explanatory value, suggesting that automated “reverse engineering” approaches for model-free symbolic nonlinear system identification may play an increasing role in our ability to understand progressively more complex systems in the future. PMID:17553966
Automated reverse engineering of nonlinear dynamical systems.
Bongard, Josh; Lipson, Hod
2007-06-12
Complex nonlinear dynamics arise in many fields of science and engineering, but uncovering the underlying differential equations directly from observations poses a challenging task. The ability to symbolically model complex networked systems is key to understanding them, an open problem in many disciplines. Here we introduce for the first time a method that can automatically generate symbolic equations for a nonlinear coupled dynamical system directly from time series data. This method is applicable to any system that can be described using sets of ordinary nonlinear differential equations, and assumes that the (possibly noisy) time series of all variables are observable. Previous automated symbolic modeling approaches of coupled physical systems produced linear models or required a nonlinear model to be provided manually. The advance presented here is made possible by allowing the method to model each (possibly coupled) variable separately, intelligently perturbing and destabilizing the system to extract its less observable characteristics, and automatically simplifying the equations during modeling. We demonstrate this method on four simulated and two real systems spanning mechanics, ecology, and systems biology. Unlike numerical models, symbolic models have explanatory value, suggesting that automated "reverse engineering" approaches for model-free symbolic nonlinear system identification may play an increasing role in our ability to understand progressively more complex systems in the future.
Model-Based Engineering for Supply Chain Risk Management
2015-09-30
Privacy, 2009 [19] Julien Delange Wheel Brake System Example using AADL; Feiler, Peter; Hansson, Jörgen; de Niz, Dionisio; & Wrage, Lutz. System ...University Software Engineering Institute Abstract—Expanded use of commercial components has increased the complexity of system assurance...verification. Model- based engineering (MBE) offers a means to design, develop, analyze, and maintain a complex system architecture. Architecture Analysis
Enhancing metaproteomics-The value of models and defined environmental microbial systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herbst, Florian-Alexander; Lünsmann, Vanessa; Kjeldal, Henrik
Metaproteomicsthe large-scale characterization of the entire protein complement of environmental microbiota at a given point in timehas provided new features to study complex microbial communities in order to unravel these black boxes. Some new technical challenges arose that were not an issue for classical proteome analytics before that could be tackled by the application of different model systems. Here, we review different current and future model systems for metaproteome analysis. We introduce model systems for clinical and biotechnological research questions including acid mine drainage, anaerobic digesters, and activated sludge, following a short introduction to microbial communities and metaproteomics. Model systemsmore » are useful to evaluate the challenges encountered within (but not limited to) metaproteomics, including species complexity and coverage, biomass availability, or reliable protein extraction. Moreover, the implementation of model systems can be considered as a step forward to better understand microbial community responses and ecological functions of single member organisms. In the future, improvements are necessary to fully explore complex environmental systems by metaproteomics.« less
Enhancing metaproteomics-The value of models and defined environmental microbial systems
Herbst, Florian-Alexander; Lünsmann, Vanessa; Kjeldal, Henrik; ...
2016-01-21
Metaproteomicsthe large-scale characterization of the entire protein complement of environmental microbiota at a given point in timehas provided new features to study complex microbial communities in order to unravel these black boxes. Some new technical challenges arose that were not an issue for classical proteome analytics before that could be tackled by the application of different model systems. Here, we review different current and future model systems for metaproteome analysis. We introduce model systems for clinical and biotechnological research questions including acid mine drainage, anaerobic digesters, and activated sludge, following a short introduction to microbial communities and metaproteomics. Model systemsmore » are useful to evaluate the challenges encountered within (but not limited to) metaproteomics, including species complexity and coverage, biomass availability, or reliable protein extraction. Moreover, the implementation of model systems can be considered as a step forward to better understand microbial community responses and ecological functions of single member organisms. In the future, improvements are necessary to fully explore complex environmental systems by metaproteomics.« less
NASA Astrophysics Data System (ADS)
Henriot, abel; Blavoux, bernard; Travi, yves; Lachassagne, patrick; Beon, olivier; Dewandel, benoit; Ladouche, bernard
2013-04-01
The Evian Natural Mineral Water (NMW) aquifer is a highly heterogeneous Quaternary glacial deposits complex composed of three main units, from bottom to top: - The "Inferior Complex" mainly composed of basal and impermeable till lying on the Alpine rocks. It outcrops only at the higher altitudes but is known in depth through drilled holes. - The "Gavot Plateau Complex" is an interstratified complex of mainly basal and lateral till up to 400 m thick. It outcrops at heights above approximately 850 m a.m.s.l. and up to 1200 m a.m.s.l. over a 30 km² area. It is the main recharge area known for the hydromineral system. - The "Terminal Complex" from which the Evian NMW is emerging at 410 m a.m.s.l. It is composed of sand and gravel Kame terraces that allow water to flow from the deep "Gavot Plateau Complex" permeable layers to the "Terminal Complex". A thick and impermeable terminal till caps and seals the system. Aquifer is then confined at its downstream area. Because of heterogeneity and complexity of this hydrosystem, distributed modeling tools are difficult to implement at the whole system scale: important hypothesis would have to be made about geometry, hydraulic properties, boundary conditions for example and extrapolation would lead with no doubt to unacceptable errors. Consequently a modeling strategy is being developed and leads also to improve the conceptual model of the hydrosystem. Lumped models mainly based on tritium time series allow the whole hydrosystem to be modeled combining in series: an exponential model (superficial aquifers of the "Gavot Plateau Complex"), a dispersive model (Gavot Plateau interstratified complex) and a piston flow model (sand and gravel from the Kame terraces) respectively 8, 60 and 2.5 years of mean transit time. These models provide insight on the governing parameters for the whole mineral aquifer. They help improving the current conceptual model and are to be improved with other environmental tracers such as CFC, SF6. A deterministic approach (distributed model; flow and transport) is performed at the scale of the terminal complex. The geometry of the system is quite well known from drill holes and the aquifer properties from data processing of hydraulic heads and pumping tests interpretation. A multidisciplinary approach (hydrodynamic, hydrochemistry, geology, isotopes) for the recharge area (Gavot Plateau Complex) aims to provide better constraint for the upstream boundary of distributed model. More, perfect tracer modeling approach highly constrains fitting of this distributed model. The result is a high resolution conceptual model leading to a future operational management tool of the aquifer.
PeTTSy: a computational tool for perturbation analysis of complex systems biology models.
Domijan, Mirela; Brown, Paul E; Shulgin, Boris V; Rand, David A
2016-03-10
Over the last decade sensitivity analysis techniques have been shown to be very useful to analyse complex and high dimensional Systems Biology models. However, many of the currently available toolboxes have either used parameter sampling, been focused on a restricted set of model observables of interest, studied optimisation of a objective function, or have not dealt with multiple simultaneous model parameter changes where the changes can be permanent or temporary. Here we introduce our new, freely downloadable toolbox, PeTTSy (Perturbation Theory Toolbox for Systems). PeTTSy is a package for MATLAB which implements a wide array of techniques for the perturbation theory and sensitivity analysis of large and complex ordinary differential equation (ODE) based models. PeTTSy is a comprehensive modelling framework that introduces a number of new approaches and that fully addresses analysis of oscillatory systems. It examines sensitivity analysis of the models to perturbations of parameters, where the perturbation timing, strength, length and overall shape can be controlled by the user. This can be done in a system-global setting, namely, the user can determine how many parameters to perturb, by how much and for how long. PeTTSy also offers the user the ability to explore the effect of the parameter perturbations on many different types of outputs: period, phase (timing of peak) and model solutions. PeTTSy can be employed on a wide range of mathematical models including free-running and forced oscillators and signalling systems. To enable experimental optimisation using the Fisher Information Matrix it efficiently allows one to combine multiple variants of a model (i.e. a model with multiple experimental conditions) in order to determine the value of new experiments. It is especially useful in the analysis of large and complex models involving many variables and parameters. PeTTSy is a comprehensive tool for analysing large and complex models of regulatory and signalling systems. It allows for simulation and analysis of models under a variety of environmental conditions and for experimental optimisation of complex combined experiments. With its unique set of tools it makes a valuable addition to the current library of sensitivity analysis toolboxes. We believe that this software will be of great use to the wider biological, systems biology and modelling communities.
Sustainability, Complexity and Learning: Insights from Complex Systems Approaches
ERIC Educational Resources Information Center
Espinosa, A.; Porter, T.
2011-01-01
Purpose: The purpose of this research is to explore core contributions from two different approaches to complexity management in organisations aiming to improve their sustainability,: the Viable Systems Model (VSM), and the Complex Adaptive Systems (CAS). It is proposed to perform this by summarising the main insights each approach offers to…
The Limitations of Model-Based Experimental Design and Parameter Estimation in Sloppy Systems.
White, Andrew; Tolman, Malachi; Thames, Howard D; Withers, Hubert Rodney; Mason, Kathy A; Transtrum, Mark K
2016-12-01
We explore the relationship among experimental design, parameter estimation, and systematic error in sloppy models. We show that the approximate nature of mathematical models poses challenges for experimental design in sloppy models. In many models of complex biological processes it is unknown what are the relevant physical mechanisms that must be included to explain system behaviors. As a consequence, models are often overly complex, with many practically unidentifiable parameters. Furthermore, which mechanisms are relevant/irrelevant vary among experiments. By selecting complementary experiments, experimental design may inadvertently make details that were ommitted from the model become relevant. When this occurs, the model will have a large systematic error and fail to give a good fit to the data. We use a simple hyper-model of model error to quantify a model's discrepancy and apply it to two models of complex biological processes (EGFR signaling and DNA repair) with optimally selected experiments. We find that although parameters may be accurately estimated, the discrepancy in the model renders it less predictive than it was in the sloppy regime where systematic error is small. We introduce the concept of a sloppy system-a sequence of models of increasing complexity that become sloppy in the limit of microscopic accuracy. We explore the limits of accurate parameter estimation in sloppy systems and argue that identifying underlying mechanisms controlling system behavior is better approached by considering a hierarchy of models of varying detail rather than focusing on parameter estimation in a single model.
Apostolopoulos, Yorghos; Lemke, Michael K; Barry, Adam E; Lich, Kristen Hassmiller
2018-02-01
Given the complexity of factors contributing to alcohol misuse, appropriate epistemologies and methodologies are needed to understand and intervene meaningfully. We aimed to (1) provide an overview of computational modeling methodologies, with an emphasis on system dynamics modeling; (2) explain how community-based system dynamics modeling can forge new directions in alcohol prevention research; and (3) present a primer on how to build alcohol misuse simulation models using system dynamics modeling, with an emphasis on stakeholder involvement, data sources and model validation. Throughout, we use alcohol misuse among college students in the United States as a heuristic example for demonstrating these methodologies. System dynamics modeling employs a top-down aggregate approach to understanding dynamically complex problems. Its three foundational properties-stocks, flows and feedbacks-capture non-linearity, time-delayed effects and other system characteristics. As a methodological choice, system dynamics modeling is amenable to participatory approaches; in particular, community-based system dynamics modeling has been used to build impactful models for addressing dynamically complex problems. The process of community-based system dynamics modeling consists of numerous stages: (1) creating model boundary charts, behavior-over-time-graphs and preliminary system dynamics models using group model-building techniques; (2) model formulation; (3) model calibration; (4) model testing and validation; and (5) model simulation using learning-laboratory techniques. Community-based system dynamics modeling can provide powerful tools for policy and intervention decisions that can result ultimately in sustainable changes in research and action in alcohol misuse prevention. © 2017 Society for the Study of Addiction.
Fault management for the Space Station Freedom control center
NASA Technical Reports Server (NTRS)
Clark, Colin; Jowers, Steven; Mcnenny, Robert; Culbert, Chris; Kirby, Sarah; Lauritsen, Janet
1992-01-01
This paper describes model based reasoning fault isolation in complex systems using automated digraph analysis. It discusses the use of the digraph representation as the paradigm for modeling physical systems and a method for executing these failure models to provide real-time failure analysis. It also discusses the generality, ease of development and maintenance, complexity management, and susceptibility to verification and validation of digraph failure models. It specifically describes how a NASA-developed digraph evaluation tool and an automated process working with that tool can identify failures in a monitored system when supplied with one or more fault indications. This approach is well suited to commercial applications of real-time failure analysis in complex systems because it is both powerful and cost effective.
Evaluation of HardSys/HardDraw, An Expert System for Electromagnetic Interactions Modelling
1993-05-01
interactions ir complex systems. This report gives a description of HardSys/HardDraw and reviews the main concepts used in its design. Various aspects of its ...HardDraw, an expert system for the modelling of electromagnetic interactions in complex systems. It consists of two main components: HardSys and HardDraw...HardSys is the advisor part of the expert system. It is knowledge-based, that is it contains a database of models and properties for various types of
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2014-01-01
A system is safety-critical if its failure can endanger human life or cause significant damage to property or the environment. State-of-the-art computer systems on commercial aircraft are highly complex, software-intensive, functionally integrated, and network-centric systems of systems. Ensuring that such systems are safe and comply with existing safety regulations is costly and time-consuming as the level of rigor in the development process, especially the validation and verification activities, is determined by considerations of system complexity and safety criticality. A significant degree of care and deep insight into the operational principles of these systems is required to ensure adequate coverage of all design implications relevant to system safety. Model-based development methodologies, methods, tools, and techniques facilitate collaboration and enable the use of common design artifacts among groups dealing with different aspects of the development of a system. This paper examines the application of model-based development to complex and safety-critical aircraft computer systems. Benefits and detriments are identified and an overall assessment of the approach is given.
Hay, L.; Knapp, L.
1996-01-01
Investigating natural, potential, and man-induced impacts on hydrological systems commonly requires complex modelling with overlapping data requirements, and massive amounts of one- to four-dimensional data at multiple scales and formats. Given the complexity of most hydrological studies, the requisite software infrastructure must incorporate many components including simulation modelling, spatial analysis and flexible, intuitive displays. There is a general requirement for a set of capabilities to support scientific analysis which, at this time, can only come from an integration of several software components. Integration of geographic information systems (GISs) and scientific visualization systems (SVSs) is a powerful technique for developing and analysing complex models. This paper describes the integration of an orographic precipitation model, a GIS and a SVS. The combination of these individual components provides a robust infrastructure which allows the scientist to work with the full dimensionality of the data and to examine the data in a more intuitive manner.
Complex networks as an emerging property of hierarchical preferential attachment.
Hébert-Dufresne, Laurent; Laurence, Edward; Allard, Antoine; Young, Jean-Gabriel; Dubé, Louis J
2015-12-01
Real complex systems are not rigidly structured; no clear rules or blueprints exist for their construction. Yet, amidst their apparent randomness, complex structural properties universally emerge. We propose that an important class of complex systems can be modeled as an organization of many embedded levels (potentially infinite in number), all of them following the same universal growth principle known as preferential attachment. We give examples of such hierarchy in real systems, for instance, in the pyramid of production entities of the film industry. More importantly, we show how real complex networks can be interpreted as a projection of our model, from which their scale independence, their clustering, their hierarchy, their fractality, and their navigability naturally emerge. Our results suggest that complex networks, viewed as growing systems, can be quite simple, and that the apparent complexity of their structure is largely a reflection of their unobserved hierarchical nature.
Complex networks as an emerging property of hierarchical preferential attachment
NASA Astrophysics Data System (ADS)
Hébert-Dufresne, Laurent; Laurence, Edward; Allard, Antoine; Young, Jean-Gabriel; Dubé, Louis J.
2015-12-01
Real complex systems are not rigidly structured; no clear rules or blueprints exist for their construction. Yet, amidst their apparent randomness, complex structural properties universally emerge. We propose that an important class of complex systems can be modeled as an organization of many embedded levels (potentially infinite in number), all of them following the same universal growth principle known as preferential attachment. We give examples of such hierarchy in real systems, for instance, in the pyramid of production entities of the film industry. More importantly, we show how real complex networks can be interpreted as a projection of our model, from which their scale independence, their clustering, their hierarchy, their fractality, and their navigability naturally emerge. Our results suggest that complex networks, viewed as growing systems, can be quite simple, and that the apparent complexity of their structure is largely a reflection of their unobserved hierarchical nature.
Modeling relations in nature and eco-informatics: a practical application of rosennean complexity.
Kineman, John J
2007-10-01
The purpose of eco-informatics is to communicate critical information about organisms and ecosystems. To accomplish this, it must reflect the complexity of natural systems. Present information systems are designed around mechanistic concepts that do not capture complexity. Robert Rosen's relational theory offers a way of representing complexity in terms of information entailments that are part of an ontologically implicit 'modeling relation'. This relation has corresponding epistemological components that can be captured empirically, the components being structure (associated with model encoding) and function (associated with model decoding). Relational complexity, thus, provides a long-awaited theoretical underpinning for these concepts that ecology has found indispensable. Structural information pertains to the material organization of a system, which can be represented by data. Functional information specifies potential change, which can be inferred from experiment and represented as models or descriptions of state transformations. Contextual dependency (of structure or function) implies meaning. Biological functions imply internalized or system-dependent laws. Complexity can be represented epistemologically by relating structure and function in two different ways. One expresses the phenomenal relation that exists in any present or past instance, and the other draws the ontology of a system into the empirical world in terms of multiple potentials subject to natural forms of selection and optimality. These act as system attractors. Implementing these components and their theoretical relations in an informatics system will provide more-complete ecological informatics than is possible from a strictly mechanistic point of view. This approach will enable many new possibilities for supporting science and decision making.
New approaches in agent-based modeling of complex financial systems
NASA Astrophysics Data System (ADS)
Chen, Ting-Ting; Zheng, Bo; Li, Yan; Jiang, Xiong-Fei
2017-12-01
Agent-based modeling is a powerful simulation technique to understand the collective behavior and microscopic interaction in complex financial systems. Recently, the concept for determining the key parameters of agent-based models from empirical data instead of setting them artificially was suggested. We first review several agent-based models and the new approaches to determine the key model parameters from historical market data. Based on the agents' behaviors with heterogeneous personal preferences and interactions, these models are successful in explaining the microscopic origination of the temporal and spatial correlations of financial markets. We then present a novel paradigm combining big-data analysis with agent-based modeling. Specifically, from internet query and stock market data, we extract the information driving forces and develop an agent-based model to simulate the dynamic behaviors of complex financial systems.
NASA Astrophysics Data System (ADS)
Sell, K.; Herbert, B.; Schielack, J.
2004-05-01
Students organize scientific knowledge and reason about environmental issues through manipulation of mental models. The nature of the environmental sciences, which are focused on the study of complex, dynamic systems, may present cognitive difficulties to students in their development of authentic, accurate mental models of environmental systems. The inquiry project seeks to develop and assess the coupling of information technology (IT)-based learning with physical models in order to foster rich mental model development of environmental systems in geoscience undergraduate students. The manipulation of multiple representations, the development and testing of conceptual models based on available evidence, and exposure to authentic, complex and ill-constrained problems were the components of investigation utilized to reach the learning goals. Upper-level undergraduate students enrolled in an environmental geology course at Texas A&M University participated in this research which served as a pilot study. Data based on rubric evaluations interpreted by principal component analyses suggest students' understanding of the nature of scientific inquiry is limited and the ability to cross scales and link systems proved problematic. Results categorized into content knowledge and cognition processes where reasoning, critical thinking and cognitive load were driving factors behind difficulties in student learning. Student mental model development revealed multiple misconceptions and lacked complexity and completeness to represent the studied systems. Further, the positive learning impacts of the implemented modules favored the physical model over the IT-based learning projects, likely due to cognitive load issues. This study illustrates the need to better understand student difficulties in solving complex problems when using IT, where the appropriate scaffolding can then be implemented to enhance student learning of the earth system sciences.
NASA Astrophysics Data System (ADS)
Dirnbeck, Matthew R.
Biological systems pose a challenge both for learners and teachers because they are complex systems mediated by feedback loops; networks of cause-effect relationships; and non-linear, hierarchical, and emergent properties. Teachers and scientists routinely use models to communicate ideas about complex systems. Model-based pedagogies engage students in model construction as a means of practicing higher-order reasoning skills. One such modeling paradigm describes systems in terms of their structures, behaviors, and functions (SBF). The SBF framework is a simple modeling language that has been used to teach about complex biological systems. Here, we used student-generated SBF models to assess students' causal reasoning in the context of a novel biological problem on an exam. We compared students' performance on the modeling problem, their performance on a set of knowledge/comprehension questions, and their performance on a set of scientific reasoning questions. We found that students who performed well on knowledge and understanding questions also constructed more networked, higher quality models. Previous studies have shown that learners' mental maps increase in complexity with increased expertise. We wanted to investigate if biology students with varying levels of training in biology showed a similar pattern when constructing system models. In a pilot study, we administered the same modeling problem to two additional groups of students: 1) an animal physiology course for students pursuing a major in biology (n=37) and 2) an exercise physiology course for non-majors (n=27). We found that there was no significant difference in model organization across the three student populations, but there was a significant difference in the ability to represent function between the three populations. Between the three groups the non-majors had the lowest function scores, the introductory majors had the middle function scores, and the upper division majors had the highest function scores.
From Cybernetics to Plectics: A Practical Approach to Systems Enquiry in Engineering
NASA Astrophysics Data System (ADS)
Pátkai, Béla; Tar, József K.; Rudas, Imre J.
The most prominent systems theories from the 20th century are reviewed in this chapter and the arguments of complex system theorists is supported who use the term “plec-tics” instead of the overused and ambiguous “systems science” and “systems theory”. It is claimed that the measurement of complex systems cannot be separated from their modelling as the boundaries between the specific steps of the scientific method are necessarily blurred. A critical and extended interpretation of the complex system modelling method is provided and the importance of discipline-specific paradigms and their systematic interdisciplinary transfer is proposed.
QMU as an approach to strengthening the predictive capabilities of complex models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, Genetha Anne.; Boggs, Paul T.; Grace, Matthew D.
2010-09-01
Complex systems are made up of multiple interdependent parts, and the behavior of the entire system cannot always be directly inferred from the behavior of the individual parts. They are nonlinear and system responses are not necessarily additive. Examples of complex systems include energy, cyber and telecommunication infrastructures, human and animal social structures, and biological structures such as cells. To meet the goals of infrastructure development, maintenance, and protection for cyber-related complex systems, novel modeling and simulation technology is needed. Sandia has shown success using M&S in the nuclear weapons (NW) program. However, complex systems represent a significant challenge andmore » relative departure from the classical M&S exercises, and many of the scientific and mathematical M&S processes must be re-envisioned. Specifically, in the NW program, requirements and acceptable margins for performance, resilience, and security are well-defined and given quantitatively from the start. The Quantification of Margins and Uncertainties (QMU) process helps to assess whether or not these safety, reliability and performance requirements have been met after a system has been developed. In this sense, QMU is used as a sort of check that requirements have been met once the development process is completed. In contrast, performance requirements and margins may not have been defined a priori for many complex systems, (i.e. the Internet, electrical distribution grids, etc.), particularly not in quantitative terms. This project addresses this fundamental difference by investigating the use of QMU at the start of the design process for complex systems. Three major tasks were completed. First, the characteristics of the cyber infrastructure problem were collected and considered in the context of QMU-based tools. Second, UQ methodologies for the quantification of model discrepancies were considered in the context of statistical models of cyber activity. Third, Bayesian methods for optimal testing in the QMU framework were developed. This completion of this project represent an increased understanding of how to apply and use the QMU process as a means for improving model predictions of the behavior of complex systems. 4« less
ERIC Educational Resources Information Center
Guevara, Porfirio
2014-01-01
This article identifies elements and connections that seem to be relevant to explain persistent aggregate behavioral patterns in educational systems when using complex dynamical systems modeling and simulation approaches. Several studies have shown what factors are at play in educational fields, but confusion still remains about the underlying…
Transdisciplinary Application of Cross-Scale Resilience ...
The cross-scale resilience model was developed in ecology to explain the emergence of resilience from the distribution of ecological functions within and across scales, and as a tool to assess resilience. We propose that the model and the underlyingdiscontinuity hypothesis are relevant to other complex adaptive systems, and can be used to identify and track changes in system parameters related to resilience. We explain the theory behind the cross-scale resilience model, review the cases where it has been applied to non-ecological systems, and discuss some examples of social-ecological, archaeological/anthropological, and economic systems where a cross-scale resilience analysis could add a quantitative dimension to our current understanding of system dynamics and resilience. We argue that the scaling and diversity parameters suitable for a resilience analysis of ecological systems are appropriate for a broad suite of systems where non-normative quantitative assessments of resilience are desired. Our planet is currently characterized by fast environmental and social change, and the cross-scale resilience model has the potential to quantify resilience across many types of complex adaptive systems. Comparative analyses of complex systems have, in fact, demonstrated commonalities among distinctly different types of systems (Schneider & Kay 1994; Holling 2001; Lansing 2003; Foster 2005; Bullmore et al. 2009). Both biological and non-biological complex systems appear t
A Chemical Engineer's Perspective on Health and Disease
Androulakis, Ioannis P.
2014-01-01
Chemical process systems engineering considers complex supply chains which are coupled networks of dynamically interacting systems. The quest to optimize the supply chain while meeting robustness and flexibility constraints in the face of ever changing environments necessitated the development of theoretical and computational tools for the analysis, synthesis and design of such complex engineered architectures. However, it was realized early on that optimality is a complex characteristic required to achieve proper balance between multiple, often competing, objectives. As we begin to unravel life's intricate complexities, we realize that that living systems share similar structural and dynamic characteristics; hence much can be learned about biological complexity from engineered systems. In this article, we draw analogies between concepts in process systems engineering and conceptual models of health and disease; establish connections between these concepts and physiologic modeling; and describe how these mirror onto the physiological counterparts of engineered systems. PMID:25506103
McGovern, Eimear; Kelleher, Eoin; Snow, Aisling; Walsh, Kevin; Gadallah, Bassem; Kutty, Shelby; Redmond, John M; McMahon, Colin J
2017-09-01
In recent years, three-dimensional printing has demonstrated reliable reproducibility of several organs including hearts with complex congenital cardiac anomalies. This represents the next step in advanced image processing and can be used to plan surgical repair. In this study, we describe three children with complex univentricular hearts and abnormal systemic or pulmonary venous drainage, in whom three-dimensional printed models based on CT data assisted with preoperative planning. For two children, after group discussion and examination of the models, a decision was made not to proceed with surgery. We extend the current clinical experience with three-dimensional printed modelling and discuss the benefits of such models in the setting of managing complex surgical problems in children with univentricular circulation and abnormal systemic or pulmonary venous drainage.
A Molecular Dynamic Modeling of Hemoglobin-Hemoglobin Interactions
NASA Astrophysics Data System (ADS)
Wu, Tao; Yang, Ye; Sheldon Wang, X.; Cohen, Barry; Ge, Hongya
2010-05-01
In this paper, we present a study of hemoglobin-hemoglobin interaction with model reduction methods. We begin with a simple spring-mass system with given parameters (mass and stiffness). With this known system, we compare the mode superposition method with Singular Value Decomposition (SVD) based Principal Component Analysis (PCA). Through PCA we are able to recover the principal direction of this system, namely the model direction. This model direction will be matched with the eigenvector derived from mode superposition analysis. The same technique will be implemented in a much more complicated hemoglobin-hemoglobin molecule interaction model, in which thousands of atoms in hemoglobin molecules are coupled with tens of thousands of T3 water molecule models. In this model, complex inter-atomic and inter-molecular potentials are replaced by nonlinear springs. We employ the same method to get the most significant modes and their frequencies of this complex dynamical system. More complex physical phenomena can then be further studied by these coarse grained models.
Making classical ground-state spin computing fault-tolerant.
Crosson, I J; Bacon, D; Brown, K R
2010-09-01
We examine a model of classical deterministic computing in which the ground state of the classical system is a spatial history of the computation. This model is relevant to quantum dot cellular automata as well as to recent universal adiabatic quantum computing constructions. In its most primitive form, systems constructed in this model cannot compute in an error-free manner when working at nonzero temperature. However, by exploiting a mapping between the partition function for this model and probabilistic classical circuits we are able to show that it is possible to make this model effectively error-free. We achieve this by using techniques in fault-tolerant classical computing and the result is that the system can compute effectively error-free if the temperature is below a critical temperature. We further link this model to computational complexity and show that a certain problem concerning finite temperature classical spin systems is complete for the complexity class Merlin-Arthur. This provides an interesting connection between the physical behavior of certain many-body spin systems and computational complexity.
Snowden, Thomas J; van der Graaf, Piet H; Tindall, Marcus J
2017-07-01
Complex models of biochemical reaction systems have become increasingly common in the systems biology literature. The complexity of such models can present a number of obstacles for their practical use, often making problems difficult to intuit or computationally intractable. Methods of model reduction can be employed to alleviate the issue of complexity by seeking to eliminate those portions of a reaction network that have little or no effect upon the outcomes of interest, hence yielding simplified systems that retain an accurate predictive capacity. This review paper seeks to provide a brief overview of a range of such methods and their application in the context of biochemical reaction network models. To achieve this, we provide a brief mathematical account of the main methods including timescale exploitation approaches, reduction via sensitivity analysis, optimisation methods, lumping, and singular value decomposition-based approaches. Methods are reviewed in the context of large-scale systems biology type models, and future areas of research are briefly discussed.
The Curriculum Prerequisite Network: Modeling the Curriculum as a Complex System
ERIC Educational Resources Information Center
Aldrich, Preston R.
2015-01-01
This article advances the prerequisite network as a means to visualize the hidden structure in an academic curriculum. Networks have been used to represent a variety of complex systems ranging from social systems to biochemical pathways and protein interactions. Here, I treat the academic curriculum as a complex system with nodes representing…
Lightweight approach to model traceability in a CASE tool
NASA Astrophysics Data System (ADS)
Vileiniskis, Tomas; Skersys, Tomas; Pavalkis, Saulius; Butleris, Rimantas; Butkiene, Rita
2017-07-01
A term "model-driven" is not at all a new buzzword within the ranks of system development community. Nevertheless, the ever increasing complexity of model-driven approaches keeps fueling all kinds of discussions around this paradigm and pushes researchers forward to research and develop new and more effective ways to system development. With the increasing complexity, model traceability, and model management as a whole, becomes indispensable activities of model-driven system development process. The main goal of this paper is to present a conceptual design and implementation of a practical lightweight approach to model traceability in a CASE tool.
NASA Astrophysics Data System (ADS)
Hrachowitz, M.; Fovet, O.; Ruiz, L.; Euser, T.; Gharari, S.; Nijzink, R.; Freer, J.; Savenije, H. H. G.; Gascuel-Odoux, C.
2014-09-01
Hydrological models frequently suffer from limited predictive power despite adequate calibration performances. This can indicate insufficient representations of the underlying processes. Thus, ways are sought to increase model consistency while satisfying the contrasting priorities of increased model complexity and limited equifinality. In this study, the value of a systematic use of hydrological signatures and expert knowledge for increasing model consistency was tested. It was found that a simple conceptual model, constrained by four calibration objective functions, was able to adequately reproduce the hydrograph in the calibration period. The model, however, could not reproduce a suite of hydrological signatures, indicating a lack of model consistency. Subsequently, testing 11 models, model complexity was increased in a stepwise way and counter-balanced by "prior constraints," inferred from expert knowledge to ensure a model which behaves well with respect to the modeler's perception of the system. We showed that, in spite of unchanged calibration performance, the most complex model setup exhibited increased performance in the independent test period and skill to better reproduce all tested signatures, indicating a better system representation. The results suggest that a model may be inadequate despite good performance with respect to multiple calibration objectives and that increasing model complexity, if counter-balanced by prior constraints, can significantly increase predictive performance of a model and its skill to reproduce hydrological signatures. The results strongly illustrate the need to balance automated model calibration with a more expert-knowledge-driven strategy of constraining models.
Light, John M; Jason, Leonard A; Stevens, Edward B; Callahan, Sarah; Stone, Ariel
2016-03-01
The complex system conception of group social dynamics often involves not only changing individual characteristics, but also changing within-group relationships. Recent advances in stochastic dynamic network modeling allow these interdependencies to be modeled from data. This methodology is discussed within a context of other mathematical and statistical approaches that have been or could be applied to study the temporal evolution of relationships and behaviors within small- to medium-sized groups. An example model is presented, based on a pilot study of five Oxford House recovery homes, sober living environments for individuals following release from acute substance abuse treatment. This model demonstrates how dynamic network modeling can be applied to such systems, examines and discusses several options for pooling, and shows how results are interpreted in line with complex system concepts. Results suggest that this approach (a) is a credible modeling framework for studying group dynamics even with limited data, (b) improves upon the most common alternatives, and (c) is especially well-suited to complex system conceptions. Continuing improvements in stochastic models and associated software may finally lead to mainstream use of these techniques for the study of group dynamics, a shift already occurring in related fields of behavioral science.
Complexity reduction of biochemical rate expressions.
Schmidt, Henning; Madsen, Mads F; Danø, Sune; Cedersund, Gunnar
2008-03-15
The current trend in dynamical modelling of biochemical systems is to construct more and more mechanistically detailed and thus complex models. The complexity is reflected in the number of dynamic state variables and parameters, as well as in the complexity of the kinetic rate expressions. However, a greater level of complexity, or level of detail, does not necessarily imply better models, or a better understanding of the underlying processes. Data often does not contain enough information to discriminate between different model hypotheses, and such overparameterization makes it hard to establish the validity of the various parts of the model. Consequently, there is an increasing demand for model reduction methods. We present a new reduction method that reduces complex rational rate expressions, such as those often used to describe enzymatic reactions. The method is a novel term-based identifiability analysis, which is easy to use and allows for user-specified reductions of individual rate expressions in complete models. The method is one of the first methods to meet the classical engineering objective of improved parameter identifiability without losing the systems biology demand of preserved biochemical interpretation. The method has been implemented in the Systems Biology Toolbox 2 for MATLAB, which is freely available from http://www.sbtoolbox2.org. The Supplementary Material contains scripts that show how to use it by applying the method to the example models, discussed in this article.
NASA Astrophysics Data System (ADS)
Koh, E. H.; Lee, E.; Kaown, D.; Lee, K. K.; Green, C. T.
2017-12-01
Timing and magnitudes of nitrate contamination are determined by various factors like contaminant loading, recharge characteristics and geologic system. Information of an elapsed time since recharged water traveling to a certain outlet location, which is defined as groundwater age, can provide indirect interpretation related to the hydrologic characteristics of the aquifer system. There are three major methods (apparent ages, lumped parameter model, and numerical model) to date groundwater ages, which differently characterize groundwater mixing resulted by various groundwater flow pathways in a heterogeneous aquifer system. Therefore, in this study, we compared the three age models in a complex aquifer system by using observed age tracer data and reconstructed history of nitrate contamination by long-term source loading. The 3H-3He and CFC-12 apparent ages, which did not consider the groundwater mixing, estimated the most delayed response time and a highest period of the nitrate loading had not reached yet. However, the lumped parameter model could generate more recent loading response than the apparent ages and the peak loading period influenced the water quality. The numerical model could delineate various groundwater mixing components and its different impacts on nitrate dynamics in the complex aquifer system. The different age estimation methods lead to variations in the estimated contaminant loading history, in which the discrepancy in the age estimation was dominantly observed in the complex aquifer system.
Hybrid modeling and empirical analysis of automobile supply chain network
NASA Astrophysics Data System (ADS)
Sun, Jun-yan; Tang, Jian-ming; Fu, Wei-ping; Wu, Bing-ying
2017-05-01
Based on the connection mechanism of nodes which automatically select upstream and downstream agents, a simulation model for dynamic evolutionary process of consumer-driven automobile supply chain is established by integrating ABM and discrete modeling in the GIS-based map. Firstly, the rationality is proved by analyzing the consistency of sales and changes in various agent parameters between the simulation model and a real automobile supply chain. Second, through complex network theory, hierarchical structures of the model and relationships of networks at different levels are analyzed to calculate various characteristic parameters such as mean distance, mean clustering coefficients, and degree distributions. By doing so, it verifies that the model is a typical scale-free network and small-world network. Finally, the motion law of this model is analyzed from the perspective of complex self-adaptive systems. The chaotic state of the simulation system is verified, which suggests that this system has typical nonlinear characteristics. This model not only macroscopically illustrates the dynamic evolution of complex networks of automobile supply chain but also microcosmically reflects the business process of each agent. Moreover, the model construction and simulation of the system by means of combining CAS theory and complex networks supplies a novel method for supply chain analysis, as well as theory bases and experience for supply chain analysis of auto companies.
ERIC Educational Resources Information Center
Holder, Lauren N.; Scherer, Hannah H.; Herbert, Bruce E.
2017-01-01
Engaging students in problem-solving concerning environmental issues in near-surface complex Earth systems involves developing student conceptualization of the Earth as a system and applying that scientific knowledge to the problems using practices that model those used by professionals. In this article, we review geoscience education research…
1991 Annual report on scientific programs: A broad research program on the sciences of complexity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-01-01
1991 was continued rapid growth for the Santa Fe Institute (SFI) as it broadened its interdisciplinary research into the organization, evolution and operation of complex systems and sought deeply the principles underlying their dynamic behavior. Research on complex systems--the focus of work at SFI--involves an extraordinary range of topics normally studied in seemingly disparate fields. Natural systems displaying complex behavior range upwards from proteins and DNA through cells and evolutionary systems to human societies. Research models exhibiting complexity include nonlinear equations, spin glasses, cellular automata, genetic algorithms, classifier systems, and an array of other computational models. Some of the majormore » questions facing complex systems researchers are: (1) explaining how complexity arises from the nonlinear interaction of simples components, (2) describing the mechanisms underlying high-level aggregate behavior of complex systems (such as the overt behavior of an organism, the flow of energy in an ecology, the GNP of an economy), and (3) creating a theoretical framework to enable predictions about the likely behavior of such systems in various conditions. The importance of understanding such systems in enormous: many of the most serious challenges facing humanity--e.g., environmental sustainability, economic stability, the control of disease--as well as many of the hardest scientific questions--e.g., protein folding, the distinction between self and non-self in the immune system, the nature of intelligence, the origin of life--require deep understanding of complex systems.« less
1991 Annual report on scientific programs: A broad research program on the sciences of complexity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-12-31
1991 was continued rapid growth for the Santa Fe Institute (SFI) as it broadened its interdisciplinary research into the organization, evolution and operation of complex systems and sought deeply the principles underlying their dynamic behavior. Research on complex systems--the focus of work at SFI--involves an extraordinary range of topics normally studied in seemingly disparate fields. Natural systems displaying complex behavior range upwards from proteins and DNA through cells and evolutionary systems to human societies. Research models exhibiting complexity include nonlinear equations, spin glasses, cellular automata, genetic algorithms, classifier systems, and an array of other computational models. Some of the majormore » questions facing complex systems researchers are: (1) explaining how complexity arises from the nonlinear interaction of simples components, (2) describing the mechanisms underlying high-level aggregate behavior of complex systems (such as the overt behavior of an organism, the flow of energy in an ecology, the GNP of an economy), and (3) creating a theoretical framework to enable predictions about the likely behavior of such systems in various conditions. The importance of understanding such systems in enormous: many of the most serious challenges facing humanity--e.g., environmental sustainability, economic stability, the control of disease--as well as many of the hardest scientific questions--e.g., protein folding, the distinction between self and non-self in the immune system, the nature of intelligence, the origin of life--require deep understanding of complex systems.« less
Deconstructing the core dynamics from a complex time-lagged regulatory biological circuit.
Eriksson, O; Brinne, B; Zhou, Y; Björkegren, J; Tegnér, J
2009-03-01
Complex regulatory dynamics is ubiquitous in molecular networks composed of genes and proteins. Recent progress in computational biology and its application to molecular data generate a growing number of complex networks. Yet, it has been difficult to understand the governing principles of these networks beyond graphical analysis or extensive numerical simulations. Here the authors exploit several simplifying biological circumstances which thereby enable to directly detect the underlying dynamical regularities driving periodic oscillations in a dynamical nonlinear computational model of a protein-protein network. System analysis is performed using the cell cycle, a mathematically well-described complex regulatory circuit driven by external signals. By introducing an explicit time delay and using a 'tearing-and-zooming' approach the authors reduce the system to a piecewise linear system with two variables that capture the dynamics of this complex network. A key step in the analysis is the identification of functional subsystems by identifying the relations between state-variables within the model. These functional subsystems are referred to as dynamical modules operating as sensitive switches in the original complex model. By using reduced mathematical representations of the subsystems the authors derive explicit conditions on how the cell cycle dynamics depends on system parameters, and can, for the first time, analyse and prove global conditions for system stability. The approach which includes utilising biological simplifying conditions, identification of dynamical modules and mathematical reduction of the model complexity may be applicable to other well-characterised biological regulatory circuits. [Includes supplementary material].
Connections Matter: Social Networks and Lifespan Health in Primate Translational Models
McCowan, Brenda; Beisner, Brianne; Bliss-Moreau, Eliza; Vandeleest, Jessica; Jin, Jian; Hannibal, Darcy; Hsieh, Fushing
2016-01-01
Humans live in societies full of rich and complex relationships that influence health. The ability to improve human health requires a detailed understanding of the complex interplay of biological systems that contribute to disease processes, including the mechanisms underlying the influence of social contexts on these biological systems. A longitudinal computational systems science approach provides methods uniquely suited to elucidate the mechanisms by which social systems influence health and well-being by investigating how they modulate the interplay among biological systems across the lifespan. In the present report, we argue that nonhuman primate social systems are sufficiently complex to serve as model systems allowing for the development and refinement of both analytical and theoretical frameworks linking social life to health. Ultimately, developing systems science frameworks in nonhuman primate models will speed discovery of the mechanisms that subserve the relationship between social life and human health. PMID:27148103
On Convergence of Development Costs and Cost Models for Complex Spaceflight Instrument Electronics
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Patel, Umeshkumar D.; Kasa, Robert L.; Hestnes, Phyllis; Brown, Tammy; Vootukuru, Madhavi
2008-01-01
Development costs of a few recent spaceflight instrument electrical and electronics subsystems have diverged from respective heritage cost model predictions. The cost models used are Grass Roots, Price-H and Parametric Model. These cost models originated in the military and industry around 1970 and were successfully adopted and patched by NASA on a mission-by-mission basis for years. However, the complexity of new instruments recently changed rapidly by orders of magnitude. This is most obvious in the complexity of representative spaceflight instrument electronics' data system. It is now required to perform intermediate processing of digitized data apart from conventional processing of science phenomenon signals from multiple detectors. This involves on-board instrument formatting of computational operands from row data for example, images), multi-million operations per second on large volumes of data in reconfigurable hardware (in addition to processing on a general purpose imbedded or standalone instrument flight computer), as well as making decisions for on-board system adaptation and resource reconfiguration. The instrument data system is now tasked to perform more functions, such as forming packets and instrument-level data compression of more than one data stream, which are traditionally performed by the spacecraft command and data handling system. It is furthermore required that the electronics box for new complex instruments is developed for one-digit watt power consumption, small size and that it is light-weight, and delivers super-computing capabilities. The conflict between the actual development cost of newer complex instruments and its electronics components' heritage cost model predictions seems to be irreconcilable. This conflict and an approach to its resolution are addressed in this paper by determining the complexity parameters, complexity index, and their use in enhanced cost model.
Mathematical modeling of physiological systems: an essential tool for discovery.
Glynn, Patric; Unudurthi, Sathya D; Hund, Thomas J
2014-08-28
Mathematical models are invaluable tools for understanding the relationships between components of a complex system. In the biological context, mathematical models help us understand the complex web of interrelations between various components (DNA, proteins, enzymes, signaling molecules etc.) in a biological system, gain better understanding of the system as a whole, and in turn predict its behavior in an altered state (e.g. disease). Mathematical modeling has enhanced our understanding of multiple complex biological processes like enzyme kinetics, metabolic networks, signal transduction pathways, gene regulatory networks, and electrophysiology. With recent advances in high throughput data generation methods, computational techniques and mathematical modeling have become even more central to the study of biological systems. In this review, we provide a brief history and highlight some of the important applications of modeling in biological systems with an emphasis on the study of excitable cells. We conclude with a discussion about opportunities and challenges for mathematical modeling going forward. In a larger sense, the review is designed to help answer a simple but important question that theoreticians frequently face from interested but skeptical colleagues on the experimental side: "What is the value of a model?" Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Gusev, Anatoly; Diansky, Nikolay; Zalesny, Vladimir
2010-05-01
The original program complex is proposed for the ocean circulation sigma-model, developed in the Institute of Numerical Mathematics (INM), Russian Academy of Sciences (RAS). The complex can be used in various curvilinear orthogonal coordinate systems. In addition to ocean circulation model, the complex contains a sea ice dynamics and thermodynamics model, as well as the original system of the atmospheric forcing implementation on the basis of both prescribed meteodata and atmospheric model results. This complex can be used as the oceanic block of Earth climate model as well as for solving the scientific and practical problems concerning the World ocean and its separate oceans and seas. The developed program complex can be effectively used on parallel shared memory computational systems and on contemporary personal computers. On the base of the complex proposed the ocean general circulation model (OGCM) was developed. The model is realized in the curvilinear orthogonal coordinate system obtained by the conformal transformation of the standard geographical grid that allowed us to locate the system singularities outside the integration domain. The horizontal resolution of the OGCM is 1 degree on longitude, 0.5 degree on latitude, and it has 40 non-uniform sigma-levels in depth. The model was integrated for 100 years starting from the Levitus January climatology using the realistic atmospheric annual cycle calculated on the base of CORE datasets. The experimental results showed us that the model adequately reproduces the basic characteristics of large-scale World Ocean dynamics, that is in good agreement with both observational data and results of the best climatic OGCMs. This OGCM is used as the oceanic component of the new version of climatic system model (CSM) developed in INM RAS. The latter is now ready for carrying out the new numerical experiments on climate and its change modelling according to IPCC (Intergovernmental Panel on Climate Change) scenarios in the scope of the CMIP-5 (Coupled Model Intercomparison Project). On the base of the complex proposed the Pacific Ocean circulation eddy-resolving model was realized. The integration domain covers the Pacific from Equator to Bering Strait. The model horizontal resolution is 0.125 degree and it has 20 non-uniform sigma-levels in depth. The model adequately reproduces circulation large-scale structure and its variability: Kuroshio meandering, ocean synoptic eddies, frontal zones, etc. Kuroshio high variability is shown. The distribution of contaminant was simulated that is admittedly wasted near Petropavlovsk-Kamchatsky. The results demonstrate contaminant distribution structure and provide us understanding of hydrological fields formation processes in the North-West Pacific.
Systems Engineering Metrics: Organizational Complexity and Product Quality Modeling
NASA Technical Reports Server (NTRS)
Mog, Robert A.
1997-01-01
Innovative organizational complexity and product quality models applicable to performance metrics for NASA-MSFC's Systems Analysis and Integration Laboratory (SAIL) missions and objectives are presented. An intensive research effort focuses on the synergistic combination of stochastic process modeling, nodal and spatial decomposition techniques, organizational and computational complexity, systems science and metrics, chaos, and proprietary statistical tools for accelerated risk assessment. This is followed by the development of a preliminary model, which is uniquely applicable and robust for quantitative purposes. Exercise of the preliminary model using a generic system hierarchy and the AXAF-I architectural hierarchy is provided. The Kendall test for positive dependence provides an initial verification and validation of the model. Finally, the research and development of the innovation is revisited, prior to peer review. This research and development effort results in near-term, measurable SAIL organizational and product quality methodologies, enhanced organizational risk assessment and evolutionary modeling results, and 91 improved statistical quantification of SAIL productivity interests.
Visual Complexity in Orthographic Learning: Modeling Learning across Writing System Variations
ERIC Educational Resources Information Center
Chang, Li-Yun; Plaut, David C.; Perfetti, Charles A.
2016-01-01
The visual complexity of orthographies varies across writing systems. Prior research has shown that complexity strongly influences the initial stage of reading development: the perceptual learning of grapheme forms. This study presents a computational simulation that examines the degree to which visual complexity leads to grapheme learning…
The new challenges of multiplex networks: Measures and models
NASA Astrophysics Data System (ADS)
Battiston, Federico; Nicosia, Vincenzo; Latora, Vito
2017-02-01
What do societies, the Internet, and the human brain have in common? They are all examples of complex relational systems, whose emerging behaviours are largely determined by the non-trivial networks of interactions among their constituents, namely individuals, computers, or neurons, rather than only by the properties of the units themselves. In the last two decades, network scientists have proposed models of increasing complexity to better understand real-world systems. Only recently we have realised that multiplexity, i.e. the coexistence of several types of interactions among the constituents of a complex system, is responsible for substantial qualitative and quantitative differences in the type and variety of behaviours that a complex system can exhibit. As a consequence, multilayer and multiplex networks have become a hot topic in complexity science. Here we provide an overview of some of the measures proposed so far to characterise the structure of multiplex networks, and a selection of models aiming at reproducing those structural properties and quantifying their statistical significance. Focusing on a subset of relevant topics, this brief review is a quite comprehensive introduction to the most basic tools for the analysis of multiplex networks observed in the real-world. The wide applicability of multiplex networks as a framework to model complex systems in different fields, from biology to social sciences, and the colloquial tone of the paper will make it an interesting read for researchers working on both theoretical and experimental analysis of networked systems.
Qualitative analysis of a discrete thermostatted kinetic framework modeling complex adaptive systems
NASA Astrophysics Data System (ADS)
Bianca, Carlo; Mogno, Caterina
2018-01-01
This paper deals with the derivation of a new discrete thermostatted kinetic framework for the modeling of complex adaptive systems subjected to external force fields (nonequilibrium system). Specifically, in order to model nonequilibrium stationary states of the system, the external force field is coupled to a dissipative term (thermostat). The well-posedness of the related Cauchy problem is investigated thus allowing the new discrete thermostatted framework to be suitable for the derivation of specific models and the related computational analysis. Applications to crowd dynamics and future research directions are also discussed within the paper.
NASA Astrophysics Data System (ADS)
De Domenico, Manlio
2018-03-01
Biological systems, from a cell to the human brain, are inherently complex. A powerful representation of such systems, described by an intricate web of relationships across multiple scales, is provided by complex networks. Recently, several studies are highlighting how simple networks - obtained by aggregating or neglecting temporal or categorical description of biological data - are not able to account for the richness of information characterizing biological systems. More complex models, namely multilayer networks, are needed to account for interdependencies, often varying across time, of biological interacting units within a cell, a tissue or parts of an organism.
The Limitations of Model-Based Experimental Design and Parameter Estimation in Sloppy Systems
Tolman, Malachi; Thames, Howard D.; Mason, Kathy A.
2016-01-01
We explore the relationship among experimental design, parameter estimation, and systematic error in sloppy models. We show that the approximate nature of mathematical models poses challenges for experimental design in sloppy models. In many models of complex biological processes it is unknown what are the relevant physical mechanisms that must be included to explain system behaviors. As a consequence, models are often overly complex, with many practically unidentifiable parameters. Furthermore, which mechanisms are relevant/irrelevant vary among experiments. By selecting complementary experiments, experimental design may inadvertently make details that were ommitted from the model become relevant. When this occurs, the model will have a large systematic error and fail to give a good fit to the data. We use a simple hyper-model of model error to quantify a model’s discrepancy and apply it to two models of complex biological processes (EGFR signaling and DNA repair) with optimally selected experiments. We find that although parameters may be accurately estimated, the discrepancy in the model renders it less predictive than it was in the sloppy regime where systematic error is small. We introduce the concept of a sloppy system–a sequence of models of increasing complexity that become sloppy in the limit of microscopic accuracy. We explore the limits of accurate parameter estimation in sloppy systems and argue that identifying underlying mechanisms controlling system behavior is better approached by considering a hierarchy of models of varying detail rather than focusing on parameter estimation in a single model. PMID:27923060
Formal Verification of Complex Systems based on SysML Functional Requirements
2014-12-23
Formal Verification of Complex Systems based on SysML Functional Requirements Hoda Mehrpouyan1, Irem Y. Tumer2, Chris Hoyle2, Dimitra Giannakopoulou3...requirements for design of complex engineered systems. The proposed ap- proach combines a SysML modeling approach to document and structure safety requirements...methods and tools to support the integration of safety into the design solution. 2.1. SysML for Complex Engineered Systems Traditional methods and tools
System Dynamics Modeling of Transboundary Systems: The Bear River Basin Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerald Sehlke; Jake Jacobson
2005-09-01
System dynamics is a computer-aided approach to evaluating the interrelationships of different components and activities within complex systems. Recently, system dynamics models have been developed in areas such as policy design, biological and medical modeling, energy and the environmental analysis, and in various other areas in the natural and social sciences. The Idaho National Engineering and Environmental Laboratory, a multi-purpose national laboratory managed by the Department of Energy, has developed a systems dynamics model in order to evaluate its utility for modeling large complex hydrological systems. We modeled the Bear River Basin, a transboundary basin that includes portions of Idaho,more » Utah and Wyoming. We found that system dynamics modeling is very useful for integrating surface water and groundwater data and for simulating the interactions between these sources within a given basin. In addition, we also found system dynamics modeling is useful for integrating complex hydrologic data with other information (e.g., policy, regulatory and management criteria) to produce a decision support system. Such decision support systems can allow managers and stakeholders to better visualize the key hydrologic elements and management constraints in the basin, which enables them to better understand the system via the simulation of multiple “what-if” scenarios. Although system dynamics models can be developed to conduct traditional hydraulic/hydrologic surface water or groundwater modeling, we believe that their strength lies in their ability to quickly evaluate trends and cause–effect relationships in large-scale hydrological systems; for integrating disparate data; for incorporating output from traditional hydraulic/hydrologic models; and for integration of interdisciplinary data, information and criteria to support better management decisions.« less
System Dynamics Modeling of Transboundary Systems: the Bear River Basin Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerald Sehlke; Jacob J. Jacobson
2005-09-01
System dynamics is a computer-aided approach to evaluating the interrelationships of different components and activities within complex systems. Recently, system dynamics models have been developed in areas such as policy design, biological and medical modeling, energy and the environmental analysis, and in various other areas in the natural and social sciences. The Idaho National Engineering and Environmental Laboratory, a multi-purpose national laboratory managed by the Department of Energy, has developed a systems dynamics model in order to evaluate its utility for modeling large complex hydrological systems. We modeled the Bear River Basin, a transboundary basin that includes portions of Idaho,more » Utah and Wyoming. We found that system dynamics modeling is very useful for integrating surface water and ground water data and for simulating the interactions between these sources within a given basin. In addition, we also found system dynamics modeling is useful for integrating complex hydrologic data with other information (e.g., policy, regulatory and management criteria) to produce a decision support system. Such decision support systems can allow managers and stakeholders to better visualize the key hydrologic elements and management constraints in the basin, which enables them to better understand the system via the simulation of multiple “what-if” scenarios. Although system dynamics models can be developed to conduct traditional hydraulic/hydrologic surface water or ground water modeling, we believe that their strength lies in their ability to quickly evaluate trends and cause–effect relationships in large-scale hydrological systems; for integrating disparate data; for incorporating output from traditional hydraulic/hydrologic models; and for integration of interdisciplinary data, information and criteria to support better management decisions.« less
Stochastic model simulation using Kronecker product analysis and Zassenhaus formula approximation.
Caglar, Mehmet Umut; Pal, Ranadip
2013-01-01
Probabilistic Models are regularly applied in Genetic Regulatory Network modeling to capture the stochastic behavior observed in the generation of biological entities such as mRNA or proteins. Several approaches including Stochastic Master Equations and Probabilistic Boolean Networks have been proposed to model the stochastic behavior in genetic regulatory networks. It is generally accepted that Stochastic Master Equation is a fundamental model that can describe the system being investigated in fine detail, but the application of this model is computationally enormously expensive. On the other hand, Probabilistic Boolean Network captures only the coarse-scale stochastic properties of the system without modeling the detailed interactions. We propose a new approximation of the stochastic master equation model that is able to capture the finer details of the modeled system including bistabilities and oscillatory behavior, and yet has a significantly lower computational complexity. In this new method, we represent the system using tensors and derive an identity to exploit the sparse connectivity of regulatory targets for complexity reduction. The algorithm involves an approximation based on Zassenhaus formula to represent the exponential of a sum of matrices as product of matrices. We derive upper bounds on the expected error of the proposed model distribution as compared to the stochastic master equation model distribution. Simulation results of the application of the model to four different biological benchmark systems illustrate performance comparable to detailed stochastic master equation models but with considerably lower computational complexity. The results also demonstrate the reduced complexity of the new approach as compared to commonly used Stochastic Simulation Algorithm for equivalent accuracy.
Guyon, Hervé; Falissard, Bruno; Kop, Jean-Luc
2017-01-01
Network Analysis is considered as a new method that challenges Latent Variable models in inferring psychological attributes. With Network Analysis, psychological attributes are derived from a complex system of components without the need to call on any latent variables. But the ontological status of psychological attributes is not adequately defined with Network Analysis, because a psychological attribute is both a complex system and a property emerging from this complex system. The aim of this article is to reappraise the legitimacy of latent variable models by engaging in an ontological and epistemological discussion on psychological attributes. Psychological attributes relate to the mental equilibrium of individuals embedded in their social interactions, as robust attractors within complex dynamic processes with emergent properties, distinct from physical entities located in precise areas of the brain. Latent variables thus possess legitimacy, because the emergent properties can be conceptualized and analyzed on the sole basis of their manifestations, without exploring the upstream complex system. However, in opposition with the usual Latent Variable models, this article is in favor of the integration of a dynamic system of manifestations. Latent Variables models and Network Analysis thus appear as complementary approaches. New approaches combining Latent Network Models and Network Residuals are certainly a promising new way to infer psychological attributes, placing psychological attributes in an inter-subjective dynamic approach. Pragmatism-realism appears as the epistemological framework required if we are to use latent variables as representations of psychological attributes. PMID:28572780
Role of Microenvironment in Glioma Invasion: What We Learned from In Vitro Models
Manini, Ivana; Caponnetto, Federica; Bartolini, Anna; Ius, Tamara; Mariuzzi, Laura; Di Loreto, Carla; Cesselli, Daniela
2018-01-01
The invasion properties of glioblastoma hamper a radical surgery and are responsible for its recurrence. Understanding the invasion mechanisms is thus critical to devise new therapeutic strategies. Therefore, the creation of in vitro models that enable these mechanisms to be studied represents a crucial step. Since in vitro models represent an over-simplification of the in vivo system, in these years it has been attempted to increase the level of complexity of in vitro assays to create models that could better mimic the behaviour of the cells in vivo. These levels of complexity involved: 1. The dimension of the system, moving from two-dimensional to three-dimensional models; 2. The use of microfluidic systems; 3. The use of mixed cultures of tumour cells and cells of the tumour micro-environment in order to mimic the complex cross-talk between tumour cells and their micro-environment; 4. And the source of cells used in an attempt to move from commercial lines to patient-based models. In this review, we will summarize the evidence obtained exploring these different levels of complexity and highlighting advantages and limitations of each system used. PMID:29300332
Hettinger, Lawrence J.; Kirlik, Alex; Goh, Yang Miang; Buckle, Peter
2015-01-01
Accurate comprehension and analysis of complex sociotechnical systems is a daunting task. Empirically examining, or simply envisioning the structure and behaviour of such systems challenges traditional analytic and experimental approaches as well as our everyday cognitive capabilities. Computer-based models and simulations afford potentially useful means of accomplishing sociotechnical system design and analysis objectives. From a design perspective, they can provide a basis for a common mental model among stakeholders, thereby facilitating accurate comprehension of factors impacting system performance and potential effects of system modifications. From a research perspective, models and simulations afford the means to study aspects of sociotechnical system design and operation, including the potential impact of modifications to structural and dynamic system properties, in ways not feasible with traditional experimental approaches. This paper describes issues involved in the design and use of such models and simulations and describes a proposed path forward to their development and implementation. Practitioner Summary: The size and complexity of real-world sociotechnical systems can present significant barriers to their design, comprehension and empirical analysis. This article describes the potential advantages of computer-based models and simulations for understanding factors that impact sociotechnical system design and operation, particularly with respect to process and occupational safety. PMID:25761227
NASA Astrophysics Data System (ADS)
Azougagh, Yassine; Benhida, Khalid; Elfezazi, Said
2016-02-01
In this paper, the focus is on studying the performance of complex systems in a supply chain context by developing a structured modelling approach based on the methodology ASDI (Analysis, Specification, Design and Implementation) by combining the modelling by Petri nets and simulation using ARENA. The linear approach typically followed in conducting of this kind of problems has to cope with a difficulty of modelling due to the complexity and the number of parameters of concern. Therefore, the approach used in this work is able to structure modelling a way to cover all aspects of the performance study. The modelling structured approach is first introduced before being applied to the case of an industrial system in the field of phosphate. Results of the performance indicators obtained from the models developed, permitted to test the behaviour and fluctuations of this system and to develop improved models of the current situation. In addition, in this paper, it was shown how Arena software can be adopted to simulate complex systems effectively. The method in this research can be applied to investigate various improvements scenarios and their consequences before implementing them in reality.
Structural Behavioral Study on the General Aviation Network Based on Complex Network
NASA Astrophysics Data System (ADS)
Zhang, Liang; Lu, Na
2017-12-01
The general aviation system is an open and dissipative system with complex structures and behavioral features. This paper has established the system model and network model for general aviation. We have analyzed integral attributes and individual attributes by applying the complex network theory and concluded that the general aviation network has influential enterprise factors and node relations. We have checked whether the network has small world effect, scale-free property and network centrality property which a complex network should have by applying degree distribution of functions and proved that the general aviation network system is a complex network. Therefore, we propose to achieve the evolution process of the general aviation industrial chain to collaborative innovation cluster of advanced-form industries by strengthening network multiplication effect, stimulating innovation performance and spanning the structural hole path.
Restricted Complexity Framework for Nonlinear Adaptive Control in Complex Systems
NASA Astrophysics Data System (ADS)
Williams, Rube B.
2004-02-01
Control law adaptation that includes implicit or explicit adaptive state estimation, can be a fundamental underpinning for the success of intelligent control in complex systems, particularly during subsystem failures, where vital system states and parameters can be impractical or impossible to measure directly. A practical algorithm is proposed for adaptive state filtering and control in nonlinear dynamic systems when the state equations are unknown or are too complex to model analytically. The state equations and inverse plant model are approximated by using neural networks. A framework for a neural network based nonlinear dynamic inversion control law is proposed, as an extrapolation of prior developed restricted complexity methodology used to formulate the adaptive state filter. Examples of adaptive filter performance are presented for an SSME simulation with high pressure turbine failure to support extrapolations to adaptive control problems.
Plant metabolic modeling: achieving new insight into metabolism and metabolic engineering.
Baghalian, Kambiz; Hajirezaei, Mohammad-Reza; Schreiber, Falk
2014-10-01
Models are used to represent aspects of the real world for specific purposes, and mathematical models have opened up new approaches in studying the behavior and complexity of biological systems. However, modeling is often time-consuming and requires significant computational resources for data development, data analysis, and simulation. Computational modeling has been successfully applied as an aid for metabolic engineering in microorganisms. But such model-based approaches have only recently been extended to plant metabolic engineering, mainly due to greater pathway complexity in plants and their highly compartmentalized cellular structure. Recent progress in plant systems biology and bioinformatics has begun to disentangle this complexity and facilitate the creation of efficient plant metabolic models. This review highlights several aspects of plant metabolic modeling in the context of understanding, predicting and modifying complex plant metabolism. We discuss opportunities for engineering photosynthetic carbon metabolism, sucrose synthesis, and the tricarboxylic acid cycle in leaves and oil synthesis in seeds and the application of metabolic modeling to the study of plant acclimation to the environment. The aim of the review is to offer a current perspective for plant biologists without requiring specialized knowledge of bioinformatics or systems biology. © 2014 American Society of Plant Biologists. All rights reserved.
Plant Metabolic Modeling: Achieving New Insight into Metabolism and Metabolic Engineering
Baghalian, Kambiz; Hajirezaei, Mohammad-Reza; Schreiber, Falk
2014-01-01
Models are used to represent aspects of the real world for specific purposes, and mathematical models have opened up new approaches in studying the behavior and complexity of biological systems. However, modeling is often time-consuming and requires significant computational resources for data development, data analysis, and simulation. Computational modeling has been successfully applied as an aid for metabolic engineering in microorganisms. But such model-based approaches have only recently been extended to plant metabolic engineering, mainly due to greater pathway complexity in plants and their highly compartmentalized cellular structure. Recent progress in plant systems biology and bioinformatics has begun to disentangle this complexity and facilitate the creation of efficient plant metabolic models. This review highlights several aspects of plant metabolic modeling in the context of understanding, predicting and modifying complex plant metabolism. We discuss opportunities for engineering photosynthetic carbon metabolism, sucrose synthesis, and the tricarboxylic acid cycle in leaves and oil synthesis in seeds and the application of metabolic modeling to the study of plant acclimation to the environment. The aim of the review is to offer a current perspective for plant biologists without requiring specialized knowledge of bioinformatics or systems biology. PMID:25344492
Sulis, William H
2017-10-01
Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.
An Integrated Crustal Dynamics Simulator
NASA Astrophysics Data System (ADS)
Xing, H. L.; Mora, P.
2007-12-01
Numerical modelling offers an outstanding opportunity to gain an understanding of the crustal dynamics and complex crustal system behaviour. This presentation provides our long-term and ongoing effort on finite element based computational model and software development to simulate the interacting fault system for earthquake forecasting. A R-minimum strategy based finite-element computational model and software tool, PANDAS, for modelling 3-dimensional nonlinear frictional contact behaviour between multiple deformable bodies with the arbitrarily-shaped contact element strategy has been developed by the authors, which builds up a virtual laboratory to simulate interacting fault systems including crustal boundary conditions and various nonlinearities (e.g. from frictional contact, materials, geometry and thermal coupling). It has been successfully applied to large scale computing of the complex nonlinear phenomena in the non-continuum media involving the nonlinear frictional instability, multiple material properties and complex geometries on supercomputers, such as the South Australia (SA) interacting fault system, South California fault model and Sumatra subduction model. It has been also extended and to simulate the hot fractured rock (HFR) geothermal reservoir system in collaboration of Geodynamics Ltd which is constructing the first geothermal reservoir system in Australia and to model the tsunami generation induced by earthquakes. Both are supported by Australian Research Council.
A new decision sciences for complex systems.
Lempert, Robert J
2002-05-14
Models of complex systems can capture much useful information but can be difficult to apply to real-world decision-making because the type of information they contain is often inconsistent with that required for traditional decision analysis. New approaches, which use inductive reasoning over large ensembles of computational experiments, now make possible systematic comparison of alternative policy options using models of complex systems. This article describes Computer-Assisted Reasoning, an approach to decision-making under conditions of deep uncertainty that is ideally suited to applying complex systems to policy analysis. The article demonstrates the approach on the policy problem of global climate change, with a particular focus on the role of technology policies in a robust, adaptive strategy for greenhouse gas abatement.
Recommended Research Directions for Improving the Validation of Complex Systems Models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vugrin, Eric D.; Trucano, Timothy G.; Swiler, Laura Painton
Improved validation for models of complex systems has been a primary focus over the past year for the Resilience in Complex Systems Research Challenge. This document describes a set of research directions that are the result of distilling those ideas into three categories of research -- epistemic uncertainty, strong tests, and value of information. The content of this document can be used to transmit valuable information to future research activities, update the Resilience in Complex Systems Research Challenge's roadmap, inform the upcoming FY18 Laboratory Directed Research and Development (LDRD) call and research proposals, and facilitate collaborations between Sandia and externalmore » organizations. The recommended research directions can provide topics for collaborative research, development of proposals, workshops, and other opportunities.« less
A discrete control model of PLANT
NASA Technical Reports Server (NTRS)
Mitchell, C. M.
1985-01-01
A model of the PLANT system using the discrete control modeling techniques developed by Miller is described. Discrete control models attempt to represent in a mathematical form how a human operator might decompose a complex system into simpler parts and how the control actions and system configuration are coordinated so that acceptable overall system performance is achieved. Basic questions include knowledge representation, information flow, and decision making in complex systems. The structure of the model is a general hierarchical/heterarchical scheme which structurally accounts for coordination and dynamic focus of attention. Mathematically, the discrete control model is defined in terms of a network of finite state systems. Specifically, the discrete control model accounts for how specific control actions are selected from information about the controlled system, the environment, and the context of the situation. The objective is to provide a plausible and empirically testable accounting and, if possible, explanation of control behavior.
SPARK: A Framework for Multi-Scale Agent-Based Biomedical Modeling.
Solovyev, Alexey; Mikheev, Maxim; Zhou, Leming; Dutta-Moscato, Joyeeta; Ziraldo, Cordelia; An, Gary; Vodovotz, Yoram; Mi, Qi
2010-01-01
Multi-scale modeling of complex biological systems remains a central challenge in the systems biology community. A method of dynamic knowledge representation known as agent-based modeling enables the study of higher level behavior emerging from discrete events performed by individual components. With the advancement of computer technology, agent-based modeling has emerged as an innovative technique to model the complexities of systems biology. In this work, the authors describe SPARK (Simple Platform for Agent-based Representation of Knowledge), a framework for agent-based modeling specifically designed for systems-level biomedical model development. SPARK is a stand-alone application written in Java. It provides a user-friendly interface, and a simple programming language for developing Agent-Based Models (ABMs). SPARK has the following features specialized for modeling biomedical systems: 1) continuous space that can simulate real physical space; 2) flexible agent size and shape that can represent the relative proportions of various cell types; 3) multiple spaces that can concurrently simulate and visualize multiple scales in biomedical models; 4) a convenient graphical user interface. Existing ABMs of diabetic foot ulcers and acute inflammation were implemented in SPARK. Models of identical complexity were run in both NetLogo and SPARK; the SPARK-based models ran two to three times faster.
ERIC Educational Resources Information Center
Berland, Matthew; Wilensky, Uri
2015-01-01
Both complex systems methods (such as agent-based modeling) and computational methods (such as programming) provide powerful ways for students to understand new phenomena. To understand how to effectively teach complex systems and computational content to younger students, we conducted a study in four urban middle school classrooms comparing…
Scope Complexity Options Risks Excursions (SCORE) Factor Mathematical Description.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gearhart, Jared Lee; Samberson, Jonell Nicole; Shettigar, Subhasini
The purpose of the Scope, Complexity, Options, Risks, Excursions (SCORE) model is to estimate the relative complexity of design variants of future warhead options, resulting in scores. SCORE factors extend this capability by providing estimates of complexity relative to a base system (i.e., all design options are normalized to one weapon system). First, a clearly defined set of scope elements for a warhead option is established. The complexity of each scope element is estimated by Subject Matter Experts (SMEs), including a level of uncertainty, relative to a specific reference system. When determining factors, complexity estimates for a scope element canmore » be directly tied to the base system or chained together via comparable scope elements in a string of reference systems that ends with the base system. The SCORE analysis process is a growing multi-organizational Nuclear Security Enterprise (NSE) effort, under the management of the NA-12 led Enterprise Modeling and Analysis Consortium (EMAC). Historically, it has provided the data elicitation, integration, and computation needed to support the out-year Life Extension Program (LEP) cost estimates included in the Stockpile Stewardship Management Plan (SSMP).« less
1983-09-01
6ENFRAL. ELECTROMAGNETIC MODEL FOR THE ANALYSIS OF COMPLEX SYSTEMS **%(GEMA CS) Computer Code Documentation ii( Version 3 ). A the BDM Corporation Dr...ANALYSIS FnlTcnclRpr F COMPLEX SYSTEM (GmCS) February 81 - July 83- I TR CODE DOCUMENTATION (Version 3 ) 6.PROMN N.REPORT NUMBER 5. CONTRACT ORGAT97...the ti and t2 directions on the source patch. 3 . METHOD: The electric field at a segment observation point due to the source patch j is given by 1-- lnA
NASA Technical Reports Server (NTRS)
Johnson, Sally C.; Boerschlein, David P.
1995-01-01
Semi-Markov models can be used to analyze the reliability of virtually any fault-tolerant system. However, the process of delineating all the states and transitions in a complex system model can be devastatingly tedious and error prone. The Abstract Semi-Markov Specification Interface to the SURE Tool (ASSIST) computer program allows the user to describe the semi-Markov model in a high-level language. Instead of listing the individual model states, the user specifies the rules governing the behavior of the system, and these are used to generate the model automatically. A few statements in the abstract language can describe a very large, complex model. Because no assumptions are made about the system being modeled, ASSIST can be used to generate models describing the behavior of any system. The ASSIST program and its input language are described and illustrated by examples.
Recording information on protein complexes in an information management system
Savitsky, Marc; Diprose, Jonathan M.; Morris, Chris; Griffiths, Susanne L.; Daniel, Edward; Lin, Bill; Daenke, Susan; Bishop, Benjamin; Siebold, Christian; Wilson, Keith S.; Blake, Richard; Stuart, David I.; Esnouf, Robert M.
2011-01-01
The Protein Information Management System (PiMS) is a laboratory information management system (LIMS) designed for use with the production of proteins in a research environment. The software is distributed under the CCP4 licence, and so is available free of charge to academic laboratories. Like most LIMS, the underlying PiMS data model originally had no support for protein–protein complexes. To support the SPINE2-Complexes project the developers have extended PiMS to meet these requirements. The modifications to PiMS, described here, include data model changes, additional protocols, some user interface changes and functionality to detect when an experiment may have formed a complex. Example data are shown for the production of a crystal of a protein complex. Integration with SPINE2-Complexes Target Tracker application is also described. PMID:21605682
Recording information on protein complexes in an information management system.
Savitsky, Marc; Diprose, Jonathan M; Morris, Chris; Griffiths, Susanne L; Daniel, Edward; Lin, Bill; Daenke, Susan; Bishop, Benjamin; Siebold, Christian; Wilson, Keith S; Blake, Richard; Stuart, David I; Esnouf, Robert M
2011-08-01
The Protein Information Management System (PiMS) is a laboratory information management system (LIMS) designed for use with the production of proteins in a research environment. The software is distributed under the CCP4 licence, and so is available free of charge to academic laboratories. Like most LIMS, the underlying PiMS data model originally had no support for protein-protein complexes. To support the SPINE2-Complexes project the developers have extended PiMS to meet these requirements. The modifications to PiMS, described here, include data model changes, additional protocols, some user interface changes and functionality to detect when an experiment may have formed a complex. Example data are shown for the production of a crystal of a protein complex. Integration with SPINE2-Complexes Target Tracker application is also described. Copyright © 2011 Elsevier Inc. All rights reserved.
Multi-Agent-Based Simulation of a Complex Ecosystem of Mental Health Care.
Kalton, Alan; Falconer, Erin; Docherty, John; Alevras, Dimitris; Brann, David; Johnson, Kyle
2016-02-01
This paper discusses the creation of an Agent-Based Simulation that modeled the introduction of care coordination capabilities into a complex system of care for patients with Serious and Persistent Mental Illness. The model describes the engagement between patients and the medical, social and criminal justice services they interact with in a complex ecosystem of care. We outline the challenges involved in developing the model, including process mapping and the collection and synthesis of data to support parametric estimates, and describe the controls built into the model to support analysis of potential changes to the system. We also describe the approach taken to calibrate the model to an observable level of system performance. Preliminary results from application of the simulation are provided to demonstrate how it can provide insights into potential improvements deriving from introduction of care coordination technology.
Surface complexation modeling of Cu(II) adsorption on mixtures of hydrous ferric oxide and kaolinite
Lund, Tracy J; Koretsky, Carla M; Landry, Christopher J; Schaller, Melinda S; Das, Soumya
2008-01-01
Background The application of surface complexation models (SCMs) to natural sediments and soils is hindered by a lack of consistent models and data for large suites of metals and minerals of interest. Furthermore, the surface complexation approach has mostly been developed and tested for single solid systems. Few studies have extended the SCM approach to systems containing multiple solids. Results Cu adsorption was measured on pure hydrous ferric oxide (HFO), pure kaolinite (from two sources) and in systems containing mixtures of HFO and kaolinite over a wide range of pH, ionic strength, sorbate/sorbent ratios and, for the mixed solid systems, using a range of kaolinite/HFO ratios. Cu adsorption data measured for the HFO and kaolinite systems was used to derive diffuse layer surface complexation models (DLMs) describing Cu adsorption. Cu adsorption on HFO is reasonably well described using a 1-site or 2-site DLM. Adsorption of Cu on kaolinite could be described using a simple 1-site DLM with formation of a monodentate Cu complex on a variable charge surface site. However, for consistency with models derived for weaker sorbing cations, a 2-site DLM with a variable charge and a permanent charge site was also developed. Conclusion Component additivity predictions of speciation in mixed mineral systems based on DLM parameters derived for the pure mineral systems were in good agreement with measured data. Discrepancies between the model predictions and measured data were similar to those observed for the calibrated pure mineral systems. The results suggest that quantifying specific interactions between HFO and kaolinite in speciation models may not be necessary. However, before the component additivity approach can be applied to natural sediments and soils, the effects of aging must be further studied and methods must be developed to estimate reactive surface areas of solid constituents in natural samples. PMID:18783619
NASA Technical Reports Server (NTRS)
White, Allan L.; Palumbo, Daniel L.
1991-01-01
Semi-Markov processes have proved to be an effective and convenient tool to construct models of systems that achieve reliability by redundancy and reconfiguration. These models are able to depict complex system architectures and to capture the dynamics of fault arrival and system recovery. A disadvantage of this approach is that the models can be extremely large, which poses both a model and a computational problem. Techniques are needed to reduce the model size. Because these systems are used in critical applications where failure can be expensive, there must be an analytically derived bound for the error produced by the model reduction technique. A model reduction technique called trimming is presented that can be applied to a popular class of systems. Automatic model generation programs were written to help the reliability analyst produce models of complex systems. This method, trimming, is easy to implement and the error bound easy to compute. Hence, the method lends itself to inclusion in an automatic model generator.
Modeling complex aquifer systems: a case study in Baton Rouge, Louisiana (USA)
NASA Astrophysics Data System (ADS)
Pham, Hai V.; Tsai, Frank T.-C.
2017-05-01
This study targets two challenges in groundwater model development: grid generation and model calibration for aquifer systems that are fluvial in origin. Realistic hydrostratigraphy can be developed using a large quantity of well log data to capture the complexity of an aquifer system. However, generating valid groundwater model grids to be consistent with the complex hydrostratigraphy is non-trivial. Model calibration can also become intractable for groundwater models that intend to match the complex hydrostratigraphy. This study uses the Baton Rouge aquifer system, Louisiana (USA), to illustrate a technical need to cope with grid generation and model calibration issues. A grid generation technique is introduced based on indicator kriging to interpolate 583 wireline well logs in the Baton Rouge area to derive a hydrostratigraphic architecture with fine vertical discretization. Then, an upscaling procedure is developed to determine a groundwater model structure with 162 layers that captures facies geometry in the hydrostratigraphic architecture. To handle model calibration for such a large model, this study utilizes a derivative-free optimization method in parallel computing to complete parameter estimation in a few months. The constructed hydrostratigraphy indicates the Baton Rouge aquifer system is fluvial in origin. The calibration result indicates hydraulic conductivity for Miocene sands is higher than that for Pliocene to Holocene sands and indicates the Baton Rouge fault and the Denham Springs-Scotlandville fault to be low-permeability leaky aquifers. The modeling result shows significantly low groundwater level in the "2,000-foot" sand due to heavy pumping, indicating potential groundwater upward flow from the "2,400-foot" sand.
Dobson, Ian; Carreras, Benjamin A; Lynch, Vickie E; Newman, David E
2007-06-01
We give an overview of a complex systems approach to large blackouts of electric power transmission systems caused by cascading failure. Instead of looking at the details of particular blackouts, we study the statistics and dynamics of series of blackouts with approximate global models. Blackout data from several countries suggest that the frequency of large blackouts is governed by a power law. The power law makes the risk of large blackouts consequential and is consistent with the power system being a complex system designed and operated near a critical point. Power system overall loading or stress relative to operating limits is a key factor affecting the risk of cascading failure. Power system blackout models and abstract models of cascading failure show critical points with power law behavior as load is increased. To explain why the power system is operated near these critical points and inspired by concepts from self-organized criticality, we suggest that power system operating margins evolve slowly to near a critical point and confirm this idea using a power system model. The slow evolution of the power system is driven by a steady increase in electric loading, economic pressures to maximize the use of the grid, and the engineering responses to blackouts that upgrade the system. Mitigation of blackout risk should account for dynamical effects in complex self-organized critical systems. For example, some methods of suppressing small blackouts could ultimately increase the risk of large blackouts.
The Use of Behavior Models for Predicting Complex Operations
NASA Technical Reports Server (NTRS)
Gore, Brian F.
2010-01-01
Modeling and simulation (M&S) plays an important role when complex human-system notions are being proposed, developed and tested within the system design process. National Aeronautics and Space Administration (NASA) as an agency uses many different types of M&S approaches for predicting human-system interactions, especially when it is early in the development phase of a conceptual design. NASA Ames Research Center possesses a number of M&S capabilities ranging from airflow, flight path models, aircraft models, scheduling models, human performance models (HPMs), and bioinformatics models among a host of other kinds of M&S capabilities that are used for predicting whether the proposed designs will benefit the specific mission criteria. The Man-Machine Integration Design and Analysis System (MIDAS) is a NASA ARC HPM software tool that integrates many models of human behavior with environment models, equipment models, and procedural / task models. The challenge to model comprehensibility is heightened as the number of models that are integrated and the requisite fidelity of the procedural sets are increased. Model transparency is needed for some of the more complex HPMs to maintain comprehensibility of the integrated model performance. This will be exemplified in a recent MIDAS v5 application model and plans for future model refinements will be presented.
Quantum Approximate Methods for the Atomistic Modeling of Multicomponent Alloys. Chapter 7
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo; Garces, Jorge; Mosca, Hugo; Gargano, pablo; Noebe, Ronald D.; Abel, Phillip
2007-01-01
This chapter describes the role of quantum approximate methods in the understanding of complex multicomponent alloys at the atomic level. The need to accelerate materials design programs based on economical and efficient modeling techniques provides the framework for the introduction of approximations and simplifications in otherwise rigorous theoretical schemes. As a promising example of the role that such approximate methods might have in the development of complex systems, the BFS method for alloys is presented and applied to Ru-rich Ni-base superalloys and also to the NiAI(Ti,Cu) system, highlighting the benefits that can be obtained from introducing simple modeling techniques to the investigation of such complex systems.
Juckem, Paul F.; Clark, Brian R.; Feinstein, Daniel T.
2017-05-04
The U.S. Geological Survey, National Water-Quality Assessment seeks to map estimated intrinsic susceptibility of the glacial aquifer system of the conterminous United States. Improved understanding of the hydrogeologic characteristics that explain spatial patterns of intrinsic susceptibility, commonly inferred from estimates of groundwater age distributions, is sought so that methods used for the estimation process are properly equipped. An important step beyond identifying relevant hydrogeologic datasets, such as glacial geology maps, is to evaluate how incorporation of these resources into process-based models using differing levels of detail could affect resulting simulations of groundwater age distributions and, thus, estimates of intrinsic susceptibility.This report describes the construction and calibration of three groundwater-flow models of northeastern Wisconsin that were developed with differing levels of complexity to provide a framework for subsequent evaluations of the effects of process-based model complexity on estimations of groundwater age distributions for withdrawal wells and streams. Preliminary assessments, which focused on the effects of model complexity on simulated water levels and base flows in the glacial aquifer system, illustrate that simulation of vertical gradients using multiple model layers improves simulated heads more in low-permeability units than in high-permeability units. Moreover, simulation of heterogeneous hydraulic conductivity fields in coarse-grained and some fine-grained glacial materials produced a larger improvement in simulated water levels in the glacial aquifer system compared with simulation of uniform hydraulic conductivity within zones. The relation between base flows and model complexity was less clear; however, the relation generally seemed to follow a similar pattern as water levels. Although increased model complexity resulted in improved calibrations, future application of the models using simulated particle tracking is anticipated to evaluate if these model design considerations are similarly important for understanding the primary modeling objective - to simulate reasonable groundwater age distributions.
Watershed System Model: The Essentials to Model Complex Human-Nature System at the River Basin Scale
NASA Astrophysics Data System (ADS)
Li, Xin; Cheng, Guodong; Lin, Hui; Cai, Ximing; Fang, Miao; Ge, Yingchun; Hu, Xiaoli; Chen, Min; Li, Weiyue
2018-03-01
Watershed system models are urgently needed to understand complex watershed systems and to support integrated river basin management. Early watershed modeling efforts focused on the representation of hydrologic processes, while the next-generation watershed models should represent the coevolution of the water-land-air-plant-human nexus in a watershed and provide capability of decision-making support. We propose a new modeling framework and discuss the know-how approach to incorporate emerging knowledge into integrated models through data exchange interfaces. We argue that the modeling environment is a useful tool to enable effective model integration, as well as create domain-specific models of river basin systems. The grand challenges in developing next-generation watershed system models include but are not limited to providing an overarching framework for linking natural and social sciences, building a scientifically based decision support system, quantifying and controlling uncertainties, and taking advantage of new technologies and new findings in the various disciplines of watershed science. The eventual goal is to build transdisciplinary, scientifically sound, and scale-explicit watershed system models that are to be codesigned by multidisciplinary communities.
Optimal service using Matlab - simulink controlled Queuing system at call centers
NASA Astrophysics Data System (ADS)
Balaji, N.; Siva, E. P.; Chandrasekaran, A. D.; Tamilazhagan, V.
2018-04-01
This paper presents graphical integrated model based academic research on telephone call centres. This paper introduces an important feature of impatient customers and abandonments in the queue system. However the modern call centre is a complex socio-technical system. Queuing theory has now become a suitable application in the telecom industry to provide better online services. Through this Matlab-simulink multi queuing structured models provide better solutions in complex situations at call centres. Service performance measures analyzed at optimal level through Simulink queuing model.
Queueing Network Models for Parallel Processing of Task Systems: an Operational Approach
NASA Technical Reports Server (NTRS)
Mak, Victor W. K.
1986-01-01
Computer performance modeling of possibly complex computations running on highly concurrent systems is considered. Earlier works in this area either dealt with a very simple program structure or resulted in methods with exponential complexity. An efficient procedure is developed to compute the performance measures for series-parallel-reducible task systems using queueing network models. The procedure is based on the concept of hierarchical decomposition and a new operational approach. Numerical results for three test cases are presented and compared to those of simulations.
NASA Astrophysics Data System (ADS)
Liu, Y.; Gupta, H.; Wagener, T.; Stewart, S.; Mahmoud, M.; Hartmann, H.; Springer, E.
2007-12-01
Some of the most challenging issues facing contemporary water resources management are those typified by complex coupled human-environmental systems with poorly characterized uncertainties. In other words, major decisions regarding water resources have to be made in the face of substantial uncertainty and complexity. It has been suggested that integrated models can be used to coherently assemble information from a broad set of domains, and can therefore serve as an effective means for tackling the complexity of environmental systems. Further, well-conceived scenarios can effectively inform decision making, particularly when high complexity and poorly characterized uncertainties make the problem intractable via traditional uncertainty analysis methods. This presentation discusses the integrated modeling framework adopted by SAHRA, an NSF Science & Technology Center, to investigate stakeholder-driven water sustainability issues within the semi-arid southwestern US. The multi-disciplinary, multi-resolution modeling framework incorporates a formal scenario approach to analyze the impacts of plausible (albeit uncertain) alternative futures to support adaptive management of water resources systems. Some of the major challenges involved in, and lessons learned from, this effort will be discussed.
Simulating complex intracellular processes using object-oriented computational modelling.
Johnson, Colin G; Goldman, Jacki P; Gullick, William J
2004-11-01
The aim of this paper is to give an overview of computer modelling and simulation in cellular biology, in particular as applied to complex biochemical processes within the cell. This is illustrated by the use of the techniques of object-oriented modelling, where the computer is used to construct abstractions of objects in the domain being modelled, and these objects then interact within the computer to simulate the system and allow emergent properties to be observed. The paper also discusses the role of computer simulation in understanding complexity in biological systems, and the kinds of information which can be obtained about biology via simulation.
Complex Physical, Biophysical and Econophysical Systems
NASA Astrophysics Data System (ADS)
Dewar, Robert L.; Detering, Frank
1. Introduction to complex and econophysics systems: a navigation map / T. Aste and T. Di Matteo -- 2. An introduction to fractional diffusion / B. I. Henry, T.A.M. Langlands and P. Straka -- 3. Space plasmas and fusion plasmas as complex systems / R. O. Dendy -- 4. Bayesian data analysis / M. S. Wheatland -- 5. Inverse problems and complexity in earth system science / I. G. Enting -- 6. Applied fluid chaos: designing advection with periodically reoriented flows for micro to geophysical mixing and transport enhancement / G. Metcalfe -- 7. Approaches to modelling the dynamical activity of brain function based on the electroencephalogram / D. T. J. Liley and F. Frascoli -- 8. Jaynes' maximum entropy principle, Riemannian metrics and generalised least action bound / R. K. Niven and B. Andresen -- 9. Complexity, post-genomic biology and gene expression programs / R. B. H. Williams and O. J.-H. Luo -- 10. Tutorials on agent-based modelling with NetLogo and network analysis with Pajek / M. J. Berryman and S. D. Angus.
Configuration complexity assessment of convergent supply chain systems
NASA Astrophysics Data System (ADS)
Modrak, Vladimir; Marton, David
2014-07-01
System designers usually generate alternative configurations of supply chains (SCs) by varying especially fixed assets to satisfy a desired production scope and rate. Such alternatives often vary in associated costs and other facets including degrees of complexity. Hence, a measure of configuration complexity can be a tool for comparison and decision-making. This paper presents three approaches to assessment of configuration complexity and their applications to designing convergent SC systems. Presented approaches are conceptually distinct ways of measuring structural complexity parameters based on different preconditions and circumstances of assembly systems which are typical representatives of convergent SCs. There are applied two similar approaches based on different preconditions that are related to demand shares. Third approach does not consider any special condition relating to character of final product demand. Subsequently, we propose a framework for modeling of assembly SC models, which are dividing to classes.
Systems for Teaching Complex Texts: A Proof-of-Concept Investigation
ERIC Educational Resources Information Center
Fisher, Douglas; Frey, Nancy
2016-01-01
In this article we investigate the systems that need to be in place for students to learn from increasingly complex texts. Our concept, drawn from past research, includes clear learning targets, teacher modeling, collaborative conversations, close reading, small group reading, and wide reading. Using a "proof of concept" model, we follow…
A Cystems Approach to Training and Complexity
ERIC Educational Resources Information Center
Kennedy, Bob
2005-01-01
Purpose: This paper aims to explore the quality profession's fascination with various models to depict complex interactive systems. Building on these and the outcome of a four-year action research programme, it provides a model which has potential for use by other professions. It has been tailored here to suit training and learning systems.…
Calibration of an Unsteady Groundwater Flow Model for a Complex, Strongly Heterogeneous Aquifer
NASA Astrophysics Data System (ADS)
Curtis, Z. K.; Liao, H.; Li, S. G.; Phanikumar, M. S.; Lusch, D.
2016-12-01
Modeling of groundwater systems characterized by complex three-dimensional structure and heterogeneity remains a significant challenge. Most of today's groundwater models are developed based on relatively simple conceptual representations in favor of model calibratibility. As more complexities are modeled, e.g., by adding more layers and/or zones, or introducing transient processes, more parameters have to be estimated and issues related to ill-posed groundwater problems and non-unique calibration arise. Here, we explore the use of an alternative conceptual representation for groundwater modeling that is fully three-dimensional and can capture complex 3D heterogeneity (both systematic and "random") without over-parameterizing the aquifer system. In particular, we apply Transition Probability (TP) geostatistics on high resolution borehole data from a water well database to characterize the complex 3D geology. Different aquifer material classes, e.g., `AQ' (aquifer material), `MAQ' (marginal aquifer material'), `PCM' (partially confining material), and `CM' (confining material), are simulated, with the hydraulic properties of each material type as tuning parameters during calibration. The TP-based approach is applied to simulate unsteady groundwater flow in a large, complex, and strongly heterogeneous glacial aquifer system in Michigan across multiple spatial and temporal scales. The resulting model is calibrated to observed static water level data over a time span of 50 years. The results show that the TP-based conceptualization enables much more accurate and robust calibration/simulation than that based on conventional deterministic layer/zone based conceptual representations.
Chang, Ivan; Heiske, Margit; Letellier, Thierry; Wallace, Douglas; Baldi, Pierre
2011-01-01
Mitochondrial bioenergetic processes are central to the production of cellular energy, and a decrease in the expression or activity of enzyme complexes responsible for these processes can result in energetic deficit that correlates with many metabolic diseases and aging. Unfortunately, existing computational models of mitochondrial bioenergetics either lack relevant kinetic descriptions of the enzyme complexes, or incorporate mechanisms too specific to a particular mitochondrial system and are thus incapable of capturing the heterogeneity associated with these complexes across different systems and system states. Here we introduce a new composable rate equation, the chemiosmotic rate law, that expresses the flux of a prototypical energy transduction complex as a function of: the saturation kinetics of the electron donor and acceptor substrates; the redox transfer potential between the complex and the substrates; and the steady-state thermodynamic force-to-flux relationship of the overall electro-chemical reaction. Modeling of bioenergetics with this rate law has several advantages: (1) it minimizes the use of arbitrary free parameters while featuring biochemically relevant parameters that can be obtained through progress curves of common enzyme kinetics protocols; (2) it is modular and can adapt to various enzyme complex arrangements for both in vivo and in vitro systems via transformation of its rate and equilibrium constants; (3) it provides a clear association between the sensitivity of the parameters of the individual complexes and the sensitivity of the system's steady-state. To validate our approach, we conduct in vitro measurements of ETC complex I, III, and IV activities using rat heart homogenates, and construct an estimation procedure for the parameter values directly from these measurements. In addition, we show the theoretical connections of our approach to the existing models, and compare the predictive accuracy of the rate law with our experimentally fitted parameters to those of existing models. Finally, we present a complete perturbation study of these parameters to reveal how they can significantly and differentially influence global flux and operational thresholds, suggesting that this modeling approach could help enable the comparative analysis of mitochondria from different systems and pathological states. The procedures and results are available in Mathematica notebooks at http://www.igb.uci.edu/tools/sb/mitochondria-modeling.html. PMID:21931590
Chang, Ivan; Heiske, Margit; Letellier, Thierry; Wallace, Douglas; Baldi, Pierre
2011-01-01
Mitochondrial bioenergetic processes are central to the production of cellular energy, and a decrease in the expression or activity of enzyme complexes responsible for these processes can result in energetic deficit that correlates with many metabolic diseases and aging. Unfortunately, existing computational models of mitochondrial bioenergetics either lack relevant kinetic descriptions of the enzyme complexes, or incorporate mechanisms too specific to a particular mitochondrial system and are thus incapable of capturing the heterogeneity associated with these complexes across different systems and system states. Here we introduce a new composable rate equation, the chemiosmotic rate law, that expresses the flux of a prototypical energy transduction complex as a function of: the saturation kinetics of the electron donor and acceptor substrates; the redox transfer potential between the complex and the substrates; and the steady-state thermodynamic force-to-flux relationship of the overall electro-chemical reaction. Modeling of bioenergetics with this rate law has several advantages: (1) it minimizes the use of arbitrary free parameters while featuring biochemically relevant parameters that can be obtained through progress curves of common enzyme kinetics protocols; (2) it is modular and can adapt to various enzyme complex arrangements for both in vivo and in vitro systems via transformation of its rate and equilibrium constants; (3) it provides a clear association between the sensitivity of the parameters of the individual complexes and the sensitivity of the system's steady-state. To validate our approach, we conduct in vitro measurements of ETC complex I, III, and IV activities using rat heart homogenates, and construct an estimation procedure for the parameter values directly from these measurements. In addition, we show the theoretical connections of our approach to the existing models, and compare the predictive accuracy of the rate law with our experimentally fitted parameters to those of existing models. Finally, we present a complete perturbation study of these parameters to reveal how they can significantly and differentially influence global flux and operational thresholds, suggesting that this modeling approach could help enable the comparative analysis of mitochondria from different systems and pathological states. The procedures and results are available in Mathematica notebooks at http://www.igb.uci.edu/tools/sb/mitochondria-modeling.html.
Complexity and dynamics of switched human balance control during quiet standing.
Nema, Salam; Kowalczyk, Piotr; Loram, Ian
2015-10-01
In this paper, we use a combination of numerical simulations, time series analysis, and complexity measures to investigate the dynamics of switched systems with noise, which are often used as models of human balance control during quiet standing. We link the results with complexity measures found in experimental data of human sway motion during quiet standing. The control model ensuring balance, which we use, is based on an act-and-wait control concept, that is, a human controller is switched on when a certain sway angle is reached. Otherwise, there is no active control present. Given a time series data, we determine how does it look a typical pattern of control strategy in our model system. We detect the switched nonlinearity in the system using a frequency analysis method in the absence of noise. We also analyse the effect of time delay on the existence of limit cycles in the system in the absence of noise. We perform the entropy and detrended fluctuation analyses in view of linking the switchings (and the dead zone) with the occurrences of complexity in the model system in the presence of noise. Finally, we perform the entropy and detrended fluctuation analyses on experimental data and link the results with numerical findings in our model example.
Principal process analysis of biological models.
Casagranda, Stefano; Touzeau, Suzanne; Ropers, Delphine; Gouzé, Jean-Luc
2018-06-14
Understanding the dynamical behaviour of biological systems is challenged by their large number of components and interactions. While efforts have been made in this direction to reduce model complexity, they often prove insufficient to grasp which and when model processes play a crucial role. Answering these questions is fundamental to unravel the functioning of living organisms. We design a method for dealing with model complexity, based on the analysis of dynamical models by means of Principal Process Analysis. We apply the method to a well-known model of circadian rhythms in mammals. The knowledge of the system trajectories allows us to decompose the system dynamics into processes that are active or inactive with respect to a certain threshold value. Process activities are graphically represented by Boolean and Dynamical Process Maps. We detect model processes that are always inactive, or inactive on some time interval. Eliminating these processes reduces the complex dynamics of the original model to the much simpler dynamics of the core processes, in a succession of sub-models that are easier to analyse. We quantify by means of global relative errors the extent to which the simplified models reproduce the main features of the original system dynamics and apply global sensitivity analysis to test the influence of model parameters on the errors. The results obtained prove the robustness of the method. The analysis of the sub-model dynamics allows us to identify the source of circadian oscillations. We find that the negative feedback loop involving proteins PER, CRY, CLOCK-BMAL1 is the main oscillator, in agreement with previous modelling and experimental studies. In conclusion, Principal Process Analysis is a simple-to-use method, which constitutes an additional and useful tool for analysing the complex dynamical behaviour of biological systems.
The highly intelligent virtual agents for modeling financial markets
NASA Astrophysics Data System (ADS)
Yang, G.; Chen, Y.; Huang, J. P.
2016-02-01
Researchers have borrowed many theories from statistical physics, like ensemble, Ising model, etc., to study complex adaptive systems through agent-based modeling. However, one fundamental difference between entities (such as spins) in physics and micro-units in complex adaptive systems is that the latter are usually with high intelligence, such as investors in financial markets. Although highly intelligent virtual agents are essential for agent-based modeling to play a full role in the study of complex adaptive systems, how to create such agents is still an open question. Hence, we propose three principles for designing high artificial intelligence in financial markets and then build a specific class of agents called iAgents based on these three principles. Finally, we evaluate the intelligence of iAgents through virtual index trading in two different stock markets. For comparison, we also include three other types of agents in this contest, namely, random traders, agents from the wealth game (modified on the famous minority game), and agents from an upgraded wealth game. As a result, iAgents perform the best, which gives a well support for the three principles. This work offers a general framework for the further development of agent-based modeling for various kinds of complex adaptive systems.
Analysis of Immune Complex Structure by Statistical Mechanics and Light Scattering Techniques.
NASA Astrophysics Data System (ADS)
Busch, Nathan Adams
1995-01-01
The size and structure of immune complexes determine their behavior in the immune system. The chemical physics of the complex formation is not well understood; this is due in part to inadequate characterization of the proteins involved, and in part by lack of sufficiently well developed theoretical techniques. Understanding the complex formation will permit rational design of strategies for inhibiting tissue deposition of the complexes. A statistical mechanical model of the proteins based upon the theory of associating fluids was developed. The multipole electrostatic potential for each protein used in this study was characterized for net protein charge, dipole moment magnitude, and dipole moment direction. The binding sites, between the model antigen and antibodies, were characterized for their net surface area, energy, and position relative to the dipole moment of the protein. The equilibrium binding graphs generated with the protein statistical mechanical model compares favorably with experimental data obtained from radioimmunoassay results. The isothermal compressibility predicted by the model agrees with results obtained from dynamic light scattering. The statistical mechanics model was used to investigate association between the model antigen and selected pairs of antibodies. It was found that, in accordance to expectations from thermodynamic arguments, the highest total binding energy yielded complex distributions which were skewed to higher complex size. From examination of the simulated formation of ring structures from linear chain complexes, and from the joint shape probability surfaces, it was found that ring configurations were formed by the "folding" of linear chains until the ends are within binding distance. By comparing the single antigen/two antibody system which differ only in their respective binding site locations, it was found that binding site location influences complex size and shape distributions only when ring formation occurs. The internal potential energy of a ring complex is considerably less than that of the non-associating system; therefore the ring complexes are quite stable and show no evidence of breaking, and collapsing into smaller complexes. The ring formation will occur only in systems where the total free energy of each complex may be minimized. Thus, ring formation will occur even though entropically unfavorable conformations result if the total free energy can be minimized by doing so.
A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
Hot cheese: a processed Swiss cheese model.
Li, Y; Thimbleby, H
2014-01-01
James Reason's classic Swiss cheese model is a vivid and memorable way to visualise how patient harm happens only when all system defences fail. Although Reason's model has been criticised for its simplicity and static portrait of complex systems, its use has been growing, largely because of the direct clarity of its simple and memorable metaphor. A more general, more flexible and equally memorable model of accident causation in complex systems is needed. We present the hot cheese model, which is more realistic, particularly in portraying defence layers as dynamic and active - more defences may cause more hazards. The hot cheese model, being more flexible, encourages deeper discussion of incidents than the simpler Swiss cheese model permits.
NASA Technical Reports Server (NTRS)
Yliniemi, Logan; Agogino, Adrian K.; Tumer, Kagan
2014-01-01
Accurate simulation of the effects of integrating new technologies into a complex system is critical to the modernization of our antiquated air traffic system, where there exist many layers of interacting procedures, controls, and automation all designed to cooperate with human operators. Additions of even simple new technologies may result in unexpected emergent behavior due to complex human/ machine interactions. One approach is to create high-fidelity human models coming from the field of human factors that can simulate a rich set of behaviors. However, such models are difficult to produce, especially to show unexpected emergent behavior coming from many human operators interacting simultaneously within a complex system. Instead of engineering complex human models, we directly model the emergent behavior by evolving goal directed agents, representing human users. Using evolution we can predict how the agent representing the human user reacts given his/her goals. In this paradigm, each autonomous agent in a system pursues individual goals, and the behavior of the system emerges from the interactions, foreseen or unforeseen, between the agents/actors. We show that this method reflects the integration of new technologies in a historical case, and apply the same methodology for a possible future technology.
A Knowledge-Based and Model-Driven Requirements Engineering Approach to Conceptual Satellite Design
NASA Astrophysics Data System (ADS)
Dos Santos, Walter A.; Leonor, Bruno B. F.; Stephany, Stephan
Satellite systems are becoming even more complex, making technical issues a significant cost driver. The increasing complexity of these systems makes requirements engineering activities both more important and difficult. Additionally, today's competitive pressures and other market forces drive manufacturing companies to improve the efficiency with which they design and manufacture space products and systems. This imposes a heavy burden on systems-of-systems engineering skills and particularly on requirements engineering which is an important phase in a system's life cycle. When this is poorly performed, various problems may occur, such as failures, cost overruns and delays. One solution is to underpin the preliminary conceptual satellite design with computer-based information reuse and integration to deal with the interdisciplinary nature of this problem domain. This can be attained by taking a model-driven engineering approach (MDE), in which models are the main artifacts during system development. MDE is an emergent approach that tries to address system complexity by the intense use of models. This work outlines the use of SysML (Systems Modeling Language) and a novel knowledge-based software tool, named SatBudgets, to deal with these and other challenges confronted during the conceptual phase of a university satellite system, called ITASAT, currently being developed by INPE and some Brazilian universities.
Using emergent order to shape a space society
NASA Technical Reports Server (NTRS)
Graps, Amara L.
1993-01-01
A fast-growing movement in the scientific community is reshaping the way that we view the world around us. The short-hand name for this movement is 'chaos'. Chaos is a science of the global, nonlinear nature of systems. The center of this set of ideas is that simple, deterministic systems can breed complexity. Systems as complex as the human body, ecology, the mind or a human society. While it is true that simple laws can breed complexity, the other side is that complex systems can breed order. It is the latter that I will focus on in this paper. In the past, nonlinear was nearly synonymous with unsolvable because no general analytic solutions exist. Mathematically, an essential difference exists between linear and nonlinear systems. For linear systems, you just break up the complicated system into many simple pieces and patch together the separated solutions for each piece to form a solution to the full problem. In contrast, solutions to a nonlinear system cannot be added to form a new solution. The system must be treated in its full complexity. While it is true that no general analytical approach exists for reducing a complex system such as a society, it can be modeled. The technical involves a mathematical construct called phase space. In this space stable structures can appear which I use as analogies for the stable structures that appear in a complex system such as an ecology, the mind or a society. The common denominator in all of these systems is that they rely on a process called feedback loops. Feedback loops link the microscopic (individual) parts to the macroscopic (global) parts. The key, then, in shaping a space society, is in effectively using feedback loops. This paper will illustrate how one can model a space society by using methods that chaoticists have developed over the last hundred years. And I will show that common threads exist in the modeling of biological, economical, philosophical, and sociological systems.
NASA Astrophysics Data System (ADS)
Li, Shu-Bin; Cao, Dan-Ni; Dang, Wen-Xiu; Zhang, Lin
As a new cross-discipline, the complexity science has penetrated into every field of economy and society. With the arrival of big data, the research of the complexity science has reached its summit again. In recent years, it offers a new perspective for traffic control by using complex networks theory. The interaction course of various kinds of information in traffic system forms a huge complex system. A new mesoscopic traffic flow model is improved with variable speed limit (VSL), and the simulation process is designed, which is based on the complex networks theory combined with the proposed model. This paper studies effect of VSL on the dynamic traffic flow, and then analyzes the optimal control strategy of VSL in different network topologies. The conclusion of this research is meaningful to put forward some reasonable transportation plan and develop effective traffic management and control measures to help the department of traffic management.
Darabi Sahneh, Faryad; Scoglio, Caterina; Riviere, Jim
2013-01-01
Nanoparticle-protein corona complex formation involves absorption of protein molecules onto nanoparticle surfaces in a physiological environment. Understanding the corona formation process is crucial in predicting nanoparticle behavior in biological systems, including applications of nanotoxicology and development of nano drug delivery platforms. This paper extends the modeling work in to derive a mathematical model describing the dynamics of nanoparticle corona complex formation from population balance equations. We apply nonlinear dynamics techniques to derive analytical results for the composition of nanoparticle-protein corona complex, and validate our results through numerical simulations. The model presented in this paper exhibits two phases of corona complex dynamics. In the first phase, proteins rapidly bind to the free surface of nanoparticles, leading to a metastable composition. During the second phase, continuous association and dissociation of protein molecules with nanoparticles slowly changes the composition of the corona complex. Given sufficient time, composition of the corona complex reaches an equilibrium state of stable composition. We find analytical approximate formulae for metastable and stable compositions of corona complex. Our formulae are very well-structured to clearly identify important parameters determining corona composition. The dynamics of biocorona formation constitute vital aspect of interactions between nanoparticles and living organisms. Our results further understanding of these dynamics through quantitation of experimental conditions, modeling results for in vitro systems to better predict behavior for in vivo systems. One potential application would involve a single cell culture medium related to a complex protein medium, such as blood or tissue fluid.
Enhancing implementation science by applying best principles of systems science.
Northridge, Mary E; Metcalf, Sara S
2016-10-04
Implementation science holds promise for better ensuring that research is translated into evidence-based policy and practice, but interventions often fail or even worsen the problems they are intended to solve due to a lack of understanding of real world structures and dynamic complexity. While systems science alone cannot possibly solve the major challenges in public health, systems-based approaches may contribute to changing the language and methods for conceptualising and acting within complex systems. The overarching goal of this paper is to improve the modelling used in dissemination and implementation research by applying best principles of systems science. Best principles, as distinct from the more customary term 'best practices', are used to underscore the need to extract the core issues from the context in which they are embedded in order to better ensure that they are transferable across settings. Toward meaningfully grappling with the complex and challenging problems faced in adopting and integrating evidence-based health interventions and changing practice patterns within specific settings, we propose and illustrate four best principles derived from our systems science experience: (1) model the problem, not the system; (2) pay attention to what is important, not just what is quantifiable; (3) leverage the utility of models as boundary objects; and (4) adopt a portfolio approach to model building. To improve our mental models of the real world, system scientists have created methodologies such as system dynamics, agent-based modelling, geographic information science and social network simulation. To understand dynamic complexity, we need the ability to simulate. Otherwise, our understanding will be limited. The practice of dynamic systems modelling, as discussed herein, is the art and science of linking system structure to behaviour for the purpose of changing structure to improve behaviour. A useful computer model creates a knowledge repository and a virtual library for internally consistent exploration of alternative assumptions. Among the benefits of systems modelling are iterative practice, participatory potential and possibility thinking. We trust that the best principles proposed here will resonate with implementation scientists; applying them to the modelling process may abet the translation of research into effective policy and practice.
Ontology patterns for complex topographic feature yypes
Varanka, Dalia E.
2011-01-01
Complex feature types are defined as integrated relations between basic features for a shared meaning or concept. The shared semantic concept is difficult to define in commonly used geographic information systems (GIS) and remote sensing technologies. The role of spatial relations between complex feature parts was recognized in early GIS literature, but had limited representation in the feature or coverage data models of GIS. Spatial relations are more explicitly specified in semantic technology. In this paper, semantics for topographic feature ontology design patterns (ODP) are developed as data models for the representation of complex features. In the context of topographic processes, component assemblages are supported by resource systems and are found on local landscapes. The topographic ontology is organized across six thematic modules that can account for basic feature types, resource systems, and landscape types. Types of complex feature attributes include location, generative processes and physical description. Node/edge networks model standard spatial relations and relations specific to topographic science to represent complex features. To demonstrate these concepts, data from The National Map of the U. S. Geological Survey was converted and assembled into ODP.
Integration of systems biology with organs-on-chips to humanize therapeutic development
NASA Astrophysics Data System (ADS)
Edington, Collin D.; Cirit, Murat; Chen, Wen Li Kelly; Clark, Amanda M.; Wells, Alan; Trumper, David L.; Griffith, Linda G.
2017-02-01
"Mice are not little people" - a refrain becoming louder as the gaps between animal models and human disease become more apparent. At the same time, three emerging approaches are headed toward integration: powerful systems biology analysis of cell-cell and intracellular signaling networks in patient-derived samples; 3D tissue engineered models of human organ systems, often made from stem cells; and micro-fluidic and meso-fluidic devices that enable living systems to be sustained, perturbed and analyzed for weeks in culture. Integration of these rapidly moving fields has the potential to revolutionize development of therapeutics for complex, chronic diseases, including those that have weak genetic bases and substantial contributions from gene-environment interactions. Technical challenges in modeling complex diseases with "organs on chips" approaches include the need for relatively large tissue masses and organ-organ cross talk to capture systemic effects, such that current microfluidic formats often fail to capture the required scale and complexity for interconnected systems. These constraints drive development of new strategies for designing in vitro models, including perfusing organ models, as well as "mesofluidic" pumping and circulation in platforms connecting several organ systems, to achieve the appropriate physiological relevance.
Macroscopic description of complex adaptive networks coevolving with dynamic node states
NASA Astrophysics Data System (ADS)
Wiedermann, Marc; Donges, Jonathan F.; Heitzig, Jobst; Lucht, Wolfgang; Kurths, Jürgen
2015-05-01
In many real-world complex systems, the time evolution of the network's structure and the dynamic state of its nodes are closely entangled. Here we study opinion formation and imitation on an adaptive complex network which is dependent on the individual dynamic state of each node and vice versa to model the coevolution of renewable resources with the dynamics of harvesting agents on a social network. The adaptive voter model is coupled to a set of identical logistic growth models and we mainly find that, in such systems, the rate of interactions between nodes as well as the adaptive rewiring probability are crucial parameters for controlling the sustainability of the system's equilibrium state. We derive a macroscopic description of the system in terms of ordinary differential equations which provides a general framework to model and quantify the influence of single node dynamics on the macroscopic state of the network. The thus obtained framework is applicable to many fields of study, such as epidemic spreading, opinion formation, or socioecological modeling.
Computer modeling describes gravity-related adaptation in cell cultures.
Alexandrov, Ludmil B; Alexandrova, Stoyana; Usheva, Anny
2009-12-16
Questions about the changes of biological systems in response to hostile environmental factors are important but not easy to answer. Often, the traditional description with differential equations is difficult due to the overwhelming complexity of the living systems. Another way to describe complex systems is by simulating them with phenomenological models such as the well-known evolutionary agent-based model (EABM). Here we developed an EABM to simulate cell colonies as a multi-agent system that adapts to hyper-gravity in starvation conditions. In the model, the cell's heritable characteristics are generated and transferred randomly to offspring cells. After a qualitative validation of the model at normal gravity, we simulate cellular growth in hyper-gravity conditions. The obtained data are consistent with previously confirmed theoretical and experimental findings for bacterial behavior in environmental changes, including the experimental data from the microgravity Atlantis and the Hypergravity 3000 experiments. Our results demonstrate that it is possible to utilize an EABM with realistic qualitative description to examine the effects of hypergravity and starvation on complex cellular entities.
Computer models of complex multiloop branched pipeline systems
NASA Astrophysics Data System (ADS)
Kudinov, I. V.; Kolesnikov, S. V.; Eremin, A. V.; Branfileva, A. N.
2013-11-01
This paper describes the principal theoretical concepts of the method used for constructing computer models of complex multiloop branched pipeline networks, and this method is based on the theory of graphs and two Kirchhoff's laws applied to electrical circuits. The models make it possible to calculate velocities, flow rates, and pressures of a fluid medium in any section of pipeline networks, when the latter are considered as single hydraulic systems. On the basis of multivariant calculations the reasons for existing problems can be identified, the least costly methods of their elimination can be proposed, and recommendations for planning the modernization of pipeline systems and construction of their new sections can be made. The results obtained can be applied to complex pipeline systems intended for various purposes (water pipelines, petroleum pipelines, etc.). The operability of the model has been verified on an example of designing a unified computer model of the heat network for centralized heat supply of the city of Samara.
Macroscopic description of complex adaptive networks coevolving with dynamic node states.
Wiedermann, Marc; Donges, Jonathan F; Heitzig, Jobst; Lucht, Wolfgang; Kurths, Jürgen
2015-05-01
In many real-world complex systems, the time evolution of the network's structure and the dynamic state of its nodes are closely entangled. Here we study opinion formation and imitation on an adaptive complex network which is dependent on the individual dynamic state of each node and vice versa to model the coevolution of renewable resources with the dynamics of harvesting agents on a social network. The adaptive voter model is coupled to a set of identical logistic growth models and we mainly find that, in such systems, the rate of interactions between nodes as well as the adaptive rewiring probability are crucial parameters for controlling the sustainability of the system's equilibrium state. We derive a macroscopic description of the system in terms of ordinary differential equations which provides a general framework to model and quantify the influence of single node dynamics on the macroscopic state of the network. The thus obtained framework is applicable to many fields of study, such as epidemic spreading, opinion formation, or socioecological modeling.
Complexation behavior of oppositely charged polyelectrolytes: Effect of charge distribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Mingtian; Li, Baohui, E-mail: dliang@pku.edu.cn, E-mail: baohui@nankai.edu.cn; Zhou, Jihan
Complexation behavior of oppositely charged polyelectrolytes in a solution is investigated using a combination of computer simulations and experiments, focusing on the influence of polyelectrolyte charge distributions along the chains on the structure of the polyelectrolyte complexes. The simulations are performed using Monte Carlo with the replica-exchange algorithm for three model systems where each system is composed of a mixture of two types of oppositely charged model polyelectrolyte chains (EGEG){sub 5}/(KGKG){sub 5}, (EEGG){sub 5}/(KKGG){sub 5}, and (EEGG){sub 5}/(KGKG){sub 5}, in a solution including explicit solvent molecules. Among the three model systems, only the charge distributions along the chains are notmore » identical. Thermodynamic quantities are calculated as a function of temperature (or ionic strength), and the microscopic structures of complexes are examined. It is found that the three systems have different transition temperatures, and form complexes with different sizes, structures, and densities at a given temperature. Complex microscopic structures with an alternating arrangement of one monolayer of E/K monomers and one monolayer of G monomers, with one bilayer of E and K monomers and one bilayer of G monomers, and with a mixture of monolayer and bilayer of E/K monomers in a box shape and a trilayer of G monomers inside the box are obtained for the three mixture systems, respectively. The experiments are carried out for three systems where each is composed of a mixture of two types of oppositely charged peptide chains. Each peptide chain is composed of Lysine (K) and glycine (G) or glutamate (E) and G, in solution, and the chain length and amino acid sequences, and hence the charge distribution, are precisely controlled, and all of them are identical with those for the corresponding model chain. The complexation behavior and complex structures are characterized through laser light scattering and atomic force microscopy measurements. The order of the apparent weight-averaged molar mass and the order of density of complexes observed from the three experimental systems are qualitatively in agreement with those predicted from the simulations.« less
Mathematical concepts for modeling human behavior in complex man-machine systems
NASA Technical Reports Server (NTRS)
Johannsen, G.; Rouse, W. B.
1979-01-01
Many human behavior (e.g., manual control) models have been found to be inadequate for describing processes in certain real complex man-machine systems. An attempt is made to find a way to overcome this problem by examining the range of applicability of existing mathematical models with respect to the hierarchy of human activities in real complex tasks. Automobile driving is chosen as a baseline scenario, and a hierarchy of human activities is derived by analyzing this task in general terms. A structural description leads to a block diagram and a time-sharing computer analogy.
Systems biology by the rules: hybrid intelligent systems for pathway modeling and discovery.
Bosl, William J
2007-02-15
Expert knowledge in journal articles is an important source of data for reconstructing biological pathways and creating new hypotheses. An important need for medical research is to integrate this data with high throughput sources to build useful models that span several scales. Researchers traditionally use mental models of pathways to integrate information and development new hypotheses. Unfortunately, the amount of information is often overwhelming and these are inadequate for predicting the dynamic response of complex pathways. Hierarchical computational models that allow exploration of semi-quantitative dynamics are useful systems biology tools for theoreticians, experimentalists and clinicians and may provide a means for cross-communication. A novel approach for biological pathway modeling based on hybrid intelligent systems or soft computing technologies is presented here. Intelligent hybrid systems, which refers to several related computing methods such as fuzzy logic, neural nets, genetic algorithms, and statistical analysis, has become ubiquitous in engineering applications for complex control system modeling and design. Biological pathways may be considered to be complex control systems, which medicine tries to manipulate to achieve desired results. Thus, hybrid intelligent systems may provide a useful tool for modeling biological system dynamics and computational exploration of new drug targets. A new modeling approach based on these methods is presented in the context of hedgehog regulation of the cell cycle in granule cells. Code and input files can be found at the Bionet website: www.chip.ord/~wbosl/Software/Bionet. This paper presents the algorithmic methods needed for modeling complicated biochemical dynamics using rule-based models to represent expert knowledge in the context of cell cycle regulation and tumor growth. A notable feature of this modeling approach is that it allows biologists to build complex models from their knowledge base without the need to translate that knowledge into mathematical form. Dynamics on several levels, from molecular pathways to tissue growth, are seamlessly integrated. A number of common network motifs are examined and used to build a model of hedgehog regulation of the cell cycle in cerebellar neurons, which is believed to play a key role in the etiology of medulloblastoma, a devastating childhood brain cancer.
Hydrological model parameter dimensionality is a weak measure of prediction uncertainty
NASA Astrophysics Data System (ADS)
Pande, S.; Arkesteijn, L.; Savenije, H.; Bastidas, L. A.
2015-04-01
This paper shows that instability of hydrological system representation in response to different pieces of information and associated prediction uncertainty is a function of model complexity. After demonstrating the connection between unstable model representation and model complexity, complexity is analyzed in a step by step manner. This is done measuring differences between simulations of a model under different realizations of input forcings. Algorithms are then suggested to estimate model complexity. Model complexities of the two model structures, SAC-SMA (Sacramento Soil Moisture Accounting) and its simplified version SIXPAR (Six Parameter Model), are computed on resampled input data sets from basins that span across the continental US. The model complexities for SIXPAR are estimated for various parameter ranges. It is shown that complexity of SIXPAR increases with lower storage capacity and/or higher recession coefficients. Thus it is argued that a conceptually simple model structure, such as SIXPAR, can be more complex than an intuitively more complex model structure, such as SAC-SMA for certain parameter ranges. We therefore contend that magnitudes of feasible model parameters influence the complexity of the model selection problem just as parameter dimensionality (number of parameters) does and that parameter dimensionality is an incomplete indicator of stability of hydrological model selection and prediction problems.
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban; ...
2015-07-14
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
POPEYE: A production rule-based model of multitask supervisory control (POPCORN)
NASA Technical Reports Server (NTRS)
Townsend, James T.; Kadlec, Helena; Kantowitz, Barry H.
1988-01-01
Recent studies of relationships between subjective ratings of mental workload, performance, and human operator and task characteristics have indicated that these relationships are quite complex. In order to study the various relationships and place subjective mental workload within a theoretical framework, we developed a production system model for the performance component of the complex supervisory task called POPCORN. The production system model is represented by a hierarchial structure of goals and subgoals, and the information flow is controlled by a set of condition-action rules. The implementation of this production system, called POPEYE, generates computer simulated data under different task difficulty conditions which are comparable to those of human operators performing the task. This model is the performance aspect of an overall dynamic psychological model which we are developing to examine and quantify relationships between performance and psychological aspects in a complex environment.
Frequency analysis of stress relaxation dynamics in model asphalts
NASA Astrophysics Data System (ADS)
Masoori, Mohammad; Greenfield, Michael L.
2014-09-01
Asphalt is an amorphous or semi-crystalline material whose mechanical performance relies on viscoelastic responses to applied strain or stress. Chemical composition and its effect on the viscoelastic properties of model asphalts have been investigated here by computing complex modulus from molecular dynamics simulation results for two different model asphalts whose compositions each resemble the Strategic Highway Research Program AAA-1 asphalt in different ways. For a model system that contains smaller molecules, simulation results for storage and loss modulus at 443 K reach both the low and high frequency scaling limits of the Maxwell model. Results for a model system composed of larger molecules (molecular weights 300-900 g/mol) with longer branches show a quantitatively higher complex modulus that decreases significantly as temperature increases over 400-533 K. Simulation results for its loss modulus approach the low frequency scaling limit of the Maxwell model at only the highest temperature simulated. A Black plot or van Gurp-Palman plot of complex modulus vs. phase angle for the system of larger molecules suggests some overlap among results at different temperatures for less high frequencies, with an interdependence consistent with the empirical Christensen-Anderson-Marasteanu model. Both model asphalts are thermorheologically complex at very high frequencies, where they show a loss peak that appears to be independent of temperature and density.
Lim, Hooi Been; Baumann, Dirk; Li, Er-Ping
2011-03-01
Wireless body area network (WBAN) is a new enabling system with promising applications in areas such as remote health monitoring and interpersonal communication. Reliable and optimum design of a WBAN system relies on a good understanding and in-depth studies of the wave propagation around a human body. However, the human body is a very complex structure and is computationally demanding to model. This paper aims to investigate the effects of the numerical model's structure complexity and feature details on the simulation results. Depending on the application, a simplified numerical model that meets desired simulation accuracy can be employed for efficient simulations. Measurements of ultra wideband (UWB) signal propagation along a human arm are performed and compared to the simulation results obtained with numerical arm models of different complexity levels. The influence of the arm shape and size, as well as tissue composition and complexity is investigated.
Managing resource capacity using hybrid simulation
NASA Astrophysics Data System (ADS)
Ahmad, Norazura; Ghani, Noraida Abdul; Kamil, Anton Abdulbasah; Tahar, Razman Mat
2014-12-01
Due to the diversity of patient flows and interdependency of the emergency department (ED) with other units in hospital, the use of analytical models seems not practical for ED modeling. One effective approach to study the dynamic complexity of ED problems is by developing a computer simulation model that could be used to understand the structure and behavior of the system. Attempts to build a holistic model using DES only will be too complex while if only using SD will lack the detailed characteristics of the system. This paper discusses the combination of DES and SD in order to get a better representation of the actual system than using either modeling paradigm solely. The model is developed using AnyLogic software that will enable us to study patient flows and the complex interactions among hospital resources for ED operations. Results from the model show that patients' length of stay is influenced by laboratories turnaround time, bed occupancy rate and ward admission rate.
Social determinants of health inequalities: towards a theoretical perspective using systems science.
Jayasinghe, Saroj
2015-08-25
A systems approach offers a novel conceptualization to natural and social systems. In recent years, this has led to perceiving population health outcomes as an emergent property of a dynamic and open, complex adaptive system. The current paper explores these themes further and applies the principles of systems approach and complexity science (i.e. systems science) to conceptualize social determinants of health inequalities. The conceptualization can be done in two steps: viewing health inequalities from a systems approach and extending it to include complexity science. Systems approach views health inequalities as patterns within the larger rubric of other facets of the human condition, such as educational outcomes and economic development. This anlysis requires more sophisticated models such as systems dynamic models. An extension of the approach is to view systems as complex adaptive systems, i.e. systems that are 'open' and adapt to the environment. They consist of dynamic adapting subsystems that exhibit non-linear interactions, while being 'open' to a similarly dynamic environment of interconnected systems. They exhibit emergent properties that cannot be estimated with precision by using the known interactions among its components (such as economic development, political freedom, health system, culture etc.). Different combinations of the same bundle of factors or determinants give rise to similar patterns or outcomes (i.e. property of convergence), and minor variations in the initial condition could give rise to widely divergent outcomes. Novel approaches using computer simulation models (e.g. agent-based models) would shed light on possible mechanisms as to how factors or determinants interact and lead to emergent patterns of health inequalities of populations.
Complexity in Soil Systems: What Does It Mean and How Should We Proceed?
NASA Astrophysics Data System (ADS)
Faybishenko, B.; Molz, F. J.; Brodie, E.; Hubbard, S. S.
2015-12-01
The complex soil systems approach is needed fundamentally for the development of integrated, interdisciplinary methods to measure and quantify the physical, chemical and biological processes taking place in soil, and to determine the role of fine-scale heterogeneities. This presentation is aimed at a review of the concepts and observations concerning complexity and complex systems theory, including terminology, emergent complexity and simplicity, self-organization and a general approach to the study of complex systems using the Weaver (1948) concept of "organized complexity." These concepts are used to provide understanding of complex soil systems, and to develop experimental and mathematical approaches to soil microbiological processes. The results of numerical simulations, observations and experiments are presented that indicate the presence of deterministic chaotic dynamics in soil microbial systems. So what are the implications for the scientists who wish to develop mathematical models in the area of organized complexity or to perform experiments to help clarify an aspect of an organized complex system? The modelers have to deal with coupled systems having at least three dependent variables, and they have to forgo making linear approximations to nonlinear phenomena. The analogous rule for experimentalists is that they need to perform experiments that involve measurement of at least three interacting entities (variables depending on time, space, and each other). These entities could be microbes in soil penetrated by roots. If a process being studied in a soil affects the soil properties, like biofilm formation, then this effect has to be measured and included. The mathematical implications of this viewpoint are examined, and results of numerical solutions to a system of equations demonstrating deterministic chaotic behavior are also discussed using time series and the 3D strange attractors.
Empirical modeling for intelligent, real-time manufacture control
NASA Technical Reports Server (NTRS)
Xu, Xiaoshu
1994-01-01
Artificial neural systems (ANS), also known as neural networks, are an attempt to develop computer systems that emulate the neural reasoning behavior of biological neural systems (e.g. the human brain). As such, they are loosely based on biological neural networks. The ANS consists of a series of nodes (neurons) and weighted connections (axons) that, when presented with a specific input pattern, can associate specific output patterns. It is essentially a highly complex, nonlinear, mathematical relationship or transform. These constructs have two significant properties that have proven useful to the authors in signal processing and process modeling: noise tolerance and complex pattern recognition. Specifically, the authors have developed a new network learning algorithm that has resulted in the successful application of ANS's to high speed signal processing and to developing models of highly complex processes. Two of the applications, the Weld Bead Geometry Control System and the Welding Penetration Monitoring System, are discussed in the body of this paper.
Why Bother to Calibrate? Model Consistency and the Value of Prior Information
NASA Astrophysics Data System (ADS)
Hrachowitz, Markus; Fovet, Ophelie; Ruiz, Laurent; Euser, Tanja; Gharari, Shervan; Nijzink, Remko; Savenije, Hubert; Gascuel-Odoux, Chantal
2015-04-01
Hydrological models frequently suffer from limited predictive power despite adequate calibration performances. This can indicate insufficient representations of the underlying processes. Thus ways are sought to increase model consistency while satisfying the contrasting priorities of increased model complexity and limited equifinality. In this study the value of a systematic use of hydrological signatures and expert knowledge for increasing model consistency was tested. It was found that a simple conceptual model, constrained by 4 calibration objective functions, was able to adequately reproduce the hydrograph in the calibration period. The model, however, could not reproduce 20 hydrological signatures, indicating a lack of model consistency. Subsequently, testing 11 models, model complexity was increased in a stepwise way and counter-balanced by using prior information about the system to impose "prior constraints", inferred from expert knowledge and to ensure a model which behaves well with respect to the modeller's perception of the system. We showed that, in spite of unchanged calibration performance, the most complex model set-up exhibited increased performance in the independent test period and skill to reproduce all 20 signatures, indicating a better system representation. The results suggest that a model may be inadequate despite good performance with respect to multiple calibration objectives and that increasing model complexity, if efficiently counter-balanced by available prior constraints, can increase predictive performance of a model and its skill to reproduce hydrological signatures. The results strongly illustrate the need to balance automated model calibration with a more expert-knowledge driven strategy of constraining models.
Why Bother and Calibrate? Model Consistency and the Value of Prior Information.
NASA Astrophysics Data System (ADS)
Hrachowitz, M.; Fovet, O.; Ruiz, L.; Euser, T.; Gharari, S.; Nijzink, R.; Freer, J. E.; Savenije, H.; Gascuel-Odoux, C.
2014-12-01
Hydrological models frequently suffer from limited predictive power despite adequate calibration performances. This can indicate insufficient representations of the underlying processes. Thus ways are sought to increase model consistency while satisfying the contrasting priorities of increased model complexity and limited equifinality. In this study the value of a systematic use of hydrological signatures and expert knowledge for increasing model consistency was tested. It was found that a simple conceptual model, constrained by 4 calibration objective functions, was able to adequately reproduce the hydrograph in the calibration period. The model, however, could not reproduce 20 hydrological signatures, indicating a lack of model consistency. Subsequently, testing 11 models, model complexity was increased in a stepwise way and counter-balanced by using prior information about the system to impose "prior constraints", inferred from expert knowledge and to ensure a model which behaves well with respect to the modeller's perception of the system. We showed that, in spite of unchanged calibration performance, the most complex model set-up exhibited increased performance in the independent test period and skill to reproduce all 20 signatures, indicating a better system representation. The results suggest that a model may be inadequate despite good performance with respect to multiple calibration objectives and that increasing model complexity, if efficiently counter-balanced by available prior constraints, can increase predictive performance of a model and its skill to reproduce hydrological signatures. The results strongly illustrate the need to balance automated model calibration with a more expert-knowledge driven strategy of constraining models.
Dispersion Modeling in Complex Urban Systems
Models are used to represent real systems in an understandable way. They take many forms. A conceptual model explains the way a system works. In environmental studies, for example, a conceptual model may delineate all the factors and parameters for determining how a particle move...
The semiotics of control and modeling relations in complex systems.
Joslyn, C
2001-01-01
We provide a conceptual analysis of ideas and principles from the systems theory discourse which underlie Pattee's semantic or semiotic closure, which is itself foundational for a school of theoretical biology derived from systems theory and cybernetics, and is now being related to biological semiotics and explicated in the relational biological school of Rashevsky and Rosen. Atomic control systems and models are described as the canonical forms of semiotic organization, sharing measurement relations, but differing topologically in that control systems are circularly and models linearly related to their environments. Computation in control systems is introduced, motivating hierarchical decomposition, hybrid modeling and control systems, and anticipatory or model-based control. The semiotic relations in complex control systems are described in terms of relational constraints, and rules and laws are distinguished as contingent and necessary functional entailments, respectively. Finally, selection as a meta-level of constraint is introduced as the necessary condition for semantic relations in control systems and models.
Cx-02 Program, workshop on modeling complex systems
Mossotti, Victor G.; Barragan, Jo Ann; Westergard, Todd D.
2003-01-01
This publication contains the abstracts and program for the workshop on complex systems that was held on November 19-21, 2002, in Reno, Nevada. Complex systems are ubiquitous within the realm of the earth sciences. Geological systems consist of a multiplicity of linked components with nested feedback loops; the dynamics of these systems are non-linear, iterative, multi-scale, and operate far from equilibrium. That notwithstanding, It appears that, with the exception of papers on seismic studies, geology and geophysics work has been disproportionally underrepresented at regional and national meetings on complex systems relative to papers in the life sciences. This is somewhat puzzling because geologists and geophysicists are, in many ways, preadapted to thinking of complex system mechanisms. Geologists and geophysicists think about processes involving large volumes of rock below the sunlit surface of Earth, the accumulated consequence of processes extending hundreds of millions of years in the past. Not only do geologists think in the abstract by virtue of the vast time spans, most of the evidence is out-of-sight. A primary goal of this workshop is to begin to bridge the gap between the Earth sciences and life sciences through demonstration of the universality of complex systems science, both philosophically and in model structures.
Elementary Teachers' Selection and Use of Visual Models
ERIC Educational Resources Information Center
Lee, Tammy D.; Jones, M. Gail
2018-01-01
As science grows in complexity, science teachers face an increasing challenge of helping students interpret models that represent complex science systems. Little is known about how teachers select and use models when planning lessons. This mixed methods study investigated the pedagogical approaches and visual models used by elementary in-service…
Challenges in the analysis of complex systems: introduction and overview
NASA Astrophysics Data System (ADS)
Hastings, Harold M.; Davidsen, Jörn; Leung, Henry
2017-12-01
One of the main challenges of modern physics is to provide a systematic understanding of systems far from equilibrium exhibiting emergent behavior. Prominent examples of such complex systems include, but are not limited to the cardiac electrical system, the brain, the power grid, social systems, material failure and earthquakes, and the climate system. Due to the technological advances over the last decade, the amount of observations and data available to characterize complex systems and their dynamics, as well as the capability to process that data, has increased substantially. The present issue discusses a cross section of the current research on complex systems, with a focus on novel experimental and data-driven approaches to complex systems that provide the necessary platform to model the behavior of such systems.
Documentation Driven Development for Complex Real-Time Systems
2004-12-01
This paper presents a novel approach for development of complex real - time systems , called the documentation-driven development (DDD) approach. This... time systems . DDD will also support automated software generation based on a computational model and some relevant techniques. DDD includes two main...stakeholders to be easily involved in development processes and, therefore, significantly improve the agility of software development for complex real
NASA Astrophysics Data System (ADS)
Ganiev, R. F.; Reviznikov, D. L.; Rogoza, A. N.; Slastushenskiy, Yu. V.; Ukrainskiy, L. E.
2017-03-01
A description of a complex approach to investigation of nonlinear wave processes in the human cardiovascular system based on a combination of high-precision methods of measuring a pulse wave, mathematical methods of processing the empirical data, and methods of direct numerical modeling of hemodynamic processes in an arterial tree is given.
General and craniofacial development are complex adaptive processes influenced by diversity.
Brook, A H; O'Donnell, M Brook; Hone, A; Hart, E; Hughes, T E; Smith, R N; Townsend, G C
2014-06-01
Complex systems are present in such diverse areas as social systems, economies, ecosystems and biology and, therefore, are highly relevant to dental research, education and practice. A Complex Adaptive System in biological development is a dynamic process in which, from interacting components at a lower level, higher level phenomena and structures emerge. Diversity makes substantial contributions to the performance of complex adaptive systems. It enhances the robustness of the process, allowing multiple responses to external stimuli as well as internal changes. From diversity comes variation in outcome and the possibility of major change; outliers in the distribution enhance the tipping points. The development of the dentition is a valuable, accessible model with extensive and reliable databases for investigating the role of complex adaptive systems in craniofacial and general development. The general characteristics of such systems are seen during tooth development: self-organization; bottom-up emergence; multitasking; self-adaptation; variation; tipping points; critical phases; and robustness. Dental findings are compatible with the Random Network Model, the Threshold Model and also with the Scale Free Network Model which has a Power Law distribution. In addition, dental development shows the characteristics of Modularity and Clustering to form Hierarchical Networks. The interactions between the genes (nodes) demonstrate Small World phenomena, Subgraph Motifs and Gene Regulatory Networks. Genetic mechanisms are involved in the creation and evolution of variation during development. The genetic factors interact with epigenetic and environmental factors at the molecular level and form complex networks within the cells. From these interactions emerge the higher level tissues, tooth germs and mineralized teeth. Approaching development in this way allows investigation of why there can be variations in phenotypes from identical genotypes; the phenotype is the outcome of perturbations in the cellular systems and networks, as well as of the genotype. Understanding and applying complexity theory will bring about substantial advances not only in dental research and education but also in the organization and delivery of oral health care. © 2014 Australian Dental Association.
OFMTutor: An operator function model intelligent tutoring system
NASA Technical Reports Server (NTRS)
Jones, Patricia M.
1989-01-01
The design, implementation, and evaluation of an Operator Function Model intelligent tutoring system (OFMTutor) is presented. OFMTutor is intended to provide intelligent tutoring in the context of complex dynamic systems for which an operator function model (OFM) can be constructed. The human operator's role in such complex, dynamic, and highly automated systems is that of a supervisory controller whose primary responsibilities are routine monitoring and fine-tuning of system parameters and occasional compensation for system abnormalities. The automated systems must support the human operator. One potentially useful form of support is the use of intelligent tutoring systems to teach the operator about the system and how to function within that system. Previous research on intelligent tutoring systems (ITS) is considered. The proposed design for OFMTutor is presented, and an experimental evaluation is described.
Cognitive engineering models in space systems
NASA Technical Reports Server (NTRS)
Mitchell, Christine M.
1992-01-01
NASA space systems, including mission operations on the ground and in space, are complex, dynamic, predominantly automated systems in which the human operator is a supervisory controller. The human operator monitors and fine-tunes computer-based control systems and is responsible for ensuring safe and efficient system operation. In such systems, the potential consequences of human mistakes and errors may be very large, and low probability of such events is likely. Thus, models of cognitive functions in complex systems are needed to describe human performance and form the theoretical basis of operator workstation design, including displays, controls, and decision support aids. The operator function model represents normative operator behavior-expected operator activities given current system state. The extension of the theoretical structure of the operator function model and its application to NASA Johnson mission operations and space station applications is discussed.
Best geoscience approach to complex systems in environment
NASA Astrophysics Data System (ADS)
Mezemate, Yacine; Tchiguirinskaia, Ioulia; Schertzer, Daniel
2017-04-01
The environment is a social issue that continues to grow in importance. Its complexity, both cross-disciplinary and multi-scale, has given rise to a large number of scientific and technological locks, that complex systems approaches can solve. Significant challenges must met to achieve the understanding of the environmental complexes systems. There study should proceed in some steps in which the use of data and models is crucial: - Exploration, observation and basic data acquisition - Identification of correlations, patterns, and mechanisms - Modelling - Model validation, implementation and prediction - Construction of a theory Since the e-learning becomes a powerful tool for knowledge and best practice shearing, we use it to teach the environmental complexities and systems. In this presentation we promote the e-learning course dedicated for a large public (undergraduates, graduates, PhD students and young scientists) which gather and puts in coherence different pedagogical materials of complex systems and environmental studies. This course describes a complex processes using numerous illustrations, examples and tests that make it "easy to enjoy" learning process. For the seek of simplicity, the course is divided in different modules and at the end of each module a set of exercises and program codes are proposed for a best practice. The graphical user interface (GUI) which is constructed using an open source Opale Scenari offers a simple navigation through the different module. The course treats the complex systems that can be found in environment and their observables, we particularly highlight the extreme variability of these observables over a wide range of scales. Using the multifractal formalism through different applications (turbulence, precipitation, hydrology) we demonstrate how such extreme variability of the geophysical/biological fields should be used solving everyday (geo-)environmental chalenges.
NASA Astrophysics Data System (ADS)
Montero, J. T.; Lintz, H. E.; Sharp, D.
2013-12-01
Do emergent properties that result from models of complex systems match emergent properties from real systems? This question targets a type of uncertainty that we argue requires more attention in system modeling and validation efforts. We define an ';emergent property' to be an attribute or behavior of a modeled or real system that can be surprising or unpredictable and result from complex interactions among the components of a system. For example, thresholds are common across diverse systems and scales and can represent emergent system behavior that is difficult to predict. Thresholds or other types of emergent system behavior can be characterized by their geometry in state space (where state space is the space containing the set of all states of a dynamic system). One way to expedite our growing mechanistic understanding of how emergent properties emerge from complex systems is to compare the geometry of surfaces in state space between real and modeled systems. Here, we present an index (threshold strength) that can quantify a geometric attribute of a surface in state space. We operationally define threshold strength as how strongly a surface in state space resembles a step or an abrupt transition between two system states. First, we validated the index for application in greater than three dimensions of state space using simulated data. Then, we demonstrated application of the index in measuring geometric state space uncertainty between a real system and a deterministic, modeled system. In particular, we looked at geometric space uncertainty between climate behavior in 20th century and modeled climate behavior simulated by global climate models (GCMs) in the Coupled Model Intercomparison Project phase 5 (CMIP5). Surfaces from the climate models came from running the models over the same domain as the real data. We also created response surfaces from a real, climate data based on an empirical model that produces a geometric surface of predicted values in state space. We used a kernel regression method designed to capture the geometry of real data pattern without imposing shape assumptions a priori on the data; this kernel regression method is known as Non-parametric Multiplicative Regression (NPMR). We found that quantifying and comparing a geometric attribute in more than three dimensions of state space can discern whether the emergent nature of complex interactions in modeled systems matches that of real systems. Further, this method has potentially wider application in contexts where searching for abrupt change or ';action' in any hyperspace is desired.
Hu, Jin; Wang, Jun
2015-06-01
In recent years, complex-valued recurrent neural networks have been developed and analysed in-depth in view of that they have good modelling performance for some applications involving complex-valued elements. In implementing continuous-time dynamical systems for simulation or computational purposes, it is quite necessary to utilize a discrete-time model which is an analogue of the continuous-time system. In this paper, we analyse a discrete-time complex-valued recurrent neural network model and obtain the sufficient conditions on its global exponential periodicity and exponential stability. Simulation results of several numerical examples are delineated to illustrate the theoretical results and an application on associative memory is also given. Copyright © 2015 Elsevier Ltd. All rights reserved.
STEPS: Modeling and Simulating Complex Reaction-Diffusion Systems with Python
Wils, Stefan; Schutter, Erik De
2008-01-01
We describe how the use of the Python language improved the user interface of the program STEPS. STEPS is a simulation platform for modeling and stochastic simulation of coupled reaction-diffusion systems with complex 3-dimensional boundary conditions. Setting up such models is a complicated process that consists of many phases. Initial versions of STEPS relied on a static input format that did not cleanly separate these phases, limiting modelers in how they could control the simulation and becoming increasingly complex as new features and new simulation algorithms were added. We solved all of these problems by tightly integrating STEPS with Python, using SWIG to expose our existing simulation code. PMID:19623245
The Ontologies of Complexity and Learning about Complex Systems
ERIC Educational Resources Information Center
Jacobson, Michael J.; Kapur, Manu; So, Hyo-Jeong; Lee, June
2011-01-01
This paper discusses a study of students learning core conceptual perspectives from recent scientific research on complexity using a hypermedia learning environment in which different types of scaffolding were provided. Three comparison groups used a hypermedia system with agent-based models and scaffolds for problem-based learning activities that…
Synchronisation and Circuit Realisation of Chaotic Hartley System
NASA Astrophysics Data System (ADS)
Varan, Metin; Akgül, Akif; Güleryüz, Emre; Serbest, Kasım
2018-06-01
Hartley chaotic system is topologically the simplest, but its dynamical behaviours are very rich and its synchronisation has not been seen in literature. This paper aims to introduce a simple chaotic system which can be used as alternative to classical chaotic systems in synchronisation fields. Time series, phase portraits, and bifurcation diagrams reveal the dynamics of the mentioned system. Chaotic Hartley model is also supported with electronic circuit model simulations. Its exponential dynamics are hard to realise on circuit model; this paper is the first in literature that handles such a complex modelling problem. Modelling, synchronisation, and circuit realisation of the Hartley system are implemented respectively in MATLAB-Simulink and ORCAD environments. The effectiveness of the applied synchronisation method is revealed via numerical methods, and the results are discussed. Retrieved results show that this complex chaotic system can be used in secure communication fields.
Vernon, Ian; Liu, Junli; Goldstein, Michael; Rowe, James; Topping, Jen; Lindsey, Keith
2018-01-02
Many mathematical models have now been employed across every area of systems biology. These models increasingly involve large numbers of unknown parameters, have complex structure which can result in substantial evaluation time relative to the needs of the analysis, and need to be compared to observed data of various forms. The correct analysis of such models usually requires a global parameter search, over a high dimensional parameter space, that incorporates and respects the most important sources of uncertainty. This can be an extremely difficult task, but it is essential for any meaningful inference or prediction to be made about any biological system. It hence represents a fundamental challenge for the whole of systems biology. Bayesian statistical methodology for the uncertainty analysis of complex models is introduced, which is designed to address the high dimensional global parameter search problem. Bayesian emulators that mimic the systems biology model but which are extremely fast to evaluate are embeded within an iterative history match: an efficient method to search high dimensional spaces within a more formal statistical setting, while incorporating major sources of uncertainty. The approach is demonstrated via application to a model of hormonal crosstalk in Arabidopsis root development, which has 32 rate parameters, for which we identify the sets of rate parameter values that lead to acceptable matches between model output and observed trend data. The multiple insights into the model's structure that this analysis provides are discussed. The methodology is applied to a second related model, and the biological consequences of the resulting comparison, including the evaluation of gene functions, are described. Bayesian uncertainty analysis for complex models using both emulators and history matching is shown to be a powerful technique that can greatly aid the study of a large class of systems biology models. It both provides insight into model behaviour and identifies the sets of rate parameters of interest.
Protein-Protein Interactions of Azurin Complex by Coarse-Grained Simulations with a Gō-Like Model
NASA Astrophysics Data System (ADS)
Rusmerryani, Micke; Takasu, Masako; Kawaguchi, Kazutomo; Saito, Hiroaki; Nagao, Hidemi
Proteins usually perform their biological functions by forming a complex with other proteins. It is very important to study the protein-protein interactions since these interactions are crucial in many processes of a living organism. In this study, we develop a coarse grained model to simulate protein complex in liquid system. We carry out molecular dynamics simulations with topology-based potential interactions to simulate dynamical properties of Pseudomonas Aeruginosa azurin complex systems. Azurin is known to play an essential role as an anticancer agent and bind many important intracellular molecules. Some physical properties are monitored during simulation time to get a better understanding of the influence of protein-protein interactions to the azurin complex dynamics. These studies will provide valuable insights for further investigation on protein-protein interactions in more realistic system.
A Spatially Continuous Model of Carbohydrate Digestion and Transport Processes in the Colon
Moorthy, Arun S.; Brooks, Stephen P. J.; Kalmokoff, Martin; Eberl, Hermann J.
2015-01-01
A spatially continuous mathematical model of transport processes, anaerobic digestion and microbial complexity as would be expected in the human colon is presented. The model is a system of first-order partial differential equations with context determined number of dependent variables, and stiff, non-linear source terms. Numerical simulation of the model is used to elucidate information about the colon-microbiota complex. It is found that the composition of materials on outflow of the model does not well-describe the composition of material in other model locations, and inferences using outflow data varies according to model reactor representation. Additionally, increased microbial complexity allows the total microbial community to withstand major system perturbations in diet and community structure. However, distribution of strains and functional groups within the microbial community can be modified depending on perturbation length and microbial kinetic parameters. Preliminary model extensions and potential investigative opportunities using the computational model are discussed. PMID:26680208
Complexity and chaos control in a discrete-time prey-predator model
NASA Astrophysics Data System (ADS)
Din, Qamar
2017-08-01
We investigate the complex behavior and chaos control in a discrete-time prey-predator model. Taking into account the Leslie-Gower prey-predator model, we propose a discrete-time prey-predator system with predator partially dependent on prey and investigate the boundedness, existence and uniqueness of positive equilibrium and bifurcation analysis of the system by using center manifold theorem and bifurcation theory. Various feedback control strategies are implemented for controlling the bifurcation and chaos in the system. Numerical simulations are provided to illustrate theoretical discussion.
Chakrabarti, C G; Ghosh, Koyel
2013-10-01
In the present paper we have first introduced a measure of dynamical entropy of an ecosystem on the basis of the dynamical model of the system. The dynamical entropy which depends on the eigenvalues of the community matrix of the system leads to a consistent measure of complexity of the ecosystem to characterize the dynamical behaviours such as the stability, instability and periodicity around the stationary states of the system. We have illustrated the theory with some model ecosystems. Copyright © 2013 Elsevier Inc. All rights reserved.
System-level simulation of liquid filling in microfluidic chips.
Song, Hongjun; Wang, Yi; Pant, Kapil
2011-06-01
Liquid filling in microfluidic channels is a complex process that depends on a variety of geometric, operating, and material parameters such as microchannel geometry, flow velocity∕pressure, liquid surface tension, and contact angle of channel surface. Accurate analysis of the filling process can provide key insights into the filling time, air bubble trapping, and dead zone formation, and help evaluate trade-offs among the various design parameters and lead to optimal chip design. However, efficient modeling of liquid filling in complex microfluidic networks continues to be a significant challenge. High-fidelity computational methods, such as the volume of fluid method, are prohibitively expensive from a computational standpoint. Analytical models, on the other hand, are primarily applicable to idealized geometries and, hence, are unable to accurately capture chip level behavior of complex microfluidic systems. This paper presents a parametrized dynamic model for the system-level analysis of liquid filling in three-dimensional (3D) microfluidic networks. In our approach, a complex microfluidic network is deconstructed into a set of commonly used components, such as reservoirs, microchannels, and junctions. The components are then assembled according to their spatial layout and operating rationale to achieve a rapid system-level model. A dynamic model based on the transient momentum equation is developed to track the liquid front in the microchannels. The principle of mass conservation at the junction is used to link the fluidic parameters in the microchannels emanating from the junction. Assembly of these component models yields a set of differential and algebraic equations, which upon integration provides temporal information of the liquid filling process, particularly liquid front propagation (i.e., the arrival time). The models are used to simulate the transient liquid filling process in a variety of microfluidic constructs and in a multiplexer, representing a complex microfluidic network. The accuracy (relative error less than 7%) and orders-of-magnitude speedup (30 000X-4 000 000X) of our system-level models are verified by comparison against 3D high-fidelity numerical studies. Our findings clearly establish the utility of our models and simulation methodology for fast, reliable analysis of liquid filling to guide the design optimization of complex microfluidic networks.
Enhancing metaproteomics-The value of models and defined environmental microbial systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herbst, Florian-Alexander; Lünsmann, Vanessa; Kjeldal, Henrik
2016-01-21
Metaproteomics - the large-scale characterization of the entire protein complement of environmental microbiota at a given point in time - added unique features and possibilities to study environmental microbial communities and to unravel these “black boxes”. New technical challenges arose which were not an issue for classical proteome analytics before and choosing the appropriate model system applicable to the research question can be difficult. Here, we reviewed different model systems for metaproteome analysis. Following a short introduction to microbial communities and systems, we discussed the most used systems ranging from technical systems over rhizospheric models to systems for the medicalmore » field. This includes acid mine drainage, anaerobic digesters, activated sludge, planted fixed bed reactors, gastrointestinal simulators and in vivo models. Model systems are useful to evaluate the challenges encountered within (but not limited to) metaproteomics, including species complexity and coverage, biomass availability or reliable protein extraction. The implementation of model systems can be considered as a step forward to better understand microbial responses and ecological distribution of member organisms. In the future, novel improvements are necessary to fully engage complex environmental systems.« less
Wu, Naiqi; Zhou, MengChu
2005-12-01
An automated manufacturing system (AMS) contains a number of versatile machines (or workstations), buffers, an automated material handling system (MHS), and is computer-controlled. An effective and flexible alternative for implementing MHS is to use automated guided vehicle (AGV) system. The deadlock issue in AMS is very important in its operation and has extensively been studied. The deadlock problems were separately treated for parts in production and transportation and many techniques were developed for each problem. However, such treatment does not take the advantage of the flexibility offered by multiple AGVs. In general, it is intractable to obtain maximally permissive control policy for either problem. Instead, this paper investigates these two problems in an integrated way. First we model an AGV system and part processing processes by resource-oriented Petri nets, respectively. Then the two models are integrated by using macro transitions. Based on the combined model, a novel control policy for deadlock avoidance is proposed. It is shown to be maximally permissive with computational complexity of O (n2) where n is the number of machines in AMS if the complexity for controlling the part transportation by AGVs is not considered. Thus, the complexity of deadlock avoidance for the whole system is bounded by the complexity in controlling the AGV system. An illustrative example shows its application and power.
Mostafa, Salama A; Mustapha, Aida; Mohammed, Mazin Abed; Ahmad, Mohd Sharifuddin; Mahmoud, Moamin A
2018-04-01
Autonomous agents are being widely used in many systems, such as ambient assisted-living systems, to perform tasks on behalf of humans. However, these systems usually operate in complex environments that entail uncertain, highly dynamic, or irregular workload. In such environments, autonomous agents tend to make decisions that lead to undesirable outcomes. In this paper, we propose a fuzzy-logic-based adjustable autonomy (FLAA) model to manage the autonomy of multi-agent systems that are operating in complex environments. This model aims to facilitate the autonomy management of agents and help them make competent autonomous decisions. The FLAA model employs fuzzy logic to quantitatively measure and distribute autonomy among several agents based on their performance. We implement and test this model in the Automated Elderly Movements Monitoring (AEMM-Care) system, which uses agents to monitor the daily movement activities of elderly users and perform fall detection and prevention tasks in a complex environment. The test results show that the FLAA model improves the accuracy and performance of these agents in detecting and preventing falls. Copyright © 2018 Elsevier B.V. All rights reserved.
Modeling the modified drug release from curved shape drug delivery systems - Dome Matrix®.
Caccavo, D; Barba, A A; d'Amore, M; De Piano, R; Lamberti, G; Rossi, A; Colombo, P
2017-12-01
The controlled drug release from hydrogel-based drug delivery systems is a topic of large interest for research in pharmacology. The mathematical modeling of the behavior of these systems is a tool of emerging relevance, since the simulations can be of use in the design of novel systems, in particular for complex shaped tablets. In this work a model, previously developed, was applied to complex-shaped oral drug delivery systems based on hydrogels (Dome Matrix®). Furthermore, the model was successfully adopted in the description of drug release from partially accessible Dome Matrix® systems (systems with some surfaces coated). In these simulations, the erosion rate was used asa fitting parameter, and its dependence upon the surface area/volume ratio and upon the local fluid dynamics was discussed. The model parameters were determined by comparison with the drug release profile from a cylindrical tablet, then the model was successfully used for the prediction of the drug release from a Dome Matrix® system, for simple module configuration and for module assembled (void and piled) configurations. It was also demonstrated that, given the same initial S/V ratio, the drug release is independent upon the shape of the tablets but it is only influenced by the S/V evolution. The model reveals itself able to describe the observed phenomena, and thus it can be of use for the design of oral drug delivery systems, even if complex shaped. Copyright © 2017 Elsevier B.V. All rights reserved.
Adaptive selection and validation of models of complex systems in the presence of uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrell-Maupin, Kathryn; Oden, J. T.
This study describes versions of OPAL, the Occam-Plausibility Algorithm in which the use of Bayesian model plausibilities is replaced with information theoretic methods, such as the Akaike Information Criterion and the Bayes Information Criterion. Applications to complex systems of coarse-grained molecular models approximating atomistic models of polyethylene materials are described. All of these model selection methods take into account uncertainties in the model, the observational data, the model parameters, and the predicted quantities of interest. A comparison of the models chosen by Bayesian model selection criteria and those chosen by the information-theoretic criteria is given.
Adaptive selection and validation of models of complex systems in the presence of uncertainty
Farrell-Maupin, Kathryn; Oden, J. T.
2017-08-01
This study describes versions of OPAL, the Occam-Plausibility Algorithm in which the use of Bayesian model plausibilities is replaced with information theoretic methods, such as the Akaike Information Criterion and the Bayes Information Criterion. Applications to complex systems of coarse-grained molecular models approximating atomistic models of polyethylene materials are described. All of these model selection methods take into account uncertainties in the model, the observational data, the model parameters, and the predicted quantities of interest. A comparison of the models chosen by Bayesian model selection criteria and those chosen by the information-theoretic criteria is given.
The application of sensitivity analysis to models of large scale physiological systems
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1974-01-01
A survey of the literature of sensitivity analysis as it applies to biological systems is reported as well as a brief development of sensitivity theory. A simple population model and a more complex thermoregulatory model illustrate the investigatory techniques and interpretation of parameter sensitivity analysis. The role of sensitivity analysis in validating and verifying models, and in identifying relative parameter influence in estimating errors in model behavior due to uncertainty in input data is presented. This analysis is valuable to the simulationist and the experimentalist in allocating resources for data collection. A method for reducing highly complex, nonlinear models to simple linear algebraic models that could be useful for making rapid, first order calculations of system behavior is presented.
Entering AN ERA of Synthesis of Modeling
NASA Astrophysics Data System (ADS)
Guerin, Stephen
First, I believe we're entering an era of synthesis of modeling. Over the past 20 years, we've seen the proliferation of many isolated complex systems models. I think we now need tools for researchers, policy makers and the public to share models. Sharing could happen by stacking different layers of spatial agent-based models in geographic information systems and projecting interactive visualization out onto shared surfaces. Further, we need to make model authoring tools much more accessible to the point where motivated policy makers can author on their own. With the increased ability to author and share models, I believe this will allow us to scale our research to understand and manage the many interacting systems that make up our complex world...
Representing Operational Modes for Situation Awareness
NASA Astrophysics Data System (ADS)
Kirchhübel, Denis; Lind, Morten; Ravn, Ole
2017-01-01
Operating complex plants is an increasingly demanding task for human operators. Diagnosis of and reaction to on-line events requires the interpretation of real time data. Vast amounts of sensor data as well as operational knowledge about the state and design of the plant are necessary to deduct reasonable reactions to abnormal situations. Intelligent computational support tools can make the operator’s task easier, but they require knowledge about the overall system in form of some model. While tools used for fault-tolerant control design based on physical principles and relations are valuable tools for designing robust systems, the models become too complex when considering the interactions on a plant-wide level. The alarm systems meant to support human operators in the diagnosis of the plant-wide situation on the other hand fail regularly in situations where these interactions of systems lead to many related alarms overloading the operator with alarm floods. Functional modelling can provide a middle way to reduce the complexity of plant-wide models by abstracting from physical details to more general functions and behaviours. Based on functional models the propagation of failures through the interconnected systems can be inferred and alarm floods can potentially be reduced to their root-cause. However, the desired behaviour of a complex system changes due to operating procedures that require more than one physical and functional configuration. In this paper a consistent representation of possible configurations is deduced from the analysis of an exemplary start-up procedure by functional models. The proposed interpretation of the modelling concepts simplifies the functional modelling of distinct modes. The analysis further reveals relevant links between the quantitative sensor data and the qualitative perspective of the diagnostics tool based on functional models. This will form the basis for the ongoing development of a novel real-time diagnostics system based on the on-line adaptation of the underlying MFM model.
Sensitivity analysis of dynamic biological systems with time-delays.
Wu, Wu Hsiung; Wang, Feng Sheng; Chang, Maw Shang
2010-10-15
Mathematical modeling has been applied to the study and analysis of complex biological systems for a long time. Some processes in biological systems, such as the gene expression and feedback control in signal transduction networks, involve a time delay. These systems are represented as delay differential equation (DDE) models. Numerical sensitivity analysis of a DDE model by the direct method requires the solutions of model and sensitivity equations with time-delays. The major effort is the computation of Jacobian matrix when computing the solution of sensitivity equations. The computation of partial derivatives of complex equations either by the analytic method or by symbolic manipulation is time consuming, inconvenient, and prone to introduce human errors. To address this problem, an automatic approach to obtain the derivatives of complex functions efficiently and accurately is necessary. We have proposed an efficient algorithm with an adaptive step size control to compute the solution and dynamic sensitivities of biological systems described by ordinal differential equations (ODEs). The adaptive direct-decoupled algorithm is extended to solve the solution and dynamic sensitivities of time-delay systems describing by DDEs. To save the human effort and avoid the human errors in the computation of partial derivatives, an automatic differentiation technique is embedded in the extended algorithm to evaluate the Jacobian matrix. The extended algorithm is implemented and applied to two realistic models with time-delays: the cardiovascular control system and the TNF-α signal transduction network. The results show that the extended algorithm is a good tool for dynamic sensitivity analysis on DDE models with less user intervention. By comparing with direct-coupled methods in theory, the extended algorithm is efficient, accurate, and easy to use for end users without programming background to do dynamic sensitivity analysis on complex biological systems with time-delays.
Butler, Samuel D; Nauyoks, Stephen E; Marciniak, Michael A
2015-06-01
Of the many classes of bidirectional reflectance distribution function (BRDF) models, two popular classes of models are the microfacet model and the linear systems diffraction model. The microfacet model has the benefit of speed and simplicity, as it uses geometric optics approximations, while linear systems theory uses a diffraction approach to compute the BRDF, at the expense of greater computational complexity. In this Letter, nongrazing BRDF measurements of rough and polished surface-reflecting materials at multiple incident angles are scaled by the microfacet cross section conversion term, but in the linear systems direction cosine space, resulting in great alignment of BRDF data at various incident angles in this space. This results in a predictive BRDF model for surface-reflecting materials at nongrazing angles, while avoiding some of the computational complexities in the linear systems diffraction model.
A probabilistic process model for pelagic marine ecosystems informed by Bayesian inverse analysis
Marine ecosystems are complex systems with multiple pathways that produce feedback cycles, which may lead to unanticipated effects. Models abstract this complexity and allow us to predict, understand, and hypothesize. In ecological models, however, the paucity of empirical data...
POD Model Reconstruction for Gray-Box Fault Detection
NASA Technical Reports Server (NTRS)
Park, Han; Zak, Michail
2007-01-01
Proper orthogonal decomposition (POD) is the mathematical basis of a method of constructing low-order mathematical models for the "gray-box" fault-detection algorithm that is a component of a diagnostic system known as beacon-based exception analysis for multi-missions (BEAM). POD has been successfully applied in reducing computational complexity by generating simple models that can be used for control and simulation for complex systems such as fluid flows. In the present application to BEAM, POD brings the same benefits to automated diagnosis. BEAM is a method of real-time or offline, automated diagnosis of a complex dynamic system.The gray-box approach makes it possible to utilize incomplete or approximate knowledge of the dynamics of the system that one seeks to diagnose. In the gray-box approach, a deterministic model of the system is used to filter a time series of system sensor data to remove the deterministic components of the time series from further examination. What is left after the filtering operation is a time series of residual quantities that represent the unknown (or at least unmodeled) aspects of the behavior of the system. Stochastic modeling techniques are then applied to the residual time series. The procedure for detecting abnormal behavior of the system then becomes one of looking for statistical differences between the residual time series and the predictions of the stochastic model.
A Systems Approach to Vaccine Decision Making
Lee, Bruce Y.; Mueller, Leslie E.; Tilchin, Carla G.
2016-01-01
Vaccines reside in a complex multiscale system that includes biological, clinical, behavioral, social, operational, environmental, and economical relationships. Not accounting for these systems when making decisions about vaccines can result in changes that have little effect rather than solutions, lead to unsustainable solutions, miss indirect (e.g., secondary, tertiary, and beyond) effects, cause unintended consequences, and lead to wasted time, effort, and resources. Mathematical and computational modeling can help better understand and address complex systems by representing all or most of the components, relationships, and processes. Such models can serve as “virtual laboratories” to examine how a system operates and test the effects of different changes within the system. Here are ten lessons learned from using computational models to bring more of a systems approach to vaccine decision making: (i) traditional single measure approaches may overlook opportunities; (ii) there is complex interplay among many vaccine, population, and disease characteristics; (iii) accounting for perspective can identify synergies; (iv) the distribution system should not be overlooked; (v) target population choice can have secondary and tertiary effects; (vi) potentially overlooked characteristics can be important; (vii) characteristics of one vaccine can affect other vaccines; (viii) the broader impact of vaccines is complex; (ix) vaccine administration extends beyond the provider level; (x) and the value of vaccines is dynamic. PMID:28017430
A systems approach to vaccine decision making.
Lee, Bruce Y; Mueller, Leslie E; Tilchin, Carla G
2017-01-20
Vaccines reside in a complex multiscale system that includes biological, clinical, behavioral, social, operational, environmental, and economical relationships. Not accounting for these systems when making decisions about vaccines can result in changes that have little effect rather than solutions, lead to unsustainable solutions, miss indirect (e.g., secondary, tertiary, and beyond) effects, cause unintended consequences, and lead to wasted time, effort, and resources. Mathematical and computational modeling can help better understand and address complex systems by representing all or most of the components, relationships, and processes. Such models can serve as "virtual laboratories" to examine how a system operates and test the effects of different changes within the system. Here are ten lessons learned from using computational models to bring more of a systems approach to vaccine decision making: (i) traditional single measure approaches may overlook opportunities; (ii) there is complex interplay among many vaccine, population, and disease characteristics; (iii) accounting for perspective can identify synergies; (iv) the distribution system should not be overlooked; (v) target population choice can have secondary and tertiary effects; (vi) potentially overlooked characteristics can be important; (vii) characteristics of one vaccine can affect other vaccines; (viii) the broader impact of vaccines is complex; (ix) vaccine administration extends beyond the provider level; and (x) the value of vaccines is dynamic. Copyright © 2016 Elsevier Ltd. All rights reserved.
Improving Systems Engineering Effectiveness in Rapid Response Development Environments
2012-06-02
environments where large, complex, brownfield systems of systems are evolved through parallel development of new capabilities in response to external, time...license 14. ABSTRACT Systems engineering is often ineffective in development environments where large, complex, brownfield systems of systems are...IEEE Press, Hoboken, NJ, 2008 [18] Boehm, B.: Applying the Incremental Commitment Model to Brownfield Systems Development, Proceedings, CSER 2009
Darabi Sahneh, Faryad; Scoglio, Caterina; Riviere, Jim
2013-01-01
Background Nanoparticle-protein corona complex formation involves absorption of protein molecules onto nanoparticle surfaces in a physiological environment. Understanding the corona formation process is crucial in predicting nanoparticle behavior in biological systems, including applications of nanotoxicology and development of nano drug delivery platforms. Method This paper extends the modeling work in to derive a mathematical model describing the dynamics of nanoparticle corona complex formation from population balance equations. We apply nonlinear dynamics techniques to derive analytical results for the composition of nanoparticle-protein corona complex, and validate our results through numerical simulations. Results The model presented in this paper exhibits two phases of corona complex dynamics. In the first phase, proteins rapidly bind to the free surface of nanoparticles, leading to a metastable composition. During the second phase, continuous association and dissociation of protein molecules with nanoparticles slowly changes the composition of the corona complex. Given sufficient time, composition of the corona complex reaches an equilibrium state of stable composition. We find analytical approximate formulae for metastable and stable compositions of corona complex. Our formulae are very well-structured to clearly identify important parameters determining corona composition. Conclusion The dynamics of biocorona formation constitute vital aspect of interactions between nanoparticles and living organisms. Our results further understanding of these dynamics through quantitation of experimental conditions, modeling results for in vitro systems to better predict behavior for in vivo systems. One potential application would involve a single cell culture medium related to a complex protein medium, such as blood or tissue fluid. PMID:23741371
Demonstration of a Safety Analysis on a Complex System
NASA Technical Reports Server (NTRS)
Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey;
1997-01-01
For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.
Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias
2011-01-01
Future multiscale and multiphysics models must use the power of high performance computing (HPC) systems to enable research into human disease, translational medical science, and treatment. Previously we showed that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message passing processes (e.g. the message passing interface (MPI)) with multithreading (e.g. OpenMP, POSIX pthreads). The objective of this work is to compare the performance of such hybrid programming models when applied to the simulation of a lightweight multiscale cardiac model. Our results show that the hybrid models do not perform favourably when compared to an implementation using only MPI which is in contrast to our results using complex physiological models. Thus, with regards to lightweight multiscale cardiac models, the user may not need to increase programming complexity by using a hybrid programming approach. However, considering that model complexity will increase as well as the HPC system size in both node count and number of cores per node, it is still foreseeable that we will achieve faster than real time multiscale cardiac simulations on these systems using hybrid programming models.
Wiemuth, M; Junger, D; Leitritz, M A; Neumann, J; Neumuth, T; Burgert, O
2017-08-01
Medical processes can be modeled using different methods and notations. Currently used modeling systems like Business Process Model and Notation (BPMN) are not capable of describing the highly flexible and variable medical processes in sufficient detail. We combined two modeling systems, Business Process Management (BPM) and Adaptive Case Management (ACM), to be able to model non-deterministic medical processes. We used the new Standards Case Management Model and Notation (CMMN) and Decision Management Notation (DMN). First, we explain how CMMN, DMN and BPMN could be used to model non-deterministic medical processes. We applied this methodology to model 79 cataract operations provided by University Hospital Leipzig, Germany, and four cataract operations provided by University Eye Hospital Tuebingen, Germany. Our model consists of 85 tasks and about 20 decisions in BPMN. We were able to expand the system with more complex situations that might appear during an intervention. An effective modeling of the cataract intervention is possible using the combination of BPM and ACM. The combination gives the possibility to depict complex processes with complex decisions. This combination allows a significant advantage for modeling perioperative processes.
Model-order reduction of lumped parameter systems via fractional calculus
NASA Astrophysics Data System (ADS)
Hollkamp, John P.; Sen, Mihir; Semperlotti, Fabio
2018-04-01
This study investigates the use of fractional order differential models to simulate the dynamic response of non-homogeneous discrete systems and to achieve efficient and accurate model order reduction. The traditional integer order approach to the simulation of non-homogeneous systems dictates the use of numerical solutions and often imposes stringent compromises between accuracy and computational performance. Fractional calculus provides an alternative approach where complex dynamical systems can be modeled with compact fractional equations that not only can still guarantee analytical solutions, but can also enable high levels of order reduction without compromising on accuracy. Different approaches are explored in order to transform the integer order model into a reduced order fractional model able to match the dynamic response of the initial system. Analytical and numerical results show that, under certain conditions, an exact match is possible and the resulting fractional differential models have both a complex and frequency-dependent order of the differential operator. The implications of this type of approach for both model order reduction and model synthesis are discussed.
NASA Astrophysics Data System (ADS)
Chang, Ni-Bin; Weng, Yu-Chi
2013-03-01
Short-term predictions of potential impacts from accidental release of various radionuclides at nuclear power plants are acutely needed, especially after the Fukushima accident in Japan. An integrated modeling system that provides expert services to assess the consequences of accidental or intentional releases of radioactive materials to the atmosphere has received wide attention. These scenarios can be initiated either by accident due to human, software, or mechanical failures, or from intentional acts such as sabotage and radiological dispersal devices. Stringent action might be required just minutes after the occurrence of accidental or intentional release. To fulfill the basic functions of emergency preparedness and response systems, previous studies seldom consider the suitability of air pollutant dispersion models or the connectivity between source term, dispersion, and exposure assessment models in a holistic context for decision support. Therefore, the Gaussian plume and puff models, which are only suitable for illustrating neutral air pollutants in flat terrain conditional to limited meteorological situations, are frequently used to predict the impact from accidental release of industrial sources. In situations with complex terrain or special meteorological conditions, the proposing emergency response actions might be questionable and even intractable to decisionmakers responsible for maintaining public health and environmental quality. This study is a preliminary effort to integrate the source term, dispersion, and exposure assessment models into a Spatial Decision Support System (SDSS) to tackle the complex issues for short-term emergency response planning and risk assessment at nuclear power plants. Through a series model screening procedures, we found that the diagnostic (objective) wind field model with the aid of sufficient on-site meteorological monitoring data was the most applicable model to promptly address the trend of local wind field patterns. However, most of the hazardous materials being released into the environment from nuclear power plants are not neutral pollutants, so the particle and multi-segment puff models can be regarded as the most suitable models to incorporate into the output of the diagnostic wind field model in a modern emergency preparedness and response system. The proposed SDSS illustrates the state-of-the-art system design based on the situation of complex terrain in South Taiwan. This system design of SDSS with 3-dimensional animation capability using a tailored source term model in connection with ArcView® Geographical Information System map layers and remote sensing images is useful for meeting the design goal of nuclear power plants located in complex terrain.
Rusoja, Evan; Haynie, Deson; Sievers, Jessica; Mustafee, Navonil; Nelson, Fred; Reynolds, Martin; Sarriot, Eric; Swanson, Robert Chad; Williams, Bob
2018-01-30
As the Sustainable Development Goals are rolled out worldwide, development leaders will be looking to the experiences of the past to improve implementation in the future. Systems thinking and complexity science (ST/CS) propose that health and the health system are composed of dynamic actors constantly evolving in response to each other and their context. While offering practical guidance for steering the next development agenda, there is no consensus as to how these important ideas are discussed in relation to health. This systematic review sought to identify and describe some of the key terms, concepts, and methods in recent ST/CS literature. Using the search terms "systems thinkin * AND health OR complexity theor* AND health OR complex adaptive system* AND health," we identified 516 relevant full texts out of 3982 titles across the search period (2002-2015). The peak number of articles were published in 2014 (83) with journals specifically focused on medicine/healthcare (265) and particularly the Journal of Evaluation in Clinical Practice (37) representing the largest number by volume. Dynamic/dynamical systems (n = 332), emergence (n = 294), complex adaptive system(s) (n = 270), and interdependent/interconnected (n = 263) were the most common terms with systems dynamic modelling (58) and agent-based modelling (43) as the most common methods. The review offered several important conclusions. First, while there was no core ST/CS "canon," certain terms appeared frequently across the reviewed texts. Second, even as these ideas are gaining traction in academic and practitioner communities, most are concentrated in a few journals. Finally, articles on ST/CS remain largely theoretical illustrating the need for further study and practical application. Given the challenge posed by the next phase of development, gaining a better understanding of ST/CS ideas and their use may lead to improvements in the implementation and practice of the Sustainable Development Goals. Key messages Systems thinking and complexity science, theories that acknowledge the dynamic, connected, and context-dependent nature of health, are highly relevant to the post-millennium development goal era yet lack consensus on their use in relation to health Although heterogeneous, terms, and concepts like emergence, dynamic/dynamical Systems, nonlinear(ity), and interdependent/interconnected as well as methods like systems dynamic modelling and agent-based modelling that comprise systems thinking and complexity science in the health literature are shared across an increasing number of publications within medical/healthcare disciplines Planners, practitioners, and theorists that can better understand these key systems thinking and complexity science concepts will be better equipped to tackle the challenges of the upcoming development goals. © 2018 John Wiley & Sons, Ltd.
Epidemic modeling in complex realities.
Colizza, Vittoria; Barthélemy, Marc; Barrat, Alain; Vespignani, Alessandro
2007-04-01
In our global world, the increasing complexity of social relations and transport infrastructures are key factors in the spread of epidemics. In recent years, the increasing availability of computer power has enabled both to obtain reliable data allowing one to quantify the complexity of the networks on which epidemics may propagate and to envision computational tools able to tackle the analysis of such propagation phenomena. These advances have put in evidence the limits of homogeneous assumptions and simple spatial diffusion approaches, and stimulated the inclusion of complex features and heterogeneities relevant in the description of epidemic diffusion. In this paper, we review recent progresses that integrate complex systems and networks analysis with epidemic modelling and focus on the impact of the various complex features of real systems on the dynamics of epidemic spreading.
2016-04-30
fåÑçêãÉÇ=`Ü~åÖÉ= - 194 - Panel 16. Improving Governance of Complex Systems Acquisition Thursday, May 5, 2016 11:15 a.m. – 12:45 p.m. Chair: Rear...Admiral David Gale, USN, Program Executive Officer, SHIPS Complex System Governance for Acquisition Joseph Bradley, President, Leading Change, LLC...Bryan Moser, Lecturer, MIT John Dickmann, Vice President, Sonalysts Inc. A Complex Systems Perspective of Risk Mitigation and Modeling in
Mathematical Modeling of Intestinal Iron Absorption Using Genetic Programming
Colins, Andrea; Gerdtzen, Ziomara P.; Nuñez, Marco T.; Salgado, J. Cristian
2017-01-01
Iron is a trace metal, key for the development of living organisms. Its absorption process is complex and highly regulated at the transcriptional, translational and systemic levels. Recently, the internalization of the DMT1 transporter has been proposed as an additional regulatory mechanism at the intestinal level, associated to the mucosal block phenomenon. The short-term effect of iron exposure in apical uptake and initial absorption rates was studied in Caco-2 cells at different apical iron concentrations, using both an experimental approach and a mathematical modeling framework. This is the first report of short-term studies for this system. A non-linear behavior in the apical uptake dynamics was observed, which does not follow the classic saturation dynamics of traditional biochemical models. We propose a method for developing mathematical models for complex systems, based on a genetic programming algorithm. The algorithm is aimed at obtaining models with a high predictive capacity, and considers an additional parameter fitting stage and an additional Jackknife stage for estimating the generalization error. We developed a model for the iron uptake system with a higher predictive capacity than classic biochemical models. This was observed both with the apical uptake dataset used for generating the model and with an independent initial rates dataset used to test the predictive capacity of the model. The model obtained is a function of time and the initial apical iron concentration, with a linear component that captures the global tendency of the system, and a non-linear component that can be associated to the movement of DMT1 transporters. The model presented in this paper allows the detailed analysis, interpretation of experimental data, and identification of key relevant components for this complex biological process. This general method holds great potential for application to the elucidation of biological mechanisms and their key components in other complex systems. PMID:28072870
Formalizing the Role of Agent-Based Modeling in Causal Inference and Epidemiology
Marshall, Brandon D. L.; Galea, Sandro
2015-01-01
Calls for the adoption of complex systems approaches, including agent-based modeling, in the field of epidemiology have largely centered on the potential for such methods to examine complex disease etiologies, which are characterized by feedback behavior, interference, threshold dynamics, and multiple interacting causal effects. However, considerable theoretical and practical issues impede the capacity of agent-based methods to examine and evaluate causal effects and thus illuminate new areas for intervention. We build on this work by describing how agent-based models can be used to simulate counterfactual outcomes in the presence of complexity. We show that these models are of particular utility when the hypothesized causal mechanisms exhibit a high degree of interdependence between multiple causal effects and when interference (i.e., one person's exposure affects the outcome of others) is present and of intrinsic scientific interest. Although not without challenges, agent-based modeling (and complex systems methods broadly) represent a promising novel approach to identify and evaluate complex causal effects, and they are thus well suited to complement other modern epidemiologic methods of etiologic inquiry. PMID:25480821
The Social Process of Analyzing Real Water Resource Systems Plans and Management Policies
NASA Astrophysics Data System (ADS)
Loucks, Daniel
2016-04-01
Developing and applying systems analysis methods for improving the development and management of real world water resource systems, I have learned, is primarily a social process. This talk is a call for more recognition of this reality in the modeling approaches we propose in the papers and books we publish. The mathematical models designed to inform planners and managers of water systems that we see in many of our journals often seem more complex than they need be. They also often seem not as connected to reality as they could be. While it may be easier to publish descriptions of complex models than simpler ones, and while adding complexity to models might make them better able to mimic or resemble the actual complexity of the real physical and/or social systems or processes being analyzed, the usefulness of such models often can be an illusion. Sometimes the important features of reality that are of concern or interest to those who make decisions can be adequately captured using relatively simple models. Finding the right balance for the particular issues being addressed or the particular decisions that need to be made is an art. When applied to real world problems or issues in specific basins or regions, systems modeling projects often involve more attention to the social aspects than the mathematical ones. Mathematical models addressing connected interacting interdependent components of complex water systems are in fact some of the most useful methods we have to study and better understand the systems we manage around us. They can help us identify and evaluate possible alternative solutions to problems facing humanity today. The study of real world systems of interacting components using mathematical models is commonly called applied systems analyses. Performing such analyses with decision makers rather than of decision makers is critical if the needed trust between project personnel and their clients is to be developed. Using examples from recent and ongoing modeling projects in different parts of the world, this talk will attempt to show the dependency on the degree of project success with the degree of attention given to the communication between project personnel, the stakeholders and decision making institutions. It will also highlight how initial project terms-of-reference and expected outcomes can change, sometimes in surprising ways, during the course of such projects. Changing project objectives often result from changing stakeholder values, emphasizing the need for analyses that can adapt to this uncertainty.
Qualitative models and experimental investigation of chaotic NOR gates and set/reset flip-flops
NASA Astrophysics Data System (ADS)
Rahman, Aminur; Jordan, Ian; Blackmore, Denis
2018-01-01
It has been observed through experiments and SPICE simulations that logical circuits based upon Chua's circuit exhibit complex dynamical behaviour. This behaviour can be used to design analogues of more complex logic families and some properties can be exploited for electronics applications. Some of these circuits have been modelled as systems of ordinary differential equations. However, as the number of components in newer circuits increases so does the complexity. This renders continuous dynamical systems models impractical and necessitates new modelling techniques. In recent years, some discrete dynamical models have been developed using various simplifying assumptions. To create a robust modelling framework for chaotic logical circuits, we developed both deterministic and stochastic discrete dynamical models, which exploit the natural recurrence behaviour, for two chaotic NOR gates and a chaotic set/reset flip-flop. This work presents a complete applied mathematical investigation of logical circuits. Experiments on our own designs of the above circuits are modelled and the models are rigorously analysed and simulated showing surprisingly close qualitative agreement with the experiments. Furthermore, the models are designed to accommodate dynamics of similarly designed circuits. This will allow researchers to develop ever more complex chaotic logical circuits with a simple modelling framework.
Qualitative models and experimental investigation of chaotic NOR gates and set/reset flip-flops.
Rahman, Aminur; Jordan, Ian; Blackmore, Denis
2018-01-01
It has been observed through experiments and SPICE simulations that logical circuits based upon Chua's circuit exhibit complex dynamical behaviour. This behaviour can be used to design analogues of more complex logic families and some properties can be exploited for electronics applications. Some of these circuits have been modelled as systems of ordinary differential equations. However, as the number of components in newer circuits increases so does the complexity. This renders continuous dynamical systems models impractical and necessitates new modelling techniques. In recent years, some discrete dynamical models have been developed using various simplifying assumptions. To create a robust modelling framework for chaotic logical circuits, we developed both deterministic and stochastic discrete dynamical models, which exploit the natural recurrence behaviour, for two chaotic NOR gates and a chaotic set/reset flip-flop. This work presents a complete applied mathematical investigation of logical circuits. Experiments on our own designs of the above circuits are modelled and the models are rigorously analysed and simulated showing surprisingly close qualitative agreement with the experiments. Furthermore, the models are designed to accommodate dynamics of similarly designed circuits. This will allow researchers to develop ever more complex chaotic logical circuits with a simple modelling framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rizzo, Davinia B.; Blackburn, Mark R.
As systems become more complex, systems engineers rely on experts to inform decisions. There are few experts and limited data in many complex new technologies. This challenges systems engineers as they strive to plan activities such as qualification in an environment where technical constraints are coupled with the traditional cost, risk, and schedule constraints. Bayesian network (BN) models provide a framework to aid systems engineers in planning qualification efforts with complex constraints by harnessing expert knowledge and incorporating technical factors. By quantifying causal factors, a BN model can provide data about the risk of implementing a decision supplemented with informationmore » on driving factors. This allows a systems engineer to make informed decisions and examine “what-if” scenarios. This paper discusses a novel process developed to define a BN model structure based primarily on expert knowledge supplemented with extremely limited data (25 data sets or less). The model was developed to aid qualification decisions—specifically to predict the suitability of six degrees of freedom (6DOF) vibration testing for qualification. The process defined the model structure with expert knowledge in an unbiased manner. Finally, validation during the process execution and of the model provided evidence the process may be an effective tool in harnessing expert knowledge for a BN model.« less
Rizzo, Davinia B.; Blackburn, Mark R.
2018-03-30
As systems become more complex, systems engineers rely on experts to inform decisions. There are few experts and limited data in many complex new technologies. This challenges systems engineers as they strive to plan activities such as qualification in an environment where technical constraints are coupled with the traditional cost, risk, and schedule constraints. Bayesian network (BN) models provide a framework to aid systems engineers in planning qualification efforts with complex constraints by harnessing expert knowledge and incorporating technical factors. By quantifying causal factors, a BN model can provide data about the risk of implementing a decision supplemented with informationmore » on driving factors. This allows a systems engineer to make informed decisions and examine “what-if” scenarios. This paper discusses a novel process developed to define a BN model structure based primarily on expert knowledge supplemented with extremely limited data (25 data sets or less). The model was developed to aid qualification decisions—specifically to predict the suitability of six degrees of freedom (6DOF) vibration testing for qualification. The process defined the model structure with expert knowledge in an unbiased manner. Finally, validation during the process execution and of the model provided evidence the process may be an effective tool in harnessing expert knowledge for a BN model.« less
Integrated System Health Management Development Toolkit
NASA Technical Reports Server (NTRS)
Figueroa, Jorge; Smith, Harvey; Morris, Jon
2009-01-01
This software toolkit is designed to model complex systems for the implementation of embedded Integrated System Health Management (ISHM) capability, which focuses on determining the condition (health) of every element in a complex system (detect anomalies, diagnose causes, and predict future anomalies), and to provide data, information, and knowledge (DIaK) to control systems for safe and effective operation.
Context in Models of Human-Machine Systems
NASA Technical Reports Server (NTRS)
Callantine, Todd J.; Null, Cynthia H. (Technical Monitor)
1998-01-01
All human-machine systems models represent context. This paper proposes a theory of context through which models may be usefully related and integrated for design. The paper presents examples of context representation in various models, describes an application to developing models for the Crew Activity Tracking System (CATS), and advances context as a foundation for integrated design of complex dynamic systems.
NASA Astrophysics Data System (ADS)
Ma, Junhai; Li, Ting; Ren, Wenbo
2017-06-01
This paper examines the optimal decisions of dual-channel game model considering the inputs of retailing service. We analyze how adjustment speed of service inputs affect the system complexity and market performance, and explore the stability of the equilibrium points by parameter basin diagrams. And chaos control is realized by variable feedback method. The numerical simulation shows that complex behavior would trigger the system to become unstable, such as double period bifurcation and chaos. We measure the performances of the model in different periods by analyzing the variation of average profit index. The theoretical results show that the percentage share of the demand and cross-service coefficients have important influence on the stability of the system and its feasible basin of attraction.
Complex adaptive systems: A new approach for understanding health practices.
Gomersall, Tim
2018-06-22
This article explores the potential of complex adaptive systems theory to inform behaviour change research. A complex adaptive system describes a collection of heterogeneous agents interacting within a particular context, adapting to each other's actions. In practical terms, this implies that behaviour change is 1) socially and culturally situated; 2) highly sensitive to small baseline differences in individuals, groups, and intervention components; and 3) determined by multiple components interacting "chaotically". Two approaches to studying complex adaptive systems are briefly reviewed. Agent-based modelling is a computer simulation technique that allows researchers to investigate "what if" questions in a virtual environment. Applied qualitative research techniques, on the other hand, offer a way to examine what happens when an intervention is pursued in real-time, and to identify the sorts of rules and assumptions governing social action. Although these represent very different approaches to complexity, there may be scope for mixing these methods - for example, by grounding models in insights derived from qualitative fieldwork. Finally, I will argue that the concept of complex adaptive systems offers one opportunity to gain a deepened understanding of health-related practices, and to examine the social psychological processes that produce health-promoting or damaging actions.
System Dynamics (SD) models are useful for holistic integration of data to evaluate indirect and cumulative effects and inform decisions. Complex SD models can provide key insights into how decisions affect the three interconnected pillars of sustainability. However, the complexi...
Sparse dynamical Boltzmann machine for reconstructing complex networks with binary dynamics
NASA Astrophysics Data System (ADS)
Chen, Yu-Zhong; Lai, Ying-Cheng
2018-03-01
Revealing the structure and dynamics of complex networked systems from observed data is a problem of current interest. Is it possible to develop a completely data-driven framework to decipher the network structure and different types of dynamical processes on complex networks? We develop a model named sparse dynamical Boltzmann machine (SDBM) as a structural estimator for complex networks that host binary dynamical processes. The SDBM attains its topology according to that of the original system and is capable of simulating the original binary dynamical process. We develop a fully automated method based on compressive sensing and a clustering algorithm to construct the SDBM. We demonstrate, for a variety of representative dynamical processes on model and real world complex networks, that the equivalent SDBM can recover the network structure of the original system and simulates its dynamical behavior with high precision.
Sparse dynamical Boltzmann machine for reconstructing complex networks with binary dynamics.
Chen, Yu-Zhong; Lai, Ying-Cheng
2018-03-01
Revealing the structure and dynamics of complex networked systems from observed data is a problem of current interest. Is it possible to develop a completely data-driven framework to decipher the network structure and different types of dynamical processes on complex networks? We develop a model named sparse dynamical Boltzmann machine (SDBM) as a structural estimator for complex networks that host binary dynamical processes. The SDBM attains its topology according to that of the original system and is capable of simulating the original binary dynamical process. We develop a fully automated method based on compressive sensing and a clustering algorithm to construct the SDBM. We demonstrate, for a variety of representative dynamical processes on model and real world complex networks, that the equivalent SDBM can recover the network structure of the original system and simulates its dynamical behavior with high precision.
Hierarchical Modeling and Robust Synthesis for the Preliminary Design of Large Scale Complex Systems
NASA Technical Reports Server (NTRS)
Koch, Patrick N.
1997-01-01
Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis; Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration; and Noise modeling techniques for implementing robust preliminary design when approximate models are employed. Hierarchical partitioning and modeling techniques including intermediate responses, linking variables, and compatibility constraints are incorporated within a hierarchical compromise decision support problem formulation for synthesizing subproblem solutions for a partitioned system. Experimentation and approximation techniques are employed for concurrent investigations and modeling of partitioned subproblems. A modified composite experiment is introduced for fitting better predictive models across the ranges of the factors, and an approach for constructing partitioned response surfaces is developed to reduce the computational expense of experimentation for fitting models in a large number of factors. Noise modeling techniques are compared and recommendations are offered for the implementation of robust design when approximate models are sought. These techniques, approaches, and recommendations are incorporated within the method developed for hierarchical robust preliminary design exploration. This method as well as the associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system. The case study is developed in collaboration with Allison Engine Company, Rolls Royce Aerospace, and is based on the Allison AE3007 existing engine designed for midsize commercial, regional business jets. For this case study, the turbofan system-level problem is partitioned into engine cycle design and configuration design and a compressor modules integrated for more detailed subsystem-level design exploration, improving system evaluation. The fan and low pressure turbine subsystems are also modeled, but in less detail. Given the defined partitioning, these subproblems are investigated independently and concurrently, and response surface models are constructed to approximate the responses of each. These response models are then incorporated within a commercial turbofan hierarchical compromise decision support problem formulation. Five design scenarios are investigated, and robust solutions are identified. The method and solutions identified are verified by comparison with the AE3007 engine. The solutions obtained are similar to the AE3007 cycle and configuration, but are better with respect to many of the requirements.
ERIC Educational Resources Information Center
Sung, Dia; You, Yeongmahn; Song, Ji Hoon
2008-01-01
The purpose of this research is to explore the possibility of viable learning organizations based on identifying viable organizational learning mechanisms. Two theoretical foundations, complex system theory and viable system theory, have been integrated to provide the rationale for building the sustainable organizational learning mechanism. The…
Cook, Daniel L; Farley, Joel F; Tapscott, Stephen J
2001-01-01
Background: We propose that a computerized, internet-based graphical description language for systems biology will be essential for describing, archiving and analyzing complex problems of biological function in health and disease. Results: We outline here a conceptual basis for designing such a language and describe BioD, a prototype language that we have used to explore the utility and feasibility of this approach to functional biology. Using example models, we demonstrate that a rather limited lexicon of icons and arrows suffices to describe complex cell-biological systems as discrete models that can be posted and linked on the internet. Conclusions: Given available computer and internet technology, BioD may be implemented as an extensible, multidisciplinary language that can be used to archive functional systems knowledge and be extended to support both qualitative and quantitative functional analysis. PMID:11305940
Dynamical systems in economics
NASA Astrophysics Data System (ADS)
Stanojević, Jelena; Kukić, Katarina
2018-01-01
In last few decades much attention is given to explain complex behaviour of very large systems, such as weather, economy, biology and demography. In this paper we give short overview of basic notions in the field of dynamical systems which are relevant for understanding complex nature of some economic models.
The complexity of air quality modeling systems, air quality monitoring data make ad-hoc systems for model evaluation important aids to the modeling community. Among those are the ENSEMBLE system developed by the EC-Joint Research Center, and the AMET software developed by the US-...
Model Checking for Verification of Interactive Health IT Systems
Butler, Keith A.; Mercer, Eric; Bahrami, Ali; Tao, Cui
2015-01-01
Rigorous methods for design and verification of health IT systems have lagged far behind their proliferation. The inherent technical complexity of healthcare, combined with the added complexity of health information technology makes their resulting behavior unpredictable and introduces serious risk. We propose to mitigate this risk by formalizing the relationship between HIT and the conceptual work that increasingly typifies modern care. We introduce new techniques for modeling clinical workflows and the conceptual products within them that allow established, powerful modeling checking technology to be applied to interactive health IT systems. The new capability can evaluate the workflows of a new HIT system performed by clinicians and computers to improve safety and reliability. We demonstrate the method on a patient contact system to demonstrate model checking is effective for interactive systems and that much of it can be automated. PMID:26958166
Systems Modeling in Developmental Toxicity
An individual starts off as a single cell, the progeny of which form complex structures that are themselves integrated into progressively larger systems. Developmental biology is concerned with how this cellular complexity and patterning arises through orchestration of cell divi...
Applying Model Based Systems Engineering to NASA's Space Communications Networks
NASA Technical Reports Server (NTRS)
Bhasin, Kul; Barnes, Patrick; Reinert, Jessica; Golden, Bert
2013-01-01
System engineering practices for complex systems and networks now require that requirement, architecture, and concept of operations product development teams, simultaneously harmonize their activities to provide timely, useful and cost-effective products. When dealing with complex systems of systems, traditional systems engineering methodology quickly falls short of achieving project objectives. This approach is encumbered by the use of a number of disparate hardware and software tools, spreadsheets and documents to grasp the concept of the network design and operation. In case of NASA's space communication networks, since the networks are geographically distributed, and so are its subject matter experts, the team is challenged to create a common language and tools to produce its products. Using Model Based Systems Engineering methods and tools allows for a unified representation of the system in a model that enables a highly related level of detail. To date, Program System Engineering (PSE) team has been able to model each network from their top-level operational activities and system functions down to the atomic level through relational modeling decomposition. These models allow for a better understanding of the relationships between NASA's stakeholders, internal organizations, and impacts to all related entities due to integration and sustainment of existing systems. Understanding the existing systems is essential to accurate and detailed study of integration options being considered. In this paper, we identify the challenges the PSE team faced in its quest to unify complex legacy space communications networks and their operational processes. We describe the initial approaches undertaken and the evolution toward model based system engineering applied to produce Space Communication and Navigation (SCaN) PSE products. We will demonstrate the practice of Model Based System Engineering applied to integrating space communication networks and the summary of its results and impact. We will highlight the insights gained by applying the Model Based System Engineering and provide recommendations for its applications and improvements.
Unsilencing Critical Conversations in Social-Studies Teacher Education Using Agent-Based Modeling
ERIC Educational Resources Information Center
Hostetler, Andrew; Sengupta, Pratim; Hollett, Ty
2018-01-01
In this article, we argue that when complex sociopolitical issues such as ethnocentrism and racial segregation are represented as complex, emergent systems using agent-based computational models (in short agent-based models or ABMs), discourse about these representations can disrupt social studies teacher candidates' dispositions of teaching…
From Complex to Simple: Interdisciplinary Stochastic Models
ERIC Educational Resources Information Center
Mazilu, D. A.; Zamora, G.; Mazilu, I.
2012-01-01
We present two simple, one-dimensional, stochastic models that lead to a qualitative understanding of very complex systems from biology, nanoscience and social sciences. The first model explains the complicated dynamics of microtubules, stochastic cellular highways. Using the theory of random walks in one dimension, we find analytical expressions…
McNamara, C; Naddy, B; Rohan, D; Sexton, J
2003-10-01
The Monte Carlo computational system for stochastic modelling of dietary exposure to food chemicals and nutrients is presented. This system was developed through a European Commission-funded research project. It is accessible as a Web-based application service. The system allows and supports very significant complexity in the data sets used as the model input, but provides a simple, general purpose, linear kernel for model evaluation. Specific features of the system include the ability to enter (arbitrarily) complex mathematical or probabilistic expressions at each and every input data field, automatic bootstrapping on subjects and on subject food intake diaries, and custom kernels to apply brand information such as market share and loyalty to the calculation of food and chemical intake.
Multiscale Mathematics for Biomass Conversion to Renewable Hydrogen
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plechac, Petr
2016-03-01
The overall objective of this project was to develop multiscale models for understanding and eventually designing complex processes for renewables. To the best of our knowledge, our work is the first attempt at modeling complex reacting systems, whose performance relies on underlying multiscale mathematics and developing rigorous mathematical techniques and computational algorithms to study such models. Our specific application lies at the heart of biofuels initiatives of DOE and entails modeling of catalytic systems, to enable economic, environmentally benign, and efficient conversion of biomass into either hydrogen or valuable chemicals.
U.S. Geological Survey Groundwater Modeling Software: Making Sense of a Complex Natural Resource
Provost, Alden M.; Reilly, Thomas E.; Harbaugh, Arlen W.; Pollock, David W.
2009-01-01
Computer models of groundwater systems simulate the flow of groundwater, including water levels, and the transport of chemical constituents and thermal energy. Groundwater models afford hydrologists a framework on which to organize their knowledge and understanding of groundwater systems, and they provide insights water-resources managers need to plan effectively for future water demands. Building on decades of experience, the U.S. Geological Survey (USGS) continues to lead in the development and application of computer software that allows groundwater models to address scientific and management questions of increasing complexity.
Use of system identification techniques for improving airframe finite element models using test data
NASA Technical Reports Server (NTRS)
Hanagud, Sathya V.; Zhou, Weiyu; Craig, James I.; Weston, Neil J.
1991-01-01
A method for using system identification techniques to improve airframe finite element models was developed and demonstrated. The method uses linear sensitivity matrices to relate changes in selected physical parameters to changes in total system matrices. The values for these physical parameters were determined using constrained optimization with singular value decomposition. The method was confirmed using both simple and complex finite element models for which pseudo-experimental data was synthesized directly from the finite element model. The method was then applied to a real airframe model which incorporated all the complexities and details of a large finite element model and for which extensive test data was available. The method was shown to work, and the differences between the identified model and the measured results were considered satisfactory.
Activity Diagrams for DEVS Models: A Case Study Modeling Health Care Behavior
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ozmen, Ozgur; Nutaro, James J
Discrete Event Systems Specification (DEVS) is a widely used formalism for modeling and simulation of discrete and continuous systems. While DEVS provides a sound mathematical representation of discrete systems, its practical use can suffer when models become complex. Five main functions, which construct the core of atomic modules in DEVS, can realize the behaviors that modelers want to represent. The integration of these functions is handled by the simulation routine, however modelers can implement each function in various ways. Therefore, there is a need for graphical representations of complex models to simplify their implementation and facilitate their reproduction. In thismore » work, we illustrate the use of activity diagrams for this purpose in the context of a health care behavior model, which is developed with an agent-based modeling paradigm.« less
Illustrations of mathematical modeling in biology: epigenetics, meiosis, and an outlook.
Richards, D; Berry, S; Howard, M
2012-01-01
In the past few years, mathematical modeling approaches in biology have begun to fulfill their promise by assisting in the dissection of complex biological systems. Here, we review two recent examples of predictive mathematical modeling in plant biology. The first involves the quantitative epigenetic silencing of the floral repressor gene FLC in Arabidopsis, mediated by a Polycomb-based system. The second involves the spatiotemporal dynamics of telomere bouquet formation in wheat-rye meiosis. Although both the biology and the modeling framework of the two systems are different, both exemplify how mathematical modeling can help to accelerate discovery of the underlying mechanisms in complex biological systems. In both cases, the models that developed were relatively minimal, including only essential features, but both nevertheless yielded fundamental insights. We also briefly review the current state of mathematical modeling in biology, difficulties inherent in its application, and its potential future development.
Structured analysis and modeling of complex systems
NASA Technical Reports Server (NTRS)
Strome, David R.; Dalrymple, Mathieu A.
1992-01-01
The Aircrew Evaluation Sustained Operations Performance (AESOP) facility at Brooks AFB, Texas, combines the realism of an operational environment with the control of a research laboratory. In recent studies we collected extensive data from the Airborne Warning and Control Systems (AWACS) Weapons Directors subjected to high and low workload Defensive Counter Air Scenarios. A critical and complex task in this environment involves committing a friendly fighter against a hostile fighter. Structured Analysis and Design techniques and computer modeling systems were applied to this task as tools for analyzing subject performance and workload. This technology is being transferred to the Man-Systems Division of NASA Johnson Space Center for application to complex mission related tasks, such as manipulating the Shuttle grappler arm.
Qualitative Fault Isolation of Hybrid Systems: A Structural Model Decomposition-Based Approach
NASA Technical Reports Server (NTRS)
Bregon, Anibal; Daigle, Matthew; Roychoudhury, Indranil
2016-01-01
Quick and robust fault diagnosis is critical to ensuring safe operation of complex engineering systems. A large number of techniques are available to provide fault diagnosis in systems with continuous dynamics. However, many systems in aerospace and industrial environments are best represented as hybrid systems that consist of discrete behavioral modes, each with its own continuous dynamics. These hybrid dynamics make the on-line fault diagnosis task computationally more complex due to the large number of possible system modes and the existence of autonomous mode transitions. This paper presents a qualitative fault isolation framework for hybrid systems based on structural model decomposition. The fault isolation is performed by analyzing the qualitative information of the residual deviations. However, in hybrid systems this process becomes complex due to possible existence of observation delays, which can cause observed deviations to be inconsistent with the expected deviations for the current mode in the system. The great advantage of structural model decomposition is that (i) it allows to design residuals that respond to only a subset of the faults, and (ii) every time a mode change occurs, only a subset of the residuals will need to be reconfigured, thus reducing the complexity of the reasoning process for isolation purposes. To demonstrate and test the validity of our approach, we use an electric circuit simulation as the case study.
NASA Astrophysics Data System (ADS)
Amancio, Diego Raphael
2014-12-01
Concepts and methods of complex networks have been applied to probe the properties of a myriad of real systems [1]. The finding that written texts modeled as graphs share several properties of other completely different real systems has inspired the study of language as a complex system [2]. Actually, language can be represented as a complex network in its several levels of complexity. As a consequence, morphological, syntactical and semantical properties have been employed in the construction of linguistic networks [3]. Even the character level has been useful to unfold particular patterns [4,5]. In the review by Cong and Liu [6], the authors emphasize the need to use the topological information of complex networks modeling the various spheres of the language to better understand its origins, evolution and organization. In addition, the authors cite the use of networks in applications aiming at holistic typology and stylistic variations. In this context, I will discuss some possible directions that could be followed in future research directed towards the understanding of language via topological characterization of complex linguistic networks. In addition, I will comment the use of network models for language processing applications. Additional prospects for future practical research lines will also be discussed in this comment.
FLAME: A platform for high performance computing of complex systems, applied for three case studies
Kiran, Mariam; Bicak, Mesude; Maleki-Dizaji, Saeedeh; ...
2011-01-01
FLAME allows complex models to be automatically parallelised on High Performance Computing (HPC) grids enabling large number of agents to be simulated over short periods of time. Modellers are hindered by complexities of porting models on parallel platforms and time taken to run large simulations on a single machine, which FLAME overcomes. Three case studies from different disciplines were modelled using FLAME, and are presented along with their performance results on a grid.
Methodology and Results of Mathematical Modelling of Complex Technological Processes
NASA Astrophysics Data System (ADS)
Mokrova, Nataliya V.
2018-03-01
The methodology of system analysis allows us to draw a mathematical model of the complex technological process. The mathematical description of the plasma-chemical process was proposed. The importance the quenching rate and initial temperature decrease time was confirmed for producing the maximum amount of the target product. The results of numerical integration of the system of differential equations can be used to describe reagent concentrations, plasma jet rate and temperature in order to achieve optimal mode of hardening. Such models are applicable both for solving control problems and predicting future states of sophisticated technological systems.
An evaluative model of system performance in manned teleoperational systems
NASA Technical Reports Server (NTRS)
Haines, Richard F.
1989-01-01
Manned teleoperational systems are used in aerospace operations in which humans must interact with machines remotely. Manual guidance of remotely piloted vehicles, controling a wind tunnel, carrying out a scientific procedure remotely are examples of teleoperations. A four input parameter throughput (Tp) model is presented which can be used to evaluate complex, manned, teleoperations-based systems and make critical comparisons among candidate control systems. The first two parameters of this model deal with nominal (A) and off-nominal (B) predicted events while the last two focus on measured events of two types, human performance (C) and system performance (D). Digital simulations showed that the expression A(1-B)/C+D) produced the greatest homogeneity of variance and distribution symmetry. Results from a recently completed manned life science telescience experiment will be used to further validate the model. Complex, interacting teleoperational systems may be systematically evaluated using this expression much like a computer benchmark is used.
Border Security: A Conceptual Model of Complexity
2013-12-01
maximum 200 words ) This research applies complexity and system dynamics theory to the idea of border security, culminating in the development of...alternative policy options. E. LIMITATIONS OF RESEARCH AND MODEL This research explores whether border security is a living system. In other words , whether...border inspections. Washington State, for example, experienced a 50% drop in tourism and lost over $100 million in local revenue because of the
ERIC Educational Resources Information Center
Greene, Jeffrey Alan; Azevedo, Roger
2009-01-01
In this study, we used think-aloud verbal protocols to examine how various macro-level processes of self-regulated learning (SRL; e.g., planning, monitoring, strategy use, handling of task difficulty and demands) were associated with the acquisition of a sophisticated mental model of a complex biological system. Numerous studies examine how…
NASA Astrophysics Data System (ADS)
Yoon, J.; Klassert, C. J. A.; Lachaut, T.; Selby, P. D.; Knox, S.; Gorelick, S.; Rajsekhar, D.; Tilmant, A.; Avisse, N.; Harou, J. J.; Gawel, E.; Klauer, B.; Mustafa, D.; Talozi, S.; Sigel, K.
2015-12-01
Our work focuses on development of a multi-agent, hydroeconomic model for purposes of water policy evaluation in Jordan. The model adopts a modular approach, integrating biophysical modules that simulate natural and engineered phenomena with human modules that represent behavior at multiple levels of decision making. The hydrologic modules are developed using spatially-distributed groundwater and surface water models, which are translated into compact simulators for efficient integration into the multi-agent model. For the groundwater model, we adopt a response matrix method approach in which a 3-dimensional MODFLOW model of a complex regional groundwater system is converted into a linear simulator of groundwater response by pre-processing drawdown results from several hundred numerical simulation runs. Surface water models for each major surface water basin in the country are developed in SWAT and similarly translated into simple rainfall-runoff functions for integration with the multi-agent model. The approach balances physically-based, spatially-explicit representation of hydrologic systems with the efficiency required for integration into a complex multi-agent model that is computationally amenable to robust scenario analysis. For the multi-agent model, we explicitly represent human agency at multiple levels of decision making, with agents representing riparian, management, supplier, and water user groups. The agents' decision making models incorporate both rule-based heuristics as well as economic optimization. The model is programmed in Python using Pynsim, a generalizable, open-source object-oriented code framework for modeling network-based water resource systems. The Jordan model is one of the first applications of Pynsim to a real-world water management case study. Preliminary results from a tanker market scenario run through year 2050 are presented in which several salient features of the water system are investigated: competition between urban and private farmer agents, the emergence of a private tanker market, disparities in economic wellbeing to different user groups caused by unique supply conditions, and response of the complex system to various policy interventions.
Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav
2015-01-01
Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close “neighborhood” of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa. PMID:26327290
Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav
2015-01-01
Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close "neighborhood" of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa.
Vižintin, Goran; Ravbar, Nataša; Janež, Jože; Koren, Eva; Janež, Naško; Zini, Luca; Treu, Francesco; Petrič, Metka
2018-04-01
Due to intrinsic characteristics of aquifers groundwater frequently passes between various types of aquifers without hindrance. The complex connection of underground water paths enables flow regardless of administrative boundaries. This can cause problems in water resources management. Numerical modelling is an important tool for the understanding, interpretation and management of aquifers. Useful and reliable methods of numerical modelling differ with regard to the type of aquifer, but their connections in a single hydrodynamic model are rare. The purpose of this study was to connect different models into an integrated system that enables determination of water travel time from the point of contamination to water sources. The worst-case scenario is considered. The system was applied in the Soča/Isonzo basin, a transboundary river in Slovenia and Italy, where there is a complex contact of karst and intergranular aquifers and surface flows over bedrock with low permeability. Time cell models were first elaborated separately for individual hydrogeological units. These were the result of numerical hydrological modelling (intergranular aquifer and surface flow) or complex GIS analysis taking into account the vulnerability map and tracer tests results (karst aquifer). The obtained cellular models present the basis of a contamination early-warning system, since it allows an estimation when contaminants can be expected to appear, and in which water sources. The system proves that the contaminants spread rapidly through karst aquifers and via surface flows, and more slowly through intergranular aquifers. For this reason, karst water sources are more at risk from one-off contamination incidents, while water sources in intergranular aquifers are more at risk in cases of long-term contamination. The system that has been developed is the basis for a single system of protection, action and quality monitoring in the areas of complex aquifer systems within or on the borders of administrative units. Copyright © 2017 Elsevier B.V. All rights reserved.
Sundaram, Meera V.; Buechner, Matthew
2016-01-01
The excretory system of the nematode Caenorhabditis elegans is a superb model of tubular organogenesis involving a minimum of cells. The system consists of just three unicellular tubes (canal, duct, and pore), a secretory gland, and two associated neurons. Just as in more complex organs, cells of the excretory system must first adopt specific identities and then coordinate diverse processes to form tubes of appropriate topology, shape, connectivity, and physiological function. The unicellular topology of excretory tubes, their varied and sometimes complex shapes, and the dynamic reprogramming of cell identity and remodeling of tube connectivity that occur during larval development are particularly fascinating features of this organ. The physiological roles of the excretory system in osmoregulation and other aspects of the animal’s life cycle are only beginning to be explored. The cellular mechanisms and molecular pathways used to build and shape excretory tubes appear similar to those used in both unicellular and multicellular tubes in more complex organs, such as the vertebrate vascular system and kidney, making this simple organ system a useful model for understanding disease processes. PMID:27183565
Robust Fixed-Structure Controller Synthesis
NASA Technical Reports Server (NTRS)
Corrado, Joseph R.; Haddad, Wassim M.; Gupta, Kajal (Technical Monitor)
2000-01-01
The ability to develop an integrated control system design methodology for robust high performance controllers satisfying multiple design criteria and real world hardware constraints constitutes a challenging task. The increasingly stringent performance specifications required for controlling such systems necessitates a trade-off between controller complexity and robustness. The principle challenge of the minimal complexity robust control design is to arrive at a tractable control design formulation in spite of the extreme complexity of such systems. Hence, design of minimal complexitY robust controllers for systems in the face of modeling errors has been a major preoccupation of system and control theorists and practitioners for the past several decades.
Complex Moving Parts: Assessment Systems and Electronic Portfolios
ERIC Educational Resources Information Center
Larkin, Martha J.; Robertson, Royce L.
2013-01-01
The largest college within an online university of over 50,000 students invested significant resources in translating a complex assessment system focused on continuous improvement and national accreditation into an effective and efficient electronic portfolio (ePortfolio). The team building the system needed a model to address problems met…
An R Package for Open, Reproducible Analysis of Urban Water Systems, With Application to Chicago
Urban water systems consist of natural and engineered flows of water interacting in complex ways. System complexity can be understood via mass conservative models that account for the interrelationships among all major flows and storages. We have developed a generic urban water s...
Governing Education in a Complex World. Educational Research and Innovation
ERIC Educational Resources Information Center
Burns, Tracey, Ed.; Köster, Florian, Ed.
2016-01-01
What models of governance are effective in complex education systems? In all systems an increasing number of stakeholders are involved in designing, delivering, and monitoring education. Like our societies, education systems are increasingly diverse regarding students, teachers, and communities, as well as the values and identities we expect…
Environmental Uncertainty and Communication Network Complexity: A Cross-System, Cross-Cultural Test.
ERIC Educational Resources Information Center
Danowski, James
An infographic model is proposed to account for the operation of systems within their information environments. Infographics is a communication paradigm used to indicate the clustering of information processing variables in communication systems. Four propositions concerning environmental uncertainty and internal communication network complexity,…
Elementary Teachers' Selection and Use of Visual Models
NASA Astrophysics Data System (ADS)
Lee, Tammy D.; Gail Jones, M.
2018-02-01
As science grows in complexity, science teachers face an increasing challenge of helping students interpret models that represent complex science systems. Little is known about how teachers select and use models when planning lessons. This mixed methods study investigated the pedagogical approaches and visual models used by elementary in-service and preservice teachers in the development of a science lesson about a complex system (e.g., water cycle). Sixty-seven elementary in-service and 69 elementary preservice teachers completed a card sort task designed to document the types of visual models (e.g., images) that teachers choose when planning science instruction. Quantitative and qualitative analyses were conducted to analyze the card sort task. Semistructured interviews were conducted with a subsample of teachers to elicit the rationale for image selection. Results from this study showed that both experienced in-service teachers and novice preservice teachers tended to select similar models and use similar rationales for images to be used in lessons. Teachers tended to select models that were aesthetically pleasing and simple in design and illustrated specific elements of the water cycle. The results also showed that teachers were not likely to select images that represented the less obvious dimensions of the water cycle. Furthermore, teachers selected visual models more as a pedagogical tool to illustrate specific elements of the water cycle and less often as a tool to promote student learning related to complex systems.
An Event-driven, Value-based, Pull Systems Engineering Scheduling Approach
2012-03-01
engineering in rapid response environments has been difficult, particularly those where large, complex brownfield systems or systems of systems exist and...where large, complex brownfield systems or systems of systems exist and are constantly being updated with both short and long term software enhancements...2004. [13] B. Boehm, “Applying the Incremental Commitment Model to Brownfield System Development,” Proceedings, CSER, 2009. [14] A. Borshchev and A
Predictive Surface Complexation Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sverjensky, Dimitri A.
Surface complexation plays an important role in the equilibria and kinetics of processes controlling the compositions of soilwaters and groundwaters, the fate of contaminants in groundwaters, and the subsurface storage of CO 2 and nuclear waste. Over the last several decades, many dozens of individual experimental studies have addressed aspects of surface complexation that have contributed to an increased understanding of its role in natural systems. However, there has been no previous attempt to develop a model of surface complexation that can be used to link all the experimental studies in order to place them on a predictive basis. Overall,more » my research has successfully integrated the results of the work of many experimentalists published over several decades. For the first time in studies of the geochemistry of the mineral-water interface, a practical predictive capability for modeling has become available. The predictive correlations developed in my research now enable extrapolations of experimental studies to provide estimates of surface chemistry for systems not yet studied experimentally and for natural and anthropogenically perturbed systems.« less
Bai, Shuming; Song, Kai; Shi, Qiang
2015-05-21
Observations of oscillatory features in the 2D spectra of several photosynthetic complexes have led to diverged opinions on their origins, including electronic coherence, vibrational coherence, and vibronic coherence. In this work, effects of these different types of quantum coherence on ultrafast pump-probe polarization anisotropy are investigated and distinguished. We first simulate the isotropic pump-probe signal and anisotropy decay of the Fenna-Matthews-Olson (FMO) complex using a model with only electronic coherence at low temperature and obtain the same coherence time as in the previous experiment. Then, three model dimer systems with different prespecified quantum coherence are simulated, and the results show that their different spectral characteristics can be used to determine the type of coherence during the spectral process. Finally, we simulate model systems with different electronic-vibrational couplings and reveal the condition in which long time vibronic coherence can be observed in systems like the FMO complex.
Shea, Christopher Michael
2017-01-01
Public health informatics is an evolving domain in which practices constantly change to meet the demands of a highly complex public health and healthcare delivery system. Given the emergence of various concepts, such as learning health systems, smart health systems, and adaptive complex health systems, health informatics professionals would benefit from a common set of measures and capabilities to inform our modeling, measuring, and managing of health system “smartness.” Here, we introduce the concepts of organizational complexity, problem/issue complexity, and situational awareness as three codependent drivers of smart public health systems characteristics. We also propose seven smart public health systems measures and capabilities that are important in a public health informatics professional's toolkit. PMID:28167999
Carney, Timothy Jay; Shea, Christopher Michael
2017-01-01
Public health informatics is an evolving domain in which practices constantly change to meet the demands of a highly complex public health and healthcare delivery system. Given the emergence of various concepts, such as learning health systems, smart health systems, and adaptive complex health systems, health informatics professionals would benefit from a common set of measures and capabilities to inform our modeling, measuring, and managing of health system "smartness." Here, we introduce the concepts of organizational complexity, problem/issue complexity, and situational awareness as three codependent drivers of smart public health systems characteristics. We also propose seven smart public health systems measures and capabilities that are important in a public health informatics professional's toolkit.
Systems Genetics as a Tool to Identify Master Genetic Regulators in Complex Disease.
Moreno-Moral, Aida; Pesce, Francesco; Behmoaras, Jacques; Petretto, Enrico
2017-01-01
Systems genetics stems from systems biology and similarly employs integrative modeling approaches to describe the perturbations and phenotypic effects observed in a complex system. However, in the case of systems genetics the main source of perturbation is naturally occurring genetic variation, which can be analyzed at the systems-level to explain the observed variation in phenotypic traits. In contrast with conventional single-variant association approaches, the success of systems genetics has been in the identification of gene networks and molecular pathways that underlie complex disease. In addition, systems genetics has proven useful in the discovery of master trans-acting genetic regulators of functional networks and pathways, which in many cases revealed unexpected gene targets for disease. Here we detail the central components of a fully integrated systems genetics approach to complex disease, starting from assessment of genetic and gene expression variation, linking DNA sequence variation to mRNA (expression QTL mapping), gene regulatory network analysis and mapping the genetic control of regulatory networks. By summarizing a few illustrative (and successful) examples, we highlight how different data-modeling strategies can be effectively integrated in a systems genetics study.
A Hardware Model Validation Tool for Use in Complex Space Systems
NASA Technical Reports Server (NTRS)
Davies, Misty Dawn; Gundy-Burlet, Karen L.; Limes, Gregory L.
2010-01-01
One of the many technological hurdles that must be overcome in future missions is the challenge of validating as-built systems against the models used for design. We propose a technique composed of intelligent parameter exploration in concert with automated failure analysis as a scalable method for the validation of complex space systems. The technique is impervious to discontinuities and linear dependencies in the data, and can handle dimensionalities consisting of hundreds of variables over tens of thousands of experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dress, W.B.
Rosen's modeling relation is embedded in Popper's three worlds to provide an heuristic tool for model building and a guide for thinking about complex systems. The utility of this construct is demonstrated by suggesting a solution to the problem of pseudo science and a resolution of the famous Bohr-Einstein debates. A theory of bizarre systems is presented by an analogy with entangled particles of quantum mechanics. This theory underscores the poverty of present-day computational systems (e.g., computers) for creating complex and bizarre entities by distinguishing between mechanism and organism.
Model verification of mixed dynamic systems. [POGO problem in liquid propellant rockets
NASA Technical Reports Server (NTRS)
Chrostowski, J. D.; Evensen, D. A.; Hasselman, T. K.
1978-01-01
A parameter-estimation method is described for verifying the mathematical model of mixed (combined interactive components from various engineering fields) dynamic systems against pertinent experimental data. The model verification problem is divided into two separate parts: defining a proper model and evaluating the parameters of that model. The main idea is to use differences between measured and predicted behavior (response) to adjust automatically the key parameters of a model so as to minimize response differences. To achieve the goal of modeling flexibility, the method combines the convenience of automated matrix generation with the generality of direct matrix input. The equations of motion are treated in first-order form, allowing for nonsymmetric matrices, modeling of general networks, and complex-mode analysis. The effectiveness of the method is demonstrated for an example problem involving a complex hydraulic-mechanical system.
Models for the modern power grid
NASA Astrophysics Data System (ADS)
Nardelli, Pedro H. J.; Rubido, Nicolas; Wang, Chengwei; Baptista, Murilo S.; Pomalaza-Raez, Carlos; Cardieri, Paulo; Latva-aho, Matti
2014-10-01
This article reviews different kinds of models for the electric power grid that can be used to understand the modern power system, the smart grid. From the physical network to abstract energy markets, we identify in the literature different aspects that co-determine the spatio-temporal multilayer dynamics of power system. We start our review by showing how the generation, transmission and distribution characteristics of the traditional power grids are already subject to complex behaviour appearing as a result of the the interplay between dynamics of the nodes and topology, namely synchronisation and cascade effects. When dealing with smart grids, the system complexity increases even more: on top of the physical network of power lines and controllable sources of electricity, the modernisation brings information networks, renewable intermittent generation, market liberalisation, prosumers, among other aspects. In this case, we forecast a dynamical co-evolution of the smart grid and other kind of networked systems that cannot be understood isolated. This review compiles recent results that model electric power grids as complex systems, going beyond pure technological aspects. From this perspective, we then indicate possible ways to incorporate the diverse co-evolving systems into the smart grid model using, for example, network theory and multi-agent simulation.
Generative complexity of Gray-Scott model
NASA Astrophysics Data System (ADS)
Adamatzky, Andrew
2018-03-01
In the Gray-Scott reaction-diffusion system one reactant is constantly fed in the system, another reactant is reproduced by consuming the supplied reactant and also converted to an inert product. The rate of feeding one reactant in the system and the rate of removing another reactant from the system determine configurations of concentration profiles: stripes, spots, waves. We calculate the generative complexity-a morphological complexity of concentration profiles grown from a point-wise perturbation of the medium-of the Gray-Scott system for a range of the feeding and removal rates. The morphological complexity is evaluated using Shannon entropy, Simpson diversity, approximation of Lempel-Ziv complexity, and expressivity (Shannon entropy divided by space-filling). We analyse behaviour of the systems with highest values of the generative morphological complexity and show that the Gray-Scott systems expressing highest levels of the complexity are composed of the wave-fragments (similar to wave-fragments in sub-excitable media) and travelling localisations (similar to quasi-dissipative solitons and gliders in Conway's Game of Life).
Modelling the urban water cycle as an integrated part of the city: a review.
Urich, Christian; Rauch, Wolfgang
2014-01-01
In contrast to common perceptions, the urban water infrastructure system is a complex and dynamic system that is constantly evolving and adapting to changes in the urban environment, to sustain existing services and provide additional ones. Instead of simplifying urban water infrastructure to a static system that is decoupled from its urban context, new management strategies use the complexity of the system to their advantage by integrating centralised with decentralised solutions and explicitly embedding water systems into their urban form. However, to understand and test possible adaptation strategies, urban water modelling tools are required to support exploration of their effectiveness as the human-technology-environment system coevolves under different future scenarios. The urban water modelling community has taken first steps to developing these new modelling tools. This paper critically reviews the historical development of urban water modelling tools and provides a summary of the current state of integrated modelling approaches. It reflects on the challenges that arise through the current practice of coupling urban water management tools with urban development models and discusses a potential pathway towards a new generation of modelling tools.
NASA Technical Reports Server (NTRS)
Hayden, Jeffrey L.; Jeffries, Alan
2012-01-01
The JPSS Ground System is a lIexible system of systems responsible for telemetry, tracking & command (TT &C), data acquisition, routing and data processing services for a varied lIeet of satellites to support weather prediction, modeling and climate modeling. To assist in this engineering effort, architecture modeling tools are being employed to translate the former NPOESS baseline to the new JPSS baseline, The paper will focus on the methodology for the system engineering process and the use of these architecture modeling tools within that process, The Department of Defense Architecture Framework version 2,0 (DoDAF 2.0) viewpoints and views that are being used to describe the JPSS GS architecture are discussed. The Unified Profile for DoOAF and MODAF (UPDM) and Systems Modeling Language (SysML), as ' provided by extensions to the MagicDraw UML modeling tool, are used to develop the diagrams and tables that make up the architecture model. The model development process and structure are discussed, examples are shown, and details of handling the complexities of a large System of Systems (SoS), such as the JPSS GS, with an equally complex modeling tool, are described
ERIC Educational Resources Information Center
Scherer, Hannah H.; Holder, Lauren; Herbert, Bruce
2017-01-01
Engaging students in authentic problem solving concerning environmental issues in near-surface complex Earth systems involves both developing student conceptualization of Earth as a system and applying that scientific knowledge using techniques that model those used by professionals. In this first paper of a two-part series, we review the state of…
Modeling and Simulation of Lab-on-a-Chip Systems
2005-08-12
complex chip geometries (including multiple turns). Variations of sample concentration profiles in laminar diffusion-based micromixers are also derived...CHAPTER 6 MODELING OF LAMINAR DIFFUSION-BASED COMPLEX ELECTROKINETIC PASSIVE MICROMIXERS ...140 6.4.4 Multi-Stream (Inter-Digital) Micromixers
Statistical Physics of Cascading Failures in Complex Networks
NASA Astrophysics Data System (ADS)
Panduranga, Nagendra Kumar
Systems such as the power grid, world wide web (WWW), and internet are categorized as complex systems because of the presence of a large number of interacting elements. For example, the WWW is estimated to have a billion webpages and understanding the dynamics of such a large number of individual agents (whose individual interactions might not be fully known) is a challenging task. Complex network representations of these systems have proved to be of great utility. Statistical physics is the study of emergence of macroscopic properties of systems from the characteristics of the interactions between individual molecules. Hence, statistical physics of complex networks has been an effective approach to study these systems. In this dissertation, I have used statistical physics to study two distinct phenomena in complex systems: i) Cascading failures and ii) Shortest paths in complex networks. Understanding cascading failures is considered to be one of the "holy grails" in the study of complex systems such as the power grid, transportation networks, and economic systems. Studying failures of these systems as percolation on complex networks has proved to be insightful. Previously, cascading failures have been studied extensively using two different models: k-core percolation and interdependent networks. The first part of this work combines the two models into a general model, solves it analytically, and validates the theoretical predictions through extensive computer simulations. The phase diagram of the percolation transition has been systematically studied as one varies the average local k-core threshold and the coupling between networks. The phase diagram of the combined processes is very rich and includes novel features that do not appear in the models which study each of the processes separately. For example, the phase diagram consists of first- and second-order transition regions separated by two tricritical lines that merge together and enclose a two-stage transition region. In the two-stage transition, the size of the giant component undergoes a first-order jump at a certain occupation probability followed by a continuous second-order transition at a smaller occupation probability. Furthermore, at certain fixed interdependencies, the percolation transition cycles from first-order to second-order to two-stage to first-order as the k-core threshold is increased. We setup the analytical equations describing the phase boundaries of the two-stage transition region and we derive the critical exponents for each type of transition. Understanding the shortest paths between individual elements in systems like communication networks and social media networks is important in the study of information cascades in these systems. Often, large heterogeneity can be present in the connections between nodes in these networks. Certain sets of nodes can be more highly connected among themselves than with the nodes from other sets. These sets of nodes are often referred to as 'communities'. The second part of this work studies the effect of the presence of communities on the distribution of shortest paths in a network using a modular Erdős-Renyi network model. In this model, the number of communities and the degree of modularity of the network can be tuned using the parameters of the model. We find that the model reaches a percolation threshold while tuning the degree of modularity of the network and the distribution of the shortest paths in the network can be used as an indicator of how the communities are connected.
Dense power-law networks and simplicial complexes
NASA Astrophysics Data System (ADS)
Courtney, Owen T.; Bianconi, Ginestra
2018-05-01
There is increasing evidence that dense networks occur in on-line social networks, recommendation networks and in the brain. In addition to being dense, these networks are often also scale-free, i.e., their degree distributions follow P (k ) ∝k-γ with γ ∈(1 ,2 ] . Models of growing networks have been successfully employed to produce scale-free networks using preferential attachment, however these models can only produce sparse networks as the numbers of links and nodes being added at each time step is constant. Here we present a modeling framework which produces networks that are both dense and scale-free. The mechanism by which the networks grow in this model is based on the Pitman-Yor process. Variations on the model are able to produce undirected scale-free networks with exponent γ =2 or directed networks with power-law out-degree distribution with tunable exponent γ ∈(1 ,2 ) . We also extend the model to that of directed two-dimensional simplicial complexes. Simplicial complexes are generalization of networks that can encode the many body interactions between the parts of a complex system and as such are becoming increasingly popular to characterize different data sets ranging from social interacting systems to the brain. Our model produces dense directed simplicial complexes with power-law distribution of the generalized out-degrees of the nodes.
Formal Requirements-Based Programming for Complex Systems
NASA Technical Reports Server (NTRS)
Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis
2005-01-01
Computer science as a field has not yet produced a general method to mechanically transform complex computer system requirements into a provably equivalent implementation. Such a method would be one major step towards dealing with complexity in computing, yet it remains the elusive holy grail of system development. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that such tools and methods leave unfilled is that the formal models cannot be proven to be equivalent to the system requirements as originated by the customer For the classes of complex systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations. While other techniques are available, this method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. We illustrate the application of the method to an example procedure from the Hubble Robotic Servicing Mission currently under study and preliminary formulation at NASA Goddard Space Flight Center.
Modeling of BN Lifetime Prediction of a System Based on Integrated Multi-Level Information
Wang, Xiaohong; Wang, Lizhi
2017-01-01
Predicting system lifetime is important to ensure safe and reliable operation of products, which requires integrated modeling based on multi-level, multi-sensor information. However, lifetime characteristics of equipment in a system are different and failure mechanisms are inter-coupled, which leads to complex logical correlations and the lack of a uniform lifetime measure. Based on a Bayesian network (BN), a lifetime prediction method for systems that combine multi-level sensor information is proposed. The method considers the correlation between accidental failures and degradation failure mechanisms, and achieves system modeling and lifetime prediction under complex logic correlations. This method is applied in the lifetime prediction of a multi-level solar-powered unmanned system, and the predicted results can provide guidance for the improvement of system reliability and for the maintenance and protection of the system. PMID:28926930
Modeling of BN Lifetime Prediction of a System Based on Integrated Multi-Level Information.
Wang, Jingbin; Wang, Xiaohong; Wang, Lizhi
2017-09-15
Predicting system lifetime is important to ensure safe and reliable operation of products, which requires integrated modeling based on multi-level, multi-sensor information. However, lifetime characteristics of equipment in a system are different and failure mechanisms are inter-coupled, which leads to complex logical correlations and the lack of a uniform lifetime measure. Based on a Bayesian network (BN), a lifetime prediction method for systems that combine multi-level sensor information is proposed. The method considers the correlation between accidental failures and degradation failure mechanisms, and achieves system modeling and lifetime prediction under complex logic correlations. This method is applied in the lifetime prediction of a multi-level solar-powered unmanned system, and the predicted results can provide guidance for the improvement of system reliability and for the maintenance and protection of the system.
UNH Data Cooperative: A Cyber Infrastructure for Earth System Studies
NASA Astrophysics Data System (ADS)
Braswell, B. H.; Fekete, B. M.; Prusevich, A.; Gliden, S.; Magill, A.; Vorosmarty, C. J.
2007-12-01
Earth system scientists and managers have a continuously growing demand for a wide array of earth observations derived from various data sources including (a) modern satellite retrievals, (b) "in-situ" records, (c) various simulation outputs, and (d) assimilated data products combining model results with observational records. The sheer quantity of data, and formatting inconsistencies make it difficult for users to take full advantage of this important information resource. Thus the system could benefit from a thorough retooling of our current data processing procedures and infrastructure. Emerging technologies, like OPeNDAP and OGC map services, open standard data formats (NetCDF, HDF) data cataloging systems (NASA-Echo, Global Change Master Directory, etc.) are providing the basis for a new approach in data management and processing, where web- services are increasingly designed to serve computer-to-computer communications without human interactions and complex analysis can be carried out over distributed computer resources interconnected via cyber infrastructure. The UNH Earth System Data Collaborative is designed to utilize the aforementioned emerging web technologies to offer new means of access to earth system data. While the UNH Data Collaborative serves a wide array of data ranging from weather station data (Climate Portal) to ocean buoy records and ship tracks (Portsmouth Harbor Initiative) to land cover characteristics, etc. the underlaying data architecture shares common components for data mining and data dissemination via web-services. Perhaps the most unique element of the UNH Data Cooperative's IT infrastructure is its prototype modeling environment for regional ecosystem surveillance over the Northeast corridor, which allows the integration of complex earth system model components with the Cooperative's data services. While the complexity of the IT infrastructure to perform complex computations is continuously increasing, scientists are often forced to spend considerable amount of time to solve basic data management and preprocessing tasks and deal with low level computational design problems like parallelization of model codes. Our modeling infrastructure is designed to take care the bulk of the common tasks found in complex earth system models like I/O handling, computational domain and time management, parallel execution of the modeling tasks, etc. The modeling infrastructure allows scientists to focus on the numerical implementation of the physical processes on a single computational objects(typically grid cells) while the framework takes care of the preprocessing of input data, establishing of the data exchange between computation objects and the execution of the science code. In our presentation, we will discuss the key concepts of our modeling infrastructure. We will demonstrate integration of our modeling framework with data services offered by the UNH Earth System Data Collaborative via web interfaces. We will layout the road map to turn our prototype modeling environment into a truly community framework for wide range of earth system scientists and environmental managers.
Complexity for Survival of Living Systems
NASA Technical Reports Server (NTRS)
Zak, Michail
2009-01-01
A logical connection between the survivability of living systems and the complexity of their behavior (equivalently, mental complexity) has been established. This connection is an important intermediate result of continuing research on mathematical models that could constitute a unified representation of the evolution of both living and non-living systems. Earlier results of this research were reported in several prior NASA Tech Briefs articles, the two most relevant being Characteristics of Dynamics of Intelligent Systems (NPO- 21037), NASA Tech Briefs, Vol. 26, No. 12 (December 2002), page 48; and Self-Supervised Dynamical Systems (NPO- 30634) NASA Tech Briefs, Vol. 27, No. 3 (March 2003), page 72. As used here, living systems is synonymous with active systems and intelligent systems. The quoted terms can signify artificial agents (e.g., suitably programmed computers) or natural biological systems ranging from single-cell organisms at one extreme to the whole of human society at the other extreme. One of the requirements that must be satisfied in mathematical modeling of living systems is reconciliation of evolution of life with the second law of thermodynamics. In the approach followed in this research, this reconciliation is effected by means of a model, inspired partly by quantum mechanics, in which the quantum potential is replaced with an information potential. The model captures the most fundamental property of life - the ability to evolve from disorder to order without any external interference. The model incorporates the equations of classical dynamics, including Newton s equations of motion and equations for random components caused by uncertainties in initial conditions and by Langevin forces. The equations of classical dynamics are coupled with corresponding Liouville or Fokker-Planck equations that describe the evolutions of probability densities that represent the uncertainties. The coupling is effected by fictitious information-based forces that are gradients of the information potential, which, in turn, is a function of the probability densities. The probability densities are associated with mental images both self-image and nonself images (images of external objects that can include other agents). The evolution of the probability densities represents mental dynamics. Then the interaction between the physical and metal aspects of behavior is implemented by feedback from mental to motor dynamics, as represented by the aforementioned fictitious forces. The interaction of a system with its self and nonself images affords unlimited capacity for increase of complexity. There is a biological basis for this model of mental dynamics in the discovery of mirror neurons that learn by imitation. The levels of complexity attained by use of this model match those observed in living systems. To establish a mechanism for increasing the complexity of dynamics of an active system, the model enables exploitation of a chain of reflections exemplified by questions of the form, "What do you think that I think that you think...?" Mathematically, each level of reflection is represented in the form of an attractor performing the corresponding level of abstraction with more details removed from higher levels. The model can be used to describe the behaviors, not only of biological systems, but also of ecological, social, and economics ones.
Toward a multiscale modeling framework for understanding serotonergic function
Wong-Lin, KongFatt; Wang, Da-Hui; Moustafa, Ahmed A; Cohen, Jeremiah Y; Nakamura, Kae
2017-01-01
Despite its importance in regulating emotion and mental wellbeing, the complex structure and function of the serotonergic system present formidable challenges toward understanding its mechanisms. In this paper, we review studies investigating the interactions between serotonergic and related brain systems and their behavior at multiple scales, with a focus on biologically-based computational modeling. We first discuss serotonergic intracellular signaling and neuronal excitability, followed by neuronal circuit and systems levels. At each level of organization, we will discuss the experimental work accompanied by related computational modeling work. We then suggest that a multiscale modeling approach that integrates the various levels of neurobiological organization could potentially transform the way we understand the complex functions associated with serotonin. PMID:28417684
NASA Technical Reports Server (NTRS)
Wise, Stephen A.; Holt, James M.
2002-01-01
The complexity of International Space Station (ISS) systems modeling often necessitates the concurrence of various dissimilar, parallel analysis techniques to validate modeling. This was the case with a feasibility and performance study of the ISS Node 3 Regenerative Heat Exchanger (RHX). A thermo-hydraulic network model was created and analyzed in SINDA/FLUINT. A less complex, closed form solution of the systems dynamics was created using an Excel Spreadsheet. The purpose of this paper is to provide a brief description of the modeling processes utilized, the results and benefits of each to the ISS Node 3 RHX study.
NASA Technical Reports Server (NTRS)
Wise, Stephen A.; Holt, James M.; Turner, Larry D. (Technical Monitor)
2001-01-01
The complexity of International Space Station (ISS) systems modeling often necessitates the concurrence of various dissimilar, parallel analysis techniques to validate modeling. This was the case with a feasibility and performance study of the ISS Node 3 Regenerative Heat Exchanger (RHX). A thermo-hydraulic network model was created and analyzed in SINDA/FLUINT. A less complex, closed form solution of the system dynamics was created using Excel. The purpose of this paper is to provide a brief description of the modeling processes utilized, the results and benefits of each to the ISS Node 3 RHX study.
Use of paired simple and complex models to reduce predictive bias and quantify uncertainty
NASA Astrophysics Data System (ADS)
Doherty, John; Christensen, Steen
2011-12-01
Modern environmental management and decision-making is based on the use of increasingly complex numerical models. Such models have the advantage of allowing representation of complex processes and heterogeneous system property distributions inasmuch as these are understood at any particular study site. The latter are often represented stochastically, this reflecting knowledge of the character of system heterogeneity at the same time as it reflects a lack of knowledge of its spatial details. Unfortunately, however, complex models are often difficult to calibrate because of their long run times and sometimes questionable numerical stability. Analysis of predictive uncertainty is also a difficult undertaking when using models such as these. Such analysis must reflect a lack of knowledge of spatial hydraulic property details. At the same time, it must be subject to constraints on the spatial variability of these details born of the necessity for model outputs to replicate observations of historical system behavior. In contrast, the rapid run times and general numerical reliability of simple models often promulgates good calibration and ready implementation of sophisticated methods of calibration-constrained uncertainty analysis. Unfortunately, however, many system and process details on which uncertainty may depend are, by design, omitted from simple models. This can lead to underestimation of the uncertainty associated with many predictions of management interest. The present paper proposes a methodology that attempts to overcome the problems associated with complex models on the one hand and simple models on the other hand, while allowing access to the benefits each of them offers. It provides a theoretical analysis of the simplification process from a subspace point of view, this yielding insights into the costs of model simplification, and into how some of these costs may be reduced. It then describes a methodology for paired model usage through which predictive bias of a simplified model can be detected and corrected, and postcalibration predictive uncertainty can be quantified. The methodology is demonstrated using a synthetic example based on groundwater modeling environments commonly encountered in northern Europe and North America.
Multi-Agent Strategic Modeling in a Specific Environment
NASA Astrophysics Data System (ADS)
Gams, Matjaz; Bezek, Andraz
Multi-agent modeling in ambient intelligence (AmI) is concerned with the following task [19]: How can external observations of multi-agent systems in the ambient be used to analyze, model, and direct agent behavior? The main purpose is to obtain knowledge about acts in the environment thus enabling proper actions of the AmI systems [1]. Analysis of such systems must thus capture complex world state representation and asynchronous agent activities. Instead of studying basic numerical data, researchers often use more complex data structures, such as rules and decision trees. Some methods are extremely useful when characterizing state space, but lack the ability to clearly represent temporal state changes occurred by agent actions. To comprehend simultaneous agent actions and complex changes of state space, most often a combination of graphical and symbolical representation performs better in terms of human understanding and performance.
Condition-based diagnosis of mechatronic systems using a fractional calculus approach
NASA Astrophysics Data System (ADS)
Gutiérrez-Carvajal, Ricardo Enrique; Flávio de Melo, Leonimer; Maurício Rosário, João; Tenreiro Machado, J. A.
2016-07-01
While fractional calculus (FC) is as old as integer calculus, its application has been mainly restricted to mathematics. However, many real systems are better described using FC equations than with integer models. FC is a suitable tool for describing systems characterised by their fractal nature, long-term memory and chaotic behaviour. It is a promising methodology for failure analysis and modelling, since the behaviour of a failing system depends on factors that increase the model's complexity. This paper explores the proficiency of FC in modelling complex behaviour by tuning only a few parameters. This work proposes a novel two-step strategy for diagnosis, first modelling common failure conditions and, second, by comparing these models with real machine signals and using the difference to feed a computational classifier. Our proposal is validated using an electrical motor coupled with a mechanical gear reducer.
1999-03-01
mates) and base their behaviors on this interactive information. This alone defines the nature of a complex adaptive system and it is based on this...world policy initiatives. 2.3.4. User Interaction Building the model with extensive user interaction gives the entire system a more appealing feel...complex behavior that hopefully mimics trends observed in reality . User interaction also allows for easier justification of assumptions used within
NASA Technical Reports Server (NTRS)
Al-Jaar, Robert Y.; Desrochers, Alan A.
1989-01-01
The main objective of this research is to develop a generic modeling methodology with a flexible and modular framework to aid in the design and performance evaluation of integrated manufacturing systems using a unified model. After a thorough examination of the available modeling methods, the Petri Net approach was adopted. The concurrent and asynchronous nature of manufacturing systems are easily captured by Petri Net models. Three basic modules were developed: machine, buffer, and Decision Making Unit. The machine and buffer modules are used for modeling transfer lines and production networks. The Decision Making Unit models the functions of a computer node in a complex Decision Making Unit Architecture. The underlying model is a Generalized Stochastic Petri Net (GSPN) that can be used for performance evaluation and structural analysis. GSPN's were chosen because they help manage the complexity of modeling large manufacturing systems. There is no need to enumerate all the possible states of the Markov Chain since they are automatically generated from the GSPN model.
A Model-Based Approach to Engineering Behavior of Complex Aerospace Systems
NASA Technical Reports Server (NTRS)
Ingham, Michel; Day, John; Donahue, Kenneth; Kadesch, Alex; Kennedy, Andrew; Khan, Mohammed Omair; Post, Ethan; Standley, Shaun
2012-01-01
One of the most challenging yet poorly defined aspects of engineering a complex aerospace system is behavior engineering, including definition, specification, design, implementation, and verification and validation of the system's behaviors. This is especially true for behaviors of highly autonomous and intelligent systems. Behavior engineering is more of an art than a science. As a process it is generally ad-hoc, poorly specified, and inconsistently applied from one project to the next. It uses largely informal representations, and results in system behavior being documented in a wide variety of disparate documents. To address this problem, JPL has undertaken a pilot project to apply its institutional capabilities in Model-Based Systems Engineering to the challenge of specifying complex spacecraft system behavior. This paper describes the results of the work in progress on this project. In particular, we discuss our approach to modeling spacecraft behavior including 1) requirements and design flowdown from system-level to subsystem-level, 2) patterns for behavior decomposition, 3) allocation of behaviors to physical elements in the system, and 4) patterns for capturing V&V activities associated with behavioral requirements. We provide examples of interesting behavior specification patterns, and discuss findings from the pilot project.
Complexity and Hopf Bifurcation Analysis on a Kind of Fractional-Order IS-LM Macroeconomic System
NASA Astrophysics Data System (ADS)
Ma, Junhai; Ren, Wenbo
On the basis of our previous research, we deepen and complete a kind of macroeconomics IS-LM model with fractional-order calculus theory, which is a good reflection on the memory characteristics of economic variables, we also focus on the influence of the variables on the real system, and improve the analysis capabilities of the traditional economic models to suit the actual macroeconomic environment. The conditions of Hopf bifurcation in fractional-order system models are briefly demonstrated, and the fractional order when Hopf bifurcation occurs is calculated, showing the inherent complex dynamic characteristics of the system. With numerical simulation, bifurcation, strange attractor, limit cycle, waveform and other complex dynamic characteristics are given; and the order condition is obtained with respect to time. We find that the system order has an important influence on the running state of the system. The system has a periodic motion when the order meets the conditions of Hopf bifurcation; the fractional-order system gradually stabilizes with the change of the order and parameters while the corresponding integer-order system diverges. This study has certain significance to policy-making about macroeconomic regulation and control.
NASA Astrophysics Data System (ADS)
McCaskill, John
There can be large spatial and temporal separation of cause and effect in policy making. Determining the correct linkage between policy inputs and outcomes can be highly impractical in the complex environments faced by policy makers. In attempting to see and plan for the probable outcomes, standard linear models often overlook, ignore, or are unable to predict catastrophic events that only seem improbable due to the issue of multiple feedback loops. There are several issues with the makeup and behaviors of complex systems that explain the difficulty many mathematical models (factor analysis/structural equation modeling) have in dealing with non-linear effects in complex systems. This chapter highlights those problem issues and offers insights to the usefulness of ABM in dealing with non-linear effects in complex policy making environments.
Four Single-Page Learning Models.
ERIC Educational Resources Information Center
Hlynka, Denis
1979-01-01
Identifies four models of single-page learning systems that can streamline lengthy, complex prose: Information Mapping, Focal Press Model, Behavioral Objectives Model, and School Mathematics Model. (CMV)
Adsorption Equilibrium and Kinetics at Goethite-Water and Related Interfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katz, Lynn Ellen
This research study is an important component of a broader comprehensive project, “Geochemistry of Interfaces: From Surfaces to Interlayers to Clusters,” which sought to identify and evaluate the critical molecular phenomena at metal-oxide interfaces that control many geochemical and environmental processes. The primary goal of this research study was to better understand and predict adsorption of metal ions at mineral/water surfaces. Macroscopic data in traditional batch experiments was used to develop predictive models that characterize sorption in complex systems containing a wide range of background solution compositions. Our studies focused on systems involving alkaline earth metal (Mg 2+, Ca 2+,more » Sr 2+, Ba 2+) and heavy metal (Hg 2+, Co 2+, Cd 2+, Cu 2+, Zn 2+, Pb 2+) cations. The anions we selected for study included Cl -, NO 3 -, ClO 4 -, SO 4 2-, CO 3 2- and SeO 3 2- and the background electrolyte cations we examined included (Na +, K +, Rb + and Cs +) because these represent a range of ion sizes and have varying potentials for forming ion-pairs or ternary complexes with the metal ions studied. The research led to the development of a modified titration congruency approach for estimating site densities for mineral oxides such as goethite. The CD-MUSIC version of the surface complexation modeling approach was applied to potentiometric titration data and macroscopic adsorption data for single-solute heavy metals, oxyanions, alkaline earth metals and background electrolytes over a range of pH and ionic strength. The model was capable of predicting sorption in bi-solute systems containing multiple cations, cations and oxyanions, and transition metal cations and alkaline earth metal ions. Incorporation of ternary complexes was required for modeling Pb(II)-Se(IV) and Cd(II)-Se(IV) systems. -Both crystal face contributions and capacitance values were shown to be sensitive to varying specific surface area but were successfully accounted for in the modeling strategy. The insights gained from the macroscopic, spectroscopic and CD-MUSIC modeling developed in this study can be used to guide the implementation of less complex models which may be more applicable to field conditions. The findings of this research suggest that surface complexation models can be used as a predictive tool for fate and transport modeling of metal ions and oxyanions in fresh and saline systems typical of energy production waters and wastewaters.« less
Undecidability and Irreducibility Conditions for Open-Ended Evolution and Emergence.
Hernández-Orozco, Santiago; Hernández-Quiroz, Francisco; Zenil, Hector
2018-01-01
Is undecidability a requirement for open-ended evolution (OEE)? Using methods derived from algorithmic complexity theory, we propose robust computational definitions of open-ended evolution and the adaptability of computable dynamical systems. Within this framework, we show that decidability imposes absolute limits on the stable growth of complexity in computable dynamical systems. Conversely, systems that exhibit (strong) open-ended evolution must be undecidable, establishing undecidability as a requirement for such systems. Complexity is assessed in terms of three measures: sophistication, coarse sophistication, and busy beaver logical depth. These three complexity measures assign low complexity values to random (incompressible) objects. As time grows, the stated complexity measures allow for the existence of complex states during the evolution of a computable dynamical system. We show, however, that finding these states involves undecidable computations. We conjecture that for similar complexity measures that assign low complexity values, decidability imposes comparable limits on the stable growth of complexity, and that such behavior is necessary for nontrivial evolutionary systems. We show that the undecidability of adapted states imposes novel and unpredictable behavior on the individuals or populations being modeled. Such behavior is irreducible. Finally, we offer an example of a system, first proposed by Chaitin, that exhibits strong OEE.
Petri net-based dependability modeling methodology for reconfigurable field programmable gate arrays
NASA Astrophysics Data System (ADS)
Graczyk, Rafał; Orleański, Piotr; Poźniak, Krzysztof
2015-09-01
Dependability modeling is an important issue for aerospace and space equipment designers. From system level perspective, one has to choose from multitude of possible architectures, redundancy levels, component combinations in a way to meet desired properties and dependability and finally fit within required cost and time budgets. Modeling of such systems is getting harder as its levels of complexity grow together with demand for more functional and flexible, yet more available systems that govern more and more crucial parts of our civilization's infrastructure (aerospace transport systems, telecommunications, exploration probes). In this article promising method of modeling complex systems using Petri networks is introduced in context of qualitative and quantitative dependability analysis. This method, although with some limitation and drawback offer still convenient visual formal method of describing system behavior on different levels (functional, timing, random events) and offers straight correspondence to underlying mathematical engine, perfect for simulations and engineering support.
How rare is complex life in the Milky Way?
Bounama, Christine; von Bloh, Werner; Franck, Siegfried
2007-10-01
An integrated Earth system model was applied to calculate the number of habitable Earth-analog planets that are likely to have developed primitive (unicellular) and complex (multicellular) life in extrasolar planetary systems. The model is based on the global carbon cycle mediated by life and driven by increasing stellar luminosity and plate tectonics. We assumed that the hypothetical primitive and complex life forms differed in their temperature limits and CO(2) tolerances. Though complex life would be more vulnerable to environmental stress, its presence would amplify weathering processes on a terrestrial planet. The model allowed us to calculate the average number of Earth-analog planets that may harbor such life by using the formation rate of Earth-like planets in the Milky Way as well as the size of a habitable zone that could support primitive and complex life forms. The number of planets predicted to bear complex life was found to be approximately 2 orders of magnitude lower than the number predicted for primitive life forms. Our model predicted a maximum abundance of such planets around 1.8 Ga ago and allowed us to calculate the average distance between potentially habitable planets in the Milky Way. If the model predictions are accurate, the future missions DARWIN (up to a probability of 65%) and TPF (up to 20%) are likely to detect at least one planet with a biosphere composed of complex life.
NASA Astrophysics Data System (ADS)
Bosikov, I. I.; Klyuev, R. V.; Revazov, V. Ch; Pilieva, D. E.
2018-03-01
The article describes research and analysis of hazardous processes occurring in the natural-industrial system and effectiveness assessment of its functioning using mathematical models. Studies of the functioning regularities of the natural and industrial system are becoming increasingly relevant in connection with the formulation of the task of modernizing production and the economy of Russia as a whole. In connection with a significant amount of poorly structured data, it is complicated by regulations for the effective functioning of production processes, social and natural complexes, under which a sustainable development of the natural-industrial system of the mining and processing complex would be ensured. Therefore, the scientific and applied problems, the solution of which allows one to formalize the hidden structural functioning patterns of the natural-industrial system and to make managerial decisions of organizational and technological nature to improve the efficiency of the system, are very relevant.
A computational framework for modeling targets as complex adaptive systems
NASA Astrophysics Data System (ADS)
Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh
2017-05-01
Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.
Using Complexity Theory to Guide Medical School Evaluations.
Jorm, Christine; Roberts, Chris
2018-03-01
Contemporary medical school evaluations are narrow in focus and often do not consider the wider systems implications of the relationship between learning and teaching, research, clinical care, and community engagement. The result is graduates who lack the necessary knowledge and skills for the modern health care system and an educational system that is limited in its ability to learn and change.To address this issue, the authors apply complexity theory to medical school evaluation, using four key factors-nesting, diversity, self-organization, and emergent outcomes. To help medical educators apply this evaluation approach in their own settings, the authors offer two tools-a modified program logic model and sensemaking. In sensemaking, they use the organic metaphor of the medical school as a neuron situated within a complex neural network to enable medical educators to reframe the way they think about program evaluation. The authors then offer practical guidance for applying this model, including describing the example of addressing graduates' engagement in the health care system. The authors consider the input of teachers, the role of culture and curriculum, and the clinical care system in this example.Medical school evaluation is reframed as an improvement science for complex social interventions (medical school is such an intervention) in this model. With complexity theory's focus on emergent outcomes, evaluation takes on a new focus, reimagining medical students as reaching their future potential as change agents, who transform health systems and the lives of patients.
A geometric modeler based on a dual-geometry representation polyhedra and rational b-splines
NASA Technical Reports Server (NTRS)
Klosterman, A. L.
1984-01-01
For speed and data base reasons, solid geometric modeling of large complex practical systems is usually approximated by a polyhedra representation. Precise parametric surface and implicit algebraic modelers are available but it is not yet practical to model the same level of system complexity with these precise modelers. In response to this contrast the GEOMOD geometric modeling system was built so that a polyhedra abstraction of the geometry would be available for interactive modeling without losing the precise definition of the geometry. Part of the reason that polyhedra modelers are effective is that all bounded surfaces can be represented in a single canonical format (i.e., sets of planar polygons). This permits a very simple and compact data structure. Nonuniform rational B-splines are currently the best representation to describe a very large class of geometry precisely with one canonical format. The specific capabilities of the modeler are described.
Approaching human language with complex networks
NASA Astrophysics Data System (ADS)
Cong, Jin; Liu, Haitao
2014-12-01
The interest in modeling and analyzing human language with complex networks is on the rise in recent years and a considerable body of research in this area has already been accumulated. We survey three major lines of linguistic research from the complex network approach: 1) characterization of human language as a multi-level system with complex network analysis; 2) linguistic typological research with the application of linguistic networks and their quantitative measures; and 3) relationships between the system-level complexity of human language (determined by the topology of linguistic networks) and microscopic linguistic (e.g., syntactic) features (as the traditional concern of linguistics). We show that the models and quantitative tools of complex networks, when exploited properly, can constitute an operational methodology for linguistic inquiry, which contributes to the understanding of human language and the development of linguistics. We conclude our review with suggestions for future linguistic research from the complex network approach: 1) relationships between the system-level complexity of human language and microscopic linguistic features; 2) expansion of research scope from the global properties to other levels of granularity of linguistic networks; and 3) combination of linguistic network analysis with other quantitative studies of language (such as quantitative linguistics).
Conceptual Modeling in Systems Biology Fosters Empirical Findings: The mRNA Lifecycle
Dori, Dov; Choder, Mordechai
2007-01-01
One of the main obstacles to understanding complex biological systems is the extent and rapid evolution of information, way beyond the capacity individuals to manage and comprehend. Current modeling approaches and tools lack adequate capacity to model concurrently structure and behavior of biological systems. Here we propose Object-Process Methodology (OPM), a holistic conceptual modeling paradigm, as a means to model both diagrammatically and textually biological systems formally and intuitively at any desired number of levels of detail. OPM combines objects, e.g., proteins, and processes, e.g., transcription, in a way that is simple and easily comprehensible to researchers and scholars. As a case in point, we modeled the yeast mRNA lifecycle. The mRNA lifecycle involves mRNA synthesis in the nucleus, mRNA transport to the cytoplasm, and its subsequent translation and degradation therein. Recent studies have identified specific cytoplasmic foci, termed processing bodies that contain large complexes of mRNAs and decay factors. Our OPM model of this cellular subsystem, presented here, led to the discovery of a new constituent of these complexes, the translation termination factor eRF3. Association of eRF3 with processing bodies is observed after a long-term starvation period. We suggest that OPM can eventually serve as a comprehensive evolvable model of the entire living cell system. The model would serve as a research and communication platform, highlighting unknown and uncertain aspects that can be addressed empirically and updated consequently while maintaining consistency. PMID:17849002
NASA Astrophysics Data System (ADS)
Box, Paul W.
GIS and spatial analysis is suited mainly for static pictures of the landscape, but many of the processes that need exploring are dynamic in nature. Dynamic processes can be complex when put in a spatial context; our ability to study such processes will probably come with advances in understanding complex systems in general. Cellular automata and agent-based models are two prime candidates for exploring complex spatial systems, but are difficult to implement. Innovative tools that help build complex simulations will create larger user communities, who will probably find novel solutions for understanding complexity. A significant source for such innovations is likely to be from the collective efforts of hobbyists and part-time programmers, who have been dubbed ``garage-band scientists'' in the popular press.
On the dimension of complex responses in nonlinear structural vibrations
NASA Astrophysics Data System (ADS)
Wiebe, R.; Spottswood, S. M.
2016-07-01
The ability to accurately model engineering systems under extreme dynamic loads would prove a major breakthrough in many aspects of aerospace, mechanical, and civil engineering. Extreme loads frequently induce both nonlinearities and coupling which increase the complexity of the response and the computational cost of finite element models. Dimension reduction has recently gained traction and promises the ability to distill dynamic responses down to a minimal dimension without sacrificing accuracy. In this context, the dimensionality of a response is related to the number of modes needed in a reduced order model to accurately simulate the response. Thus, an important step is characterizing the dimensionality of complex nonlinear responses of structures. In this work, the dimensionality of the nonlinear response of a post-buckled beam is investigated. Significant detail is dedicated to carefully introducing the experiment, the verification of a finite element model, and the dimensionality estimation algorithm as it is hoped that this system may help serve as a benchmark test case. It is shown that with minor modifications, the method of false nearest neighbors can quantitatively distinguish between the response dimension of various snap-through, non-snap-through, random, and deterministic loads. The state-space dimension of the nonlinear system in question increased from 2-to-10 as the system response moved from simple, low-level harmonic to chaotic snap-through. Beyond the problem studied herein, the techniques developed will serve as a prescriptive guide in developing fast and accurate dimensionally reduced models of nonlinear systems, and eventually as a tool for adaptive dimension-reduction in numerical modeling. The results are especially relevant in the aerospace industry for the design of thin structures such as beams, panels, and shells, which are all capable of spatio-temporally complex dynamic responses that are difficult and computationally expensive to model.
Use of system identification techniques for improving airframe finite element models using test data
NASA Technical Reports Server (NTRS)
Hanagud, Sathya V.; Zhou, Weiyu; Craig, James I.; Weston, Neil J.
1993-01-01
A method for using system identification techniques to improve airframe finite element models using test data was developed and demonstrated. The method uses linear sensitivity matrices to relate changes in selected physical parameters to changes in the total system matrices. The values for these physical parameters were determined using constrained optimization with singular value decomposition. The method was confirmed using both simple and complex finite element models for which pseudo-experimental data was synthesized directly from the finite element model. The method was then applied to a real airframe model which incorporated all of the complexities and details of a large finite element model and for which extensive test data was available. The method was shown to work, and the differences between the identified model and the measured results were considered satisfactory.
NASA Astrophysics Data System (ADS)
Frenken, Koen
2001-06-01
The biological evolution of complex organisms, in which the functioning of genes is interdependent, has been analyzed as "hill-climbing" on NK fitness landscapes through random mutation and natural selection. In evolutionary economics, NK fitness landscapes have been used to simulate the evolution of complex technological systems containing elements that are interdependent in their functioning. In these models, economic agents randomly search for new technological design by trial-and-error and run the risk of ending up in sub-optimal solutions due to interdependencies between the elements in a complex system. These models of random search are legitimate for reasons of modeling simplicity, but remain limited as these models ignore the fact that agents can apply heuristics. A specific heuristic is one that sequentially optimises functions according to their ranking by users of the system. To model this heuristic, a generalized NK-model is developed. In this model, core elements that influence many functions can be distinguished from peripheral elements that affect few functions. The concept of paradigmatic search can then be analytically defined as search that leaves core elements in tact while concentrating on improving functions by mutation of peripheral elements.
NASA Astrophysics Data System (ADS)
Hoepfer, Matthias
Over the last two decades, computer modeling and simulation have evolved as the tools of choice for the design and engineering of dynamic systems. With increased system complexities, modeling and simulation become essential enablers for the design of new systems. Some of the advantages that modeling and simulation-based system design allows for are the replacement of physical tests to ensure product performance, reliability and quality, the shortening of design cycles due to the reduced need for physical prototyping, the design for mission scenarios, the invoking of currently nonexisting technologies, and the reduction of technological and financial risks. Traditionally, dynamic systems are modeled in a monolithic way. Such monolithic models include all the data, relations and equations necessary to represent the underlying system. With increased complexity of these models, the monolithic model approach reaches certain limits regarding for example, model handling and maintenance. Furthermore, while the available computer power has been steadily increasing according to Moore's Law (a doubling in computational power every 10 years), the ever-increasing complexities of new models have negated the increased resources available. Lastly, modern systems and design processes are interdisciplinary, enforcing the necessity to make models more flexible to be able to incorporate different modeling and design approaches. The solution to bypassing the shortcomings of monolithic models is cosimulation. In a very general sense, co-simulation addresses the issue of linking together different dynamic sub-models to a model which represents the overall, integrated dynamic system. It is therefore an important enabler for the design of interdisciplinary, interconnected, highly complex dynamic systems. While a basic co-simulation setup can be very easy, complications can arise when sub-models display behaviors such as algebraic loops, singularities, or constraints. This work frames the co-simulation approach to modeling and simulation. It lays out the general approach to dynamic system co-simulation, and gives a comprehensive overview of what co-simulation is and what it is not. It creates a taxonomy of the requirements and limits of co-simulation, and the issues arising with co-simulating sub-models. Possible solutions towards resolving the stated problems are investigated to a certain depth. A particular focus is given to the issue of time stepping. It will be shown that for dynamic models, the selection of the simulation time step is a crucial issue with respect to computational expense, simulation accuracy, and error control. The reasons for this are discussed in depth, and a time stepping algorithm for co-simulation with unknown dynamic sub-models is proposed. Motivations and suggestions for the further treatment of selected issues are presented.
Supporting Space Systems Design via Systems Dependency Analysis Methodology
NASA Astrophysics Data System (ADS)
Guariniello, Cesare
The increasing size and complexity of space systems and space missions pose severe challenges to space systems engineers. When complex systems and Systems-of-Systems are involved, the behavior of the whole entity is not only due to that of the individual systems involved but also to the interactions and dependencies between the systems. Dependencies can be varied and complex, and designers usually do not perform analysis of the impact of dependencies at the level of complex systems, or this analysis involves excessive computational cost, or occurs at a later stage of the design process, after designers have already set detailed requirements, following a bottom-up approach. While classical systems engineering attempts to integrate the perspectives involved across the variety of engineering disciplines and the objectives of multiple stakeholders, there is still a need for more effective tools and methods capable to identify, analyze and quantify properties of the complex system as a whole and to model explicitly the effect of some of the features that characterize complex systems. This research describes the development and usage of Systems Operational Dependency Analysis and Systems Developmental Dependency Analysis, two methods based on parametric models of the behavior of complex systems, one in the operational domain and one in the developmental domain. The parameters of the developed models have intuitive meaning, are usable with subjective and quantitative data alike, and give direct insight into the causes of observed, and possibly emergent, behavior. The approach proposed in this dissertation combines models of one-to-one dependencies among systems and between systems and capabilities, to analyze and evaluate the impact of failures or delays on the outcome of the whole complex system. The analysis accounts for cascading effects, partial operational failures, multiple failures or delays, and partial developmental dependencies. The user of these methods can assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.
Intermittent dynamics in complex systems driven to depletion.
Escobar, Juan V; Pérez Castillo, Isaac
2018-03-19
When complex systems are driven to depletion by some external factor, their non-stationary dynamics can present an intermittent behaviour between relative tranquility and burst of activity whose consequences are often catastrophic. To understand and ultimately be able to predict such dynamics, we propose an underlying mechanism based on sharp thresholds of a local generalized energy density that naturally leads to negative feedback. We find a transition from a continuous regime to an intermittent one, in which avalanches can be predicted despite the stochastic nature of the process. This model may have applications in many natural and social complex systems where a rapid depletion of resources or generalized energy drives the dynamics. In particular, we show how this model accurately describes the time evolution and avalanches present in a real social system.
Models-3 is a flexible software system designed to simplify the development and use of environmental assessment and other decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheri...
A protocol for parameterization and calibration of RZWQM2 in field research
USDA-ARS?s Scientific Manuscript database
Use of agricultural system models in field research requires a full understanding of both the model and the system it simulates. Since the 1960s, agricultural system models have increased tremendously in their complexity due to greater understanding of the processes simulated, their application to r...
Advances In High Temperature (Viscoelastoplastic) Material Modeling for Thermal Structural Analysis
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Saleeb, Atef F.
2005-01-01
Typical High Temperature Applications High Temperature Applications Demand High Performance Materials: 1) Complex Thermomechanical Loading; 2) Complex Material response requires Time-Dependent/Hereditary Models: Viscoelastic/Viscoplastic; and 3) Comprehensive Characterization (Tensile, Creep, Relaxation) for a variety of material systems.
A program code generator for multiphysics biological simulation using markup languages.
Amano, Akira; Kawabata, Masanari; Yamashita, Yoshiharu; Rusty Punzalan, Florencio; Shimayoshi, Takao; Kuwabara, Hiroaki; Kunieda, Yoshitoshi
2012-01-01
To cope with the complexity of the biological function simulation models, model representation with description language is becoming popular. However, simulation software itself becomes complex in these environment, thus, it is difficult to modify the simulation conditions, target computation resources or calculation methods. In the complex biological function simulation software, there are 1) model equations, 2) boundary conditions and 3) calculation schemes. Use of description model file is useful for first point and partly second point, however, third point is difficult to handle for various calculation schemes which is required for simulation models constructed from two or more elementary models. We introduce a simulation software generation system which use description language based description of coupling calculation scheme together with cell model description file. By using this software, we can easily generate biological simulation code with variety of coupling calculation schemes. To show the efficiency of our system, example of coupling calculation scheme with three elementary models are shown.
How do precision medicine and system biology response to human body's complex adaptability?
Yuan, Bing
2016-12-01
In the field of life sciences, although system biology and "precision medicine" introduce some complex scientifific methods and techniques, it is still based on the "analysis-reconstruction" of reductionist theory as a whole. Adaptability of complex system increase system behaviour uncertainty as well as the difficulties of precise identifification and control. It also put systems biology research into trouble. To grasp the behaviour and characteristics of organism fundamentally, systems biology has to abandon the "analysis-reconstruction" concept. In accordance with the guidelines of complexity science, systems biology should build organism model from holistic level, just like the Chinese medicine did in dealing with human body and disease. When we study the living body from the holistic level, we will fifind the adaptability of complex system is not the obstacle that increases the diffificulty of problem solving. It is the "exceptional", "right-hand man" that helping us to deal with the complexity of life more effectively.
Can Models Capture the Complexity of the Systems Engineering Process?
NASA Astrophysics Data System (ADS)
Boppana, Krishna; Chow, Sam; de Weck, Olivier L.; Lafon, Christian; Lekkakos, Spyridon D.; Lyneis, James; Rinaldi, Matthew; Wang, Zhiyong; Wheeler, Paul; Zborovskiy, Marat; Wojcik, Leonard A.
Many large-scale, complex systems engineering (SE) programs have been problematic; a few examples are listed below (Bar-Yam, 2003 and Cullen, 2004), and many others have been late, well over budget, or have failed: Hilton/Marriott/American Airlines system for hotel reservations and flights; 1988-1992; 125 million; "scrapped"
Metainference: A Bayesian inference method for heterogeneous systems.
Bonomi, Massimiliano; Camilloni, Carlo; Cavalli, Andrea; Vendruscolo, Michele
2016-01-01
Modeling a complex system is almost invariably a challenging task. The incorporation of experimental observations can be used to improve the quality of a model and thus to obtain better predictions about the behavior of the corresponding system. This approach, however, is affected by a variety of different errors, especially when a system simultaneously populates an ensemble of different states and experimental data are measured as averages over such states. To address this problem, we present a Bayesian inference method, called "metainference," that is able to deal with errors in experimental measurements and with experimental measurements averaged over multiple states. To achieve this goal, metainference models a finite sample of the distribution of models using a replica approach, in the spirit of the replica-averaging modeling based on the maximum entropy principle. To illustrate the method, we present its application to a heterogeneous model system and to the determination of an ensemble of structures corresponding to the thermal fluctuations of a protein molecule. Metainference thus provides an approach to modeling complex systems with heterogeneous components and interconverting between different states by taking into account all possible sources of errors.
NASA Astrophysics Data System (ADS)
Wlodarczyk, Jakub; Kierdaszuk, Borys
2005-08-01
Decays of tyrosine fluorescence in protein-ligand complexes are described by a model of continuous distribution of fluorescence lifetimes. Resulted analytical power-like decay function provides good fits to highly complex fluorescence kinetics. Moreover, this is a manifestation of so-called Tsallis q-exponential function, which is suitable for description of the systems with long-range interactions, memory effect, as well as with fluctuations of the characteristic lifetime of fluorescence. The proposed decay functions were applied to analysis of fluorescence decays of tyrosine in a protein, i.e. the enzyme purine nucleoside phosphorylase from E. coli (the product of the deoD gene), free in aqueous solution and in a complex with formycin A (an inhibitor) and orthophosphate (a co-substrate). The power-like function provides new information about enzyme-ligand complex formation based on the physically justified heterogeneity parameter directly related to the lifetime distribution. A measure of the heterogeneity parameter in the enzyme systems is provided by a variance of fluorescence lifetime distribution. The possible number of deactivation channels and excited state mean lifetime can be easily derived without a priori knowledge of the complexity of studied system. Moreover, proposed model is simpler then traditional multi-exponential one, and better describes heterogeneous nature of studied systems.
Ward, Marie; McDonald, Nick; Morrison, Rabea; Gaynor, Des; Nugent, Tony
2010-02-01
Aircraft maintenance is a highly regulated, safety critical, complex and competitive industry. There is a need to develop innovative solutions to address process efficiency without compromising safety and quality. This paper presents the case that in order to improve a highly complex system such as aircraft maintenance, it is necessary to develop a comprehensive and ecologically valid model of the operational system, which represents not just what is meant to happen, but what normally happens. This model then provides the backdrop against which to change or improve the system. A performance report, the Blocker Report, specific to aircraft maintenance and related to the model was developed gathering data on anything that 'blocks' task or check performance. A Blocker Resolution Process was designed to resolve blockers and improve the current check system. Significant results were obtained for the company in the first trial and implications for safety management systems and hazard identification are discussed. Statement of Relevance: Aircraft maintenance is a safety critical, complex, competitive industry with a need to develop innovative solutions to address process and safety efficiency. This research addresses this through the development of a comprehensive and ecologically valid model of the system linked with a performance reporting and resolution system.
Meta II: Multi-Model Language Suite for Cyber Physical Systems
2013-03-01
AVM META) projects have developed tools for designing cyber physical (or Mechatronic ) Systems . These systems are increasingly complex, take much...projects have developed tools for designing cyber physical (CPS) (or Mechatronic ) systems . Exemplified by modern amphibious and ground military...and parametric interface of Simulink models and defines associations with CyPhy components and component interfaces. 2. Embedded Systems Modeling
A survey of fuzzy logic monitoring and control utilisation in medicine.
Mahfouf, M; Abbod, M F; Linkens, D A
2001-01-01
Intelligent systems have appeared in many technical areas, such as consumer electronics, robotics and industrial control systems. Many of these intelligent systems are based on fuzzy control strategies which describe complex systems mathematical models in terms of linguistic rules. Since the 1980s new techniques have appeared from which fuzzy logic has been applied extensively in medical systems. The justification for such intelligent systems driven solutions is that biological systems are so complex that the development of computerised systems within such environments is not always a straightforward exercise. In practice, a precise model may not exist for biological systems or it may be too difficult to model. In most cases fuzzy logic is considered to be an ideal tool as human minds work from approximate data, extract meaningful information and produce crisp solutions. This paper surveys the utilisation of fuzzy logic control and monitoring in medical sciences with an analysis of its possible future penetration.
Simulations of Sea Level Rise Effects on Complex Coastal Systems
NASA Astrophysics Data System (ADS)
Niedoroda, A. W.; Ye, M.; Saha, B.; Donoghue, J. F.; Reed, C. W.
2009-12-01
It is now established that complex coastal systems with elements such as beaches, inlets, bays, and rivers adjust their morphologies according to time-varying balances in between the processes that control the exchange of sediment. Accelerated sea level rise introduces a major perturbation into the sediment-sharing systems. A modeling framework based on a new SL-PR model which is an advanced version of the aggregate-scale CST Model and the event-scale CMS-2D and CMS-Wave combination have been used to simulate the recent evolution of a portion of the Florida panhandle coast. This combination of models provides a method to evaluate coefficients in the aggregate-scale model that were previously treated as fitted parameters. That is, by carrying out simulations of a complex coastal system with runs of the event-scale model representing more than a year it is now possible to directly relate the coefficients in the large-scale SL-PR model to measureable physical parameters in the current and wave fields. This cross-scale modeling procedure has been used to simulate the shoreline evolution at the Santa Rosa Island, a long barrier which houses significant military infrastructure at the north Gulf Coast. The model has been used to simulate 137 years of measured shoreline change and to extend these to predictions of future rates of shoreline migration.
THE EPA MULTIMEDIA INTEGRATED MODELING SYSTEM SOFTWARE SUITE
The U.S. EPA is developing a Multimedia Integrated Modeling System (MIMS) framework that will provide a software infrastructure or environment to support constructing, composing, executing, and evaluating complex modeling studies. The framework will include (1) common software ...
On the impact of communication complexity in the design of parallel numerical algorithms
NASA Technical Reports Server (NTRS)
Gannon, D.; Vanrosendale, J.
1984-01-01
This paper describes two models of the cost of data movement in parallel numerical algorithms. One model is a generalization of an approach due to Hockney, and is suitable for shared memory multiprocessors where each processor has vector capabilities. The other model is applicable to highly parallel nonshared memory MIMD systems. In the second model, algorithm performance is characterized in terms of the communication network design. Techniques used in VLSI complexity theory are also brought in, and algorithm independent upper bounds on system performance are derived for several problems that are important to scientific computation.
On the impact of communication complexity on the design of parallel numerical algorithms
NASA Technical Reports Server (NTRS)
Gannon, D. B.; Van Rosendale, J.
1984-01-01
This paper describes two models of the cost of data movement in parallel numerical alorithms. One model is a generalization of an approach due to Hockney, and is suitable for shared memory multiprocessors where each processor has vector capabilities. The other model is applicable to highly parallel nonshared memory MIMD systems. In this second model, algorithm performance is characterized in terms of the communication network design. Techniques used in VLSI complexity theory are also brought in, and algorithm-independent upper bounds on system performance are derived for several problems that are important to scientific computation.
NASA Astrophysics Data System (ADS)
Nuh, M. Z.; Nasir, N. F.
2017-08-01
Biodiesel as a fuel comprised of mono alkyl esters of long chain fatty acids derived from renewable lipid feedstock, such as vegetable oil and animal fat. Biodiesel production is complex process which need systematic design and optimization. However, no case study using the process system engineering (PSE) elements which are superstructure optimization of batch process, it involves complex problems and uses mixed-integer nonlinear programming (MINLP). The PSE offers a solution to complex engineering system by enabling the use of viable tools and techniques to better manage and comprehend the complexity of the system. This study is aimed to apply the PSE tools for the simulation of biodiesel process and optimization and to develop mathematical models for component of the plant for case A, B, C by using published kinetic data. Secondly, to determine economic analysis for biodiesel production, focusing on heterogeneous catalyst. Finally, the objective of this study is to develop the superstructure for biodiesel production by using heterogeneous catalyst. The mathematical models are developed by the superstructure and solving the resulting mixed integer non-linear model and estimation economic analysis by using MATLAB software. The results of the optimization process with the objective function of minimizing the annual production cost by batch process from case C is 23.2587 million USD. Overall, the implementation a study of process system engineering (PSE) has optimized the process of modelling, design and cost estimation. By optimizing the process, it results in solving the complex production and processing of biodiesel by batch.
Quantifying the Adaptive Cycle
The adaptive cycle was proposed as a conceptual model to portray patterns of change in complex systems. Despite the model having potential for elucidating change across systems, it has been used mainly as a metaphor, describing system dynamics qualitatively. We use a quantitative...
The deconvolution of complex spectra by artificial immune system
NASA Astrophysics Data System (ADS)
Galiakhmetova, D. I.; Sibgatullin, M. E.; Galimullin, D. Z.; Kamalova, D. I.
2017-11-01
An application of the artificial immune system method for decomposition of complex spectra is presented. The results of decomposition of the model contour consisting of three components, Gaussian contours, are demonstrated. The method of artificial immune system is an optimization method, which is based on the behaviour of the immune system and refers to modern methods of search for the engine optimization.
System Thinking and Feeding Relations: Learning with a Live Ecosystem Model
ERIC Educational Resources Information Center
Eilam, Billie
2012-01-01
Considering well-documented difficulties in mastering ecology concepts and system thinking, the aim of the study was to examine 9th graders' understanding of the complex, multilevel, systemic construct of feeding relations, nested within a larger system of a live model. Fifty students interacted with the model and manipulated a variable within it…
Human performance cognitive-behavioral modeling: a benefit for occupational safety.
Gore, Brian F
2002-01-01
Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.
Human performance cognitive-behavioral modeling: a benefit for occupational safety
NASA Technical Reports Server (NTRS)
Gore, Brian F.
2002-01-01
Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.
NASA Astrophysics Data System (ADS)
Christensen, Claire Petra
Across diverse fields ranging from physics to biology, sociology, and economics, the technological advances of the past decade have engendered an unprecedented explosion of data on highly complex systems with thousands, if not millions of interacting components. These systems exist at many scales of size and complexity, and it is becoming ever-more apparent that they are, in fact, universal, arising in every field of study. Moreover, they share fundamental properties---chief among these, that the individual interactions of their constituent parts may be well-understood, but the characteristic behaviour produced by the confluence of these interactions---by these complex networks---is unpredictable; in a nutshell, the whole is more than the sum of its parts. There is, perhaps, no better illustration of this concept than the discoveries being made regarding complex networks in the biological sciences. In particular, though the sequencing of the human genome in 2003 was a remarkable feat, scientists understand that the "cellular-level blueprints" for the human being are cellular-level parts lists, but they say nothing (explicitly) about cellular-level processes. The challenge of modern molecular biology is to understand these processes in terms of the networks of parts---in terms of the interactions among proteins, enzymes, genes, and metabolites---as it is these processes that ultimately differentiate animate from inanimate, giving rise to life! It is the goal of systems biology---an umbrella field encapsulating everything from molecular biology to epidemiology in social systems---to understand processes in terms of fundamental networks of core biological parts, be they proteins or people. By virtue of the fact that there are literally countless complex systems, not to mention tools and techniques used to infer, simulate, analyze, and model these systems, it is impossible to give a truly comprehensive account of the history and study of complex systems. The author's own publications have contributed network inference, simulation, modeling, and analysis methods to the much larger body of work in systems biology, and indeed, in network science. The aim of this thesis is therefore twofold: to present this original work in the historical context of network science, but also to provide sufficient review and reference regarding complex systems (with an emphasis on complex networks in systems biology) and tools and techniques for their inference, simulation, analysis, and modeling, such that the reader will be comfortable in seeking out further information on the subject. The review-like Chapters 1, 2, and 4 are intended to convey the co-evolution of network science and the slow but noticeable breakdown of boundaries between disciplines in academia as research and comparison of diverse systems has brought to light the shared properties of these systems. It is the author's hope that theses chapters impart some sense of the remarkable and rapid progress in complex systems research that has led to this unprecedented academic synergy. Chapters 3 and 5 detail the author's original work in the context of complex systems research. Chapter 3 presents the methods and results of a two-stage modeling process that generates candidate gene-regulatory networks of the bacterium B.subtilis from experimentally obtained, yet mathematically underdetermined microchip array data. These networks are then analyzed from a graph theoretical perspective, and their biological viability is critiqued by comparing the networks' graph theoretical properties to those of other biological systems. The results of topological perturbation analyses revealing commonalities in behavior at multiple levels of complexity are also presented, and are shown to be an invaluable means by which to ascertain the level of complexity to which the network inference process is robust to noise. Chapter 5 outlines a learning algorithm for the development of a realistic, evolving social network (a city) into which a disease is introduced. The results of simulations in populations spanning two orders of magnitude are compared to prevaccine era measles data for England and Wales and demonstrate that the simulations are able to capture the quantitative and qualitative features of epidemics in populations as small as 10,000 people. The work presented in Chapter 5 validates the utility of network simulation in concurrently probing contact network dynamics and disease dynamics.
NASA Astrophysics Data System (ADS)
Gosses, Moritz; Nowak, Wolfgang; Wöhling, Thomas
2017-04-01
Physically-based modeling is a wide-spread tool in understanding and management of natural systems. With the high complexity of many such models and the huge amount of model runs necessary for parameter estimation and uncertainty analysis, overall run times can be prohibitively long even on modern computer systems. An encouraging strategy to tackle this problem are model reduction methods. In this contribution, we compare different proper orthogonal decomposition (POD, Siade et al. (2010)) methods and their potential applications to groundwater models. The POD method performs a singular value decomposition on system states as simulated by the complex (e.g., PDE-based) groundwater model taken at several time-steps, so-called snapshots. The singular vectors with the highest information content resulting from this decomposition are then used as a basis for projection of the system of model equations onto a subspace of much lower dimensionality than the original complex model, thereby greatly reducing complexity and accelerating run times. In its original form, this method is only applicable to linear problems. Many real-world groundwater models are non-linear, tough. These non-linearities are introduced either through model structure (unconfined aquifers) or boundary conditions (certain Cauchy boundaries, like rivers with variable connection to the groundwater table). To date, applications of POD focused on groundwater models simulating pumping tests in confined aquifers with constant head boundaries. In contrast, POD model reduction either greatly looses accuracy or does not significantly reduce model run time if the above-mentioned non-linearities are introduced. We have also found that variable Dirichlet boundaries are problematic for POD model reduction. An extension to the POD method, called POD-DEIM, has been developed for non-linear groundwater models by Stanko et al. (2016). This method uses spatial interpolation points to build the equation system in the reduced model space, thereby allowing the recalculation of system matrices at every time-step necessary for non-linear models while retaining the speed of the reduced model. This makes POD-DEIM applicable for groundwater models simulating unconfined aquifers. However, in our analysis, the method struggled to reproduce variable river boundaries accurately and gave no advantage for variable Dirichlet boundaries compared to the original POD method. We have developed another extension for POD that targets to address these remaining problems by performing a second POD operation on the model matrix on the left-hand side of the equation. The method aims to at least reproduce the accuracy of the other methods where they are applicable while outperforming them for setups with changing river boundaries or variable Dirichlet boundaries. We compared the new extension with original POD and POD-DEIM for different combinations of model structures and boundary conditions. The new method shows the potential of POD extensions for applications to non-linear groundwater systems and complex boundary conditions that go beyond the current, relatively limited range of applications. References: Siade, A. J., Putti, M., and Yeh, W. W.-G. (2010). Snapshot selection for groundwater model reduction using proper orthogonal decomposition. Water Resour. Res., 46(8):W08539. Stanko, Z. P., Boyce, S. E., and Yeh, W. W.-G. (2016). Nonlinear model reduction of unconfined groundwater flow using pod and deim. Advances in Water Resources, 97:130 - 143.
Model Based Autonomy for Robust Mars Operations
NASA Technical Reports Server (NTRS)
Kurien, James A.; Nayak, P. Pandurang; Williams, Brian C.; Lau, Sonie (Technical Monitor)
1998-01-01
Space missions have historically relied upon a large ground staff, numbering in the hundreds for complex missions, to maintain routine operations. When an anomaly occurs, this small army of engineers attempts to identify and work around the problem. A piloted Mars mission, with its multiyear duration, cost pressures, half-hour communication delays and two-week blackouts cannot be closely controlled by a battalion of engineers on Earth. Flight crew involvement in routine system operations must also be minimized to maximize science return. It also may be unrealistic to require the crew have the expertise in each mission subsystem needed to diagnose a system failure and effect a timely repair, as engineers did for Apollo 13. Enter model-based autonomy, which allows complex systems to autonomously maintain operation despite failures or anomalous conditions, contributing to safe, robust, and minimally supervised operation of spacecraft, life support, In Situ Resource Utilization (ISRU) and power systems. Autonomous reasoning is central to the approach. A reasoning algorithm uses a logical or mathematical model of a system to infer how to operate the system, diagnose failures and generate appropriate behavior to repair or reconfigure the system in response. The 'plug and play' nature of the models enables low cost development of autonomy for multiple platforms. Declarative, reusable models capture relevant aspects of the behavior of simple devices (e.g. valves or thrusters). Reasoning algorithms combine device models to create a model of the system-wide interactions and behavior of a complex, unique artifact such as a spacecraft. Rather than requiring engineers to all possible interactions and failures at design time or perform analysis during the mission, the reasoning engine generates the appropriate response to the current situation, taking into account its system-wide knowledge, the current state, and even sensor failures or unexpected behavior.
Analysis hierarchical model for discrete event systems
NASA Astrophysics Data System (ADS)
Ciortea, E. M.
2015-11-01
The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.
NASA Astrophysics Data System (ADS)
Adabanija, M. A.; Omidiora, E. O.; Olayinka, A. I.
2008-05-01
A linguistic fuzzy logic system (LFLS)-based expert system model has been developed for the assessment of aquifers for the location of productive water boreholes in a crystalline basement complex. The model design employed a multiple input/single output (MISO) approach with geoelectrical parameters and topographic features as input variables and control crisp value as the output. The application of the method to the data acquired in Khondalitic terrain, a basement complex in Vizianagaram District, south India, shows that potential groundwater resource zones that have control output values in the range 0.3295-0.3484 have a yield greater than 6,000 liters per hour (LPH). The range 0.3174-0.3226 gives a yield less than 4,000 LPH. The validation of the control crisp value using data acquired from Oban Massif, a basement complex in southeastern Nigeria, indicates a yield less than 3,000 LPH for control output values in the range 0.2938-0.3065. This validation corroborates the ability of control output values to predict a yield, thereby vindicating the applicability of linguistic fuzzy logic system in siting productive water boreholes in a basement complex.
A self-cognizant dynamic system approach for prognostics and health management
NASA Astrophysics Data System (ADS)
Bai, Guangxing; Wang, Pingfeng; Hu, Chao
2015-03-01
Prognostics and health management (PHM) is an emerging engineering discipline that diagnoses and predicts how and when a system will degrade its performance and lose its partial or whole functionality. Due to the complexity and invisibility of rules and states of most dynamic systems, developing an effective approach to track evolving system states becomes a major challenge. This paper presents a new self-cognizant dynamic system (SCDS) approach that incorporates artificial intelligence into dynamic system modeling for PHM. A feed-forward neural network (FFNN) is selected to approximate a complex system response which is challenging task in general due to inaccessible system physics. The trained FFNN model is then embedded into a dual extended Kalman filter algorithm to track down system dynamics. A recursive computation technique used to update the FFNN model using online measurements is also derived. To validate the proposed SCDS approach, a battery dynamic system is considered as an experimental application. After modeling the battery system by a FFNN model and a state-space model, the state-of-charge (SoC) and state-of-health (SoH) are estimated by updating the FFNN model using the proposed approach. Experimental results suggest that the proposed approach improves the efficiency and accuracy for battery health management.
Modeling complex tone perception: grouping harmonics with combination-sensitive neurons.
Medvedev, Andrei V; Chiao, Faye; Kanwal, Jagmeet S
2002-06-01
Perception of complex communication sounds is a major function of the auditory system. To create a coherent precept of these sounds the auditory system may instantaneously group or bind multiple harmonics within complex sounds. This perception strategy simplifies further processing of complex sounds and facilitates their meaningful integration with other sensory inputs. Based on experimental data and a realistic model, we propose that associative learning of combinations of harmonic frequencies and nonlinear facilitation of responses to those combinations, also referred to as "combination-sensitivity," are important for spectral grouping. For our model, we simulated combination sensitivity using Hebbian and associative types of synaptic plasticity in auditory neurons. We also provided a parallel tonotopic input that converges and diverges within the network. Neurons in higher-order layers of the network exhibited an emergent property of multifrequency tuning that is consistent with experimental findings. Furthermore, this network had the capacity to "recognize" the pitch or fundamental frequency of a harmonic tone complex even when the fundamental frequency itself was missing.
Safety Metrics for Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Leveson, Nancy G; Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
A Computational Workflow for the Automated Generation of Models of Genetic Designs.
Misirli, Göksel; Nguyen, Tramy; McLaughlin, James Alastair; Vaidyanathan, Prashant; Jones, Timothy S; Densmore, Douglas; Myers, Chris; Wipat, Anil
2018-06-05
Computational models are essential to engineer predictable biological systems and to scale up this process for complex systems. Computational modeling often requires expert knowledge and data to build models. Clearly, manual creation of models is not scalable for large designs. Despite several automated model construction approaches, computational methodologies to bridge knowledge in design repositories and the process of creating computational models have still not been established. This paper describes a workflow for automatic generation of computational models of genetic circuits from data stored in design repositories using existing standards. This workflow leverages the software tool SBOLDesigner to build structural models that are then enriched by the Virtual Parts Repository API using Systems Biology Open Language (SBOL) data fetched from the SynBioHub design repository. The iBioSim software tool is then utilized to convert this SBOL description into a computational model encoded using the Systems Biology Markup Language (SBML). Finally, this SBML model can be simulated using a variety of methods. This workflow provides synthetic biologists with easy to use tools to create predictable biological systems, hiding away the complexity of building computational models. This approach can further be incorporated into other computational workflows for design automation.
Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J
2015-03-15
This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.
Gonzalez Bernaldo de Quiros, Fernan; Dawidowski, Adriana R; Figar, Silvana
2017-02-01
In this study, we aimed: 1) to conceptualize the theoretical challenges facing health information systems (HIS) to represent patients' decisions about health and medical treatments in everyday life; 2) to suggest approaches for modeling these processes. The conceptualization of the theoretical and methodological challenges was discussed in 2015 during a series of interdisciplinary meetings attended by health informatics staff, epidemiologists and health professionals working in quality management and primary and secondary prevention of chronic diseases of the Hospital Italiano de Buenos Aires, together with sociologists, anthropologists and e-health stakeholders. HIS are facing the need and challenge to represent social human processes based on constructivist and complexity theories, which are the current frameworks of human sciences for understanding human learning and socio-cultural changes. Computer systems based on these theories can model processes of social construction of concrete and subjective entities and the interrelationships between them. These theories could be implemented, among other ways, through the mapping of health assets, analysis of social impact through community trials and modeling of complexity with system simulation tools. This analysis suggested the need to complement the traditional linear causal explanations of disease onset (and treatments) that are the bases for models of analysis of HIS with constructivist and complexity frameworks. Both may enlighten the complex interrelationships among patients, health services and the health system. The aim of this strategy is to clarify people's decision making processes to improve the efficiency, quality and equity of the health services and the health system.
Multiagent model and mean field theory of complex auction dynamics
NASA Astrophysics Data System (ADS)
Chen, Qinghua; Huang, Zi-Gang; Wang, Yougui; Lai, Ying-Cheng
2015-09-01
Recent years have witnessed a growing interest in analyzing a variety of socio-economic phenomena using methods from statistical and nonlinear physics. We study a class of complex systems arising from economics, the lowest unique bid auction (LUBA) systems, which is a recently emerged class of online auction game systems. Through analyzing large, empirical data sets of LUBA, we identify a general feature of the bid price distribution: an inverted J-shaped function with exponential decay in the large bid price region. To account for the distribution, we propose a multi-agent model in which each agent bids stochastically in the field of winner’s attractiveness, and develop a theoretical framework to obtain analytic solutions of the model based on mean field analysis. The theory produces bid-price distributions that are in excellent agreement with those from the real data. Our model and theory capture the essential features of human behaviors in the competitive environment as exemplified by LUBA, and may provide significant quantitative insights into complex socio-economic phenomena.
Modeling microbial community structure and functional diversity across time and space.
Larsen, Peter E; Gibbons, Sean M; Gilbert, Jack A
2012-07-01
Microbial communities exhibit exquisitely complex structure. Many aspects of this complexity, from the number of species to the total number of interactions, are currently very difficult to examine directly. However, extraordinary efforts are being made to make these systems accessible to scientific investigation. While recent advances in high-throughput sequencing technologies have improved accessibility to the taxonomic and functional diversity of complex communities, monitoring the dynamics of these systems over time and space - using appropriate experimental design - is still expensive. Fortunately, modeling can be used as a lens to focus low-resolution observations of community dynamics to enable mathematical abstractions of functional and taxonomic dynamics across space and time. Here, we review the approaches for modeling bacterial diversity at both the very large and the very small scales at which microbial systems interact with their environments. We show that modeling can help to connect biogeochemical processes to specific microbial metabolic pathways. © 2012 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.
Modeling fluctuations in default-mode brain network using a spiking neural network.
Yamanishi, Teruya; Liu, Jian-Qin; Nishimura, Haruhiko
2012-08-01
Recently, numerous attempts have been made to understand the dynamic behavior of complex brain systems using neural network models. The fluctuations in blood-oxygen-level-dependent (BOLD) brain signals at less than 0.1 Hz have been observed by functional magnetic resonance imaging (fMRI) for subjects in a resting state. This phenomenon is referred to as a "default-mode brain network." In this study, we model the default-mode brain network by functionally connecting neural communities composed of spiking neurons in a complex network. Through computational simulations of the model, including transmission delays and complex connectivity, the network dynamics of the neural system and its behavior are discussed. The results show that the power spectrum of the modeled fluctuations in the neuron firing patterns is consistent with the default-mode brain network's BOLD signals when transmission delays, a characteristic property of the brain, have finite values in a given range.
NASA Technical Reports Server (NTRS)
Gore, Brian F.
2011-01-01
As automation and advanced technologies are introduced into transport systems ranging from the Next Generation Air Transportation System termed NextGen, to the advanced surface transportation systems as exemplified by the Intelligent Transportations Systems, to future systems designed for space exploration, there is an increased need to validly predict how the future systems will be vulnerable to error given the demands imposed by the assistive technologies. One formalized approach to study the impact of assistive technologies on the human operator in a safe and non-obtrusive manner is through the use of human performance models (HPMs). HPMs play an integral role when complex human-system designs are proposed, developed, and tested. One HPM tool termed the Man-machine Integration Design and Analysis System (MIDAS) is a NASA Ames Research Center HPM software tool that has been applied to predict human-system performance in various domains since 1986. MIDAS is a dynamic, integrated HPM and simulation environment that facilitates the design, visualization, and computational evaluation of complex man-machine system concepts in simulated operational environments. The paper will discuss a range of aviation specific applications including an approach used to model human error for NASA s Aviation Safety Program, and what-if analyses to evaluate flight deck technologies for NextGen operations. This chapter will culminate by raising two challenges for the field of predictive HPMs for complex human-system designs that evaluate assistive technologies: that of (1) model transparency and (2) model validation.
Topics in Complexity: Dynamical Patterns in the Cyberworld
NASA Astrophysics Data System (ADS)
Qi, Hong
Quantitative understanding of mechanism in complex systems is a common "difficult" problem across many fields such as physical, biological, social and economic sciences. Investigation on underlying dynamics of complex systems and building individual-based models have recently been fueled by big data resulted from advancing information technology. This thesis investigates complex systems in social science, focusing on civil unrests on streets and relevant activities online. Investigation consists of collecting data of unrests from open digital source, featuring dynamical patterns underlying, making predictions and constructing models. A simple law governing the progress of two-sided confrontations is proposed with data of activities at micro-level. Unraveling the connections between activity of organizing online and outburst of unrests on streets gives rise to a further meso-level pattern of human behavior, through which adversarial groups evolve online and hyper-escalate ahead of real-world uprisings. Based on the patterns found, noticeable improvement of prediction of civil unrests is achieved. Meanwhile, novel model created from combination of mobility dynamics in the cyberworld and a traditional contagion model can better capture the characteristics of modern civil unrests and other contagion-like phenomena than the original one.
Yang, Guanxue; Wang, Lin; Wang, Xiaofan
2017-06-07
Reconstruction of networks underlying complex systems is one of the most crucial problems in many areas of engineering and science. In this paper, rather than identifying parameters of complex systems governed by pre-defined models or taking some polynomial and rational functions as a prior information for subsequent model selection, we put forward a general framework for nonlinear causal network reconstruction from time-series with limited observations. With obtaining multi-source datasets based on the data-fusion strategy, we propose a novel method to handle nonlinearity and directionality of complex networked systems, namely group lasso nonlinear conditional granger causality. Specially, our method can exploit different sets of radial basis functions to approximate the nonlinear interactions between each pair of nodes and integrate sparsity into grouped variables selection. The performance characteristic of our approach is firstly assessed with two types of simulated datasets from nonlinear vector autoregressive model and nonlinear dynamic models, and then verified based on the benchmark datasets from DREAM3 Challenge4. Effects of data size and noise intensity are also discussed. All of the results demonstrate that the proposed method performs better in terms of higher area under precision-recall curve.
Shen, Weifeng; Jiang, Libing; Zhang, Mao; Ma, Yuefeng; Jiang, Guanyu; He, Xiaojun
2014-01-01
To review the research methods of mass casualty incident (MCI) systematically and introduce the concept and characteristics of complexity science and artificial system, computational experiments and parallel execution (ACP) method. We searched PubMed, Web of Knowledge, China Wanfang and China Biology Medicine (CBM) databases for relevant studies. Searches were performed without year or language restrictions and used the combinations of the following key words: "mass casualty incident", "MCI", "research method", "complexity science", "ACP", "approach", "science", "model", "system" and "response". Articles were searched using the above keywords and only those involving the research methods of mass casualty incident (MCI) were enrolled. Research methods of MCI have increased markedly over the past few decades. For now, dominating research methods of MCI are theory-based approach, empirical approach, evidence-based science, mathematical modeling and computer simulation, simulation experiment, experimental methods, scenario approach and complexity science. This article provides an overview of the development of research methodology for MCI. The progresses of routine research approaches and complexity science are briefly presented in this paper. Furthermore, the authors conclude that the reductionism underlying the exact science is not suitable for MCI complex systems. And the only feasible alternative is complexity science. Finally, this summary is followed by a review that ACP method combining artificial systems, computational experiments and parallel execution provides a new idea to address researches for complex MCI.
Drewes, Rich; Zou, Quan; Goodman, Philip H
2009-01-01
Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading "glue" tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS.
Drewes, Rich; Zou, Quan; Goodman, Philip H.
2008-01-01
Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading “glue” tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS. PMID:19506707
NASA Astrophysics Data System (ADS)
Zhang, Yali; Wang, Jun
2017-09-01
In an attempt to investigate the nonlinear complex evolution of financial dynamics, a new financial price model - the multitype range-intensity contact (MRIC) financial model, is developed based on the multitype range-intensity interacting contact system, in which the interaction and transmission of different types of investment attitudes in a stock market are simulated by viruses spreading. Two new random visibility graph (VG) based analyses and Lempel-Ziv complexity (LZC) are applied to study the complex behaviors of return time series and the corresponding random sorted series. The VG method is the complex network theory, and the LZC is a non-parametric measure of complexity reflecting the rate of new pattern generation of a series. In this work, the real stock market indices are considered to be comparatively studied with the simulation data of the proposed model. Further, the numerical empirical study shows the similar complexity behaviors between the model and the real markets, the research confirms that the financial model is reasonable to some extent.
Decreasing the temporal complexity for nonlinear, implicit reduced-order models by forecasting
Carlberg, Kevin; Ray, Jaideep; van Bloemen Waanders, Bart
2015-02-14
Implicit numerical integration of nonlinear ODEs requires solving a system of nonlinear algebraic equations at each time step. Each of these systems is often solved by a Newton-like method, which incurs a sequence of linear-system solves. Most model-reduction techniques for nonlinear ODEs exploit knowledge of system's spatial behavior to reduce the computational complexity of each linear-system solve. However, the number of linear-system solves for the reduced-order simulation often remains roughly the same as that for the full-order simulation. We propose exploiting knowledge of the model's temporal behavior to (1) forecast the unknown variable of the reduced-order system of nonlinear equationsmore » at future time steps, and (2) use this forecast as an initial guess for the Newton-like solver during the reduced-order-model simulation. To compute the forecast, we propose using the Gappy POD technique. As a result, the goal is to generate an accurate initial guess so that the Newton solver requires many fewer iterations to converge, thereby decreasing the number of linear-system solves in the reduced-order-model simulation.« less
Transdisciplinary application of the cross-scale resilience model
Sundstrom, Shana M.; Angeler, David G.; Garmestani, Ahjond S.; Garcia, Jorge H.; Allen, Craig R.
2014-01-01
The cross-scale resilience model was developed in ecology to explain the emergence of resilience from the distribution of ecological functions within and across scales, and as a tool to assess resilience. We propose that the model and the underlying discontinuity hypothesis are relevant to other complex adaptive systems, and can be used to identify and track changes in system parameters related to resilience. We explain the theory behind the cross-scale resilience model, review the cases where it has been applied to non-ecological systems, and discuss some examples of social-ecological, archaeological/ anthropological, and economic systems where a cross-scale resilience analysis could add a quantitative dimension to our current understanding of system dynamics and resilience. We argue that the scaling and diversity parameters suitable for a resilience analysis of ecological systems are appropriate for a broad suite of systems where non-normative quantitative assessments of resilience are desired. Our planet is currently characterized by fast environmental and social change, and the cross-scale resilience model has the potential to quantify resilience across many types of complex adaptive systems.
NASA Technical Reports Server (NTRS)
Schmidt, R. J.; Dodds, R. H., Jr.
1985-01-01
The dynamic analysis of complex structural systems using the finite element method and multilevel substructured models is presented. The fixed-interface method is selected for substructure reduction because of its efficiency, accuracy, and adaptability to restart and reanalysis. This method is extended to reduction of substructures which are themselves composed of reduced substructures. The implementation and performance of the method in a general purpose software system is emphasized. Solution algorithms consistent with the chosen data structures are presented. It is demonstrated that successful finite element software requires the use of software executives to supplement the algorithmic language. The complexity of the implementation of restart and reanalysis porcedures illustrates the need for executive systems to support the noncomputational aspects of the software. It is shown that significant computational efficiencies can be achieved through proper use of substructuring and reduction technbiques without sacrificing solution accuracy. The restart and reanalysis capabilities and the flexible procedures for multilevel substructured modeling gives economical yet accurate analyses of complex structural systems.
Modeling and Verification of Dependable Electronic Power System Architecture
NASA Astrophysics Data System (ADS)
Yuan, Ling; Fan, Ping; Zhang, Xiao-fang
The electronic power system can be viewed as a system composed of a set of concurrently interacting subsystems to generate, transmit, and distribute electric power. The complex interaction among sub-systems makes the design of electronic power system complicated. Furthermore, in order to guarantee the safe generation and distribution of electronic power, the fault tolerant mechanisms are incorporated in the system design to satisfy high reliability requirements. As a result, the incorporation makes the design of such system more complicated. We propose a dependable electronic power system architecture, which can provide a generic framework to guide the development of electronic power system to ease the development complexity. In order to provide common idioms and patterns to the system *designers, we formally model the electronic power system architecture by using the PVS formal language. Based on the PVS model of this system architecture, we formally verify the fault tolerant properties of the system architecture by using the PVS theorem prover, which can guarantee that the system architecture can satisfy high reliability requirements.
NASA Technical Reports Server (NTRS)
Frisch, Harold P.
2007-01-01
Engineers, who design systems using text specification documents, focus their work upon the completed system to meet Performance, time and budget goals. Consistency and integrity is difficult to maintain within text documents for a single complex system and more difficult to maintain as several systems are combined into higher-level systems, are maintained over decades, and evolve technically and in performance through updates. This system design approach frequently results in major changes during the system integration and test phase, and in time and budget overruns. Engineers who build system specification documents within a model-based systems environment go a step further and aggregate all of the data. They interrelate all of the data to insure consistency and integrity. After the model is constructed, the various system specification documents are prepared, all from the same database. The consistency and integrity of the model is assured, therefore the consistency and integrity of the various specification documents is insured. This article attempts to define model-based systems relative to such an environment. The intent is to expose the complexity of the enabling problem by outlining what is needed, why it is needed and how needs are being addressed by international standards writing teams.
A finite element model of rigid body structures actuated by dielectric elastomer actuators
NASA Astrophysics Data System (ADS)
Simone, F.; Linnebach, P.; Rizzello, G.; Seelecke, S.
2018-06-01
This paper presents on finite element (FE) modeling and simulation of dielectric elastomer actuators (DEAs) coupled with articulated structures. DEAs have proven to represent an effective transduction technology for the realization of large deformation, low-power consuming, and fast mechatronic actuators. However, the complex dynamic behavior of the material, characterized by nonlinearities and rate-dependent phenomena, makes it difficult to accurately model and design DEA systems. The problem is further complicated in case the DEA is used to activate articulated structures, which increase both system complexity and implementation effort of numerical simulation models. In this paper, we present a model based tool which allows to effectively implement and simulate complex articulated systems actuated by DEAs. A first prototype of a compact switch actuated by DEA membranes is chosen as reference study to introduce the methodology. The commercially available FE software COMSOL is used for implementing and coupling a physics-based dynamic model of the DEA with the external structure, i.e., the switch. The model is then experimentally calibrated and validated in both quasi-static and dynamic loading conditions. Finally, preliminary results on how to use the simulation tool to optimize the design are presented.
Model identification of signal transduction networks from data using a state regulator problem.
Gadkar, K G; Varner, J; Doyle, F J
2005-03-01
Advances in molecular biology provide an opportunity to develop detailed models of biological processes that can be used to obtain an integrated understanding of the system. However, development of useful models from the available knowledge of the system and experimental observations still remains a daunting task. In this work, a model identification strategy for complex biological networks is proposed. The approach includes a state regulator problem (SRP) that provides estimates of all the component concentrations and the reaction rates of the network using the available measurements. The full set of the estimates is utilised for model parameter identification for the network of known topology. An a priori model complexity test that indicates the feasibility of performance of the proposed algorithm is developed. Fisher information matrix (FIM) theory is used to address model identifiability issues. Two signalling pathway case studies, the caspase function in apoptosis and the MAP kinase cascade system, are considered. The MAP kinase cascade, with measurements restricted to protein complex concentrations, fails the a priori test and the SRP estimates are poor as expected. The apoptosis network structure used in this work has moderate complexity and is suitable for application of the proposed tools. Using a measurement set of seven protein concentrations, accurate estimates for all unknowns are obtained. Furthermore, the effects of measurement sampling frequency and quality of information in the measurement set on the performance of the identified model are described.
Data-Driven Modeling of Complex Systems by means of a Dynamical ANN
NASA Astrophysics Data System (ADS)
Seleznev, A.; Mukhin, D.; Gavrilov, A.; Loskutov, E.; Feigin, A.
2017-12-01
The data-driven methods for modeling and prognosis of complex dynamical systems become more and more popular in various fields due to growth of high-resolution data. We distinguish the two basic steps in such an approach: (i) determining the phase subspace of the system, or embedding, from available time series and (ii) constructing an evolution operator acting in this reduced subspace. In this work we suggest a novel approach combining these two steps by means of construction of an artificial neural network (ANN) with special topology. The proposed ANN-based model, on the one hand, projects the data onto a low-dimensional manifold, and, on the other hand, models a dynamical system on this manifold. Actually, this is a recurrent multilayer ANN which has internal dynamics and capable of generating time series. Very important point of the proposed methodology is the optimization of the model allowing us to avoid overfitting: we use Bayesian criterion to optimize the ANN structure and estimate both the degree of evolution operator nonlinearity and the complexity of nonlinear manifold which the data are projected on. The proposed modeling technique will be applied to the analysis of high-dimensional dynamical systems: Lorenz'96 model of atmospheric turbulence, producing high-dimensional space-time chaos, and quasi-geostrophic three-layer model of the Earth's atmosphere with the natural orography, describing the dynamics of synoptical vortexes as well as mesoscale blocking systems. The possibility of application of the proposed methodology to analyze real measured data is also discussed. The study was supported by the Russian Science Foundation (grant #16-12-10198).
NASA Technical Reports Server (NTRS)
Alexandrov, Natalia (Technical Monitor); Kuby, Michael; Tierney, Sean; Roberts, Tyler; Upchurch, Christopher
2005-01-01
This report reviews six classes of models that are used for studying transportation network topologies. The report is motivated by two main questions. First, what can the "new science" of complex networks (scale-free, small-world networks) contribute to our understanding of transport network structure, compared to more traditional methods? Second, how can geographic information systems (GIS) contribute to studying transport networks? The report defines terms that can be used to classify different kinds of models by their function, composition, mechanism, spatial and temporal dimensions, certainty, linearity, and resolution. Six broad classes of models for analyzing transport network topologies are then explored: GIS; static graph theory; complex networks; mathematical programming; simulation; and agent-based modeling. Each class of models is defined and classified according to the attributes introduced earlier. The paper identifies some typical types of research questions about network structure that have been addressed by each class of model in the literature.
Non-Archimedean reaction-ultradiffusion equations and complex hierarchic systems
NASA Astrophysics Data System (ADS)
Zúñiga-Galindo, W. A.
2018-06-01
We initiate the study of non-Archimedean reaction-ultradiffusion equations and their connections with models of complex hierarchic systems. From a mathematical perspective, the equations studied here are the p-adic counterpart of the integro-differential models for phase separation introduced by Bates and Chmaj. Our equations are also generalizations of the ultradiffusion equations on trees studied in the 1980s by Ogielski, Stein, Bachas, Huberman, among others, and also generalizations of the master equations of the Avetisov et al models, which describe certain complex hierarchic systems. From a physical perspective, our equations are gradient flows of non-Archimedean free energy functionals and their solutions describe the macroscopic density profile of a bistable material whose space of states has an ultrametric structure. Some of our results are p-adic analogs of some well-known results in the Archimedean setting, however, the mechanism of diffusion is completely different due to the fact that it occurs in an ultrametric space.
Learning from Evidence in a Complex World
Sterman, John D.
2006-01-01
Policies to promote public health and welfare often fail or worsen the problems they are intended to solve. Evidence-based learning should prevent such policy resistance, but learning in complex systems is often weak and slow. Complexity hinders our ability to discover the delayed and distal impacts of interventions, generating unintended “side effects.” Yet learning often fails even when strong evidence is available: common mental models lead to erroneous but self-confirming inferences, allowing harmful beliefs and behaviors to persist and undermining implementation of beneficial policies. Here I show how systems thinking and simulation modeling can help expand the boundaries of our mental models, enhance our ability to generate and learn from evidence, and catalyze effective change in public health and beyond. PMID:16449579
NASA Astrophysics Data System (ADS)
Abdel-Aty, Mahmoud
2016-07-01
The modeling of a complex system requires the analysis of all microscopic constituents and in particular of their interactions [1]. The interest in this research field has increased considering also recent developments in the information sciences. However interaction among scholars working in various fields of the applied sciences can be considered the true motor for the definition of a general framework for the analysis of complex systems. In particular biological systems constitute the platform where many scientists have decided to collaborate in order to gain a global description of the system. Among others, cancer-immune system competition (see [2] and the review papers [3,4]) has attracted much attention.
Reasoning from non-stationarity
NASA Astrophysics Data System (ADS)
Struzik, Zbigniew R.; van Wijngaarden, Willem J.; Castelo, Robert
2002-11-01
Complex real-world (biological) systems often exhibit intrinsically non-stationary behaviour of their temporal characteristics. We discuss local measures of scaling which can capture and reveal changes in a system's behaviour. Such measures offer increased insight into a system's behaviour and are superior to global, spectral characteristics like the multifractal spectrum. They are, however, often inadequate for fully understanding and modelling the phenomenon. We illustrate an attempt to capture complex model characteristics by analysing (multiple order) correlations in a high dimensional space of parameters of the (biological) system being studied. Both temporal information, among others local scaling information, and external descriptors/parameters, possibly influencing the system's state, are used to span the search space investigated for the presence of a (sub-)optimal model. As an example, we use fetal heartbeat monitored during labour.
Modeling and simulation of a direct ethanol fuel cell: An overview
NASA Astrophysics Data System (ADS)
Abdullah, S.; Kamarudin, S. K.; Hasran, U. A.; Masdar, M. S.; Daud, W. R. W.
2014-09-01
The commercialization of Direct Ethanol Fuel Cells (DEFCs) is still hindered because of economic and technical reasons. Fundamental scientific research is required to more completely understanding the complex electrochemical behavior and engineering technology of DEFCs. To use the DEFC system in real-world applications, fast, reliable, and cost-effective methods are needed to explore this complex phenomenon and to predict the performance of different system designs. Thus, modeling and simulation play an important role in examining the DEFC system as well as in designing an optimized DEFC system. The current DEFC literature shows that modeling studies on DEFCs are still in their early stages and are not able to describe the DEFC system as a whole. Potential DEFC applications and their current status are also presented.
Surfing on Protein Waves: Proteophoresis as a Mechanism for Bacterial Genome Partitioning
NASA Astrophysics Data System (ADS)
Walter, J.-C.; Dorignac, J.; Lorman, V.; Rech, J.; Bouet, J.-Y.; Nollmann, M.; Palmeri, J.; Parmeggiani, A.; Geniet, F.
2017-07-01
Efficient bacterial chromosome segregation typically requires the coordinated action of a three-component machinery, fueled by adenosine triphosphate, called the partition complex. We present a phenomenological model accounting for the dynamic activity of this system that is also relevant for the physics of catalytic particles in active environments. The model is obtained by coupling simple linear reaction-diffusion equations with a proteophoresis, or "volumetric" chemophoresis, force field that arises from protein-protein interactions and provides a physically viable mechanism for complex translocation. This minimal description captures most known experimental observations: dynamic oscillations of complex components, complex separation, and subsequent symmetrical positioning. The predictions of our model are in phenomenological agreement with and provide substantial insight into recent experiments. From a nonlinear physics view point, this system explores the active separation of matter at micrometric scales with a dynamical instability between static positioning and traveling wave regimes triggered by the dynamical spontaneous breaking of rotational symmetry.
On the robustness of complex heterogeneous gene expression networks.
Gómez-Gardeñes, Jesús; Moreno, Yamir; Floría, Luis M
2005-04-01
We analyze a continuous gene expression model on the underlying topology of a complex heterogeneous network. Numerical simulations aimed at studying the chaotic and periodic dynamics of the model are performed. The results clearly indicate that there is a region in which the dynamical and structural complexity of the system avoid chaotic attractors. However, contrary to what has been reported for Random Boolean Networks, the chaotic phase cannot be completely suppressed, which has important bearings on network robustness and gene expression modeling.
An agent-based hydroeconomic model to evaluate water policies in Jordan
NASA Astrophysics Data System (ADS)
Yoon, J.; Gorelick, S.
2014-12-01
Modern water systems can be characterized by a complex network of institutional and private actors that represent competing sectors and interests. Identifying solutions to enhance water security in such systems calls for analysis that can adequately account for this level of complexity and interaction. Our work focuses on the development of a hierarchical, multi-agent, hydroeconomic model that attempts to realistically represent complex interactions between hydrologic and multi-faceted human systems. The model is applied to Jordan, one of the most water-poor countries in the world. In recent years, the water crisis in Jordan has escalated due to an ongoing drought and influx of refugees from regional conflicts. We adopt a modular approach in which biophysical modules simulate natural and engineering phenomena, and human modules represent behavior at multiple scales of decision making. The human modules employ agent-based modeling, in which agents act as autonomous decision makers at the transboundary, state, organizational, and user levels. A systematic nomenclature and conceptual framework is used to characterize model agents and modules. Concepts from the Unified Modeling Language (UML) are adopted to promote clear conceptualization of model classes and process sequencing, establishing a foundation for full deployment of the integrated model in a scalable object-oriented programming environment. Although the framework is applied to the Jordanian water context, it is generalizable to other regional human-natural freshwater supply systems.
Ecological systems are generally considered among the most complex because they are characterized by a large number of diverse components, nonlinear interactions, scale multiplicity, and spatial heterogeneity. Hierarchy theory, as well as empirical evidence, suggests that comp...
NASA Astrophysics Data System (ADS)
Givens, J.; Padowski, J.; Malek, K.; Guzman, C.; Boll, J.; Adam, J. C.; Witinok-Huber, R.
2017-12-01
In the face of climate change and multi-scalar governance objectives, achieving resilience of food-energy-water (FEW) systems requires interdisciplinary approaches. Through coordinated modeling and management efforts, we study "Innovations in the Food-Energy-Water Nexus (INFEWS)" through a case-study in the Columbia River Basin. Previous research on FEW system management and resilience includes some attention to social dynamics (e.g., economic, governance); however, more research is needed to better address social science perspectives. Decisions ultimately taken in this river basin would occur among stakeholders encompassing various institutional power structures including multiple U.S. states, tribal lands, and sovereign nations. The social science lens draws attention to the incompatibility between the engineering definition of resilience (i.e., return to equilibrium or a singular stable state) and the ecological and social system realities, more explicit in the ecological interpretation of resilience (i.e., the ability of a system to move into a different, possibly more resilient state). Social science perspectives include but are not limited to differing views on resilience as normative, system persistence versus transformation, and system boundary issues. To expand understanding of resilience and objectives for complex and dynamic systems, concepts related to inequality, heterogeneity, power, agency, trust, values, culture, history, conflict, and system feedbacks must be more tightly integrated into FEW research. We identify gaps in knowledge and data, and the value and complexity of incorporating social components and processes into systems models. We posit that socio-biophysical system resilience modeling would address important complex, dynamic social relationships, including non-linear dynamics of social interactions, to offer an improved understanding of sustainable management in FEW systems. Conceptual modeling that is presented in our study, represents a starting point for a continued research agenda that incorporates social dynamics into FEW system resilience and management.
Vinnakota, Kalyan C; Beard, Daniel A; Dash, Ranjan K
2009-01-01
Identification of a complex biochemical system model requires appropriate experimental data. Models constructed on the basis of data from the literature often contain parameters that are not identifiable with high sensitivity and therefore require additional experimental data to identify those parameters. Here we report the application of a local sensitivity analysis to design experiments that will improve the identifiability of previously unidentifiable model parameters in a model of mitochondrial oxidative phosphorylation and tricaboxylic acid cycle. Experiments were designed based on measurable biochemical reactants in a dilute suspension of purified cardiac mitochondria with experimentally feasible perturbations to this system. Experimental perturbations and variables yielding the most number of parameters above a 5% sensitivity level are presented and discussed.
NASA Technical Reports Server (NTRS)
Mog, Robert A.
1999-01-01
Unique and innovative graph theory, neural network, organizational modeling, and genetic algorithms are applied to the design and evolution of programmatic and organizational architectures. Graph theory representations of programs and organizations increase modeling capabilities and flexibility, while illuminating preferable programmatic/organizational design features. Treating programs and organizations as neural networks results in better system synthesis, and more robust data modeling. Organizational modeling using covariance structures enhances the determination of organizational risk factors. Genetic algorithms improve programmatic evolution characteristics, while shedding light on rulebase requirements for achieving specified technological readiness levels, given budget and schedule resources. This program of research improves the robustness and verifiability of systems synthesis tools, including the Complex Organizational Metric for Programmatic Risk Environments (COMPRE).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arion is a library and tool set that enables researchers to holistically define test system models. To define a complex system for testing an algorithm or control requires expertise across multiple domains. Simulating a complex system requires the integration of multiple simulators and test hardware, each with their own specification languages and concepts. This requires extensive set of knowledge and capabilities. Arion was developed to alleviate this challenge. Arion is a library of Java libraries that abstracts the concepts from supported simulators into a cohesive model language that allows someone to build models to their needed level of fidelity andmore » expertise. Arion is also a software tool that translates the users model back into the specification languages of the simulators and test hardware needed for execution.« less
Acceleration techniques for dependability simulation. M.S. Thesis
NASA Technical Reports Server (NTRS)
Barnette, James David
1995-01-01
As computer systems increase in complexity, the need to project system performance from the earliest design and development stages increases. We have to employ simulation for detailed dependability studies of large systems. However, as the complexity of the simulation model increases, the time required to obtain statistically significant results also increases. This paper discusses an approach that is application independent and can be readily applied to any process-based simulation model. Topics include background on classical discrete event simulation and techniques for random variate generation and statistics gathering to support simulation.
Method of fuzzy inference for one class of MISO-structure systems with non-singleton inputs
NASA Astrophysics Data System (ADS)
Sinuk, V. G.; Panchenko, M. V.
2018-03-01
In fuzzy modeling, the inputs of the simulated systems can receive both crisp values and non-Singleton. Computational complexity of fuzzy inference with fuzzy non-Singleton inputs corresponds to an exponential. This paper describes a new method of inference, based on the theorem of decomposition of a multidimensional fuzzy implication and a fuzzy truth value. This method is considered for fuzzy inputs and has a polynomial complexity, which makes it possible to use it for modeling large-dimensional MISO-structure systems.
Origin of Complexity in Multicellular Organisms
NASA Astrophysics Data System (ADS)
Furusawa, Chikara; Kaneko, Kunihiko
2000-06-01
Through extensive studies of dynamical system modeling cellular growth and reproduction, we find evidence that complexity arises in multicellular organisms naturally through evolution. Without any elaborate control mechanism, these systems can exhibit complex pattern formation with spontaneous cell differentiation. Such systems employ a ``cooperative'' use of resources and maintain a larger growth speed than simple cell systems, which exist in a homogeneous state and behave ``selfishly.'' The relevance of the diversity of chemicals and reaction dynamics to the growth of a multicellular organism is demonstrated. Chaotic biochemical dynamics are found to provide the multipotency of stem cells.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ward, Gregory; Mistrick, Ph.D., Richard; Lee, Eleanor
2011-01-21
We describe two methods which rely on bidirectional scattering distribution functions (BSDFs) to model the daylighting performance of complex fenestration systems (CFS), enabling greater flexibility and accuracy in evaluating arbitrary assemblies of glazing, shading, and other optically-complex coplanar window systems. Two tools within Radiance enable a) efficient annual performance evaluations of CFS, and b) accurate renderings of CFS despite the loss of spatial resolution associated with low-resolution BSDF datasets for inhomogeneous systems. Validation, accuracy, and limitations of the methods are discussed.
Studying the HIT-Complexity Interchange.
Kuziemsky, Craig E; Borycki, Elizabeth M; Kushniruk, Andre W
2016-01-01
The design and implementation of health information technology (HIT) is challenging, particularly when it is being introduced into complex settings. While complex adaptive system (CASs) can be a valuable means of understanding relationships between users, HIT and tasks, much of the existing work using CASs is descriptive in nature. This paper addresses that issue by integrating a model for analyzing task complexity with approaches for HIT evaluation and systems analysis. The resulting framework classifies HIT-user tasks and issues as simple, complicated or complex, and provides insight on how to study them.
Archetypes for Organisational Safety
NASA Technical Reports Server (NTRS)
Marais, Karen; Leveson, Nancy G.
2003-01-01
We propose a framework using system dynamics to model the dynamic behavior of organizations in accident analysis. Most current accident analysis techniques are event-based and do not adequately capture the dynamic complexity and non-linear interactions that characterize accidents in complex systems. In this paper we propose a set of system safety archetypes that model common safety culture flaws in organizations, i.e., the dynamic behaviour of organizations that often leads to accidents. As accident analysis and investigation tools, the archetypes can be used to develop dynamic models that describe the systemic and organizational factors contributing to the accident. The archetypes help clarify why safety-related decisions do not always result in the desired behavior, and how independent decisions in different parts of the organization can combine to impact safety.
Anharmonic Vibrational Spectroscopy on Metal Transition Complexes
NASA Astrophysics Data System (ADS)
Latouche, Camille; Bloino, Julien; Barone, Vincenzo
2014-06-01
Advances in hardware performance and the availability of efficient and reliable computational models have made possible the application of computational spectroscopy to ever larger molecular systems. The systematic interpretation of experimental data and the full characterization of complex molecules can then be facilitated. Focusing on vibrational spectroscopy, several approaches have been proposed to simulate spectra beyond the double harmonic approximation, so that more details become available. However, a routine use of such tools requires the preliminary definition of a valid protocol with the most appropriate combination of electronic structure and nuclear calculation models. Several benchmark of anharmonic calculations frequency have been realized on organic molecules. Nevertheless, benchmarks of organometallics or inorganic metal complexes at this level are strongly lacking despite the interest of these systems due to their strong emission and vibrational properties. Herein we report the benchmark study realized with anharmonic calculations on simple metal complexes, along with some pilot applications on systems of direct technological or biological interest.
An integrative model of evolutionary covariance: a symposium on body shape in fishes.
Walker, Jeffrey A
2010-12-01
A major direction of current and future biological research is to understand how multiple, interacting functional systems coordinate in producing a body that works. This understanding is complicated by the fact that organisms need to work well in multiple environments, with both predictable and unpredictable environmental perturbations. Furthermore, organismal design reflects a history of past environments and not a plan for future environments. How complex, interacting functional systems evolve, then, is a truly grand challenge. In accepting the challenge, an integrative model of evolutionary covariance is developed. The model combines quantitative genetics, functional morphology/physiology, and functional ecology. The model is used to convene scientists ranging from geneticists, to physiologists, to ecologists, to engineers to facilitate the emergence of body shape in fishes as a model system for understanding how complex, interacting functional systems develop and evolve. Body shape of fish is a complex morphology that (1) results from many developmental paths and (2) functions in many different behaviors. Understanding the coordination and evolution of the many paths from genes to body shape, body shape to function, and function to a working fish body in a dynamic environment is now possible given new technologies from genetics to engineering and new theoretical models that integrate the different levels of biological organization (from genes to ecology).
Engineering education as a complex system
NASA Astrophysics Data System (ADS)
Gattie, David K.; Kellam, Nadia N.; Schramski, John R.; Walther, Joachim
2011-12-01
This paper presents a theoretical basis for cultivating engineering education as a complex system that will prepare students to think critically and make decisions with regard to poorly understood, ill-structured issues. Integral to this theoretical basis is a solution space construct developed and presented as a benchmark for evaluating problem-solving orientations that emerge within students' thinking as they progress through an engineering curriculum. It is proposed that the traditional engineering education model, while analytically rigorous, is characterised by properties that, although necessary, are insufficient for preparing students to address complex issues of the twenty-first century. A Synthesis and Design Studio model for engineering education is proposed, which maintains the necessary rigor of analysis within a uniquely complex yet sufficiently structured learning environment.
Slow dynamics in glasses: A comparison between theory and experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillips, J. C.
Minimalist theories of complex systems are broadly of two kinds: mean field and axiomatic. So far, all theories of complex properties absent from simple systems and intrinsic to glasses are axiomatic. Stretched Exponential Relaxation (SER) is the prototypical complex temporal property of glasses, discovered by Kohlrausch 150 years ago, and now observed almost universally in microscopically homogeneous, complex nonequilibrium materials, including luminescent electronic Coulomb glasses. A critical comparison of alternative axiomatic theories with both numerical simulations and experiments strongly favors channeled dynamical trap models over static percolative or energy landscape models. The topics discussed cover those reported since the author'smore » review article in 1996, with an emphasis on parallels between channel bifurcation in electronic and molecular relaxation.« less
A cardiovascular system model for lower-body negative pressure response
NASA Technical Reports Server (NTRS)
Mitchell, B. A., Jr.; Giese, R. P.
1971-01-01
Mathematical models used to study complex physiological control systems are discussed. Efforts were made to modify a model of the cardiovascular system for use in studying lower body negative pressure. A computer program was written which allows orderly, straightforward expansion to include exercise, metabolism (thermal stress), respiration, and other body functions.
NASA Astrophysics Data System (ADS)
Gorlov, A. P.; Averchenkov, V. I.; Rytov, M. Yu; Eryomenko, V. T.
2017-01-01
The article is concerned with mathematical simulation of protection level assessment of complex organizational and technical systems of industrial enterprises by creating automated system, which main functions are: information security (IS) audit, forming of the enterprise threats model, recommendations concerning creation of the information protection system, a set of organizational-administrative documentation.
Large-scale systems: Complexity, stability, reliability
NASA Technical Reports Server (NTRS)
Siljak, D. D.
1975-01-01
After showing that a complex dynamic system with a competitive structure has highly reliable stability, a class of noncompetitive dynamic systems for which competitive models can be constructed is defined. It is shown that such a construction is possible in the context of the hierarchic stability analysis. The scheme is based on the comparison principle and vector Liapunov functions.
Use of Mechanistic Models to?Improve Understanding: Differential, mass balance, process-based Spatial and temporal resolution Necessary simplifications of system complexity Combing field monitoring and modeling efforts Balance between capturing complexity and maintaining...
Modeling software systems by domains
NASA Technical Reports Server (NTRS)
Dippolito, Richard; Lee, Kenneth
1992-01-01
The Software Architectures Engineering (SAE) Project at the Software Engineering Institute (SEI) has developed engineering modeling techniques that both reduce the complexity of software for domain-specific computer systems and result in systems that are easier to build and maintain. These techniques allow maximum freedom for system developers to apply their domain expertise to software. We have applied these techniques to several types of applications, including training simulators operating in real time, engineering simulators operating in non-real time, and real-time embedded computer systems. Our modeling techniques result in software that mirrors both the complexity of the application and the domain knowledge requirements. We submit that the proper measure of software complexity reflects neither the number of software component units nor the code count, but the locus of and amount of domain knowledge. As a result of using these techniques, domain knowledge is isolated by fields of engineering expertise and removed from the concern of the software engineer. In this paper, we will describe kinds of domain expertise, describe engineering by domains, and provide relevant examples of software developed for simulator applications using the techniques.
An Education for Peace Model That Centres on Belief Systems: The Theory behind The Model
ERIC Educational Resources Information Center
Willis, Alison
2017-01-01
The education for peace model (EFPM) presented in this paper was developed within a theoretical framework of complexity science and critical theory and was derived from a review of an empirical research project conducted in a conflict affected environment. The model positions belief systems at the centre and is socioecologically systemic in design…
Simulating the Interactions Among Land Use, Transportation ...
In most transportation studies, computer models that forecast travel behavior statistics for a future year use static projections of the spatial distribution of future population and employment growth as inputs. As a result, they are unable to account for the temporally dynamic and non-linear interactions among transportation, land use, and socioeconomic systems. System dynamics (SD) provides a common framework for modeling the complex interactions among transportation and other related systems. This study uses a SD model to simulate the cascading impacts of a proposed light rail transit (LRT) system in central North Carolina, USA. The Durham-Orange Light Rail Project (D-O LRP) SD model incorporates relationships among the land use, transportation, and economy sectors to simulate the complex feedbacks that give rise to the travel behavior changes forecasted by the region’s transportation model. This paper demonstrates the sensitivity of changes in travel behavior to the proposed LRT system and the assumptions that went into the transportation modeling, and compares those results to the impacts of an alternative fare-free transit system. SD models such as the D-O LRP SD model can complement transportation studies by providing valuable insight into the interdependent community systems that collectively contribute to travel behavior changes. Presented at the 35th International Conference of the System Dynamics Society in Cambridge, MA, July 18th, 2017
Reliability models applicable to space telescope solar array assembly system
NASA Technical Reports Server (NTRS)
Patil, S. A.
1986-01-01
A complex system may consist of a number of subsystems with several components in series, parallel, or combination of both series and parallel. In order to predict how well the system will perform, it is necessary to know the reliabilities of the subsystems and the reliability of the whole system. The objective of the present study is to develop mathematical models of the reliability which are applicable to complex systems. The models are determined by assuming k failures out of n components in a subsystem. By taking k = 1 and k = n, these models reduce to parallel and series models; hence, the models can be specialized to parallel, series combination systems. The models are developed by assuming the failure rates of the components as functions of time and as such, can be applied to processes with or without aging effects. The reliability models are further specialized to Space Telescope Solar Arrray (STSA) System. The STSA consists of 20 identical solar panel assemblies (SPA's). The reliabilities of the SPA's are determined by the reliabilities of solar cell strings, interconnects, and diodes. The estimates of the reliability of the system for one to five years are calculated by using the reliability estimates of solar cells and interconnects given n ESA documents. Aging effects in relation to breaks in interconnects are discussed.
The implementation of a comprehensive PBPK modeling approach resulted in ERDEM, a complex PBPK modeling system. ERDEM provides a scalable and user-friendly environment that enables researchers to focus on data input values rather than writing program code. ERDEM efficiently m...
System Models and Aging: A Driving Example.
ERIC Educational Resources Information Center
Melichar, Joseph F.
Chronological age is a marker in time but it fails to measure accurately the performance or behavioral characteristics of individuals. This paper models the complexity of aging by using a system model and a human function paradigm. These models help facilitate representation of older adults, integrate research agendas, and enhance remediative…
MODELS-3 INSTALLATION PROCEDURES FOR A PC WITH AN NT OPERATING SYSTEM (MODELS-3 VERSION 4.0)
Models-3 is a flexible software system designed to simplify the development and use of air quality models and other environmental decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of at...
Models-3 is a flexible system designed to simplify the development and use of air quality models and other environmental decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheric...
USDA-ARS?s Scientific Manuscript database
The complexity of the hydrologic system challenges the development of models. One issue faced during the model development stage is the uncertainty involved in model parameterization. Using a single optimized set of parameters (one snapshot) to represent baseline conditions of the system limits the ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marzouk, Youssef
Predictive simulation of complex physical systems increasingly rests on the interplay of experimental observations with computational models. Key inputs, parameters, or structural aspects of models may be incomplete or unknown, and must be developed from indirect and limited observations. At the same time, quantified uncertainties are needed to qualify computational predictions in the support of design and decision-making. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, but at prohibitive computional expense. This project intends to make rigorous predictive modeling *feasible* in complex physical systems, via accelerated and scalable tools for uncertainty quantification, Bayesianmore » inference, and experimental design. Specific objectives are as follows: 1. Develop adaptive posterior approximations and dimensionality reduction approaches for Bayesian inference in high-dimensional nonlinear systems. 2. Extend accelerated Bayesian methodologies to large-scale {\\em sequential} data assimilation, fully treating nonlinear models and non-Gaussian state and parameter distributions. 3. Devise efficient surrogate-based methods for Bayesian model selection and the learning of model structure. 4. Develop scalable simulation/optimization approaches to nonlinear Bayesian experimental design, for both parameter inference and model selection. 5. Demonstrate these inferential tools on chemical kinetic models in reacting flow, constructing and refining thermochemical and electrochemical models from limited data. Demonstrate Bayesian filtering on canonical stochastic PDEs and in the dynamic estimation of inhomogeneous subsurface properties and flow fields.« less
Network theory and its applications in economic systems
NASA Astrophysics Data System (ADS)
Huang, Xuqing
This dissertation covers the two major parts of my Ph.D. research: i) developing theoretical framework of complex networks; and ii) applying complex networks models to quantitatively analyze economics systems. In part I, we focus on developing theories of interdependent networks, which includes two chapters: 1) We develop a mathematical framework to study the percolation of interdependent networks under targeted-attack and find that when the highly connected nodes are protected and have lower probability to fail, in contrast to single scale-free (SF) networks where the percolation threshold pc = 0, coupled SF networks are significantly more vulnerable with pc significantly larger than zero. 2) We analytically demonstrates that clustering, which quantifies the propensity for two neighbors of the same vertex to also be neighbors of each other, significantly increases the vulnerability of the system. In part II, we apply the complex networks models to study economics systems, which also includes two chapters: 1) We study the US corporate governance network, in which nodes representing directors and links between two directors representing their service on common company boards, and propose a quantitative measure of information and influence transformation in the network. Thus we are able to identify the most influential directors in the network. 2) We propose a bipartite networks model to simulate the risk propagation process among commercial banks during financial crisis. With empirical bank's balance sheet data in 2007 as input to the model, we find that our model efficiently identifies a significant portion of the actual failed banks reported by Federal Deposit Insurance Corporation during the financial crisis between 2008 and 2011. The results suggest that complex networks model could be useful for systemic risk stress testing for financial systems. The model also identifies that commercial rather than residential real estate assets are major culprits for the failure of over 350 US commercial banks during 2008 - 2011.
Spatial operator algebra for flexible multibody dynamics
NASA Technical Reports Server (NTRS)
Jain, A.; Rodriguez, G.
1993-01-01
This paper presents an approach to modeling the dynamics of flexible multibody systems such as flexible spacecraft and limber space robotic systems. A large number of degrees of freedom and complex dynamic interactions are typical in these systems. This paper uses spatial operators to develop efficient recursive algorithms for the dynamics of these systems. This approach very efficiently manages complexity by means of a hierarchy of mathematical operations.
T-MATS Toolbox for the Modeling and Analysis of Thermodynamic Systems
NASA Technical Reports Server (NTRS)
Chapman, Jeffryes W.
2014-01-01
The Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) is a MATLABSimulink (The MathWorks Inc.) plug-in for creating and simulating thermodynamic systems and controls. The package contains generic parameterized components that can be combined with a variable input iterative solver and optimization algorithm to create complex system models, such as gas turbines.
From Brown-Peterson to continual distractor via operation span: A SIMPLE account of complex span.
Neath, Ian; VanWormer, Lisa A; Bireta, Tamra J; Surprenant, Aimée M
2014-09-01
Three memory tasks-Brown-Peterson, complex span, and continual distractor-all alternate presentation of a to-be-remembered item and a distractor activity, but each task is associated with a different memory system, short-term memory, working memory, and long-term memory, respectively. SIMPLE, a relative local distinctiveness model, has previously been fit to data from both the Brown-Peterson and continual distractor tasks; here we use the same version of the model to fit data from a complex span task. Despite the many differences between the tasks, including unpredictable list length, SIMPLE fit the data well. Because SIMPLE posits a single memory system, these results constitute yet another demonstration that performance on tasks originally thought to tap different memory systems can be explained without invoking multiple memory systems.
Adjoint equations and analysis of complex systems: Application to virus infection modelling
NASA Astrophysics Data System (ADS)
Marchuk, G. I.; Shutyaev, V.; Bocharov, G.
2005-12-01
Recent development of applied mathematics is characterized by ever increasing attempts to apply the modelling and computational approaches across various areas of the life sciences. The need for a rigorous analysis of the complex system dynamics in immunology has been recognized since more than three decades ago. The aim of the present paper is to draw attention to the method of adjoint equations. The methodology enables to obtain information about physical processes and examine the sensitivity of complex dynamical systems. This provides a basis for a better understanding of the causal relationships between the immune system's performance and its parameters and helps to improve the experimental design in the solution of applied problems. We show how the adjoint equations can be used to explain the changes in hepatitis B virus infection dynamics between individual patients.
The Difference between Uncertainty and Information, and Why This Matters
NASA Astrophysics Data System (ADS)
Nearing, G. S.
2016-12-01
Earth science investigation and arbitration (for decision making) is very often organized around a concept of uncertainty. It seems relatively straightforward that the purpose of our science is to reduce uncertainty about how environmental systems will react and evolve under different conditions. I propose here that approaching a science of complex systems as a process of quantifying and reducing uncertainty is a mistake, and specifically a mistake that is rooted in certain rather hisoric logical errors. Instead I propose that we should be asking questions about information. I argue here that an information-based perspective facilitates almost trivial answers to environmental science questions that are either difficult or theoretically impossible to answer when posed as questions about uncertainty. In particular, I propose that an information-centric perspective leads to: Coherent and non-subjective hypothesis tests for complex system models. Process-level diagnostics for complex systems models. Methods for building complex systems models that allow for inductive inference without the need for a priori specification of likelihood functions or ad hoc error metrics. Asymptotically correct quantification of epistemic uncertainty. To put this in slightly more basic terms, I propose that an information-theoretic philosophy of science has the potential to resolve certain important aspects of the Demarcation Problem and the Duhem-Quine Problem, and that Hydrology and other Earth Systems Sciences can immediately capitalize on this to address some of our most difficult and persistent problems.
Statistical Analysis of Complexity Generators for Cost Estimation
NASA Technical Reports Server (NTRS)
Rowell, Ginger Holmes
1999-01-01
Predicting the cost of cutting edge new technologies involved with spacecraft hardware can be quite complicated. A new feature of the NASA Air Force Cost Model (NAFCOM), called the Complexity Generator, is being developed to model the complexity factors that drive the cost of space hardware. This parametric approach is also designed to account for the differences in cost, based on factors that are unique to each system and subsystem. The cost driver categories included in this model are weight, inheritance from previous missions, technical complexity, and management factors. This paper explains the Complexity Generator framework, the statistical methods used to select the best model within this framework, and the procedures used to find the region of predictability and the prediction intervals for the cost of a mission.
Observation-Driven Configuration of Complex Software Systems
NASA Astrophysics Data System (ADS)
Sage, Aled
2010-06-01
The ever-increasing complexity of software systems makes them hard to comprehend, predict and tune due to emergent properties and non-deterministic behaviour. Complexity arises from the size of software systems and the wide variety of possible operating environments: the increasing choice of platforms and communication policies leads to ever more complex performance characteristics. In addition, software systems exhibit different behaviour under different workloads. Many software systems are designed to be configurable so that policies can be chosen to meet the needs of various stakeholders. For complex software systems it can be difficult to accurately predict the effects of a change and to know which configuration is most appropriate. This thesis demonstrates that it is useful to run automated experiments that measure a selection of system configurations. Experiments can find configurations that meet the stakeholders' needs, find interesting behavioural characteristics, and help produce predictive models of the system's behaviour. The design and use of ACT (Automated Configuration Tool) for running such experiments is described, in combination a number of search strategies for deciding on the configurations to measure. Design Of Experiments (DOE) is discussed, with emphasis on Taguchi Methods. These statistical methods have been used extensively in manufacturing, but have not previously been used for configuring software systems. The novel contribution here is an industrial case study, applying the combination of ACT and Taguchi Methods to DC-Directory, a product from Data Connection Ltd (DCL). The case study investigated the applicability of Taguchi Methods for configuring complex software systems. Taguchi Methods were found to be useful for modelling and configuring DC- Directory, making them a valuable addition to the techniques available to system administrators and developers.
Fuzzy Edge Connectivity of Graphical Fuzzy State Space Model in Multi-connected System
NASA Astrophysics Data System (ADS)
Harish, Noor Ainy; Ismail, Razidah; Ahmad, Tahir
2010-11-01
Structured networks of interacting components illustrate complex structure in a direct or intuitive way. Graph theory provides a mathematical modeling for studying interconnection among elements in natural and man-made systems. On the other hand, directed graph is useful to define and interpret the interconnection structure underlying the dynamics of the interacting subsystem. Fuzzy theory provides important tools in dealing various aspects of complexity, imprecision and fuzziness of the network structure of a multi-connected system. Initial development for systems of Fuzzy State Space Model (FSSM) and a fuzzy algorithm approach were introduced with the purpose of solving the inverse problems in multivariable system. In this paper, fuzzy algorithm is adapted in order to determine the fuzzy edge connectivity between subsystems, in particular interconnected system of Graphical Representation of FSSM. This new approach will simplify the schematic diagram of interconnection of subsystems in a multi-connected system.
Systemic Analysis Approaches for Air Transportation
NASA Technical Reports Server (NTRS)
Conway, Sheila
2005-01-01
Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.
Designing To Learn about Complex Systems.
ERIC Educational Resources Information Center
Hmelo, Cindy E.; Holton, Douglas L.; Kolodner, Janet L.
2000-01-01
Indicates the presence of complex structural, behavioral, and functional relations to understanding. Reports on a design experiment in which 6th grade children learned about the human respiratory system by designing artificial lungs and building partial working models. Makes suggestions for successful learning from design activities. (Contains 44…
Profiling Bioactivity of the ToxCast Chemical Library Using BioMAP Primary Human Cell Systems
The complexity of human biology has made prediction of health effects as a consequence of exposure to environmental chemicals especially challenging. Complex cell systems, such as the Biologically Multiplexed Activity Profiling (BioMAP) primary, human, cell-based disease models, ...
Hierarchical Control Using Networks Trained with Higher-Level Forward Models
Wayne, Greg; Abbott, L.F.
2015-01-01
We propose and develop a hierarchical approach to network control of complex tasks. In this approach, a low-level controller directs the activity of a “plant,” the system that performs the task. However, the low-level controller may only be able to solve fairly simple problems involving the plant. To accomplish more complex tasks, we introduce a higher-level controller that controls the lower-level controller. We use this system to direct an articulated truck to a specified location through an environment filled with static or moving obstacles. The final system consists of networks that have memorized associations between the sensory data they receive and the commands they issue. These networks are trained on a set of optimal associations that are generated by minimizing cost functions. Cost function minimization requires predicting the consequences of sequences of commands, which is achieved by constructing forward models, including a model of the lower-level controller. The forward models and cost minimization are only used during training, allowing the trained networks to respond rapidly. In general, the hierarchical approach can be extended to larger numbers of levels, dividing complex tasks into more manageable sub-tasks. The optimization procedure and the construction of the forward models and controllers can be performed in similar ways at each level of the hierarchy, which allows the system to be modified to perform other tasks, or to be extended for more complex tasks without retraining lower-levels. PMID:25058706
Promoting evaluation capacity building in a complex adaptive system.
Lawrenz, Frances; Kollmann, Elizabeth Kunz; King, Jean A; Bequette, Marjorie; Pattison, Scott; Nelson, Amy Grack; Cohn, Sarah; Cardiel, Christopher L B; Iacovelli, Stephanie; Eliou, Gayra Ostgaard; Goss, Juli; Causey, Lauren; Sinkey, Anne; Beyer, Marta; Francisco, Melanie
2018-04-10
This study provides results from an NSF funded, four year, case study about evaluation capacity building in a complex adaptive system, the Nanoscale Informal Science Education Network (NISE Net). The results of the Complex Adaptive Systems as a Model for Network Evaluations (CASNET) project indicate that complex adaptive system concepts help to explain evaluation capacity building in a network. The NISE Network was found to be a complex learning system that was supportive of evaluation capacity building through feedback loops that provided for information sharing and interaction. Participants in the system had different levels of and sources of evaluation knowledge. To be successful at building capacity, the system needed to have a balance between both centralized and decentralized control, coherence, redundancy, and diversity. Embeddedness of individuals within the system also provided support and moved the capacity of the system forward. Finally, success depended on attention being paid to the control of resources. Implications of these findings are discussed. Copyright © 2018 Elsevier Ltd. All rights reserved.
Socio-Environmental Resilience and Complex Urban Systems Modeling
NASA Astrophysics Data System (ADS)
Deal, Brian; Petri, Aaron; Pan, Haozhi; Goldenberg, Romain; Kalantari, Zahra; Cvetkovic, Vladimir
2017-04-01
The increasing pressure of climate change has inspired two normative agendas; socio-technical transitions and socio-ecological resilience, both sharing a complex-systems epistemology (Gillard et al. 2016). Socio-technical solutions include a continuous, massive data gathering exercise now underway in urban places under the guise of developing a 'smart'(er) city. This has led to the creation of data-rich environments where large data sets have become central to monitoring and forming a response to anomalies. Some have argued that these kinds of data sets can help in planning for resilient cities (Norberg and Cumming 2008; Batty 2013). In this paper, we focus on a more nuanced, ecologically based, socio-environmental perspective of resilience planning that is often given less consideration. Here, we broadly discuss (and model) the tightly linked, mutually influenced, social and biophysical subsystems that are critical for understanding urban resilience. We argue for the need to incorporate these sub system linkages into the resilience planning lexicon through the integration of systems models and planning support systems. We make our case by first providing a context for urban resilience from a socio-ecological and planning perspective. We highlight the data needs for this type of resilient planning and compare it to currently collected data streams in various smart city efforts. This helps to define an approach for operationalizing socio-environmental resilience planning using robust systems models and planning support systems. For this, we draw from our experiences in coupling a spatio-temporal land use model (the Landuse Evolution and impact Assessment Model (LEAM)) with water quality and quantity models in Stockholm Sweden. We describe the coupling of these systems models using a robust Planning Support System (PSS) structural framework. We use the coupled model simulations and PSS to analyze the connection between urban land use transformation (social) and water (environmental) systems within the context of planning for a more resilient Stockholm. This work shows that complex urban systems models can help bridge the divide between socio-technological and socio-environmental systems knowledge and achieving resilient urban areas.
NASA Astrophysics Data System (ADS)
Hrachowitz, Markus; Fovet, Ophelie; Ruiz, Laurent; Gascuel-Odoux, Chantal; Savenije, Hubert
2014-05-01
Hydrological models are frequently characterized by what is often considered to be adequate calibration performances. In many cases, however, these models experience a substantial uncertainty and performance decrease in validation periods, thus resulting in poor predictive power. Besides the likely presence of data errors, this observation can point towards wrong or insufficient representations of the underlying processes and their heterogeneity. In other words, right results are generated for the wrong reasons. Thus ways are sought to increase model consistency and to thereby satisfy the contrasting priorities of the need a) to increase model complexity and b) to limit model equifinality. In this study a stepwise model development approach is chosen to test the value of an exhaustive and systematic combined use of hydrological signatures, expert knowledge and readily available, yet anecdotal and rarely exploited, hydrological information for increasing model consistency towards generating the right answer for the right reasons. A simple 3-box, 7 parameter, conceptual HBV-type model, constrained by 4 calibration objective functions was able to adequately reproduce the hydrograph with comparatively high values for the 4 objective functions in the 5-year calibration period. However, closer inspection of the results showed a dramatic decrease of model performance in the 5-year validation period. In addition, assessing the model's skill to reproduce a range of 20 hydrological signatures including, amongst others, the flow duration curve, the autocorrelation function and the rising limb density, showed that it could not adequately reproduce the vast majority of these signatures, indicating a lack of model consistency. Subsequently model complexity was increased in a stepwise way to allow for more process heterogeneity. To limit model equifinality, increase in complexity was counter-balanced by a stepwise application of "realism constraints", inferred from expert knowledge (e.g. unsaturated storage capacity of hillslopes should exceed the one of wetlands) and anecdotal hydrological information (e.g. long-term estimates of actual evaporation obtained from the Budyko framework and long-term estimates of baseflow contribution) to ensure that the model is well behaved with respect to the modeller's perception of the system. A total of 11 model set-ups with increased complexity and an increased number of realism constraints was tested. It could be shown that in spite of largely unchanged calibration performance, compared to the simplest set-up, the most complex model set-up (12 parameters, 8 constraints) exhibited significantly increased performance in the validation period while uncertainty did not increase. In addition, the most complex model was characterized by a substantially increased skill to reproduce all 20 signatures, indicating a more suitable representation of the system. The results suggest that a model, "well" constrained by 4 calibration objective functions may still be an inadequate representation of the system and that increasing model complexity, if counter-balanced by realism constraints, can indeed increase predictive performance of a model and its skill to reproduce a range of hydrological signatures, but that it does not necessarily result in increased uncertainty. The results also strongly illustrate the need to move away from automated model calibration towards a more general expert-knowledge driven strategy of constraining models if a certain level of model consistency is to be achieved.
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2015-01-01
This report documents a case study on the application of Reliability Engineering techniques to achieve an optimal balance between performance and robustness by tuning the functional parameters of a complex non-linear control system. For complex systems with intricate and non-linear patterns of interaction between system components, analytical derivation of a mathematical model of system performance and robustness in terms of functional parameters may not be feasible or cost-effective. The demonstrated approach is simple, structured, effective, repeatable, and cost and time efficient. This general approach is suitable for a wide range of systems.
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Preheim, Larry E.
1990-01-01
Data systems requirements in the Earth Observing System (EOS) Space Station Freedom (SSF) eras indicate increasing data volume, increased discipline interplay, higher complexity and broader data integration and interpretation. A response to the needs of the interdisciplinary investigator is proposed, considering the increasing complexity and rising costs of scientific investigation. The EOS Data Information System, conceived to be a widely distributed system with reliable communication links between central processing and the science user community, is described. Details are provided on information architecture, system models, intelligent data management of large complex databases, and standards for archiving ancillary data, using a research library, a laboratory and collaboration services.
Biocellion: accelerating computer simulation of multicellular biological system models
Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya
2014-01-01
Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572
Virtual Construction of Space Habitats: Connecting Building Information Models (BIM) and SysML
NASA Technical Reports Server (NTRS)
Polit-Casillas, Raul; Howe, A. Scott
2013-01-01
Current trends in design, construction and management of complex projects make use of Building Information Models (BIM) connecting different types of data to geometrical models. This information model allow different types of analysis beyond pure graphical representations. Space habitats, regardless their size, are also complex systems that require the synchronization of many types of information and disciplines beyond mass, volume, power or other basic volumetric parameters. For this, the state-of-the-art model based systems engineering languages and processes - for instance SysML - represent a solid way to tackle this problem from a programmatic point of view. Nevertheless integrating this with a powerful geometrical architectural design tool with BIM capabilities could represent a change in the workflow and paradigm of space habitats design applicable to other aerospace complex systems. This paper shows some general findings and overall conclusions based on the ongoing research to create a design protocol and method that practically connects a systems engineering approach with a BIM architectural and engineering design as a complete Model Based Engineering approach. Therefore, one hypothetical example is created and followed during the design process. In order to make it possible this research also tackles the application of IFC categories and parameters in the aerospace field starting with the application upon the space habitats design as way to understand the information flow between disciplines and tools. By building virtual space habitats we can potentially improve in the near future the way more complex designs are developed from very little detail from concept to manufacturing.
A causal framework for integrating contemporary and Vedic holism.
Kineman, John J
2017-12-01
Whereas the last Century of science was characterized by epistemological uncertainty; the current Century will likely be characterized by ontological complexity (Gorban and Yablonsky, 2013). Advances in Systems Theory by mathematical biologist Robert Rosen suggest an elegant way forward (Rosen, 2013). "R-theory" (Kineman, 2012) is a synthesis of Rosen's theories explaining complexity and life in terms of a meta-model for 'whole' systems (and their fractions) in terms of "5 th -order holons". Such holons are Rosen "modeling relations" relating system-dependent processes with their formative contexts via closed cycles of four archetypal (Aristotelian) causes. This approach has post-predicted the three most basic taxa of life, plus a quasi-organismic form that may describe proto, component, and ecosystemic life. R-theory thus suggests a fundamentally complex ontology of existence inverting the current view that complexity arises from simple mechanisms. This model of cyclical causality corresponds to the ancient meta-model described in the Vedas and Upanishads of India. Part I of this discussion (Kineman, 2016a) presented a case for associating Vedic philosophy with Harappan civilization, allowing interpretation of ancient concepts of "cosmic order" (Rta) in the Rig Veda, nonduality (advaita), seven-fold beingness (saptanna) and other forms of holism appearing later in the Upanishads. By deciphering the model of wholeness that was applied and tested in ancient times, it is possible to compare, test, and confirm the holon model as a mathematical definition of life, systemic wholeness, and sustainability that may be applied today in modern terms, even as a foundation for holistic science. Copyright © 2017 Elsevier Ltd. All rights reserved.
Real-time biomimetic Central Pattern Generators in an FPGA for hybrid experiments
Ambroise, Matthieu; Levi, Timothée; Joucla, Sébastien; Yvert, Blaise; Saïghi, Sylvain
2013-01-01
This investigation of the leech heartbeat neural network system led to the development of a low resources, real-time, biomimetic digital hardware for use in hybrid experiments. The leech heartbeat neural network is one of the simplest central pattern generators (CPG). In biology, CPG provide the rhythmic bursts of spikes that form the basis for all muscle contraction orders (heartbeat) and locomotion (walking, running, etc.). The leech neural network system was previously investigated and this CPG formalized in the Hodgkin–Huxley neural model (HH), the most complex devised to date. However, the resources required for a neural model are proportional to its complexity. In response to this issue, this article describes a biomimetic implementation of a network of 240 CPGs in an FPGA (Field Programmable Gate Array), using a simple model (Izhikevich) and proposes a new synapse model: activity-dependent depression synapse. The network implementation architecture operates on a single computation core. This digital system works in real-time, requires few resources, and has the same bursting activity behavior as the complex model. The implementation of this CPG was initially validated by comparing it with a simulation of the complex model. Its activity was then matched with pharmacological data from the rat spinal cord activity. This digital system opens the way for future hybrid experiments and represents an important step toward hybridization of biological tissue and artificial neural networks. This CPG network is also likely to be useful for mimicking the locomotion activity of various animals and developing hybrid experiments for neuroprosthesis development. PMID:24319408
Modeling of Wildlife-Associated Zoonoses: Applications and Caveats
Lewis, Bryan L.; Marathe, Madhav; Eubank, Stephen; Blackburn, Jason K.
2012-01-01
Abstract Wildlife species are identified as an important source of emerging zoonotic disease. Accordingly, public health programs have attempted to expand in scope to include a greater focus on wildlife and its role in zoonotic disease outbreaks. Zoonotic disease transmission dynamics involving wildlife are complex and nonlinear, presenting a number of challenges. First, empirical characterization of wildlife host species and pathogen systems are often lacking, and insight into one system may have little application to another involving the same host species and pathogen. Pathogen transmission characterization is difficult due to the changing nature of population size and density associated with wildlife hosts. Infectious disease itself may influence wildlife population demographics through compensatory responses that may evolve, such as decreased age to reproduction. Furthermore, wildlife reservoir dynamics can be complex, involving various host species and populations that may vary in their contribution to pathogen transmission and persistence over space and time. Mathematical models can provide an important tool to engage these complex systems, and there is an urgent need for increased computational focus on the coupled dynamics that underlie pathogen spillover at the human–wildlife interface. Often, however, scientists conducting empirical studies on emerging zoonotic disease do not have the necessary skill base to choose, develop, and apply models to evaluate these complex systems. How do modeling frameworks differ and what considerations are important when applying modeling tools to the study of zoonotic disease? Using zoonotic disease examples, we provide an overview of several common approaches and general considerations important in the modeling of wildlife-associated zoonoses. PMID:23199265
Information modeling system for blast furnace control
NASA Astrophysics Data System (ADS)
Spirin, N. A.; Gileva, L. Y.; Lavrov, V. V.
2016-09-01
Modern Iron & Steel Works as a rule are equipped with powerful distributed control systems (DCS) and databases. Implementation of DSC system solves the problem of storage, control, protection, entry, editing and retrieving of information as well as generation of required reporting data. The most advanced and promising approach is to use decision support information technologies based on a complex of mathematical models. The model decision support system for control of blast furnace smelting is designed and operated. The basis of the model system is a complex of mathematical models created using the principle of natural mathematical modeling. This principle provides for construction of mathematical models of two levels. The first level model is a basic state model which makes it possible to assess the vector of system parameters using field data and blast furnace operation results. It is also used to calculate the adjustment (adaptation) coefficients of the predictive block of the system. The second-level model is a predictive model designed to assess the design parameters of the blast furnace process when there are changes in melting conditions relative to its current state. Tasks for which software is developed are described. Characteristics of the main subsystems of the blast furnace process as an object of modeling and control - thermal state of the furnace, blast, gas dynamic and slag conditions of blast furnace smelting - are presented.
Batch-mode Reinforcement Learning for improved hydro-environmental systems management
NASA Astrophysics Data System (ADS)
Castelletti, A.; Galelli, S.; Restelli, M.; Soncini-Sessa, R.
2010-12-01
Despite the great progresses made in the last decades, the optimal management of hydro-environmental systems still remains a very active and challenging research area. The combination of multiple, often conflicting interests, high non-linearities of the physical processes and the management objectives, strong uncertainties in the inputs, and high dimensional state makes the problem challenging and intriguing. Stochastic Dynamic Programming (SDP) is one of the most suitable methods for designing (Pareto) optimal management policies preserving the original problem complexity. However, it suffers from a dual curse, which, de facto, prevents its practical application to even reasonably complex water systems. (i) Computational requirement grows exponentially with state and control dimension (Bellman's curse of dimensionality), so that SDP can not be used with water systems where the state vector includes more than few (2-3) units. (ii) An explicit model of each system's component is required (curse of modelling) to anticipate the effects of the system transitions, i.e. any information included into the SDP framework can only be either a state variable described by a dynamic model or a stochastic disturbance, independent in time, with the associated pdf. Any exogenous information that could effectively improve the system operation cannot be explicitly considered in taking the management decision, unless a dynamic model is identified for each additional information, thus adding to the problem complexity through the curse of dimensionality (additional state variables). To mitigate this dual curse, the combined use of batch-mode Reinforcement Learning (bRL) and Dynamic Model Reduction (DMR) techniques is explored in this study. bRL overcomes the curse of modelling by replacing explicit modelling with an external simulator and/or historical observations. The curse of dimensionality is averted using a functional approximation of the SDP value function based on proper non-linear regressors. DMR reduces the complexity and the associated computational requirements of non-linear distributed process based models, making them suitable for being included into optimization schemes. Results from real world applications of the approach are also presented, including reservoir operation with both quality and quantity targets.
Saving Human Lives: What Complexity Science and Information Systems can Contribute
NASA Astrophysics Data System (ADS)
Helbing, Dirk; Brockmann, Dirk; Chadefaux, Thomas; Donnay, Karsten; Blanke, Ulf; Woolley-Meza, Olivia; Moussaid, Mehdi; Johansson, Anders; Krause, Jens; Schutte, Sebastian; Perc, Matjaž
2015-02-01
We discuss models and data of crowd disasters, crime, terrorism, war and disease spreading to show that conventional recipes, such as deterrence strategies, are often not effective and sufficient to contain them. Many common approaches do not provide a good picture of the actual system behavior, because they neglect feedback loops, instabilities and cascade effects. The complex and often counter-intuitive behavior of social systems and their macro-level collective dynamics can be better understood by means of complexity science. We highlight that a suitable system design and management can help to stop undesirable cascade effects and to enable favorable kinds of self-organization in the system. In such a way, complexity science can help to save human lives.
Saving Human Lives: What Complexity Science and Information Systems can Contribute.
Helbing, Dirk; Brockmann, Dirk; Chadefaux, Thomas; Donnay, Karsten; Blanke, Ulf; Woolley-Meza, Olivia; Moussaid, Mehdi; Johansson, Anders; Krause, Jens; Schutte, Sebastian; Perc, Matjaž
We discuss models and data of crowd disasters, crime, terrorism, war and disease spreading to show that conventional recipes, such as deterrence strategies, are often not effective and sufficient to contain them. Many common approaches do not provide a good picture of the actual system behavior, because they neglect feedback loops, instabilities and cascade effects. The complex and often counter-intuitive behavior of social systems and their macro-level collective dynamics can be better understood by means of complexity science. We highlight that a suitable system design and management can help to stop undesirable cascade effects and to enable favorable kinds of self-organization in the system. In such a way, complexity science can help to save human lives.
Kinetics and mechanism of olefin catalytic hydroalumination by organoaluminum compounds
NASA Astrophysics Data System (ADS)
Koledina, K. F.; Gubaidullin, I. M.
2016-05-01
The complex reaction mechanism of α-olefin catalytic hydroalumination by alkylalanes is investigated via mathematical modeling that involves plotting the kinetic models for the individual reactions that make up a complex system and a separate study of their principles. Kinetic parameters of olefin catalytic hydroalumination are estimated. Activation energies of the possible steps of the schemes of complex reaction mechanisms are compared and possible reaction pathways are determined.
Some aspects of mathematical and chemical modeling of complex chemical processes
NASA Technical Reports Server (NTRS)
Nemes, I.; Botar, L.; Danoczy, E.; Vidoczy, T.; Gal, D.
1983-01-01
Some theoretical questions involved in the mathematical modeling of the kinetics of complex chemical process are discussed. The analysis is carried out for the homogeneous oxidation of ethylbenzene in the liquid phase. Particular attention is given to the determination of the general characteristics of chemical systems from an analysis of mathematical models developed on the basis of linear algebra.
Approaching human language with complex networks.
Cong, Jin; Liu, Haitao
2014-12-01
The interest in modeling and analyzing human language with complex networks is on the rise in recent years and a considerable body of research in this area has already been accumulated. We survey three major lines of linguistic research from the complex network approach: 1) characterization of human language as a multi-level system with complex network analysis; 2) linguistic typological research with the application of linguistic networks and their quantitative measures; and 3) relationships between the system-level complexity of human language (determined by the topology of linguistic networks) and microscopic linguistic (e.g., syntactic) features (as the traditional concern of linguistics). We show that the models and quantitative tools of complex networks, when exploited properly, can constitute an operational methodology for linguistic inquiry, which contributes to the understanding of human language and the development of linguistics. We conclude our review with suggestions for future linguistic research from the complex network approach: 1) relationships between the system-level complexity of human language and microscopic linguistic features; 2) expansion of research scope from the global properties to other levels of granularity of linguistic networks; and 3) combination of linguistic network analysis with other quantitative studies of language (such as quantitative linguistics). Copyright © 2014 Elsevier B.V. All rights reserved.
State Machine Modeling of the Space Launch System Solid Rocket Boosters
NASA Technical Reports Server (NTRS)
Harris, Joshua A.; Patterson-Hine, Ann
2013-01-01
The Space Launch System is a Shuttle-derived heavy-lift vehicle currently in development to serve as NASA's premiere launch vehicle for space exploration. The Space Launch System is a multistage rocket with two Solid Rocket Boosters and multiple payloads, including the Multi-Purpose Crew Vehicle. Planned Space Launch System destinations include near-Earth asteroids, the Moon, Mars, and Lagrange points. The Space Launch System is a complex system with many subsystems, requiring considerable systems engineering and integration. To this end, state machine analysis offers a method to support engineering and operational e orts, identify and avert undesirable or potentially hazardous system states, and evaluate system requirements. Finite State Machines model a system as a finite number of states, with transitions between states controlled by state-based and event-based logic. State machines are a useful tool for understanding complex system behaviors and evaluating "what-if" scenarios. This work contributes to a state machine model of the Space Launch System developed at NASA Ames Research Center. The Space Launch System Solid Rocket Booster avionics and ignition subsystems are modeled using MATLAB/Stateflow software. This model is integrated into a larger model of Space Launch System avionics used for verification and validation of Space Launch System operating procedures and design requirements. This includes testing both nominal and o -nominal system states and command sequences.
SAINT: A combined simulation language for modeling man-machine systems
NASA Technical Reports Server (NTRS)
Seifert, D. J.
1979-01-01
SAINT (Systems Analysis of Integrated Networks of Tasks) is a network modeling and simulation technique for design and analysis of complex man machine systems. SAINT provides the conceptual framework for representing systems that consist of discrete task elements, continuous state variables, and interactions between them. It also provides a mechanism for combining human performance models and dynamic system behaviors in a single modeling structure. The SAINT technique is described and applications of the SAINT are discussed.
A hierarchical approach for simulating northern forest dynamics
Don C. Bragg; David W. Roberts; Thomas R. Crow
2004-01-01
Complexity in ecological systems has challenged forest simulation modelers for years, resulting in a number of approaches with varying degrees of success. Arguments in favor of hierarchical modeling are made, especially for considering a complex environmental issue like widespread eastern hemlock regeneration failure. We present the philosophy and basic framework for...
Teaching Complex Concepts in the Geosciences by Integrating Analytical Reasoning with GIS
ERIC Educational Resources Information Center
Houser, Chris; Bishop, Michael P.; Lemmons, Kelly
2017-01-01
Conceptual models have long served as a means for physical geographers to organize their understanding of feedback mechanisms and complex systems. Analytical reasoning provides undergraduate students with an opportunity to develop conceptual models based upon their understanding of surface processes and environmental conditions. This study…
Xu, Wei
2007-12-01
This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.
Sheng, Xiao -Lan; Batista, Enrique Ricardo; Duan, Yi -Xiang; ...
2016-11-01
Previous studies suggested that in Nishibayashi’s homogenous catalytic systems based on molybdenum (Mo) complexes, the bimetallic structure facilitated dinitrogen to ammonia conversion in comparison to the corresponding monometallic complexes, likely due to the through-bond interactions between the two Mo centers. However, more detailed model systems are necessary to support this bimetallic hypothesis, and to elucidate the multi-metallic effects on the catalytic mechanism. In this work, we computationally examined the effects of dimension as well as the types of bridging ligands on the catalytic activities of molybdenum-dinitrogen complexes by using a set of extended model systems based on Nishibayashi’s bimetallic structure.more » The polynuclear chains containing four ([Mo] 4) or more Mo centers were found to drastically enhance the catalytic performance by comparing with both the monometallic and bimetallic complexes. Carbide ([:C≡C:] 2–) was found to be a more effective bridging ligand than N 2 in terms of electronic charges dispersion between metal centers thereby facilitating reactions in the catalytic cycle. Furthermore, the mechanistic modelling suggests that in principle, more efficient catalytic system for N 2 to NH 3 transformation might be obtained by extending the polynuclear chain to a proper size in combination with an effective bridging ligand for charge dispersion.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheng, Xiao -Lan; Batista, Enrique Ricardo; Duan, Yi -Xiang
Previous studies suggested that in Nishibayashi’s homogenous catalytic systems based on molybdenum (Mo) complexes, the bimetallic structure facilitated dinitrogen to ammonia conversion in comparison to the corresponding monometallic complexes, likely due to the through-bond interactions between the two Mo centers. However, more detailed model systems are necessary to support this bimetallic hypothesis, and to elucidate the multi-metallic effects on the catalytic mechanism. In this work, we computationally examined the effects of dimension as well as the types of bridging ligands on the catalytic activities of molybdenum-dinitrogen complexes by using a set of extended model systems based on Nishibayashi’s bimetallic structure.more » The polynuclear chains containing four ([Mo] 4) or more Mo centers were found to drastically enhance the catalytic performance by comparing with both the monometallic and bimetallic complexes. Carbide ([:C≡C:] 2–) was found to be a more effective bridging ligand than N 2 in terms of electronic charges dispersion between metal centers thereby facilitating reactions in the catalytic cycle. Furthermore, the mechanistic modelling suggests that in principle, more efficient catalytic system for N 2 to NH 3 transformation might be obtained by extending the polynuclear chain to a proper size in combination with an effective bridging ligand for charge dispersion.« less
Wang, Jiguang; Sun, Yidan; Zheng, Si; Zhang, Xiang-Sun; Zhou, Huarong; Chen, Luonan
2013-01-01
Synergistic interactions among transcription factors (TFs) and their cofactors collectively determine gene expression in complex biological systems. In this work, we develop a novel graphical model, called Active Protein-Gene (APG) network model, to quantify regulatory signals of transcription in complex biomolecular networks through integrating both TF upstream-regulation and downstream-regulation high-throughput data. Firstly, we theoretically and computationally demonstrate the effectiveness of APG by comparing with the traditional strategy based only on TF downstream-regulation information. We then apply this model to study spontaneous type 2 diabetic Goto-Kakizaki (GK) and Wistar control rats. Our biological experiments validate the theoretical results. In particular, SP1 is found to be a hidden TF with changed regulatory activity, and the loss of SP1 activity contributes to the increased glucose production during diabetes development. APG model provides theoretical basis to quantitatively elucidate transcriptional regulation by modelling TF combinatorial interactions and exploiting multilevel high-throughput information.
Wang, Jiguang; Sun, Yidan; Zheng, Si; Zhang, Xiang-Sun; Zhou, Huarong; Chen, Luonan
2013-01-01
Synergistic interactions among transcription factors (TFs) and their cofactors collectively determine gene expression in complex biological systems. In this work, we develop a novel graphical model, called Active Protein-Gene (APG) network model, to quantify regulatory signals of transcription in complex biomolecular networks through integrating both TF upstream-regulation and downstream-regulation high-throughput data. Firstly, we theoretically and computationally demonstrate the effectiveness of APG by comparing with the traditional strategy based only on TF downstream-regulation information. We then apply this model to study spontaneous type 2 diabetic Goto-Kakizaki (GK) and Wistar control rats. Our biological experiments validate the theoretical results. In particular, SP1 is found to be a hidden TF with changed regulatory activity, and the loss of SP1 activity contributes to the increased glucose production during diabetes development. APG model provides theoretical basis to quantitatively elucidate transcriptional regulation by modelling TF combinatorial interactions and exploiting multilevel high-throughput information. PMID:23346354
Large/Complex Antenna Performance Validation for Spaceborne Radar/Radiometeric Instruments
NASA Technical Reports Server (NTRS)
Focardi, Paolo; Harrell, Jefferson; Vacchione, Joseph
2013-01-01
Over the past decade, Earth observing missions which employ spaceborne combined radar & radiometric instruments have been developed and implemented. These instruments include the use of large and complex deployable antennas whose radiation characteristics need to be accurately determined over 4 pisteradians. Given the size and complexity of these antennas, the performance of the flight units cannot be readily measured. In addition, the radiation performance is impacted by the presence of the instrument's service platform which cannot easily be included in any measurement campaign. In order to meet the system performance knowledge requirements, a two pronged approach has been employed. The first is to use modeling tools to characterize the system and the second is to build a scale model of the system and use RF measurements to validate the results of the modeling tools. This paper demonstrates the resulting level of agreement between scale model and numerical modeling for two recent missions: (1) the earlier Aquarius instrument currently in Earth orbit and (2) the upcoming Soil Moisture Active Passive (SMAP) mission. The results from two modeling approaches, Ansoft's High Frequency Structure Simulator (HFSS) and TICRA's General RF Applications Software Package (GRASP), were compared with measurements of approximately 1/10th scale models of the Aquarius and SMAP systems. Generally good agreement was found between the three methods but each approach had its shortcomings as will be detailed in this paper.
NASA Astrophysics Data System (ADS)
Farmer, J. Doyne; Gallegati, M.; Hommes, C.; Kirman, A.; Ormerod, P.; Cincotti, S.; Sanchez, A.; Helbing, D.
2012-11-01
We outline a vision for an ambitious program to understand the economy and financial markets as a complex evolving system of coupled networks of interacting agents. This is a completely different vision from that currently used in most economic models. This view implies new challenges and opportunities for policy and managing economic crises. The dynamics of such models inherently involve sudden and sometimes dramatic changes of state. Further, the tools and approaches we use emphasize the analysis of crises rather than of calm periods. In this they respond directly to the calls of Governors Bernanke and Trichet for new approaches to macroeconomic modelling.
The noisy voter model on complex networks.
Carro, Adrián; Toral, Raúl; San Miguel, Maxi
2016-04-20
We propose a new analytical method to study stochastic, binary-state models on complex networks. Moving beyond the usual mean-field theories, this alternative approach is based on the introduction of an annealed approximation for uncorrelated networks, allowing to deal with the network structure as parametric heterogeneity. As an illustration, we study the noisy voter model, a modification of the original voter model including random changes of state. The proposed method is able to unfold the dependence of the model not only on the mean degree (the mean-field prediction) but also on more complex averages over the degree distribution. In particular, we find that the degree heterogeneity--variance of the underlying degree distribution--has a strong influence on the location of the critical point of a noise-induced, finite-size transition occurring in the model, on the local ordering of the system, and on the functional form of its temporal correlations. Finally, we show how this latter point opens the possibility of inferring the degree heterogeneity of the underlying network by observing only the aggregate behavior of the system as a whole, an issue of interest for systems where only macroscopic, population level variables can be measured.
Using VCL as an Aspect-Oriented Approach to Requirements Modelling
NASA Astrophysics Data System (ADS)
Amálio, Nuno; Kelsen, Pierre; Ma, Qin; Glodt, Christian
Software systems are becoming larger and more complex. By tackling the modularisation of crosscutting concerns, aspect orientation draws attention to modularity as a means to address the problems of scalability, complexity and evolution in software systems development. Aspect-oriented modelling (AOM) applies aspect-orientation to the construction of models. Most existing AOM approaches are designed without a formal semantics, and use multi-view partial descriptions of behaviour. This paper presents an AOM approach based on the Visual Contract Language (VCL): a visual language for abstract and precise modelling, designed with a formal semantics, and comprising a novel approach to visual behavioural modelling based on design by contract where behavioural descriptions are total. By applying VCL to a large case study of a car-crash crisis management system, the paper demonstrates how modularity of VCL's constructs, at different levels of granularity, help to tackle complexity. In particular, it shows how VCL's package construct and its associated composition mechanisms are key in supporting separation of concerns, coarse-grained problem decomposition and aspect-orientation. The case study's modelling solution has a clear and well-defined modular structure; the backbone of this structure is a collection of packages encapsulating local solutions to concerns.
Han, Bumsoo; Qu, Chunjing; Park, Kinam; Konieczny, Stephen F.; Korc, Murray
2016-01-01
Targeted delivery aims to selectively distribute drugs to targeted tumor tissue but not to healthy tissue. This can address many of clinical challenges by maximizing the efficacy but minimizing the toxicity of anti-cancer drugs. However, complex tumor microenvironment poses various barriers hindering the transport of drugs and drug delivery systems. New tumor models that allow for the systematic study of these complex environments are highly desired to provide reliable test beds to develop drug delivery systems for targeted delivery. Recently, research efforts have yielded new in vitro tumor models, the so called tumor-microenvironment-on-chip, that recapitulate certain characteristics of the tumor microenvironment. These new models show benefits over other conventional tumor models, and have the potential to accelerate drug discovery and enable precision medicines. However, further research is warranted to overcome their limitations and to properly interpret the data obtained from these models. In this article, key features of the in vivo tumor microenvironment that are relevant to drug transport processes for targeted delivery was discussed, and the current status and challenges for developing in vitro transport model systems was reviewed. PMID:26688098
The use of discrete-event simulation modelling to improve radiation therapy planning processes.
Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven
2009-07-01
The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.
NASA Astrophysics Data System (ADS)
Bezruchko, Konstantin; Davidov, Albert
2009-01-01
In the given article scientific and technical complex for modeling, researching and testing of rocket-space vehicles' power installations which was created in Power Source Laboratory of National Aerospace University "KhAI" is described. This scientific and technical complex gives the opportunity to replace the full-sized tests on model tests and to reduce financial and temporary inputs at modeling, researching and testing of rocket-space vehicles' power installations. Using the given complex it is possible to solve the problems of designing and researching of rocket-space vehicles' power installations efficiently, and also to provide experimental researches of physical processes and tests of solar and chemical batteries of rocket-space complexes and space vehicles. Scientific and technical complex also allows providing accelerated tests, diagnostics, life-time control and restoring of chemical accumulators for rocket-space vehicles' power supply systems.
Parameter Estimation in Epidemiology: from Simple to Complex Dynamics
NASA Astrophysics Data System (ADS)
Aguiar, Maíra; Ballesteros, Sebastién; Boto, João Pedro; Kooi, Bob W.; Mateus, Luís; Stollenwerk, Nico
2011-09-01
We revisit the parameter estimation framework for population biological dynamical systems, and apply it to calibrate various models in epidemiology with empirical time series, namely influenza and dengue fever. When it comes to more complex models like multi-strain dynamics to describe the virus-host interaction in dengue fever, even most recently developed parameter estimation techniques, like maximum likelihood iterated filtering, come to their computational limits. However, the first results of parameter estimation with data on dengue fever from Thailand indicate a subtle interplay between stochasticity and deterministic skeleton. The deterministic system on its own already displays complex dynamics up to deterministic chaos and coexistence of multiple attractors.
NASA Astrophysics Data System (ADS)
Keilis-Borok, V. I.; Soloviev, A. A.
2010-09-01
Socioeconomic and natural complex systems persistently generate extreme events also known as disasters, crises, or critical transitions. Here we analyze patterns of background activity preceding extreme events in four complex systems: economic recessions, surges in homicides in a megacity, magnetic storms, and strong earthquakes. We use as a starting point the indicators describing the system's behavior and identify changes in an indicator's trend. Those changes constitute our background events (BEs). We demonstrate a premonitory pattern common to all four systems considered: relatively large magnitude BEs become more frequent before extreme event. A premonitory change of scaling has been found in various models and observations. Here we demonstrate this change in scaling of uniformly defined BEs in four real complex systems, their enormous differences notwithstanding.
A dynamic fault tree model of a propulsion system
NASA Technical Reports Server (NTRS)
Xu, Hong; Dugan, Joanne Bechta; Meshkat, Leila
2006-01-01
We present a dynamic fault tree model of the benchmark propulsion system, and solve it using Galileo. Dynamic fault trees (DFT) extend traditional static fault trees with special gates to model spares and other sequence dependencies. Galileo solves DFT models using a judicious combination of automatically generated Markov and Binary Decision Diagram models. Galileo easily handles the complexities exhibited by the benchmark problem. In particular, Galileo is designed to model phased mission systems.
Computing Systems | High-Performance Computing | NREL
investigate, build, and test models of complex phenomena or entire integrated systems-that cannot be directly observed or manipulated in the lab, or would be too expensive or time consuming. Models and visualizations
A Telecommunications Industry Primer: A Systems Model.
ERIC Educational Resources Information Center
Obermier, Timothy R.; Tuttle, Ronald H.
2003-01-01
Describes the Telecommunications Systems Model to help technical educators and students understand the increasingly complex telecommunications infrastructure. Specifically looks at ownership and regulatory status, service providers, transport medium, network protocols, and end-user services. (JOW)
Exploring model based engineering for large telescopes: getting started with descriptive models
NASA Astrophysics Data System (ADS)
Karban, R.; Zamparelli, M.; Bauvir, B.; Koehler, B.; Noethe, L.; Balestra, A.
2008-07-01
Large telescopes pose a continuous challenge to systems engineering due to their complexity in terms of requirements, operational modes, long duty lifetime, interfaces and number of components. A multitude of decisions must be taken throughout the life cycle of a new system, and a prime means of coping with complexity and uncertainty is using models as one decision aid. The potential of descriptive models based on the OMG Systems Modeling Language (OMG SysMLTM) is examined in different areas: building a comprehensive model serves as the basis for subsequent activities of soliciting and review for requirements, analysis and design alike. Furthermore a model is an effective communication instrument against misinterpretation pitfalls which are typical of cross disciplinary activities when using natural language only or free-format diagrams. Modeling the essential characteristics of the system, like interfaces, system structure and its behavior, are important system level issues which are addressed. Also shown is how to use a model as an analysis tool to describe the relationships among disturbances, opto-mechanical effects and control decisions and to refine the control use cases. Considerations on the scalability of the model structure and organization, its impact on the development process, the relation to document-centric structures, style and usage guidelines and the required tool chain are presented.
Intrinsic Uncertainties in Modeling Complex Systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooper, Curtis S; Bramson, Aaron L.; Ames, Arlo L.
2014-09-01
Models are built to understand and predict the behaviors of both natural and artificial systems. Because it is always necessary to abstract away aspects of any non-trivial system being modeled, we know models can potentially leave out important, even critical elements. This reality of the modeling enterprise forces us to consider the prospective impacts of those effects completely left out of a model - either intentionally or unconsidered. Insensitivity to new structure is an indication of diminishing returns. In this work, we represent a hypothetical unknown effect on a validated model as a finite perturba- tion whose amplitude is constrainedmore » within a control region. We find robustly that without further constraints, no meaningful bounds can be placed on the amplitude of a perturbation outside of the control region. Thus, forecasting into unsampled regions is a very risky proposition. We also present inherent difficulties with proper time discretization of models and representing in- herently discrete quantities. We point out potentially worrisome uncertainties, arising from math- ematical formulation alone, which modelers can inadvertently introduce into models of complex systems. Acknowledgements This work has been funded under early-career LDRD project #170979, entitled "Quantify- ing Confidence in Complex Systems Models Having Structural Uncertainties", which ran from 04/2013 to 09/2014. We wish to express our gratitude to the many researchers at Sandia who con- tributed ideas to this work, as well as feedback on the manuscript. In particular, we would like to mention George Barr, Alexander Outkin, Walt Beyeler, Eric Vugrin, and Laura Swiler for provid- ing invaluable advice and guidance through the course of the project. We would also like to thank Steven Kleban, Amanda Gonzales, Trevor Manzanares, and Sarah Burwell for their assistance in managing project tasks and resources.« less
St-Maurice, Justin D; Burns, Catherine M
2017-07-28
Health care is a complex sociotechnical system. Patient treatment is evolving and needs to incorporate the use of technology and new patient-centered treatment paradigms. Cognitive work analysis (CWA) is an effective framework for understanding complex systems, and work domain analysis (WDA) is useful for understanding complex ecologies. Although previous applications of CWA have described patient treatment, due to their scope of work patients were previously characterized as biomedical machines, rather than patient actors involved in their own care. An abstraction hierarchy that characterizes patients as beings with complex social values and priorities is needed. This can help better understand treatment in a modern approach to care. The purpose of this study was to perform a WDA to represent the treatment of patients with medical records. The methods to develop this model included the analysis of written texts and collaboration with subject matter experts. Our WDA represents the ecology through its functional purposes, abstract functions, generalized functions, physical functions, and physical forms. Compared with other work domain models, this model is able to articulate the nuanced balance between medical treatment, patient education, and limited health care resources. Concepts in the analysis were similar to the modeling choices of other WDAs but combined them in as a comprehensive, systematic, and contextual overview. The model is helpful to understand user competencies and needs. Future models could be developed to model the patient's domain and enable the exploration of the shared decision-making (SDM) paradigm. Our work domain model links treatment goals, decision-making constraints, and task workflows. This model can be used by system developers who would like to use ecological interface design (EID) to improve systems. Our hierarchy is the first in a future set that could explore new treatment paradigms. Future hierarchies could model the patient as a controller and could be useful for mobile app development. ©Justin D St-Maurice, Catherine M Burns. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 28.07.2017.
2017-01-01
Background Health care is a complex sociotechnical system. Patient treatment is evolving and needs to incorporate the use of technology and new patient-centered treatment paradigms. Cognitive work analysis (CWA) is an effective framework for understanding complex systems, and work domain analysis (WDA) is useful for understanding complex ecologies. Although previous applications of CWA have described patient treatment, due to their scope of work patients were previously characterized as biomedical machines, rather than patient actors involved in their own care. Objective An abstraction hierarchy that characterizes patients as beings with complex social values and priorities is needed. This can help better understand treatment in a modern approach to care. The purpose of this study was to perform a WDA to represent the treatment of patients with medical records. Methods The methods to develop this model included the analysis of written texts and collaboration with subject matter experts. Our WDA represents the ecology through its functional purposes, abstract functions, generalized functions, physical functions, and physical forms. Results Compared with other work domain models, this model is able to articulate the nuanced balance between medical treatment, patient education, and limited health care resources. Concepts in the analysis were similar to the modeling choices of other WDAs but combined them in as a comprehensive, systematic, and contextual overview. The model is helpful to understand user competencies and needs. Future models could be developed to model the patient’s domain and enable the exploration of the shared decision-making (SDM) paradigm. Conclusion Our work domain model links treatment goals, decision-making constraints, and task workflows. This model can be used by system developers who would like to use ecological interface design (EID) to improve systems. Our hierarchy is the first in a future set that could explore new treatment paradigms. Future hierarchies could model the patient as a controller and could be useful for mobile app development. PMID:28754650
Maximally Expressive Modeling of Operations Tasks
NASA Technical Reports Server (NTRS)
Jaap, John; Richardson, Lea; Davis, Elizabeth
2002-01-01
Planning and scheduling systems organize "tasks" into a timeline or schedule. The tasks are defined within the scheduling system in logical containers called models. The dictionary might define a model of this type as "a system of things and relations satisfying a set of rules that, when applied to the things and relations, produce certainty about the tasks that are being modeled." One challenging domain for a planning and scheduling system is the operation of on-board experiments for the International Space Station. In these experiments, the equipment used is among the most complex hardware ever developed, the information sought is at the cutting edge of scientific endeavor, and the procedures are intricate and exacting. Scheduling is made more difficult by a scarcity of station resources. The models to be fed into the scheduler must describe both the complexity of the experiments and procedures (to ensure a valid schedule) and the flexibilities of the procedures and the equipment (to effectively utilize available resources). Clearly, scheduling International Space Station experiment operations calls for a "maximally expressive" modeling schema.
NASA Astrophysics Data System (ADS)
In 1992 the Santa Fe Institute hosted more than 100 short- and long-term research visitors who conducted a total of 212 person-months of residential research in complex systems. To date this 1992 work has resulted in more than 50 SFI Working Papers and nearly 150 publications in the scientific literature. The Institute's book series in the sciences of complexity continues to grow, now numbering more than 20 volumes. The fifth annual complex systems summer school brought nearly 60 graduate students and postdoctoral fellows to Santa Fe for an intensive introduction to the field. Research on complex systems - the focus of work at SFI - involves an extraordinary range of topics normally studied in seemingly disparate fields. Natural systems displaying complex adaptive behavior range upwards from DNA through cells and evolutionary systems to human societies. Research models exhibiting complex behavior include spin glasses, cellular automata, and genetic algorithms. Some of the major questions facing complex systems researchers are: (1) explaining how complexity arises from the nonlinear interaction of simple components; (2) describing the mechanisms underlying high-level aggregate behavior of complex systems (such as the overt behavior of an organism, the flow of energy in an ecology, and the Gross National Product (GNP) of an economy); and (3) creating a theoretical framework to enable predictions about the likely behavior of such systems in various conditions.