Implementation of Complexity Analyzing Based on Additional Effect
NASA Astrophysics Data System (ADS)
Zhang, Peng; Li, Na; Liang, Yanhong; Liu, Fang
According to the Complexity Theory, there is complexity in the system when the functional requirement is not be satisfied. There are several study performances for Complexity Theory based on Axiomatic Design. However, they focus on reducing the complexity in their study and no one focus on method of analyzing the complexity in the system. Therefore, this paper put forth a method of analyzing the complexity which is sought to make up the deficiency of the researches. In order to discussing the method of analyzing the complexity based on additional effect, this paper put forth two concepts which are ideal effect and additional effect. The method of analyzing complexity based on additional effect combines Complexity Theory with Theory of Inventive Problem Solving (TRIZ). It is helpful for designers to analyze the complexity by using additional effect. A case study shows the application of the process.
Analyzing SystemC Designs: SystemC Analysis Approaches for Varying Applications
Stoppe, Jannis; Drechsler, Rolf
2015-01-01
The complexity of hardware designs is still increasing according to Moore's law. With embedded systems being more and more intertwined and working together not only with each other, but also with their environments as cyber physical systems (CPSs), more streamlined development workflows are employed to handle the increasing complexity during a system's design phase. SystemC is a C++ library for the design of hardware/software systems, enabling the designer to quickly prototype, e.g., a distributed CPS without having to decide about particular implementation details (such as whether to implement a feature in hardware or in software) early in the design process. Thereby, this approach reduces the initial implementation's complexity by offering an abstract layer with which to build a working prototype. However, as SystemC is based on C++, analyzing designs becomes a difficult task due to the complex language features that are available to the designer. Several fundamentally different approaches for analyzing SystemC designs have been suggested. This work illustrates several different SystemC analysis approaches, including their specific advantages and shortcomings, allowing designers to pick the right tools to assist them with a specific problem during the design of a system using SystemC. PMID:25946632
Analyzing SystemC Designs: SystemC Analysis Approaches for Varying Applications.
Stoppe, Jannis; Drechsler, Rolf
2015-05-04
The complexity of hardware designs is still increasing according to Moore's law. With embedded systems being more and more intertwined and working together not only with each other, but also with their environments as cyber physical systems (CPSs), more streamlined development workflows are employed to handle the increasing complexity during a system's design phase. SystemC is a C++ library for the design of hardware/software systems, enabling the designer to quickly prototype, e.g., a distributed CPS without having to decide about particular implementation details (such as whether to implement a feature in hardware or in software) early in the design process. Thereby, this approach reduces the initial implementation's complexity by offering an abstract layer with which to build a working prototype. However, as SystemC is based on C++, analyzing designs becomes a difficult task due to the complex language features that are available to the designer. Several fundamentally different approaches for analyzing SystemC designs have been suggested. This work illustrates several different SystemC analysis approaches, including their specific advantages and shortcomings, allowing designers to pick the right tools to assist them with a specific problem during the design of a system using SystemC.
ADAM: analysis of discrete models of biological systems using computer algebra.
Hinkelmann, Franziska; Brandon, Madison; Guang, Bonny; McNeill, Rustin; Blekherman, Grigoriy; Veliz-Cuba, Alan; Laubenbacher, Reinhard
2011-07-20
Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics.
Greek, Ray; Hansen, Lawrence A
2013-11-01
We surveyed the scientific literature regarding amyotrophic lateral sclerosis, the SOD1 mouse model, complex adaptive systems, evolution, drug development, animal models, and philosophy of science in an attempt to analyze the SOD1 mouse model of amyotrophic lateral sclerosis in the context of evolved complex adaptive systems. Humans and animals are examples of evolved complex adaptive systems. It is difficult to predict the outcome from perturbations to such systems because of the characteristics of complex systems. Modeling even one complex adaptive system in order to predict outcomes from perturbations is difficult. Predicting outcomes to one evolved complex adaptive system based on outcomes from a second, especially when the perturbation occurs at higher levels of organization, is even more problematic. Using animal models to predict human outcomes to perturbations such as disease and drugs should have a very low predictive value. We present empirical evidence confirming this and suggest a theory to explain this phenomenon. We analyze the SOD1 mouse model of amyotrophic lateral sclerosis in order to illustrate this position. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.
Issues in Big-Data Database Systems
2014-06-01
Post, 18 August 2013. Berman, Jules K. (2013). Principles of Big Data: Preparing, Sharing, and Analyzing Complex Information. New York: Elsevier... Jules K. (2013). Principles of Big Data: Preparing, Sharing, and Analyzing Complex Information. New York: Elsevier. 261pp. Characterization of
ADAM: Analysis of Discrete Models of Biological Systems Using Computer Algebra
2011-01-01
Background Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. Results We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Conclusions Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics. PMID:21774817
Visualizing Teacher Education as a Complex System: A Nested Simplex System Approach
ERIC Educational Resources Information Center
Ludlow, Larry; Ell, Fiona; Cochran-Smith, Marilyn; Newton, Avery; Trefcer, Kaitlin; Klein, Kelsey; Grudnoff, Lexie; Haigh, Mavis; Hill, Mary F.
2017-01-01
Our purpose is to provide an exploratory statistical representation of initial teacher education as a complex system comprised of dynamic influential elements. More precisely, we reveal what the system looks like for differently-positioned teacher education stakeholders based on our framework for gathering, statistically analyzing, and graphically…
Mathematical Models to Determine Stable Behavior of Complex Systems
NASA Astrophysics Data System (ADS)
Sumin, V. I.; Dushkin, A. V.; Smolentseva, T. E.
2018-05-01
The paper analyzes a possibility to predict functioning of a complex dynamic system with a significant amount of circulating information and a large number of random factors impacting its functioning. Functioning of the complex dynamic system is described as a chaotic state, self-organized criticality and bifurcation. This problem may be resolved by modeling such systems as dynamic ones, without applying stochastic models and taking into account strange attractors.
Phase locking route behind complex periodic windows in a forced oscillator
NASA Astrophysics Data System (ADS)
Jan, Hengtai; Tsai, Kuo-Ting; Kuo, Li-wei
2013-09-01
Chaotic systems have complex reactions against an external driving force; even in cases with low-dimension oscillators, the routes to synchronization are diverse. We proposed a stroboscope-based method for analyzing driven chaotic systems in their phase space. According to two statistic quantities generated from time series, we could realize the system state and the driving behavior simultaneously. We demonstrated our method in a driven bi-stable system, which showed complex period windows under a proper driving force. With increasing periodic driving force, a route from interior periodic oscillation to phase synchronization through the chaos state could be found. Periodic windows could also be identified and the circumstances under which they occurred distinguished. Statistical results were supported by conditional Lyapunov exponent analysis to show the power in analyzing the unknown time series.
ERIC Educational Resources Information Center
Yoon, Susan A.; Goh, Sao-Ee; Park, Miyoung
2018-01-01
The study of complex systems has been highlighted in recent science education policy in the United States and has been the subject of important real-world scientific investigation. Because of this, research on complex systems in K-12 science education has shown a marked increase over the past two decades. In this systematic review, we analyzed 75…
Model-Based Engineering for Supply Chain Risk Management
2015-09-30
Privacy, 2009 [19] Julien Delange Wheel Brake System Example using AADL; Feiler, Peter; Hansson, Jörgen; de Niz, Dionisio; & Wrage, Lutz. System ...University Software Engineering Institute Abstract—Expanded use of commercial components has increased the complexity of system assurance...verification. Model- based engineering (MBE) offers a means to design, develop, analyze, and maintain a complex system architecture. Architecture Analysis
Complex Systems and Educational Change: Towards a New Research Agenda
ERIC Educational Resources Information Center
Lemke, Jay L.; Sabelli, Nora H.
2008-01-01
How might we usefully apply concepts and procedures derived from the study of other complex dynamical systems to analyzing systemic change in education? In this article, we begin to define possible agendas for research toward developing systematic frameworks and shared terminology for such a project. We illustrate the plausibility of defining such…
An empirical comparison of a dynamic software testability metric to static cyclomatic complexity
NASA Technical Reports Server (NTRS)
Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffrey E.
1993-01-01
This paper compares the dynamic testability prediction technique termed 'sensitivity analysis' to the static testability technique termed cyclomatic complexity. The application that we chose in this empirical study is a CASE generated version of a B-737 autoland system. For the B-737 system we analyzed, we isolated those functions that we predict are more prone to hide errors during system/reliability testing. We also analyzed the code with several other well-known static metrics. This paper compares and contrasts the results of sensitivity analysis to the results of the static metrics.
NASA Astrophysics Data System (ADS)
Ma, Junhai; Li, Ting; Ren, Wenbo
2017-06-01
This paper examines the optimal decisions of dual-channel game model considering the inputs of retailing service. We analyze how adjustment speed of service inputs affect the system complexity and market performance, and explore the stability of the equilibrium points by parameter basin diagrams. And chaos control is realized by variable feedback method. The numerical simulation shows that complex behavior would trigger the system to become unstable, such as double period bifurcation and chaos. We measure the performances of the model in different periods by analyzing the variation of average profit index. The theoretical results show that the percentage share of the demand and cross-service coefficients have important influence on the stability of the system and its feasible basin of attraction.
Hybrid modeling and empirical analysis of automobile supply chain network
NASA Astrophysics Data System (ADS)
Sun, Jun-yan; Tang, Jian-ming; Fu, Wei-ping; Wu, Bing-ying
2017-05-01
Based on the connection mechanism of nodes which automatically select upstream and downstream agents, a simulation model for dynamic evolutionary process of consumer-driven automobile supply chain is established by integrating ABM and discrete modeling in the GIS-based map. Firstly, the rationality is proved by analyzing the consistency of sales and changes in various agent parameters between the simulation model and a real automobile supply chain. Second, through complex network theory, hierarchical structures of the model and relationships of networks at different levels are analyzed to calculate various characteristic parameters such as mean distance, mean clustering coefficients, and degree distributions. By doing so, it verifies that the model is a typical scale-free network and small-world network. Finally, the motion law of this model is analyzed from the perspective of complex self-adaptive systems. The chaotic state of the simulation system is verified, which suggests that this system has typical nonlinear characteristics. This model not only macroscopically illustrates the dynamic evolution of complex networks of automobile supply chain but also microcosmically reflects the business process of each agent. Moreover, the model construction and simulation of the system by means of combining CAS theory and complex networks supplies a novel method for supply chain analysis, as well as theory bases and experience for supply chain analysis of auto companies.
Computer-Assisted Monitoring Of A Complex System
NASA Technical Reports Server (NTRS)
Beil, Bob J.; Mickelson, Eric M.; Sterritt, John M.; Costantino, Rob W.; Houvener, Bob C.; Super, Mike A.
1995-01-01
Propulsion System Advisor (PSA) computer-based system assists engineers and technicians in analyzing masses of sensory data indicative of operating conditions of space shuttle propulsion system during pre-launch and launch activities. Designed solely for monitoring; does not perform any control functions. Although PSA developed for highly specialized application, serves as prototype of noncontrolling, computer-based subsystems for monitoring other complex systems like electric-power-distribution networks and factories.
Cook, Daniel L; Farley, Joel F; Tapscott, Stephen J
2001-01-01
Background: We propose that a computerized, internet-based graphical description language for systems biology will be essential for describing, archiving and analyzing complex problems of biological function in health and disease. Results: We outline here a conceptual basis for designing such a language and describe BioD, a prototype language that we have used to explore the utility and feasibility of this approach to functional biology. Using example models, we demonstrate that a rather limited lexicon of icons and arrows suffices to describe complex cell-biological systems as discrete models that can be posted and linked on the internet. Conclusions: Given available computer and internet technology, BioD may be implemented as an extensible, multidisciplinary language that can be used to archive functional systems knowledge and be extended to support both qualitative and quantitative functional analysis. PMID:11305940
VoroTop: Voronoi cell topology visualization and analysis toolkit
NASA Astrophysics Data System (ADS)
Lazar, Emanuel A.
2018-01-01
This paper introduces a new open-source software program called VoroTop, which uses Voronoi topology to analyze local structure in atomic systems. Strengths of this approach include its abilities to analyze high-temperature systems and to characterize complex structure such as grain boundaries. This approach enables the automated analysis of systems and mechanisms previously not possible.
Attempt to generalize fractional-order electric elements to complex-order ones
NASA Astrophysics Data System (ADS)
Si, Gangquan; Diao, Lijie; Zhu, Jianwei; Lei, Yuhang; Zhang, Yanbin
2017-06-01
The complex derivative {D}α +/- {{j}β }, with α, β \\in R+ is a generalization of the concept of integer derivative, where α=1, β=0. Fractional-order electric elements and circuits are becoming more and more attractive. In this paper, the complex-order electric elements concept is proposed for the first time, and the complex-order elements are modeled and analyzed. Some interesting phenomena are found that the real part of the order affects the phase of output signal, and the imaginary part affects the amplitude for both the complex-order capacitor and complex-order memristor. More interesting is that the complex-order capacitor can do well at the time of fitting electrochemistry impedance spectra. The complex-order memristor is also analyzed. The area inside the hysteresis loops increases with the increasing of the imaginary part of the order and decreases with the increasing of the real part. Some complex case of complex-order memristors hysteresis loops are analyzed at last, whose loop has touching points beyond the origin of the coordinate system.
Studying the HIT-Complexity Interchange.
Kuziemsky, Craig E; Borycki, Elizabeth M; Kushniruk, Andre W
2016-01-01
The design and implementation of health information technology (HIT) is challenging, particularly when it is being introduced into complex settings. While complex adaptive system (CASs) can be a valuable means of understanding relationships between users, HIT and tasks, much of the existing work using CASs is descriptive in nature. This paper addresses that issue by integrating a model for analyzing task complexity with approaches for HIT evaluation and systems analysis. The resulting framework classifies HIT-user tasks and issues as simple, complicated or complex, and provides insight on how to study them.
Chang, Le; Baseggio, Oscar; Sementa, Luca; Cheng, Daojian; Fronzoni, Giovanna; Toffoli, Daniele; Aprà, Edoardo; Stener, Mauro; Fortunelli, Alessandro
2018-06-13
We introduce Individual Component Maps of Rotatory Strength (ICM-RS) and Rotatory Strength Density (RSD) plots as analysis tools of chiro-optical linear response spectra deriving from time-dependent density functional theory (TDDFT) simulations. ICM-RS and RSD allow one to visualize the origin of chiro-optical response in momentum or real space, including signed contributions and therefore highlighting cancellation terms that are ubiquitous in chirality phenomena, and should be especially useful in analyzing the spectra of complex systems. As test cases, we use ICM-RS and RSD to analyze circular dichroism spectra of selected (Ag-Au)30(SR)18 monolayer-protected metal nanoclusters, showing the potential of the proposed tools to derive insight and understanding, and eventually rational design, in chiro-optical studies of complex systems.
Pattern-oriented modeling of agent-based complex systems: Lessons from ecology
Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.
2005-01-01
Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.
Pattern-Oriented Modeling of Agent-Based Complex Systems: Lessons from Ecology
NASA Astrophysics Data System (ADS)
Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.
2005-11-01
Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.
The Effectiveness of an Electronic Security Management System in a Privately Owned Apartment Complex
ERIC Educational Resources Information Center
Greenberg, David F.; Roush, Jeffrey B.
2009-01-01
Poisson and negative binomial regression methods are used to analyze the monthly time series data to determine the effects of introducing an integrated security management system including closed-circuit television (CCTV), door alarm monitoring, proximity card access, and emergency call boxes to a large privately-owned complex of apartment…
Wind Energy System Time-domain (WEST) analyzers using hybrid simulation techniques
NASA Technical Reports Server (NTRS)
Hoffman, J. A.
1979-01-01
Two stand-alone analyzers constructed for real time simulation of the complex dynamic characteristics of horizontal-axis wind energy systems are described. Mathematical models for an aeroelastic rotor, including nonlinear aerodynamic and elastic loads, are implemented with high speed digital and analog circuitry. Models for elastic supports, a power train, a control system, and a rotor gimbal system are also included. Limited correlation efforts show good comparisons between results produced by the analyzers and results produced by a large digital simulation. The digital simulation results correlate well with test data.
In recent years, a new class of enclosed, closed-path gas analyzers suitable for eddy covariance applications has come to market, designed to combine the advantages of traditional closed-path systems (small density corrections, good performance in poor weather) and open-path syst...
Verification of Space Station Secondary Power System Stability Using Design of Experiment
NASA Technical Reports Server (NTRS)
Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce
1998-01-01
This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.
NASA Technical Reports Server (NTRS)
Lindvall, Mikael; Godfrey, Sally; Ackermann, Chris; Ray, Arnab; Yonkwa, Lyly; Ganesan, Dharma; Stratton, William C.; Sibol, Deane E.
2008-01-01
Analyze, Visualize, and Evaluate structure and behavior using static and dynamic information, individual systems as well as systems of systems. Next steps: Refine software tool support; Apply to other systems; and Apply earlier in system life cycle.
Functional complexity and ecosystem stability: an experimental approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Voris, P.; O'Neill, R.V.; Shugart, H.H.
1978-01-01
The complexity-stability hypothesis was experimentally tested using intact terrestrial microcosms. Functional complexity was defined as the number and significance of component interactions (i.e., population interactions, physical-chemical reactions, biological turnover rates) influenced by nonlinearities, feedbacks, and time delays. It was postulated that functional complexity could be nondestructively measured through analysis of a signal generated from the system. Power spectral analysis of hourly CO/sub 2/ efflux, from eleven old-field microcosms, was analyzed for the number of low frequency peaks and used to rank the functional complexity of each system. Ranking of ecosystem stability was based on the capacity of the system tomore » retain essential nutrients and was measured by net loss of Ca after the system was stressed. Rank correlation supported the hypothesis that increasing ecosystem functional complexity leads to increasing ecosystem stability. The results indicated that complex functional dynamics can serve to stabilize the system. The results also demonstrated that microcosms are useful tools for system-level investigations.« less
Predictability of Extreme Climate Events via a Complex Network Approach
NASA Astrophysics Data System (ADS)
Muhkin, D.; Kurths, J.
2017-12-01
We analyse climate dynamics from a complex network approach. This leads to an inverse problem: Is there a backbone-like structure underlying the climate system? For this we propose a method to reconstruct and analyze a complex network from data generated by a spatio-temporal dynamical system. This approach enables us to uncover relations to global circulation patterns in oceans and atmosphere. This concept is then applied to Monsoon data; in particular, we develop a general framework to predict extreme events by combining a non-linear synchronization technique with complex networks. Applying this method, we uncover a new mechanism of extreme floods in the eastern Central Andes which could be used for operational forecasts. Moreover, we analyze the Indian Summer Monsoon (ISM) and identify two regions of high importance. By estimating an underlying critical point, this leads to an improved prediction of the onset of the ISM; this scheme was successful in 2016 and 2017.
NASA Astrophysics Data System (ADS)
Wang, Rongxi; Gao, Xu; Gao, Jianmin; Gao, Zhiyong; Kang, Jiani
2018-02-01
As one of the most important approaches for analyzing the mechanism of fault pervasion, fault root cause tracing is a powerful and useful tool for detecting the fundamental causes of faults so as to prevent any further propagation and amplification. Focused on the problems arising from the lack of systematic and comprehensive integration, an information transfer-based novel data-driven framework for fault root cause tracing of complex electromechanical systems in the processing industry was proposed, taking into consideration the experience and qualitative analysis of conventional fault root cause tracing methods. Firstly, an improved symbolic transfer entropy method was presented to construct a directed-weighted information model for a specific complex electromechanical system based on the information flow. Secondly, considering the feedback mechanisms in the complex electromechanical systems, a method for determining the threshold values of weights was developed to explore the disciplines of fault propagation. Lastly, an iterative method was introduced to identify the fault development process. The fault root cause was traced by analyzing the changes in information transfer between the nodes along with the fault propagation pathway. An actual fault root cause tracing application of a complex electromechanical system is used to verify the effectiveness of the proposed framework. A unique fault root cause is obtained regardless of the choice of the initial variable. Thus, the proposed framework can be flexibly and effectively used in fault root cause tracing for complex electromechanical systems in the processing industry, and formulate the foundation of system vulnerability analysis and condition prediction, as well as other engineering applications.
NASA Astrophysics Data System (ADS)
Keilis-Borok, V. I.; Soloviev, A. A.
2010-09-01
Socioeconomic and natural complex systems persistently generate extreme events also known as disasters, crises, or critical transitions. Here we analyze patterns of background activity preceding extreme events in four complex systems: economic recessions, surges in homicides in a megacity, magnetic storms, and strong earthquakes. We use as a starting point the indicators describing the system's behavior and identify changes in an indicator's trend. Those changes constitute our background events (BEs). We demonstrate a premonitory pattern common to all four systems considered: relatively large magnitude BEs become more frequent before extreme event. A premonitory change of scaling has been found in various models and observations. Here we demonstrate this change in scaling of uniformly defined BEs in four real complex systems, their enormous differences notwithstanding.
Refined two-index entropy and multiscale analysis for complex system
NASA Astrophysics Data System (ADS)
Bian, Songhan; Shang, Pengjian
2016-10-01
As a fundamental concept in describing complex system, entropy measure has been proposed to various forms, like Boltzmann-Gibbs (BG) entropy, one-index entropy, two-index entropy, sample entropy, permutation entropy etc. This paper proposes a new two-index entropy Sq,δ and we find the new two-index entropy is applicable to measure the complexity of wide range of systems in the terms of randomness and fluctuation range. For more complex system, the value of two-index entropy is smaller and the correlation between parameter δ and entropy Sq,δ is weaker. By combining the refined two-index entropy Sq,δ with scaling exponent h(δ), this paper analyzes the complexities of simulation series and classifies several financial markets in various regions of the world effectively.
NASA Astrophysics Data System (ADS)
Gong, Tao; Shuai, Lan; Wu, Yicheng
2014-12-01
By analyzing complex networks constructed from authentic language data, Cong and Liu [1] advance linguistics research into the big data era. The network approach has revealed many intrinsic generalities and crucial differences at both the macro and micro scales between human languages. The axiom behind this research is that language is a complex adaptive system [2]. Although many lexical, semantic, or syntactic features have been discovered by means of analyzing the static and dynamic linguistic networks of world languages, available network-based language studies have not explicitly addressed the evolutionary dynamics of language systems and the correlations between language and human cognition. This commentary aims to provide some insights on how to use the network approach to study these issues.
NASA Astrophysics Data System (ADS)
Glick, Aaron; Carr, Lincoln; Calarco, Tommaso; Montangero, Simone
2014-03-01
In order to investigate the emergence of complexity in quantum systems, we present a quantum game of life, inspired by Conway's classic game of life. Through Matrix Product State (MPS) calculations, we simulate the evolution of quantum systems, dictated by a Hamiltonian that defines the rules of our quantum game. We analyze the system through a number of measures which elicit the emergence of complexity in terms of spatial organization, system dynamics, and non-local mutual information within the network. Funded by NSF
Covian, Raul; Chess, David; Balaban, Robert S.
2012-01-01
Native gel electrophoresis allows the separation of very small amounts of protein complexes while retaining aspects of their activity. In-gel enzymatic assays are usually performed by using reaction-dependent deposition of chromophores or light scattering precipitates quantified at fixed time points after gel removal and fixation, limiting the ability to analyze enzyme reaction kinetics. Herein, we describe a custom reaction chamber with reaction media recirculation and filtering and an imaging system that permits the continuous monitoring of in-gel enzymatic activity even in the presence of turbidity. Images were continuously collected using time-lapse high resolution digital imaging, and processing routines were developed to obtain kinetic traces of the in-gel activities and analyze reaction time courses. This system also permitted the evaluation of enzymatic activity topology within the protein bands of the gel. This approach was used to analyze the reaction kinetics of two mitochondrial complexes in native gels. Complex IV kinetics showed a short initial linear phase where catalytic rates could be calculated, whereas Complex V activity revealed a significant lag phase followed by two linear phases. The utility of monitoring the entire kinetic behavior of these reactions in native gels, as well as the general application of this approach, is discussed. PMID:22975200
Covian, Raul; Chess, David; Balaban, Robert S
2012-12-01
Native gel electrophoresis allows the separation of very small amounts of protein complexes while retaining aspects of their activity. In-gel enzymatic assays are usually performed by using reaction-dependent deposition of chromophores or light-scattering precipitates quantified at fixed time points after gel removal and fixation, limiting the ability to analyze the enzyme reaction kinetics. Herein, we describe a custom reaction chamber with reaction medium recirculation and filtering and an imaging system that permits the continuous monitoring of in-gel enzymatic activity even in the presence of turbidity. Images were continuously collected using time-lapse high-resolution digital imaging, and processing routines were developed to obtain kinetic traces of the in-gel activities and analyze reaction time courses. This system also permitted the evaluation of enzymatic activity topology within the protein bands of the gel. This approach was used to analyze the reaction kinetics of two mitochondrial complexes in native gels. Complex IV kinetics showed a short initial linear phase in which catalytic rates could be calculated, whereas Complex V activity revealed a significant lag phase followed by two linear phases. The utility of monitoring the entire kinetic behavior of these reactions in native gels, as well as the general application of this approach, is discussed. Published by Elsevier Inc.
Systems thinking: what business modeling can do for public health.
Williams, Warren; Lyalin, David; Wingo, Phyllis A
2005-01-01
Today's public health programs are complex business systems with multiple levels of collaborating federal, state, and local entities. The use of proven systems engineering modeling techniques to analyze, align, and streamline public health operations is in the beginning stages. The authors review the initial business modeling efforts in immunization and cancer registries and present a case to broadly apply business modeling approaches to analyze and improve public health processes.
Theorems and application of local activity of CNN with five state variables and one port.
Xiong, Gang; Dong, Xisong; Xie, Li; Yang, Thomas
2012-01-01
Coupled nonlinear dynamical systems have been widely studied recently. However, the dynamical properties of these systems are difficult to deal with. The local activity of cellular neural network (CNN) has provided a powerful tool for studying the emergence of complex patterns in a homogeneous lattice, which is composed of coupled cells. In this paper, the analytical criteria for the local activity in reaction-diffusion CNN with five state variables and one port are presented, which consists of four theorems, including a serial of inequalities involving CNN parameters. These theorems can be used for calculating the bifurcation diagram to determine or analyze the emergence of complex dynamic patterns, such as chaos. As a case study, a reaction-diffusion CNN of hepatitis B Virus (HBV) mutation-selection model is analyzed and simulated, the bifurcation diagram is calculated. Using the diagram, numerical simulations of this CNN model provide reasonable explanations of complex mutant phenomena during therapy. Therefore, it is demonstrated that the local activity of CNN provides a practical tool for the complex dynamics study of some coupled nonlinear systems.
The Security of Machine Learning
2008-04-24
Machine learning has become a fundamental tool for computer security, since it can rapidly evolve to changing and complex situations. That...adaptability is also a vulnerability: attackers can exploit machine learning systems. We present a taxonomy identifying and analyzing attacks against machine ...We use our framework to survey and analyze the literature of attacks against machine learning systems. We also illustrate our taxonomy by showing
Nagata, Masatoshi; Yanagihara, Dai; Tomioka, Ryohei; Utsumi, Hideko; Kubota, Yasuo; Yagi, Takeshi; Graybiel, Ann M.; Yamamori, Tetsuo
2011-01-01
Motor control is critical in daily life as well as in artistic and athletic performance and thus is the subject of intense interest in neuroscience. Mouse models of movement disorders have proven valuable for many aspects of investigation, but adequate methods for analyzing complex motor control in mouse models have not been fully established. Here, we report the development of a novel running-wheel system that can be used to evoke simple and complex stepping patterns in mice. The stepping patterns are controlled by spatially organized pegs, which serve as footholds that can be arranged in adjustable, ladder-like configurations. The mice run as they drink water from a spout, providing reward, while the wheel turns at a constant speed. The stepping patterns of the mice can thus be controlled not only spatially, but also temporally. A voltage sensor to detect paw touches is attached to each peg, allowing precise registration of footfalls. We show that this device can be used to analyze patterns of complex motor coordination in mice. We further demonstrate that it is possible to measure patterns of neural activity with chronically implanted tetrodes as the mice engage in vigorous running bouts. We suggest that this instrumented multipeg running wheel (which we name the Step-Wheel System) can serve as an important tool in analyzing motor control and motor learning in mice. PMID:21525375
Kitsukawa, Takashi; Nagata, Masatoshi; Yanagihara, Dai; Tomioka, Ryohei; Utsumi, Hideko; Kubota, Yasuo; Yagi, Takeshi; Graybiel, Ann M; Yamamori, Tetsuo
2011-07-01
Motor control is critical in daily life as well as in artistic and athletic performance and thus is the subject of intense interest in neuroscience. Mouse models of movement disorders have proven valuable for many aspects of investigation, but adequate methods for analyzing complex motor control in mouse models have not been fully established. Here, we report the development of a novel running-wheel system that can be used to evoke simple and complex stepping patterns in mice. The stepping patterns are controlled by spatially organized pegs, which serve as footholds that can be arranged in adjustable, ladder-like configurations. The mice run as they drink water from a spout, providing reward, while the wheel turns at a constant speed. The stepping patterns of the mice can thus be controlled not only spatially, but also temporally. A voltage sensor to detect paw touches is attached to each peg, allowing precise registration of footfalls. We show that this device can be used to analyze patterns of complex motor coordination in mice. We further demonstrate that it is possible to measure patterns of neural activity with chronically implanted tetrodes as the mice engage in vigorous running bouts. We suggest that this instrumented multipeg running wheel (which we name the Step-Wheel System) can serve as an important tool in analyzing motor control and motor learning in mice.
Time Factor in the Theory of Anthropogenic Risk Prediction in Complex Dynamic Systems
NASA Astrophysics Data System (ADS)
Ostreikovsky, V. A.; Shevchenko, Ye N.; Yurkov, N. K.; Kochegarov, I. I.; Grishko, A. K.
2018-01-01
The article overviews the anthropogenic risk models that take into consideration the development of different factors in time that influence the complex system. Three classes of mathematical models have been analyzed for the use in assessing the anthropogenic risk of complex dynamic systems. These models take into consideration time factor in determining the prospect of safety change of critical systems. The originality of the study is in the analysis of five time postulates in the theory of anthropogenic risk and the safety of highly important objects. It has to be stressed that the given postulates are still rarely used in practical assessment of equipment service life of critically important systems. That is why, the results of study presented in the article can be used in safety engineering and analysis of critically important complex technical systems.
Structural Behavioral Study on the General Aviation Network Based on Complex Network
NASA Astrophysics Data System (ADS)
Zhang, Liang; Lu, Na
2017-12-01
The general aviation system is an open and dissipative system with complex structures and behavioral features. This paper has established the system model and network model for general aviation. We have analyzed integral attributes and individual attributes by applying the complex network theory and concluded that the general aviation network has influential enterprise factors and node relations. We have checked whether the network has small world effect, scale-free property and network centrality property which a complex network should have by applying degree distribution of functions and proved that the general aviation network system is a complex network. Therefore, we propose to achieve the evolution process of the general aviation industrial chain to collaborative innovation cluster of advanced-form industries by strengthening network multiplication effect, stimulating innovation performance and spanning the structural hole path.
Supporting Space Systems Design via Systems Dependency Analysis Methodology
NASA Astrophysics Data System (ADS)
Guariniello, Cesare
The increasing size and complexity of space systems and space missions pose severe challenges to space systems engineers. When complex systems and Systems-of-Systems are involved, the behavior of the whole entity is not only due to that of the individual systems involved but also to the interactions and dependencies between the systems. Dependencies can be varied and complex, and designers usually do not perform analysis of the impact of dependencies at the level of complex systems, or this analysis involves excessive computational cost, or occurs at a later stage of the design process, after designers have already set detailed requirements, following a bottom-up approach. While classical systems engineering attempts to integrate the perspectives involved across the variety of engineering disciplines and the objectives of multiple stakeholders, there is still a need for more effective tools and methods capable to identify, analyze and quantify properties of the complex system as a whole and to model explicitly the effect of some of the features that characterize complex systems. This research describes the development and usage of Systems Operational Dependency Analysis and Systems Developmental Dependency Analysis, two methods based on parametric models of the behavior of complex systems, one in the operational domain and one in the developmental domain. The parameters of the developed models have intuitive meaning, are usable with subjective and quantitative data alike, and give direct insight into the causes of observed, and possibly emergent, behavior. The approach proposed in this dissertation combines models of one-to-one dependencies among systems and between systems and capabilities, to analyze and evaluate the impact of failures or delays on the outcome of the whole complex system. The analysis accounts for cascading effects, partial operational failures, multiple failures or delays, and partial developmental dependencies. The user of these methods can assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.
Reliability analysis in interdependent smart grid systems
NASA Astrophysics Data System (ADS)
Peng, Hao; Kan, Zhe; Zhao, Dandan; Han, Jianmin; Lu, Jianfeng; Hu, Zhaolong
2018-06-01
Complex network theory is a useful way to study many real complex systems. In this paper, a reliability analysis model based on complex network theory is introduced in interdependent smart grid systems. In this paper, we focus on understanding the structure of smart grid systems and studying the underlying network model, their interactions, and relationships and how cascading failures occur in the interdependent smart grid systems. We propose a practical model for interdependent smart grid systems using complex theory. Besides, based on percolation theory, we also study the effect of cascading failures effect and reveal detailed mathematical analysis of failure propagation in such systems. We analyze the reliability of our proposed model caused by random attacks or failures by calculating the size of giant functioning components in interdependent smart grid systems. Our simulation results also show that there exists a threshold for the proportion of faulty nodes, beyond which the smart grid systems collapse. Also we determine the critical values for different system parameters. In this way, the reliability analysis model based on complex network theory can be effectively utilized for anti-attack and protection purposes in interdependent smart grid systems.
Wind energy system time-domain (WEST) analyzers
NASA Technical Reports Server (NTRS)
Dreier, M. E.; Hoffman, J. A.
1981-01-01
A portable analyzer which simulates in real time the complex nonlinear dynamics of horizontal axis wind energy systems was constructed. Math models for an aeroelastic rotor featuring nonlinear aerodynamic and inertial terms were implemented with high speed digital controllers and analog calculation. This model was combined with other math models of elastic supports, control systems, a power train and gimballed rotor kinematics. A stroboscopic display system graphically depicting distributed blade loads, motion, and other aerodynamic functions on a cathode ray tube is included. Limited correlation efforts showed good comparison between the results of this analyzer and other sophisticated digital simulations. The digital simulation results were successfully correlated with test data.
Thinking on building the network cardiovasology of Chinese medicine.
Yu, Gui; Wang, Jie
2012-11-01
With advances in complex network theory, the thinking and methods regarding complex systems have changed revolutionarily. Network biology and network pharmacology were built by applying network-based approaches in biomedical research. The cardiovascular system may be regarded as a complex network, and cardiovascular diseases may be taken as the damage of structure and function of the cardiovascular network. Although Chinese medicine (CM) is effective in treating cardiovascular diseases, its mechanisms are still unclear. With the guidance of complex network theory, network biology and network pharmacology, network-based approaches could be used in the study of CM in preventing and treating cardiovascular diseases. A new discipline-network cardiovasology of CM was, therefore, developed. In this paper, complex network theory, network biology and network pharmacology were introduced and the connotation of "disease-syndrome-formula-herb" was illustrated from the network angle. Network biology could be used to analyze cardiovascular diseases and syndromes and network pharmacology could be used to analyze CM formulas and herbs. The "network-network"-based approaches could provide a new view for elucidating the mechanisms of CM treatment.
Multiagent model and mean field theory of complex auction dynamics
NASA Astrophysics Data System (ADS)
Chen, Qinghua; Huang, Zi-Gang; Wang, Yougui; Lai, Ying-Cheng
2015-09-01
Recent years have witnessed a growing interest in analyzing a variety of socio-economic phenomena using methods from statistical and nonlinear physics. We study a class of complex systems arising from economics, the lowest unique bid auction (LUBA) systems, which is a recently emerged class of online auction game systems. Through analyzing large, empirical data sets of LUBA, we identify a general feature of the bid price distribution: an inverted J-shaped function with exponential decay in the large bid price region. To account for the distribution, we propose a multi-agent model in which each agent bids stochastically in the field of winner’s attractiveness, and develop a theoretical framework to obtain analytic solutions of the model based on mean field analysis. The theory produces bid-price distributions that are in excellent agreement with those from the real data. Our model and theory capture the essential features of human behaviors in the competitive environment as exemplified by LUBA, and may provide significant quantitative insights into complex socio-economic phenomena.
Analysis of Students' Conceptions of Basic Magnetism from a Complex Systems Perspective
NASA Astrophysics Data System (ADS)
Lemmer, Miriam; Kriek, Jeanne; Erasmus, Benita
2018-03-01
This study established whether 12 South African secondary school physics students had incorrect conceptions of basic magnetism and if they had, to what extent they consistently applied such conceptions. Different scenarios in the form of thought experiments were presented in a clinical interview approach. A complex systems perspective underpinned the study and was firstly used to analyze 12 students' conceptions in terms of intuitive fragments of knowledge elements, structured misconceptions, and theory-like system of knowledge elements. Secondly, coherence in each student's ideas expressed across ten themes using thought experiments was analyzed in an effort to determine variations or coherence in responses. Examples of student explanations and sketches are discussed in the paper to illustrate the conceptual structures they applied. Most of the students in this study used a variety of knowledge elements in accord with a complex systems perspective, but three students seemed to prefer a specific perspective. One student's ideas tended to be mainly fragmented, a second exposed a number of structured misconceptions, while another student's reasoning can be described as a theory-like system of knowledge elements. Accordingly, the emphasis of physics education research should no longer be on the compilation of a list of misconceptions that have to be remedied or replaced, but on the conceptual connections, students make and their associative reasoning patterns (i.e., knowledge systems revealed). It remains for the teacher to use the complex systems perspective as a framework to facilitate students' conceptual development and understanding, proceeding on their existing knowledge systems.
Multivariate multiscale entropy of financial markets
NASA Astrophysics Data System (ADS)
Lu, Yunfan; Wang, Jun
2017-11-01
In current process of quantifying the dynamical properties of the complex phenomena in financial market system, the multivariate financial time series are widely concerned. In this work, considering the shortcomings and limitations of univariate multiscale entropy in analyzing the multivariate time series, the multivariate multiscale sample entropy (MMSE), which can evaluate the complexity in multiple data channels over different timescales, is applied to quantify the complexity of financial markets. Its effectiveness and advantages have been detected with numerical simulations with two well-known synthetic noise signals. For the first time, the complexity of four generated trivariate return series for each stock trading hour in China stock markets is quantified thanks to the interdisciplinary application of this method. We find that the complexity of trivariate return series in each hour show a significant decreasing trend with the stock trading time progressing. Further, the shuffled multivariate return series and the absolute multivariate return series are also analyzed. As another new attempt, quantifying the complexity of global stock markets (Asia, Europe and America) is carried out by analyzing the multivariate returns from them. Finally we utilize the multivariate multiscale entropy to assess the relative complexity of normalized multivariate return volatility series with different degrees.
Vacuum system design and tritium inventory for the TFTR charge exchange diagnostic
DOE Office of Scientific and Technical Information (OSTI.GOV)
Medley, S.S.
The charge exchange diagnostic for the TFTR is comprised of two analyzer systems which contain a total of twenty independent mass/energy analyzers and one diagnostic neutral beam tentatively rated at 80 keV, 15 A. The associated vacuum systems were analyzed using the Vacuum System Transient Simulator (VSTS) computer program which models the transient transport of multi-gas species through complex networks of ducts, valves, traps, vacuum pumps, and other related vacuum system components. In addition to providing improved design performance at reduced cost, the analysis yields estimates for the exchange of tritium from the torus to the diagnostic components and ofmore » the diagnostic working gases to the torus.« less
NASA Astrophysics Data System (ADS)
Sachko, A. V.; Zakordonskii, V. P.; Voloshinovskii, A. S.; Golod, T. Yu.
2009-07-01
A complex of physicochemical methods (light scattering, potentiometry, conductometry, viscometry, tensiometry, and fluorescence spectroscopy) were used to show the possibility of formation of intermolecular associates/complexes in systems with likely charged components. The driving forces of such interactions were analyzed and a possible scheme of complex formation between polymethacrylic acid and sodium dodecylbenzenesulfonate was suggested.
NASA Astrophysics Data System (ADS)
Kolmogorov, Yu. P.; Mezentsev, N. A.; Mironov, A. G.; Parkhomenko, V. S.; Spiridonov, A. M.; Shaporenko, A. D.; Yusupov, T. S.; Zhmodik, S. M.; Zolotarev, K. V.; Anoshin, G. N.
2009-05-01
A system of methods to detect platinum group elements (PGE): Re, Au, and Ag in hard-to-analyze rocks and complex ores has been developed. It applies the SRXRF for Ru, Rh, Pd, and Ag and the INAA method for Os, Ir, Pt and Ag and implies mechanoactivation of probes to study. The results of measurement of standard samples of carbonaceous rocks and ores in order to PGE, gold, and silver confirm the possibility of detecting some of the above-listed elements with a detection limit of 10 ppb.
Symmetric and Asymmetric Tendencies in Stable Complex Systems
Tan, James P. L.
2016-01-01
A commonly used approach to study stability in a complex system is by analyzing the Jacobian matrix at an equilibrium point of a dynamical system. The equilibrium point is stable if all eigenvalues have negative real parts. Here, by obtaining eigenvalue bounds of the Jacobian, we show that stable complex systems will favor mutualistic and competitive relationships that are asymmetrical (non-reciprocative) and trophic relationships that are symmetrical (reciprocative). Additionally, we define a measure called the interdependence diversity that quantifies how distributed the dependencies are between the dynamical variables in the system. We find that increasing interdependence diversity has a destabilizing effect on the equilibrium point, and the effect is greater for trophic relationships than for mutualistic and competitive relationships. These predictions are consistent with empirical observations in ecology. More importantly, our findings suggest stabilization algorithms that can apply very generally to a variety of complex systems. PMID:27545722
Symmetric and Asymmetric Tendencies in Stable Complex Systems.
Tan, James P L
2016-08-22
A commonly used approach to study stability in a complex system is by analyzing the Jacobian matrix at an equilibrium point of a dynamical system. The equilibrium point is stable if all eigenvalues have negative real parts. Here, by obtaining eigenvalue bounds of the Jacobian, we show that stable complex systems will favor mutualistic and competitive relationships that are asymmetrical (non-reciprocative) and trophic relationships that are symmetrical (reciprocative). Additionally, we define a measure called the interdependence diversity that quantifies how distributed the dependencies are between the dynamical variables in the system. We find that increasing interdependence diversity has a destabilizing effect on the equilibrium point, and the effect is greater for trophic relationships than for mutualistic and competitive relationships. These predictions are consistent with empirical observations in ecology. More importantly, our findings suggest stabilization algorithms that can apply very generally to a variety of complex systems.
Risk analysis with a fuzzy-logic approach of a complex installation
NASA Astrophysics Data System (ADS)
Peikert, Tim; Garbe, Heyno; Potthast, Stefan
2016-09-01
This paper introduces a procedural method based on fuzzy logic to analyze systematic the risk of an electronic system in an intentional electromagnetic environment (IEME). The method analyzes the susceptibility of a complex electronic installation with respect to intentional electromagnetic interference (IEMI). It combines the advantages of well-known techniques as fault tree analysis (FTA), electromagnetic topology (EMT) and Bayesian networks (BN) and extends the techniques with an approach to handle uncertainty. This approach uses fuzzy sets, membership functions and fuzzy logic to handle the uncertainty with probability functions and linguistic terms. The linguistic terms add to the risk analysis the knowledge from experts of the investigated system or environment.
Johnston, Lee M; Matteson, Carrie L; Finegood, Diane T
2014-07-01
We demonstrate the use of a systems-based framework to assess solutions to complex health problems such as obesity. We coded 12 documents published between 2004 and 2013 aimed at influencing obesity planning for complex systems design (9 reports from US and Canadian governmental or health authorities, 1 Cochrane review, and 2 Institute of Medicine reports). We sorted data using the intervention-level framework (ILF), a novel solutions-oriented approach to complex problems. An in-depth comparison of 3 documents provides further insight into complexity and systems design in obesity policy. The majority of strategies focused mainly on changing the determinants of energy imbalance (food intake and physical activity). ILF analysis brings to the surface actions aimed at higher levels of system function and points to a need for more innovative policy design. Although many policymakers acknowledge obesity as a complex problem, many strategies stem from the paradigm of individual choice and are limited in scope. The ILF provides a template to encourage natural systems thinking and more strategic policy design grounded in complexity science.
Optimal service using Matlab - simulink controlled Queuing system at call centers
NASA Astrophysics Data System (ADS)
Balaji, N.; Siva, E. P.; Chandrasekaran, A. D.; Tamilazhagan, V.
2018-04-01
This paper presents graphical integrated model based academic research on telephone call centres. This paper introduces an important feature of impatient customers and abandonments in the queue system. However the modern call centre is a complex socio-technical system. Queuing theory has now become a suitable application in the telecom industry to provide better online services. Through this Matlab-simulink multi queuing structured models provide better solutions in complex situations at call centres. Service performance measures analyzed at optimal level through Simulink queuing model.
Intelligent control of a planning system for astronaut training.
Ortiz, J; Chen, G
1999-07-01
This work intends to design, analyze and solve, from the systems control perspective, a complex, dynamic, and multiconstrained planning system for generating training plans for crew members of the NASA-led International Space Station. Various intelligent planning systems have been developed within the framework of artificial intelligence. These planning systems generally lack a rigorous mathematical formalism to allow a reliable and flexible methodology for their design, modeling, and performance analysis in a dynamical, time-critical, and multiconstrained environment. Formulating the planning problem in the domain of discrete-event systems under a unified framework such that it can be modeled, designed, and analyzed as a control system will provide a self-contained theory for such planning systems. This will also provide a means to certify various planning systems for operations in the dynamical and complex environments in space. The work presented here completes the design, development, and analysis of an intricate, large-scale, and representative mathematical formulation for intelligent control of a real planning system for Space Station crew training. This planning system has been tested and used at NASA-Johnson Space Center.
Multifractality and heteroscedastic dynamics: An application to time series analysis
NASA Astrophysics Data System (ADS)
Nascimento, C. M.; Júnior, H. B. N.; Jennings, H. D.; Serva, M.; Gleria, Iram; Viswanathan, G. M.
2008-01-01
An increasingly important problem in physics concerns scale invariance symmetry in diverse complex systems, often characterized by heteroscedastic dynamics. We investigate the nature of the relationship between the heteroscedastic and fractal aspects of the dynamics of complex systems, by analyzing the sensitivity to heteroscedasticity of the scaling properties of weakly nonstationary time series. By using multifractal detrended fluctuation analysis, we study the singularity spectra of currency exchange rate fluctuations, after partially or completely eliminating n-point correlations via data shuffling techniques. We conclude that heteroscedasticity can significantly increase multifractality and interpret these findings in the context of self-organizing and adaptive complex systems.
Power-rate-distortion analysis for wireless video communication under energy constraint
NASA Astrophysics Data System (ADS)
He, Zhihai; Liang, Yongfang; Ahmad, Ishfaq
2004-01-01
In video coding and streaming over wireless communication network, the power-demanding video encoding operates on the mobile devices with limited energy supply. To analyze, control, and optimize the rate-distortion (R-D) behavior of the wireless video communication system under the energy constraint, we need to develop a power-rate-distortion (P-R-D) analysis framework, which extends the traditional R-D analysis by including another dimension, the power consumption. Specifically, in this paper, we analyze the encoding mechanism of typical video encoding systems and develop a parametric video encoding architecture which is fully scalable in computational complexity. Using dynamic voltage scaling (DVS), a hardware technology recently developed in CMOS circuits design, the complexity scalability can be translated into the power consumption scalability of the video encoder. We investigate the rate-distortion behaviors of the complexity control parameters and establish an analytic framework to explore the P-R-D behavior of the video encoding system. Both theoretically and experimentally, we show that, using this P-R-D model, the encoding system is able to automatically adjust its complexity control parameters to match the available energy supply of the mobile device while maximizing the picture quality. The P-R-D model provides a theoretical guideline for system design and performance optimization in wireless video communication under energy constraint, especially over the wireless video sensor network.
Threshold transitions in a regional urban system
In this paper we analyze the evolution of city size distributions over time in a regional urban system. This urban complex system is in constant flux with changing groups and city migration across existing and newly created groups. Using group formation as an emergent property, t...
ERIC Educational Resources Information Center
Shabes, Vladimir; Troshchenkova, Ekaterina; Potapova, Tamara; Ivarsson, Lena; Damber, Ulla; Bostedt, Goran
2012-01-01
In the article on the basis of the psycholinguistic experimental data obtained in 2009-2010 from Russian and Swedish students, we consider internal features of several complex values ("Harmony", "Freedom", "Democracy", "Tolerance" and "Patriotism") and analyze their external systemic organization,…
Automated Instructional Monitors for Complex Operational Tasks. Final Report.
ERIC Educational Resources Information Center
Feurzeig, Wallace
A computer-based instructional system is described which incorporates diagnosis of students difficulties in acquiring complex concepts and skills. A computer automatically generated a simulated display. It then monitored and analyzed a student's work in the performance of assigned training tasks. Two major tasks were studied. The first,…
NASA Astrophysics Data System (ADS)
Sherwin, Jason
At the start of the 21st century, the topic of complexity remains a formidable challenge in engineering, science and other aspects of our world. It seems that when disaster strikes it is because some complex and unforeseen interaction causes the unfortunate outcome. Why did the financial system of the world meltdown in 2008--2009? Why are global temperatures on the rise? These questions and other ones like them are difficult to answer because they pertain to contexts that require lengthy descriptions. In other words, these contexts are complex. But we as human beings are able to observe and recognize this thing we call 'complexity'. Furthermore, we recognize that there are certain elements of a context that form a system of complex interactions---i.e., a complex system. Many researchers have even noted similarities between seemingly disparate complex systems. Do sub-atomic systems bear resemblance to weather patterns? Or do human-based economic systems bear resemblance to macroscopic flows? Where do we draw the line in their resemblance? These are the kinds of questions that are asked in complex systems research. And the ability to recognize complexity is not only limited to analytic research. Rather, there are many known examples of humans who, not only observe and recognize but also, operate complex systems. How do they do it? Is there something superhuman about these people or is there something common to human anatomy that makes it possible to fly a plane? Or to drive a bus? Or to operate a nuclear power plant? Or to play Chopin's etudes on the piano? In each of these examples, a human being operates a complex system of machinery, whether it is a plane, a bus, a nuclear power plant or a piano. What is the common thread running through these abilities? The study of situational awareness (SA) examines how people do these types of remarkable feats. It is not a bottom-up science though because it relies on finding general principles running through a host of varied human activities. Nevertheless, since it is not constrained by computational details, the study of situational awareness provides a unique opportunity to approach complex tasks of operation from an analytical perspective. In other words, with SA, we get to see how humans observe, recognize and react to complex systems on which they exert some control. Reconciling this perspective on complexity with complex systems research, it might be possible to further our understanding of complex phenomena if we can probe the anatomical mechanisms by which we, as humans, do it naturally. At this unique intersection of two disciplines, a hybrid approach is needed. So in this work, we propose just such an approach. In particular, this research proposes a computational approach to the situational awareness (SA) of complex systems. Here we propose to implement certain aspects of situational awareness via a biologically-inspired machine-learning technique called Hierarchical Temporal Memory (HTM). In doing so, we will use either simulated or actual data to create and to test computational implementations of situational awareness. This will be tested in two example contexts, one being more complex than the other. The ultimate goal of this research is to demonstrate a possible approach to analyzing and understanding complex systems. By using HTM and carefully developing techniques to analyze the SA formed from data, it is believed that this goal can be obtained.
A system for programming experiments and for recording and analyzing data automatically1
Herrick, Robert M.; Denelsbeck, John S.
1963-01-01
A system designed for use in complex operant conditioning experiments is described. Some of its key features are: (a) plugboards that permit the experimenter to change either from one program to another or from one analysis to another in less than a minute, (b) time-sharing of permanently-wired, electronic logic components, (c) recordings suitable for automatic analyses. Included are flow diagrams of the system and sample logic diagrams for programming experiments and for analyzing data. ImagesFig. 4. PMID:14055967
Micro-Macro Analysis of Complex Networks
Marchiori, Massimo; Possamai, Lino
2015-01-01
Complex systems have attracted considerable interest because of their wide range of applications, and are often studied via a “classic” approach: study a specific system, find a complex network behind it, and analyze the corresponding properties. This simple methodology has produced a great deal of interesting results, but relies on an often implicit underlying assumption: the level of detail on which the system is observed. However, in many situations, physical or abstract, the level of detail can be one out of many, and might also depend on intrinsic limitations in viewing the data with a different level of abstraction or precision. So, a fundamental question arises: do properties of a network depend on its level of observability, or are they invariant? If there is a dependence, then an apparently correct network modeling could in fact just be a bad approximation of the true behavior of a complex system. In order to answer this question, we propose a novel micro-macro analysis of complex systems that quantitatively describes how the structure of complex networks varies as a function of the detail level. To this extent, we have developed a new telescopic algorithm that abstracts from the local properties of a system and reconstructs the original structure according to a fuzziness level. This way we can study what happens when passing from a fine level of detail (“micro”) to a different scale level (“macro”), and analyze the corresponding behavior in this transition, obtaining a deeper spectrum analysis. The obtained results show that many important properties are not universally invariant with respect to the level of detail, but instead strongly depend on the specific level on which a network is observed. Therefore, caution should be taken in every situation where a complex network is considered, if its context allows for different levels of observability. PMID:25635812
Design Features of the Neutral Particle Diagnostic System for the ITER Tokamak
NASA Astrophysics Data System (ADS)
Petrov, S. Ya.; Afanasyev, V. I.; Melnik, A. D.; Mironov, M. I.; Navolotsky, A. S.; Nesenevich, V. G.; Petrov, M. P.; Chernyshev, F. V.; Kedrov, I. V.; Kuzmin, E. G.; Lyublin, B. V.; Kozlovski, S. S.; Mokeev, A. N.
2017-12-01
The control of the deuterium-tritium (DT) fuel isotopic ratio has to ensure the best performance of the ITER thermonuclear fusion reactor. The diagnostic system described in this paper allows the measurement of this ratio analyzing the hydrogen isotope fluxes (performing neutral particle analysis (NPA)). The development and supply of the NPA diagnostics for ITER was delegated to the Russian Federation. The diagnostics is being developed at the Ioffe Institute. The system consists of two analyzers, viz., LENPA (Low Energy Neutral Particle Analyzer) with 10-200 keV energy range and HENPA (High Energy Neutral Particle Analyzer) with 0.1-4.0MeV energy range. Simultaneous operation of both analyzers in different energy ranges enables researchers to measure the DT fuel ratio both in the central burning plasma (thermonuclear burn zone) and at the edge as well. When developing the diagnostic complex, it was necessary to account for the impact of several factors: high levels of neutron and gamma radiation, the direct vacuum connection to the ITER vessel, implying high tritium containment, strict requirements on reliability of all units and mechanisms, and the limited space available for accommodation of the diagnostic hardware at the ITER tokamak. The paper describes the design of the diagnostic complex and the engineering solutions that make it possible to conduct measurements under tokamak reactor conditions. The proposed engineering solutions provide a safe—with respect to thermal and mechanical loads—common vacuum channel for hydrogen isotope atoms to pass to the analyzers; ensure efficient shielding of the analyzers from the ITER stray magnetic field (up to 1 kG); provide the remote control of the NPA diagnostic complex, in particular, connection/disconnection of the NPA vacuum beamline from the ITER vessel; meet the ITER radiation safety requirements; and ensure measurements of the fuel isotopic ratio under high levels of neutron and gamma radiation.
Hybrid estimation of complex systems.
Hofbaur, Michael W; Williams, Brian C
2004-10-01
Modern automated systems evolve both continuously and discretely, and hence require estimation techniques that go well beyond the capability of a typical Kalman Filter. Multiple model (MM) estimation schemes track these system evolutions by applying a bank of filters, one for each discrete system mode. Modern systems, however, are often composed of many interconnected components that exhibit rich behaviors, due to complex, system-wide interactions. Modeling these systems leads to complex stochastic hybrid models that capture the large number of operational and failure modes. This large number of modes makes a typical MM estimation approach infeasible for online estimation. This paper analyzes the shortcomings of MM estimation, and then introduces an alternative hybrid estimation scheme that can efficiently estimate complex systems with large number of modes. It utilizes search techniques from the toolkit of model-based reasoning in order to focus the estimation on the set of most likely modes, without missing symptoms that might be hidden amongst the system noise. In addition, we present a novel approach to hybrid estimation in the presence of unknown behavioral modes. This leads to an overall hybrid estimation scheme for complex systems that robustly copes with unforeseen situations in a degraded, but fail-safe manner.
Structured analysis and modeling of complex systems
NASA Technical Reports Server (NTRS)
Strome, David R.; Dalrymple, Mathieu A.
1992-01-01
The Aircrew Evaluation Sustained Operations Performance (AESOP) facility at Brooks AFB, Texas, combines the realism of an operational environment with the control of a research laboratory. In recent studies we collected extensive data from the Airborne Warning and Control Systems (AWACS) Weapons Directors subjected to high and low workload Defensive Counter Air Scenarios. A critical and complex task in this environment involves committing a friendly fighter against a hostile fighter. Structured Analysis and Design techniques and computer modeling systems were applied to this task as tools for analyzing subject performance and workload. This technology is being transferred to the Man-Systems Division of NASA Johnson Space Center for application to complex mission related tasks, such as manipulating the Shuttle grappler arm.
Hybrid fuel cell/diesel generation total energy system, part 2
NASA Astrophysics Data System (ADS)
Blazek, C. F.
1982-11-01
Meeting the Goldstone Deep Space Communications Complex (DGSCC) electrical and thermal requirements with the existing system was compared with using fuel cells. Fuel cell technology selection was based on a 1985 time frame for installation. The most cost-effective fuel feedstock for fuel cell application was identified. Fuels considered included diesel oil, natural gas, methanol and coal. These fuel feedstocks were considered not only on the cost and efficiency of the fuel conversion process, but also on complexity and integration of the fuel processor on system operation and thermal energy availability. After a review of fuel processor technology, catalytic steam reformer technology was selected based on the ease of integration and the economics of hydrogen production. The phosphoric acid fuel cell was selected for application at the GDSCC due to its commercial readiness for near term application. Fuel cell systems were analyzed for both natural gas and methanol feedstock. The subsequent economic analysis indicated that a natural gas fueled system was the most cost effective of the cases analyzed.
Hybrid fuel cell/diesel generation total energy system, part 2
NASA Technical Reports Server (NTRS)
Blazek, C. F.
1982-01-01
Meeting the Goldstone Deep Space Communications Complex (DGSCC) electrical and thermal requirements with the existing system was compared with using fuel cells. Fuel cell technology selection was based on a 1985 time frame for installation. The most cost-effective fuel feedstock for fuel cell application was identified. Fuels considered included diesel oil, natural gas, methanol and coal. These fuel feedstocks were considered not only on the cost and efficiency of the fuel conversion process, but also on complexity and integration of the fuel processor on system operation and thermal energy availability. After a review of fuel processor technology, catalytic steam reformer technology was selected based on the ease of integration and the economics of hydrogen production. The phosphoric acid fuel cell was selected for application at the GDSCC due to its commercial readiness for near term application. Fuel cell systems were analyzed for both natural gas and methanol feedstock. The subsequent economic analysis indicated that a natural gas fueled system was the most cost effective of the cases analyzed.
ERIC Educational Resources Information Center
Lane, Jason E., Ed.; Johnstone, D. Bruce, Ed.
2013-01-01
This thought-provoking volume brings together scholars and system leaders to analyze some of the most pressing and complex issues now facing higher education systems and society. Higher Education Systems 3.0 focuses on the remaking of higher education coordination in an era of increased accountability, greater calls for productivity, and…
Vocal repertoire of the social giant otter.
Leuchtenberger, Caroline; Sousa-Lima, Renata; Duplaix, Nicole; Magnusson, William E; Mourão, Guilherme
2014-11-01
According to the "social intelligence hypothesis," species with complex social interactions have more sophisticated communication systems. Giant otters (Pteronura brasiliensis) live in groups with complex social interactions. It is likely that the vocal communication of giant otters is more sophisticated than previous studies suggest. The objectives of the current study were to describe the airborne vocal repertoire of giant otters in the Pantanal area of Brazil, to analyze call types within different behavioral contexts, and to correlate vocal complexity with level of sociability of mustelids to verify whether or not the result supports the social intelligence hypothesis. The behavior of nine giant otters groups was observed. Vocalizations recorded were acoustically and statistically analyzed to describe the species' repertoire. The repertoire was comprised by 15 sound types emitted in different behavioral contexts. The main behavioral contexts of each sound type were significantly associated with the acoustic variable ordination of different sound types. A strong correlation between vocal complexity and sociability was found for different species, suggesting that the communication systems observed in the family mustelidae support the social intelligence hypothesis.
An Asymmetry-to-Symmetry Switch in Signal Transmission by the Histidine Kinase Receptor for TMAO
Moore, Jason O.; Hendrickson, Wayne A.
2012-01-01
Summary The osmoregulator trimethylamine-N-oxide (TMAO), commonplace in aquatic organisms, is used as the terminal electron acceptor for respiration in many bacterial species. The TMAO reductase (Tor) pathway for respiratory catalysis is controlled by a receptor system that comprises the TMAO-binding protein TorT, the sensor histidine kinase TorS and the response regulator TorR. Here we study the TorS/TorT sensor system to gain mechanistic insight into signaling by histidine kinase receptors. We determined crystal structures for complexes of TorS sensor domains with apo TorT and with TorT(TMAO); we characterized TorS sensor associations with TorT in solution; we analyzed the thermodynamics of TMAO binding to TorT-TorS complexes; and we analyzed in vivo responses to TMAO through the TorT/TorS/TorR system to test structure-inspired hypotheses. TorS-TorT(apo) is an asymmetric 2:2 complex that binds TMAO with negative cooperativity to form a symmetric active kinase. PMID:22483119
An Asymmetry-to-Symmetry Switch in Signal Transmission by the Histidine Kinase Receptor for TMAO
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Jason O.; Hendrickson, Wayne A.
2012-06-28
The osmoregulator trimethylamine-N-oxide (TMAO), commonplace in aquatic organisms, is used as the terminal electron acceptor for respiration in many bacterial species. The TMAO reductase (Tor) pathway for respiratory catalysis is controlled by a receptor system that comprises the TMAO-binding protein TorT, the sensor histidine kinase TorS, and the response regulator TorR. Here we study the TorS/TorT sensor system to gain mechanistic insight into signaling by histidine kinase receptors. We determined crystal structures for complexes of TorS sensor domains with apo TorT and with TorT (TMAO); we characterized TorS sensor associations with TorT in solution; we analyzed the thermodynamics of TMAOmore » binding to TorT-TorS complexes; and we analyzed in vivo responses to TMAO through the TorT/TorS/TorR system to test structure-inspired hypotheses. TorS-TorT(apo) is an asymmetric 2:2 complex that binds TMAO with negative cooperativity to form a symmetric active kinase.« less
An asymmetry-to-symmetry switch in signal transmission by the histidine kinase receptor for TMAO.
Moore, Jason O; Hendrickson, Wayne A
2012-04-04
The osmoregulator trimethylamine-N-oxide (TMAO), commonplace in aquatic organisms, is used as the terminal electron acceptor for respiration in many bacterial species. The TMAO reductase (Tor) pathway for respiratory catalysis is controlled by a receptor system that comprises the TMAO-binding protein TorT, the sensor histidine kinase TorS, and the response regulator TorR. Here we study the TorS/TorT sensor system to gain mechanistic insight into signaling by histidine kinase receptors. We determined crystal structures for complexes of TorS sensor domains with apo TorT and with TorT (TMAO); we characterized TorS sensor associations with TorT in solution; we analyzed the thermodynamics of TMAO binding to TorT-TorS complexes; and we analyzed in vivo responses to TMAO through the TorT/TorS/TorR system to test structure-inspired hypotheses. TorS-TorT(apo) is an asymmetric 2:2 complex that binds TMAO with negative cooperativity to form a symmetric active kinase. Copyright © 2012 Elsevier Ltd. All rights reserved.
Mathematical concepts for modeling human behavior in complex man-machine systems
NASA Technical Reports Server (NTRS)
Johannsen, G.; Rouse, W. B.
1979-01-01
Many human behavior (e.g., manual control) models have been found to be inadequate for describing processes in certain real complex man-machine systems. An attempt is made to find a way to overcome this problem by examining the range of applicability of existing mathematical models with respect to the hierarchy of human activities in real complex tasks. Automobile driving is chosen as a baseline scenario, and a hierarchy of human activities is derived by analyzing this task in general terms. A structural description leads to a block diagram and a time-sharing computer analogy.
Automatic acquisition of motion trajectories: tracking hockey players
NASA Astrophysics Data System (ADS)
Okuma, Kenji; Little, James J.; Lowe, David
2003-12-01
Computer systems that have the capability of analyzing complex and dynamic scenes play an essential role in video annotation. Scenes can be complex in such a way that there are many cluttered objects with different colors, shapes and sizes, and can be dynamic with multiple interacting moving objects and a constantly changing background. In reality, there are many scenes that are complex, dynamic, and challenging enough for computers to describe. These scenes include games of sports, air traffic, car traffic, street intersections, and cloud transformations. Our research is about the challenge of inventing a descriptive computer system that analyzes scenes of hockey games where multiple moving players interact with each other on a constantly moving background due to camera motions. Ultimately, such a computer system should be able to acquire reliable data by extracting the players" motion as their trajectories, querying them by analyzing the descriptive information of data, and predict the motions of some hockey players based on the result of the query. Among these three major aspects of the system, we primarily focus on visual information of the scenes, that is, how to automatically acquire motion trajectories of hockey players from video. More accurately, we automatically analyze the hockey scenes by estimating parameters (i.e., pan, tilt, and zoom) of the broadcast cameras, tracking hockey players in those scenes, and constructing a visual description of the data by displaying trajectories of those players. Many technical problems in vision such as fast and unpredictable players' motions and rapid camera motions make our challenge worth tackling. To the best of our knowledge, there have not been any automatic video annotation systems for hockey developed in the past. Although there are many obstacles to overcome, our efforts and accomplishments would hopefully establish the infrastructure of the automatic hockey annotation system and become a milestone for research in automatic video annotation in this domain.
How Unstable Are Complex Financial Systems? Analyzing an Inter-bank Network of Credit Relations
NASA Astrophysics Data System (ADS)
Sinha, Sitabhra; Thess, Maximilian; Markose, Sheri
The recent worldwide economic crisis of 2007-09 has focused attention on the need to analyze systemic risk in complex financial networks. We investigate the problem of robustness of such systems in the context of the general theory of dynamical stability in complex networks and, in particular, how the topology of connections influence the risk of the failure of a single institution triggering a cascade of successive collapses propagating through the network. We use data on bilateral liabilities (or exposure) in the derivatives market between 202 financial intermediaries based in USA and Europe in the last quarter of 2009 to empirically investigate the network structure of the over-the-counter (OTC) derivatives market. We observe that the network exhibits both heterogeneity in node properties and the existence of communities. It also has a prominent core-periphery organization and can resist large-scale collapse when subjected to individual bank defaults (however, failure of any bank in the core may result in localized collapse of the innermost core with substantial loss of capital) but is vulnerable to system-wide breakdown as a result of an accompanying liquidity crisis.
Analyzing Distributed Functions in an Integrated Hazard Analysis
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Massie, Michael J.
2010-01-01
Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.
Phase synchronization based on a Dual-Tree Complex Wavelet Transform
NASA Astrophysics Data System (ADS)
Ferreira, Maria Teodora; Domingues, Margarete Oliveira; Macau, Elbert E. N.
2016-11-01
In this work, we show the applicability of our Discrete Complex Wavelet Approach (DCWA) to verify the phenomenon of phase synchronization transition in two coupled chaotic Lorenz systems. DCWA is based on the phase assignment from complex wavelet coefficients obtained by using a Dual-Tree Complex Wavelet Transform (DT-CWT). We analyzed two coupled chaotic Lorenz systems, aiming to detect the transition from non-phase synchronization to phase synchronization. In addition, we check how good is the method in detecting periods of 2π phase-slips. In all experiments, DCWA is compared with classical phase detection methods such as the ones based on arctangent and Hilbert transform showing a much better performance.
NASA Technical Reports Server (NTRS)
Stehle, Roy H.; Ogier, Richard G.
1993-01-01
Alternatives for realizing a packet-based network switch for use on a frequency division multiple access/time division multiplexed (FDMA/TDM) geostationary communication satellite were investigated. Each of the eight downlink beams supports eight directed dwells. The design needed to accommodate multicast packets with very low probability of loss due to contention. Three switch architectures were designed and analyzed. An output-queued, shared bus system yielded a functionally simple system, utilizing a first-in, first-out (FIFO) memory per downlink dwell, but at the expense of a large total memory requirement. A shared memory architecture offered the most efficiency in memory requirements, requiring about half the memory of the shared bus design. The processing requirement for the shared-memory system adds system complexity that may offset the benefits of the smaller memory. An alternative design using a shared memory buffer per downlink beam decreases circuit complexity through a distributed design, and requires at most 1000 packets of memory more than the completely shared memory design. Modifications to the basic packet switch designs were proposed to accommodate circuit-switched traffic, which must be served on a periodic basis with minimal delay. Methods for dynamically controlling the downlink dwell lengths were developed and analyzed. These methods adapt quickly to changing traffic demands, and do not add significant complexity or cost to the satellite and ground station designs. Methods for reducing the memory requirement by not requiring the satellite to store full packets were also proposed and analyzed. In addition, optimal packet and dwell lengths were computed as functions of memory size for the three switch architectures.
Weck, P J; Schaffner, D A; Brown, M R; Wicks, R T
2015-02-01
The Bandt-Pompe permutation entropy and the Jensen-Shannon statistical complexity are used to analyze fluctuating time series of three different turbulent plasmas: the magnetohydrodynamic (MHD) turbulence in the plasma wind tunnel of the Swarthmore Spheromak Experiment (SSX), drift-wave turbulence of ion saturation current fluctuations in the edge of the Large Plasma Device (LAPD), and fully developed turbulent magnetic fluctuations of the solar wind taken from the Wind spacecraft. The entropy and complexity values are presented as coordinates on the CH plane for comparison among the different plasma environments and other fluctuation models. The solar wind is found to have the highest permutation entropy and lowest statistical complexity of the three data sets analyzed. Both laboratory data sets have larger values of statistical complexity, suggesting that these systems have fewer degrees of freedom in their fluctuations, with SSX magnetic fluctuations having slightly less complexity than the LAPD edge I(sat). The CH plane coordinates are compared to the shape and distribution of a spectral decomposition of the wave forms. These results suggest that fully developed turbulence (solar wind) occupies the lower-right region of the CH plane, and that other plasma systems considered to be turbulent have less permutation entropy and more statistical complexity. This paper presents use of this statistical analysis tool on solar wind plasma, as well as on an MHD turbulent experimental plasma.
Transparent Information Systems through Gateways, Front Ends, Intermediaries, and Interfaces.
ERIC Educational Resources Information Center
Williams, Martha E.
1986-01-01
Provides overview of design requirements for transparent information retrieval (implies that user sees through complexity of retrieval activities sequence). Highlights include need for transparent systems; history of transparent retrieval research; information retrieval functions (automated converters, routers, selectors, evaluators/analyzers);…
Reliability-Based Model to Analyze the Performance and Cost of a Transit Fare Collection System.
DOT National Transportation Integrated Search
1985-06-01
The collection of transit system fares has become more sophisticated in recent years, with more flexible structures requiring more sophisticated fare collection equipment to process tickets and admit passengers. However, this new and complex equipmen...
[Review of the active locomotion system for capsule endoscope].
Zhao, Dechun; Guo, Yijun; Peng, Chenglin
2010-02-01
This review summarized the progress of researches on the active locomotion system for capsule endoscope, analyzed the moving and controlling principles in different locomotion systems, and compared their merits and shortcomings. Owing to the complexity of human intestines and the limits to the size and consumption of locomotion system from the capsule endoscope, there is not yet one kind of active locomotion system currently used in clinical practice. The locomotive system driven by an outer rotational magnetic field could improve the commercial endoscope capsule, while its magnetic field controlling moving is complex. Active locomotion system driven by shape memory alloys will be the orientated development and the point of research in the future.
Application of the GERTS II simulator in the industrial environment.
NASA Technical Reports Server (NTRS)
Whitehouse, G. E.; Klein, K. I.
1971-01-01
GERT was originally developed to aid in the analysis of stochastic networks. GERT can be used to graphically model and analyze complex systems. Recently a simulator model, GERTS II, has been developed to solve GERT Networks. The simulator language used in the development of this model was GASP II A. This paper discusses the possible application of GERTS II to model and analyze (1) assembly line operations, (2) project management networks, (3) conveyor systems and (4) inventory systems. Finally, an actual application dealing with a job shop loading problem is presented.
NASA Astrophysics Data System (ADS)
Nawani, Jigna; Rixius, Julia; Neuhaus, Birgit J.
2016-08-01
Empirical analysis of secondary biology classrooms revealed that, on average, 68% of teaching time in Germany revolved around processing tasks. Quality of instruction can thus be assessed by analyzing the quality of tasks used in classroom discourse. This quasi-experimental study analyzed how teachers used tasks in 38 videotaped biology lessons pertaining to the topic 'blood and circulatory system'. Two fundamental characteristics used to analyze tasks include: (1) required cognitive level of processing (e.g. low level information processing: repetiition, summary, define, classify and high level information processing: interpret-analyze data, formulate hypothesis, etc.) and (2) complexity of task content (e.g. if tasks require use of factual, linking or concept level content). Additionally, students' cognitive knowledge structure about the topic 'blood and circulatory system' was measured using student-drawn concept maps (N = 970 students). Finally, linear multilevel models were created with high-level cognitive processing tasks and higher content complexity tasks as class-level predictors and students' prior knowledge, students' interest in biology, and students' interest in biology activities as control covariates. Results showed a positive influence of high-level cognitive processing tasks (β = 0.07; p < .01) on students' cognitive knowledge structure. However, there was no observed effect of higher content complexity tasks on students' cognitive knowledge structure. Presented findings encourage the use of high-level cognitive processing tasks in biology instruction.
NASA Technical Reports Server (NTRS)
Niebur, Dagmar
1995-01-01
Electric power systems represent complex systems involving many electrical components whoseoperation has to be planned, analyzed, monitored and controlled. The time-scale of tasks in electricpower systems extends from long term planning years ahead to milliseconds in the area of control. The behavior of power systems is highly non-linear. Monitoring and control involves several hundred variables which are only partly available by measurements.
Results of medical studies during long-term manned flights on the orbital Salyut-6 and Soyuz complex
NASA Technical Reports Server (NTRS)
Yegorov, A. D. (Compiler)
1979-01-01
Results of tests made on the crews of the Salyut-6 and Soyuz complex are presented. The basic results of studies made before, during and after 96-day and 140-day flights are presented in 5 sections: characteristics of flight conditions in the orbital complex; the cardiovascular system; the motor sphere and vestibular analyzer; biochemical, hematologic and immunologic studies; and recovery measures in the readaptation period.
Animal models and conserved processes
2012-01-01
Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is insufficient for inter-species extrapolation when the trait or response being studied is located at higher levels of organization, is in a different module, or is influenced by other modules. However, when the examination of the conserved process occurs at the same level of organization or in the same module, and hence is subject to study solely by reductionism, then extrapolation is possible. PMID:22963674
NASA Technical Reports Server (NTRS)
Alexandrov, Natalia (Technical Monitor); Kuby, Michael; Tierney, Sean; Roberts, Tyler; Upchurch, Christopher
2005-01-01
This report reviews six classes of models that are used for studying transportation network topologies. The report is motivated by two main questions. First, what can the "new science" of complex networks (scale-free, small-world networks) contribute to our understanding of transport network structure, compared to more traditional methods? Second, how can geographic information systems (GIS) contribute to studying transport networks? The report defines terms that can be used to classify different kinds of models by their function, composition, mechanism, spatial and temporal dimensions, certainty, linearity, and resolution. Six broad classes of models for analyzing transport network topologies are then explored: GIS; static graph theory; complex networks; mathematical programming; simulation; and agent-based modeling. Each class of models is defined and classified according to the attributes introduced earlier. The paper identifies some typical types of research questions about network structure that have been addressed by each class of model in the literature.
The State of the Field: Qualitative Analyses of Text Complexity
ERIC Educational Resources Information Center
Pearson, P. David; Hiebert, Elfrieda H.
2014-01-01
The purpose of this article is to understand the function, logic, and impact of qualitative systems for analyzing text complexity, focusing on their benefits and imperfections. We identified two primary functions for their use: (a) to match texts to reader ability so that readers read books that are within their grasp, and (b) to unearth, and then…
Single Event Testing on Complex Devices: Test Like You Fly versus Test-Specific Design Structures
NASA Technical Reports Server (NTRS)
Berg, Melanie; LaBel, Kenneth A.
2014-01-01
We present a framework for evaluating complex digital systems targeted for harsh radiation environments such as space. Focus is limited to analyzing the single event upset (SEU) susceptibility of designs implemented inside Field Programmable Gate Array (FPGA) devices. Tradeoffs are provided between application-specific versus test-specific test structures.
NASA Astrophysics Data System (ADS)
Ma, Junhai; Ren, Wenbo; Zhan, Xueli
2017-04-01
Based on the study of scholars at home and abroad, this paper improves the three-dimensional IS-LM model in macroeconomics, analyzes the equilibrium point of the system and stability conditions, focuses on the parameters and complex dynamic characteristics when Hopf bifurcation occurs in the three-dimensional IS-LM macroeconomics system. In order to analyze the stability of limit cycles when Hopf bifurcation occurs, this paper further introduces the first Lyapunov coefficient to judge the limit cycles, i.e. from a practical view of the business cycle. Numerical simulation results show that within the range of most of the parameters, the limit cycle of 3D IS-LM macroeconomics is stable, that is, the business cycle is stable; with the increase of the parameters, limit cycles becomes unstable, and the value range of the parameters in this situation is small. The research results of this paper have good guide significance for the analysis of macroeconomics system.
TDRSS telecommunications system, PN code analysis
NASA Technical Reports Server (NTRS)
Dixon, R.; Gold, R.; Kaiser, F.
1976-01-01
The pseudo noise (PN) codes required to support the TDRSS telecommunications services are analyzed and the impact of alternate coding techniques on the user transponder equipment, the TDRSS equipment, and all factors that contribute to the acquisition and performance of these telecommunication services is assessed. Possible alternatives to the currently proposed hybrid FH/direct sequence acquisition procedures are considered and compared relative to acquisition time, implementation complexity, operational reliability, and cost. The hybrid FH/direct sequence technique is analyzed and rejected in favor of a recommended approach which minimizes acquisition time and user transponder complexity while maximizing probability of acquisition and overall link reliability.
Koorehdavoudi, Hana; Bogdan, Paul
2016-01-01
Biological systems are frequently categorized as complex systems due to their capabilities of generating spatio-temporal structures from apparent random decisions. In spite of research on analyzing biological systems, we lack a quantifiable framework for measuring their complexity. To fill this gap, in this paper, we develop a new paradigm to study a collective group of N agents moving and interacting in a three-dimensional space. Our paradigm helps to identify the spatio-temporal states of the motion of the group and their associated transition probabilities. This framework enables the estimation of the free energy landscape corresponding to the identified states. Based on the energy landscape, we quantify missing information, emergence, self-organization and complexity for a collective motion. We show that the collective motion of the group of agents evolves to reach the most probable state with relatively lowest energy level and lowest missing information compared to other possible states. Our analysis demonstrates that the natural group of animals exhibit a higher degree of emergence, self-organization and complexity over time. Consequently, this algorithm can be integrated into new frameworks to engineer collective motions to achieve certain degrees of emergence, self-organization and complexity. PMID:27297496
NASA Astrophysics Data System (ADS)
Koorehdavoudi, Hana; Bogdan, Paul
2016-06-01
Biological systems are frequently categorized as complex systems due to their capabilities of generating spatio-temporal structures from apparent random decisions. In spite of research on analyzing biological systems, we lack a quantifiable framework for measuring their complexity. To fill this gap, in this paper, we develop a new paradigm to study a collective group of N agents moving and interacting in a three-dimensional space. Our paradigm helps to identify the spatio-temporal states of the motion of the group and their associated transition probabilities. This framework enables the estimation of the free energy landscape corresponding to the identified states. Based on the energy landscape, we quantify missing information, emergence, self-organization and complexity for a collective motion. We show that the collective motion of the group of agents evolves to reach the most probable state with relatively lowest energy level and lowest missing information compared to other possible states. Our analysis demonstrates that the natural group of animals exhibit a higher degree of emergence, self-organization and complexity over time. Consequently, this algorithm can be integrated into new frameworks to engineer collective motions to achieve certain degrees of emergence, self-organization and complexity.
NASA Applications for Computational Electromagnetic Analysis
NASA Technical Reports Server (NTRS)
Lewis, Catherine C.; Trout, Dawn H.; Krome, Mark E.; Perry, Thomas A.
2011-01-01
Computational Electromagnetic Software is used by NASA to analyze the compatibility of systems too large or too complex for testing. Recent advances in software packages and computer capabilities have made it possible to determine the effects of a transmitter inside a launch vehicle fairing, better analyze the environment threats, and perform on-orbit replacements with assured electromagnetic compatibility.
Statistically Validated Networks in Bipartite Complex Systems
Tumminello, Michele; Miccichè, Salvatore; Lillo, Fabrizio; Piilo, Jyrki; Mantegna, Rosario N.
2011-01-01
Many complex systems present an intrinsic bipartite structure where elements of one set link to elements of the second set. In these complex systems, such as the system of actors and movies, elements of one set are qualitatively different than elements of the other set. The properties of these complex systems are typically investigated by constructing and analyzing a projected network on one of the two sets (for example the actor network or the movie network). Complex systems are often very heterogeneous in the number of relationships that the elements of one set establish with the elements of the other set, and this heterogeneity makes it very difficult to discriminate links of the projected network that are just reflecting system's heterogeneity from links relevant to unveil the properties of the system. Here we introduce an unsupervised method to statistically validate each link of a projected network against a null hypothesis that takes into account system heterogeneity. We apply the method to a biological, an economic and a social complex system. The method we propose is able to detect network structures which are very informative about the organization and specialization of the investigated systems, and identifies those relationships between elements of the projected network that cannot be explained simply by system heterogeneity. We also show that our method applies to bipartite systems in which different relationships might have different qualitative nature, generating statistically validated networks in which such difference is preserved. PMID:21483858
From embodied mind to embodied robotics: humanities and system theoretical aspects.
Mainzer, Klaus
2009-01-01
After an introduction (1) the article analyzes the evolution of the embodied mind (2), the innovation of embodied robotics (3), and finally discusses conclusions of embodied robotics for human responsibility (4). Considering the evolution of the embodied mind (2), we start with an introduction of complex systems and nonlinear dynamics (2.1), apply this approach to neural self-organization (2.2), distinguish degrees of complexity of the brain (2.3), explain the emergence of cognitive states by complex systems dynamics (2.4), and discuss criteria for modeling the brain as complex nonlinear system (2.5). The innovation of embodied robotics (3) is a challenge of future technology. We start with the distinction of symbolic and embodied AI (3.1) and explain embodied robots as dynamical systems (3.2). Self-organization needs self-control of technical systems (3.3). Cellular neural networks (CNN) are an example of self-organizing technical systems offering new avenues for neurobionics (3.4). In general, technical neural networks support different kinds of learning robots (3.5). Finally, embodied robotics aim at the development of cognitive and conscious robots (3.6).
Developing an Approach for Analyzing and Verifying System Communication
NASA Technical Reports Server (NTRS)
Stratton, William C.; Lindvall, Mikael; Ackermann, Chris; Sibol, Deane E.; Godfrey, Sally
2009-01-01
This slide presentation reviews a project for developing an approach for analyzing and verifying the inter system communications. The motivation for the study was that software systems in the aerospace domain are inherently complex, and operate under tight constraints for resources, so that systems of systems must communicate with each other to fulfill the tasks. The systems of systems requires reliable communications. The technical approach was to develop a system, DynSAVE, that detects communication problems among the systems. The project enhanced the proven Software Architecture Visualization and Evaluation (SAVE) tool to create Dynamic SAVE (DynSAVE). The approach monitors and records low level network traffic, converting low level traffic into meaningful messages, and displays the messages in a way the issues can be detected.
Complexity in electronic negotiation support systems.
Griessmair, Michele; Strunk, Guido; Vetschera, Rudolf; Koeszegi, Sabine T
2011-10-01
It is generally acknowledged that the medium influences the way we communicate and negotiation research directs considerable attention to the impact of different electronic communication modes on the negotiation process and outcomes. Complexity theories offer models and methods that allow the investigation of how pattern and temporal sequences unfold over time in negotiation interactions. By focusing on the dynamic and interactive quality of negotiations as well as the information, choice, and uncertainty contained in the negotiation process, the complexity perspective addresses several issues of central interest in classical negotiation research. In the present study we compare the complexity of the negotiation communication process among synchronous and asynchronous negotiations (IM vs. e-mail) as well as an electronic negotiation support system including a decision support system (DSS). For this purpose, transcripts of 145 negotiations have been coded and analyzed with the Shannon entropy and the grammar complexity. Our results show that negotiating asynchronically via e-mail as well as including a DSS significantly reduces the complexity of the negotiation process. Furthermore, a reduction of the complexity increases the probability of reaching an agreement.
Analysis of Power System Low Frequency Oscillation Based on Energy Shift Theory
NASA Astrophysics Data System (ADS)
Zhang, Junfeng; Zhang, Chunwang; Ma, Daqing
2018-01-01
In this paper, a new method for analyzing low-frequency oscillation between analytic areas based on energy coefficient is proposed. The concept of energy coefficient is proposed by constructing the energy function, and the low-frequency oscillation is analyzed according to the energy coefficient under the current operating conditions; meanwhile, the concept of model energy is proposed to analyze the energy exchange behavior between two generators. Not only does this method provide an explanation of low-frequency oscillation from the energy point of view, but also it helps further reveal the dynamic behavior of complex power systems. The case analysis of four-machine two-area and the power system of Jilin Power Grid proves the correctness and effectiveness of the proposed method in low-frequency oscillation analysis of power system.
NASA Technical Reports Server (NTRS)
Simons, S. N.; Maag, W. L.
1978-01-01
The electrical and thermal energy utilization efficiencies of a 500 unit apartment complex are analyzed and compared for each of three energy supply systems. Two on-site integrated energy systems, one powered by diesel engines and the other by phosphoric-acid fuel cells were compared with a conventional system which uses purchased electricity and on-site boilers for heating. All fuels consumed on-site are clean, synthetic fuels (distillate fuel oil or pipeline quality gas) derived from coal. Purchased electricity was generated from coal at a central station utility. The relative energy consumption and economics of the three systems are analyzed and compared.
Analyzing Software Requirements Errors in Safety-Critical, Embedded Systems
NASA Technical Reports Server (NTRS)
Lutz, Robyn R.
1993-01-01
This paper analyzes the root causes of safety-related software errors in safety-critical, embedded systems. The results show that software errors identified as potentially hazardous to the system tend to be produced by different error mechanisms than non- safety-related software errors. Safety-related software errors are shown to arise most commonly from (1) discrepancies between the documented requirements specifications and the requirements needed for correct functioning of the system and (2) misunderstandings of the software's interface with the rest of the system. The paper uses these results to identify methods by which requirements errors can be prevented. The goal is to reduce safety-related software errors and to enhance the safety of complex, embedded systems.
NASA Astrophysics Data System (ADS)
Lucas, Iris; Cotsaftis, Michel; Bertelle, Cyrille
2017-12-01
Multiagent systems (MAS) provide a useful tool for exploring the complex dynamics and behavior of financial markets and now MAS approach has been widely implemented and documented in the empirical literature. This paper introduces the implementation of an innovative multi-scale mathematical model for a computational agent-based financial market. The paper develops a method to quantify the degree of self-organization which emerges in the system and shows that the capacity of self-organization is maximized when the agent behaviors are heterogeneous. Numerical results are presented and analyzed, showing how the global market behavior emerges from specific individual behavior interactions.
The Complex Economic System of Supply Chain Financing
NASA Astrophysics Data System (ADS)
Zhang, Lili; Yan, Guangle
Supply Chain Financing (SCF) refers to a series of innovative and complicated financial services based on supply chain. The SCF set-up is a complex system, where the supply chain management and Small and Medium Enterprises (SMEs) financing services interpenetrate systematically. This paper establishes the organization structure of SCF System, and presents two financing models respectively, with or without the participation of the third-party logistic provider (3PL). Using Information Economics and Game Theory, the interrelationship among diverse economic sectors is analyzed, and the economic mechanism of development and existent for SCF system is demonstrated. New thoughts and approaches to solve SMEs financing problem are given.
Complexity analysis of the Next Gen Air Traffic Management System: trajectory based operations.
Lyons, Rhonda
2012-01-01
According to Federal Aviation Administration traffic predictions currently our Air Traffic Management (ATM) system is operating at 150 percent capacity; forecasting that within the next two decades, the traffic with increase to a staggering 250 percent [17]. This will require a major redesign of our system. Today's ATM system is complex. It is designed to safely, economically, and efficiently provide air traffic services through the cost-effective provision of facilities and seamless services in collaboration with multiple agents however, contrary the vision, the system is loosely integrated and is suffering tremendously from antiquated equipment and saturated airways. The new Next Generation (Next Gen) ATM system is designed to transform the current system into an agile, robust and responsive set of operations that are designed to safely manage the growing needs of the projected increasingly complex, diverse set of air transportation system users and massive projected worldwide traffic rates. This new revolutionary technology-centric system is dynamically complex and is much more sophisticated than it's soon to be predecessor. ATM system failures could yield large scale catastrophic consequences as it is a safety critical system. This work will attempt to describe complexity and the complex nature of the NextGen ATM system and Trajectory Based Operational. Complex human factors interactions within Next Gen will be analyzed using a proposed dual experimental approach designed to identify hazards, gaps and elicit emergent hazards that would not be visible if conducted in isolation. Suggestions will be made along with a proposal for future human factors research in the TBO safety critical Next Gen environment.
NASA Astrophysics Data System (ADS)
Zhang, Lin; Lu, Jian; Zhou, Jialin; Zhu, Jinqing; Li, Yunxuan; Wan, Qian
2018-03-01
Didi Dache is the most popular taxi order mobile app in China, which provides online taxi-hailing service. The obtained big database from this app could be used to analyze the complexities’ day-to-day dynamic evolution of Didi taxi trip network (DTTN) from the level of complex network dynamics. First, this paper proposes the data cleaning and modeling methods for expressing Nanjing’s DTTN as a complex network. Second, the three consecutive weeks’ data are cleaned to establish 21 DTTNs based on the proposed big data processing technology. Then, multiple topology measures that characterize the complexities’ day-to-day dynamic evolution of these networks are provided. Third, these measures of 21 DTTNs are calculated and subsequently explained with actual implications. They are used as a training set for modeling the BP neural network which is designed for predicting DTTN complexities evolution. Finally, the reliability of the designed BP neural network is verified by comparing with the actual data and the results obtained from ARIMA method simultaneously. Because network complexities are the basis for modeling cascading failures and conducting link prediction in complex system, this proposed research framework not only provides a novel perspective for analyzing DTTN from the level of system aggregated behavior, but can also be used to improve the DTTN management level.
NASA Astrophysics Data System (ADS)
Haghnevis, Moeed
The main objective of this research is to develop an integrated method to study emergent behavior and consequences of evolution and adaptation in engineered complex adaptive systems (ECASs). A multi-layer conceptual framework and modeling approach including behavioral and structural aspects is provided to describe the structure of a class of engineered complex systems and predict their future adaptive patterns. The approach allows the examination of complexity in the structure and the behavior of components as a result of their connections and in relation to their environment. This research describes and uses the major differences of natural complex adaptive systems (CASs) with artificial/engineered CASs to build a framework and platform for ECAS. While this framework focuses on the critical factors of an engineered system, it also enables one to synthetically employ engineering and mathematical models to analyze and measure complexity in such systems. In this way concepts of complex systems science are adapted to management science and system of systems engineering. In particular an integrated consumer-based optimization and agent-based modeling (ABM) platform is presented that enables managers to predict and partially control patterns of behaviors in ECASs. Demonstrated on the U.S. electricity markets, ABM is integrated with normative and subjective decision behavior recommended by the U.S. Department of Energy (DOE) and Federal Energy Regulatory Commission (FERC). The approach integrates social networks, social science, complexity theory, and diffusion theory. Furthermore, it has unique and significant contribution in exploring and representing concrete managerial insights for ECASs and offering new optimized actions and modeling paradigms in agent-based simulation.
Singh, Juswinder; Deng, Zhan; Narale, Gaurav; Chuaqui, Claudio
2006-01-01
The combination of advances in structure-based drug design efforts in the pharmaceutical industry in parallel with structural genomics initiatives in the public domain has led to an explosion in the number of structures of protein-small molecule complexes structures. This information has critical importance to both the understanding of the structural basis for molecular recognition in biological systems and the design of better drugs. A significant challenge exists in managing this vast amount of data and fully leveraging it. Here, we review our work to develop a simple, fast way to store, organize, mine, and analyze large numbers of protein-small molecule complexes. We illustrate the utility of the approach to the management of inhibitor complexes from the protein kinase family. Finally, we describe our recent efforts in applying this method to the design of target-focused chemical libraries.
NASA Astrophysics Data System (ADS)
Li, Shu-Bin; Cao, Dan-Ni; Dang, Wen-Xiu; Zhang, Lin
As a new cross-discipline, the complexity science has penetrated into every field of economy and society. With the arrival of big data, the research of the complexity science has reached its summit again. In recent years, it offers a new perspective for traffic control by using complex networks theory. The interaction course of various kinds of information in traffic system forms a huge complex system. A new mesoscopic traffic flow model is improved with variable speed limit (VSL), and the simulation process is designed, which is based on the complex networks theory combined with the proposed model. This paper studies effect of VSL on the dynamic traffic flow, and then analyzes the optimal control strategy of VSL in different network topologies. The conclusion of this research is meaningful to put forward some reasonable transportation plan and develop effective traffic management and control measures to help the department of traffic management.
Simulations of Instabilities in Complex Valve and Feed Systems
NASA Technical Reports Server (NTRS)
Ahuja, Vineet; Hosangadi, Ashvin; Shipman, Jeremy; Cavallo, Peter A.
2006-01-01
CFD analyses are playing an increasingly important role in identifying and characterizing flow induced instabilities in rocket engine test facilities and flight systems. In this paper, we analyze instability mechanisms that range from turbulent pressure fluctuations due to vortex shedding in structurally complex valve systems to flow resonance in plug cavities to large scale pressure fluctuations due to collapse of cavitation induced vapor clouds. Furthermore, we discuss simulations of transient behavior related to valve motion that can serve as guidelines for valve scheduling. Such predictions of valve response to varying flow conditions is of crucial importance to engine operation and testing.
Systems Genetics as a Tool to Identify Master Genetic Regulators in Complex Disease.
Moreno-Moral, Aida; Pesce, Francesco; Behmoaras, Jacques; Petretto, Enrico
2017-01-01
Systems genetics stems from systems biology and similarly employs integrative modeling approaches to describe the perturbations and phenotypic effects observed in a complex system. However, in the case of systems genetics the main source of perturbation is naturally occurring genetic variation, which can be analyzed at the systems-level to explain the observed variation in phenotypic traits. In contrast with conventional single-variant association approaches, the success of systems genetics has been in the identification of gene networks and molecular pathways that underlie complex disease. In addition, systems genetics has proven useful in the discovery of master trans-acting genetic regulators of functional networks and pathways, which in many cases revealed unexpected gene targets for disease. Here we detail the central components of a fully integrated systems genetics approach to complex disease, starting from assessment of genetic and gene expression variation, linking DNA sequence variation to mRNA (expression QTL mapping), gene regulatory network analysis and mapping the genetic control of regulatory networks. By summarizing a few illustrative (and successful) examples, we highlight how different data-modeling strategies can be effectively integrated in a systems genetics study.
Generation of two-dimensional binary mixtures in complex plasmas
NASA Astrophysics Data System (ADS)
Wieben, Frank; Block, Dietmar
2016-10-01
Complex plasmas are an excellent model system for strong coupling phenomena. Under certain conditions the dust particles immersed into the plasma form crystals which can be analyzed in terms of structure and dynamics. Previous experiments focussed mostly on monodisperse particle systems whereas dusty plasmas in nature and technology are polydisperse. Thus, a first and important step towards experiments in polydisperse systems are binary mixtures. Recent experiments on binary mixtures under microgravity conditions observed a phase separation of particle species with different radii even for small size disparities. This contradicts several numerical studies of 2D binary mixtures. Therefore, dedicated experiments are required to gain more insight into the physics of polydisperse systems. In this contribution first ground based experiments on two-dimensional binary mixtures are presented. Particular attention is paid to the requirements for the generation of such systems which involve the consideration of the temporal evolution of the particle properties. Furthermore, the structure of these two-component crystals is analyzed and compared to simulations. This work was supported by the Deutsche Forschungsgemeinschaft DFG in the framework of the SFB TR24 Greifswald Kiel, Project A3b.
TechWriter: An Evolving System for Writing Assistance for Advanced Learners of English
ERIC Educational Resources Information Center
Napolitano, Diane M.; Stent, Amanda
2009-01-01
Writing assistance systems, from simple spelling checkers to more complex grammar and readability analyzers, can be helpful aids to nonnative writers of English. However, many writing assistance systems have two disadvantages. First, they are not designed to encourage skills learning and independence in their users; instead, users may begin to use…
ART/Ada design project, phase 1. Task 2 report: Detailed design
NASA Technical Reports Server (NTRS)
Allen, Bradley P.
1988-01-01
Various issues are studied in the context of the design of an Ada based expert system building tool. Using an existing successful design as a starting point, the impact is analyzed of the Ada language and Ada development methodologies on that design, the Ada system is redesigned, and its performance is analyzed using both complexity-theoretic and empirical techniques. The algorithms specified in the overall design are refined, resolving and documenting any open design issues, identifying each system module, documenting the internal architecture and control logic, and describing the primary data structures involved in the module.
Integrated Safety Analysis Teams
NASA Technical Reports Server (NTRS)
Wetherholt, Jonathan C.
2008-01-01
Today's complex systems require understanding beyond one person s capability to comprehend. Each system requires a team to divide the system into understandable subsystems which can then be analyzed with an Integrated Hazard Analysis. The team must have both specific experiences and diversity of experience. Safety experience and system understanding are not always manifested in one individual. Group dynamics make the difference between success and failure as well as the difference between a difficult task and a rewarding experience. There are examples in the news which demonstrate the need to connect the pieces of a system into a complete picture. The Columbia disaster is now a standard example of a low consequence hazard in one part of the system; the External Tank is a catastrophic hazard cause for a companion subsystem, the Space Shuttle Orbiter. The interaction between the hardware, the manufacturing process, the handling, and the operations contributed to the problem. Each of these had analysis performed, but who constituted the team which integrated this analysis together? This paper will explore some of the methods used for dividing up a complex system; and how one integration team has analyzed the parts. How this analysis has been documented in one particular launch space vehicle case will also be discussed.
Approaching human language with complex networks
NASA Astrophysics Data System (ADS)
Cong, Jin; Liu, Haitao
2014-12-01
The interest in modeling and analyzing human language with complex networks is on the rise in recent years and a considerable body of research in this area has already been accumulated. We survey three major lines of linguistic research from the complex network approach: 1) characterization of human language as a multi-level system with complex network analysis; 2) linguistic typological research with the application of linguistic networks and their quantitative measures; and 3) relationships between the system-level complexity of human language (determined by the topology of linguistic networks) and microscopic linguistic (e.g., syntactic) features (as the traditional concern of linguistics). We show that the models and quantitative tools of complex networks, when exploited properly, can constitute an operational methodology for linguistic inquiry, which contributes to the understanding of human language and the development of linguistics. We conclude our review with suggestions for future linguistic research from the complex network approach: 1) relationships between the system-level complexity of human language and microscopic linguistic features; 2) expansion of research scope from the global properties to other levels of granularity of linguistic networks; and 3) combination of linguistic network analysis with other quantitative studies of language (such as quantitative linguistics).
Complex Economies Have a Lateral Escape from the Poverty Trap
Pugliese, Emanuele; Chiarotti, Guido L.; Zaccaria, Andrea; Pietronero, Luciano
2017-01-01
We analyze the decisive role played by the complexity of economic systems at the onset of the industrialization process of countries over the past 50 years. Our analysis of the input growth dynamics, considering a further dimension through a recently introduced measure of economic complexity, reveals that more differentiated and more complex economies face a lower barrier (in terms of GDP per capita) when starting the transition towards industrialization. As a consequence, we can extend the classical concept of a one-dimensional poverty trap, by introducing a two-dimensional poverty trap: a country will start the industrialization process if it is rich enough (as in neo-classical economic theories), complex enough (using this new dimension and laterally escaping from the poverty trap), or a linear combination of the two. This naturally leads to the proposal of a Complex Index of Relative Development (CIRD) which shows, when analyzed as a function of the growth due to input, a shape of an upside down parabola similar to that expected from the standard economic theories when considering only the GDP per capita dimension. PMID:28072867
Complex Economies Have a Lateral Escape from the Poverty Trap.
Pugliese, Emanuele; Chiarotti, Guido L; Zaccaria, Andrea; Pietronero, Luciano
2017-01-01
We analyze the decisive role played by the complexity of economic systems at the onset of the industrialization process of countries over the past 50 years. Our analysis of the input growth dynamics, considering a further dimension through a recently introduced measure of economic complexity, reveals that more differentiated and more complex economies face a lower barrier (in terms of GDP per capita) when starting the transition towards industrialization. As a consequence, we can extend the classical concept of a one-dimensional poverty trap, by introducing a two-dimensional poverty trap: a country will start the industrialization process if it is rich enough (as in neo-classical economic theories), complex enough (using this new dimension and laterally escaping from the poverty trap), or a linear combination of the two. This naturally leads to the proposal of a Complex Index of Relative Development (CIRD) which shows, when analyzed as a function of the growth due to input, a shape of an upside down parabola similar to that expected from the standard economic theories when considering only the GDP per capita dimension.
Astakhov, Vadim
2009-01-01
Interest in simulation of large-scale metabolic networks, species development, and genesis of various diseases requires new simulation techniques to accommodate the high complexity of realistic biological networks. Information geometry and topological formalisms are proposed to analyze information processes. We analyze the complexity of large-scale biological networks as well as transition of the system functionality due to modification in the system architecture, system environment, and system components. The dynamic core model is developed. The term dynamic core is used to define a set of causally related network functions. Delocalization of dynamic core model provides a mathematical formalism to analyze migration of specific functions in biosystems which undergo structure transition induced by the environment. The term delocalization is used to describe these processes of migration. We constructed a holographic model with self-poetic dynamic cores which preserves functional properties under those transitions. Topological constraints such as Ricci flow and Pfaff dimension were found for statistical manifolds which represent biological networks. These constraints can provide insight on processes of degeneration and recovery which take place in large-scale networks. We would like to suggest that therapies which are able to effectively implement estimated constraints, will successfully adjust biological systems and recover altered functionality. Also, we mathematically formulate the hypothesis that there is a direct consistency between biological and chemical evolution. Any set of causal relations within a biological network has its dual reimplementation in the chemistry of the system environment.
Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Wucherl; Koo, Michelle; Cao, Yu
Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe-more » art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.« less
Complexities, Catastrophes and Cities: Emergency Dynamics in Varying Scenarios and Urban Topologies
NASA Astrophysics Data System (ADS)
Narzisi, Giuseppe; Mysore, Venkatesh; Byeon, Jeewoong; Mishra, Bud
Complex Systems are often characterized by agents capable of interacting with each other dynamically, often in non-linear and non-intuitive ways. Trying to characterize their dynamics often results in partial differential equations that are difficult, if not impossible, to solve. A large city or a city-state is an example of such an evolving and self-organizing complex environment that efficiently adapts to different and numerous incremental changes to its social, cultural and technological infrastructure [1]. One powerful technique for analyzing such complex systems is Agent-Based Modeling (ABM) [9], which has seen an increasing number of applications in social science, economics and also biology. The agent-based paradigm facilitates easier transfer of domain specific knowledge into a model. ABM provides a natural way to describe systems in which the overall dynamics can be described as the result of the behavior of populations of autonomous components: agents, with a fixed set of rules based on local information and possible central control. As part of the NYU Center for Catastrophe Preparedness and Response (CCPR1), we have been exploring how ABM can serve as a powerful simulation technique for analyzing large-scale urban disasters. The central problem in Disaster Management is that it is not immediately apparent whether the current emergency plans are robust against such sudden, rare and punctuated catastrophic events.
On the robustness of complex heterogeneous gene expression networks.
Gómez-Gardeñes, Jesús; Moreno, Yamir; Floría, Luis M
2005-04-01
We analyze a continuous gene expression model on the underlying topology of a complex heterogeneous network. Numerical simulations aimed at studying the chaotic and periodic dynamics of the model are performed. The results clearly indicate that there is a region in which the dynamical and structural complexity of the system avoid chaotic attractors. However, contrary to what has been reported for Random Boolean Networks, the chaotic phase cannot be completely suppressed, which has important bearings on network robustness and gene expression modeling.
Dintner, Sebastian; Heermann, Ralf; Fang, Chong; Jung, Kirsten; Gebhard, Susanne
2014-10-03
Resistance against antimicrobial peptides in many Firmicutes bacteria is mediated by detoxification systems that are composed of a two-component regulatory system (TCS) and an ATP-binding cassette (ABC) transporter. The histidine kinases of these systems depend entirely on the transporter for sensing of antimicrobial peptides, suggesting a novel mode of signal transduction where the transporter constitutes the actual sensor. The aim of this study was to investigate the molecular mechanisms of this unusual signaling pathway in more detail, using the bacitracin resistance system BceRS-BceAB of Bacillus subtilis as an example. To analyze the proposed communication between TCS and the ABC transporter, we characterized their interactions by bacterial two-hybrid analyses and could show that the permease BceB and the histidine kinase BceS interact directly. In vitro pulldown assays confirmed this interaction, which was found to be independent of bacitracin. Because it was unknown whether BceAB-type transporters could detect their substrate peptides directly or instead recognized the peptide-target complex in the cell envelope, we next analyzed substrate binding by the transport permease, BceB. Direct and specific binding of bacitracin by BceB was demonstrated by surface plasmon resonance spectroscopy. Finally, in vitro signal transduction assays indicated that complex formation with the transporter influenced the autophosphorylation activity of the histidine kinase. Taken together, our findings clearly show the existence of a sensory complex composed of TCS and ABC transporters and provide the first functional insights into the mechanisms of stimulus perception, signal transduction, and antimicrobial resistance employed by Bce-like detoxification systems. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.
Dintner, Sebastian; Heermann, Ralf; Fang, Chong; Jung, Kirsten; Gebhard, Susanne
2014-01-01
Resistance against antimicrobial peptides in many Firmicutes bacteria is mediated by detoxification systems that are composed of a two-component regulatory system (TCS) and an ATP-binding cassette (ABC) transporter. The histidine kinases of these systems depend entirely on the transporter for sensing of antimicrobial peptides, suggesting a novel mode of signal transduction where the transporter constitutes the actual sensor. The aim of this study was to investigate the molecular mechanisms of this unusual signaling pathway in more detail, using the bacitracin resistance system BceRS-BceAB of Bacillus subtilis as an example. To analyze the proposed communication between TCS and the ABC transporter, we characterized their interactions by bacterial two-hybrid analyses and could show that the permease BceB and the histidine kinase BceS interact directly. In vitro pulldown assays confirmed this interaction, which was found to be independent of bacitracin. Because it was unknown whether BceAB-type transporters could detect their substrate peptides directly or instead recognized the peptide-target complex in the cell envelope, we next analyzed substrate binding by the transport permease, BceB. Direct and specific binding of bacitracin by BceB was demonstrated by surface plasmon resonance spectroscopy. Finally, in vitro signal transduction assays indicated that complex formation with the transporter influenced the autophosphorylation activity of the histidine kinase. Taken together, our findings clearly show the existence of a sensory complex composed of TCS and ABC transporters and provide the first functional insights into the mechanisms of stimulus perception, signal transduction, and antimicrobial resistance employed by Bce-like detoxification systems. PMID:25118291
NASA Astrophysics Data System (ADS)
Zhang, Yunpeng; Li, En; Zhang, Jing; Yu, Chengyong; Zheng, Hu; Guo, Gaofeng
2018-02-01
A microwave test system to measure the complex permittivity of solid and powder materials as a function of temperature has been developed. The system is based on a TM0n0 multi-mode cylindrical cavity with a slotting structure, which provides purer test modes compared to a traditional cavity. To ensure the safety, effectiveness, and longevity, heating and testing are carried out separately and the sample can move between two functional areas through an Alundum tube. Induction heating and a pneumatic platform are employed to, respectively, shorten the heating and cooling time of the sample. The single trigger function of the vector network analyzer is added to test software to suppress the drift of the resonance peak during testing. Complex permittivity is calculated by the rigorous field theoretical solution considering multilayer media loading. The variation of the cavity equivalent radius caused by the sample insertion holes is discussed in detail, and its influence to the test result is analyzed. The calibration method for the complex permittivity of the Alundum tube and quartz vial (for loading powder sample), which vary with the temperature, is given. The feasibility of the system has been verified by measuring different samples in a wide range of relative permittivity and loss tangent, and variable-temperature test results of fused quartz and SiO2 powder up to 1500 °C are compared with published data. The results indicate that the presented system is reliable and accurate. The stability of the system is verified by repeated and long-term tests, and error analysis is presented to estimate the error incurred due to the uncertainties in different error sources.
A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
Safety Metrics for Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Leveson, Nancy G; Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
The Applied Mathematics for Power Systems (AMPS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chertkov, Michael
2012-07-24
Increased deployment of new technologies, e.g., renewable generation and electric vehicles, is rapidly transforming electrical power networks by crossing previously distinct spatiotemporal scales and invalidating many traditional approaches for designing, analyzing, and operating power grids. This trend is expected to accelerate over the coming years, bringing the disruptive challenge of complexity, but also opportunities to deliver unprecedented efficiency and reliability. Our Applied Mathematics for Power Systems (AMPS) Center will discover, enable, and solve emerging mathematics challenges arising in power systems and, more generally, in complex engineered networks. We will develop foundational applied mathematics resulting in rigorous algorithms and simulation toolboxesmore » for modern and future engineered networks. The AMPS Center deconstruction/reconstruction approach 'deconstructs' complex networks into sub-problems within non-separable spatiotemporal scales, a missing step in 20th century modeling of engineered networks. These sub-problems are addressed within the appropriate AMPS foundational pillar - complex systems, control theory, and optimization theory - and merged or 'reconstructed' at their boundaries into more general mathematical descriptions of complex engineered networks where important new questions are formulated and attacked. These two steps, iterated multiple times, will bridge the growing chasm between the legacy power grid and its future as a complex engineered network.« less
Applications of complex systems theory in nursing education, research, and practice.
Clancy, Thomas R; Effken, Judith A; Pesut, Daniel
2008-01-01
The clinical and administrative processes in today's healthcare environment are becoming increasingly complex. Multiple providers, new technology, competition, and the growing ubiquity of information all contribute to the notion of health care as a complex system. A complex system (CS) is characterized by a highly connected network of entities (e.g., physical objects, people or groups of people) from which higher order behavior emerges. Research in the transdisciplinary field of CS has focused on the use of computational modeling and simulation as a methodology for analyzing CS behavior. The creation of virtual worlds through computer simulation allows researchers to analyze multiple variables simultaneously and begin to understand behaviors that are common regardless of the discipline. The application of CS principles, mediated through computer simulation, informs nursing practice of the benefits and drawbacks of new procedures, protocols and practices before having to actually implement them. The inclusion of new computational tools and their applications in nursing education is also gaining attention. For example, education in CSs and applied computational applications has been endorsed by The Institute of Medicine, the American Organization of Nurse Executives and the American Association of Colleges of Nursing as essential training of nurse leaders. The purpose of this article is to review current research literature regarding CS science within the context of expert practice and implications for the education of nurse leadership roles. The article focuses on 3 broad areas: CS defined, literature review and exemplars from CS research and applications of CS theory in nursing leadership education. The article also highlights the key role nursing informaticists play in integrating emerging computational tools in the analysis of complex nursing systems.
Mathematical and Computational Modeling in Complex Biological Systems
Li, Wenyang; Zhu, Xiaoliang
2017-01-01
The biological process and molecular functions involved in the cancer progression remain difficult to understand for biologists and clinical doctors. Recent developments in high-throughput technologies urge the systems biology to achieve more precise models for complex diseases. Computational and mathematical models are gradually being used to help us understand the omics data produced by high-throughput experimental techniques. The use of computational models in systems biology allows us to explore the pathogenesis of complex diseases, improve our understanding of the latent molecular mechanisms, and promote treatment strategy optimization and new drug discovery. Currently, it is urgent to bridge the gap between the developments of high-throughput technologies and systemic modeling of the biological process in cancer research. In this review, we firstly studied several typical mathematical modeling approaches of biological systems in different scales and deeply analyzed their characteristics, advantages, applications, and limitations. Next, three potential research directions in systems modeling were summarized. To conclude, this review provides an update of important solutions using computational modeling approaches in systems biology. PMID:28386558
Mathematical and Computational Modeling in Complex Biological Systems.
Ji, Zhiwei; Yan, Ke; Li, Wenyang; Hu, Haigen; Zhu, Xiaoliang
2017-01-01
The biological process and molecular functions involved in the cancer progression remain difficult to understand for biologists and clinical doctors. Recent developments in high-throughput technologies urge the systems biology to achieve more precise models for complex diseases. Computational and mathematical models are gradually being used to help us understand the omics data produced by high-throughput experimental techniques. The use of computational models in systems biology allows us to explore the pathogenesis of complex diseases, improve our understanding of the latent molecular mechanisms, and promote treatment strategy optimization and new drug discovery. Currently, it is urgent to bridge the gap between the developments of high-throughput technologies and systemic modeling of the biological process in cancer research. In this review, we firstly studied several typical mathematical modeling approaches of biological systems in different scales and deeply analyzed their characteristics, advantages, applications, and limitations. Next, three potential research directions in systems modeling were summarized. To conclude, this review provides an update of important solutions using computational modeling approaches in systems biology.
Using machine-learning methods to analyze economic loss function of quality management processes
NASA Astrophysics Data System (ADS)
Dzedik, V. A.; Lontsikh, P. A.
2018-05-01
During analysis of quality management systems, their economic component is often analyzed insufficiently. To overcome this issue, it is necessary to withdraw the concept of economic loss functions from tolerance thinking and address it. Input data about economic losses in processes have a complex form, thus, using standard tools to solve this problem is complicated. Use of machine learning techniques allows one to obtain precise models of the economic loss function based on even the most complex input data. Results of such analysis contain data about the true efficiency of a process and can be used to make investment decisions.
Complex and unexpected dynamics in simple genetic regulatory networks
NASA Astrophysics Data System (ADS)
Borg, Yanika; Ullner, Ekkehard; Alagha, Afnan; Alsaedi, Ahmed; Nesbeth, Darren; Zaikin, Alexey
2014-03-01
One aim of synthetic biology is to construct increasingly complex genetic networks from interconnected simpler ones to address challenges in medicine and biotechnology. However, as systems increase in size and complexity, emergent properties lead to unexpected and complex dynamics due to nonlinear and nonequilibrium properties from component interactions. We focus on four different studies of biological systems which exhibit complex and unexpected dynamics. Using simple synthetic genetic networks, small and large populations of phase-coupled quorum sensing repressilators, Goodwin oscillators, and bistable switches, we review how coupled and stochastic components can result in clustering, chaos, noise-induced coherence and speed-dependent decision making. A system of repressilators exhibits oscillations, limit cycles, steady states or chaos depending on the nature and strength of the coupling mechanism. In large repressilator networks, rich dynamics can also be exhibited, such as clustering and chaos. In populations of Goodwin oscillators, noise can induce coherent oscillations. In bistable systems, the speed with which incoming external signals reach steady state can bias the network towards particular attractors. These studies showcase the range of dynamical behavior that simple synthetic genetic networks can exhibit. In addition, they demonstrate the ability of mathematical modeling to analyze nonlinearity and inhomogeneity within these systems.
Three perspectives on complexity: entropy, compression, subsymmetry
NASA Astrophysics Data System (ADS)
Nagaraj, Nithin; Balasubramanian, Karthi
2017-12-01
There is no single universally accepted definition of `Complexity'. There are several perspectives on complexity and what constitutes complex behaviour or complex systems, as opposed to regular, predictable behaviour and simple systems. In this paper, we explore the following perspectives on complexity: effort-to-describe (Shannon entropy H, Lempel-Ziv complexity LZ), effort-to-compress (ETC complexity) and degree-of-order (Subsymmetry or SubSym). While Shannon entropy and LZ are very popular and widely used, ETC is relatively a new complexity measure. In this paper, we also propose a novel normalized complexity measure SubSym based on the existing idea of counting the number of subsymmetries or palindromes within a sequence. We compare the performance of these complexity measures on the following tasks: (A) characterizing complexity of short binary sequences of lengths 4 to 16, (B) distinguishing periodic and chaotic time series from 1D logistic map and 2D Hénon map, (C) analyzing the complexity of stochastic time series generated from 2-state Markov chains, and (D) distinguishing between tonic and irregular spiking patterns generated from the `Adaptive exponential integrate-and-fire' neuron model. Our study reveals that each perspective has its own advantages and uniqueness while also having an overlap with each other.
Risk Modeling of Interdependent Complex Systems of Systems: Theory and Practice.
Haimes, Yacov Y
2018-01-01
The emergence of the complexity characterizing our systems of systems (SoS) requires a reevaluation of the way we model, assess, manage, communicate, and analyze the risk thereto. Current models for risk analysis of emergent complex SoS are insufficient because too often they rely on the same risk functions and models used for single systems. These models commonly fail to incorporate the complexity derived from the networks of interdependencies and interconnectedness (I-I) characterizing SoS. There is a need to reevaluate currently practiced risk analysis to respond to this reality by examining, and thus comprehending, what makes emergent SoS complex. The key to evaluating the risk to SoS lies in understanding the genesis of characterizing I-I of systems manifested through shared states and other essential entities within and among the systems that constitute SoS. The term "essential entities" includes shared decisions, resources, functions, policies, decisionmakers, stakeholders, organizational setups, and others. This undertaking can be accomplished by building on state-space theory, which is fundamental to systems engineering and process control. This article presents a theoretical and analytical framework for modeling the risk to SoS with two case studies performed with the MITRE Corporation and demonstrates the pivotal contributions made by shared states and other essential entities to modeling and analysis of the risk to complex SoS. A third case study highlights the multifarious representations of SoS, which require harmonizing the risk analysis process currently applied to single systems when applied to complex SoS. © 2017 Society for Risk Analysis.
Maja Valley and the Chryse outflow complex sites
NASA Technical Reports Server (NTRS)
Rice, Jim W.
1994-01-01
This candidate landing site is located at 19 deg N, 53.5 deg W near the mouth of a major outflow channel. Maja Valles, and two 'valley network' channel systems, Maumee and Vedra Valles. The following objectives are to be analyzed in this region: (1) origin and paleohydrology of outflow and valley network channels; (2) fan delta complex composition (the deposit located in this area is one of the few identified at the mouth s of any channels on the planet); and (3) analysis of any paleolake sediments (carbonates, evaporites). The primary objectives of the Chryse Outflow Complex region (Ares, Tiu, Mawrth, Simud, and Shalbatana Valles) would be outflow channel dynamics (paleohydrology) of five different channel systems.
Coexisting multiple attractors and riddled basins of a memristive system.
Wang, Guangyi; Yuan, Fang; Chen, Guanrong; Zhang, Yu
2018-01-01
In this paper, a new memristor-based chaotic system is designed, analyzed, and implemented. Multistability, multiple attractors, and complex riddled basins are observed from the system, which are investigated along with other dynamical behaviors such as equilibrium points and their stabilities, symmetrical bifurcation diagrams, and sustained chaotic states. With different sets of system parameters, the system can also generate various multi-scroll attractors. Finally, the system is realized by experimental circuits.
PumpKin: A tool to find principal pathways in plasma chemical models
NASA Astrophysics Data System (ADS)
Markosyan, A. H.; Luque, A.; Gordillo-Vázquez, F. J.; Ebert, U.
2014-10-01
PumpKin is a software package to find all principal pathways, i.e. the dominant reaction sequences, in chemical reaction systems. Although many tools are available to integrate numerically arbitrarily complex chemical reaction systems, few tools exist in order to analyze the results and interpret them in relatively simple terms. In particular, due to the large disparity in the lifetimes of the interacting components, it is often useful to group reactions into pathways that recycle the fastest species. This allows a researcher to focus on the slow chemical dynamics, eliminating the shortest timescales. Based on the algorithm described by Lehmann (2004), PumpKin automates the process of finding such pathways, allowing the user to analyze complex kinetics and to understand the consumption and production of a certain species of interest. We designed PumpKin with an emphasis on plasma chemical systems but it can also be applied to atmospheric modeling and to industrial applications such as plasma medicine and plasma-assisted combustion.
Berge, J A; Gramstad, L; Grimnes, S
1995-05-01
Modern anaesthetic machines are equipped with several safety components to prevent delivery of hypoxic mixtures. However, such a technical development has increased the complexity of the equipment. We report a reconstructed anaesthetic machine in which a paramagnetic oxygen analyzer has provided the means to simplify the apparatus. The new machine is devoid of several components conventionally included to prevent hypoxic mixtures: oxygen failure protection device, reservoir O2 alarm, N2O/air selector, and proportioning system for oxygen/nitrous oxide delivery. These devices have been replaced by a simple safety system using a paramagnetic oxygen analyzer at the common gas outlet, which in a feed-back system cuts off the supply of nitrous oxide whenever the oxygen concentration falls below 25%. The simplified construction of the anaesthetic machine has important consequences for safety, cost and user-friendliness. Reducing the complexity of the construction also simplifies the pre-use checkout procedure, and an efficient 5-point check list is presented for the new machine.
Kacang Cerdik: A Conceptual Design of an Idea Management System
ERIC Educational Resources Information Center
Murah, Mohd Zamri; Abdullah, Zuraidah; Hassan, Rosilah; Bakar, Marini Abu; Mohamed, Ibrahim; Amin, Hazilah Mohd
2013-01-01
An idea management system is where ideas are stored and then can be evaluated and analyzed. It provides the structure and the platform for users to contribute ideas for innovation and creativity. Designing and developing an idea management system is a complex task because it involves many users and lot of ideas. Some of the critical features for…
a Statistical Dynamic Approach to Structural Evolution of Complex Capital Market Systems
NASA Astrophysics Data System (ADS)
Shao, Xiao; Chai, Li H.
As an important part of modern financial systems, capital market has played a crucial role on diverse social resource allocations and economical exchanges. Beyond traditional models and/or theories based on neoclassical economics, considering capital markets as typical complex open systems, this paper attempts to develop a new approach to overcome some shortcomings of the available researches. By defining the generalized entropy of capital market systems, a theoretical model and nonlinear dynamic equation on the operations of capital market are proposed from statistical dynamic perspectives. The US security market from 1995 to 2001 is then simulated and analyzed as a typical case. Some instructive results are discussed and summarized.
NASA Astrophysics Data System (ADS)
Zhang, Wanli; Li, Chuandong; Huang, Tingwen; Huang, Junjian
2018-02-01
This paper investigates the fixed-time synchronization of complex networks (CNs) with nonidentical nodes and stochastic noise perturbations. By designing new controllers, constructing Lyapunov functions and using the properties of Weiner process, different synchronization criteria are derived according to whether the node systems in the CNs or the goal system satisfies the corresponding conditions. Moreover, the role of the designed controllers is analyzed in great detail by constructing a suitable comparison system and a new method is presented to estimate the settling time by utilizing the comparison system. Results of this paper can be applied to both directed and undirected weighted networks. Numerical simulations are offered to verify the effectiveness of our new results.
Analysis of chaos in high-dimensional wind power system.
Wang, Cong; Zhang, Hongli; Fan, Wenhui; Ma, Ping
2018-01-01
A comprehensive analysis on the chaos of a high-dimensional wind power system is performed in this study. A high-dimensional wind power system is more complex than most power systems. An 11-dimensional wind power system proposed by Huang, which has not been analyzed in previous studies, is investigated. When the systems are affected by external disturbances including single parameter and periodic disturbance, or its parameters changed, chaotic dynamics of the wind power system is analyzed and chaotic parameters ranges are obtained. Chaos existence is confirmed by calculation and analysis of all state variables' Lyapunov exponents and the state variable sequence diagram. Theoretical analysis and numerical simulations show that the wind power system chaos will occur when parameter variations and external disturbances change to a certain degree.
Dynamical complexity changes during two forms of meditation
NASA Astrophysics Data System (ADS)
Li, Jin; Hu, Jing; Zhang, Yinhong; Zhang, Xiaofeng
2011-06-01
Detection of dynamical complexity changes in natural and man-made systems has deep scientific and practical meaning. We use the base-scale entropy method to analyze dynamical complexity changes for heart rate variability (HRV) series during specific traditional forms of Chinese Chi and Kundalini Yoga meditation techniques in healthy young adults. The results show that dynamical complexity decreases in meditation states for two forms of meditation. Meanwhile, we detected changes in probability distribution of m-words during meditation and explained this changes using probability distribution of sine function. The base-scale entropy method may be used on a wider range of physiologic signals.
NASA Astrophysics Data System (ADS)
Aggarwal, Anil Kr.; Kumar, Sanjeev; Singh, Vikram
2017-03-01
The binary states, i.e., success or failed state assumptions used in conventional reliability are inappropriate for reliability analysis of complex industrial systems due to lack of sufficient probabilistic information. For large complex systems, the uncertainty of each individual parameter enhances the uncertainty of the system reliability. In this paper, the concept of fuzzy reliability has been used for reliability analysis of the system, and the effect of coverage factor, failure and repair rates of subsystems on fuzzy availability for fault-tolerant crystallization system of sugar plant is analyzed. Mathematical modeling of the system is carried out using the mnemonic rule to derive Chapman-Kolmogorov differential equations. These governing differential equations are solved with Runge-Kutta fourth-order method.
X-ray emission spectroscopy of biomimetic Mn coordination complexes
Jensen, Scott C.; Davis, Katherine M.; Sullivan,
2017-05-19
Understanding the function of Mn ions in biological and chemical redox catalysis requires precise knowledge of their electronic structure. X-ray emission spectroscopy (XES) is an emerging technique with a growing application to biological and biomimetic systems. Here, we report an improved, cost-effective spectrometer used to analyze two biomimetic coordination compounds, [Mn IV(OH) 2(Me 2EBC)] 2+ and [Mn IV(O)(OH)(Me 2EBC)] +, the second of which contains a key Mn IV=O structural fragment. Despite having the same formal oxidation state (Mn IV) and tetradentate ligands, XES spectra from these two compounds demonstrate different electronic structures. Experimental measurements and DFT calculations yield differentmore » localized spin densities for the two complexes resulting from Mn IV–OH conversion to Mn IV=O. The relevance of the observed spectroscopic changes is discussed for applications in analyzing complex biological systems such as photosystem II. In conclusion, a model of the S 3 intermediate state of photosystem II containing a Mn IV=O fragment is compared to recent time-resolved X-ray diffraction data of the same state.« less
X-ray Emission Spectroscopy of Biomimetic Mn Coordination Complexes.
Jensen, Scott C; Davis, Katherine M; Sullivan, Brendan; Hartzler, Daniel A; Seidler, Gerald T; Casa, Diego M; Kasman, Elina; Colmer, Hannah E; Massie, Allyssa A; Jackson, Timothy A; Pushkar, Yulia
2017-06-15
Understanding the function of Mn ions in biological and chemical redox catalysis requires precise knowledge of their electronic structure. X-ray emission spectroscopy (XES) is an emerging technique with a growing application to biological and biomimetic systems. Here, we report an improved, cost-effective spectrometer used to analyze two biomimetic coordination compounds, [Mn IV (OH) 2 (Me 2 EBC)] 2+ and [Mn IV (O)(OH)(Me 2 EBC)] + , the second of which contains a key Mn IV ═O structural fragment. Despite having the same formal oxidation state (Mn IV ) and tetradentate ligands, XES spectra from these two compounds demonstrate different electronic structures. Experimental measurements and DFT calculations yield different localized spin densities for the two complexes resulting from Mn IV -OH conversion to Mn IV ═O. The relevance of the observed spectroscopic changes is discussed for applications in analyzing complex biological systems such as photosystem II. A model of the S 3 intermediate state of photosystem II containing a Mn IV ═O fragment is compared to recent time-resolved X-ray diffraction data of the same state.
Boundaries of mass resolution in native mass spectrometry.
Lössl, Philip; Snijder, Joost; Heck, Albert J R
2014-06-01
Over the last two decades, native mass spectrometry (MS) has emerged as a valuable tool to study intact proteins and noncovalent protein complexes. Studied experimental systems range from small-molecule (drug)-protein interactions, to nanomachineries such as the proteasome and ribosome, to even virus assembly. In native MS, ions attain high m/z values, requiring special mass analyzers for their detection. Depending on the particular mass analyzer used, instrumental mass resolution does often decrease at higher m/z but can still be above a couple of thousand at m/z 5000. However, the mass resolving power obtained on charge states of protein complexes in this m/z region is experimentally found to remain well below the inherent instrument resolution of the mass analyzers employed. Here, we inquire into reasons for this discrepancy and ask how native MS would benefit from higher instrumental mass resolution. To answer this question, we discuss advantages and shortcomings of mass analyzers used to study intact biomolecules and biomolecular complexes in their native state, and we review which other factors determine mass resolving power in native MS analyses. Recent examples from the literature are given to illustrate the current status and limitations.
Problem-solving tools for analyzing system problems. The affinity map and the relationship diagram.
Lepley, C J
1998-12-01
The author describes how to use two management tools, an affinity map and a relationship diagram, to define and analyze aspects of a complex problem in a system. The affinity map identifies the key influencing elements of the problem, whereas the relationship diagram helps to identify the area that is the most important element of the issue. Managers can use the tools to draw a map of problem drivers, graphically display the drivers in a diagram, and use the diagram to develop a cause-and-effect relationship.
Artificial intelligence applied to process signal analysis
NASA Technical Reports Server (NTRS)
Corsberg, Dan
1988-01-01
Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.
Quality Management and Key Performance Indicators in Oncologic Esophageal Surgery.
Gockel, Ines; Ahlbrand, Constantin Johannes; Arras, Michael; Schreiber, Elke Maria; Lang, Hauke
2015-12-01
Ranking systems and comparisons of quality and performance indicators will be of increasing relevance for complex "high-risk" procedures such as esophageal cancer surgery. The identification of evidence-based standards relevant for key performance indicators in esophageal surgery is essential for establishing monitoring systems and furthermore a requirement to enhance treatment quality. In the course of this review, we analyze the key performance indicators case volume, radicality of resection, and postoperative morbidity and mortality, leading to continuous quality improvement. Ranking systems established on this basis will gain increased relevance in highly complex procedures within the national and international comparison and furthermore improve the treatment of patients with esophageal carcinoma.
Real-time color image processing for forensic fiber investigations
NASA Astrophysics Data System (ADS)
Paulsson, Nils
1995-09-01
This paper describes a system for automatic fiber debris detection based on color identification. The properties of the system are fast analysis and high selectivity, a necessity when analyzing forensic fiber samples. An ordinary investigation separates the material into well above 100,000 video images to analyze. The system is based on standard techniques such as CCD-camera, motorized sample table, and IBM-compatible PC/AT with add-on-boards for video frame digitalization and stepping motor control as the main parts. It is possible to operate the instrument at full video rate (25 image/s) with aid of the HSI-color system (hue- saturation-intensity) and software optimization. High selectivity is achieved by separating the analysis into several steps. The first step is fast direct color identification of objects in the analyzed video images and the second step analyzes detected objects with a more complex and time consuming stage of the investigation to identify single fiber fragments for subsequent analysis with more selective techniques.
Securing Information with Complex Optical Encryption Networks
2015-08-11
Network Security, Network Vulnerability , Multi-dimentional Processing, optoelectronic devices 16. SECURITY CLASSIFICATION OF: 17. LIMITATION... optoelectronic devices and systems should be analyzed before the retrieval, any hostile hacker will need to possess multi-disciplinary scientific...sophisticated optoelectronic principles and systems where he/she needs to process the information. However, in the military applications, most military
Analyzing Change in Students' Gene-to-Evolution Models in College-Level Introductory Biology
ERIC Educational Resources Information Center
Dauer, Joseph T.; Momsen, Jennifer L.; Speth, Elena Bray; Makohon-Moore, Sasha C.; Long, Tammy M.
2013-01-01
Research in contemporary biology has become increasingly complex and organized around understanding biological processes in the context of systems. To better reflect the ways of thinking required for learning about systems, we developed and implemented a pedagogical approach using box-and-arrow models (similar to concept maps) as a foundational…
Lumber Grading With A Computer Vision System
Richard W. Conners; Tai-Hoon Cho; Philip A. Araman
1989-01-01
Over the past few years significant progress has been made in developing a computer vision system for locating and identifying defects on surfaced hardwood lumber. Unfortunately, until September of 1988 little research had gone into developing methods for analyzing rough lumber. This task is arguably more complex than the analysis of surfaced lumber. The prime...
2007-07-01
Systems, Ciudad Real, Spain, 2002. [Ame00] "Metamorphosis," in American Heritage Dictionary of the English Language Fourth ed: Houghton Mifflin Company...Beyond Fear: Thinking Sensibly About Security in an Uncertain World. New York: Copernicus Books, 2003. [Sch99] Schneier, B. "Modeling Security
Method and apparatus for transfer function simulator for testing complex systems
NASA Technical Reports Server (NTRS)
Kavaya, M. J. (Inventor)
1985-01-01
A method and apparatus for testing the operation of a complex stabilization circuit in a closed loop system is presented. The method is comprised of a programmed analog or digital computing system for implementing the transfer function of a load thereby providing a predictable load. The digital computing system employs a table stored in a microprocessor in which precomputed values of the load transfer function are stored for values of input signal from the stabilization circuit over the range of interest. This technique may be used not only for isolating faults in the stabilization circuit, but also for analyzing a fault in a faulty load by so varying parameters of the computing system as to simulate operation of the actual load with the fault.
The BiolAD-DB system : an informatics system for clinical and genetic data.
Nielsen, David A; Leidner, Marty; Haynes, Chad; Krauthammer, Michael; Kreek, Mary Jeanne
2007-01-01
The Biology of Addictive Diseases-Database (BiolAD-DB) system is a research bioinformatics system for archiving, analyzing, and processing of complex clinical and genetic data. The database schema employs design principles for handling complex clinical information, such as response items in genetic questionnaires. Data access and validation is provided by the BiolAD-DB client application, which features a data validation engine tightly coupled to a graphical user interface. Data integrity is provided by the password-protected BiolAD-DB SQL compliant server and database. BiolAD-DB tools further provide functionalities for generating customized reports and views. The BiolAD-DB system schema, client, and installation instructions are freely available at http://www.rockefeller.edu/biolad-db/.
NASA Astrophysics Data System (ADS)
Wang, Yi Jiao; Feng, Qing Yi; Chai, Li He
As one of the most important financial markets and one of the main parts of economic system, the stock market has become the research focus in economics. The stock market is a typical complex open system far from equilibrium. Many available models that make huge contribution to researches on market are strong in describing the market however, ignoring strong nonlinear interactions among active agents and weak in reveal underlying dynamic mechanisms of structural evolutions of market. From econophysical perspectives, this paper analyzes the complex interactions among agents and defines the generalized entropy in stock markets. Nonlinear evolutionary dynamic equation for the stock markets is then derived from Maximum Generalized Entropy Principle. Simulations are accordingly conducted for a typical case with the given data, by which the structural evolution of the stock market system is demonstrated. Some discussions and implications are finally provided.
NASA Technical Reports Server (NTRS)
Dill, Evan T.; Young, Steven D.
2015-01-01
In the constant drive to further the safety and efficiency of air travel, the complexity of avionics-related systems, and the procedures for interacting with these systems, appear to be on an ever-increasing trend. While this growing complexity often yields productive results with respect to system capabilities and flight efficiency, it can place a larger burden on pilots to manage increasing amounts of information and to understand intricate system designs. Evidence supporting this observation is becoming widespread, yet has been largely anecdotal or the result of subjective analysis. One way to gain more insight into this issue is through experimentation using more objective measures or indicators. This study utilizes and analyzes eye-tracking data obtained during a high-fidelity flight simulation study wherein many of the complexities of current flight decks, as well as those planned for the next generation air transportation system (NextGen), were emulated. The following paper presents the findings of this study with a focus on electronic flight bag (EFB) usage, system state awareness (SSA) and events involving suspected inattentional blindness (IB).
Birefringence measurement in complex optical systems
NASA Astrophysics Data System (ADS)
Knell, Holger; Heuck, Hans-Martin
2017-06-01
State of the art optical systems become more complex. There are more lenses required in the optical design and optical coatings have more layers. These complex designs are prone to induce more thermal stress into the optical system which causes birefringence. In addition, there is a certain degree of freedom required to meet optical specifications during the assembly process. The mechanical fixation of these degrees of freedom can also lead to mechanical stress in the optical system and therefore to birefringence. To be able to distinguish those two types of stress a method to image the birefringence in the optical system is required. In the proposed setup light is polarized by a circular polarization filter and then is transmitted through a rotatable linear retarder and the tested optical system. The light then is reflected on the same path by a mirror. After the light passes the circular polarization filter on the way back, the intensity is recorded. When the rotatable retarder is rotated, the recorded intensity is modulated depending on the birefringence of the tested optical system. This modulation can be analyzed in Fourier domain and the linear retardance angle between the slow and the fast axis as well as the angle of the fast axis can be calculated. The retardance distribution over the pupil of the optical system then can be analyzed using Zernike decomposition. From the Zernike decomposition, the origin of the birefringence can be identified. Since it is required to quantify small amounts of retardance well below 10nm, the birefringence of the measurement system must be characterized before the measurement and considered in the calculation of the resulting birefringence. Temperature change of the measurement system still can produce measurement artifacts in the calculated result, which must also be compensated for.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Chan-Joong; Kim, Jimin; Hong, Taehoon
Climate change has become one of the most significant environmental issues, of which about 40% come from the building sector. In particular, complex building projects with various functions have increased, which should be managed from a program-level perspective. Therefore, this study aimed to develop a program-level management system for the life-cycle environmental and economic assessment of complex building projects. The developed system consists of three parts: (i) input part: database server and input data; (ii) analysis part: life cycle assessment and life cycle cost; and (iii) result part: microscopic analysis and macroscopic analysis. To analyze the applicability of the developedmore » system, this study selected ‘U’ University, a complex building project consisting of research facility and residential facility. Through value engineering with experts, a total of 137 design alternatives were established. Based on these alternatives, the macroscopic analysis results were as follows: (i) at the program-level, the life-cycle environmental and economic cost in ‘U’ University were reduced by 6.22% and 2.11%, respectively; (ii) at the project-level, the life-cycle environmental and economic cost in research facility were reduced 6.01% and 1.87%, respectively; and those in residential facility, 12.01% and 3.83%, respective; and (iii) for the mechanical work at the work-type-level, the initial cost was increased 2.9%; but the operation and maintenance phase was reduced by 20.0%. As a result, the developed system can allow the facility managers to establish the operation and maintenance strategies for the environmental and economic aspects from a program-level perspective. - Highlights: • A program-level management system for complex building projects was developed. • Life-cycle environmental and economic assessment can be conducted using the system. • The design alternatives can be analyzed from the microscopic perspective. • The system can be used to establish the optimal O&M strategy at the program-level. • It can be applied to any other country or sector in the global environment.« less
XAP, a program for deconvolution and analysis of complex X-ray spectra
Quick, James E.; Haleby, Abdul Malik
1989-01-01
The X-ray analysis program (XAP) is a spectral-deconvolution program written in BASIC and specifically designed to analyze complex spectra produced by energy-dispersive X-ray analytical systems (EDS). XAP compensates for spectrometer drift, utilizes digital filtering to remove background from spectra, and solves for element abundances by least-squares, multiple-regression analysis. Rather than base analyses on only a few channels, broad spectral regions of a sample are reconstructed from standard reference spectra. The effects of this approach are (1) elimination of tedious spectrometer adjustments, (2) removal of background independent of sample composition, and (3) automatic correction for peak overlaps. Although the program was written specifically to operate a KEVEX 7000 X-ray fluorescence analytical system, it could be adapted (with minor modifications) to analyze spectra produced by scanning electron microscopes, electron microprobes, and probes, and X-ray defractometer patterns obtained from whole-rock powders.
Multi-Agent Strategic Modeling in a Specific Environment
NASA Astrophysics Data System (ADS)
Gams, Matjaz; Bezek, Andraz
Multi-agent modeling in ambient intelligence (AmI) is concerned with the following task [19]: How can external observations of multi-agent systems in the ambient be used to analyze, model, and direct agent behavior? The main purpose is to obtain knowledge about acts in the environment thus enabling proper actions of the AmI systems [1]. Analysis of such systems must thus capture complex world state representation and asynchronous agent activities. Instead of studying basic numerical data, researchers often use more complex data structures, such as rules and decision trees. Some methods are extremely useful when characterizing state space, but lack the ability to clearly represent temporal state changes occurred by agent actions. To comprehend simultaneous agent actions and complex changes of state space, most often a combination of graphical and symbolical representation performs better in terms of human understanding and performance.
Primordial Evolution in the Finitary Process Soup
NASA Astrophysics Data System (ADS)
Görnerup, Olof; Crutchfield, James P.
A general and basic model of primordial evolution—a soup of reacting finitary and discrete processes—is employed to identify and analyze fundamental mechanisms that generate and maintain complex structures in prebiotic systems. The processes—ɛ-machines as defined in computational mechanics—and their interaction networks both provide well defined notions of structure. This enables us to quantitatively demonstrate hierarchical self-organization in the soup in terms of complexity. We found that replicating processes evolve the strategy of successively building higher levels of organization by autocatalysis. Moreover, this is facilitated by local components that have low structural complexity, but high generality. In effect, the finitary process soup spontaneously evolves a selection pressure that favors such components. In light of the finitary process soup's generality, these results suggest a fundamental law of hierarchical systems: global complexity requires local simplicity.
Performance analysis of a generalized upset detection procedure
NASA Technical Reports Server (NTRS)
Blough, Douglas M.; Masson, Gerald M.
1987-01-01
A general procedure for upset detection in complex systems, called the data block capture and analysis upset monitoring process is described and analyzed. The process consists of repeatedly recording a fixed amount of data from a set of predetermined observation lines of the system being monitored (i.e., capturing a block of data), and then analyzing the captured block in an attempt to determine whether the system is functioning correctly. The algorithm which analyzes the data blocks can be characterized in terms of the amount of time it requires to examine a given length data block to ascertain the existence of features/conditions that have been predetermined to characterize the upset-free behavior of the system. The performance of linear, quadratic, and logarithmic data analysis algorithms is rigorously characterized in terms of three performance measures: (1) the probability of correctly detecting an upset; (2) the expected number of false alarms; and (3) the expected latency in detecting upsets.
Characteristics of a Four-Nozzle, Slotted Short Mixing Stack with Shroud, Gas Eductor System.
1982-03-01
system, becomes extremely complex . The other method, which was chosen here, analyzes the overall performance of the eductor system and is not...SCELLAHEOUS INF ORIATiON LENTH 7.55 CI N TILT ANGLE, IS COEG3 ORIVICE CIANETER 6 965 [IN] DIAMETER, 11.70 E|K2 ROTATION ANGLE, t0 EDEG3 ORIFICE SETA, 0 49? L
Performance analysis of Integrated Communication and Control System networks
NASA Technical Reports Server (NTRS)
Halevi, Y.; Ray, A.
1990-01-01
This paper presents statistical analysis of delays in Integrated Communication and Control System (ICCS) networks that are based on asynchronous time-division multiplexing. The models are obtained in closed form for analyzing control systems with randomly varying delays. The results of this research are applicable to ICCS design for complex dynamical processes like advanced aircraft and spacecraft, autonomous manufacturing plants, and chemical and processing plants.
Approaching human language with complex networks.
Cong, Jin; Liu, Haitao
2014-12-01
The interest in modeling and analyzing human language with complex networks is on the rise in recent years and a considerable body of research in this area has already been accumulated. We survey three major lines of linguistic research from the complex network approach: 1) characterization of human language as a multi-level system with complex network analysis; 2) linguistic typological research with the application of linguistic networks and their quantitative measures; and 3) relationships between the system-level complexity of human language (determined by the topology of linguistic networks) and microscopic linguistic (e.g., syntactic) features (as the traditional concern of linguistics). We show that the models and quantitative tools of complex networks, when exploited properly, can constitute an operational methodology for linguistic inquiry, which contributes to the understanding of human language and the development of linguistics. We conclude our review with suggestions for future linguistic research from the complex network approach: 1) relationships between the system-level complexity of human language and microscopic linguistic features; 2) expansion of research scope from the global properties to other levels of granularity of linguistic networks; and 3) combination of linguistic network analysis with other quantitative studies of language (such as quantitative linguistics). Copyright © 2014 Elsevier B.V. All rights reserved.
Faults Discovery By Using Mined Data
NASA Technical Reports Server (NTRS)
Lee, Charles
2005-01-01
Fault discovery in the complex systems consist of model based reasoning, fault tree analysis, rule based inference methods, and other approaches. Model based reasoning builds models for the systems either by mathematic formulations or by experiment model. Fault Tree Analysis shows the possible causes of a system malfunction by enumerating the suspect components and their respective failure modes that may have induced the problem. The rule based inference build the model based on the expert knowledge. Those models and methods have one thing in common; they have presumed some prior-conditions. Complex systems often use fault trees to analyze the faults. Fault diagnosis, when error occurs, is performed by engineers and analysts performing extensive examination of all data gathered during the mission. International Space Station (ISS) control center operates on the data feedback from the system and decisions are made based on threshold values by using fault trees. Since those decision-making tasks are safety critical and must be done promptly, the engineers who manually analyze the data are facing time challenge. To automate this process, this paper present an approach that uses decision trees to discover fault from data in real-time and capture the contents of fault trees as the initial state of the trees.
NASA Astrophysics Data System (ADS)
Sheikh, Alireza; Amat, Alexandre Graell i.; Liva, Gianluigi
2017-12-01
We analyze the achievable information rates (AIRs) for coded modulation schemes with QAM constellations with both bit-wise and symbol-wise decoders, corresponding to the case where a binary code is used in combination with a higher-order modulation using the bit-interleaved coded modulation (BICM) paradigm and to the case where a nonbinary code over a field matched to the constellation size is used, respectively. In particular, we consider hard decision decoding, which is the preferable option for fiber-optic communication systems where decoding complexity is a concern. Recently, Liga \\emph{et al.} analyzed the AIRs for bit-wise and symbol-wise decoders considering what the authors called \\emph{hard decision decoder} which, however, exploits \\emph{soft information} of the transition probabilities of discrete-input discrete-output channel resulting from the hard detection. As such, the complexity of the decoder is essentially the same as the complexity of a soft decision decoder. In this paper, we analyze instead the AIRs for the standard hard decision decoder, commonly used in practice, where the decoding is based on the Hamming distance metric. We show that if standard hard decision decoding is used, bit-wise decoders yield significantly higher AIRs than symbol-wise decoders. As a result, contrary to the conclusion by Liga \\emph{et al.}, binary decoders together with the BICM paradigm are preferable for spectrally-efficient fiber-optic systems. We also design binary and nonbinary staircase codes and show that, in agreement with the AIRs, binary codes yield better performance.
Preliminary Characterization of Erythrocytes Deformability on the Entropy-Complexity Plane
Korol, Ana M; D’Arrigo, Mabel; Foresto, Patricia; Pérez, Susana; Martín, Maria T; Rosso, Osualdo A
2010-01-01
We present an application of wavelet-based Information Theory quantifiers (Normalized Total Shannon Entropy, MPR-Statistical Complexity and Entropy-Complexity plane) on red blood cells membrane viscoelasticity characterization. These quantifiers exhibit important localization advantages provided by the Wavelet Theory. The present approach produces a clear characterization of this dynamical system, finding out an evident manifestation of a random process on the red cell samples of healthy individuals, and its sharp reduction of randomness on analyzing a human haematological disease, such as β-thalassaemia minor. PMID:21611139
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yakimov, M.A.; Nosova, N.F.; Degtyarev, A.Ya.
1963-01-01
Solubility in ternary systems TlNO/sub 3/--UO/sub 2/(NO/sub 3/)/sub 2/-- H/sub 2/ O and CsNO/sub 3/--UO/sub 2/(NO/sub 3/)/sub 2/--H/sub 2/O at 0 to 25 c- C was studi ed by the isothermal method. The first system did not form solid phase compounds; the second system formed two compounds Cs/sub 2/UO/ sub 2/(NO/sub 3/)/sub 4/ and CsUO/sub 2/(NO/sub 3/)/sub 3/ at 25 c- and of water vapor pressure over the systems at 25 c- showed that water activity in the ternary systems at certain concentrations does not exceed the water activity in binary uranyl nitratewater system (at identical uranyl nitrate concentrations) confirmingmore » the observed complex formation in the solution. The mechanism of complex formation was analyzed and expanded for alkali metal - metal salt-complexing agent water systems. (R.V.J.)« less
Analyzing system safety in lithium-ion grid energy storage
NASA Astrophysics Data System (ADS)
Rosewater, David; Williams, Adam
2015-12-01
As grid energy storage systems become more complex, it grows more difficult to design them for safe operation. This paper first reviews the properties of lithium-ion batteries that can produce hazards in grid scale systems. Then the conventional safety engineering technique Probabilistic Risk Assessment (PRA) is reviewed to identify its limitations in complex systems. To address this gap, new research is presented on the application of Systems-Theoretic Process Analysis (STPA) to a lithium-ion battery based grid energy storage system. STPA is anticipated to fill the gaps recognized in PRA for designing complex systems and hence be more effective or less costly to use during safety engineering. It was observed that STPA is able to capture causal scenarios for accidents not identified using PRA. Additionally, STPA enabled a more rational assessment of uncertainty (all that is not known) thereby promoting a healthy skepticism of design assumptions. We conclude that STPA may indeed be more cost effective than PRA for safety engineering in lithium-ion battery systems. However, further research is needed to determine if this approach actually reduces safety engineering costs in development, or improves industry safety standards.
Maron, Bradley A; Leopold, Jane A
2016-09-30
Reductionist theory proposes that analyzing complex systems according to their most fundamental components is required for problem resolution, and has served as the cornerstone of scientific methodology for more than four centuries. However, technological gains in the current scientific era now allow for the generation of large datasets that profile the proteomic, genomic, and metabolomic signatures of biological systems across a range of conditions. The accessibility of data on such a vast scale has, in turn, highlighted the limitations of reductionism, which is not conducive to analyses that consider multiple and contemporaneous interactions between intermediates within a pathway or across constructs. Systems biology has emerged as an alternative approach to analyze complex biological systems. This methodology is based on the generation of scale-free networks and, thus, provides a quantitative assessment of relationships between multiple intermediates, such as protein-protein interactions, within and between pathways of interest. In this way, systems biology is well positioned to identify novel targets implicated in the pathogenesis or treatment of diseases. In this review, the historical root and fundamental basis of systems biology, as well as the potential applications of this methodology are discussed with particular emphasis on integration of these concepts to further understanding of cardiovascular disorders such as coronary artery disease and pulmonary hypertension.
Analysis of Selected Enhancements to the En Route Central Computing Complex
DOT National Transportation Integrated Search
1981-09-01
This report analyzes selected hardware enhancements that could improve the performance of the 9020 computer systems, which are used to provide en route air traffic control services. These enhancements could be implemented quickly, would be relatively...
NASA Astrophysics Data System (ADS)
Liu, Y.; Gupta, H.; Wagener, T.; Stewart, S.; Mahmoud, M.; Hartmann, H.; Springer, E.
2007-12-01
Some of the most challenging issues facing contemporary water resources management are those typified by complex coupled human-environmental systems with poorly characterized uncertainties. In other words, major decisions regarding water resources have to be made in the face of substantial uncertainty and complexity. It has been suggested that integrated models can be used to coherently assemble information from a broad set of domains, and can therefore serve as an effective means for tackling the complexity of environmental systems. Further, well-conceived scenarios can effectively inform decision making, particularly when high complexity and poorly characterized uncertainties make the problem intractable via traditional uncertainty analysis methods. This presentation discusses the integrated modeling framework adopted by SAHRA, an NSF Science & Technology Center, to investigate stakeholder-driven water sustainability issues within the semi-arid southwestern US. The multi-disciplinary, multi-resolution modeling framework incorporates a formal scenario approach to analyze the impacts of plausible (albeit uncertain) alternative futures to support adaptive management of water resources systems. Some of the major challenges involved in, and lessons learned from, this effort will be discussed.
Distributed Cooperation Solution Method of Complex System Based on MAS
NASA Astrophysics Data System (ADS)
Weijin, Jiang; Yuhui, Xu
To adapt the model in reconfiguring fault diagnosing to dynamic environment and the needs of solving the tasks of complex system fully, the paper introduced multi-Agent and related technology to the complicated fault diagnosis, an integrated intelligent control system is studied in this paper. Based on the thought of the structure of diagnostic decision and hierarchy in modeling, based on multi-layer decomposition strategy of diagnosis task, a multi-agent synchronous diagnosis federation integrated different knowledge expression modes and inference mechanisms are presented, the functions of management agent, diagnosis agent and decision agent are analyzed, the organization and evolution of agents in the system are proposed, and the corresponding conflict resolution algorithm in given, Layered structure of abstract agent with public attributes is build. System architecture is realized based on MAS distributed layered blackboard. The real world application shows that the proposed control structure successfully solves the fault diagnose problem of the complex plant, and the special advantage in the distributed domain.
A survey of noninteractive zero knowledge proof system and its applications.
Wu, Huixin; Wang, Feng
2014-01-01
Zero knowledge proof system which has received extensive attention since it was proposed is an important branch of cryptography and computational complexity theory. Thereinto, noninteractive zero knowledge proof system contains only one message sent by the prover to the verifier. It is widely used in the construction of various types of cryptographic protocols and cryptographic algorithms because of its good privacy, authentication, and lower interactive complexity. This paper reviews and analyzes the basic principles of noninteractive zero knowledge proof system, and summarizes the research progress achieved by noninteractive zero knowledge proof system on the following aspects: the definition and related models of noninteractive zero knowledge proof system, noninteractive zero knowledge proof system of NP problems, noninteractive statistical and perfect zero knowledge, the connection between noninteractive zero knowledge proof system, interactive zero knowledge proof system, and zap, and the specific applications of noninteractive zero knowledge proof system. This paper also points out the future research directions.
Fontanesi, John; Martinez, Anthony; Boyo, Toritsesan O; Gish, Robert
2015-01-01
Although demands for greater access to hepatology services that are less costly and achieve better outcomes have led to numerous quality improvement initiatives, traditional quality management methods may be inappropriate for hepatology. We empirically tested a model for conducting quality improvement in an academic hepatology program using methods developed to analyze and improve complex adaptive systems. We achieved a 25% increase in volume using 15% more clinical sessions with no change in staff or faculty FTEs, generating a positive margin of 50%. Wait times for next available appointments were reduced from five months to two weeks; unscheduled appointment slots dropped from 7% to less than 1%; "no-show" rates dropped to less than 10%; Press-Ganey scores increased to the 100th percentile. We conclude that framing hepatology as a complex adaptive system may improve our understanding of the complex, interdependent actions required to improve quality of care, patient satisfaction, and cost-effectiveness.
Bilodeau, Angèle; Beauchemin, Jean; Bourque, Denis; Galarneau, Marilène
2013-02-11
Based on a theory of intervention as a complex action system, analyze collaboration among partners in Montréal's sexually transmitted and blood-borne infections (STBBI) prevention program to identify main operations problems and possible scenarios for change to achieve better outcomes. A descriptive study was conducted using three data sources - public policies and programs, system management documents, and interviews with three types of partners. The results were validated with stakeholders. Five main operations problems affecting the capacity of the system to provide expected services were identified, as well as strategies the partners use to address these. Two scenarios for system change to increase its effectiveness in achieving program goals are discussed.
Massive Multi-Agent Systems Control
NASA Technical Reports Server (NTRS)
Campagne, Jean-Charles; Gardon, Alain; Collomb, Etienne; Nishida, Toyoaki
2004-01-01
In order to build massive multi-agent systems, considered as complex and dynamic systems, one needs a method to analyze and control the system. We suggest an approach using morphology to represent and control the state of large organizations composed of a great number of light software agents. Morphology is understood as representing the state of the multi-agent system as shapes in an abstract geometrical space, this notion is close to the notion of phase space in physics.
Cheng, Guanhui; Huang, Guohe; Dong, Cong; Xu, Ye; Chen, Xiujuan; Chen, Jiapei
2017-03-01
Due to the existence of complexities of heterogeneities, hierarchy, discreteness, and interactions in municipal solid waste management (MSWM) systems such as Beijing, China, a series of socio-economic and eco-environmental problems may emerge or worsen and result in irredeemable damages in the following decades. Meanwhile, existing studies, especially ones focusing on MSWM in Beijing, could hardly reflect these complexities in system simulations and provide reliable decision support for management practices. Thus, a framework of distributed mixed-integer fuzzy hierarchical programming (DMIFHP) is developed in this study for MSWM under these complexities. Beijing is selected as a representative case. The Beijing MSWM system is comprehensively analyzed in many aspects such as socio-economic conditions, natural conditions, spatial heterogeneities, treatment facilities, and system complexities, building a solid foundation for system simulation and optimization. Correspondingly, the MSWM system in Beijing is discretized as 235 grids to reflect spatial heterogeneity. A DMIFHP model which is a nonlinear programming problem is constructed to parameterize the Beijing MSWM system. To enable scientific solving of it, a solution algorithm is proposed based on coupling of fuzzy programming and mixed-integer linear programming. Innovations and advantages of the DMIFHP framework are discussed. The optimal MSWM schemes and mechanism revelations will be discussed in another companion paper due to length limitation.
The principle of superposition and its application in ground-water hydraulics
Reilly, T.E.; Franke, O.L.; Bennett, G.D.
1984-01-01
The principle of superposition, a powerful methematical technique for analyzing certain types of complex problems in many areas of science and technology, has important application in ground-water hydraulics and modeling of ground-water systems. The principle of superposition states that solutions to individual problems can be added together to obtain solutions to complex problems. This principle applies to linear systems governed by linear differential equations. This report introduces the principle of superposition as it applies to groundwater hydrology and provides background information, discussion, illustrative problems with solutions, and problems to be solved by the reader. (USGS)
Complex groundwater flow systems as traveling agent models
Padilla, Pablo; Escolero, Oscar; González, Tomas; Morales-Casique, Eric; Osorio-Olvera, Luis
2014-01-01
Analyzing field data from pumping tests, we show that as with many other natural phenomena, groundwater flow exhibits complex dynamics described by 1/f power spectrum. This result is theoretically studied within an agent perspective. Using a traveling agent model, we prove that this statistical behavior emerges when the medium is complex. Some heuristic reasoning is provided to justify both spatial and dynamic complexity, as the result of the superposition of an infinite number of stochastic processes. Even more, we show that this implies that non-Kolmogorovian probability is needed for its study, and provide a set of new partial differential equations for groundwater flow. PMID:25337455
A Crowdsourcing Framework for Medical Data Sets.
Ye, Cheng; Coco, Joseph; Epishova, Anna; Hajaj, Chen; Bogardus, Henry; Novak, Laurie; Denny, Joshua; Vorobeychik, Yevgeniy; Lasko, Thomas; Malin, Bradley; Fabbri, Daniel
2018-01-01
Crowdsourcing services like Amazon Mechanical Turk allow researchers to ask questions to crowds of workers and quickly receive high quality labeled responses. However, crowds drawn from the general public are not suitable for labeling sensitive and complex data sets, such as medical records, due to various concerns. Major challenges in building and deploying a crowdsourcing system for medical data include, but are not limited to: managing access rights to sensitive data and ensuring data privacy controls are enforced; identifying workers with the necessary expertise to analyze complex information; and efficiently retrieving relevant information in massive data sets. In this paper, we introduce a crowdsourcing framework to support the annotation of medical data sets. We further demonstrate a workflow for crowdsourcing clinical chart reviews including (1) the design and decomposition of research questions; (2) the architecture for storing and displaying sensitive data; and (3) the development of tools to support crowd workers in quickly analyzing information from complex data sets.
Mobility and Position Error Analysis of a Complex Planar Mechanism with Redundant Constraints
NASA Astrophysics Data System (ADS)
Sun, Qipeng; Li, Gangyan
2018-03-01
Nowadays mechanisms with redundant constraints have been created and attracted much attention for their merits. The mechanism of the redundant constraints in a mechanical system is analyzed in this paper. A analysis method of Planar Linkage with a repetitive structure is proposed to get the number and type of constraints. According to the difference of applications and constraint characteristics, the redundant constraints are divided into the theoretical planar redundant constraints and the space-planar redundant constraints. And the calculation formula for the number of redundant constraints and type of judging method are carried out. And a complex mechanism with redundant constraints is analyzed of the influence about redundant constraints on mechanical performance. With the combination of theoretical derivation and simulation research, a mechanism analysis method is put forward about the position error of complex mechanism with redundant constraints. It points out the direction on how to eliminate or reduce the influence of redundant constraints.
Graph Theory at the Service of Electroencephalograms.
Iakovidou, Nantia D
2017-04-01
The brain is one of the largest and most complex organs in the human body and EEG is a noninvasive electrophysiological monitoring method that is used to record the electrical activity of the brain. Lately, the functional connectivity in human brain has been regarded and studied as a complex network using EEG signals. This means that the brain is studied as a connected system where nodes, or units, represent different specialized brain regions and links, or connections, represent communication pathways between the nodes. Graph theory and theory of complex networks provide a variety of measures, methods, and tools that can be useful to efficiently model, analyze, and study EEG networks. This article is addressed to computer scientists who wish to be acquainted and deal with the study of EEG data and also to neuroscientists who would like to become familiar with graph theoretic approaches and tools to analyze EEG data.
Electromagnetic disturbance of electric drive system signal is extracted based on PLS
NASA Astrophysics Data System (ADS)
Wang, Yun; Wang, Chuanqi; Yang, Weidong; Zhang, Xu; Jiang, Li; Hou, Shuai; Chen, Xichen
2018-05-01
At present ISO11452 and GB/T33014 specified by electromagnetic immunity are narrowband electromagnetic radiation, but our exposure to electromagnetic radiation at ordinary times is not only a narrowband electromagnetic radiation, and some broadband electromagnetic radiation, and even some of the more complex electromagnetic environment. In terms of Electric vehicles, electric drive system is a kind of complex electromagnetic disturbance source, is not only a narrow-band signal, there are a lot of broadband signal, this paper puts forward PLS data processing method is adopted to analyze the electric drive system of electromagnetic disturbance, this kind of method to extract the data can be provide reliable data support for future standards.
NASA Technical Reports Server (NTRS)
Wise, Stephen A.; Holt, James M.
2002-01-01
The complexity of International Space Station (ISS) systems modeling often necessitates the concurrence of various dissimilar, parallel analysis techniques to validate modeling. This was the case with a feasibility and performance study of the ISS Node 3 Regenerative Heat Exchanger (RHX). A thermo-hydraulic network model was created and analyzed in SINDA/FLUINT. A less complex, closed form solution of the systems dynamics was created using an Excel Spreadsheet. The purpose of this paper is to provide a brief description of the modeling processes utilized, the results and benefits of each to the ISS Node 3 RHX study.
NASA Technical Reports Server (NTRS)
Wise, Stephen A.; Holt, James M.; Turner, Larry D. (Technical Monitor)
2001-01-01
The complexity of International Space Station (ISS) systems modeling often necessitates the concurrence of various dissimilar, parallel analysis techniques to validate modeling. This was the case with a feasibility and performance study of the ISS Node 3 Regenerative Heat Exchanger (RHX). A thermo-hydraulic network model was created and analyzed in SINDA/FLUINT. A less complex, closed form solution of the system dynamics was created using Excel. The purpose of this paper is to provide a brief description of the modeling processes utilized, the results and benefits of each to the ISS Node 3 RHX study.
Xu, Wei
2007-12-01
This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.
Sensor fusion for laparoscopic surgery skill acquisition.
Anderson, Fraser; Birch, Daniel W; Boulanger, Pierre; Bischof, Walter F
2012-01-01
Surgical techniques are becoming more complex and require substantial training to master. The development of automated, objective methods to analyze and evaluate surgical skill is necessary to provide trainees with reliable and accurate feedback during their training programs. We present a system to capture, visualize, and analyze the movements of a laparoscopic surgeon for the purposes of skill evaluation. The system records the upper body movement of the surgeon, the position, and orientation of the instruments, and the force and torque applied to the instruments. An empirical study was conducted using the system to record the performances of a number of surgeons with a wide range of skill. The study validated the usefulness of the system, and demonstrated the accuracy of the measurements.
NASA Technical Reports Server (NTRS)
Masiulaniec, K. C.; Keith, T. G., Jr.; Dewitt, K. J.
1984-01-01
A numerical procedure is presented for analyzing a wide variety of heat conduction problems in multilayered bodies having complex geometry. The method is based on a finite difference solution of the heat conduction equation using a body fitted coordinate system transformation. Solution techniques are described for steady and transient problems with and without internal energy generation. Results are found to compare favorably with several well known solutions.
Zedler, Linda; Guthmuller, Julien; Rabelo de Moraes, Inês; Kupfer, Stephan; Krieck, Sven; Schmitt, Michael; Popp, Jürgen; Rau, Sven; Dietzek, Benjamin
2014-05-25
The sequential order of photoinduced charge transfer processes and accompanying structure changes were analyzed by UV-vis and resonance-Raman spectroscopy of intermediates of a Ru(ii) based photocatalytic hydrogen evolving system obtained by electrochemical reduction.
Overexpression of MutSα Complex Proteins Predicts Poor Prognosis in Oral Squamous Cell Carcinoma.
Wagner, Vivian Petersen; Webber, Liana Preto; Salvadori, Gabriela; Meurer, Luise; Fonseca, Felipe Paiva; Castilho, Rogério Moraes; Squarize, Cristiane Helena; Vargas, Pablo Agustin; Martins, Manoela Domingues
2016-05-01
The DNA mismatch repair (MMR) system is responsible for the detection and correction of errors created during DNA replication, thereby avoiding the incorporation of mutations in dividing cells. The prognostic value of alterations in MMR system has not previously been analyzed in oral squamous cell carcinoma (OSCC).The study comprised 115 cases of OSCC diagnosed between 1996 and 2010. The specimens collected were constructed into tissue microarray blocks. Immunohistochemical staining for MutSα complex proteins hMSH2 and hMSH6 was performed. The slides were subsequently scanned into high-resolution images, and nuclear staining of hMSH2 and hMSH6 was analyzed using the Nuclear V9 algorithm. Univariable and multivariable Cox proportional hazard regression models were performed to evaluate the prognostic value of hMSH2 and hMSH6 in OSCC.All cases in the present cohort were positive for hMSH2 and hMSH6 and a direct correlation was found between the expression of the proteins (P < 0.05). The mean number of positive cells for hMSH2 and hMSH6 was 64.44 ± 15.21 and 31.46 ± 22.38, respectively. These values were used as cutoff points to determine high protein expression. Cases with high expression of both proteins simultaneously were classified as having high MutSα complex expression. In the multivariable analysis, high expression of the MutSα complex was an independent prognostic factor for poor overall survival (hazard ratio: 2.75, P = 0.02).This study provides a first insight of the prognostic value of alterations in MMR system in OSCC. We found that MutSα complex may constitute a molecular marker for the poor prognosis of OSCC.
Exploring stability of entropy analysis for signal with different trends
NASA Astrophysics Data System (ADS)
Zhang, Yin; Li, Jin; Wang, Jun
2017-03-01
Considering the effects of environment disturbances and instrument systems, the actual detecting signals always are carrying different trends, which result in that it is difficult to accurately catch signals complexity. So choosing steady and effective analysis methods is very important. In this paper, we applied entropy measures-the base-scale entropy and approximate entropy to analyze signal complexity, and studied the effect of trends on the ideal signal and the heart rate variability (HRV) signals, that is, linear, periodic, and power-law trends which are likely to occur in actual signals. The results show that approximate entropy is unsteady when we embed different trends into the signals, so it is not suitable to analyze signal with trends. However, the base-scale entropy has preferable stability and accuracy for signal with different trends. So the base-scale entropy is an effective method to analyze the actual signals.
A toolbox for discrete modelling of cell signalling dynamics.
Paterson, Yasmin Z; Shorthouse, David; Pleijzier, Markus W; Piterman, Nir; Bendtsen, Claus; Hall, Benjamin A; Fisher, Jasmin
2018-06-18
In an age where the volume of data regarding biological systems exceeds our ability to analyse it, many researchers are looking towards systems biology and computational modelling to help unravel the complexities of gene and protein regulatory networks. In particular, the use of discrete modelling allows generation of signalling networks in the absence of full quantitative descriptions of systems, which are necessary for ordinary differential equation (ODE) models. In order to make such techniques more accessible to mainstream researchers, tools such as the BioModelAnalyzer (BMA) have been developed to provide a user-friendly graphical interface for discrete modelling of biological systems. Here we use the BMA to build a library of discrete target functions of known canonical molecular interactions, translated from ordinary differential equations (ODEs). We then show that these BMA target functions can be used to reconstruct complex networks, which can correctly predict many known genetic perturbations. This new library supports the accessibility ethos behind the creation of BMA, providing a toolbox for the construction of complex cell signalling models without the need for extensive experience in computer programming or mathematical modelling, and allows for construction and simulation of complex biological systems with only small amounts of quantitative data.
Optimizing structure of complex technical system by heterogeneous vector criterion in interval form
NASA Astrophysics Data System (ADS)
Lysenko, A. V.; Kochegarov, I. I.; Yurkov, N. K.; Grishko, A. K.
2018-05-01
The article examines the methods of development and multi-criteria choice of the preferred structural variant of the complex technical system at the early stages of its life cycle in the absence of sufficient knowledge of parameters and variables for optimizing this structure. The suggested methods takes into consideration the various fuzzy input data connected with the heterogeneous quality criteria of the designed system and the parameters set by their variation range. The suggested approach is based on the complex use of methods of interval analysis, fuzzy sets theory, and the decision-making theory. As a result, the method for normalizing heterogeneous quality criteria has been developed on the basis of establishing preference relations in the interval form. The method of building preferential relations in the interval form on the basis of the vector of heterogeneous quality criteria suggest the use of membership functions instead of the coefficients considering the criteria value. The former show the degree of proximity of the realization of the designed system to the efficient or Pareto optimal variants. The study analyzes the example of choosing the optimal variant for the complex system using heterogeneous quality criteria.
NASA Technical Reports Server (NTRS)
Sterritt, Roy (Inventor); Hinchey, Michael G. (Inventor); Penn, Joaquin (Inventor)
2011-01-01
Systems, methods and apparatus are provided through which in some embodiments, an agent-oriented specification modeled with MaCMAS, is analyzed, flaws in the agent-oriented specification modeled with MaCMAS are corrected, and an implementation is derived from the corrected agent-oriented specification. Described herein are systems, method and apparatus that produce fully (mathematically) tractable development of agent-oriented specification(s) modeled with methodology fragment for analyzing complex multiagent systems (MaCMAS) and policies for autonomic systems from requirements through to code generation. The systems, method and apparatus described herein are illustrated through an example showing how user formulated policies can be translated into a formal mode which can then be converted to code. The requirements-based programming systems, method and apparatus described herein may provide faster, higher quality development and maintenance of autonomic systems based on user formulation of policies.
Visual analysis and exploration of complex corporate shareholder networks
NASA Astrophysics Data System (ADS)
Tekušová, Tatiana; Kohlhammer, Jörn
2008-01-01
The analysis of large corporate shareholder network structures is an important task in corporate governance, in financing, and in financial investment domains. In a modern economy, large structures of cross-corporation, cross-border shareholder relationships exist, forming complex networks. These networks are often difficult to analyze with traditional approaches. An efficient visualization of the networks helps to reveal the interdependent shareholding formations and the controlling patterns. In this paper, we propose an effective visualization tool that supports the financial analyst in understanding complex shareholding networks. We develop an interactive visual analysis system by combining state-of-the-art visualization technologies with economic analysis methods. Our system is capable to reveal patterns in large corporate shareholder networks, allows the visual identification of the ultimate shareholders, and supports the visual analysis of integrated cash flow and control rights. We apply our system on an extensive real-world database of shareholder relationships, showing its usefulness for effective visual analysis.
Interdisciplinary analysis procedures in the modeling and control of large space-based structures
NASA Technical Reports Server (NTRS)
Cooper, Paul A.; Stockwell, Alan E.; Kim, Zeen C.
1987-01-01
The paper describes a computer software system called the Integrated Multidisciplinary Analysis Tool, IMAT, that has been developed at NASA Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven interactive executive program, IMAT links a relational database to commercial structural and controls analysis codes. The paper describes the procedures followed to analyze a complex satellite structure and control system. The codes used to accomplish the analysis are described, and an example is provided of an application of IMAT to the analysis of a reference space station subject to a rectangular pulse loading at its docking port.
Decision support systems and methods for complex networks
Huang, Zhenyu [Richland, WA; Wong, Pak Chung [Richland, WA; Ma, Jian [Richland, WA; Mackey, Patrick S [Richland, WA; Chen, Yousu [Richland, WA; Schneider, Kevin P [Seattle, WA
2012-02-28
Methods and systems for automated decision support in analyzing operation data from a complex network. Embodiments of the present invention utilize these algorithms and techniques not only to characterize the past and present condition of a complex network, but also to predict future conditions to help operators anticipate deteriorating and/or problem situations. In particular, embodiments of the present invention characterize network conditions from operation data using a state estimator. Contingency scenarios can then be generated based on those network conditions. For at least a portion of all of the contingency scenarios, risk indices are determined that describe the potential impact of each of those scenarios. Contingency scenarios with risk indices are presented visually as graphical representations in the context of a visual representation of the complex network. Analysis of the historical risk indices based on the graphical representations can then provide trends that allow for prediction of future network conditions.
Complexity and network dynamics in physiological adaptation: an integrated view.
Baffy, György; Loscalzo, Joseph
2014-05-28
Living organisms constantly interact with their surroundings and sustain internal stability against perturbations. This dynamic process follows three fundamental strategies (restore, explore, and abandon) articulated in historical concepts of physiological adaptation such as homeostasis, allostasis, and the general adaptation syndrome. These strategies correspond to elementary forms of behavior (ordered, chaotic, and static) in complex adaptive systems and invite a network-based analysis of the operational characteristics, allowing us to propose an integrated framework of physiological adaptation from a complex network perspective. Applicability of this concept is illustrated by analyzing molecular and cellular mechanisms of adaptation in response to the pervasive challenge of obesity, a chronic condition resulting from sustained nutrient excess that prompts chaotic exploration for system stability associated with tradeoffs and a risk of adverse outcomes such as diabetes, cardiovascular disease, and cancer. Deconstruction of this complexity holds the promise of gaining novel insights into physiological adaptation in health and disease. Published by Elsevier Inc.
Case Complexity and Quality Attestation for Clinical Ethics Consultants.
Spielman, Bethany; Craig, Jana; Gorka, Christine; Miller, Keith
2015-01-01
A proposal by the American Society for Bioethics and Humanities (ASBH) to identify individuals who are qualified to perform ethics consultations neglects case complexity in candidates' portfolios. To protect patients and healthcare organizations, and to be fair to candidates, a minimum case complexity level must be clearly and publicly articulated. This proof-of-concept study supports the feasibility of assessing case complexity. Using text analytics, we developed a complexity scoring system, and retrospectively analyzed more than 500 ethics summaries of consults performed at an academic medical center during 2013. We demonstrate its use with seven case summaries that range in complexity from uncomplicated to very complicated. We encourage the ASBH to require a minimum level of case complexity, and recommend that attestation portfolios include several cases of moderate complexity and at least one very complex case. Copyright 2015 The Journal of Clinical Ethics. All rights reserved.
Dobosz, Marina; Bocci, Chiara; Bonuglia, Margherita; Grasso, Cinzia; Merigioli, Sara; Russo, Alessandra; De Iuliis, Paolo
2010-01-01
Microsatellites have been used for parentage testing and individual identification in forensic science because they are highly polymorphic and show abundant sequences dispersed throughout most eukaryotic nuclear genomes. At present, genetic testing based on DNA technology is used for most domesticated animals, including horses, to confirm identity, to determine parentage, and to validate registration certificates. But if genetic data of one of the putative parents are missing, verifying a genealogy could be questionable. The aim of this paper is to illustrate a new approach to analyze complex cases of disputed relationship with microsatellites markers. These cases were solved by analyzing the genotypes of the offspring and other horses' genotypes in the pedigrees of the putative dam/sire with probabilistic expert systems (PESs). PES was especially efficient in supplying reliable, error-free Bayesian probabilities in complex cases with missing pedigree data. One of these systems was developed for forensic purposes (FINEX program) and is particularly valuable in human analyses. We applied this program to parentage analysis in horses, and we will illustrate how different cases have been successfully worked out.
Complex-network description of thermal quantum states in the Ising spin chain
NASA Astrophysics Data System (ADS)
Sundar, Bhuvanesh; Valdez, Marc Andrew; Carr, Lincoln D.; Hazzard, Kaden R. A.
2018-05-01
We use network analysis to describe and characterize an archetypal quantum system—an Ising spin chain in a transverse magnetic field. We analyze weighted networks for this quantum system, with link weights given by various measures of spin-spin correlations such as the von Neumann and Rényi mutual information, concurrence, and negativity. We analytically calculate the spin-spin correlations in the system at an arbitrary temperature by mapping the Ising spin chain to fermions, as well as numerically calculate the correlations in the ground state using matrix product state methods, and then analyze the resulting networks using a variety of network measures. We demonstrate that the network measures show some traits of complex networks already in this spin chain, arguably the simplest quantum many-body system. The network measures give insight into the phase diagram not easily captured by more typical quantities, such as the order parameter or correlation length. For example, the network structure varies with transverse field and temperature, and the structure in the quantum critical fan is different from the ordered and disordered phases.
NASA Technical Reports Server (NTRS)
Mizell, Carolyn Barrett; Malone, Linda
2007-01-01
The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.
NASA Technical Reports Server (NTRS)
Fields, Chris
1989-01-01
Continuous dynamical systems intuitively seem capable of more complex behavior than discrete systems. If analyzed in the framework of the traditional theory of computation, a continuous dynamical system with countably many quasistable states has at least the computational power of a universal Turing machine. Such an analysis assumes, however, the classical notion of measurement. If measurement is viewed nonclassically, a continuous dynamical system cannot, even in principle, exhibit behavior that cannot be simulated by a universal Turing machine.
NASA Technical Reports Server (NTRS)
Fields, Chris
1989-01-01
Continuous dynamical systems intuitively seem capable of more complex behavior than discrete systems. If analyzed in the framework of the traditional theory of computation, a continuous dynamical system with countablely many quasistable states has at least the computational power of a universal Turing machine. Such an analyses assumes, however, the classical notion of measurement. If measurement is viewed nonclassically, a continuous dynamical system cannot, even in principle, exhibit behavior that cannot be simulated by a universal Turing machine.
Spectral simplicity of apparent complexity. II. Exact complexities and complexity spectra
NASA Astrophysics Data System (ADS)
Riechers, Paul M.; Crutchfield, James P.
2018-03-01
The meromorphic functional calculus developed in Part I overcomes the nondiagonalizability of linear operators that arises often in the temporal evolution of complex systems and is generic to the metadynamics of predicting their behavior. Using the resulting spectral decomposition, we derive closed-form expressions for correlation functions, finite-length Shannon entropy-rate approximates, asymptotic entropy rate, excess entropy, transient information, transient and asymptotic state uncertainties, and synchronization information of stochastic processes generated by finite-state hidden Markov models. This introduces analytical tractability to investigating information processing in discrete-event stochastic processes, symbolic dynamics, and chaotic dynamical systems. Comparisons reveal mathematical similarities between complexity measures originally thought to capture distinct informational and computational properties. We also introduce a new kind of spectral analysis via coronal spectrograms and the frequency-dependent spectra of past-future mutual information. We analyze a number of examples to illustrate the methods, emphasizing processes with multivariate dependencies beyond pairwise correlation. This includes spectral decomposition calculations for one representative example in full detail.
Ca K-Edge XAS as a Probe of Calcium Centers in Complex Systems
Martin-Diaconescu, Vlad; Gennari, Marcello; Gerey, Bertrand; ...
2014-12-10
Calcium K-edge pre-edges coupled with TD-DFT theoretical calculation of spectra provide a powerful approach for the characterization of complex calcium centers in inorganic and bioinorganic chemistry. Herein, Ca K-edge X-ray absorption spectroscopy (XAS) is developed as a means to characterize the local environment of calcium centers. The spectra for six, seven, and eight coordinate inorganic and molecular calcium complexes were analyzed and determined to be primarily influenced by the coordination environment and site symmetry at the calcium center. The experimental results are closely correlated to time-dependent density functional theory (TD-DFT) calculations of the XAS spectra. The applicability of this methodologymore » to complex systems was investigated using structural mimics of the oxygen-evolving complex (OEC) of PSII. It was found that Ca K-edge XAS is a sensitive probe for structural changes occurring in the cubane heterometallic cluster due to Mn oxidation. Future applications to the OEC are discussed.« less
NASA Astrophysics Data System (ADS)
Nagy, Julia; Eilert, Tobias; Michaelis, Jens
2018-03-01
Modern hybrid structural analysis methods have opened new possibilities to analyze and resolve flexible protein complexes where conventional crystallographic methods have reached their limits. Here, the Fast-Nano-Positioning System (Fast-NPS), a Bayesian parameter estimation-based analysis method and software, is an interesting method since it allows for the localization of unknown fluorescent dye molecules attached to macromolecular complexes based on single-molecule Förster resonance energy transfer (smFRET) measurements. However, the precision, accuracy, and reliability of structural models derived from results based on such complex calculation schemes are oftentimes difficult to evaluate. Therefore, we present two proof-of-principle benchmark studies where we use smFRET data to localize supposedly unknown positions on a DNA as well as on a protein-nucleic acid complex. Since we use complexes where structural information is available, we can compare Fast-NPS localization to the existing structural data. In particular, we compare different dye models and discuss how both accuracy and precision can be optimized.
Reliability Standards of Complex Engineering Systems
NASA Astrophysics Data System (ADS)
Galperin, E. M.; Zayko, V. A.; Gorshkalev, P. A.
2017-11-01
Production and manufacture play an important role in today’s modern society. Industrial production is nowadays characterized by increased and complex communications between its parts. The problem of preventing accidents in a large industrial enterprise becomes especially relevant. In these circumstances, the reliability of enterprise functioning is of particular importance. Potential damage caused by an accident at such enterprise may lead to substantial material losses and, in some cases, can even cause a loss of human lives. That is why industrial enterprise functioning reliability is immensely important. In terms of their reliability, industrial facilities (objects) are divided into simple and complex. Simple objects are characterized by only two conditions: operable and non-operable. A complex object exists in more than two conditions. The main characteristic here is the stability of its operation. This paper develops the reliability indicator combining the set theory methodology and a state space method. Both are widely used to analyze dynamically developing probability processes. The research also introduces a set of reliability indicators for complex technical systems.
Risk Management using Dependency Stucture Matrix
NASA Astrophysics Data System (ADS)
Petković, Ivan
2011-09-01
An efficient method based on dependency structure matrix (DSM) analysis is given for ranking risks in a complex system or process whose entities are mutually dependent. This rank is determined according to the element's values of the unique positive eigenvector which corresponds to the matrix spectral radius modeling the considered engineering system. For demonstration, the risk problem of NASA's robotic spacecraft is analyzed.
A Multi-Pumping Flow System for In Situ Measurements of Dissolved Manganese in Aquatic Systems
Meyer, David; Prien, Ralf D.; Dellwig, Olaf; Waniek, Joanna J.; Schuffenhauer, Ingo; Donath, Jan; Krüger, Siegfried; Pallentin, Malte; Schulz-Bull, Detlef E.
2016-01-01
A METals In Situ analyzer (METIS) has been used to determine dissolved manganese (II) concentrations in the subhalocline waters of the Gotland Deep (central Baltic Sea). High-resolution in situ measurements of total dissolved Mn were obtained in near real-time by spectrophotometry using 1-(2-pyridylazo)-2-naphthol (PAN). PAN is a complexing agent of dissolved Mn and forms a wine-red complex with a maximum absorbance at a wavelength of 562 nm. Results are presented together with ancillary temperature, salinity, and dissolved O2 data. Lab calibration of the analyzer was performed in a pressure testing tank. A detection limit of 77 nM was obtained. For validation purposes, discrete water samples were taken by using a pump-CTD system. Dissolved Mn in these samples was determined by an independent laboratory based method (inductively coupled plasma–optical emission spectrometry, ICP-OES). Mn measurements from both METIS and ICP-OES analysis were in good agreement. The results showed that the in situ analysis of dissolved Mn is a powerful technique reducing dependencies on heavy and expensive equipment (pump-CTD system, ICP-OES) and is also cost and time effective. PMID:27916898
Zheng, Mengge; Chao, Chen; Yu, Jinglin; Copeland, Les; Wang, Shuo; Wang, Shujun
2018-02-28
The effects of chain length and degree of unsaturation of fatty acids (FAs) on structure and in vitro digestibility of starch-protein-FA complexes were investigated in model systems. Studies with the rapid visco analyzer (RVA) showed that the formation of ternary complex resulted in higher viscosities than those of binary complex during the cooling and holding stages. The results of differential scanning calorimetry (DSC), Raman, and X-ray diffraction (XRD) showed that the structural differences for ternary complexes were much less than those for binary complexes. Starch-protein-FA complexes presented lower in vitro enzymatic digestibility compared with starch-FAs complexes. We conclude that shorter chain and lower unsaturation FAs favor the formation of ternary complexes but decrease the thermal stability of these complexes. FAs had a smaller effect on the ordered structures of ternary complexes than on those of binary complexes and little effect on enzymatic digestibility of both binary and ternary complexes.
Literature Mining and Knowledge Discovery Tools for Virtual Tissues
Virtual Tissues (VTs) are in silico models that simulate the cellular fabric of tissues to analyze complex relationships and predict multicellular behaviors in specific biological systems such as the mature liver (v-Liver™) or developing embryo (v-Embryo™). VT models require inpu...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Medley, S.S.
The application of charge exchange analyzers for the measurement of ion temperature in fusion plasma experiments requires a direct connection between the diagnostic and plasma-discharge vacuum chambers. Differential pumping of the gas load from the diagnostic stripping cell operated at > or approx. = 10/sup -3/ Torr is required to maintain the analyzer chamber at a pressure of < or approx. = 10/sup -6/ Torr. The migration of gases between the diagnostic and plasma vacuum chambers must be minimized. In particular, introduction of the analyzer stripping cell gas into the plasma chamber having a base pressure of < or approx.more » = 10/sup -8/ Torr must be suppressed. The charge exchange diagnostic for the Tokamak Fusion Test Reactor (TFTR) is comprised of two analyzer systems designed to contain a total of 18 independent mass/energy analyzers and one diagnostic neutral beam rated at 80 keV, 15 A. The associated arrays of multiple, interconnected vacuum systems were analyzed using the Vacuum System Transient Simulator (Vsts) computer program which models the transient transport of multigas species through complex networks of ducts, valves, traps, vacuum pumps, and other related vacuum system components. In addition to providing improved design performance at reduced costs, the analysis yields estimates for the exchange of tritium from the torus to the diagnostic components and of the diagnostic working gases to the torus.« less
Expert systems for space power supply - Design, analysis, and evaluation
NASA Technical Reports Server (NTRS)
Cooper, Ralph S.; Thomson, M. Kemer; Hoshor, Alan
1987-01-01
The feasibility of applying expert systems to the conceptual design, analysis, and evaluation of space power supplies in particular, and complex systems in general is evaluated. To do this, the space power supply design process and its associated knowledge base were analyzed and characterized in a form suitable for computer emulation of a human expert. The existing expert system tools and the results achieved with them were evaluated to assess their applicability to power system design. Some new concepts for combining program architectures (modular expert systems and algorithms) with information about the domain were applied to create a 'deep' system for handling the complex design problem. NOVICE, a code to solve a simplified version of a scoping study of a wide variety of power supply types for a broad range of missions, has been developed, programmed, and tested as a concrete feasibility demonstration.
Analysis of the faster-than-Nyquist optimal linear multicarrier system
NASA Astrophysics Data System (ADS)
Marquet, Alexandre; Siclet, Cyrille; Roque, Damien
2017-02-01
Faster-than-Nyquist signalization enables a better spectral efficiency at the expense of an increased computational complexity. Regarding multicarrier communications, previous work mainly relied on the study of non-linear systems exploiting coding and/or equalization techniques, with no particular optimization of the linear part of the system. In this article, we analyze the performance of the optimal linear multicarrier system when used together with non-linear receiving structures (iterative decoding and direct feedback equalization), or in a standalone fashion. We also investigate the limits of the normality assumption of the interference, used for implementing such non-linear systems. The use of this optimal linear system leads to a closed-form expression of the bit-error probability that can be used to predict the performance and help the design of coded systems. Our work also highlights the great performance/complexity trade-off offered by decision feedback equalization in a faster-than-Nyquist context. xml:lang="fr"
Information Flows? A Critique of Transfer Entropies
NASA Astrophysics Data System (ADS)
James, Ryan G.; Barnett, Nix; Crutchfield, James P.
2016-06-01
A central task in analyzing complex dynamics is to determine the loci of information storage and the communication topology of information flows within a system. Over the last decade and a half, diagnostics for the latter have come to be dominated by the transfer entropy. Via straightforward examples, we show that it and a derivative quantity, the causation entropy, do not, in fact, quantify the flow of information. At one and the same time they can overestimate flow or underestimate influence. We isolate why this is the case and propose several avenues to alternate measures for information flow. We also address an auxiliary consequence: The proliferation of networks as a now-common theoretical model for large-scale systems, in concert with the use of transferlike entropies, has shoehorned dyadic relationships into our structural interpretation of the organization and behavior of complex systems. This interpretation thus fails to include the effects of polyadic dependencies. The net result is that much of the sophisticated organization of complex systems may go undetected.
NASA Astrophysics Data System (ADS)
Maldonado, Solvey; Findeisen, Rolf
2010-06-01
The modeling, analysis, and design of treatment therapies for bone disorders based on the paradigm of force-induced bone growth and adaptation is a challenging task. Mathematical models provide, in comparison to clinical, medical and biological approaches an structured alternative framework to understand the concurrent effects of the multiple factors involved in bone remodeling. By now, there are few mathematical models describing the appearing complex interactions. However, the resulting models are complex and difficult to analyze, due to the strong nonlinearities appearing in the equations, the wide range of variability of the states, and the uncertainties in parameters. In this work, we focus on analyzing the effects of changes in model structure and parameters/inputs variations on the overall steady state behavior using systems theoretical methods. Based on an briefly reviewed existing model that describes force-induced bone adaptation, the main objective of this work is to analyze the stationary behavior and to identify plausible treatment targets for remodeling related bone disorders. Identifying plausible targets can help in the development of optimal treatments combining both physical activity and drug-medication. Such treatments help to improve/maintain/restore bone strength, which deteriorates under bone disorder conditions, such as estrogen deficiency.
Novel physical constraints on implementation of computational processes
NASA Astrophysics Data System (ADS)
Wolpert, David; Kolchinsky, Artemy
Non-equilibrium statistical physics permits us to analyze computational processes, i.e., ways to drive a physical system such that its coarse-grained dynamics implements some desired map. It is now known how to implement any such desired computation without dissipating work, and what the minimal (dissipationless) work is that such a computation will require (the so-called generalized Landauer bound\\x9D). We consider how these analyses change if we impose realistic constraints on the computational process. First, we analyze how many degrees of freedom of the system must be controlled, in addition to the ones specifying the information-bearing degrees of freedom, in order to avoid dissipating work during a given computation, when local detailed balance holds. We analyze this issue for deterministic computations, deriving a state-space vs. speed trade-off, and use our results to motivate a measure of the complexity of a computation. Second, we consider computations that are implemented with logic circuits, in which only a small numbers of degrees of freedom are coupled at a time. We show that the way a computation is implemented using circuits affects its minimal work requirements, and relate these minimal work requirements to information-theoretic measures of complexity.
Qualitative Fault Isolation of Hybrid Systems: A Structural Model Decomposition-Based Approach
NASA Technical Reports Server (NTRS)
Bregon, Anibal; Daigle, Matthew; Roychoudhury, Indranil
2016-01-01
Quick and robust fault diagnosis is critical to ensuring safe operation of complex engineering systems. A large number of techniques are available to provide fault diagnosis in systems with continuous dynamics. However, many systems in aerospace and industrial environments are best represented as hybrid systems that consist of discrete behavioral modes, each with its own continuous dynamics. These hybrid dynamics make the on-line fault diagnosis task computationally more complex due to the large number of possible system modes and the existence of autonomous mode transitions. This paper presents a qualitative fault isolation framework for hybrid systems based on structural model decomposition. The fault isolation is performed by analyzing the qualitative information of the residual deviations. However, in hybrid systems this process becomes complex due to possible existence of observation delays, which can cause observed deviations to be inconsistent with the expected deviations for the current mode in the system. The great advantage of structural model decomposition is that (i) it allows to design residuals that respond to only a subset of the faults, and (ii) every time a mode change occurs, only a subset of the residuals will need to be reconfigured, thus reducing the complexity of the reasoning process for isolation purposes. To demonstrate and test the validity of our approach, we use an electric circuit simulation as the case study.
Odille, Fabrice G J; Jónsson, Stefán; Stjernqvist, Susann; Rydén, Tobias; Wärnmark, Kenneth
2007-01-01
A general mathematical model for the characterization of the dynamic (kinetically labile) association of supramolecular assemblies in solution is presented. It is an extension of the equal K (EK) model by the stringent use of linear algebra to allow for the simultaneous presence of an unlimited number of different units in the resulting assemblies. It allows for the analysis of highly complex dynamic equilibrium systems in solution, including both supramolecular homo- and copolymers without the recourse to extensive approximations, in a field in which other analytical methods are difficult. The derived mathematical methodology makes it possible to analyze dynamic systems such as supramolecular copolymers regarding for instance the degree of polymerization, the distribution of a given monomer in different copolymers as well as its position in an aggregate. It is to date the only general means to characterize weak supramolecular systems. The model was fitted to NMR dilution titration data by using the program Matlab, and a detailed algorithm for the optimization of the different parameters has been developed. The methodology is applied to a case study, a hydrogen-bonded supramolecular system, salen 4+porphyrin 5. The system is formally a two-component system but in reality a three-component system. This results in a complex dynamic system in which all monomers are associated to each other by hydrogen bonding with different association constants, resulting in homo- and copolymers 4n5m as well as cyclic structures 6 and 7, in addition to free 4 and 5. The system was analyzed by extensive NMR dilution titrations at variable temperatures. All chemical shifts observed at different temperatures were used in the fitting to obtain the DeltaH degrees and DeltaS degrees values producing the best global fit. From the derived general mathematical expressions, system 4+5 could be characterized with respect to above-mentioned parameters.
Analyzing milestoning networks for molecular kinetics: definitions, algorithms, and examples.
Viswanath, Shruthi; Kreuzer, Steven M; Cardenas, Alfredo E; Elber, Ron
2013-11-07
Network representations are becoming increasingly popular for analyzing kinetic data from techniques like Milestoning, Markov State Models, and Transition Path Theory. Mapping continuous phase space trajectories into a relatively small number of discrete states helps in visualization of the data and in dissecting complex dynamics to concrete mechanisms. However, not only are molecular networks derived from molecular dynamics simulations growing in number, they are also getting increasingly complex, owing partly to the growth in computer power that allows us to generate longer and better converged trajectories. The increased complexity of the networks makes simple interpretation and qualitative insight of the molecular systems more difficult to achieve. In this paper, we focus on various network representations of kinetic data and algorithms to identify important edges and pathways in these networks. The kinetic data can be local and partial (such as the value of rate coefficients between states) or an exact solution to kinetic equations for the entire system (such as the stationary flux between vertices). In particular, we focus on the Milestoning method that provides fluxes as the main output. We proposed Global Maximum Weight Pathways as a useful tool for analyzing molecular mechanism in Milestoning networks. A closely related definition was made in the context of Transition Path Theory. We consider three algorithms to find Global Maximum Weight Pathways: Recursive Dijkstra's, Edge-Elimination, and Edge-List Bisection. The asymptotic efficiency of the algorithms is analyzed and numerical tests on finite networks show that Edge-List Bisection and Recursive Dijkstra's algorithms are most efficient for sparse and dense networks, respectively. Pathways are illustrated for two examples: helix unfolding and membrane permeation. Finally, we illustrate that networks based on local kinetic information can lead to incorrect interpretation of molecular mechanisms.
Bertti, Poliana; Tejada, Julian; Martins, Ana Paula Pinheiro; Dal-Cól, Maria Luiza Cleto; Terra, Vera Cristina; de Oliveira, José Antônio Cortes; Velasco, Tonicarlo Rodrigues; Sakamoto, Américo Ceiki; Garcia-Cairasco, Norberto
2014-09-01
Epileptic syndromes and seizures are the expression of complex brain systems. Because no analysis of complexity has been applied to epileptic seizure semiology, our goal was to apply neuroethology and graph analysis to the study of the complexity of behavioral manifestations of epileptic seizures in human frontal lobe epilepsy (FLE) and temporal lobe epilepsy (TLE). We analyzed the video recordings of 120 seizures of 18 patients with FLE and 28 seizures of 28 patients with TLE. All patients were seizure-free >1 year after surgery (Engel Class I). All patients' behavioral sequences were analyzed by means of a glossary containing all behaviors and analyzed for neuroethology (Ethomatic software). The same series were used for graph analysis (CYTOSCAPE). Behaviors, displayed as nodes, were connected by edges to other nodes according to their temporal sequence of appearance. Using neuroethology analysis, we confirmed data in the literature such as in FLE: brief/frequent seizures, complex motor behaviors, head and eye version, unilateral/bilateral tonic posturing, speech arrest, vocalization, and rapid postictal recovery and in the case of TLE: presence of epigastric aura, lateralized dystonias, impairment of consciousness/speech during ictal and postictal periods, and development of secondary generalization. Using graph analysis metrics of FLE and TLE confirmed data from flowcharts. However, because of the algorithms we used, they highlighted more powerfully the connectivity and complex associations among behaviors in a quite selective manner, depending on the origin of the seizures. The algorithms we used are commonly employed to track brain connectivity from EEG and MRI sources, which makes our study very promising for future studies of complexity in this field. Copyright © 2014 Elsevier Inc. All rights reserved.
Anti-aliasing filter design on spaceborne digital receiver
NASA Astrophysics Data System (ADS)
Yu, Danru; Zhao, Chonghui
2009-12-01
In recent years, with the development of satellite observation technologies, more and more active remote sensing technologies are adopted in spaceborne system. The spaceborne precipitation radar will depend heavily on high performance digital processing to collect meaningful rain echo data. It will increase the complexity of the spaceborne system and need high-performance and reliable digital receiver. This paper analyzes the frequency aliasing in the intermediate frequency signal sampling of digital down conversion in spaceborne radar, and gives an effective digital filter. By analysis and calculation, we choose reasonable parameters of the half-band filters to suppress the frequency aliasing on DDC. Compared with traditional filter, the FPGA resources cost in our system are reduced by over 50%. This can effectively reduce the complexity in the spaceborne digital receiver and improve the reliability of system.
Castañeda, María; Odriozola, Adrián; Gómez, Javier; Zarrabeitia, María T
2013-07-01
We report the development of an effective system for analyzing X chromosome-linked mini short tandem repeat loci with reduced-size amplicons (less than 220 bp), useful for analyzing highly degraded DNA samples. To generate smaller amplicons, we redesigned primers for eight X-linked microsatellites (DXS7132, DXS10079, DXS10074, DXS10075, DXS6801, DXS6809, DXS6789, and DXS6799) and established efficient conditions for a multiplex PCR system (miniX). The validation tests confirmed that it has good sensitivity, requiring as little as 20 pg of DNA, and performs well with DNA from paraffin-embedded tissues, thus showing potential for improved analysis and identification of highly degraded and/or very limited DNA samples. Consequently, this system may help to solve complex forensic cases, particularly when autosomal markers convey insufficient information.
Quantum effects in energy and charge transfer in an artificial photosynthetic complex
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghosh, Pulak Kumar; Smirnov, Anatoly Yu.; Nori, Franco
2011-06-28
We investigate the quantum dynamics of energy and charge transfer in a wheel-shaped artificial photosynthetic antenna-reaction center complex. This complex consists of six light-harvesting chromophores and an electron-acceptor fullerene. To describe quantum effects on a femtosecond time scale, we derive the set of exact non-Markovian equations for the Heisenberg operators of this photosynthetic complex in contact with a Gaussian heat bath. With these equations we can analyze the regime of strong system-bath interactions, where reorganization energies are of the order of the intersite exciton couplings. We show that the energy of the initially excited antenna chromophores is efficiently funneled tomore » the porphyrin-fullerene reaction center, where a charge-separated state is set up in a few picoseconds, with a quantum yield of the order of 95%. In the single-exciton regime, with one antenna chromophore being initially excited, we observe quantum beatings of energy between two resonant antenna chromophores with a decoherence time of {approx}100 fs. We also analyze the double-exciton regime, when two porphyrin molecules involved in the reaction center are initially excited. In this regime we obtain pronounced quantum oscillations of the charge on the fullerene molecule with a decoherence time of about 20 fs (at liquid nitrogen temperatures). These results show a way to directly detect quantum effects in artificial photosynthetic systems.« less
A System to Measure Both Inner and Outer Car Tire Temperatures ``in situ''
NASA Astrophysics Data System (ADS)
Koštial, P.; Mokryšová, M.; Šišáková, J.; Mošková, Z.; Rusnáková, S.
2009-02-01
In the paper, a system for the complex analysis of the internal and external tire temperatures and pressure of sporty tires is presented. Tests were performed on the test circuit of a tire producer. The CTPA 05 measuring system (complex temperature-pressure analyzer) enables simultaneous measurements of the internal temperature and pressure in a passenger or sports tire. The experimentalist determines that the CTPA 05 can be used to measure independently the external temperature of the overcoat on the front wheel driving tires at three points. Measurements of both the internal tire temperature and pressure, as well as of the external tire temperature, are collected together with GPS (global position system) data. The system of measurement is fully automatic and contactless. The obtained results are in very good agreement with those obtained by independent methods.
NASA Technical Reports Server (NTRS)
Johnson, Sally C.; Boerschlein, David P.
1995-01-01
Semi-Markov models can be used to analyze the reliability of virtually any fault-tolerant system. However, the process of delineating all the states and transitions in a complex system model can be devastatingly tedious and error prone. The Abstract Semi-Markov Specification Interface to the SURE Tool (ASSIST) computer program allows the user to describe the semi-Markov model in a high-level language. Instead of listing the individual model states, the user specifies the rules governing the behavior of the system, and these are used to generate the model automatically. A few statements in the abstract language can describe a very large, complex model. Because no assumptions are made about the system being modeled, ASSIST can be used to generate models describing the behavior of any system. The ASSIST program and its input language are described and illustrated by examples.
Is the destabilization of the cournot equilibrium a good business strategy in cournot-puu duopoly?
Canovas, Jose S
2011-10-01
It is generally acknowledged that the medium influences the way we communicate and negotiation research directs considerable attention to the impact of different electronic communication modes on the negotiation process and outcomes. Complexity theories offer models and methods that allow the investigation of how pattern and temporal sequences unfold over time in negotiation interactions. By focusing on the dynamic and interactive quality of negotiations as well as the information, choice, and uncertainty contained in the negotiation process, the complexity perspective addresses several issues of central interest in classical negotiation research. In the present study we compare the complexity of the negotiation communication process among synchronous and asynchronous negotiations (IM vs. e-mail) as well as an electronic negotiation support system including a decision support system (DSS). For this purpose, transcripts of 145 negotiations have been coded and analyzed with the Shannon entropy and the grammar complexity. Our results show that negotiating asynchronically via e-mail as well as including a DSS significantly reduces the complexity of the negotiation process. Furthermore, a reduction of the complexity increases the probability of reaching an agreement.
Study of multi-level atomic systems with the application of magnetic field
NASA Astrophysics Data System (ADS)
Hu, Jianping; Roy, Subhankar; Ummal Momeen, M.
2018-04-01
The complexity of multiple energy levels associated with each atomic system determines the various processes related to light- matter interactions. It is necessary to understand the influence of different levels in a given atomic system. In this work we focus on multi- level atomic schemes with the application of magnetic field. We analyze the different EIT windows which appears in the presence of moderately high magnetic field (∼ 10 G) strength.
NASA Technical Reports Server (NTRS)
1972-01-01
A definition of the expendable second stage and space shuttle booster separation system is presented. Modifications required on the reusable booster for expendable second stage/payload flight and the ground systems needed to operate the expendable second stage in conjuction with the space shuttle booster are described. The safety, reliability, and quality assurance program is explained. Launch complex operations and services are analyzed.
Reversible heart rhythm complexity impairment in patients with primary aldosteronism
NASA Astrophysics Data System (ADS)
Lin, Yen-Hung; Wu, Vin-Cent; Lo, Men-Tzung; Wu, Xue-Ming; Hung, Chi-Sheng; Wu, Kwan-Dun; Lin, Chen; Ho, Yi-Lwun; Stowasser, Michael; Peng, Chung-Kang
2015-08-01
Excess aldosterone secretion in patients with primary aldosteronism (PA) impairs their cardiovascular system. Heart rhythm complexity analysis, derived from heart rate variability (HRV), is a powerful tool to quantify the complex regulatory dynamics of human physiology. We prospectively analyzed 20 patients with aldosterone producing adenoma (APA) that underwent adrenalectomy and 25 patients with essential hypertension (EH). The heart rate data were analyzed by conventional HRV and heart rhythm complexity analysis including detrended fluctuation analysis (DFA) and multiscale entropy (MSE). We found APA patients had significantly decreased DFAα2 on DFA analysis and decreased area 1-5, area 6-15, and area 6-20 on MSE analysis (all p < 0.05). Area 1-5, area 6-15, area 6-20 in the MSE study correlated significantly with log-transformed renin activity and log-transformed aldosterone-renin ratio (all p < = 0.01). The conventional HRV parameters were comparable between PA and EH patients. After adrenalectomy, all the altered DFA and MSE parameters improved significantly (all p < 0.05). The conventional HRV parameters did not change. Our result suggested that heart rhythm complexity is impaired in APA patients and this is at least partially reversed by adrenalectomy.
Analysis of the interface variability in NMR structure ensembles of protein-protein complexes.
Calvanese, Luisa; D'Auria, Gabriella; Vangone, Anna; Falcigno, Lucia; Oliva, Romina
2016-06-01
NMR structures consist in ensembles of conformers, all satisfying the experimental restraints, which exhibit a certain degree of structural variability. We analyzed here the interface in NMR ensembles of protein-protein heterodimeric complexes and found it to span a wide range of different conservations. The different exhibited conservations do not simply correlate with the size of the systems/interfaces, and are most probably the result of an interplay between different factors, including the quality of experimental data and the intrinsic complex flexibility. In any case, this information is not to be missed when NMR structures of protein-protein complexes are analyzed; especially considering that, as we also show here, the first NMR conformer is usually not the one which best reflects the overall interface. To quantify the interface conservation and to analyze it, we used an approach originally conceived for the analysis and ranking of ensembles of docking models, which has now been extended to directly deal with NMR ensembles. We propose this approach, based on the conservation of the inter-residue contacts at the interface, both for the analysis of the interface in whole ensembles of NMR complexes and for the possible selection of a single conformer as the best representative of the overall interface. In order to make the analyses automatic and fast, we made the protocol available as a web tool at: https://www.molnac.unisa.it/BioTools/consrank/consrank-nmr.html. Copyright © 2016 Elsevier Inc. All rights reserved.
Cross-Modulated Amplitudes and Frequencies Characterize Interacting Components in Complex Systems
NASA Astrophysics Data System (ADS)
Gans, Fabian; Schumann, Aicko Y.; Kantelhardt, Jan W.; Penzel, Thomas; Fietze, Ingo
2009-03-01
The dynamics of complex systems is characterized by oscillatory components on many time scales. To study the interactions between these components we analyze the cross modulation of their instantaneous amplitudes and frequencies, separating synchronous and antisynchronous modulation. We apply our novel technique to brain-wave oscillations in the human electroencephalogram and show that interactions between the α wave and the δ or β wave oscillators as well as spatial interactions can be quantified and related with physiological conditions (e.g., sleep stages). Our approach overcomes the limitation to oscillations with similar frequencies and enables us to quantify directly nonlinear effects such as positive or negative frequency modulation.
Analyzing system safety in lithium-ion grid energy storage
Rosewater, David; Williams, Adam
2015-10-08
As grid energy storage systems become more complex, it grows more di cult to design them for safe operation. This paper first reviews the properties of lithium-ion batteries that can produce hazards in grid scale systems. Then the conventional safety engineering technique Probabilistic Risk Assessment (PRA) is reviewed to identify its limitations in complex systems. To address this gap, new research is presented on the application of Systems-Theoretic Process Analysis (STPA) to a lithium-ion battery based grid energy storage system. STPA is anticipated to ll the gaps recognized in PRA for designing complex systems and hence be more e ectivemore » or less costly to use during safety engineering. It was observed that STPA is able to capture causal scenarios for accidents not identified using PRA. Additionally, STPA enabled a more rational assessment of uncertainty (all that is not known) thereby promoting a healthy skepticism of design assumptions. Lastly, we conclude that STPA may indeed be more cost effective than PRA for safety engineering in lithium-ion battery systems. However, further research is needed to determine if this approach actually reduces safety engineering costs in development, or improves industry safety standards.« less
DOT National Transportation Integrated Search
2017-03-01
Transportation corridors are complex systems. Tradeoffs, particularly in terms of traffic mobility, transit performance, accessibility and pedestrian : interactions, are not well understood. When the focus is on motorized vehicle mobility and through...
NASA Astrophysics Data System (ADS)
Rodriguez Lucatero, C.; Schaum, A.; Alarcon Ramos, L.; Bernal-Jaquez, R.
2014-07-01
In this study, the dynamics of decisions in complex networks subject to external fields are studied within a Markov process framework using nonlinear dynamical systems theory. A mathematical discrete-time model is derived using a set of basic assumptions regarding the convincement mechanisms associated with two competing opinions. The model is analyzed with respect to the multiplicity of critical points and the stability of extinction states. Sufficient conditions for extinction are derived in terms of the convincement probabilities and the maximum eigenvalues of the associated connectivity matrices. The influences of exogenous (e.g., mass media-based) effects on decision behavior are analyzed qualitatively. The current analysis predicts: (i) the presence of fixed-point multiplicity (with a maximum number of four different fixed points), multi-stability, and sensitivity with respect to the process parameters; and (ii) the bounded but significant impact of exogenous perturbations on the decision behavior. These predictions were verified using a set of numerical simulations based on a scale-free network topology.
Bush, Ian E.
1980-01-01
The lessons of the 70's with MIS were largely painful, often the same as those of the 60's, and were found in different phases on two continents. On examination this turns out to be true for many non-medical fields, true for systems programming, and thus a very general phenomenon. It is related to the functional complexity rather than to the sheer size of the software required, and above all to the relative neglect of human factors at all levels of software and hardware design. Simple hierarchical theory is a useful tool for analyzing complex systems and restoring the necessary dominance of common sense human factors. An example shows the very large effects of neglecting these factors on costs and benefits of MIS and their sub-systems.
Ahmadi, Samira; Wu, Christine; Sepehri, Nariman; Kantikar, Anuprita; Nankar, Mayur; Szturm, Tony
2018-01-01
Quantized dynamical entropy (QDE) has recently been proposed as a new measure to quantify the complexity of dynamical systems with the purpose of offering a better computational efficiency. This paper further investigates the viability of this method using five different human gait signals. These signals are recorded while normal walking and while performing secondary tasks among two age groups (young and older age groups). The results are compared with the outcomes of previously established sample entropy (SampEn) measure for the same signals. We also study how analyzing segmented and spatially and temporally normalized signal differs from analyzing whole data. Our findings show that human gait signals become more complex as people age and while they are cognitively loaded. Center of pressure (COP) displacement in mediolateral direction is the best signal for showing the gait changes. Moreover, the results suggest that by segmenting data, more information about intrastride dynamical features are obtained. Most importantly, QDE is shown to be a reliable measure for human gait complexity analysis.
Persistent topological features of dynamical systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maletić, Slobodan, E-mail: slobodan@hitsz.edu.cn; Institute of Nuclear Sciences Vinča, University of Belgrade, Belgrade; Zhao, Yi, E-mail: zhao.yi@hitsz.edu.cn
Inspired by an early work of Muldoon et al., Physica D 65, 1–16 (1993), we present a general method for constructing simplicial complex from observed time series of dynamical systems based on the delay coordinate reconstruction procedure. The obtained simplicial complex preserves all pertinent topological features of the reconstructed phase space, and it may be analyzed from topological, combinatorial, and algebraic aspects. In focus of this study is the computation of homology of the invariant set of some well known dynamical systems that display chaotic behavior. Persistent homology of simplicial complex and its relationship with the embedding dimensions are examinedmore » by studying the lifetime of topological features and topological noise. The consistency of topological properties for different dynamic regimes and embedding dimensions is examined. The obtained results shed new light on the topological properties of the reconstructed phase space and open up new possibilities for application of advanced topological methods. The method presented here may be used as a generic method for constructing simplicial complex from a scalar time series that has a number of advantages compared to the mapping of the same time series to a complex network.« less
A Process Management System for Networked Manufacturing
NASA Astrophysics Data System (ADS)
Liu, Tingting; Wang, Huifen; Liu, Linyan
With the development of computer, communication and network, networked manufacturing has become one of the main manufacturing paradigms in the 21st century. Under the networked manufacturing environment, there exist a large number of cooperative tasks susceptible to alterations, conflicts caused by resources and problems of cost and quality. This increases the complexity of administration. Process management is a technology used to design, enact, control, and analyze networked manufacturing processes. It supports efficient execution, effective management, conflict resolution, cost containment and quality control. In this paper we propose an integrated process management system for networked manufacturing. Requirements of process management are analyzed and architecture of the system is presented. And a process model considering process cost and quality is developed. Finally a case study is provided to explain how the system runs efficiently.
Measuring the impact of final demand on global production system based on Markov process
NASA Astrophysics Data System (ADS)
Xing, Lizhi; Guan, Jun; Wu, Shan
2018-07-01
Input-output table is a comprehensive and detailed in describing the national economic systems, consisting of supply and demand information among various industrial sectors. The complex network, a theory and method for measuring the structure of complex system, can depict the structural properties of social and economic systems, and reveal the complicated relationships between the inner hierarchies and the external macroeconomic functions. This paper tried to measure the globalization degree of industrial sectors on the global value chain. Firstly, it constructed inter-country input-output network models to reproduce the topological structure of global economic system. Secondly, it regarded the propagation of intermediate goods on the global value chain as Markov process and introduced counting first passage betweenness to quantify the added processing amount when globally final demand stimulates this production system. Thirdly, it analyzed the features of globalization at both global and country-sector level
Morphosemantic parsing of medical compound words: transferring a French analyzer to English.
Deléger, Louise; Namer, Fiammetta; Zweigenbaum, Pierre
2009-04-01
Medical language, as many technical languages, is rich with morphologically complex words, many of which take their roots in Greek and Latin--in which case they are called neoclassical compounds. Morphosemantic analysis can help generate definitions of such words. The similarity of structure of those compounds in several European languages has also been observed, which seems to indicate that a same linguistic analysis could be applied to neo-classical compounds from different languages with minor modifications. This paper reports work on the adaptation of a morphosemantic analyzer dedicated to French (DériF) to analyze English medical neo-classical compounds. It presents the principles of this transposition and its current performance. The analyzer was tested on a set of 1299 compounds extracted from the WHO-ART terminology. 859 could be decomposed and defined, 675 of which successfully. An advantage of this process is that complex linguistic analyses designed for French could be successfully transposed to the analysis of English medical neoclassical compounds, which confirmed our hypothesis of transferability. The fact that the method was successfully applied to a Germanic language such as English suggests that performances would be at least as high if experimenting with Romance languages such as Spanish. Finally, the resulting system can produce more complete analyses of English medical compounds than existing systems, including a hierarchical decomposition and semantic gloss of each word.
NASA Astrophysics Data System (ADS)
Christensen, Claire Petra
Across diverse fields ranging from physics to biology, sociology, and economics, the technological advances of the past decade have engendered an unprecedented explosion of data on highly complex systems with thousands, if not millions of interacting components. These systems exist at many scales of size and complexity, and it is becoming ever-more apparent that they are, in fact, universal, arising in every field of study. Moreover, they share fundamental properties---chief among these, that the individual interactions of their constituent parts may be well-understood, but the characteristic behaviour produced by the confluence of these interactions---by these complex networks---is unpredictable; in a nutshell, the whole is more than the sum of its parts. There is, perhaps, no better illustration of this concept than the discoveries being made regarding complex networks in the biological sciences. In particular, though the sequencing of the human genome in 2003 was a remarkable feat, scientists understand that the "cellular-level blueprints" for the human being are cellular-level parts lists, but they say nothing (explicitly) about cellular-level processes. The challenge of modern molecular biology is to understand these processes in terms of the networks of parts---in terms of the interactions among proteins, enzymes, genes, and metabolites---as it is these processes that ultimately differentiate animate from inanimate, giving rise to life! It is the goal of systems biology---an umbrella field encapsulating everything from molecular biology to epidemiology in social systems---to understand processes in terms of fundamental networks of core biological parts, be they proteins or people. By virtue of the fact that there are literally countless complex systems, not to mention tools and techniques used to infer, simulate, analyze, and model these systems, it is impossible to give a truly comprehensive account of the history and study of complex systems. The author's own publications have contributed network inference, simulation, modeling, and analysis methods to the much larger body of work in systems biology, and indeed, in network science. The aim of this thesis is therefore twofold: to present this original work in the historical context of network science, but also to provide sufficient review and reference regarding complex systems (with an emphasis on complex networks in systems biology) and tools and techniques for their inference, simulation, analysis, and modeling, such that the reader will be comfortable in seeking out further information on the subject. The review-like Chapters 1, 2, and 4 are intended to convey the co-evolution of network science and the slow but noticeable breakdown of boundaries between disciplines in academia as research and comparison of diverse systems has brought to light the shared properties of these systems. It is the author's hope that theses chapters impart some sense of the remarkable and rapid progress in complex systems research that has led to this unprecedented academic synergy. Chapters 3 and 5 detail the author's original work in the context of complex systems research. Chapter 3 presents the methods and results of a two-stage modeling process that generates candidate gene-regulatory networks of the bacterium B.subtilis from experimentally obtained, yet mathematically underdetermined microchip array data. These networks are then analyzed from a graph theoretical perspective, and their biological viability is critiqued by comparing the networks' graph theoretical properties to those of other biological systems. The results of topological perturbation analyses revealing commonalities in behavior at multiple levels of complexity are also presented, and are shown to be an invaluable means by which to ascertain the level of complexity to which the network inference process is robust to noise. Chapter 5 outlines a learning algorithm for the development of a realistic, evolving social network (a city) into which a disease is introduced. The results of simulations in populations spanning two orders of magnitude are compared to prevaccine era measles data for England and Wales and demonstrate that the simulations are able to capture the quantitative and qualitative features of epidemics in populations as small as 10,000 people. The work presented in Chapter 5 validates the utility of network simulation in concurrently probing contact network dynamics and disease dynamics.
Dynamical Analysis and Visualization of Tornadoes Time Series
2015-01-01
In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns. PMID:25790281
Dynamical analysis and visualization of tornadoes time series.
Lopes, António M; Tenreiro Machado, J A
2015-01-01
In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.
Software for occupational health and safety risk analysis based on a fuzzy model.
Stefanovic, Miladin; Tadic, Danijela; Djapan, Marko; Macuzic, Ivan
2012-01-01
Risk and safety management are very important issues in healthcare systems. Those are complex systems with many entities, hazards and uncertainties. In such an environment, it is very hard to introduce a system for evaluating and simulating significant hazards. In this paper, we analyzed different types of hazards in healthcare systems and we introduced a new fuzzy model for evaluating and ranking hazards. Finally, we presented a developed software solution, based on the suggested fuzzy model for evaluating and monitoring risk.
Patient safety - the role of human factors and systems engineering.
Carayon, Pascale; Wood, Kenneth E
2010-01-01
Patient safety is a global challenge that requires knowledge and skills in multiple areas, including human factors and systems engineering. In this chapter, numerous conceptual approaches and methods for analyzing, preventing and mitigating medical errors are described. Given the complexity of healthcare work systems and processes, we emphasize the need for increasing partnerships between the health sciences and human factors and systems engineering to improve patient safety. Those partnerships will be able to develop and implement the system redesigns that are necessary to improve healthcare work systems and processes for patient safety.
Improving the detection of wind fields from LIDAR aerosol backscatter using feature extraction
NASA Astrophysics Data System (ADS)
Bickel, Brady R.; Rotthoff, Eric R.; Walters, Gage S.; Kane, Timothy J.; Mayor, Shane D.
2016-04-01
The tracking of winds and atmospheric features has many applications, from predicting and analyzing weather patterns in the upper and lower atmosphere to monitoring air movement from pig and chicken farms. Doppler LIDAR systems exist to quantify the underlying wind speeds, but cost of these systems can sometimes be relatively high, and processing limitations exist. The alternative is using an incoherent LIDAR system to analyze aerosol backscatter. Improving the detection and analysis of wind information from aerosol backscatter LIDAR systems will allow for the adoption of these relatively low cost instruments in environments where the size, complexity, and cost of other options are prohibitive. Using data from a simple aerosol backscatter LIDAR system, we attempt to extend the processing capabilities by calculating wind vectors through image correlation techniques to improve the detection of wind features.
Skin-electrode circuit model for use in optimizing energy transfer in volume conduction systems.
Hackworth, Steven A; Sun, Mingui; Sclabassi, Robert J
2009-01-01
The X-Delta model for through-skin volume conduction systems is introduced and analyzed. This new model has advantages over our previous X model in that it explicitly represents current pathways in the skin. A vector network analyzer is used to take measurements on pig skin to obtain data for use in finding the model's impedance parameters. An optimization method for obtaining this more complex model's parameters is described. Results show the model to accurately represent the impedance behavior of the skin system with error of generally less than one percent. Uses for the model include optimizing energy transfer across the skin in a volume conduction system with appropriate current exposure constraints, and exploring non-linear behavior of the electrode-skin system at moderate voltages (below ten) and frequencies (kilohertz to megahertz).
NASA Astrophysics Data System (ADS)
Li, Yuanyuan; Jin, Suoqin; Lei, Lei; Pan, Zishu; Zou, Xiufen
2015-03-01
The early diagnosis and investigation of the pathogenic mechanisms of complex diseases are the most challenging problems in the fields of biology and medicine. Network-based systems biology is an important technique for the study of complex diseases. The present study constructed dynamic protein-protein interaction (PPI) networks to identify dynamical network biomarkers (DNBs) and analyze the underlying mechanisms of complex diseases from a systems level. We developed a model-based framework for the construction of a series of time-sequenced networks by integrating high-throughput gene expression data into PPI data. By combining the dynamic networks and molecular modules, we identified significant DNBs for four complex diseases, including influenza caused by either H3N2 or H1N1, acute lung injury and type 2 diabetes mellitus, which can serve as warning signals for disease deterioration. Function and pathway analyses revealed that the identified DNBs were significantly enriched during key events in early disease development. Correlation and information flow analyses revealed that DNBs effectively discriminated between different disease processes and that dysfunctional regulation and disproportional information flow may contribute to the increased disease severity. This study provides a general paradigm for revealing the deterioration mechanisms of complex diseases and offers new insights into their early diagnoses.
Fundamental concepts of structural loading and load relief techniques for the space shuttle
NASA Technical Reports Server (NTRS)
Ryan, R. S.; Mowery, D. K.; Winder, S. W.
1972-01-01
The prediction of flight loads and their potential reduction, using various control system logics for the space shuttle vehicles, is discussed. Some factors not found on previous launch vehicles that increase the complexity are large lifting surfaces, unsymmetrical structure, unsymmetrical aerodynamics, trajectory control system coupling, and large aeroelastic effects. These load-producing factors and load-reducing techniques are analyzed.
Origin and Evolutionary Alteration of the Mitochondrial Import System in Eukaryotic Lineages
Fukasawa, Yoshinori; Oda, Toshiyuki; Tomii, Kentaro
2017-01-01
Abstract Protein transport systems are fundamentally important for maintaining mitochondrial function. Nevertheless, mitochondrial protein translocases such as the kinetoplastid ATOM complex have recently been shown to vary in eukaryotic lineages. Various evolutionary hypotheses have been formulated to explain this diversity. To resolve any contradiction, estimating the primitive state and clarifying changes from that state are necessary. Here, we present more likely primitive models of mitochondrial translocases, specifically the translocase of the outer membrane (TOM) and translocase of the inner membrane (TIM) complexes, using scrutinized phylogenetic profiles. We then analyzed the translocases’ evolution in eukaryotic lineages. Based on those results, we propose a novel evolutionary scenario for diversification of the mitochondrial transport system. Our results indicate that presequence transport machinery was mostly established in the last eukaryotic common ancestor, and that primitive translocases already had a pathway for transporting presequence-containing proteins. Moreover, secondary changes including convergent and migrational gains of a presequence receptor in TOM and TIM complexes, respectively, likely resulted from constrained evolution. The nature of a targeting signal can constrain alteration to the protein transport complex. PMID:28369657
A Survey of Noninteractive Zero Knowledge Proof System and Its Applications
Wu, Huixin; Wang, Feng
2014-01-01
Zero knowledge proof system which has received extensive attention since it was proposed is an important branch of cryptography and computational complexity theory. Thereinto, noninteractive zero knowledge proof system contains only one message sent by the prover to the verifier. It is widely used in the construction of various types of cryptographic protocols and cryptographic algorithms because of its good privacy, authentication, and lower interactive complexity. This paper reviews and analyzes the basic principles of noninteractive zero knowledge proof system, and summarizes the research progress achieved by noninteractive zero knowledge proof system on the following aspects: the definition and related models of noninteractive zero knowledge proof system, noninteractive zero knowledge proof system of NP problems, noninteractive statistical and perfect zero knowledge, the connection between noninteractive zero knowledge proof system, interactive zero knowledge proof system, and zap, and the specific applications of noninteractive zero knowledge proof system. This paper also points out the future research directions. PMID:24883407
NASA Technical Reports Server (NTRS)
Parsons-Wingerter, P.; Weitzel, Alexander; Vyas, R. J.; Murray, M. C.; Vickerman, M. B.; Bhattacharya, S.; Wyatt, S. E.
2016-01-01
One fundamental requirement shared by humans with all higher terrestrial life forms, including other vertebrates, insects, and higher land plants, is a complex, fractally branching vascular system. NASA's VESsel GENeration Analysis (VESGEN) software maps and quantifies vascular trees, networks, and tree-network composites according to weighted physiological rules such as vessel connectivity, tapering and bifurcational branching. According to fluid dynamics, successful vascular transport requires a complex distributed system of highly regulated laminar flow. Microvascular branching rules within vertebrates, dicot leaves and the other organisms therefore display many similarities. A unifying perspective is that vascular patterning offers a useful readout of molecular signaling that necessarily integrates these complex pathways. VESGEN has elucidated changes in vascular pattern resulting from inflammatory, developmental and other signaling within numerous tissues and major model organisms studied for Space Biology. For a new VESGEN systems approach, we analyzed differential gene expression in leaves of Arabidopsis thaliana reported by GeneLab (GLDS-7) for spaceflight. Vascularrelated changes in leaf gene expression were identified that can potentially be phenocopied by mutants in ground-based experiments. To link transcriptional, protein and other molecular change with phenotype, alterations in the spatial and dynamic dimensions of vascular patterns for Arabidopsis leaves and other model species are being co-localized with signaling patterns of single molecular expression analyzed as information dimensions. Previously, Drosophila microarray data returned from space suggested significant changes in genes related to wing venation development that include EGF, Notch, Hedghog, Wingless and Dpp signaling. Phenotypes of increasingly abnormal ectopic wing venation in the (non-spaceflight) Drosophila wing generated by overexpression of a Notch antagonist were analyzed by VESGEN. Other VESGEN research applications include the mouse retina, GI and coronary vessels, avian placental analogs and translational studies in the astronaut retina related to health challenges for long-duration missions.
NASA Technical Reports Server (NTRS)
Parsons-Wingerter, Patricia A.; Weitzel, Alexander; Vyas, Ruchi J.; Murray, Matthew C.; Wyatt, Sarah E.
2016-01-01
One fundamental requirement shared by humans with all higher terrestrial life forms, including insect wings, higher land plants and other vertebrates, is a complex, fractally branching vascular system. NASA's VESsel GENeration Analysis (VESGEN) software maps and quantifies vascular trees, networks, and tree-network composites according to weighted physiological rules such as vessel connectivity, tapering and bifurcational branching. According to fluid dynamics, successful vascular transport requires a complex distributed system of highly regulated laminar flow. Microvascular branching rules within vertebrates, dicot leaves and the other organisms therefore display many similarities. One unifying perspective is that vascular patterning offers a useful readout that necessarily integrates complex molecular signaling pathways. VESGEN has elucidated changes in vascular pattern resulting from inflammatory, stress response, developmental and other signaling within numerous tissues and major model organisms studied for Space Biology. For a new VESGEN systems approach, we analyzed differential gene expression in leaves of Arabidopsis thaliana reported by GeneLab (GLDS-7) for spaceflight. Vascular-related changes in leaf gene expression were identified that can potentially be phenocopied by mutants in ground-based experiments. To link transcriptional, protein and other molecular change with phenotype, alterations in the Euclidean and dynamic dimensions (x,y,t) of vascular patterns for Arabidopsis leaves and other model species are being co-localized with signaling patterns of single molecular expression analyzed as information dimensions (i,j,k,...). Previously, Drosophila microarray data returned from space suggested significant changes in genes related to wing venation development that include EGF, Notch, Hedghog, Wingless and Dpp signaling. Phenotypes of increasingly abnormal ectopic wing venation in the (non-spaceflight) Drosophila wing generated by overexpression of a Notch antagonist were analyzed by VESGEN. Other VESGEN research applications include the mouse retina, GI and coronary vessels, avian placental analogs and translational studies in the astronaut retina related to health challenges for long-duration missions.
Advantages and challenges in automated apatite fission track counting
NASA Astrophysics Data System (ADS)
Enkelmann, E.; Ehlers, T. A.
2012-04-01
Fission track thermochronometer data are often a core element of modern tectonic and denudation studies. Soon after the development of the fission track methods interest emerged for the developed an automated counting procedure to replace the time consuming labor of counting fission tracks under the microscope. Automated track counting became feasible in recent years with increasing improvements in computer software and hardware. One such example used in this study is the commercial automated fission track counting procedure from Autoscan Systems Pty that has been highlighted through several venues. We conducted experiments that are designed to reliably and consistently test the ability of this fully automated counting system to recognize fission tracks in apatite and a muscovite external detector. Fission tracks were analyzed in samples with a step-wise increase in sample complexity. The first set of experiments used a large (mm-size) slice of Durango apatite cut parallel to the prism plane. Second, samples with 80-200 μm large apatite grains of Fish Canyon Tuff were analyzed. This second sample set is characterized by complexities often found in apatites in different rock types. In addition to the automated counting procedure, the same samples were also analyzed using conventional counting procedures. We found for all samples that the fully automated fission track counting procedure using the Autoscan System yields a larger scatter in the fission track densities measured compared to conventional (manual) track counting. This scatter typically resulted from the false identification of tracks due surface and mineralogical defects, regardless of the image filtering procedure used. Large differences between track densities analyzed with the automated counting persisted between different grains analyzed in one sample as well as between different samples. As a result of these differences a manual correction of the fully automated fission track counts is necessary for each individual surface area and grain counted. This manual correction procedure significantly increases (up to four times) the time required to analyze a sample with the automated counting procedure compared to the conventional approach.
Innovation Study for Laser Cutting of Complex Geometries with Paper Materials
NASA Astrophysics Data System (ADS)
Happonen, A.; Stepanov, A.; Piili, H.; Salminen, A.
Even though technology for laser cutting of paper materials has existed for over 30 years, it seems that results of applications of this technology and possibilities of laser cutting systems are not easily available. The aim of this study was to analyze the feasibility of the complex geometry laser cutting of paper materials and to analyze the innovation challenges and potential of current laser cutting technologies offer. This research studied the potential and possible challenges in applying CO2 laser cutting technology for cutting of paper materials in current supply chains trying to fulfil the changing needs of customer in respect of shape, fast response during rapid delivery cycle. The study is focused on examining and analyzing the different possibilities of laser cutting of paper material in application area of complex low volume geometry cutting. The goal of this case was to analyze the feasibility of the laser cutting from technical, quality and implementation points of view and to discuss availability of new business opportunities. It was noticed that there are new business models still available within laser technology applications in complex geometry cutting. Application of laser technology, in business-to-consume markets, in synergy with Internet service platforms can widen the customer base and offer new value streams for technology and service companies. Because of this, existing markets and competition has to be identified, and appropriate new and innovative business model needs to be developed. And to be competitive in the markets, models like these need to include the earning logic and the stages from production to delivery as discussed in the paper.
Integration of systems biology with organs-on-chips to humanize therapeutic development
NASA Astrophysics Data System (ADS)
Edington, Collin D.; Cirit, Murat; Chen, Wen Li Kelly; Clark, Amanda M.; Wells, Alan; Trumper, David L.; Griffith, Linda G.
2017-02-01
"Mice are not little people" - a refrain becoming louder as the gaps between animal models and human disease become more apparent. At the same time, three emerging approaches are headed toward integration: powerful systems biology analysis of cell-cell and intracellular signaling networks in patient-derived samples; 3D tissue engineered models of human organ systems, often made from stem cells; and micro-fluidic and meso-fluidic devices that enable living systems to be sustained, perturbed and analyzed for weeks in culture. Integration of these rapidly moving fields has the potential to revolutionize development of therapeutics for complex, chronic diseases, including those that have weak genetic bases and substantial contributions from gene-environment interactions. Technical challenges in modeling complex diseases with "organs on chips" approaches include the need for relatively large tissue masses and organ-organ cross talk to capture systemic effects, such that current microfluidic formats often fail to capture the required scale and complexity for interconnected systems. These constraints drive development of new strategies for designing in vitro models, including perfusing organ models, as well as "mesofluidic" pumping and circulation in platforms connecting several organ systems, to achieve the appropriate physiological relevance.
The motion and control of a complex three-body space tethered system
NASA Astrophysics Data System (ADS)
Shi, Gefei; Zhu, Zhanxia; Chen, Shiyu; Yuan, Jianping; Tang, Biwei
2017-11-01
This paper is mainly devoted to investigating the dynamics and stability control of a three body-tethered satellite system which contains a main satellite and two subsatellites connected by two straight, massless and inextensible tethers. Firstly, a detailed mathematical model is established in the central gravitational field. Then, the dynamic characteristics of the established system are investigated and analyzed. Based on the dynamic analysis, a novel sliding mode prediction model (SMPM) control strategy is proposed to suppress the motion of the built tethered system. The numerical results show that the proposed underactuated control law is highly effective in suppressing the attitude/libration motion of the underactuated three-body tethered system. Furthermore, cases of different target angles are also examined and analyzed. The simulation results reveal that even if the final equilibrium states differ from different selections of the target angles, the whole system can still be maintained in acceptable areas.
Dissipative quantum trajectories in complex space: Damped harmonic oscillator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chou, Chia-Chun, E-mail: ccchou@mx.nthu.edu.tw
Dissipative quantum trajectories in complex space are investigated in the framework of the logarithmic nonlinear Schrödinger equation. The logarithmic nonlinear Schrödinger equation provides a phenomenological description for dissipative quantum systems. Substituting the wave function expressed in terms of the complex action into the complex-extended logarithmic nonlinear Schrödinger equation, we derive the complex quantum Hamilton–Jacobi equation including the dissipative potential. It is shown that dissipative quantum trajectories satisfy a quantum Newtonian equation of motion in complex space with a friction force. Exact dissipative complex quantum trajectories are analyzed for the wave and solitonlike solutions to the logarithmic nonlinear Schrödinger equation formore » the damped harmonic oscillator. These trajectories converge to the equilibrium position as time evolves. It is indicated that dissipative complex quantum trajectories for the wave and solitonlike solutions are identical to dissipative complex classical trajectories for the damped harmonic oscillator. This study develops a theoretical framework for dissipative quantum trajectories in complex space.« less
Energy: Production, Consumption, and Consequences.
ERIC Educational Resources Information Center
Helm, John L., Ed.
Energy policy in the United States and much of the analysis behind those policies is largely incomplete according to many. Systems for energy production, distribution, and use have traditionally been analyzed by supply sector, yet such analyses cannot capture the complex interplay of technology, economics, public policy, and environmental concerns…
Held, Jürgen; Manser, Tanja
2005-02-01
This article outlines how a Palm- or Newton-based PDA (personal digital assistant) system for online event recording was used to record and analyze concurrent events. We describe the features of this PDA-based system, called the FIT-System (flexible interface technique), and its application to the analysis of concurrent events in complex behavioral processes--in this case, anesthesia work processes. The patented FIT-System has a unique user interface design allowing the user to design an interface template with a pencil and paper or using a transparency film. The template usually consists of a drawing or sketch that includes icons or symbols that depict the observer's representation of the situation to be observed. In this study, the FIT-System allowed us to create a design for fast, intuitive online recording of concurrent events using a set of 41 observation codes. An analysis of concurrent events leads to a description of action density, and our results revealed a characteristic distribution of action density during the administration of anesthesia in the operating room. This distribution indicated the central role of the overlapping operations in the action sequences of medical professionals as they deal with the varying requirements of this complex task. We believe that the FIT-System for online recording of concurrent events in complex behavioral processes has the potential to be useful across a broad spectrum of research areas.
Study of component technologies for fuel cell on-site integrated energy systems
NASA Technical Reports Server (NTRS)
Lee, W. D.; Mathias, S.
1980-01-01
Heating, ventilation and air conditioning equipment are integrated with three types of fuel cells. System design and computer simulations are developed to utilize the thermal energy discharge of the fuel in the most cost effective manner. The fuel provides all of the electric needs and a loss of load probability analysis is used to ensure adequate power plant reliability. Equipment cost is estimated for each of the systems analyzed. A levelized annual cost reflecting owning and operating costs including the cost of money was used to select the most promising integrated system configurations. Cash flows are presented for the most promising 16 systems. Several systems for the 96 unit apartment complex (a retail store was also studied) were cost competitive with both gas and electric based conventional systems. Thermal storage is shown to be beneficial and the optimum absorption chiller sizing (waste heat recovery) in connection with electric chillers are developed. Battery storage was analyzed since the system is not electric grid connected. Advanced absorption chillers were analyzed as well. Recommendations covering financing, technical development, and policy issues are given to accelerate the commercialization of the fuel cell for on-site power generation in buildings.
Towards a Net Zero Building Cluster Energy Systems Analysis for a Brigade Combat Team Complex
2010-05-01
of technologies, like cogeneration or combined heat and power, waste heat recovery, biomass, geother- mal energy, solar heating (and cooling), and...peaks of individual buildings; thus the needed gen- eration and back-up capacity is smaller. To develop the community energy concept, energy models...overall thermal energy system, a hydraulic flow model (Figure 5) should be used to analyze critical capacities and flows in the system. This material is
AOIPS water resources data management system
NASA Technical Reports Server (NTRS)
Vanwie, P.
1977-01-01
The text and computer-generated displays used to demonstrate the AOIPS (Atmospheric and Oceanographic Information Processing System) water resources data management system are investigated. The system was developed to assist hydrologists in analyzing the physical processes occurring in watersheds. It was designed to alleviate some of the problems encountered while investigating the complex interrelationships of variables such as land-cover type, topography, precipitation, snow melt, surface runoff, evapotranspiration, and streamflow rates. The system has an interactive image processing capability and a color video display to display results as they are obtained.
Loeb, Danielle F; Bayliss, Elizabeth A; Candrian, Carey; deGruy, Frank V; Binswanger, Ingrid A
2016-03-22
Complex patients are increasingly common in primary care and often have poor clinical outcomes. Healthcare system barriers to effective care for complex patients have been previously described, but less is known about the potential impact and meaning of caring for complex patients on a daily basis for primary care providers (PCPs). Our objective was to describe PCPs' experiences providing care for complex patients, including their experiences of health system barriers and facilitators and their strategies to enhance provision of effective care. Using a general inductive approach, our qualitative research study was guided by an interpretive epistemology, or way of knowing. Our method for understanding included semi-structured in-depth interviews with internal medicine PCPs from two university-based and three community health clinics. We developed an interview guide, which included questions on PCPs' experiences, perceived system barriers and facilitators, and strategies to improve their ability to effectively treat complex patients. To focus interviews on real cases, providers were asked to bring de-identified clinical notes from patients they considered complex to the interview. Interview transcripts were coded and analyzed to develop categories from the raw data, which were then conceptualized into broad themes after team-based discussion. PCPs (N = 15) described complex patients with multidimensional needs, such as socio-economic, medical, and mental health. A vision of optimal care emerged from the data, which included coordinating care, preventing hospitalizations, and developing patient trust. PCPs relied on professional values and individual care strategies to overcome local and system barriers. Team based approaches were endorsed to improve the management of complex patients. Given the barriers to effective care described by PCPs, individual PCP efforts alone are unlikely to meet the needs of complex patients. To fulfill PCP's expressed concepts of optimal care, implementation of effective systemic approaches should be considered.
The Social Process of Analyzing Real Water Resource Systems Plans and Management Policies
NASA Astrophysics Data System (ADS)
Loucks, Daniel
2016-04-01
Developing and applying systems analysis methods for improving the development and management of real world water resource systems, I have learned, is primarily a social process. This talk is a call for more recognition of this reality in the modeling approaches we propose in the papers and books we publish. The mathematical models designed to inform planners and managers of water systems that we see in many of our journals often seem more complex than they need be. They also often seem not as connected to reality as they could be. While it may be easier to publish descriptions of complex models than simpler ones, and while adding complexity to models might make them better able to mimic or resemble the actual complexity of the real physical and/or social systems or processes being analyzed, the usefulness of such models often can be an illusion. Sometimes the important features of reality that are of concern or interest to those who make decisions can be adequately captured using relatively simple models. Finding the right balance for the particular issues being addressed or the particular decisions that need to be made is an art. When applied to real world problems or issues in specific basins or regions, systems modeling projects often involve more attention to the social aspects than the mathematical ones. Mathematical models addressing connected interacting interdependent components of complex water systems are in fact some of the most useful methods we have to study and better understand the systems we manage around us. They can help us identify and evaluate possible alternative solutions to problems facing humanity today. The study of real world systems of interacting components using mathematical models is commonly called applied systems analyses. Performing such analyses with decision makers rather than of decision makers is critical if the needed trust between project personnel and their clients is to be developed. Using examples from recent and ongoing modeling projects in different parts of the world, this talk will attempt to show the dependency on the degree of project success with the degree of attention given to the communication between project personnel, the stakeholders and decision making institutions. It will also highlight how initial project terms-of-reference and expected outcomes can change, sometimes in surprising ways, during the course of such projects. Changing project objectives often result from changing stakeholder values, emphasizing the need for analyses that can adapt to this uncertainty.
NASA Astrophysics Data System (ADS)
Balaji, P. A.
1999-07-01
A cricket's ear is a directional acoustic sensor. It has a remarkable level of sensitivity to the direction of sound propagation in a narrow frequency bandwidth of 4-5 KHz. Because of its complexity, the directional sensitivity has long intrigued researchers. The cricket's ear is a four-acoustic-inputs/two-vibration-outputs system. In this dissertation, this system is examined in depth, both experimentally and theoretically, with a primary goal to understand the mechanics involved in directional hearing. Experimental identification of the system is done by using random signal processing techniques. Theoretical identification of the system is accomplished by analyzing sound transmission through complex trachea of the ear. Finally, a description of how the cricket achieves directional hearing sensitivity is proposed. The fundamental principle involved in directional heating of the cricket has been utilized to design a device to obtain a directional signal from non- directional inputs.
Modeling And Simulation Of Bar Code Scanners Using Computer Aided Design Software
NASA Astrophysics Data System (ADS)
Hellekson, Ron; Campbell, Scott
1988-06-01
Many optical systems have demanding requirements to package the system in a small 3 dimensional space. The use of computer graphic tools can be a tremendous aid to the designer in analyzing the optical problems created by smaller and less costly systems. The Spectra Physics grocery store bar code scanner employs an especially complex 3 dimensional scan pattern to read bar code labels. By using a specially written program which interfaces with a computer aided design system, we have simulated many of the functions of this complex optical system. In this paper we will illustrate how a recent version of the scanner has been designed. We will discuss the use of computer graphics in the design process including interactive tweaking of the scan pattern, analysis of collected light, analysis of the scan pattern density, and analysis of the manufacturing tolerances used to build the scanner.
Method for Evaluating Information to Solve Problems of Control, Monitoring and Diagnostics
NASA Astrophysics Data System (ADS)
Vasil'ev, V. A.; Dobrynina, N. V.
2017-06-01
The article describes a method for evaluating information to solve problems of control, monitoring and diagnostics. It is necessary for reducing the dimensionality of informational indicators of situations, bringing them to relative units, for calculating generalized information indicators on their basis, ranking them by characteristic levels, for calculating the efficiency criterion of a system functioning in real time. The design of information evaluation system has been developed on its basis that allows analyzing, processing and assessing information about the object. Such object can be a complex technical, economic and social system. The method and the based system thereof can find a wide application in the field of analysis, processing and evaluation of information on the functioning of the systems, regardless of their purpose, goals, tasks and complexity. For example, they can be used to assess the innovation capacities of industrial enterprises and management decisions.
Characterizing air quality data from complex network perspective.
Fan, Xinghua; Wang, Li; Xu, Huihui; Li, Shasha; Tian, Lixin
2016-02-01
Air quality depends mainly on changes in emission of pollutants and their precursors. Understanding its characteristics is the key to predicting and controlling air quality. In this study, complex networks were built to analyze topological characteristics of air quality data by correlation coefficient method. Firstly, PM2.5 (particulate matter with aerodynamic diameter less than 2.5 μm) indexes of eight monitoring sites in Beijing were selected as samples from January 2013 to December 2014. Secondly, the C-C method was applied to determine the structure of phase space. Points in the reconstructed phase space were considered to be nodes of the network mapped. Then, edges were determined by nodes having the correlation greater than a critical threshold. Three properties of the constructed networks, degree distribution, clustering coefficient, and modularity, were used to determine the optimal value of the critical threshold. Finally, by analyzing and comparing topological properties, we pointed out that similarities and difference in the constructed complex networks revealed influence factors and their different roles on real air quality system.
A Crowdsourcing Framework for Medical Data Sets
Ye, Cheng; Coco, Joseph; Epishova, Anna; Hajaj, Chen; Bogardus, Henry; Novak, Laurie; Denny, Joshua; Vorobeychik, Yevgeniy; Lasko, Thomas; Malin, Bradley; Fabbri, Daniel
2018-01-01
Crowdsourcing services like Amazon Mechanical Turk allow researchers to ask questions to crowds of workers and quickly receive high quality labeled responses. However, crowds drawn from the general public are not suitable for labeling sensitive and complex data sets, such as medical records, due to various concerns. Major challenges in building and deploying a crowdsourcing system for medical data include, but are not limited to: managing access rights to sensitive data and ensuring data privacy controls are enforced; identifying workers with the necessary expertise to analyze complex information; and efficiently retrieving relevant information in massive data sets. In this paper, we introduce a crowdsourcing framework to support the annotation of medical data sets. We further demonstrate a workflow for crowdsourcing clinical chart reviews including (1) the design and decomposition of research questions; (2) the architecture for storing and displaying sensitive data; and (3) the development of tools to support crowd workers in quickly analyzing information from complex data sets. PMID:29888085
Statistical and dynamical remastering of classic exoplanet systems
NASA Astrophysics Data System (ADS)
Nelson, Benjamin Earl
The most powerful constraints on planet formation will come from characterizing the dynamical state of complex multi-planet systems. Unfortunately, with that complexity comes a number of factors that make analyzing these systems a computationally challenging endeavor: the sheer number of model parameters, a wonky shaped posterior distribution, and hundreds to thousands of time series measurements. In this dissertation, I will review our efforts to improve the statistical analyses of radial velocity (RV) data and their applications to some renown, dynamically complex exoplanet system. In the first project (Chapters 2 and 4), we develop a differential evolution Markov chain Monte Carlo (RUN DMC) algorithm to tackle the aforementioned difficult aspects of data analysis. We test the robustness of the algorithm in regards to the number of modeled planets (model dimensionality) and increasing dynamical strength. We apply RUN DMC to a couple classic multi-planet systems and one highly debated system from radial velocity surveys. In the second project (Chapter 5), we analyze RV data of 55 Cancri, a wide binary system known to harbor five planetary orbiting the primary. We find the inner-most planet "e" must be coplanar to within 40 degrees of the outer planets, otherwise Kozai-like perturbations will cause the planet to enter the stellar photosphere through its periastron passage. We find the orbits of planets "b" and "c" are apsidally aligned and librating with low to median amplitude (50+/-6 10 degrees), but they are not orbiting in a mean-motion resonance. In the third project (Chapters 3, 4, 6), we analyze RV data of Gliese 876, a four planet system with three participating in a multi-body resonance, i.e. a Laplace resonance. From a combined observational and statistical analysis computing Bayes factors, we find a four-planet model is favored over one with three-planets. Conditioned on this preferred model, we meaningfully constrain the three-dimensional orbital architecture of all the planets orbiting Gliese 876 based on the radial velocity data alone. By demanding orbital stability, we find the resonant planets have low mutual inclinations phi so they must be roughly coplanar (phicb = 1.41(+/-0.62/0.57) degrees and phibe = 3.87(+/-1.99/1.86 degrees). The three-dimensional Laplace argument librates chaotically with an amplitude of 50.5(+/-7.9/10.0) degrees, indicating significant past disk migration and ensuring long-term stability. In the final project (Chapter 7), we analyze the RV data for nu Octantis, a closely separated binary with an alleged planet orbiting interior and retrograde to the binary. Preliminary results place very tight constraints on the planet-binary mutual inclination but no model is dynamically stable beyond 105 years. These empirically derived models motivate the need for more sophisticated algorithms to analyze exoplanet data and will provide new challenges for planet formation models.
Small-time Scale Network Traffic Prediction Based on Complex-valued Neural Network
NASA Astrophysics Data System (ADS)
Yang, Bin
2017-07-01
Accurate models play an important role in capturing the significant characteristics of the network traffic, analyzing the network dynamic, and improving the forecasting accuracy for system dynamics. In this study, complex-valued neural network (CVNN) model is proposed to further improve the accuracy of small-time scale network traffic forecasting. Artificial bee colony (ABC) algorithm is proposed to optimize the complex-valued and real-valued parameters of CVNN model. Small-scale traffic measurements data namely the TCP traffic data is used to test the performance of CVNN model. Experimental results reveal that CVNN model forecasts the small-time scale network traffic measurement data very accurately
NASA Technical Reports Server (NTRS)
Berg, Melanie; Label, Kenneth; Campola, Michael; Xapsos, Michael
2017-01-01
We propose a method for the application of single event upset (SEU) data towards the analysis of complex systems using transformed reliability models (from the time domain to the particle fluence domain) and space environment data.
USDA-ARS?s Scientific Manuscript database
NMR-based metabolomics plays a major role studying complex living systems. However, very few studies describe the application of this technique to the evaluation of soil metabolome. Here, we introduce a protocol for analyzing the biochemical compounds from agricultural soils where the microbial comm...
Aquatic organisms are continuously exposed to complex mixtures of chemicals, many of which can interfere with their endocrine system, resulting in impaired reproduction, development or survival, among others. In order to analyze the effects and mechanisms of action of estrogen...
Rural Extension Services. Policy Research Working Paper.
ERIC Educational Resources Information Center
Anderson, Jock R.; Feder, Gershon
This paper analyzes the considerations that lead policy makers to undertake extension investments as a key public responsibility, as well as the complex set of factors and intra-agency incentives that explain variations in performance between different extension systems. The goals of extension include transferring knowledge from researchers to…
Decision Making/The Chesapeake Bay. An Interdisciplinary Environmental Education Curriculum Unit.
ERIC Educational Resources Information Center
Maryland Univ., College Park. Science Teaching Center.
This multidisciplinary, self-contained curriculum unit focuses on the management of the Chesapeake Bay, a threatened and complex environmental system. Major unit goals include identifying and analyzing conflicting interests, issues, and public policies concerning the Bay, and determining their effects on people and the environment. The unit…
DOT National Transportation Integrated Search
1976-12-01
This report describes a case study of an air quality analysis prepared by the U.S. Department of Transportation (DOT), Transportation Systems Center (TSC). The site analyzed was the proposed I-83/I-95 interchange in Baltimore, Maryland. This intercha...
Design consideration in constructing high performance embedded Knowledge-Based Systems (KBS)
NASA Technical Reports Server (NTRS)
Dalton, Shelly D.; Daley, Philip C.
1988-01-01
As the hardware trends for artificial intelligence (AI) involve more and more complexity, the process of optimizing the computer system design for a particular problem will also increase in complexity. Space applications of knowledge based systems (KBS) will often require an ability to perform both numerically intensive vector computations and real time symbolic computations. Although parallel machines can theoretically achieve the speeds necessary for most of these problems, if the application itself is not highly parallel, the machine's power cannot be utilized. A scheme is presented which will provide the computer systems engineer with a tool for analyzing machines with various configurations of array, symbolic, scaler, and multiprocessors. High speed networks and interconnections make customized, distributed, intelligent systems feasible for the application of AI in space. The method presented can be used to optimize such AI system configurations and to make comparisons between existing computer systems. It is an open question whether or not, for a given mission requirement, a suitable computer system design can be constructed for any amount of money.
An Integrated Solution for Performing Thermo-fluid Conjugate Analysis
NASA Technical Reports Server (NTRS)
Kornberg, Oren
2009-01-01
A method has been developed which integrates a fluid flow analyzer and a thermal analyzer to produce both steady state and transient results of 1-D, 2-D, and 3-D analysis models. The Generalized Fluid System Simulation Program (GFSSP) is a one dimensional, general purpose fluid analysis code which computes pressures and flow distributions in complex fluid networks. The MSC Systems Improved Numerical Differencing Analyzer (MSC.SINDA) is a one dimensional general purpose thermal analyzer that solves network representations of thermal systems. Both GFSSP and MSC.SINDA have graphical user interfaces which are used to build the respective model and prepare it for analysis. The SINDA/GFSSP Conjugate Integrator (SGCI) is a formbase graphical integration program used to set input parameters for the conjugate analyses and run the models. The contents of this paper describes SGCI and its thermo-fluids conjugate analysis techniques and capabilities by presenting results from some example models including the cryogenic chill down of a copper pipe, a bar between two walls in a fluid stream, and a solid plate creating a phase change in a flowing fluid.
Systems and context modeling approach to requirements analysis
NASA Astrophysics Data System (ADS)
Ahuja, Amrit; Muralikrishna, G.; Patwari, Puneet; Subhrojyoti, C.; Swaminathan, N.; Vin, Harrick
2014-08-01
Ensuring completeness and correctness of the requirements for a complex system such as the SKA is challenging. Current system engineering practice includes developing a stakeholder needs definition, a concept of operations, and defining system requirements in terms of use cases and requirements statements. We present a method that enhances this current practice into a collection of system models with mutual consistency relationships. These include stakeholder goals, needs definition and system-of-interest models, together with a context model that participates in the consistency relationships among these models. We illustrate this approach by using it to analyze the SKA system requirements.
Xie, Mingxia; Wang, Jiayao; Chen, Ke
2017-01-01
This study investigates the basic characteristics and proposes a concept for the complex system of geographical conditions (CSGC). By analyzing the DPSIR model and its correlation with the index system, we selected indexes for geographical conditions according to the resources, ecology, environment, economy and society parameters to build a system. This system consists of four hierarchies: index, classification, element and target levels. We evaluated the elements or indexes of the complex system using the TOPSIS method and a general model coordinating multiple complex systems. On this basis, the coordination analysis experiment of geographical conditions is applied to cities in the Henan province in China. The following conclusions were reached: ①According to the pressure, state and impact of geographical conditions, relatively consistent measures are taken around the city, but with conflicting results. ②The coordination degree of geographical conditions is small among regions showing large differences in classification index value. The degree of coordination of such regions is prone to extreme values; however, the smaller the difference the larger the coordination degree. ③The coordinated development of geographical conditions in the Henan province is at the stage of the point axis.
ERIC Educational Resources Information Center
Danish, Joshua A.
2014-01-01
This article reports on a study in which activity theory was used to design, implement, and analyze a 10-week curriculum unit about how honeybees collect nectar with a particular focus on complex systems concepts. Students (n = 42) in a multi-year kindergarten and 1st-grade classroom participated in this study as part of their 10 regular classroom…
NREL's System Advisor Model Simplifies Complex Energy Analysis (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2015-01-01
NREL has developed a tool -- the System Advisor Model (SAM) -- that can help decision makers analyze cost, performance, and financing of any size grid-connected solar, wind, or geothermal power project. Manufacturers, engineering and consulting firms, research and development firms, utilities, developers, venture capital firms, and international organizations use SAM for end-to-end analysis that helps determine whether and how to make investments in renewable energy projects.
NASA Astrophysics Data System (ADS)
Mashkov, O. A.; Samborskiy, I. I.
2009-10-01
A bundle of papers dealing with functionally stable systems requires the necessity of analyzing of obtained results and their understanding in a general context of cybernetic's development and applications. Description of this field of science, main results and perspectives of the new theory of functionally stability of dynamical systems concerning the problem of remote-piloted aircrafts engineering using pseudosatellite technologies are proposed in the paper.
Trouvé, Hélène; Couturier, Yves; Etheridge, Francis; Saint-Jean, Olivier; Somme, Dominique
2010-06-30
The literature on integration indicates the need for an enhanced theorization of institutional integration. This article proposes path dependence as an analytical framework to study the systems in which integration takes place. PRISMA proposes a model for integrating health and social care services for older adults. This model was initially tested in Quebec. The PRISMA France study gave us an opportunity to analyze institutional integration in France. A qualitative approach was used. Analyses were based on semi-structured interviews with actors of all levels of decision-making, observations of advisory board meetings, and administrative documents. Our analyses revealed the complexity and fragmentation of institutional integration. The path dependency theory, which analyzes the change capacity of institutions by taking into account their historic structures, allows analysis of this situation. The path dependency to the Bismarckian system and the incomplete reforms of gerontological policies generate the coexistence and juxtaposition of institutional systems. In such a context, no institution has sufficient ability to determine gerontology policy and build institutional integration by itself. Using path dependence as an analytical framework helps to understand the reasons why institutional integration is critical to organizational and clinical integration, and the complex construction of institutional integration in France.
Bioengineering thermodynamics of biological cells.
Lucia, Umberto
2015-12-01
Cells are open complex thermodynamic systems. They can be also regarded as complex engines that execute a series of chemical reactions. Energy transformations, thermo-electro-chemical processes and transports phenomena can occur across the cells membranes. Moreover, cells can also actively modify their behaviours in relation to changes in their environment. Different thermo-electro-biochemical behaviours occur between health and disease states. But, all the living systems waste heat, which is no more than the result of their internal irreversibility. This heat is dissipated into the environment. But, this wasted heat represent also a sort of information, which outflows from the cell toward its environment, completely accessible to any observer. The analysis of irreversibility related to this wasted heat can represent a new approach to study the behaviour of the cells themselves and to control their behaviours. So, this approach allows us to consider the living systems as black boxes and analyze only the inflows and outflows and their changes in relation to the modification of the environment. Therefore, information on the systems can be obtained by analyzing the changes in the cell heat wasted in relation to external perturbations. The bioengineering thermodynamics bases are summarized and used to analyse possible controls of the calls behaviours based on the control of the ions fluxes across the cells membranes.
Getting to the core of cadherin complex function in Caenorhabditis elegans.
Hardin, Jeff
2015-01-01
The classic cadherin-catenin complex (CCC) mediates cell-cell adhesion in metazoans. Although substantial insights have been gained by studying the CCC in vertebrate tissue culture, analyzing requirements for and regulation of the CCC in vertebrates remains challenging. Caenorhabditis elegans is a powerful system for connecting the molecular details of CCC function with functional requirements in a living organism. Recent data, using an "angstroms to embryos" approach, have elucidated functions for key residues, conserved across all metazoans, that mediate cadherin/β-catenin binding. Other recent work reveals a novel, potentially ancestral, role for the C. elegans p120ctn homologue in regulating polarization of blastomeres in the early embryo via Cdc42 and the partitioning-defective (PAR)/atypical protein kinase C (aPKC) complex. Finally, recent work suggests that the CCC is trafficked to the cell surface via the clathrin adaptor protein complex 1 (AP-1) in surprising ways. These studies continue to underscore the value of C. elegans as a model system for identifying conserved molecular mechanisms involving the CCC.
Chatterji, Madhabi
2016-12-01
This paper explores avenues for navigating evaluation design challenges posed by complex social programs (CSPs) and their environments when conducting studies that call for generalizable, causal inferences on the intervention's effectiveness. A definition is provided of a CSP drawing on examples from different fields, and an evaluation case is analyzed in depth to derive seven (7) major sources of complexity that typify CSPs, threatening assumptions of textbook-recommended experimental designs for performing impact evaluations. Theoretically-supported, alternative methodological strategies are discussed to navigate assumptions and counter the design challenges posed by the complex configurations and ecology of CSPs. Specific recommendations include: sequential refinement of the evaluation design through systems thinking, systems-informed logic modeling; and use of extended term, mixed methods (ETMM) approaches with exploratory and confirmatory phases of the evaluation. In the proposed approach, logic models are refined through direct induction and interactions with stakeholders. To better guide assumption evaluation, question-framing, and selection of appropriate methodological strategies, a multiphase evaluation design is recommended. Copyright © 2016 Elsevier Ltd. All rights reserved.
Complex-ordered patterns in shaken convection.
Rogers, Jeffrey L; Pesch, Werner; Brausch, Oliver; Schatz, Michael F
2005-06-01
We report and analyze complex patterns observed in a combination of two standard pattern forming experiments. These exotic states are composed of two distinct spatial scales, each displaying a different temporal dependence. The system is a fluid layer experiencing forcing from both a vertical temperature difference and vertical time-periodic oscillations. Depending on the parameters these forcing mechanisms produce fluid motion with either a harmonic or a subharmonic temporal response. Over a parameter range where these mechanisms have comparable influence the spatial scales associated with both responses are found to coexist, resulting in complex, yet highly ordered patterns. Phase diagrams of this region are reported and criteria to define the patterns as quasiperiodic crystals or superlattices are presented. These complex patterns are found to satisfy four-mode (resonant tetrad) conditions. The qualitative difference between the present formation mechanism and the resonant triads ubiquitously used to explain complex-ordered patterns in other nonequilibrium systems is discussed. The only exception to quantitative agreement between our analysis based on Boussinesq equations and laboratory investigations is found to be the result of breaking spatial symmetry in a small parameter region near onset.
AI mass spectrometers for space shuttle health monitoring
NASA Technical Reports Server (NTRS)
Adams, F. W.
1991-01-01
The facility Hazardous Gas Detection System (HGDS) at Kennedy Space Center (KSC) is a mass spectrometer based gas analyzer. Two instruments make up the HGDS, which is installed in a prime/backup arrangement, with the option of using both analyzers on the same sample line, or on two different lines simultaneously. It is used for monitoring the Shuttle during fuel loading, countdown, and drainback, if necessary. The use of complex instruments, operated over many shifts, has caused problems in tracking the status of the ground support equipment (GSE) and the vehicle. A requirement for overall system reliability has been a major force in the development of Shuttle GSE, and is the ultimate driver in the choice to pursue artificial intelligence (AI) techniques for Shuttle and Advanced Launch System (ALS) mass spectrometer systems. Shuttle applications of AI are detailed.
NASA Technical Reports Server (NTRS)
Mata, Carlos T.; Mata, Angel G.; Rakov, V. A.; Nag, A.; Saul, Jon
2012-01-01
A new comprehensive lightning instrumentation system has been designed for Launch Complex 39B (LC39B) at the Kennedy Space Center, Florida. This new instrumentation system includes six synchronized high-speed video cameras, current sensors installed on the nine downcouductors of the new lightning protection system (LPS) for LC39B; four dH/dt, 3-axis measurement stations; and five dE/dt stations composed of two antennas each. The LPS received 8 direct lightning strikes (a total of 19 strokes) from March 31 through December 31, 2011. The measured peak currents and locations are compared to those reported by the CGLSS 11 and the NLDN. Results of comparison are presented and analyzed in this paper.
Dynamical analysis of uterine cell electrical activity model.
Rihana, S; Santos, J; Mondie, S; Marque, C
2006-01-01
The uterus is a physiological system consisting of a large number of interacting smooth muscle cells. The uterine excitability changes remarkably with time, generally quiescent during pregnancy, the uterus exhibits forceful synchronized contractions at term leading to fetus expulsion. These changes characterize thus a dynamical system susceptible of being studied through formal mathematical tools. Multiple physiological factors are involved in the regulation process of this complex system. Our aim is to relate the physiological factors to the uterine cell dynamic behaviors. Taking into account a previous work presented, in which the electrical activity of a uterine cell is described by a set of ordinary differential equations, we analyze the impact of physiological parameters on the response of the model, and identify the main subsystems generating the complex uterine electrical activity, with respect to physiological data.
Information of Complex Systems and Applications in Agent Based Modeling.
Bao, Lei; Fritchman, Joseph C
2018-04-18
Information about a system's internal interactions is important to modeling the system's dynamics. This study examines the finer categories of the information definition and explores the features of a type of local information that describes the internal interactions of a system. Based on the results, a dual-space agent and information modeling framework (AIM) is developed by explicitly distinguishing an information space from the material space. The two spaces can evolve both independently and interactively. The dual-space framework can provide new analytic methods for agent based models (ABMs). Three examples are presented including money distribution, individual's economic evolution, and artificial stock market. The results are analyzed in the dual-space, which more clearly shows the interactions and evolutions within and between the information and material spaces. The outcomes demonstrate the wide-ranging applicability of using the dual-space AIMs to model and analyze a broad range of interactive and intelligent systems.
Modulus spectroscopy of grain-grain boundary binary system
NASA Astrophysics Data System (ADS)
Cheng, Peng-Fei; Song, Jiang; Li, Sheng-Tao; Wang, Hui
2015-02-01
Understanding various polarization mechanisms in complex dielectric systems and specifying their physical origins are key issues in dielectric physics. In this paper, four different methods for representing dielectric properties were analyzed and compared. Depending on the details of the system under study, i.e., uniform or non-uniform, it was suggested that different representing approaches should be used to obtain more valuable information. Especially, for the grain-grain boundary binary non-uniform system, its dielectric response was analyzed in detail in terms of modulus spectroscopy (MS). Furthermore, it was found that through MS, the dielectric responses between uniform and non-uniform systems, grain and grain boundary, Maxwell-Wagner polarization and intrinsic polarization can be distinguished. Finally, with the proposed model, the dielectric properties of CaCu3Ti4O12 (CCTO) ceramics were studied. The colossal dielectric constant of CCTO at low frequency was attributed to the pseudo relaxation process of grain.
Digital Signal Processing and Control for the Study of Gene Networks
NASA Astrophysics Data System (ADS)
Shin, Yong-Jun
2016-04-01
Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.
Digital Signal Processing and Control for the Study of Gene Networks.
Shin, Yong-Jun
2016-04-22
Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.
NASA Astrophysics Data System (ADS)
Kassem, M.; Soize, C.; Gagliardini, L.
2009-06-01
In this paper, an energy-density field approach applied to the vibroacoustic analysis of complex industrial structures in the low- and medium-frequency ranges is presented. This approach uses a statistical computational model. The analyzed system consists of an automotive vehicle structure coupled with its internal acoustic cavity. The objective of this paper is to make use of the statistical properties of the frequency response functions of the vibroacoustic system observed from previous experimental and numerical work. The frequency response functions are expressed in terms of a dimensionless matrix which is estimated using the proposed energy approach. Using this dimensionless matrix, a simplified vibroacoustic model is proposed.
Nonequilibrium transitions in complex networks: A model of social interaction
NASA Astrophysics Data System (ADS)
Klemm, Konstantin; Eguíluz, Víctor M.; Toral, Raúl; San Miguel, Maxi
2003-02-01
We analyze the nonequilibrium order-disorder transition of Axelrod’s model of social interaction in several complex networks. In a small-world network, we find a transition between an ordered homogeneous state and a disordered state. The transition point is shifted by the degree of spatial disorder of the underlying network, the network disorder favoring ordered configurations. In random scale-free networks the transition is only observed for finite size systems, showing system size scaling, while in the thermodynamic limit only ordered configurations are always obtained. Thus, in the thermodynamic limit the transition disappears. However, in structured scale-free networks, the phase transition between an ordered and a disordered phase is restored.
Digital Signal Processing and Control for the Study of Gene Networks
Shin, Yong-Jun
2016-01-01
Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks. PMID:27102828
Retinal Connectomics: Towards Complete, Accurate Networks
Marc, Robert E.; Jones, Bryan W.; Watt, Carl B.; Anderson, James R.; Sigulinsky, Crystal; Lauritzen, Scott
2013-01-01
Connectomics is a strategy for mapping complex neural networks based on high-speed automated electron optical imaging, computational assembly of neural data volumes, web-based navigational tools to explore 1012–1015 byte (terabyte to petabyte) image volumes, and annotation and markup tools to convert images into rich networks with cellular metadata. These collections of network data and associated metadata, analyzed using tools from graph theory and classification theory, can be merged with classical systems theory, giving a more completely parameterized view of how biologic information processing systems are implemented in retina and brain. Networks have two separable features: topology and connection attributes. The first findings from connectomics strongly validate the idea that the topologies complete retinal networks are far more complex than the simple schematics that emerged from classical anatomy. In particular, connectomics has permitted an aggressive refactoring of the retinal inner plexiform layer, demonstrating that network function cannot be simply inferred from stratification; exposing the complex geometric rules for inserting different cells into a shared network; revealing unexpected bidirectional signaling pathways between mammalian rod and cone systems; documenting selective feedforward systems, novel candidate signaling architectures, new coupling motifs, and the highly complex architecture of the mammalian AII amacrine cell. This is but the beginning, as the underlying principles of connectomics are readily transferrable to non-neural cell complexes and provide new contexts for assessing intercellular communication. PMID:24016532
Magnetic Analyzer Mavr for Study of Exotic Weakly Bound Nuclei
NASA Astrophysics Data System (ADS)
Maslov, V. A.; Kazacha, V. I.; Kolesov, I. V.; Lukyanov, S. M.; Melnikov, V. N.; Osipov, N. F.; Penionzhkevich, Yu. E.; Skobelev, N. K.; Sobolev, Yu. G.; Voskoboinik, E. I.
2015-06-01
A project of the high-resolution magnetic analyzer MAVR is proposed. The analyzer will comprise new magnetic optical and detecting systems for separation and identification of reaction products in a wide range of masses (5-150) and charges (1-60). The magnetic optical system consists of the MSP-144 magnet and a doublet of quadrupole lenses. This will allow the solid angle of the spectrometer to be increased by an order of magnitude up to 30 msr. The magnetic analyzer will have a high momentum resolution (10-4) and high focal-plane dispersion (1.9 m). It will allow products of nuclear reactions at energies up to 30 MeV/nucleon to be detected with the charge resolution ~1/60. Implementation of the project is divided into two stages: conversion of the magnetic analyzer proper and construction of the nuclear reaction products identification system. The MULTI detecting system is being developed for the MAVR magnetic analyzer to allow detection of nuclear reaction products and their identification by charge Q, atomic number Z, and mass A with a high absolute accuracy. The identification will be performed by measuring the energy loss (ΔE), time of flight (TOF), and total kinetic energy (TKE) of reaction products. The particle trajectories in the analyzer will also be determined using the drift chamber developed jointly with GANIL. The MAVR analyzer will operate in both primary beams of heavy ions and beams of radioactive nuclei produced by the U400 - U400M acceleration complex. It will also be used for measuring energy spectra of nuclear reaction products and as an energy monochromator.
Study of Exotic Weakly Bound Nuclei Using Magnetic Analyzer Mavr
NASA Astrophysics Data System (ADS)
Maslov, V. A.; Kazacha, V. I.; Kolesov, I. V.; Lukyanov, S. M.; Melnikov, V. N.; Osipov, N. F.; Penionzhkevich, Yu. E.; Skobelev, N. K.; Sobolev, Yu. G.; Voskoboinik, E. I.
2016-06-01
A project of the high-resolution magnetic analyzer MAVR is proposed. The analyzer will comprise new magnetic optical and detecting systems for separation and identification of reaction products in a wide range of masses (5-150) and charges (1-60). The magnetic optical system consists of the MSP-144 magnet and a doublet of quadrupole lenses. This will allow the solid angle of the spectrometer to be increased by an order of magnitude up to 30 msr. The magnetic analyzer will have a high momentum resolution (10-4) and high focal-plane dispersion (1.9 m). It will allow products of nuclear reactions at energies up to 30 MeV/nucleon to be detected with the charge resolution ∼1/60. Implementation of the project is divided into two stages: conversion of the magnetic analyzer proper and construction of the nuclear reaction products identification system. The MULTI detecting system is being developed for the MAVR magnetic analyzer to allow detection of nuclear reaction products and their identification by charge Q, atomic number Z, and mass A with a high absolute accuracy. The identification will be performed by measuring the energy loss (ΔE), time of flight (TOF), and total kinetic energy (TKE) of reaction products. The particle trajectories in the analyzer will also be determined using the drift chamber developed jointly with GANIL. The MAVR analyzer will operate in both primary beams of heavy ions and beams of radioactive nuclei produced by the U400 - U400M acceleration complex. It will also be used for measuring energy spectra of nuclear reaction products and as an energy monochromator.
Phase transitions in Pareto optimal complex networks
NASA Astrophysics Data System (ADS)
Seoane, Luís F.; Solé, Ricard
2015-09-01
The organization of interactions in complex systems can be described by networks connecting different units. These graphs are useful representations of the local and global complexity of the underlying systems. The origin of their topological structure can be diverse, resulting from different mechanisms including multiplicative processes and optimization. In spatial networks or in graphs where cost constraints are at work, as it occurs in a plethora of situations from power grids to the wiring of neurons in the brain, optimization plays an important part in shaping their organization. In this paper we study network designs resulting from a Pareto optimization process, where different simultaneous constraints are the targets of selection. We analyze three variations on a problem, finding phase transitions of different kinds. Distinct phases are associated with different arrangements of the connections, but the need of drastic topological changes does not determine the presence or the nature of the phase transitions encountered. Instead, the functions under optimization do play a determinant role. This reinforces the view that phase transitions do not arise from intrinsic properties of a system alone, but from the interplay of that system with its external constraints.
Integrating a Genetic Algorithm Into a Knowledge-Based System for Ordering Complex Design Processes
NASA Technical Reports Server (NTRS)
Rogers, James L.; McCulley, Collin M.; Bloebaum, Christina L.
1996-01-01
The design cycle associated with large engineering systems requires an initial decomposition of the complex system into design processes which are coupled through the transference of output data. Some of these design processes may be grouped into iterative subcycles. In analyzing or optimizing such a coupled system, it is essential to be able to determine the best ordering of the processes within these subcycles to reduce design cycle time and cost. Many decomposition approaches assume the capability is available to determine what design processes and couplings exist and what order of execution will be imposed during the design cycle. Unfortunately, this is often a complex problem and beyond the capabilities of a human design manager. A new feature, a genetic algorithm, has been added to DeMAID (Design Manager's Aid for Intelligent Decomposition) to allow the design manager to rapidly examine many different combinations of ordering processes in an iterative subcycle and to optimize the ordering based on cost, time, and iteration requirements. Two sample test cases are presented to show the effects of optimizing the ordering with a genetic algorithm.
Metabolic Compartmentation – A System Level Property of Muscle Cells
Saks, Valdur; Beraud, Nathalie; Wallimann, Theo
2008-01-01
Problems of quantitative investigation of intracellular diffusion and compartmentation of metabolites are analyzed. Principal controversies in recently published analyses of these problems for the living cells are discussed. It is shown that the formal theoretical analysis of diffusion of metabolites based on Fick's equation and using fixed diffusion coefficients for diluted homogenous aqueous solutions, but applied for biological systems in vivo without any comparison with experimental results, may lead to misleading conclusions, which are contradictory to most biological observations. However, if the same theoretical methods are used for analysis of actual experimental data, the apparent diffusion constants obtained are orders of magnitude lower than those in diluted aqueous solutions. Thus, it can be concluded that local restrictions of diffusion of metabolites in a cell are a system-level properties caused by complex structural organization of the cells, macromolecular crowding, cytoskeletal networks and organization of metabolic pathways into multienzyme complexes and metabolons. This results in microcompartmentation of metabolites, their channeling between enzymes and in modular organization of cellular metabolic networks. The perspectives of further studies of these complex intracellular interactions in the framework of Systems Biology are discussed. PMID:19325782
The architecture of Newton, a general-purpose dynamics simulator
NASA Technical Reports Server (NTRS)
Cremer, James F.; Stewart, A. James
1989-01-01
The architecture for Newton, a general-purpose system for simulating the dynamics of complex physical objects, is described. The system automatically formulates and analyzes equations of motion, and performs automatic modification of this system equations when necessitated by changes in kinematic relationships between objects. Impact and temporary contact are handled, although only using simple models. User-directed influence of simulations is achieved using Newton's module, which can be used to experiment with the control of many-degree-of-freedom articulated objects.
Study of the GEM detector performance in BM@N experiment
NASA Astrophysics Data System (ADS)
Bazylev, Sergei; Kapishin, Mikhail; Kapusniak, Kacper; Karjavine, Vladimir; Khabarov, Sergei; Kolesnikov, Alexander; Kulish, Elena; Lenivenko, Vasilisa; Makankin, Alexander; Maksymchuk, Anna; Mehl, Bertrand; De Oliveira, Rui; Palchik, Vladimir; Pokatashkin, Gleb; Rodriguez, A.; Rufanov, Igor; Shutov, Alexander; Slepnev, Ilya; Slepnev, Vyacheslav; Vasiliev, Sergei; Zinchenko, Alexander
2018-04-01
BM@N is the fixed target experiment at the accelerator complex NICA-Nuclotron aimed to study nuclear matter in the relativistic heavy ion collisions. Triple-GEM detectors were identified as appropriate for the BM@N tracking system located inside the analyzing magnet. Seven GEM chambers are integrated into the BM@N experimental setup and data acquisition system. GEM construction, main characteristics and first obtained results of the GEM tracking system performance in the technical run with the deuteron beam are shortly reviewed.
Transient Oscilliations in Mechanical Systems of Automatic Control with Random Parameters
NASA Astrophysics Data System (ADS)
Royev, B.; Vinokur, A.; Kulikov, G.
2018-04-01
Transient oscillations in mechanical systems of automatic control with random parameters is a relevant but insufficiently studied issue. In this paper, a modified spectral method was applied to investigate the problem. The nature of dynamic processes and the phase portraits are analyzed depending on the amplitude and frequency of external influence. It is evident from the obtained results, that the dynamic phenomena occurring in the systems with random parameters under external influence are complex, and their study requires further investigation.
Dynamical singularities of glassy systems in a quantum quench.
Obuchi, Tomoyuki; Takahashi, Kazutaka
2012-11-01
We present a prototype of behavior of glassy systems driven by quantum dynamics in a quenching protocol by analyzing the random energy model in a transverse field. We calculate several types of dynamical quantum amplitude and find a freezing transition at some critical time. The behavior is understood by the partition-function zeros in the complex temperature plane. We discuss the properties of the freezing phase as a dynamical chaotic phase, which are contrasted to those of the spin-glass phase in the static system.
[Financing of inpatient orthopedics and trauma surgery in the G-DRG system 2010].
Franz, D; Schemmann, F; Roeder, N; Mahlke, L
2010-08-01
The German DRG (diagnosis-related groups) system forms the basis for billing inpatient hospital services. It includes not only the case groups (G-DRGs), but also additional and innovation payments. This paper analyzes and evaluates the relevant developments of the G-DRG System 2010 for orthopedics and traumatology from the medical and classification perspectives. Analyses of relevant diagnoses, medical procedures and G-DRGs in the versions 2009 and 2010 based on the publications of the German DRG institute (InEK) and the German Institute of Medical Documentation and Information (DIMDI) were carried out. The DRG catalog is has grown from 8 to 1,200 G-DRGs. A number of codes for surgical measures have been newly established or modified. Here, the identification and the correct and performance-based mapping of complex and elaborate scenarios was again the focus of the restructuring of the G-DRG system. The G-DRG structure in orthopedics and traumatology has been changed, especially in the areas of spinal surgery and surgery of the upper and lower extremities. The actual impact of the changes may vary depending on the individual hospital services. For the first time since the introduction of the G-DRG system, the pure numerical changes at the level of DRGs themselves are so marginal that only part of the DRG users in the hospitals will register them. The changes implemented not only a high selectivity between complex and less complex scenarios, but partly also unintended and unjustified revaluation of less complex measures. The G-DRG system has gained complexity again. Especially the G-DRG allocation of spinal surgery and multiple surgical interventions of the upper and/or lower extremities have reached such a complexity that only a few DRG users can follow them.
Learning the organization: a model for health system analysis for new nurse administrators.
Clark, Mary Jo
2004-01-01
Health systems are large and complex organizations in which multiple components and processes influence system outcomes. In order to effectively position themselves in such organizations, nurse administrators new to a system must gain a rapid understanding of overall system operation. Such understanding is facilitated by use of a model for system analysis. The model presented here examines the dynamic interrelationships between and among internal and external elements as they affect system performance. External elements to be analyzed include environmental factors and characteristics of system clientele. Internal elements flow from the mission and goals of the system and include system culture, services, resources, and outcomes.
Analyzing and Detecting Problems in Systems of Systems
NASA Technical Reports Server (NTRS)
Lindvall, Mikael; Ackermann, Christopher; Stratton, William C.; Sibol, Deane E.; Godfrey, Sally
2008-01-01
Many software systems are evolving complex system of systems (SoS) for which inter-system communication is mission-critical. Evidence indicates that transmission failures and performance issues are not uncommon occurrences. In a NASA-supported Software Assurance Research Program (SARP) project, we are researching a new approach addressing such problems. In this paper, we are presenting an approach for analyzing inter-system communications with the goal to uncover both transmission errors and performance problems. Our approach consists of a visualization and an evaluation component. While the visualization of the observed communication aims to facilitate understanding, the evaluation component automatically checks the conformance of an observed communication (actual) to a desired one (planned). The actual and the planned are represented as sequence diagrams. The evaluation algorithm checks the conformance of the actual to the planned diagram. We have applied our approach to the communication of aerospace systems and were successful in detecting and resolving even subtle and long existing transmission problems.
Learning surface molecular structures via machine vision
NASA Astrophysics Data System (ADS)
Ziatdinov, Maxim; Maksov, Artem; Kalinin, Sergei V.
2017-08-01
Recent advances in high resolution scanning transmission electron and scanning probe microscopies have allowed researchers to perform measurements of materials structural parameters and functional properties in real space with a picometre precision. In many technologically relevant atomic and/or molecular systems, however, the information of interest is distributed spatially in a non-uniform manner and may have a complex multi-dimensional nature. One of the critical issues, therefore, lies in being able to accurately identify (`read out') all the individual building blocks in different atomic/molecular architectures, as well as more complex patterns that these blocks may form, on a scale of hundreds and thousands of individual atomic/molecular units. Here we employ machine vision to read and recognize complex molecular assemblies on surfaces. Specifically, we combine Markov random field model and convolutional neural networks to classify structural and rotational states of all individual building blocks in molecular assembly on the metallic surface visualized in high-resolution scanning tunneling microscopy measurements. We show how the obtained full decoding of the system allows us to directly construct a pair density function—a centerpiece in analysis of disorder-property relationship paradigm—as well as to analyze spatial correlations between multiple order parameters at the nanoscale, and elucidate reaction pathway involving molecular conformation changes. The method represents a significant shift in our way of analyzing atomic and/or molecular resolved microscopic images and can be applied to variety of other microscopic measurements of structural, electronic, and magnetic orders in different condensed matter systems.
How Complex, Probable, and Predictable is Genetically Driven Red Queen Chaos?
Duarte, Jorge; Rodrigues, Carla; Januário, Cristina; Martins, Nuno; Sardanyés, Josep
2015-12-01
Coevolution between two antagonistic species has been widely studied theoretically for both ecologically- and genetically-driven Red Queen dynamics. A typical outcome of these systems is an oscillatory behavior causing an endless series of one species adaptation and others counter-adaptation. More recently, a mathematical model combining a three-species food chain system with an adaptive dynamics approach revealed genetically driven chaotic Red Queen coevolution. In the present article, we analyze this mathematical model mainly focusing on the impact of species rates of evolution (mutation rates) in the dynamics. Firstly, we analytically proof the boundedness of the trajectories of the chaotic attractor. The complexity of the coupling between the dynamical variables is quantified using observability indices. By using symbolic dynamics theory, we quantify the complexity of genetically driven Red Queen chaos computing the topological entropy of existing one-dimensional iterated maps using Markov partitions. Co-dimensional two bifurcation diagrams are also built from the period ordering of the orbits of the maps. Then, we study the predictability of the Red Queen chaos, found in narrow regions of mutation rates. To extend the previous analyses, we also computed the likeliness of finding chaos in a given region of the parameter space varying other model parameters simultaneously. Such analyses allowed us to compute a mean predictability measure for the system in the explored region of the parameter space. We found that genetically driven Red Queen chaos, although being restricted to small regions of the analyzed parameter space, might be highly unpredictable.
Guyon, Hervé; Falissard, Bruno; Kop, Jean-Luc
2017-01-01
Network Analysis is considered as a new method that challenges Latent Variable models in inferring psychological attributes. With Network Analysis, psychological attributes are derived from a complex system of components without the need to call on any latent variables. But the ontological status of psychological attributes is not adequately defined with Network Analysis, because a psychological attribute is both a complex system and a property emerging from this complex system. The aim of this article is to reappraise the legitimacy of latent variable models by engaging in an ontological and epistemological discussion on psychological attributes. Psychological attributes relate to the mental equilibrium of individuals embedded in their social interactions, as robust attractors within complex dynamic processes with emergent properties, distinct from physical entities located in precise areas of the brain. Latent variables thus possess legitimacy, because the emergent properties can be conceptualized and analyzed on the sole basis of their manifestations, without exploring the upstream complex system. However, in opposition with the usual Latent Variable models, this article is in favor of the integration of a dynamic system of manifestations. Latent Variables models and Network Analysis thus appear as complementary approaches. New approaches combining Latent Network Models and Network Residuals are certainly a promising new way to infer psychological attributes, placing psychological attributes in an inter-subjective dynamic approach. Pragmatism-realism appears as the epistemological framework required if we are to use latent variables as representations of psychological attributes. PMID:28572780
NASA Astrophysics Data System (ADS)
Varela, Consuelo; Tarquis, Ana M.; Blanco-Gutiérrez, Irene; Estebe, Paloma; Toledo, Marisol; Martorano, Lucieta
2015-04-01
Social-ecological systems are linked complex systems that represent interconnected human and biophysical processes evolving and adapting across temporal and spatial scales. In the real world, social-ecological systems pose substantial challenges for modeling. In this regard, Fuzzy Cognitive Maps (FCMs) have proven to be a useful method for capturing the functioning of this type of systems. FCMs are a semi-quantitative type of cognitive map that represent a system composed of relevant factors and weighted links showing the strength and direction of cause-effects relationships among factors. Therefore, FCMs can be interpreted as complex system structures or complex networks. In this sense, recent research has applied complex network concepts for the analysis of FCMs that represent social-ecological systems. Key to FCM the tool is its potential to allow feedback loops and to include stakeholder knowledge in the construction of the tool. Also, previous research has demonstrated their potential to represent system dynamics and simulate the effects of changes in the system, such as policy interventions. For illustrating this analysis, we have developed a series of participatory FCM for the study of the ecological and human systems related to biodiversity conservation in two case studies of the Amazonian region, the Bolivia lowlands of Guarayos and the Brazil Tapajos National forest. The research is carried out in the context of the EU project ROBIN1 and it is based on the development of a series of stakeholder workshops to analyze the current state of the socio-ecological environment in the Amazonian forest, reflecting conflicts and challenges for biodiversity conservation and human development. Stakeholders included all relevant actors in the local case studies, namely farmers, environmental groups, producer organizations, local and provincial authorities and scientists. In both case studies we illustrate the use of complex networks concepts, such as the adjacency matrix and centrality properties (e.g.: centrality, page-rank, betweenness centrality). Different measures of network centrality evidence that deforestation and loss of biodiversity are the most relevant factors in the FCM of the two case studies analyzed. In both cases agricultural expansion emerges as a key driver of deforestation. The lack of policy coordination and a weak implementation and enforcement are also highly influential factors. The analysis of the system's dynamics suggest that in the case of Bolivia forest fires and deforestation are likely to continue in the immediate future as illegal activities are maintained and poverty increases. In the case of Brazil a decrease in available viable economic activities is driving further deforestation and ecosystem services loss. Overall, the research evidences how using FCMs together with complex network analysis can support policy development by identifying key elements and processes upon which policy makers and institutions can take action. Acknowledgements The authors would like to acknowledge the EU project ROBIN (The Role of Biodiversity in Climate Change Mitigation, from the EC FP7, no 283093) and the Spanish project AL14-PID-12 (Biodiversidad y cambio climático en la Amazonía: Perspectivas socio-económicas y ambientales) of the UPM Latin America Cooperation Program for funding this research.
Using activity theory to study cultural complexity in medical education.
Frambach, Janneke M; Driessen, Erik W; van der Vleuten, Cees P M
2014-06-01
There is a growing need for research on culture, cultural differences and cultural effects of globalization in medical education, but these are complex phenomena to investigate. Socio-cultural activity theory seems a useful framework to study cultural complexity, because it matches current views on culture as a dynamic process situated in a social context, and has been valued in diverse fields for yielding rich understandings of complex issues and key factors involved. This paper explains how activity theory can be used in (cross-)cultural medical education research. We discuss activity theory's theoretical background and principles, and we show how these can be applied to the cultural research practice by discussing the steps involved in a cross-cultural study that we conducted, from formulating research questions to drawing conclusions. We describe how the activity system, the unit of analysis in activity theory, can serve as an organizing principle to grasp cultural complexity. We end with reflections on the theoretical and practical use of activity theory for cultural research and note that it is not a shortcut to capture cultural complexity: it is a challenge for researchers to determine the boundaries of their study and to analyze and interpret the dynamics of the activity system.
Patient Safety: The Role of Human Factors and Systems Engineering
Carayon, Pascale; Wood, Kenneth E.
2011-01-01
Patient safety is a global challenge that requires knowledge and skills in multiple areas, including human factors and systems engineering. In this chapter, numerous conceptual approaches and methods for analyzing, preventing and mitigating medical errors are described. Given the complexity of healthcare work systems and processes, we emphasize the need for increasing partnerships between the health sciences and human factors and systems engineering to improve patient safety. Those partnerships will be able to develop and implement the system redesigns that are necessary to improve healthcare work systems and processes for patient safety. PMID:20543237
Interesting examples of supervised continuous variable systems
NASA Technical Reports Server (NTRS)
Chase, Christopher; Serrano, Joe; Ramadge, Peter
1990-01-01
The authors analyze two simple deterministic flow models for multiple buffer servers which are examples of the supervision of continuous variable systems by a discrete controller. These systems exhibit what may be regarded as the two extremes of complexity of the closed loop behavior: one is eventually periodic, the other is chaotic. The first example exhibits chaotic behavior that could be characterized statistically. The dual system, the switched server system, exhibits very predictable behavior, which is modeled by a finite state automaton. This research has application to multimodal discrete time systems where the controller can choose from a set of transition maps to implement.
Low-Complexity Polynomial Channel Estimation in Large-Scale MIMO With Arbitrary Statistics
NASA Astrophysics Data System (ADS)
Shariati, Nafiseh; Bjornson, Emil; Bengtsson, Mats; Debbah, Merouane
2014-10-01
This paper considers pilot-based channel estimation in large-scale multiple-input multiple-output (MIMO) communication systems, also known as massive MIMO, where there are hundreds of antennas at one side of the link. Motivated by the fact that computational complexity is one of the main challenges in such systems, a set of low-complexity Bayesian channel estimators, coined Polynomial ExpAnsion CHannel (PEACH) estimators, are introduced for arbitrary channel and interference statistics. While the conventional minimum mean square error (MMSE) estimator has cubic complexity in the dimension of the covariance matrices, due to an inversion operation, our proposed estimators significantly reduce this to square complexity by approximating the inverse by a L-degree matrix polynomial. The coefficients of the polynomial are optimized to minimize the mean square error (MSE) of the estimate. We show numerically that near-optimal MSEs are achieved with low polynomial degrees. We also derive the exact computational complexity of the proposed estimators, in terms of the floating-point operations (FLOPs), by which we prove that the proposed estimators outperform the conventional estimators in large-scale MIMO systems of practical dimensions while providing a reasonable MSEs. Moreover, we show that L needs not scale with the system dimensions to maintain a certain normalized MSE. By analyzing different interference scenarios, we observe that the relative MSE loss of using the low-complexity PEACH estimators is smaller in realistic scenarios with pilot contamination. On the other hand, PEACH estimators are not well suited for noise-limited scenarios with high pilot power; therefore, we also introduce the low-complexity diagonalized estimator that performs well in this regime. Finally, we ...
Performance Prediction of a MongoDB-Based Traceability System in Smart Factory Supply Chains
Kang, Yong-Shin; Park, Il-Ha; Youm, Sekyoung
2016-01-01
In the future, with the advent of the smart factory era, manufacturing and logistics processes will become more complex, and the complexity and criticality of traceability will further increase. This research aims at developing a performance assessment method to verify scalability when implementing traceability systems based on key technologies for smart factories, such as Internet of Things (IoT) and BigData. To this end, based on existing research, we analyzed traceability requirements and an event schema for storing traceability data in MongoDB, a document-based Not Only SQL (NoSQL) database. Next, we analyzed the algorithm of the most representative traceability query and defined a query-level performance model, which is composed of response times for the components of the traceability query algorithm. Next, this performance model was solidified as a linear regression model because the response times increase linearly by a benchmark test. Finally, for a case analysis, we applied the performance model to a virtual automobile parts logistics. As a result of the case study, we verified the scalability of a MongoDB-based traceability system and predicted the point when data node servers should be expanded in this case. The traceability system performance assessment method proposed in this research can be used as a decision-making tool for hardware capacity planning during the initial stage of construction of traceability systems and during their operational phase. PMID:27983654
Performance Prediction of a MongoDB-Based Traceability System in Smart Factory Supply Chains.
Kang, Yong-Shin; Park, Il-Ha; Youm, Sekyoung
2016-12-14
In the future, with the advent of the smart factory era, manufacturing and logistics processes will become more complex, and the complexity and criticality of traceability will further increase. This research aims at developing a performance assessment method to verify scalability when implementing traceability systems based on key technologies for smart factories, such as Internet of Things (IoT) and BigData. To this end, based on existing research, we analyzed traceability requirements and an event schema for storing traceability data in MongoDB, a document-based Not Only SQL (NoSQL) database. Next, we analyzed the algorithm of the most representative traceability query and defined a query-level performance model, which is composed of response times for the components of the traceability query algorithm. Next, this performance model was solidified as a linear regression model because the response times increase linearly by a benchmark test. Finally, for a case analysis, we applied the performance model to a virtual automobile parts logistics. As a result of the case study, we verified the scalability of a MongoDB-based traceability system and predicted the point when data node servers should be expanded in this case. The traceability system performance assessment method proposed in this research can be used as a decision-making tool for hardware capacity planning during the initial stage of construction of traceability systems and during their operational phase.
ERIC Educational Resources Information Center
Wu, Jiun-Yu; Kwok, Oi-man
2012-01-01
Both ad-hoc robust sandwich standard error estimators (design-based approach) and multilevel analysis (model-based approach) are commonly used for analyzing complex survey data with nonindependent observations. Although these 2 approaches perform equally well on analyzing complex survey data with equal between- and within-level model structures…
A Design of Product Collaborative Online Configuration Model
NASA Astrophysics Data System (ADS)
Wang, Xiaoguo; Zheng, Jin; Zeng, Qian
According to the actual needs of mass customization, the personalization of product and its collaborative design, the paper analyzes and studies the working mechanism of modular-based product configuration technology and puts forward an information model of modular product family. Combined with case-based reasoning techniques (CBR) and the constraint satisfaction problem solving techniques (CSP), we design and study the algorithm for product configuration, and analyze its time complexity. A car chassis is made as the application object, we provide a prototype system of online configuration. Taking advantage of this system, designers can make appropriate changes on the existing programs in accordance with the demand. This will accelerate all aspects of product development and shorten the product cycle. Also the system will provide a strong technical support for enterprises to improve their market competitiveness.
NASA Technical Reports Server (NTRS)
Chase, Christopher; Serrano, Joseph; Ramadge, Peter J.
1993-01-01
We analyze two examples of the discrete control of a continuous variable system. These examples exhibit what may be regarded as the two extremes of complexity of the closed-loop behavior: one is eventually periodic, the other is chaotic. Our examples are derived from sampled deterministic flow models. These are of interest in their own right but have also been used as models for certain aspects of manufacturing systems. In each case, we give a precise characterization of the closed-loop behavior.
On the interplay between mathematics and biology. Hallmarks toward a new systems biology
NASA Astrophysics Data System (ADS)
Bellomo, Nicola; Elaiw, Ahmed; Althiabi, Abdullah M.; Alghamdi, Mohammed Ali
2015-03-01
This paper proposes a critical analysis of the existing literature on mathematical tools developed toward systems biology approaches and, out of this overview, develops a new approach whose main features can be briefly summarized as follows: derivation of mathematical structures suitable to capture the complexity of biological, hence living, systems, modeling, by appropriate mathematical tools, Darwinian type dynamics, namely mutations followed by selection and evolution. Moreover, multiscale methods to move from genes to cells, and from cells to tissue are analyzed in view of a new systems biology approach.
Understanding the complexity of human gait dynamics
NASA Astrophysics Data System (ADS)
Scafetta, Nicola; Marchi, Damiano; West, Bruce J.
2009-06-01
Time series of human gait stride intervals exhibit fractal and multifractal properties under several conditions. Records from subjects walking at normal, slow, and fast pace speed are analyzed to determine changes in the fractal scalings as a function of the stress condition of the system. Records from subjects with different age from children to elderly and patients suffering from neurodegenerative disease are analyzed to determine changes in the fractal scalings as a function of the physical maturation or degeneration of the system. A supercentral pattern generator model is presented to simulate the above two properties that are typically found in dynamical network performance: that is, how a dynamical network responds to stress and to evolution.
Using Space Weather for Enhanced, Extreme Terrestrial Weather Predictions.
NASA Astrophysics Data System (ADS)
McKenna, M. H.; Lee, T. A., III
2017-12-01
Considering the complexities of the Sun-Earth system, the impacts of space weather to weather here on Earth are not fully understood. This study attempts to analyze this interrelationship by providing a theoretical framework for studying the varied modalities of solar inclination and explores the extent to which they contribute, both in formation and intensity, to extreme terrestrial weather. Using basic topologic and ontology engineering concepts (TOEC), the transdisciplinary syntaxes of space physics, geophysics, and meteorology are analyzed as a seamless interrelated system. This paper reports this investigation's initial findings and examines the validity of the question "Does space weather contribute to extreme weather on Earth, and if so, to what degree?"
NASA Technical Reports Server (NTRS)
Feher, Kamilo
1993-01-01
The performance and implementation complexity of coherent and of noncoherent QPSK and GMSK modulation/demodulation techniques in a complex mobile satellite systems environment, including large Doppler shift, delay spread, and low C/I, are compared. We demonstrate that for large f(sub d)T(sub b) products, where f(sub d) is the Doppler shift and T(sub b) is the bit duration, noncoherent (discriminator detector or differential demodulation) systems have a lower BER floor than their coherent counterparts. For significant delay spreads, e.g., tau(sub rms) greater than 0.4 T(sub b), and low C/I, coherent systems outperform noncoherent systems. However, the synchronization time of coherent systems is longer than that of noncoherent systems. Spectral efficiency, overall capacity, and related hardware complexity issues of these systems are also analyzed. We demonstrate that coherent systems have a simpler overall architecture (IF filter implementation-cost versus carrier recovery) and are more robust in an RF frequency drift environment. Additionally, the prediction tools, computer simulations, and analysis of coherent systems is simpler. The threshold or capture effect in low C/I interference environment is critical for noncoherent discriminator based systems. We conclude with a comparison of hardware architectures of coherent and of noncoherent systems, including recent trends in commercial VLSI technology and direct baseband to RF transmit, RF to baseband (0-IF) receiver implementation strategies.
NASA Astrophysics Data System (ADS)
Feher, Kamilo
The performance and implementation complexity of coherent and of noncoherent QPSK and GMSK modulation/demodulation techniques in a complex mobile satellite systems environment, including large Doppler shift, delay spread, and low C/I, are compared. We demonstrate that for large f(sub d)T(sub b) products, where f(sub d) is the Doppler shift and T(sub b) is the bit duration, noncoherent (discriminator detector or differential demodulation) systems have a lower BER floor than their coherent counterparts. For significant delay spreads, e.g., tau(sub rms) greater than 0.4 T(sub b), and low C/I, coherent systems outperform noncoherent systems. However, the synchronization time of coherent systems is longer than that of noncoherent systems. Spectral efficiency, overall capacity, and related hardware complexity issues of these systems are also analyzed. We demonstrate that coherent systems have a simpler overall architecture (IF filter implementation-cost versus carrier recovery) and are more robust in an RF frequency drift environment. Additionally, the prediction tools, computer simulations, and analysis of coherent systems is simpler. The threshold or capture effect in low C/I interference environment is critical for noncoherent discriminator based systems. We conclude with a comparison of hardware architectures of coherent and of noncoherent systems, including recent trends in commercial VLSI technology and direct baseband to RF transmit, RF to baseband (0-IF) receiver implementation strategies.
Preparation of Some Novel Copper(I) Complexes and their Molar Conductances in Organic Solvents
NASA Astrophysics Data System (ADS)
Gill, Dip Singh; Rana, Dilbag
2009-04-01
Attempts have been made to prepare some novel copper(I) nitrate, sulfate, and perchlorate complexes. Molar conductances of these complexes have been measured in organic solvents like acetonitrile (AN), acetone (AC), methanol (MeOH), N,N-dimethylformamide (DMF), N,Ndimethylacetamide (DMA), and dimethylsulfoxide (DMSO) at 298 K. The molar conductance data have been analyzed to obtain limiting molar conductances (λ0) and ion association constants (KA) of the electrolytes. The results showed that all these complexes are strong electrolytes in all organic solvents. The limiting ionic molar conductances (λo± ) for various ions have been calculated using Bu4NBPh4 as reference electrolyte. The actual radii for copper(I) complex ions are very large and different in different solvents and indicate some solvation effects in each solvent system
Signals and circuits in the purkinje neuron.
Abrams, Zéev R; Zhang, Xiang
2011-01-01
Purkinje neurons (PN) in the cerebellum have over 100,000 inputs organized in an orthogonal geometry, and a single output channel. As the sole output of the cerebellar cortex layer, their complex firing pattern has been associated with motor control and learning. As such they have been extensively modeled and measured using tools ranging from electrophysiology and neuroanatomy, to dynamic systems and artificial intelligence methods. However, there is an alternative approach to analyze and describe the neuronal output of these cells using concepts from electrical engineering, particularly signal processing and digital/analog circuits. By viewing the PN as an unknown circuit to be reverse-engineered, we can use the tools that provide the foundations of today's integrated circuits and communication systems to analyze the Purkinje system at the circuit level. We use Fourier transforms to analyze and isolate the inherent frequency modes in the PN and define three unique frequency ranges associated with the cells' output. Comparing the PN to a signal generator that can be externally modulated adds an entire level of complexity to the functional role of these neurons both in terms of data analysis and information processing, relying on Fourier analysis methods in place of statistical ones. We also re-describe some of the recent literature in the field, using the nomenclature of signal processing. Furthermore, by comparing the experimental data of the past decade with basic electronic circuitry, we can resolve the outstanding controversy in the field, by recognizing that the PN can act as a multivibrator circuit.
Cortes, Pablo A; Bozinovic, Francisco; Blier, Pierre U
2018-07-01
Mammalian torpor is a phenotype characterized by a controlled decline of metabolic rate, generally followed by a reduction in body temperature. During arousal from torpor, both metabolic rate and body temperature rapidly returns to resting levels. Metabolic rate reduction experienced by torpid animals is triggered by active suppression of mitochondrial respiration, which is rapidly reversed during rewarming process. In this study, we analyzed the changes in the maximal activity of key enzymes related to electron transport system (complexes I, III and IV) in six tissues of torpid, arousing and euthermic Chilean mouse-opossums (Thylamys elegans). We observed higher maximal activities of complexes I and IV during torpor in brain, heart and liver, the most metabolically active organs in mammals. On the contrary, higher enzymatic activities of complexes III were observed during torpor in kidneys and lungs. Moreover, skeletal muscle was the only tissue without significant differences among stages in all complexes evaluated, suggesting no modulation of oxidative capacities of electron transport system components in this thermogenic tissue. In overall, our data suggest that complexes I and IV activity plays a major role in initiation and maintenance of metabolic suppression during torpor in Chilean mouse-opossum, whereas improvement of oxidative capacities in complex III might be critical to sustain metabolic machinery in organs that remains metabolically active during torpor. Copyright © 2018 Elsevier Inc. All rights reserved.
Analyzing neuronal networks using discrete-time dynamics
NASA Astrophysics Data System (ADS)
Ahn, Sungwoo; Smith, Brian H.; Borisyuk, Alla; Terman, David
2010-05-01
We develop mathematical techniques for analyzing detailed Hodgkin-Huxley like models for excitatory-inhibitory neuronal networks. Our strategy for studying a given network is to first reduce it to a discrete-time dynamical system. The discrete model is considerably easier to analyze, both mathematically and computationally, and parameters in the discrete model correspond directly to parameters in the original system of differential equations. While these networks arise in many important applications, a primary focus of this paper is to better understand mechanisms that underlie temporally dynamic responses in early processing of olfactory sensory information. The models presented here exhibit several properties that have been described for olfactory codes in an insect’s Antennal Lobe. These include transient patterns of synchronization and decorrelation of sensory inputs. By reducing the model to a discrete system, we are able to systematically study how properties of the dynamics, including the complex structure of the transients and attractors, depend on factors related to connectivity and the intrinsic and synaptic properties of cells within the network.
NASA Astrophysics Data System (ADS)
Aji Hapsoro, Cahyo; Purqon, Acep; Srigutomo, Wahyu
2017-07-01
2-D Time Domain Electromagnetic (TDEM) has been successfully conducted to illustrate the value of Electric field distribution under the Earth surface. Electric field compared by magnetic field is used to analyze resistivity and resistivity is one of physical properties which very important to determine the reservoir potential area of geothermal systems as one of renewable energy. In this modeling we used Time Domain Electromagnetic method because it can solve EM field interaction problem with complex geometry and to analyze transient problems. TDEM methods used to model the value of electric and magnetic fields as a function of the time combined with the function of distance and depth. The result of this modeling is Electric field intensity value which is capable to describe the structure of the Earth’s subsurface. The result of this modeling can be applied to describe the Earths subsurface resistivity values to determine the reservoir potential of geothermal systems.
Automation of complex assays: pharmacogenetics of warfarin dosing.
Wu, Whei-Kuo; Hujsak, Paul G; Kureshy, Fareed
2007-10-01
AutoGenomics, Inc. (Carlsbad, CA, USA) have developed a multiplex microarray assay for genotyping both VKORC1 and CYP2C9 using the INFINITI(™) Analyzer. Multiple alleles in each DNA sample are analyzed by polymerase chain reaction amplification, followed by detection primer extension using the INFINITI Analyzer. The INFINITI Analyzer performs single-nucleotide polymorphism (SNP) analysis using universal oligonucleotides immobilized on the biochip. To genotype broader ethnic groups, genomic DNA from whole blood was tested for nine SNPs for VKORC1 and six for CYP2C9 genotypes. Information related to all 15 SNPs is needed to determine dosing of population of diverse ethnic origin. The INFINITI system provides genotyping information for same day dosing of warfarin.
Wang, Jinling; Jiang, Haijun; Ma, Tianlong; Hu, Cheng
2018-05-01
This paper considers the delay-dependent stability of memristive complex-valued neural networks (MCVNNs). A novel linear mapping function is presented to transform the complex-valued system into the real-valued system. Under such mapping function, both continuous-time and discrete-time MCVNNs are analyzed in this paper. Firstly, when activation functions are continuous but not Lipschitz continuous, an extended matrix inequality is proved to ensure the stability of continuous-time MCVNNs. Furthermore, if activation functions are discontinuous, a discontinuous adaptive controller is designed to acquire its stability by applying Lyapunov-Krasovskii functionals. Secondly, compared with techniques in continuous-time MCVNNs, the Halanay-type inequality and comparison principle are firstly used to exploit the dynamical behaviors of discrete-time MCVNNs. Finally, the effectiveness of theoretical results is illustrated through numerical examples. Copyright © 2018 Elsevier Ltd. All rights reserved.
Computer simulations of dendrimer-polyelectrolyte complexes.
Pandav, Gunja; Ganesan, Venkat
2014-08-28
We carry out a systematic analysis of static properties of the clusters formed by complexation between charged dendrimers and linear polyelectrolyte (LPE) chains in a dilute solution under good solvent conditions. We use single chain in mean-field simulations and analyze the structure of the clusters through radial distribution functions of the dendrimer, cluster size, and charge distributions. The effects of LPE length, charge ratio between LPE and dendrimer, the influence of salt concentration, and the dendrimer generation number are examined. Systems with short LPEs showed a reduced propensity for aggregation with dendrimers, leading to formation of smaller clusters. In contrast, larger dendrimers and longer LPEs lead to larger clusters with significant bridging. Increasing salt concentration was seen to reduce aggregation between dendrimers as a result of screening of electrostatic interactions. Generally, maximum complexation was observed in systems with an equal amount of net dendrimer and LPE charges, whereas either excess LPE or dendrimer concentrations resulted in reduced clustering between dendrimers.
Flight-deck automation - Promises and problems
NASA Technical Reports Server (NTRS)
Wiener, E. L.; Curry, R. E.
1980-01-01
The paper analyzes the role of human factors in flight-deck automation, identifies problem areas, and suggests design guidelines. Flight-deck automation using microprocessor technology and display systems improves performance and safety while leading to a decrease in size, cost, and power consumption. On the other hand negative factors such as failure of automatic equipment, automation-induced error compounded by crew error, crew error in equipment set-up, failure to heed automatic alarms, and loss of proficiency must also be taken into account. Among the problem areas discussed are automation of control tasks, monitoring of complex systems, psychosocial aspects of automation, and alerting and warning systems. Guidelines are suggested for designing, utilising, and improving control and monitoring systems. Investigation into flight-deck automation systems is important as the knowledge gained can be applied to other systems such as air traffic control and nuclear power generation, but the many problems encountered with automated systems need to be analyzed and overcome in future research.
Solve the Dilemma of Over-Simplification
NASA Astrophysics Data System (ADS)
Schmitt, Gerhard
Complexity science can help to understand the functioning and the interaction of the components of a city. In 1965, Christopher Alexander gave in his book A city is not a tree a description of the complex nature of urban organization. At this time, neither high-speed computers nor urban big data existed. Today, Luis Bettencourt et al. use complexity science to analyze data for countries, regions, or cities. The results can be used globally in other cities. Objectives of complexity science with regard to future cities are the observation and identification of tendencies and regularities in behavioral patterns, and to find correlations between them and spatial configurations. Complex urban systems cannot be understood in total yet. But research focuses on describing the system by finding some simple, preferably general and emerging patterns and rules that can be used for urban planning. It is important that the influencing factors are not just geo-spatial patterns but also consider variables which are important for the design quality. Complexity science is a way to solve the dilemma of oversimplification of insights from existing cities and their applications to new cities. An example: The effects of streets, public places and city structures on citizens and their behavior depend on how they are perceived. To describe this perception, it is not sufficient to consider only particular characteristics of the urban environment. Different aspects play a role and influence each other. Complexity science could take this fact into consideration and handle the non-linearity of the system...
Analysis of a fuel cell on-site integrated energy system for a residential complex
NASA Technical Reports Server (NTRS)
Simons, S. N.; Maag, W. L.
1979-01-01
Declining supplies of domestic oil and gas and the increased cost of energy resulted in a renewed emphasis in utilizing available resources in the most efficient manner possible. This, in turn, brought about a reassessment of a number of methods for converting fossil fuels to end uses at the highest practical efficiency. One of these is the on-site integrated energy system (OS/IES). This system provides electric power from an on-site power plant and recovers heat from the power plant that would normally be rejected to the environment. An OS/IES is potentially useful in any application that requires both electricity and heat. Several OS/IES are analyzed for a residential complex. The paper is divided into two sections; the first compares three energy supply systems, the second compares various designs for fuel cell OS/IES.
NASA Technical Reports Server (NTRS)
Mata, C. T.; Mata, A. G.; Rakov, V. A.; Nag, A.; Saul, J.
2012-01-01
A new comprehensive lightning instrumentation system has been designed for Launch Complex 39B (LC39B) at the Kennedy Space Center, Florida. This new instrumentation system includes seven synchronized high-speed video cameras, current sensors installed on the nine downconductors of the new lightning protection system (LPS) for LC39B; four dH/dt, 3-axis measurement stations; and five dE/dt stations composed of two antennas each. The LPS received 8 direct lightning strikes (a total of 19 strokes) from March 31 through December 31 2011. The measured peak currents and locations are compared to those reported by the Cloud-to-Ground Lightning Surveillance System (CGLSS II) and the National Lightning Detection Network (NLDN). Results of comparison are presented and analyzed in this paper.
Effect on Ammonium Bromide in dielectric behavior based Alginate Solid Biopolymer electrolytes
NASA Astrophysics Data System (ADS)
Fuzlin, A. F.; Rasali, N. M. J.; Samsudin, A. S.
2018-04-01
This paper present the development of solid biopolymer electrolytes (SBEs) system which has been accomplished by incorporating various composition of ionic dopant namely ammonium bromide (NH4Br) with alginate solution casting method. The prepared sample of SBEs has been analyzed via electrical impedance spectroscopy (EIS) showed that the ionic conductivity at room temperature was increased from 4.67 x 10-7 S cm-1 for un-doped sample to optimum value at 4.41 x 10-5 S cm-1 for composition of 20 wt. % NH4Br. The SBEs system was found to obey the Arrhenius characteristics with R2~1where all sample is thermally activated when increasing temperature. The dielectric behavior of the alginate-NH4Br SBEs system were measured using complex permittivity (ε*) and complex electrical modulus (M*) and shown the non-debye behavior where no single relaxation was found for present SBEs system.
Developments in the Tools and Methodologies of Synthetic Biology
Kelwick, Richard; MacDonald, James T.; Webb, Alexander J.; Freemont, Paul
2014-01-01
Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices, or systems. However, biological systems are generally complex and unpredictable, and are therefore, intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a “body of knowledge” from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled, and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled, and its functionality tested. At each stage of the design cycle, an expanding repertoire of tools is being developed. In this review, we highlight several of these tools in terms of their applications and benefits to the synthetic biology community. PMID:25505788
Toward a theoretical framework for trustworthy cyber sensing
NASA Astrophysics Data System (ADS)
Xu, Shouhuai
2010-04-01
Cyberspace is an indispensable part of the economy and society, but has been "polluted" with many compromised computers that can be abused to launch further attacks against the others. Since it is likely that there always are compromised computers, it is important to be aware of the (dynamic) cyber security-related situation, which is however challenging because cyberspace is an extremely large-scale complex system. Our project aims to investigate a theoretical framework for trustworthy cyber sensing. With the perspective of treating cyberspace as a large-scale complex system, the core question we aim to address is: What would be a competent theoretical (mathematical and algorithmic) framework for designing, analyzing, deploying, managing, and adapting cyber sensor systems so as to provide trustworthy information or input to the higher layer of cyber situation-awareness management, even in the presence of sophisticated malicious attacks against the cyber sensor systems?
Tool for simplifying the complex interactions within resilient communities
NASA Astrophysics Data System (ADS)
Stwertka, C.; Albert, M. R.; White, K. D.
2016-12-01
In recent decades, scientists have observed and documented impacts from climate change that will impact multiple sectors, will be impacted by decisions from multiple sectors, and will change over time. This complex human-engineered system has a large number of moving, interacting parts, which are interdependent and evolve over time towards their purpose. Many of the existing resilience frameworks and vulnerability frameworks focus on interactions between the domains, but do not include the structure of the interactions. We present an engineering systems approach to investigate the structural elements that influence a community's ability to be resilient. In this presentation we will present and analyze four common methods for building community resilience, utilizing our common framework. For several existing case studies we examine the stress points in the system and identify the impacts on the outcomes from the case studies. In ongoing research we will apply our system tool to a new case in the field.
Dependency visualization for complex system understanding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smart, J. Allison Cory
1994-09-01
With the volume of software in production use dramatically increasing, the importance of software maintenance has become strikingly apparent. Techniques now sought and developed for reverse engineering and design extraction and recovery. At present, numerous commercial products and research tools exist which are capable of visualizing a variety of programming languages and software constructs. The list of new tools and services continues to grow rapidly. Although the scope of the existing commercial and academic product set is quite broad, these tools still share a common underlying problem. The ability of each tool to visually organize object representations is increasingly impairedmore » as the number of components and component dependencies within systems increases. Regardless of how objects are defined, complex ``spaghetti`` networks result in nearly all large system cases. While this problem is immediately apparent in modem systems analysis involving large software implementations, it is not new. As will be discussed in Chapter 2, related problems involving the theory of graphs were identified long ago. This important theoretical foundation provides a useful vehicle for representing and analyzing complex system structures. While the utility of directed graph based concepts in software tool design has been demonstrated in literature, these tools still lack the capabilities necessary for large system comprehension. This foundation must therefore be expanded with new organizational and visualization constructs necessary to meet this challenge. This dissertation addresses this need by constructing a conceptual model and a set of methods for interactively exploring, organizing, and understanding the structure of complex software systems.« less
Stochastic tools hidden behind the empirical dielectric relaxation laws
NASA Astrophysics Data System (ADS)
Stanislavsky, Aleksander; Weron, Karina
2017-03-01
The paper is devoted to recent advances in stochastic modeling of anomalous kinetic processes observed in dielectric materials which are prominent examples of disordered (complex) systems. Theoretical studies of dynamical properties of ‘structures with variations’ (Goldenfield and Kadanoff 1999 Science 284 87-9) require application of such mathematical tools—by means of which their random nature can be analyzed and, independently of the details distinguishing various systems (dipolar materials, glasses, semiconductors, liquid crystals, polymers, etc), the empirical universal kinetic patterns can be derived. We begin with a brief survey of the historical background of the dielectric relaxation study. After a short outline of the theoretical ideas providing the random tools applicable to modeling of relaxation phenomena, we present probabilistic implications for the study of the relaxation-rate distribution models. In the framework of the probability distribution of relaxation rates we consider description of complex systems, in which relaxing entities form random clusters interacting with each other and single entities. Then we focus on stochastic mechanisms of the relaxation phenomenon. We discuss the diffusion approach and its usefulness for understanding of anomalous dynamics of relaxing systems. We also discuss extensions of the diffusive approach to systems under tempered random processes. Useful relationships among different stochastic approaches to the anomalous dynamics of complex systems allow us to get a fresh look at this subject. The paper closes with a final discussion on achievements of stochastic tools describing the anomalous time evolution of complex systems.
Climate Informatics: Accelerating Discovering in Climate Science with Machine Learning
NASA Technical Reports Server (NTRS)
Monteleoni, Claire; Schmidt, Gavin A.; McQuade, Scott
2014-01-01
The goal of climate informatics, an emerging discipline, is to inspire collaboration between climate scientists and data scientists, in order to develop tools to analyze complex and ever-growing amounts of observed and simulated climate data, and thereby bridge the gap between data and understanding. Here, recent climate informatics work is presented, along with details of some of the field's remaining challenges. Given the impact of climate change, understanding the climate system is an international priority. The goal of climate informatics is to inspire collaboration between climate scientists and data scientists, in order to develop tools to analyze complex and ever-growing amounts of observed and simulated climate data, and thereby bridge the gap between data and understanding. Here, recent climate informatics work is presented, along with details of some of the remaining challenges.
NASA Technical Reports Server (NTRS)
Lucero, John M.
2003-01-01
A new optically based measuring capability that characterizes surface topography, geometry, and wear has been employed by NASA Glenn Research Center s Tribology and Surface Science Branch. To characterize complex parts in more detail, we are using a three-dimensional, surface structure analyzer-the NewView5000 manufactured by Zygo Corporation (Middlefield, CT). This system provides graphical images and high-resolution numerical analyses to accurately characterize surfaces. Because of the inherent complexity of the various analyzed assemblies, the machine has been pushed to its limits. For example, special hardware fixtures and measuring techniques were developed to characterize Oil- Free thrust bearings specifically. We performed a more detailed wear analysis using scanning white light interferometry to image and measure the bearing structure and topography, enabling a further understanding of bearing failure causes.
Effective Capital Provision Within Government. Methodologies for Right-Sizing Base Infrastructure
2005-01-01
unknown distributions, since they more accurately represent the complexity of real -world problems. Forecasting uncertain future demand flows is critical to...ordering system with no time lags and no additional costs for instantaneous delivery, shortage and holding costs would be eliminated, because the...order a fixed quantity, Q. 4.1.4 Analyzed Time Step Time is an important dimension in inventory models, since the way the system changes over time affects
Use of microstrip patch antennas in grain and pulverized materials permittivity measurement
El Sabbagh, M.A.; Ramahi, O.M.; Trabelsi, S.; Nelson, S.O.; Khan, L.
2003-01-01
A free-space microwave system developed for the measurement of the relative complex permittivity of granular materials and of pulverized materials was reported. The system consists of a transmitting antenna and a receiving antenna separated by a space filled by the sample to be characterized and a network analyzer for transmission measurement. The receiving antenna was mounted on a movable plate, which gives the flexibility of having different sample thicknesses.
Yu, Ming; Cao, Qi-chen; Su, Yu-xi; Sui, Xin; Yang, Hong-jun; Huang, Lu-qi; Wang, Wen-ping
2015-08-01
Malignant tumor is one of the main causes for death in the world at present as well as a major disease seriously harming human health and life and restricting the social and economic development. There are many kinds of reports about traditional Chinese medicine patent prescriptions, empirical prescriptions and self-made prescriptions treating cancer, and prescription rules were often analyzed based on medication frequency. Such methods were applicable for discovering dominant experience but hard to have an innovative discovery and knowledge. In this paper, based on the traditional Chinese medicine inheritance assistance system, the software integration of mutual information improvement method, complex system entropy clustering and unsupervised entropy-level clustering data mining methods was adopted to analyze the rules of traditional Chinese medicine prescriptions for cancer. Totally 114 prescriptions were selected, the frequency of herbs in prescription was determined, and 85 core combinations and 13 new prescriptions were indentified. The traditional Chinese medicine inheritance assistance system, as a valuable traditional Chinese medicine research-supporting tool, can be used to record, manage, inquire and analyze prescription data.
Ding, Hang
2014-01-01
Structures in recurrence plots (RPs), preserving the rich information of nonlinear invariants and trajectory characteristics, have been increasingly analyzed in dynamic discrimination studies. The conventional analysis of RPs is mainly focused on quantifying the overall diagonal and vertical line structures through a method, called recurrence quantification analysis (RQA). This study extensively explores the information in RPs by quantifying local complex RP structures. To do this, an approach was developed to analyze the combination of three major RQA variables: determinism, laminarity, and recurrence rate (DLR) in a metawindow moving over a RP. It was then evaluated in two experiments discriminating (1) ideal nonlinear dynamic series emulated from the Lorenz system with different control parameters and (2) data sets of human heart rate regulations with normal sinus rhythms (n = 18) and congestive heart failure (n = 29). Finally, the DLR was compared with seven major RQA variables in terms of discriminatory power, measured by standardized mean difference (DSMD). In the two experiments, DLR resulted in the highest discriminatory power with DSMD = 2.53 and 0.98, respectively, which were 7.41 and 2.09 times the best performance from RQA. The study also revealed that the optimal RP structures for the discriminations were neither typical diagonal structures nor vertical structures. These findings indicate that local complex RP structures contain some rich information unexploited by RQA. Therefore, future research to extensively analyze complex RP structures would potentially improve the effectiveness of the RP analysis in dynamic discrimination studies.
Education Reform in New York City: Ambitious Change in the Nation's Most Complex School System
ERIC Educational Resources Information Center
O'Day, Jennifer A., Ed.; Bitter, Catherine S., Ed.; Gomez, Louis M., Ed.
2011-01-01
Written in an accessible style by highly respected scholars, the papers in this volume document and analyze particular components of the Children First reforms, including governance, community engagement, finance, accountability, and instruction. The education reforms in New York City's public schools begun under the administration of Mayor…
Observing Complex Systems Thinking in the Zone of Proximal Development
ERIC Educational Resources Information Center
Danish, Joshua; Saleh, Asmalina; Andrade, Alejandro; Bryan, Branden
2017-01-01
Our paper builds on the construct of the zone of proximal development (ZPD) (Vygotsky in Mind in society: the development of higher psychological processes, Harvard University Press, Cambridge, 1978) to analyze the relationship between students' answers and the help they receive as they construct them. We report on a secondary analysis of…
USDA-ARS?s Scientific Manuscript database
The impacts of projected temperature increases in agricultural ecosystems are complex, varying by region, cropping system, crop growth stage and humidity. We analyze the impacts of mid- century temperature increases on crops grown in five southwestern states: Arizona, California, New Mexico, Nevada ...
Service beyond Silos: Analyzing Data Trends to Inform the One-Stop Model
ERIC Educational Resources Information Center
Fifolt, Matthew
2010-01-01
Institutions of higher education, like other large organizations, can have complex and complicated administrative structures. Nowhere is this more true than in the area of student services. Internal systems and processes that have become almost second nature to the individuals who staff administrative units can seem confusing and frustrating to…
Student-Teachers' Use of "Google Earth" in Problem-Based Geology Learning
ERIC Educational Resources Information Center
Ratinen, Ilkka; Keinonen, Tuula
2011-01-01
Geographical Information Systems (GIS) are adequate for analyzing complex scientific and spatial phenomena in geography education. "Google Earth" is a geographic information tool for GIS-based learning. It allows students to engage in the lesson, explore the Earth, explain what they identify and evaluate the implications of what they are…
Origin and Evolutionary Alteration of the Mitochondrial Import System in Eukaryotic Lineages.
Fukasawa, Yoshinori; Oda, Toshiyuki; Tomii, Kentaro; Imai, Kenichiro
2017-07-01
Protein transport systems are fundamentally important for maintaining mitochondrial function. Nevertheless, mitochondrial protein translocases such as the kinetoplastid ATOM complex have recently been shown to vary in eukaryotic lineages. Various evolutionary hypotheses have been formulated to explain this diversity. To resolve any contradiction, estimating the primitive state and clarifying changes from that state are necessary. Here, we present more likely primitive models of mitochondrial translocases, specifically the translocase of the outer membrane (TOM) and translocase of the inner membrane (TIM) complexes, using scrutinized phylogenetic profiles. We then analyzed the translocases' evolution in eukaryotic lineages. Based on those results, we propose a novel evolutionary scenario for diversification of the mitochondrial transport system. Our results indicate that presequence transport machinery was mostly established in the last eukaryotic common ancestor, and that primitive translocases already had a pathway for transporting presequence-containing proteins. Moreover, secondary changes including convergent and migrational gains of a presequence receptor in TOM and TIM complexes, respectively, likely resulted from constrained evolution. The nature of a targeting signal can constrain alteration to the protein transport complex. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Cardone, A.; Bornstein, A.; Pant, H. C.; Brady, M.; Sriram, R.; Hassan, S. A.
2015-01-01
A method is proposed to study protein-ligand binding in a system governed by specific and non-specific interactions. Strong associations lead to narrow distributions in the proteins configuration space; weak and ultra-weak associations lead instead to broader distributions, a manifestation of non-specific, sparsely-populated binding modes with multiple interfaces. The method is based on the notion that a discrete set of preferential first-encounter modes are metastable states from which stable (pre-relaxation) complexes at equilibrium evolve. The method can be used to explore alternative pathways of complexation with statistical significance and can be integrated into a general algorithm to study protein interaction networks. The method is applied to a peptide-protein complex. The peptide adopts several low-population conformers and binds in a variety of modes with a broad range of affinities. The system is thus well suited to analyze general features of binding, including conformational selection, multiplicity of binding modes, and nonspecific interactions, and to illustrate how the method can be applied to study these problems systematically. The equilibrium distributions can be used to generate biasing functions for simulations of multiprotein systems from which bulk thermodynamic quantities can be calculated. PMID:25782918
Automated in-line gel filtration for native state mass spectrometry.
Waitt, Greg M; Xu, Robert; Wisely, G Bruce; Williams, Jon D
2008-02-01
Characterization of protein-ligand complexes by nondenaturing mass spectrometry provides direct evidence of drug-like molecules binding with potential therapeutic targets. Typically, protein-ligand complexes to be analyzed contain buffer salts, detergents, and other additives to enhance protein solubility, all of which make the sample unable to be analyzed directly by electrospray ionization mass spectrometry. This work describes an in-line gel-filtration method that has been automated and optimized. Automation was achieved using commercial HPLC equipment. Gel column parameters that were optimized include: column dimensions, flow rate, packing material type, particle size, and molecular weight cut-off. Under optimal conditions, desalted protein ions are detected 4 min after injection and the analysis is completed in 20 min. The gel column retains good performance even after >200 injections. A demonstration for using the in-line gel-filtration system is shown for monitoring the exchange of fatty acids from the pocket of a nuclear hormone receptor, peroxisome proliferator activator-delta (PPARdelta) with a tool compound. Additional utilities of in-line gel-filtration mass spectrometry system will also be discussed.
Robust autoassociative memory with coupled networks of Kuramoto-type oscillators
NASA Astrophysics Data System (ADS)
Heger, Daniel; Krischer, Katharina
2016-08-01
Uncertain recognition success, unfavorable scaling of connection complexity, or dependence on complex external input impair the usefulness of current oscillatory neural networks for pattern recognition or restrict technical realizations to small networks. We propose a network architecture of coupled oscillators for pattern recognition which shows none of the mentioned flaws. Furthermore we illustrate the recognition process with simulation results and analyze the dynamics analytically: Possible output patterns are isolated attractors of the system. Additionally, simple criteria for recognition success are derived from a lower bound on the basins of attraction.
Preparation and luminescence properties of organogel doped with Eu(TTA)3phen complex
NASA Astrophysics Data System (ADS)
Cocca, M.; Di Lorenzo, M. L.; Avella, M.; Gentile, G.; Aubouy, L.; Della Pirreira, M.; Gutiérrez-Tauste, D.; Kennedy, M.; Doran, J.; Norton, B.
2012-07-01
In this contribution we report the preparation and the luminescence property of Eu(TTA)3phen complex doped toluene gels. Gels were prepared by using either a low molecular weight gelator, 12-hydroxystearic acid (HSA), or a macromolecular gelator, syndiotactic polymethylmethacrylate (s-PMMA). The gelation properties and their reversible behavior from solid-like to liquid systems have been investigated. In addition, photophysical investigations, as well as morphology, thermal properties and ageing behavior of the gels were analyzed as a function of composition of the gels.
Computer modeling and simulation of human movement. Applications in sport and rehabilitation.
Neptune, R R
2000-05-01
Computer modeling and simulation of human movement plays an increasingly important role in sport and rehabilitation, with applications ranging from sport equipment design to understanding pathologic gait. The complex dynamic interactions within the musculoskeletal and neuromuscular systems make analyzing human movement with existing experimental techniques difficult but computer modeling and simulation allows for the identification of these complex interactions and causal relationships between input and output variables. This article provides an overview of computer modeling and simulation and presents an example application in the field of rehabilitation.
Ashfaq, Muhammad; Hebert, Paul D N; Mirza, M Sajjad; Khan, Arif M; Mansoor, Shahid; Shah, Ghulam S; Zafar, Yusuf
2014-01-01
Although whiteflies (Bemisia tabaci complex) are an important pest of cotton in Pakistan, its taxonomic diversity is poorly understood. As DNA barcoding is an effective tool for resolving species complexes and analyzing species distributions, we used this approach to analyze genetic diversity in the B. tabaci complex and map the distribution of B. tabaci lineages in cotton growing areas of Pakistan. Sequence diversity in the DNA barcode region (mtCOI-5') was examined in 593 whiteflies from Pakistan to determine the number of whitefly species and their distributions in the cotton-growing areas of Punjab and Sindh provinces. These new records were integrated with another 173 barcode sequences for B. tabaci, most from India, to better understand regional whitefly diversity. The Barcode Index Number (BIN) System assigned the 766 sequences to 15 BINs, including nine from Pakistan. Representative specimens of each Pakistan BIN were analyzed for mtCOI-3' to allow their assignment to one of the putative species in the B. tabaci complex recognized on the basis of sequence variation in this gene region. This analysis revealed the presence of Asia II 1, Middle East-Asia Minor 1, Asia 1, Asia II 5, Asia II 7, and a new lineage "Pakistan". The first two taxa were found in both Punjab and Sindh, but Asia 1 was only detected in Sindh, while Asia II 5, Asia II 7 and "Pakistan" were only present in Punjab. The haplotype networks showed that most haplotypes of Asia II 1, a species implicated in transmission of the cotton leaf curl virus, occurred in both India and Pakistan. DNA barcodes successfully discriminated cryptic species in B. tabaci complex. The dominant haplotypes in the B. tabaci complex were shared by India and Pakistan. Asia II 1 was previously restricted to Punjab, but is now the dominant lineage in southern Sindh; its southward spread may have serious implications for cotton plantations in this region.
A Pub/Sub Message Distribution Architecture for Disruption Tolerant Networks
NASA Astrophysics Data System (ADS)
Carrilho, Sergio; Esaki, Hiroshi
Access to information is taken for granted in urban areas covered by a robust communication infrastructure. Nevertheless most of the areas in the world, are not covered by such infrastructures. We propose a DTN publish and subscribe system called Hikari, which uses nodes' mobility in order to distribute messages without using a robust infrastructure. The area of Disruption/Delay Tolerant Networks (DTN) focuses on providing connectivity to locations separated by networks with disruptions and delays. The Hikari system does not use node identifiers for message forwarding thus eliminating the complexity of routing associated with many forwarding schemes in DTN. Hikari uses nodes paths' information, advertised by special nodes in the system or predicted by the system itself, for optimizing the message dissemination process. We have used the Paris subway system, due to it's complexity, to validate Hikari and to analyze it's performance. We have shown that Hikari achieves a superior deliver rate while keeping redundant messages in the system low, which is ideal when using devices with limited resources for message dissemination.
Development of a solenoid pumped in situ zinc analyzer for environmental monitoring
Chapin, T.P.; Wanty, R.B.
2005-01-01
A battery powered submersible chemical analyzer, the Zn-DigiScan (Zn Digital Submersible Chemical Analyzer), has been developed for near real-time, in situ monitoring of zinc in aquatic systems. Microprocessor controlled solenoid pumps propel sample and carrier through an anion exchange column to separate zinc from interferences, add colorimetric reagents, and propel the reaction complex through a simple photometric detector. The Zn-DigiScan is capable of self-calibration with periodic injections of standards and blanks. The detection limit with this approach was 30 ??g L-1. Precision was 5-10% relative standard deviation (R.S.D.) below 100 ??g L-1, improving to 1% R.S.D. at 1000 ??g L-1. The linear range extended from 30 to 3000 ??g L-1. In situ field results were in agreement with samples analyzed by inductively coupled plasma mass spectrometry (ICPMS). This pump technology is quite versatile and colorimetric methods with complex online manipulations such as column reduction, preconcentration, and dilution can be performed with the DigiScan. However, long-term field deployments in shallow high altitude streams were hampered by air bubble formation in the photometric detector. ?? 2005 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Potirakis, Stelios M.; Contoyiannis, Yiannis; Kopanas, John; Kalimeris, Anastasios; Antonopoulos, George; Peratzakis, Athanasios; Eftaxias, Konstantinos; Nomicos, Costantinos
2014-05-01
When one considers a phenomenon that is "complex" refers to a system whose phenomenological laws that describe the global behavior of the system, are not necessarily directly related to the "microscopic" laws that regulate the evolution of its elementary parts. The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe disparate problems ranging from particle physics to economies of societies. Several authors have suggested that earthquake (EQ) dynamics can be analyzed within similar mathematical frameworks with economy dynamics, and neurodynamics. A central property of the EQ preparation process is the occurrence of coherent large-scale collective behavior with a very rich structure, resulting from repeated nonlinear interactions among the constituents of the system. As a result, nonextensive statistics is an appropriate, physically meaningful, tool for the study of EQ dynamics. Since the fracture induced electromagnetic (EM) precursors are observable manifestations of the underlying EQ preparation process, the analysis of a fracture induced EM precursor observed prior to the occurrence of a large EQ can also be conducted within the nonextensive statistics framework. Within the frame of the investigation for universal principles that may hold for different dynamical systems that are related to the genesis of extreme events, we present here statistical similarities of the pre-earthquake EM emissions related to an EQ, with the pre-ictal electrical brain activity related to an epileptic seizure, and with the pre-crisis economic observables related to the collapse of a share. It is demonstrated the all three dynamical systems' observables can be analyzed in the frame of nonextensive statistical mechanics, while the frequency-size relations of appropriately defined "events" that precede the extreme event related to each one of these different systems present striking quantitative similarities. It is also demonstrated that, for the considered systems, the nonextensive parameter q increases as the extreme event approaches, which indicates that the strength of the long-memory / long-range interactions between the constituents of the system increases characterizing the dynamics of the system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartmann, Anja, E-mail: hartmann@ipk-gatersleben.de; Schreiber, Falk; Martin-Luther-University Halle-Wittenberg, Halle
The characterization of biological systems with respect to their behavior and functionality based on versatile biochemical interactions is a major challenge. To understand these complex mechanisms at systems level modeling approaches are investigated. Different modeling formalisms allow metabolic models to be analyzed depending on the question to be solved, the biochemical knowledge and the availability of experimental data. Here, we describe a method for an integrative analysis of the structure and dynamics represented by qualitative and quantitative metabolic models. Using various formalisms, the metabolic model is analyzed from different perspectives. Determined structural and dynamic properties are visualized in the contextmore » of the metabolic model. Interaction techniques allow the exploration and visual analysis thereby leading to a broader understanding of the behavior and functionality of the underlying biological system. The System Biology Metabolic Model Framework (SBM{sup 2} – Framework) implements the developed method and, as an example, is applied for the integrative analysis of the crop plant potato.« less
FAME, a microprocessor based front-end analysis and modeling environment
NASA Technical Reports Server (NTRS)
Rosenbaum, J. D.; Kutin, E. B.
1980-01-01
Higher order software (HOS) is a methodology for the specification and verification of large scale, complex, real time systems. The HOS methodology was implemented as FAME (front end analysis and modeling environment), a microprocessor based system for interactively developing, analyzing, and displaying system models in a low cost user-friendly environment. The nature of the model is such that when completed it can be the basis for projection to a variety of forms such as structured design diagrams, Petri-nets, data flow diagrams, and PSL/PSA source code. The user's interface with the analyzer is easily recognized by any current user of a structured modeling approach; therefore extensive training is unnecessary. Furthermore, when all the system capabilities are used one can check on proper usage of data types, functions, and control structures thereby adding a new dimension to the design process that will lead to better and more easily verified software designs.
The Prediction of Botulinum Toxin Structure Based on in Silico and in Vitro Analysis
NASA Astrophysics Data System (ADS)
Suzuki, Tomonori; Miyazaki, Satoru
2011-01-01
Many of biological system mediated through protein-protein interactions. Knowledge of protein-protein complex structure is required for understanding the function. The determination of huge size and flexible protein-protein complex structure by experimental studies remains difficult, costly and five-consuming, therefore computational prediction of protein structures by homolog modeling and docking studies is valuable method. In addition, MD simulation is also one of the most powerful methods allowing to see the real dynamics of proteins. Here, we predict protein-protein complex structure of botulinum toxin to analyze its property. These bioinformatics methods are useful to report the relation between the flexibility of backbone structure and the activity.
Alignment and integration of complex networks by hypergraph-based spectral clustering
NASA Astrophysics Data System (ADS)
Michoel, Tom; Nachtergaele, Bruno
2012-11-01
Complex networks possess a rich, multiscale structure reflecting the dynamical and functional organization of the systems they model. Often there is a need to analyze multiple networks simultaneously, to model a system by more than one type of interaction, or to go beyond simple pairwise interactions, but currently there is a lack of theoretical and computational methods to address these problems. Here we introduce a framework for clustering and community detection in such systems using hypergraph representations. Our main result is a generalization of the Perron-Frobenius theorem from which we derive spectral clustering algorithms for directed and undirected hypergraphs. We illustrate our approach with applications for local and global alignment of protein-protein interaction networks between multiple species, for tripartite community detection in folksonomies, and for detecting clusters of overlapping regulatory pathways in directed networks.
Alignment and integration of complex networks by hypergraph-based spectral clustering.
Michoel, Tom; Nachtergaele, Bruno
2012-11-01
Complex networks possess a rich, multiscale structure reflecting the dynamical and functional organization of the systems they model. Often there is a need to analyze multiple networks simultaneously, to model a system by more than one type of interaction, or to go beyond simple pairwise interactions, but currently there is a lack of theoretical and computational methods to address these problems. Here we introduce a framework for clustering and community detection in such systems using hypergraph representations. Our main result is a generalization of the Perron-Frobenius theorem from which we derive spectral clustering algorithms for directed and undirected hypergraphs. We illustrate our approach with applications for local and global alignment of protein-protein interaction networks between multiple species, for tripartite community detection in folksonomies, and for detecting clusters of overlapping regulatory pathways in directed networks.
The Many Faces of a Software Engineer in a Research Community
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marinovici, Maria C.; Kirkham, Harold
2013-10-14
The ability to gather, analyze and make decisions based on real world data is changing nearly every field of human endeavor. These changes are particularly challenging for software engineers working in a scientific community, designing and developing large, complex systems. To avoid the creation of a communications gap (almost a language barrier), the software engineers should possess an ‘adaptive’ skill. In the science and engineering research community, the software engineers must be responsible for more than creating mechanisms for storing and analyzing data. They must also develop a fundamental scientific and engineering understanding of the data. This paper looks atmore » the many faces that a software engineer should have: developer, domain expert, business analyst, security expert, project manager, tester, user experience professional, etc. Observations made during work on a power-systems scientific software development are analyzed and extended to describe more generic software development projects.« less
A Generic Multibody Parachute Simulation Model
NASA Technical Reports Server (NTRS)
Neuhaus, Jason Richard; Kenney, Patrick Sean
2006-01-01
Flight simulation of dynamic atmospheric vehicles with parachute systems is a complex task that is not easily modeled in many simulation frameworks. In the past, the performance of vehicles with parachutes was analyzed by simulations dedicated to parachute operations and were generally not used for any other portion of the vehicle flight trajectory. This approach required multiple simulation resources to completely analyze the performance of the vehicle. Recently, improved software engineering practices and increased computational power have allowed a single simulation to model the entire flight profile of a vehicle employing a parachute.
Baggio, Jacopo A; BurnSilver, Shauna B; Arenas, Alex; Magdanz, James S; Kofinas, Gary P; De Domenico, Manlio
2016-11-29
Network analysis provides a powerful tool to analyze complex influences of social and ecological structures on community and household dynamics. Most network studies of social-ecological systems use simple, undirected, unweighted networks. We analyze multiplex, directed, and weighted networks of subsistence food flows collected in three small indigenous communities in Arctic Alaska potentially facing substantial economic and ecological changes. Our analysis of plausible future scenarios suggests that changes to social relations and key households have greater effects on community robustness than changes to specific wild food resources.
[Application prospect of human-artificial intelligence system in future manned space flight].
Wei, Jin-he
2003-01-01
To make the manned space flight more efficient and safer, a concept of human-artificial (AI) system is proposed in the present paper. The task of future manned space flight and the technique requirement with respect to the human-AI system development were analyzed. The main points are as follows: 1)Astronaut and AI are complementary to each other functionally; 2) Both symbol AI and connectionist AI should be included in the human-AI system, but expert system and Soar-like system are used mainly inside the cabin, the COG-like robots are mainly assigned for EVA either in LEO flight or on the surface of Moon or Mars; 3) The human-AI system is hierarchical in nature with astronaut at the top level; 4) The complex interfaces between astronaut and AI are the key points for running the system reliably and efficiently. As the importance of human-AI system in future manned space flight and the complexity of related technology, it is suggested that the R/D should be planned as early as possible.
Orbital Architectures of Dynamically Complex Exoplanet Systems
NASA Astrophysics Data System (ADS)
Nelson, Benjamin E.
2015-01-01
The most powerful constraints on planet formation will come from characterizing the dynamical state of complex multi-planet systems. Unfortunately, with that complexity comes a number of factors that make analyzing these systems a computationally challenging endeavor: the sheer number of model parameters, a wonky shaped posterior distribution, and hundreds to thousands of time series measurements. We develop a differential evolution Markov chain Monte Carlo (RUN DMC) to tackle these difficult aspects of data analysis. We apply RUN DMC to two classic multi-planet systems from radial velocity surveys, 55 Cancri and GJ 876. For 55 Cancri, we find the inner-most planet "e" must be coplanar to within 40 degrees of the outer planets, otherwise Kozai-like perturbations will cause the planet's orbit to cross the stellar surface. We find the orbits of planets "b" and "c" are apsidally aligned and librating with low to median amplitude (50±610 degrees), but they are not orbiting in a mean-motion resonance. For GJ 876, we can meaningfully constrain the three-dimensional orbital architecture of all the planets based on the radial velocity data alone. By demanding orbital stability, we find the resonant planets have low mutual inclinations (Φ) so they must be roughly coplanar (Φcb = 1.41±0.620.57 degrees and Φbe = 3.87±1.991.86 degrees). The three-dimensional Laplace argument librates with an amplitude of 50.5±7.910.0 degrees, indicating significant past disk migration and ensuring long-term stability. These empirically derived models will provide new challenges for planet formation models and motivate the need for more sophisticated algorithms to analyze exoplanet data.
Bifurcation Phenomena of Opinion Dynamics in Complex Networks
NASA Astrophysics Data System (ADS)
Guo, Long; Cai, Xu
In this paper, we study the opinion dynamics of Improved Deffuant model (IDM), where the convergence parameter μ is a function of the opposite’s degree K according to the celebrity effect, in small-world network (SWN) and scale-free network (SFN). Generically, the system undergoes a phase transition from the plurality state to the polarization state and to the consensus state as the confidence parameter ɛ increasing. Furthermore, the evolution of the steady opinion s * as a function of ɛ, and the relation between the minority steady opinion s_{*}^{min} and the individual connectivity k also have been analyzed. Our present work shows the crucial role of the confidence parameter and the complex system topology in the opinion dynamics of IDM.
Financial Markets during Highly Anxious Time: Multifractal Fluctuations in Asset Returns
NASA Astrophysics Data System (ADS)
Siokis, Fotios M.
Building on the notion that systems and in particular complex systems such as stock exchange markets reveal their structure better when they are under stress, we analyze the multifractal character and nonlinear properties of four major stock market indices during financial meltdowns by means of the multifractal detrended fluctuation analysis (MF-DFA). The three distinct financial crises under investigation are the Black Monday, the Dot-Com and the Great Recession. Scaling and Hurst exponents are derived as well as the singularity spectra. The results show that all indices exhibit strong multifractal properties. The complexity of the markets is higher under the Black Monday event revealed by the width of the singularity spectrum and the higher α0 parameter.
Thermodynamics aspects of noise-induced phase synchronization
NASA Astrophysics Data System (ADS)
Pinto, Pedro D.; Oliveira, Fernando A.; Penna, André L. A.
2016-05-01
In this article, we present an approach for the thermodynamics of phase oscillators induced by an internal multiplicative noise. We analytically derive the free energy, entropy, internal energy, and specific heat. In this framework, the formulation of the first law of thermodynamics requires the definition of a synchronization field acting on the phase oscillators. By introducing the synchronization field, we have consistently obtained the susceptibility and analyzed its behavior. This allows us to characterize distinct phases in the system, which we have denoted as synchronized and parasynchronized phases, in analogy with magnetism. The system also shows a rich complex behavior, exhibiting ideal gas characteristics for low temperatures and susceptibility anomalies that are similar to those present in complex fluids such as water.
Thermodynamics aspects of noise-induced phase synchronization.
Pinto, Pedro D; Oliveira, Fernando A; Penna, André L A
2016-05-01
In this article, we present an approach for the thermodynamics of phase oscillators induced by an internal multiplicative noise. We analytically derive the free energy, entropy, internal energy, and specific heat. In this framework, the formulation of the first law of thermodynamics requires the definition of a synchronization field acting on the phase oscillators. By introducing the synchronization field, we have consistently obtained the susceptibility and analyzed its behavior. This allows us to characterize distinct phases in the system, which we have denoted as synchronized and parasynchronized phases, in analogy with magnetism. The system also shows a rich complex behavior, exhibiting ideal gas characteristics for low temperatures and susceptibility anomalies that are similar to those present in complex fluids such as water.
NASA Technical Reports Server (NTRS)
Johannsen, G.; Rouse, W. B.
1978-01-01
A hierarchy of human activities is derived by analyzing automobile driving in general terms. A structural description leads to a block diagram and a time-sharing computer analogy. The range of applicability of existing mathematical models is considered with respect to the hierarchy of human activities in actual complex tasks. Other mathematical tools so far not often applied to man machine systems are also discussed. The mathematical descriptions at least briefly considered here include utility, estimation, control, queueing, and fuzzy set theory as well as artificial intelligence techniques. Some thoughts are given as to how these methods might be integrated and how further work might be pursued.
NASA Astrophysics Data System (ADS)
Fang, Fang; Xiao, Yan
2006-12-01
We consider an inhomogeneous optical fiber system described by the generalized cubic complex Ginzburg-Landau (CGL) equation with varying dispersion, nonlinearity, gain (loss), nonlinear gain (absorption) and the effect of spectral limitation. Exact chirped bright and dark soliton-like solutions of the CGL equation were found by using a suitable ansatz. Furthermore, we analyze the features of the solitons and consider the problem of stability of these soliton-like solutions under finite initial perturbations. It is shown by extensive numerical simulations that both bright and dark soliton-like solutions are stable in an inhomogeneous fiber system. Finally, the interaction between two chirped bright and dark soliton-like pulses is investigated numerically.
He, Awen; Wang, Wenyu; Prakash, N Tejo; Tinkov, Alexey A; Skalny, Anatoly V; Wen, Yan; Hao, Jingcan; Guo, Xiong; Zhang, Feng
2018-03-01
Chemical elements are closely related to human health. Extensive genomic profile data of complex diseases offer us a good opportunity to systemically investigate the relationships between elements and complex diseases/traits. In this study, we applied gene set enrichment analysis (GSEA) approach to detect the associations between elements and complex diseases/traits though integrating element-gene interaction datasets and genome-wide association study (GWAS) data of complex diseases/traits. To illustrate the performance of GSEA, the element-gene interaction datasets of 24 elements were extracted from the comparative toxicogenomics database (CTD). GWAS summary datasets of 24 complex diseases or traits were downloaded from the dbGaP or GEFOS websites. We observed significant associations between 7 elements and 13 complex diseases or traits (all false discovery rate (FDR) < 0.05), including reported relationships such as aluminum vs. Alzheimer's disease (FDR = 0.042), calcium vs. bone mineral density (FDR = 0.031), magnesium vs. systemic lupus erythematosus (FDR = 0.012) as well as novel associations, such as nickel vs. hypertriglyceridemia (FDR = 0.002) and bipolar disorder (FDR = 0.027). Our study results are consistent with previous biological studies, supporting the good performance of GSEA. Our analyzing results based on GSEA framework provide novel clues for discovering causal relationships between elements and complex diseases. © 2017 WILEY PERIODICALS, INC.
High-Resolution Magnetic Analyzer MAVR for the Study of Exotic Weakly-Bound Nuclei
NASA Astrophysics Data System (ADS)
Maslov, V. A.; Kazacha, V. I.; Kolesov, I. V.; Lukyanov, S. M.; Melnikov, V. N.; Osipov, N. F.; Penionzhkevich, Yu. E.; Skobelev, N. K.; Sobolev, Yu. G.; Voskoboinik, E. I.
2015-11-01
A project of the high-resolution magnetic analyzer MAVR is proposed. The analyzer will comprise new magnetic optical and detecting systems for separation and identification of reaction products in a wide range of masses (5-150) and charges (1-60). The magnetic optical system consists of the MSP-144 magnet and a doublet of quadrupole lenses. This will allow the solid angle of the spectrometer to be increased by an order of magnitude up to 30 msr. The magnetic analyzer will have a high momentum resolution (10-4) and high focal-plane dispersion (1.9 m). It will allow products of nuclear reactions at energies up to 30 MeV/nucleon to be detected with the charge resolution ~1/60. Implementation of the project is divided into two stages: conversion of the magnetic analyzer proper and construction of the nuclear reaction products identification system. The MULTI detecting system is being developed for the MAVR magnetic analyzer to allow detection of nuclear reaction products and their identification by charge Q, atomic number Z, and mass A with a high absolute accuracy. The identification will be performed by measuring the energy loss (ΔE), time of flight (TOF), and total kinetic energy (TKE) of reaction products. The particle trajectories in the analyzer will also be determined using the drift chamber developed jointly with GANIL. The MAVR analyzer will operate in both primary beams of heavy ions and beams of radioactive nuclei produced by the U400-U400M acceleration complex. It will also be used for measuring energy spectra of nuclear reaction products and as an energy monochromator.
Gonçalves-Araujo, Rafael; Wiegmann, Sonja; Torrecilla, Elena; Bardaji, Raul; Röttgers, Rüdiger; Bracher, Astrid; Piera, Jaume
2017-01-01
The detection and prediction of changes in coastal ecosystems require a better understanding of the complex physical, chemical and biological interactions, which involves that observations should be performed continuously. For this reason, there is an increasing demand for small, simple and cost-effective in situ sensors to analyze complex coastal waters at a broad range of scales. In this context, this study seeks to explore the potential of beam attenuation spectra, c(λ), measured in situ with an advanced-technology optical transmissometer, for assessing temporal and spatial patterns in the complex estuarine waters of Alfacs Bay (NW Mediterranean) as a test site. In particular, the information contained in the spectral beam attenuation coefficient was assessed and linked with different biogeochemical variables. The attenuation at λ = 710 nm was used as a proxy for particle concentration, TSM, whereas a novel parameter was adopted as an optical indicator for chlorophyll a (Chl-a) concentration, based on the local maximum of c(λ) observed at the long-wavelength side of the red band Chl-a absorption peak. In addition, since coloured dissolved organic matter (CDOM) has an important influence on the beam attenuation spectral shape and complementary measurements of particle size distribution were available, the beam attenuation spectral slope was used to analyze the CDOM content. Results were successfully compared with optical and biogeochemical variables from laboratory analysis of collocated water samples, and statistically significant correlations were found between the attenuation proxies and the biogeochemical variables TSM, Chl-a and CDOM. This outcome depicted the potential of high-frequency beam attenuation measurements as a simple, continuous and cost-effective approach for rapid detection of changes and patterns in biogeochemical properties in complex coastal environments. PMID:28107539
Reframing the challenges to integrated care: a complex-adaptive systems perspective.
Tsasis, Peter; Evans, Jenna M; Owen, Susan
2012-01-01
Despite over two decades of international experience and research on health systems integration, integrated care has not developed widely. We hypothesized that part of the problem may lie in how we conceptualize the integration process and the complex systems within which integrated care is enacted. This study aims to contribute to discourse regarding the relevance and utility of a complex-adaptive systems (CAS) perspective on integrated care. In the Canadian province of Ontario, government mandated the development of fourteen Local Health Integration Networks in 2006. Against the backdrop of these efforts to integrate care, we collected focus group data from a diverse sample of healthcare professionals in the Greater Toronto Area using convenience and snowball sampling. A semi-structured interview guide was used to elicit participant views and experiences of health systems integration. We use a CAS framework to describe and analyze the data, and to assess the theoretical fit of a CAS perspective with the dominant themes in participant responses. Our findings indicate that integration is challenged by system complexity, weak ties and poor alignment among professionals and organizations, a lack of funding incentives to support collaborative work, and a bureaucratic environment based on a command and control approach to management. Using a CAS framework, we identified several characteristics of CAS in our data, including diverse, interdependent and semi-autonomous actors; embedded co-evolutionary systems; emergent behaviours and non-linearity; and self-organizing capacity. One possible explanation for the lack of systems change towards integration is that we have failed to treat the healthcare system as complex-adaptive. The data suggest that future integration initiatives must be anchored in a CAS perspective, and focus on building the system's capacity to self-organize. We conclude that integrating care requires policies and management practices that promote system awareness, relationship-building and information-sharing, and that recognize change as an evolving learning process rather than a series of programmatic steps.
Investigating dynamical complexity in the magnetosphere using various entropy measures
NASA Astrophysics Data System (ADS)
Balasis, Georgios; Daglis, Ioannis A.; Papadimitriou, Constantinos; Kalimeri, Maria; Anastasiadis, Anastasios; Eftaxias, Konstantinos
2009-09-01
The complex system of the Earth's magnetosphere corresponds to an open spatially extended nonequilibrium (input-output) dynamical system. The nonextensive Tsallis entropy has been recently introduced as an appropriate information measure to investigate dynamical complexity in the magnetosphere. The method has been employed for analyzing Dst time series and gave promising results, detecting the complexity dissimilarity among different physiological and pathological magnetospheric states (i.e., prestorm activity and intense magnetic storms, respectively). This paper explores the applicability and effectiveness of a variety of computable entropy measures (e.g., block entropy, Kolmogorov entropy, T complexity, and approximate entropy) to the investigation of dynamical complexity in the magnetosphere. We show that as the magnetic storm approaches there is clear evidence of significant lower complexity in the magnetosphere. The observed higher degree of organization of the system agrees with that inferred previously, from an independent linear fractal spectral analysis based on wavelet transforms. This convergence between nonlinear and linear analyses provides a more reliable detection of the transition from the quiet time to the storm time magnetosphere, thus showing evidence that the occurrence of an intense magnetic storm is imminent. More precisely, we claim that our results suggest an important principle: significant complexity decrease and accession of persistency in Dst time series can be confirmed as the magnetic storm approaches, which can be used as diagnostic tools for the magnetospheric injury (global instability). Overall, approximate entropy and Tsallis entropy yield superior results for detecting dynamical complexity changes in the magnetosphere in comparison to the other entropy measures presented herein. Ultimately, the analysis tools developed in the course of this study for the treatment of Dst index can provide convenience for space weather applications.
Multiscale entropy-based methods for heart rate variability complexity analysis
NASA Astrophysics Data System (ADS)
Silva, Luiz Eduardo Virgilio; Cabella, Brenno Caetano Troca; Neves, Ubiraci Pereira da Costa; Murta Junior, Luiz Otavio
2015-03-01
Physiologic complexity is an important concept to characterize time series from biological systems, which associated to multiscale analysis can contribute to comprehension of many complex phenomena. Although multiscale entropy has been applied to physiological time series, it measures irregularity as function of scale. In this study we purpose and evaluate a set of three complexity metrics as function of time scales. Complexity metrics are derived from nonadditive entropy supported by generation of surrogate data, i.e. SDiffqmax, qmax and qzero. In order to access accuracy of proposed complexity metrics, receiver operating characteristic (ROC) curves were built and area under the curves was computed for three physiological situations. Heart rate variability (HRV) time series in normal sinus rhythm, atrial fibrillation, and congestive heart failure data set were analyzed. Results show that proposed metric for complexity is accurate and robust when compared to classic entropic irregularity metrics. Furthermore, SDiffqmax is the most accurate for lower scales, whereas qmax and qzero are the most accurate when higher time scales are considered. Multiscale complexity analysis described here showed potential to assess complex physiological time series and deserves further investigation in wide context.
Identification of hybrid node and link communities in complex networks
He, Dongxiao; Jin, Di; Chen, Zheng; Zhang, Weixiong
2015-01-01
Identifying communities in complex networks is an effective means for analyzing complex systems, with applications in diverse areas such as social science, engineering, biology and medicine. Finding communities of nodes and finding communities of links are two popular schemes for network analysis. These schemes, however, have inherent drawbacks and are inadequate to capture complex organizational structures in real networks. We introduce a new scheme and an effective approach for identifying complex mixture structures of node and link communities, called hybrid node-link communities. A central piece of our approach is a probabilistic model that accommodates node, link and hybrid node-link communities. Our extensive experiments on various real-world networks, including a large protein-protein interaction network and a large network of semantically associated words, illustrated that the scheme for hybrid communities is superior in revealing network characteristics. Moreover, the new approach outperformed the existing methods for finding node or link communities separately. PMID:25728010
Identification of hybrid node and link communities in complex networks.
He, Dongxiao; Jin, Di; Chen, Zheng; Zhang, Weixiong
2015-03-02
Identifying communities in complex networks is an effective means for analyzing complex systems, with applications in diverse areas such as social science, engineering, biology and medicine. Finding communities of nodes and finding communities of links are two popular schemes for network analysis. These schemes, however, have inherent drawbacks and are inadequate to capture complex organizational structures in real networks. We introduce a new scheme and an effective approach for identifying complex mixture structures of node and link communities, called hybrid node-link communities. A central piece of our approach is a probabilistic model that accommodates node, link and hybrid node-link communities. Our extensive experiments on various real-world networks, including a large protein-protein interaction network and a large network of semantically associated words, illustrated that the scheme for hybrid communities is superior in revealing network characteristics. Moreover, the new approach outperformed the existing methods for finding node or link communities separately.
Identification of hybrid node and link communities in complex networks
NASA Astrophysics Data System (ADS)
He, Dongxiao; Jin, Di; Chen, Zheng; Zhang, Weixiong
2015-03-01
Identifying communities in complex networks is an effective means for analyzing complex systems, with applications in diverse areas such as social science, engineering, biology and medicine. Finding communities of nodes and finding communities of links are two popular schemes for network analysis. These schemes, however, have inherent drawbacks and are inadequate to capture complex organizational structures in real networks. We introduce a new scheme and an effective approach for identifying complex mixture structures of node and link communities, called hybrid node-link communities. A central piece of our approach is a probabilistic model that accommodates node, link and hybrid node-link communities. Our extensive experiments on various real-world networks, including a large protein-protein interaction network and a large network of semantically associated words, illustrated that the scheme for hybrid communities is superior in revealing network characteristics. Moreover, the new approach outperformed the existing methods for finding node or link communities separately.
T Cell Receptor Engineering and Analysis Using the Yeast Display Platform
Smith, Sheena N.; Harris, Daniel T.; Kranz, David M.
2017-01-01
The αβ heterodimeric T cell receptor (TCR) recognizes peptide antigens that are transported to the cell surface as a complex with a protein encoded by the major histocompatibility complex (MHC). T cells thus evolved a strategy to sense these intracellular antigens, and to respond either by eliminating the antigen-presenting cell (e.g. a virus-infected cell) or by secreting factors that recruit the immune system to the site of the antigen. The central role of the TCR in the binding of antigens as peptide-MHC (pepMHC) ligands has now been studied thoroughly. Interestingly, despite their exquisite sensitivity (e.g. T cell activation by as few as 1 to 3 pepMHC complexes on a single target cell), TCRs are known to have relatively low affinities for pepMHC, with KD values in the micromolar range. There has been interest in engineering the affinity of TCRs in order to use this class of molecules in ways similar to now done with antibodies. By doing so, it would be possible to harness the potential of TCRs as therapeutics against a much wider array of antigens that include essentially all intracellular targets. To engineer TCRs, and to analyze their binding features more rapidly, we have used a yeast display system as a platform. Expression and engineering of a single-chain form of the TCR, analogous to scFv fragments from antibodies, allow the TCR to be affinity matured with a variety of possible pepMHC ligands. In addition, the yeast display platform allows one to rapidly generate TCR variants with diverse binding affinities and to analyze specificity and affinity without the need for purification of soluble forms of the TCRs. The present chapter describes the methods for engineering and analyzing single-chain TCRs using yeast display. PMID:26060072
"Time-dependent flow-networks"
NASA Astrophysics Data System (ADS)
Tupikina, Liubov; Molkentin, Nora; Lopez, Cristobal; Hernandez-Garcia, Emilio; Marwan, Norbert; Kurths, Jürgen
2015-04-01
Complex networks have been successfully applied to various systems such as society, technology, and recently climate. Links in a climate network are defined between two geographical locations if the correlation between the time series of some climate variable is higher than a threshold. Therefore, network links are considered to imply information or heat exchange. However, the relationship between the oceanic and atmospheric flows and the climate network's structure is still unclear. Recently, a theoretical approach verifying the correlation between ocean currents and surface air temperature networks has been introduced, where the Pearson correlation networks were constructed from advection-diffusion dynamics on an underlying flow. Since the continuous approach has its limitations, i.e. high computational complexity and fixed variety of the flows in the underlying system, we introduce a new, method of flow-networks for changing in time velocity fields including external forcing in the system, noise and temperature-decay. Method of the flow-network construction can be divided into several steps: first we obtain the linear recursive equation for the temperature time-series. Then we compute the correlation matrix for time-series averaging the tensor product over all realizations of the noise, which we interpret as a weighted adjacency matrix of the flow-network and analyze using network measures. We apply the method to different types of moving flows with geographical relevance such as meandering flow. Analyzing the flow-networks using network measures we find that our approach can highlight zones of high velocity by degree and transition zones by betweenness, while the combination of these network measures can uncover how the flow propagates within time. Flow-networks can be powerful tool to understand the connection between system's dynamics and network's topology analyzed using network measures in order to shed light on different climatic phenomena.
Rodriguez, Javier; Voss, Andreas; Caminal, Pere; Bayes-Genis, Antoni; Giraldo, Beatriz F
2017-07-01
Cardiac death risk is still a big problem by an important part of the population, especially in elderly patients. In this study, we propose to characterize and analyze the cardiovascular and cardiorespiratory systems using the Poincaré plot. A total of 46 cardiomyopathy patients and 36 healthy subjets were analyzed. Left ventricular ejection fraction (LVEF) was used to stratify patients with low risk (LR: LVEF > 35%, 16 patients), and high risk (HR: LVEF ≤ 35%, 30 patients) of heart attack. RR, SBP and T Tot time series were extracted from the ECG, blood pressure and respiratory flow signals, respectively. Parameters that describe the scatterplott of Poincaré method, related to short- and long-term variabilities, acceleration and deceleration of the dynamic system, and the complex correlation index were extracted. The linear discriminant analysis (LDA) and the support vector machines (SVM) classification methods were used to analyze the results of the extracted parameters. The results showed that cardiac parameters were the best to discriminate between HR and LR groups, especially the complex correlation index (p = 0.009). Analising the interaction, the best result was obtained with the relation between the difference of the standard deviation of the cardiac and respiratory system (p = 0.003). When comparing HR vs LR groups, the best classification was obtained applying SVM method, using an ANOVA kernel, with an accuracy of 98.12%. An accuracy of 97.01% was obtained by comparing patients versus healthy, with a SVM classifier and Laplacian kernel. The morphology of Poincaré plot introduces parameters that allow the characterization of the cardiorespiratory system dynamics.
Gasco, Jaime; Braun, Jonathan D; McCutcheon, Ian E; Black, Peter M
2011-01-01
To objectively compare the complexity and diversity of the certification process in neurological surgery in member societies of the World Federation of Neurosurgical Societies. This study centers in continental Asia. We provide here an analysis based on the responses provided to a 13-item survey. The data received were analyzed, and three Regional Complexity Scores (RCS) were designed. To compare national board experience, eligibility requirements for access to the certification process, and the obligatory nature of the examinations, an RCS-Organizational score was created (20 points maximum). To analyze the complexity of the examination, an RCS-Components score was designed (20 points maximum). The sum of both is presented in a Global RCS score. Only those countries that responded to the survey and presented nationwide homogeneity in the conduction of neurosurgery examinations could be included within the scoring system. In addition, a descriptive summary of the certification process per responding society is also provided. On the basis of the data provided by our RCS system, the highest global RCS was achieved by South Korea and Malaysia (21/40 points) followed by the joint examination of Singapore and Hong-Kong (FRCS-Ed) (20/40 points), Japan (17/40 points), the Philippines (15/40 points), and Taiwan (13 points). The experience from these leading countries should be of value to all countries within Asia. Copyright © 2011 Elsevier Inc. All rights reserved.
Fu, Wenjiang J.; Stromberg, Arnold J.; Viele, Kert; Carroll, Raymond J.; Wu, Guoyao
2009-01-01
Over the past two decades, there have been revolutionary developments in life science technologies characterized by high throughput, high efficiency, and rapid computation. Nutritionists now have the advanced methodologies for the analysis of DNA, RNA, protein, low-molecular-weight metabolites, as well as access to bioinformatics databases. Statistics, which can be defined as the process of making scientific inferences from data that contain variability, has historically played an integral role in advancing nutritional sciences. Currently, in the era of systems biology, statistics has become an increasingly important tool to quantitatively analyze information about biological macromolecules. This article describes general terms used in statistical analysis of large, complex experimental data. These terms include experimental design, power analysis, sample size calculation, and experimental errors (type I and II errors) for nutritional studies at population, tissue, cellular, and molecular levels. In addition, we highlighted various sources of experimental variations in studies involving microarray gene expression, real-time polymerase chain reaction, proteomics, and other bioinformatics technologies. Moreover, we provided guidelines for nutritionists and other biomedical scientists to plan and conduct studies and to analyze the complex data. Appropriate statistical analyses are expected to make an important contribution to solving major nutrition-associated problems in humans and animals (including obesity, diabetes, cardiovascular disease, cancer, ageing, and intrauterine fetal retardation). PMID:20233650
Cardea: Dynamic Access Control in Distributed Systems
NASA Technical Reports Server (NTRS)
Lepro, Rebekah
2004-01-01
Modern authorization systems span domains of administration, rely on many different authentication sources, and manage complex attributes as part of the authorization process. This . paper presents Cardea, a distributed system that facilitates dynamic access control, as a valuable piece of an inter-operable authorization framework. First, the authorization model employed in Cardea and its functionality goals are examined. Next, critical features of the system architecture and its handling of the authorization process are then examined. Then the S A M L and XACML standards, as incorporated into the system, are analyzed. Finally, the future directions of this project are outlined and connection points with general components of an authorization system are highlighted.
NASA Technical Reports Server (NTRS)
Nelson, Gregory A.; Marshall, Tamara M.; Schubert, Wayne W.
1989-01-01
The effects of ionizing and nonionizing radiation effects on cell reproduction, differentiation, and mutation in vivo are studied using the nematode C. elegans. The relationships between fluence/dose and response and quality factor and linear energy transfer are analyzed. The data reveal that there is a complex repair pathway in the nematode and that mutants can be used to direct the sensitivity of the system to specific mutagens/radiation types.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Li-Ping, E-mail: yangliping302@hrbeu.edu.cn; Ding, Shun-Liang; Song, En-Zhe
The cycling combustion instabilities in a diesel engine have been analyzed based on chaos theory. The objective was to investigate the dynamical characteristics of combustion in diesel engine. In this study, experiments were performed under the entire operating range of a diesel engine (the engine speed was changed from 600 to 1400 rpm and the engine load rate was from 0% to 100%), and acquired real-time series of in-cylinder combustion pressure using a piezoelectric transducer installed on the cylinder head. Several methods were applied to identify and quantitatively analyze the combustion process complexity in the diesel engine including delay-coordinate embedding, recurrencemore » plot (RP), Recurrence Quantification Analysis, correlation dimension (CD), and the largest Lyapunov exponent (LLE) estimation. The results show that the combustion process exhibits some determinism. If LLE is positive, then the combustion system has a fractal dimension and CD is no more than 1.6 and within the diesel engine operating range. We have concluded that the combustion system of diesel engine is a low-dimensional chaotic system and the maximum values of CD and LLE occur at the lowest engine speed and load. This means that combustion system is more complex and sensitive to initial conditions and that poor combustion quality leads to the decrease of fuel economy and the increase of exhaust emissions.« less
Yang, Li-Ping; Ding, Shun-Liang; Litak, Grzegorz; Song, En-Zhe; Ma, Xiu-Zhen
2015-01-01
The cycling combustion instabilities in a diesel engine have been analyzed based on chaos theory. The objective was to investigate the dynamical characteristics of combustion in diesel engine. In this study, experiments were performed under the entire operating range of a diesel engine (the engine speed was changed from 600 to 1400 rpm and the engine load rate was from 0% to 100%), and acquired real-time series of in-cylinder combustion pressure using a piezoelectric transducer installed on the cylinder head. Several methods were applied to identify and quantitatively analyze the combustion process complexity in the diesel engine including delay-coordinate embedding, recurrence plot (RP), Recurrence Quantification Analysis, correlation dimension (CD), and the largest Lyapunov exponent (LLE) estimation. The results show that the combustion process exhibits some determinism. If LLE is positive, then the combustion system has a fractal dimension and CD is no more than 1.6 and within the diesel engine operating range. We have concluded that the combustion system of diesel engine is a low-dimensional chaotic system and the maximum values of CD and LLE occur at the lowest engine speed and load. This means that combustion system is more complex and sensitive to initial conditions and that poor combustion quality leads to the decrease of fuel economy and the increase of exhaust emissions.
Methods and tools for profiling and control of distributed systems
NASA Astrophysics Data System (ADS)
Sukharev, R.; Lukyanchikov, O.; Nikulchev, E.; Biryukov, D.; Ryadchikov, I.
2018-02-01
This article is devoted to the topic of profiling and control of distributed systems. Distributed systems have a complex architecture, applications are distributed among various computing nodes, and many network operations are performed. Therefore, today it is important to develop methods and tools for profiling distributed systems. The article analyzes and standardizes methods for profiling distributed systems that focus on simulation to conduct experiments and build a graph model of the system. The theory of queueing networks is used for simulation modeling of distributed systems, receiving and processing user requests. To automate the above method of profiling distributed systems the software application was developed with a modular structure and similar to a SCADA-system.
Automated thermal mapping techniques using chromatic image analysis
NASA Technical Reports Server (NTRS)
Buck, Gregory M.
1989-01-01
Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.
A service-oriented data access control model
NASA Astrophysics Data System (ADS)
Meng, Wei; Li, Fengmin; Pan, Juchen; Song, Song; Bian, Jiali
2017-01-01
The development of mobile computing, cloud computing and distributed computing meets the growing individual service needs. Facing with complex application system, it's an urgent problem to ensure real-time, dynamic, and fine-grained data access control. By analyzing common data access control models, on the basis of mandatory access control model, the paper proposes a service-oriented access control model. By regarding system services as subject and data of databases as object, the model defines access levels and access identification of subject and object, and ensures system services securely to access databases.
CPAD: Cyber-Physical Attack Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferragut, Erik M; Laska, Jason A
The CPAD technology relates to anomaly detection and more specifically to cyber physical attack detection. It infers underlying physical relationships between components by analyzing the sensor measurements of a system. It then uses these measurements to detect signs of a non-physically realizable state, which is indicative of an integrity attack on the system. CPAD can be used on any highly-instrumented cyber-physical system to detect integrity attacks and identify the component or components compromised. It has applications to power transmission and distribution, nuclear and industrial plants, and complex vehicles.
Trouvé, Hélène; Couturier, Yves; Etheridge, Francis; Saint-Jean, Olivier; Somme, Dominique
2010-01-01
Background The literature on integration indicates the need for an enhanced theorization of institutional integration. This article proposes path dependence as an analytical framework to study the systems in which integration takes place. Purpose PRISMA proposes a model for integrating health and social care services for older adults. This model was initially tested in Quebec. The PRISMA France study gave us an opportunity to analyze institutional integration in France. Methods A qualitative approach was used. Analyses were based on semi-structured interviews with actors of all levels of decision-making, observations of advisory board meetings, and administrative documents. Results Our analyses revealed the complexity and fragmentation of institutional integration. The path dependency theory, which analyzes the change capacity of institutions by taking into account their historic structures, allows analysis of this situation. The path dependency to the Bismarckian system and the incomplete reforms of gerontological policies generate the coexistence and juxtaposition of institutional systems. In such a context, no institution has sufficient ability to determine gerontology policy and build institutional integration by itself. Conclusion Using path dependence as an analytical framework helps to understand the reasons why institutional integration is critical to organizational and clinical integration, and the complex construction of institutional integration in France. PMID:20689740
Learning surface molecular structures via machine vision
Ziatdinov, Maxim; Maksov, Artem; Kalinin, Sergei V.
2017-08-10
Recent advances in high resolution scanning transmission electron and scanning probe microscopies have allowed researchers to perform measurements of materials structural parameters and functional properties in real space with a picometre precision. In many technologically relevant atomic and/or molecular systems, however, the information of interest is distributed spatially in a non-uniform manner and may have a complex multi-dimensional nature. One of the critical issues, therefore, lies in being able to accurately identify (‘read out’) all the individual building blocks in different atomic/molecular architectures, as well as more complex patterns that these blocks may form, on a scale of hundreds andmore » thousands of individual atomic/molecular units. Here we employ machine vision to read and recognize complex molecular assemblies on surfaces. Specifically, we combine Markov random field model and convolutional neural networks to classify structural and rotational states of all individual building blocks in molecular assembly on the metallic surface visualized in high-resolution scanning tunneling microscopy measurements. We show how the obtained full decoding of the system allows us to directly construct a pair density function—a centerpiece in analysis of disorder-property relationship paradigm—as well as to analyze spatial correlations between multiple order parameters at the nanoscale, and elucidate reaction pathway involving molecular conformation changes. Here, the method represents a significant shift in our way of analyzing atomic and/or molecular resolved microscopic images and can be applied to variety of other microscopic measurements of structural, electronic, and magnetic orders in different condensed matter systems.« less
Temporal-logic analysis of microglial phenotypic conversion with exposure to amyloid-β.
Anastasio, Thomas J
2015-02-01
Alzheimer Disease (AD) remains a leading killer with no adequate treatment. Ongoing research increasingly implicates the brain's immune system as a critical contributor to AD pathogenesis, but the complexity of the immune contribution poses a barrier to understanding. Here I use temporal logic to analyze a computational specification of the immune component of AD. Temporal logic is an extension of logic to propositions expressed in terms of time. It has traditionally been used to analyze computational specifications of complex engineered systems but applications to complex biological systems are now appearing. The inflammatory component of AD involves the responses of microglia to the peptide amyloid-β (Aβ), which is an inflammatory stimulus and a likely causative AD agent. Temporal-logic analysis of the model provides explanations for the puzzling findings that Aβ induces an anti-inflammatory and well as a pro-inflammatory response, and that Aβ is phagocytized by microglia in young but not in old animals. To potentially explain the first puzzle, the model suggests that interferon-γ acts as an "autocrine bridge" over which an Aβ-induced increase in pro-inflammatory cytokines leads to an increase in anti-inflammatory mediators also. To potentially explain the second puzzle, the model identifies a potential instability in signaling via insulin-like growth factor 1 that could explain the failure of old microglia to phagocytize Aβ. The model predicts that augmentation of insulin-like growth factor 1 signaling, and activation of protein kinase C in particular, could move old microglia from a neurotoxic back toward a more neuroprotective and phagocytic phenotype.
Learning surface molecular structures via machine vision
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ziatdinov, Maxim; Maksov, Artem; Kalinin, Sergei V.
Recent advances in high resolution scanning transmission electron and scanning probe microscopies have allowed researchers to perform measurements of materials structural parameters and functional properties in real space with a picometre precision. In many technologically relevant atomic and/or molecular systems, however, the information of interest is distributed spatially in a non-uniform manner and may have a complex multi-dimensional nature. One of the critical issues, therefore, lies in being able to accurately identify (‘read out’) all the individual building blocks in different atomic/molecular architectures, as well as more complex patterns that these blocks may form, on a scale of hundreds andmore » thousands of individual atomic/molecular units. Here we employ machine vision to read and recognize complex molecular assemblies on surfaces. Specifically, we combine Markov random field model and convolutional neural networks to classify structural and rotational states of all individual building blocks in molecular assembly on the metallic surface visualized in high-resolution scanning tunneling microscopy measurements. We show how the obtained full decoding of the system allows us to directly construct a pair density function—a centerpiece in analysis of disorder-property relationship paradigm—as well as to analyze spatial correlations between multiple order parameters at the nanoscale, and elucidate reaction pathway involving molecular conformation changes. Here, the method represents a significant shift in our way of analyzing atomic and/or molecular resolved microscopic images and can be applied to variety of other microscopic measurements of structural, electronic, and magnetic orders in different condensed matter systems.« less
The influence of bile salts on the distribution of simvastatin in the octanol/buffer system.
Đanić, Maja; Pavlović, Nebojša; Stanimirov, Bojan; Vukmirović, Saša; Nikolić, Katarina; Agbaba, Danica; Mikov, Momir
2016-01-01
Distribution coefficient (D) is useful parameter for evaluating drugs permeability properties across biological membranes, which are of importance for drugs bioavailability. Given that bile acids are intensively studied as drug permeation-modifying and -solubilizing agents, the aim of this study was to estimate the influence of sodium salts of cholic (CA), deoxycholic (DCA) and 12-monoketocholic acids (MKC) on distribution coefficient of simvastatin (SV) (lactone [SVL] and acid form [SVA]) which is a highly lipophilic compound with extremely low water solubility and bioavailability. LogD values of SVA and SVL with or without bile salts were measured by liquid-liquid extraction in n-octanol/buffer systems at pH 5 and 7.4. SV concentrations in aqueous phase were determined by HPLC-DAD. Chem3D Ultra program was applied for computation of physico-chemical properties of analyzed compounds and their complexes. Statistically significant decrease in both SVA and SVL logD was observed for all three studied bile salts at both selected pH. MKC exerted the most pronounced effect in the case of SVA while there were no statistically significant differences between observed bile salts for SVL. The calculated physico-chemical properties of analyzed compounds and their complexes supported experimental results. Our data indicate that the addition of bile salts into the n-octanol/buffer system decreases the values of SV distribution coefficient at both studied pH values. This may be the result of the formation of hydrophilic complexes increasing the solubility of SV that could consequently impact the pharmacokinetic parameters of SV and the final drug response in patients.
Breaking into the Hebrew Verb System: A Learning Problem
ERIC Educational Resources Information Center
Ashkenazi, Orit; Ravid, Dorit; Gillis, Steven
2016-01-01
Verb learning is an important part of linguistic acquisition. The present study examines the early phases of verb acquisition in Hebrew, a language with complex derivational and inflectional verb morphology, analyzing verbs in dense recordings of CDS and CS of two Hebrew-speaking parent-child dyads aged 1;8-2;2. The goal was to pinpoint those cues…
Surfacing the Structures of Patriarchy: Teaching and Learning Threshold Concepts in Women's Studies
ERIC Educational Resources Information Center
Hassel, Holly; Reddinger, Amy; van Slooten, Jessica
2011-01-01
Patriarchy is a threshold concept in women's studies--a significant, defining concept that transforms students' understanding of the discipline. This article reviews our design, implementation, and findings of a lesson study crafted to teach women's studies students the complex idea of patriarchy as a social system. We analyze the lesson using…
Metaphor and Congruence in the Media: Barriers for International Students of Economics and Commerce.
ERIC Educational Resources Information Center
McGowan, Ursula
1997-01-01
Uses the systemic-functional concept of "congruence" to analyze media texts related to commentaries and reports surrounding the 1995 Australian Federal Budget. Results present insights into the complexity and inaccessibility of some media texts for students new to English in the context of economics or the Australian media. (17…
Selection of Educational Materials in the United States Public Schools.
ERIC Educational Resources Information Center
Institute for Educational Development, New York, NY.
The objective of this study was to collect "baseline" data with which to examine a complex process in the educational system--the selection of educational materials. The first part of the study analyzes the statutes of the fifty states which bear upon selection and purchase of educational materials. The purpose of this analysis is to…
Analysis of a Plant Transcriptional Regulatory Network Using Transient Expression Systems.
Díaz-Triviño, Sara; Long, Yuchen; Scheres, Ben; Blilou, Ikram
2017-01-01
In plant biology, transient expression systems have become valuable approaches used routinely to rapidly study protein expression, subcellular localization, protein-protein interactions, and transcriptional activity prior to in vivo studies. When studying transcriptional regulation, luciferase reporter assays offer a sensitive readout for assaying promoter behavior in response to different regulators or environmental contexts and to confirm and assess the functional relevance of predicted binding sites in target promoters. This chapter aims to provide detailed methods for using luciferase reporter system as a rapid, efficient, and versatile assay to analyze transcriptional regulation of target genes by transcriptional regulators. We describe a series of optimized transient expression systems consisting of Arabidopsis thaliana protoplasts, infiltrated Nicotiana benthamiana leaves, and human HeLa cells to study the transcriptional regulations of two well-characterized transcriptional regulators SCARECROW (SCR) and SHORT-ROOT (SHR) on one of their targets, CYCLIN D6 (CYCD6).Here, we illustrate similarities and differences in outcomes when using different systems. The plant-based systems revealed that the SCR-SHR complex enhances CYCD6 transcription, while analysis in HeLa cells showed that the complex is not sufficient to strongly induce CYCD6 transcription, suggesting that additional, plant-specific regulators are required for full activation. These results highlight the importance of the system and suggest that including heterologous systems, such as HeLa cells, can provide a more comprehensive analysis of a complex gene regulatory network.
On a biologically inspired topology optimization method
NASA Astrophysics Data System (ADS)
Kobayashi, Marcelo H.
2010-03-01
This work concerns the development of a biologically inspired methodology for the study of topology optimization in engineering and natural systems. The methodology is based on L systems and its turtle interpretation for the genotype-phenotype modeling of the topology development. The topology is analyzed using the finite element method, and optimized using an evolutionary algorithm with the genetic encoding of the L system and its turtle interpretation, as well as, body shape and physical characteristics. The test cases considered in this work clearly show the suitability of the proposed method for the study of engineering and natural complex systems.
On the interplay between mathematics and biology: hallmarks toward a new systems biology.
Bellomo, Nicola; Elaiw, Ahmed; Althiabi, Abdullah M; Alghamdi, Mohammed Ali
2015-03-01
This paper proposes a critical analysis of the existing literature on mathematical tools developed toward systems biology approaches and, out of this overview, develops a new approach whose main features can be briefly summarized as follows: derivation of mathematical structures suitable to capture the complexity of biological, hence living, systems, modeling, by appropriate mathematical tools, Darwinian type dynamics, namely mutations followed by selection and evolution. Moreover, multiscale methods to move from genes to cells, and from cells to tissue are analyzed in view of a new systems biology approach. Copyright © 2014 Elsevier B.V. All rights reserved.
Respiratory chain complex III deficiency in patients with tRNA-leu mutation.
Jiang, J; Wang, X L; Ma, Y Y
2015-12-29
The aim of this study was to investigate the clinical and genetic profiles of mitochondrial disease resulting from deficiencies in the respiratory chain complex III. Three patients, aged between 8 months and 12 years, were recruited for this study. The activities of mitochondrial respiratory chain complexes in the peripheral leucocytes were spectrophotometrically measured. The entire mitochondrial DNA (mtDNA) sequence was analyzed. Samples obtained from the three patients and their families were subjected to restriction fragment length polymorphism and gene sequencing analyses. mtDNA copy numbers of all patients and their mothers were analyzed. The patients displayed nervous system impairment, including motor and mental developmental delay, hypotonia, and motor regression. Two patients also suffered from Leigh syndrome. Assay of the mitochondrial respiratory chain enzymes revealed an isolated complex III deficiency in the three patients. The m.3243 A>G mutation was detected in all patients and their mothers. The mutation loads were 48.3, 57.2, and 45.5% in the patients, and 20.5, 16.4, and 23.6% in their respective mothers. The leukocyte mtDNA copy numbers of the patients and their mothers were within the control range. The clinical manifestation and genetics were observed to be very heterogeneous. Patient carrying an m.3243 A>G mutation may biochemically display a deficiency in the mitochondrial respiratory chain complex III.
Villarreal A, Juan Carlos; Crandall-Stotler, Barbara J; Hart, Michelle L; Long, David G; Forrest, Laura L
2016-03-01
We present a complete generic-level phylogeny of the complex thalloid liverworts, a lineage that includes the model system Marchantia polymorpha. The complex thalloids are remarkable for their slow rate of molecular evolution and for being the only extant plant lineage to differentiate gas exchange tissues in the gametophyte generation. We estimated the divergence times and analyzed the evolutionary trends of morphological traits, including air chambers, rhizoids and specialized reproductive structures. A multilocus dataset was analyzed using maximum likelihood and Bayesian approaches. Relative rates were estimated using local clocks. Our phylogeny cements the early branching in complex thalloids. Marchantia is supported in one of the earliest divergent lineages. The rate of evolution in organellar loci is slower than for other liverwort lineages, except for two annual lineages. Most genera diverged in the Cretaceous. Marchantia polymorpha diversified in the Late Miocene, giving a minimum age estimate for the evolution of its sex chromosomes. The complex thalloid ancestor, excluding Blasiales, is reconstructed as a plant with a carpocephalum, with filament-less air chambers opening via compound pores, and without pegged rhizoids. Our comprehensive study of the group provides a temporal framework for the analysis of the evolution of critical traits essential for plants during land colonization. © 2015 Royal Botanic Garden Edinburgh. New Phytologist © 2015 New Phytologist Trust.
Wang, Guochao; Wang, Jun
2017-01-01
We make an approach on investigating the fluctuation behaviors of financial volatility duration dynamics. A new concept of volatility two-component range intensity (VTRI) is developed, which constitutes the maximal variation range of volatility intensity and shortest passage time of duration, and can quantify the investment risk in financial markets. In an attempt to study and describe the nonlinear complex properties of VTRI, a random agent-based financial price model is developed by the finite-range interacting biased voter system. The autocorrelation behaviors and the power-law scaling behaviors of return time series and VTRI series are investigated. Then, the complexity of VTRI series of the real markets and the proposed model is analyzed by Fuzzy entropy (FuzzyEn) and Lempel-Ziv complexity. In this process, we apply the cross-Fuzzy entropy (C-FuzzyEn) to study the asynchrony of pairs of VTRI series. The empirical results reveal that the proposed model has the similar complex behaviors with the actual markets and indicate that the proposed stock VTRI series analysis and the financial model are meaningful and feasible to some extent.
NASA Astrophysics Data System (ADS)
Wang, Guochao; Wang, Jun
2017-01-01
We make an approach on investigating the fluctuation behaviors of financial volatility duration dynamics. A new concept of volatility two-component range intensity (VTRI) is developed, which constitutes the maximal variation range of volatility intensity and shortest passage time of duration, and can quantify the investment risk in financial markets. In an attempt to study and describe the nonlinear complex properties of VTRI, a random agent-based financial price model is developed by the finite-range interacting biased voter system. The autocorrelation behaviors and the power-law scaling behaviors of return time series and VTRI series are investigated. Then, the complexity of VTRI series of the real markets and the proposed model is analyzed by Fuzzy entropy (FuzzyEn) and Lempel-Ziv complexity. In this process, we apply the cross-Fuzzy entropy (C-FuzzyEn) to study the asynchrony of pairs of VTRI series. The empirical results reveal that the proposed model has the similar complex behaviors with the actual markets and indicate that the proposed stock VTRI series analysis and the financial model are meaningful and feasible to some extent.
Distributed Trajectory Flexibility Preservation for Traffic Complexity Mitigation
NASA Technical Reports Server (NTRS)
Idris, Husni; Wing, David; Delahaye, Daniel
2009-01-01
The growing demand for air travel is increasing the need for mitigation of air traffic congestion and complexity problems, which are already at high levels. At the same time new information and automation technologies are enabling the distribution of tasks and decisions from the service providers to the users of the air traffic system, with potential capacity and cost benefits. This distribution of tasks and decisions raises the concern that independent user actions will decrease the predictability and increase the complexity of the traffic system, hence inhibiting and possibly reversing any potential benefits. In answer to this concern, the authors propose the introduction of decision-making metrics for preserving user trajectory flexibility. The hypothesis is that such metrics will make user actions naturally mitigate traffic complexity. In this paper, the impact of using these metrics on traffic complexity is investigated. The scenarios analyzed include aircraft in en route airspace with each aircraft meeting a required time of arrival in a one-hour time horizon while mitigating the risk of loss of separation with the other aircraft, thus preserving its trajectory flexibility. The experiments showed promising results in that the individual trajectory flexibility preservation induced self-separation and self-organization effects in the overall traffic situation. The effects were quantified using traffic complexity metrics based on Lyapunov exponents and traffic proximity.
NASA Technical Reports Server (NTRS)
Xu, Xidong; Ulrey, Mike L.; Brown, John A.; Mast, James; Lapis, Mary B.
2013-01-01
NextGen is a complex socio-technical system and, in many ways, it is expected to be more complex than the current system. It is vital to assess the safety impact of the NextGen elements (technologies, systems, and procedures) in a rigorous and systematic way and to ensure that they do not compromise safety. In this study, the NextGen elements in the form of Operational Improvements (OIs), Enablers, Research Activities, Development Activities, and Policy Issues were identified. The overall hazard situation in NextGen was outlined; a high-level hazard analysis was conducted with respect to multiple elements in a representative NextGen OI known as OI-0349 (Automation Support for Separation Management); and the hazards resulting from the highly dynamic complexity involved in an OI-0349 scenario were illustrated. A selected but representative set of the existing safety methods, tools, processes, and regulations was then reviewed and analyzed regarding whether they are sufficient to assess safety in the elements of that OI and ensure that safety will not be compromised and whether they might incur intolerably high costs.
Visualization of water transport into soybean nodules by Tof-SIMS cryo system.
Iijima, Morio; Watanabe, Toshimasa; Yoshida, Tomoharu; Kawasaki, Michio; Kato, Toshiyuki; Yamane, Koji
2015-04-15
This paper examined the route of water supply into soybean nodules through the new visualization technique of time of flight secondary ion mass spectrometry (Tof-SIMS) cryo system, and obtained circumstantial evidence for the water inflow route. The maximum resolution of the Tof-SIMS imaging used by this study was 1.8 μm (defined as the three pixel step length), which allowed us to detect water movement at the cellular level. Deuterium-labeled water was supplied to soybean plants for 4h and the presence of deuterium in soybean nodules was analyzed by the Tof-SIMS cryo system. Deuterium ions were found only in the endodermis tissue surrounding the central cylinder in soybean nodules. Neither xylem vessels nor phloem complex itself did not indicate any deuterium accumulation. Deuterium-ion counts in the endodermis tissue were not changed by girdling treatment, which restricted water movement through the phloem complex. The results strongly indicated that nodule tissues did not receive water directly from the phloem complex, but received water through root cortex apoplastic pathway from the root axis. Copyright © 2015 Elsevier GmbH. All rights reserved.
Synchronization Experiments With A Global Coupled Model of Intermediate Complexity
NASA Astrophysics Data System (ADS)
Selten, Frank; Hiemstra, Paul; Shen, Mao-Lin
2013-04-01
In the super modeling approach an ensemble of imperfect models are connected through nudging terms that nudge the solution of each model to the solution of all other models in the ensemble. The goal is to obtain a synchronized state through a proper choice of connection strengths that closely tracks the trajectory of the true system. For the super modeling approach to be successful, the connections should be dense and strong enough for synchronization to occur. In this study we analyze the behavior of an ensemble of connected global atmosphere-ocean models of intermediate complexity. All atmosphere models are connected to the same ocean model through the surface fluxes of heat, water and momentum, the ocean is integrated using weighted averaged surface fluxes. In particular we analyze the degree of synchronization between the atmosphere models and the characteristics of the ensemble mean solution. The results are interpreted using a low order atmosphere-ocean toy model.
Improving and analyzing signage within a healthcare setting.
Rousek, J B; Hallbeck, M S
2011-11-01
Healthcare facilities are increasingly utilizing pictograms rather than text signs to help direct people. The purpose of this study was to analyze a wide variety of standardized healthcare pictograms and the effects of color contrasts and complexity for participants with both normal and impaired vision. Fifty (25 males, 25 females) participants completed a signage recognition questionnaire and identified pictograms while wearing vision simulators to represent specific visual impairment. The study showed that certain color contrasts, complexities and orientations can help or hinder comprehension of signage for people with and without visual impairment. High contrast signage with consistent pictograms involving human figures (not too detailed or too abstract) is most identifiable. Standardization of healthcare signage is recommended to speed up and aid the cognitive thought process in detecting signage and determining meaning. These fundamental signage principles are critical in producing an efficient, universal wayfinding system for healthcare facilities. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Maintenance of Time and Frequency in the DSN Using the Global Positioning System
NASA Technical Reports Server (NTRS)
Clements, P. A.; Kirk, A.; Borutzki, S. E.
1985-01-01
The Deep Space Network must maintain time and frequency within specified limits in order to accurately track the spacecraft engaged in deep space exploration. The DSN has three tracking complexes, located approximately equidistantly around the Earth. Various methods are used to coordinate the clocks among the three complexes. These methods include Loran-C, TV Line 10, very long baseline interferometry (VLBI), and the Global Positioning System (GPS). The GPS is becoming increasingly important because of the accuracy, precision, and rapid availability of the data; GPS receivers have been installed at each of the DSN complexes and are used to obtain daily time offsets between the master clock at each site and UTC(USNO/NBS). Calculations are made to obtain frequency offsets and Allan variances. These data are analyzed and used to monitor the performance of the hydrogen masers that provide the reference frequencies for the DSN frequency and timing system (DFT). A brief history of the GPS timing receivers in the DSN, a description of the data and information flow, data on the performance of the DSN master clocks and GPS measurement system, and a description of hydrogen maser frequency steering using these data are presented.
Parametric Analysis of a Hover Test Vehicle using Advanced Test Generation and Data Analysis
NASA Technical Reports Server (NTRS)
Gundy-Burlet, Karen; Schumann, Johann; Menzies, Tim; Barrett, Tony
2009-01-01
Large complex aerospace systems are generally validated in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. This is due to the large parameter space, and complex, highly coupled nonlinear nature of the different systems that contribute to the performance of the aerospace system. We have addressed the factors deterring such an analysis by applying a combination of technologies to the area of flight envelop assessment. We utilize n-factor (2,3) combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. The data generated is automatically analyzed through a combination of unsupervised learning using a Bayesian multivariate clustering technique (AutoBayes) and supervised learning of critical parameter ranges using the machine-learning tool TAR3, a treatment learner. Covariance analysis with scatter plots and likelihood contours are used to visualize correlations between simulation parameters and simulation results, a task that requires tool support, especially for large and complex models. We present results of simulation experiments for a cold-gas-powered hover test vehicle.
The emergence of mind and brain: an evolutionary, computational, and philosophical approach.
Mainzer, Klaus
2008-01-01
Modern philosophy of mind cannot be understood without recent developments in computer science, artificial intelligence (AI), robotics, neuroscience, biology, linguistics, and psychology. Classical philosophy of formal languages as well as symbolic AI assume that all kinds of knowledge must explicitly be represented by formal or programming languages. This assumption is limited by recent insights into the biology of evolution and developmental psychology of the human organism. Most of our knowledge is implicit and unconscious. It is not formally represented, but embodied knowledge, which is learnt by doing and understood by bodily interacting with changing environments. That is true not only for low-level skills, but even for high-level domains of categorization, language, and abstract thinking. The embodied mind is considered an emergent capacity of the brain as a self-organizing complex system. Actually, self-organization has been a successful strategy of evolution to handle the increasing complexity of the world. Genetic programs are not sufficient and cannot prepare the organism for all kinds of complex situations in the future. Self-organization and emergence are fundamental concepts in the theory of complex dynamical systems. They are also applied in organic computing as a recent research field of computer science. Therefore, cognitive science, AI, and robotics try to model the embodied mind in an artificial evolution. The paper analyzes these approaches in the interdisciplinary framework of complex dynamical systems and discusses their philosophical impact.
A stochastic method for stand-alone photovoltaic system sizing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cabral, Claudia Valeria Tavora; Filho, Delly Oliveira; Martins, Jose Helvecio
Photovoltaic systems utilize solar energy to generate electrical energy to meet load demands. Optimal sizing of these systems includes the characterization of solar radiation. Solar radiation at the Earth's surface has random characteristics and has been the focus of various academic studies. The objective of this study was to stochastically analyze parameters involved in the sizing of photovoltaic generators and develop a methodology for sizing of stand-alone photovoltaic systems. Energy storage for isolated systems and solar radiation were analyzed stochastically due to their random behavior. For the development of the methodology proposed stochastic analysis were studied including the Markov chainmore » and beta probability density function. The obtained results were compared with those for sizing of stand-alone using from the Sandia method (deterministic), in which the stochastic model presented more reliable values. Both models present advantages and disadvantages; however, the stochastic one is more complex and provides more reliable and realistic results. (author)« less
Analyzing checkpointing trends for applications on the IBM Blue Gene/P system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naik, H.; Gupta, R.; Beckman, P.
Current petascale systems have tens of thousands of hardware components and complex system software stacks, which increase the probability of faults occurring during the lifetime of a process. Checkpointing has been a popular method of providing fault tolerance in high-end systems. While considerable research has been done to optimize checkpointing, in practice the method still involves a high-cost overhead for users. In this paper, we study the checkpointing overhead seen by applications running on leadership-class machines such as the IBM Blue Gene/P at Argonne National Laboratory. We study various applications and design a methodology to assist users in understanding andmore » choosing checkpointing frequency and reducing the overhead incurred. In particular, we study three popular applications -- the Grid-Based Projector-Augmented Wave application, the Carr-Parrinello Molecular Dynamics application, and a Nek5000 computational fluid dynamics application -- and analyze their memory usage and possible checkpointing trends on 32,768 processors of the Blue Gene/P system.« less
Design for testability and diagnosis at the system-level
NASA Technical Reports Server (NTRS)
Simpson, William R.; Sheppard, John W.
1993-01-01
The growing complexity of full-scale systems has surpassed the capabilities of most simulation software to provide detailed models or gate-level failure analyses. The process of system-level diagnosis approaches the fault-isolation problem in a manner that differs significantly from the traditional and exhaustive failure mode search. System-level diagnosis is based on a functional representation of the system. For example, one can exercise one portion of a radar algorithm (the Fast Fourier Transform (FFT) function) by injecting several standard input patterns and comparing the results to standardized output results. An anomalous output would point to one of several items (including the FFT circuit) without specifying the gate or failure mode. For system-level repair, identifying an anomalous chip is sufficient. We describe here an information theoretic and dependency modeling approach that discards much of the detailed physical knowledge about the system and analyzes its information flow and functional interrelationships. The approach relies on group and flow associations and, as such, is hierarchical. Its hierarchical nature allows the approach to be applicable to any level of complexity and to any repair level. This approach has been incorporated in a product called STAMP (System Testability and Maintenance Program) which was developed and refined through more than 10 years of field-level applications to complex system diagnosis. The results have been outstanding, even spectacular in some cases. In this paper we describe system-level testability, system-level diagnoses, and the STAMP analysis approach, as well as a few STAMP applications.
Complex network approach to classifying classical piano compositions
NASA Astrophysics Data System (ADS)
Xin, Chen; Zhang, Huishu; Huang, Jiping
2016-10-01
Complex network has been regarded as a useful tool handling systems with vague interactions. Hence, numerous applications have arised. In this paper we construct complex networks for 770 classical piano compositions of Mozart, Beethoven and Chopin based on musical note pitches and lengths. We find prominent distinctions among network edges of different composers. Some stylized facts can be explained by such parameters of network structures and topologies. Further, we propose two classification methods for music styles and genres according to the discovered distinctions. These methods are easy to implement and the results are sound. This work suggests that complex network could be a decent way to analyze the characteristics of musical notes, since it could provide a deep view into understanding of the relationships among notes in musical compositions and evidence for classification of different composers, styles and genres of music.
Nuclear localization of Schizosaccharomyces pombe Mcm2/Cdc19p requires MCM complex assembly.
Pasion, S G; Forsburg, S L
1999-12-01
The minichromosome maintenance (MCM) proteins MCM2-MCM7 are conserved eukaryotic replication factors that assemble in a heterohexameric complex. In fission yeast, these proteins are nuclear throughout the cell cycle. In studying the mechanism that regulates assembly of the MCM complex, we analyzed the cis and trans elements required for nuclear localization of a single subunit, Mcm2p. Mutation of any single mcm gene leads to redistribution of wild-type MCM subunits to the cytoplasm, and this redistribution depends on an active nuclear export system. We identified the nuclear localization signal sequences of Mcm2p and showed that these are required for nuclear targeting of other MCM subunits. In turn, Mcm2p must associate with other MCM proteins for its proper localization; nuclear localization of MCM proteins thus requires assembly of MCM proteins in a complex. We suggest that coupling complex assembly to nuclear targeting and retention ensures that only intact heterohexameric MCM complexes remain nuclear.
Nuclear Localization of Schizosaccharomyces pombe Mcm2/Cdc19p Requires MCM Complex Assembly
Pasion, Sally G.; Forsburg, Susan L.
1999-01-01
The minichromosome maintenance (MCM) proteins MCM2–MCM7 are conserved eukaryotic replication factors that assemble in a heterohexameric complex. In fission yeast, these proteins are nuclear throughout the cell cycle. In studying the mechanism that regulates assembly of the MCM complex, we analyzed the cis and trans elements required for nuclear localization of a single subunit, Mcm2p. Mutation of any single mcm gene leads to redistribution of wild-type MCM subunits to the cytoplasm, and this redistribution depends on an active nuclear export system. We identified the nuclear localization signal sequences of Mcm2p and showed that these are required for nuclear targeting of other MCM subunits. In turn, Mcm2p must associate with other MCM proteins for its proper localization; nuclear localization of MCM proteins thus requires assembly of MCM proteins in a complex. We suggest that coupling complex assembly to nuclear targeting and retention ensures that only intact heterohexameric MCM complexes remain nuclear. PMID:10588642
Zhang, Xiulan; Bloom, Gerald; Xu, Xiaoxin; Chen, Lin; Liang, Xiaoyun; Wolcott, Sara J
2014-08-26
This paper explores the evolution of schemes for rural finance in China as a case study of the long and complex process of health system development. It argues that the evolution of these schemes has been the outcome of the response of a large number of agents to a rapidly changing context and of efforts by the government to influence this adaptation process and achieve public health goals. The study draws on several sources of data including a review of official policy documents and academic papers and in-depth interviews with key policy actors at national level and at a sample of localities. The study identifies three major transition points associated with changes in broad development strategy and demonstrates how the adaptation of large numbers of actors to these contextual changes had a major impact on the performance of the health system. Further, it documents how the Ministry of Health viewed its role as both an advocate for the interests of health facilities and health workers and as the agency responsible for ensuring that government health system objectives were met. It is argued that a major reason for the resilience of the health system and its ability to adapt to rapid economic and institutional change was the ability of the Ministry to provide overall strategy leadership. Additionally, it postulates that a number of interest groups have emerged, which now also seek to influence the pathway of health system development. This history illustrates the complex and political nature of the management of health system development and reform. The paper concludes that governments will need to increase their capacity to analyze the health sector as a complex system and to manage change processes.
Data management system performance modeling
NASA Technical Reports Server (NTRS)
Kiser, Larry M.
1993-01-01
This paper discusses analytical techniques that have been used to gain a better understanding of the Space Station Freedom's (SSF's) Data Management System (DMS). The DMS is a complex, distributed, real-time computer system that has been redesigned numerous times. The implications of these redesigns have not been fully analyzed. This paper discusses the advantages and disadvantages for static analytical techniques such as Rate Monotonic Analysis (RMA) and also provides a rationale for dynamic modeling. Factors such as system architecture, processor utilization, bus architecture, queuing, etc. are well suited for analysis with a dynamic model. The significance of performance measures for a real-time system are discussed.
The design of multiplayer online video game systems
NASA Astrophysics Data System (ADS)
Hsu, Chia-chun A.; Ling, Jim; Li, Qing; Kuo, C.-C. J.
2003-11-01
The distributed Multiplayer Online Game (MOG) system is complex since it involves technologies in computer graphics, multimedia, artificial intelligence, computer networking, embedded systems, etc. Due to the large scope of this problem, the design of MOG systems has not yet been widely addressed in the literatures. In this paper, we review and analyze the current MOG system architecture followed by evaluation. Furthermore, we propose a clustered-server architecture to provide a scalable solution together with the region oriented allocation strategy. Two key issues, i.e. interesting management and synchronization, are discussed in depth. Some preliminary ideas to deal with the identified problems are described.
Optical spring effect in nanoelectromechanical systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tian, Feng; Zhou, Guangya, E-mail: mpezgy@nus.edu.sg; Du, Yu
2014-08-11
In this Letter, we report a hybrid system consisting of nano-optical and nano-mechanical springs, in which the optical spring effect works to adjust the mechanical frequency of a nanoelectromechanical systems resonator. Nano-scale folded beams are fabricated as the mechanical springs and double-coupled one-dimensional photonic crystal cavities are used to pump the “optical spring.” The dynamic characteristics of this hybrid system are measured and analyzed at both low and high input optical powers. This study leads the physical phenomenon of optomechanics in complex nano-opto-electro-mechanical systems (NOEMS) and could benefit the future applications of NOEMS in chip-level communication and sensing.
Identifying States of a Financial Market
NASA Astrophysics Data System (ADS)
Münnix, Michael C.; Shimada, Takashi; Schäfer, Rudi; Leyvraz, Francois; Seligman, Thomas H.; Guhr, Thomas; Stanley, H. Eugene
2012-09-01
The understanding of complex systems has become a central issue because such systems exist in a wide range of scientific disciplines. We here focus on financial markets as an example of a complex system. In particular we analyze financial data from the S&P 500 stocks in the 19-year period 1992-2010. We propose a definition of state for a financial market and use it to identify points of drastic change in the correlation structure. These points are mapped to occurrences of financial crises. We find that a wide variety of characteristic correlation structure patterns exist in the observation time window, and that these characteristic correlation structure patterns can be classified into several typical ``market states''. Using this classification we recognize transitions between different market states. A similarity measure we develop thus affords means of understanding changes in states and of recognizing developments not previously seen.
Information driving force and its application in agent-based modeling
NASA Astrophysics Data System (ADS)
Chen, Ting-Ting; Zheng, Bo; Li, Yan; Jiang, Xiong-Fei
2018-04-01
Exploring the scientific impact of online big-data has attracted much attention of researchers from different fields in recent years. Complex financial systems are typical open systems profoundly influenced by the external information. Based on the large-scale data in the public media and stock markets, we first define an information driving force, and analyze how it affects the complex financial system. The information driving force is observed to be asymmetric in the bull and bear market states. As an application, we then propose an agent-based model driven by the information driving force. Especially, all the key parameters are determined from the empirical analysis rather than from statistical fitting of the simulation results. With our model, both the stationary properties and non-stationary dynamic behaviors are simulated. Considering the mean-field effect of the external information, we also propose a few-body model to simulate the financial market in the laboratory.
Identifying states of a financial market.
Münnix, Michael C; Shimada, Takashi; Schäfer, Rudi; Leyvraz, Francois; Seligman, Thomas H; Guhr, Thomas; Stanley, H Eugene
2012-01-01
The understanding of complex systems has become a central issue because such systems exist in a wide range of scientific disciplines. We here focus on financial markets as an example of a complex system. In particular we analyze financial data from the S&P 500 stocks in the 19-year period 1992-2010. We propose a definition of state for a financial market and use it to identify points of drastic change in the correlation structure. These points are mapped to occurrences of financial crises. We find that a wide variety of characteristic correlation structure patterns exist in the observation time window, and that these characteristic correlation structure patterns can be classified into several typical "market states". Using this classification we recognize transitions between different market states. A similarity measure we develop thus affords means of understanding changes in states and of recognizing developments not previously seen.
Quantifying the Behavior of Stock Correlations Under Market Stress
Preis, Tobias; Kenett, Dror Y.; Stanley, H. Eugene; Helbing, Dirk; Ben-Jacob, Eshel
2012-01-01
Understanding correlations in complex systems is crucial in the face of turbulence, such as the ongoing financial crisis. However, in complex systems, such as financial systems, correlations are not constant but instead vary in time. Here we address the question of quantifying state-dependent correlations in stock markets. Reliable estimates of correlations are absolutely necessary to protect a portfolio. We analyze 72 years of daily closing prices of the 30 stocks forming the Dow Jones Industrial Average (DJIA). We find the striking result that the average correlation among these stocks scales linearly with market stress reflected by normalized DJIA index returns on various time scales. Consequently, the diversification effect which should protect a portfolio melts away in times of market losses, just when it would most urgently be needed. Our empirical analysis is consistent with the interesting possibility that one could anticipate diversification breakdowns, guiding the design of protected portfolios. PMID:23082242
A finite element formulation for scattering from electrically large 2-dimensional structures
NASA Technical Reports Server (NTRS)
Ross, Daniel C.; Volakis, John L.
1992-01-01
A finite element formulation is given using the scattered field approach with a fictitious material absorber to truncate the mesh. The formulation includes the use of arbitrary approximation functions so that more accurate results can be achieved without any modification to the software. Additionally, non-polynomial approximation functions can be used, including complex approximation functions. The banded system that results is solved with an efficient sparse/banded iterative scheme and as a consequence, large structures can be analyzed. Results are given for simple cases to verify the formulation and also for large, complex geometries.
Emerging properties of financial time series in the ``Game of Life''
NASA Astrophysics Data System (ADS)
Hernández-Montoya, A. R.; Coronel-Brizio, H. F.; Stevens-Ramírez, G. A.; Rodríguez-Achach, M.; Politi, M.; Scalas, E.
2011-12-01
We explore the spatial complexity of Conway’s “Game of Life,” a prototypical cellular automaton by means of a geometrical procedure generating a two-dimensional random walk from a bidimensional lattice with periodical boundaries. The one-dimensional projection of this process is analyzed and it turns out that some of its statistical properties resemble the so-called stylized facts observed in financial time series. The scope and meaning of this result are discussed from the viewpoint of complex systems. In particular, we stress how the supposed peculiarities of financial time series are, often, overrated in their importance.
Akemann, G; Bloch, J; Shifrin, L; Wettig, T
2008-01-25
We analyze how individual eigenvalues of the QCD Dirac operator at nonzero quark chemical potential are distributed in the complex plane. Exact and approximate analytical results for both quenched and unquenched distributions are derived from non-Hermitian random matrix theory. When comparing these to quenched lattice QCD spectra close to the origin, excellent agreement is found for zero and nonzero topology at several values of the quark chemical potential. Our analytical results are also applicable to other physical systems in the same symmetry class.
SABRINA - an interactive geometry modeler for MCNP
DOE Office of Scientific and Technical Information (OSTI.GOV)
West, J.T.; Murphy, J.
One of the most difficult tasks when analyzing a complex three-dimensional system with Monte Carlo is geometry model development. SABRINA attempts to make the modeling process more user-friendly and less of an obstacle. It accepts both combinatorial solid bodies and MCNP surfaces and produces MCNP cells. The model development process in SABRINA is highly interactive and gives the user immediate feedback on errors. Users can view their geometry from arbitrary perspectives while the model is under development and interactively find and correct modeling errors. An example of a SABRINA display is shown. It represents a complex three-dimensional shape.
Emerging properties of financial time series in the "Game of Life".
Hernández-Montoya, A R; Coronel-Brizio, H F; Stevens-Ramírez, G A; Rodríguez-Achach, M; Politi, M; Scalas, E
2011-12-01
We explore the spatial complexity of Conway's "Game of Life," a prototypical cellular automaton by means of a geometrical procedure generating a two-dimensional random walk from a bidimensional lattice with periodical boundaries. The one-dimensional projection of this process is analyzed and it turns out that some of its statistical properties resemble the so-called stylized facts observed in financial time series. The scope and meaning of this result are discussed from the viewpoint of complex systems. In particular, we stress how the supposed peculiarities of financial time series are, often, overrated in their importance.
Dekker, Sidney
2012-05-01
Complexity is a defining characteristic of healthcare, and ergonomic interventions in clinical practice need to take into account aspects vital for the success or failure of new technology. The introduction of new monitoring technology, for example, creates many ripple effects through clinical relationships and agents' cross-adaptations. This paper uses the signal detection paradigm to account for a case in which multiple clinical decision makers, across power hierarchies and gender gaps, manipulate each others' sensitivities to evidence and decision criteria. These are possible to analyze and predict with an applied ergonomics that is sensitive to the social complexities of the workplace, including power, gender, hierarchy and fuzzy system boundaries. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.
NASA Astrophysics Data System (ADS)
Delignières, Didier; Marmelat, Vivien
2014-01-01
In this paper, we analyze empirical data, accounting for coordination processes between complex systems (bimanual coordination, interpersonal coordination, and synchronization with a fractal metronome), by using a recently proposed method: detrended cross-correlation analysis (DCCA). This work is motivated by the strong anticipation hypothesis, which supposes that coordination between complex systems is not achieved on the basis of local adaptations (i.e., correction, predictions), but results from a more global matching of complexity properties. Indeed, recent experiments have evidenced a very close correlation between the scaling properties of the series produced by two coordinated systems, despite a quite weak local synchronization. We hypothesized that strong anticipation should result in the presence of long-range cross-correlations between the series produced by the two systems. Results allow a detailed analysis of the effects of coordination on the fluctuations of the series produced by the two systems. In the long term, series tend to present similar scaling properties, with clear evidence of long-range cross-correlation. Short-term results strongly depend on the nature of the task. Simulation studies allow disentangling the respective effects of noise and short-term coupling processes on DCCA results, and suggest that the matching of long-term fluctuations could be the result of short-term coupling processes.
Carvalho, Marilia Sá; Coeli, Claudia Medina; Chor, Dóra; Pinheiro, Rejane Sobrino; da Fonseca, Maria de Jesus Mendes; de Sá Carvalho, Luiz Carlos
2015-01-01
The most common modeling approaches to understanding incidence, prevalence and control of chronic diseases in populations, such as statistical regression models, are limited when it comes to dealing with the complexity of those problems. Those complex adaptive systems have characteristics such as emerging properties, self-organization and feedbacks, which structure the system stability and resistance to changes. Recently, system science approaches have been proposed to deal with the range, complexity, and multifactor nature of those public health problems. In this paper we applied a multilevel systemic approach to create an integrated, coherent, and increasingly precise conceptual framework, capable of aggregating different partial or specialized studies, based on the challenges of the Longitudinal Study of Adult Health – ELSA-Brasil. The failure to control blood pressure found in several of the study's subjects was discussed, based on the proposed model, analyzing different loops, time lags, and feedback that influence this outcome in a population with high educational level, with reasonably good health services access. We were able to identify the internal circularities and cycles that generate the system’s resistance to change. We believe that this study can contribute to propose some new possibilities of the research agenda and to the discussion of integrated actions in the field of public health. PMID:26171854
System model the processing of heterogeneous sensory information in robotized complex
NASA Astrophysics Data System (ADS)
Nikolaev, V.; Titov, V.; Syryamkin, V.
2018-05-01
Analyzed the scope and the types of robotic systems consisting of subsystems of the form "a heterogeneous sensors data processing subsystem". On the basis of the Queuing theory model is developed taking into account the unevenness of the intensity of information flow from the sensors to the subsystem of information processing. Analytical solution to assess the relationship of subsystem performance and uneven flows. The research of the obtained solution in the range of parameter values of practical interest.
NASA Astrophysics Data System (ADS)
Gorshkov, A. V.; Prosviryakov, E. Yu.
2017-12-01
The paper considers the construction of analytical solutions to the Oberbeck-Boussinesq system. This system describes layered Bénard-Marangoni convective flows of an incompressible viscous fluid. The third-kind boundary condition, i. e. Newton's heat transfer law, is used on the boundaries of a fluid layer. The obtained solution is analyzed. It is demonstrated that there is a fluid layer thickness with tangential stresses vanishing simultaneously, this being equivalent to the existence of tensile and compressive stresses.
2005-03-01
as Cooperative Engagement Capability (CEC). The addition of CEC makes it a hub for Battle Group Integration Testing (BGIT) that can replicate Radar and...Link performance characteristics for naval battle groups . 2 3. The closure of the Atlantic Fleet Weapons Training Facility (AFWTF), Vieques Island...different authors and groups over the past ten years. The intention is to analyze this information, combine it where appropriate, present it in one
Dasgupta, Aritra; Lee, Joon-Yong; Wilson, Ryan; Lafrance, Robert A; Cramer, Nick; Cook, Kristin; Payne, Samuel
2017-01-01
Combining interactive visualization with automated analytical methods like statistics and data mining facilitates data-driven discovery. These visual analytic methods are beginning to be instantiated within mixed-initiative systems, where humans and machines collaboratively influence evidence-gathering and decision-making. But an open research question is that, when domain experts analyze their data, can they completely trust the outputs and operations on the machine-side? Visualization potentially leads to a transparent analysis process, but do domain experts always trust what they see? To address these questions, we present results from the design and evaluation of a mixed-initiative, visual analytics system for biologists, focusing on analyzing the relationships between familiarity of an analysis medium and domain experts' trust. We propose a trust-augmented design of the visual analytics system, that explicitly takes into account domain-specific tasks, conventions, and preferences. For evaluating the system, we present the results of a controlled user study with 34 biologists where we compare the variation of the level of trust across conventional and visual analytic mediums and explore the influence of familiarity and task complexity on trust. We find that despite being unfamiliar with a visual analytic medium, scientists seem to have an average level of trust that is comparable with the same in conventional analysis medium. In fact, for complex sense-making tasks, we find that the visual analytic system is able to inspire greater trust than other mediums. We summarize the implications of our findings with directions for future research on trustworthiness of visual analytic systems.
Geographically distributed real-time digital simulations using linear prediction
Liu, Ren; Mohanpurkar, Manish; Panwar, Mayank; ...
2016-07-04
Real time simulation is a powerful tool for analyzing, planning, and operating modern power systems. For analyzing the ever evolving power systems and understanding complex dynamic and transient interactions larger real time computation capabilities are essential. These facilities are interspersed all over the globe and to leverage unique facilities geographically-distributed real-time co-simulation in analyzing the power systems is pursued and presented. However, the communication latency between different simulator locations may lead to inaccuracy in geographically distributed real-time co-simulations. In this paper, the effect of communication latency on geographically distributed real-time co-simulation is introduced and discussed. In order to reduce themore » effect of the communication latency, a real-time data predictor, based on linear curve fitting is developed and integrated into the distributed real-time co-simulation. Two digital real time simulators are used to perform dynamic and transient co-simulations with communication latency and predictor. Results demonstrate the effect of the communication latency and the performance of the real-time data predictor to compensate it.« less
Geographically distributed real-time digital simulations using linear prediction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Ren; Mohanpurkar, Manish; Panwar, Mayank
Real time simulation is a powerful tool for analyzing, planning, and operating modern power systems. For analyzing the ever evolving power systems and understanding complex dynamic and transient interactions larger real time computation capabilities are essential. These facilities are interspersed all over the globe and to leverage unique facilities geographically-distributed real-time co-simulation in analyzing the power systems is pursued and presented. However, the communication latency between different simulator locations may lead to inaccuracy in geographically distributed real-time co-simulations. In this paper, the effect of communication latency on geographically distributed real-time co-simulation is introduced and discussed. In order to reduce themore » effect of the communication latency, a real-time data predictor, based on linear curve fitting is developed and integrated into the distributed real-time co-simulation. Two digital real time simulators are used to perform dynamic and transient co-simulations with communication latency and predictor. Results demonstrate the effect of the communication latency and the performance of the real-time data predictor to compensate it.« less
Proteomics-Based Analysis of Protein Complexes in Pluripotent Stem Cells and Cancer Biology.
Sudhir, Putty-Reddy; Chen, Chung-Hsuan
2016-03-22
A protein complex consists of two or more proteins that are linked together through protein-protein interactions. The proteins show stable/transient and direct/indirect interactions within the protein complex or between the protein complexes. Protein complexes are involved in regulation of most of the cellular processes and molecular functions. The delineation of protein complexes is important to expand our knowledge on proteins functional roles in physiological and pathological conditions. The genetic yeast-2-hybrid method has been extensively used to characterize protein-protein interactions. Alternatively, a biochemical-based affinity purification coupled with mass spectrometry (AP-MS) approach has been widely used to characterize the protein complexes. In the AP-MS method, a protein complex of a target protein of interest is purified using a specific antibody or an affinity tag (e.g., DYKDDDDK peptide (FLAG) and polyhistidine (His)) and is subsequently analyzed by means of MS. Tandem affinity purification, a two-step purification system, coupled with MS has been widely used mainly to reduce the contaminants. We review here a general principle for AP-MS-based characterization of protein complexes and we explore several protein complexes identified in pluripotent stem cell biology and cancer biology as examples.
Proteomics-Based Analysis of Protein Complexes in Pluripotent Stem Cells and Cancer Biology
Sudhir, Putty-Reddy; Chen, Chung-Hsuan
2016-01-01
A protein complex consists of two or more proteins that are linked together through protein–protein interactions. The proteins show stable/transient and direct/indirect interactions within the protein complex or between the protein complexes. Protein complexes are involved in regulation of most of the cellular processes and molecular functions. The delineation of protein complexes is important to expand our knowledge on proteins functional roles in physiological and pathological conditions. The genetic yeast-2-hybrid method has been extensively used to characterize protein-protein interactions. Alternatively, a biochemical-based affinity purification coupled with mass spectrometry (AP-MS) approach has been widely used to characterize the protein complexes. In the AP-MS method, a protein complex of a target protein of interest is purified using a specific antibody or an affinity tag (e.g., DYKDDDDK peptide (FLAG) and polyhistidine (His)) and is subsequently analyzed by means of MS. Tandem affinity purification, a two-step purification system, coupled with MS has been widely used mainly to reduce the contaminants. We review here a general principle for AP-MS-based characterization of protein complexes and we explore several protein complexes identified in pluripotent stem cell biology and cancer biology as examples. PMID:27011181
Modal vector estimation for closely spaced frequency modes
NASA Technical Reports Server (NTRS)
Craig, R. R., Jr.; Chung, Y. T.; Blair, M.
1982-01-01
Techniques for obtaining improved modal vector estimates for systems with closely spaced frequency modes are discussed. In describing the dynamical behavior of a complex structure modal parameters are often analyzed: undamped natural frequency, mode shape, modal mass, modal stiffness and modal damping. From both an analytical standpoint and an experimental standpoint, identification of modal parameters is more difficult if the system has repeated frequencies or even closely spaced frequencies. The more complex the structure, the more likely it is to have closely spaced frequencies. This makes it difficult to determine valid mode shapes using single shaker test methods. By employing band selectable analysis (zoom) techniques and by employing Kennedy-Pancu circle fitting or some multiple degree of freedom (MDOF) curve fit procedure, the usefulness of the single shaker approach can be extended.
NASA Technical Reports Server (NTRS)
Davies, Misty D.; Gundy-Burlet, Karen
2010-01-01
A useful technique for the validation and verification of complex flight systems is Monte Carlo Filtering -- a global sensitivity analysis that tries to find the inputs and ranges that are most likely to lead to a subset of the outputs. A thorough exploration of the parameter space for complex integrated systems may require thousands of experiments and hundreds of controlled and measured variables. Tools for analyzing this space often have limitations caused by the numerical problems associated with high dimensionality and caused by the assumption of independence of all of the dimensions. To combat both of these limitations, we propose a technique that uses a combination of the original variables with the derived variables obtained during a principal component analysis.
Pratt, M W; Diessner, R; Hunsberger, B; Pancer, S M; Savoy, K
1991-12-01
Four systems for analyzing thinking about 2 personal-life dilemmas, as discussed by 29 men and 35 women (ages 35-85), were compared. Kohlberg's (1976) moral judgment stages, Kegan's (1982) ego-development stages, Gilligan's (1982) moral orientation system, and Suedfeld and Tetlock's (1977) integrative complexity scoring were used. Subjects completed Kohlberg's (Colby & Kohlberg, 1987) standard moral judgment measure, a self-concept description, and several questionnaires. The Kohlberg, Kegan, and integrative complexity codings of the dilemmas were positively related to each other and to the standard Kohlberg moral stage scores. There were no age-group differences and few gender differences on the measures. However, education, role-taking skills, and greater sensitivity to age changes in the self positively predicted higher stage scores across maturity.
NASA Astrophysics Data System (ADS)
Alexandridis, Konstantinos T.
This dissertation adopts a holistic and detailed approach to modeling spatially explicit agent-based artificial intelligent systems, using the Multi Agent-based Behavioral Economic Landscape (MABEL) model. The research questions that addresses stem from the need to understand and analyze the real-world patterns and dynamics of land use change from a coupled human-environmental systems perspective. Describes the systemic, mathematical, statistical, socio-economic and spatial dynamics of the MABEL modeling framework, and provides a wide array of cross-disciplinary modeling applications within the research, decision-making and policy domains. Establishes the symbolic properties of the MABEL model as a Markov decision process, analyzes the decision-theoretic utility and optimization attributes of agents towards comprising statistically and spatially optimal policies and actions, and explores the probabilogic character of the agents' decision-making and inference mechanisms via the use of Bayesian belief and decision networks. Develops and describes a Monte Carlo methodology for experimental replications of agent's decisions regarding complex spatial parcel acquisition and learning. Recognizes the gap on spatially-explicit accuracy assessment techniques for complex spatial models, and proposes an ensemble of statistical tools designed to address this problem. Advanced information assessment techniques such as the Receiver-Operator Characteristic curve, the impurity entropy and Gini functions, and the Bayesian classification functions are proposed. The theoretical foundation for modular Bayesian inference in spatially-explicit multi-agent artificial intelligent systems, and the ensembles of cognitive and scenario assessment modular tools build for the MABEL model are provided. Emphasizes the modularity and robustness as valuable qualitative modeling attributes, and examines the role of robust intelligent modeling as a tool for improving policy-decisions related to land use change. Finally, the major contributions to the science are presented along with valuable directions for future research.
Schoeman, Elizna M; Lopez, Genghis H; McGowan, Eunike C; Millard, Glenda M; O'Brien, Helen; Roulis, Eileen V; Liew, Yew-Wah; Martin, Jacqueline R; McGrath, Kelli A; Powley, Tanya; Flower, Robert L; Hyland, Catherine A
2017-04-01
Blood group single nucleotide polymorphism genotyping probes for a limited range of polymorphisms. This study investigated whether massively parallel sequencing (also known as next-generation sequencing), with a targeted exome strategy, provides an extended blood group genotype and the extent to which massively parallel sequencing correctly genotypes in homologous gene systems, such as RH and MNS. Donor samples (n = 28) that were extensively phenotyped and genotyped using single nucleotide polymorphism typing, were analyzed using the TruSight One Sequencing Panel and MiSeq platform. Genes for 28 protein-based blood group systems, GATA1, and KLF1 were analyzed. Copy number variation analysis was used to characterize complex structural variants in the GYPC and RH systems. The average sequencing depth per target region was 66.2 ± 39.8. Each sample harbored on average 43 ± 9 variants, of which 10 ± 3 were used for genotyping. For the 28 samples, massively parallel sequencing variant sequences correctly matched expected sequences based on single nucleotide polymorphism genotyping data. Copy number variation analysis defined the Rh C/c alleles and complex RHD hybrids. Hybrid RHD*D-CE-D variants were correctly identified, but copy number variation analysis did not confidently distinguish between D and CE exon deletion versus rearrangement. The targeted exome sequencing strategy employed extended the range of blood group genotypes detected compared with single nucleotide polymorphism typing. This single-test format included detection of complex MNS hybrid cases and, with copy number variation analysis, defined RH hybrid genes along with the RHCE*C allele hitherto difficult to resolve by variant detection. The approach is economical compared with whole-genome sequencing and is suitable for a red blood cell reference laboratory setting. © 2017 AABB.
NASA Astrophysics Data System (ADS)
Domínguez Cerdeña, Itahiza; Villasante-Marcos, Victor; Meletlidis, Stavros; Sainz-Maza, Sergio; Abella, Rafael; Torres, Pedro A.; Sánchez, Nieves; Luengo-Oroz, Natividad; José Blanco, María; García-Cañada, Laura; Pereda de Pablo, Jorge; Lamolda, Héctor; Moure, David; Del Fresno, Carmen; Finizola, Anthony; Felepto, Alicia
2017-04-01
Teide-Pico Viejo complex stands for one of the major natural volcanic hazards in the Canary Islands, due to the expected types of eruptions in the area and the high number of inhabitants in Tenerife Island. Therefore, it is necessary to have a volcanic alert system able to afford a precise assessment of the current state of the complex. For this purpose, the knowledge of the expected signals at each volcanic activity level is required. Moreover, the external effects that can affect the measurements shall be distinguished, external influences as the atmosphere are qualitatively known but have not been quantified yet. The objective of the project is to collect, analyze and jointly and continuously evaluate over time geophysical, geodetic, geochemical and meteorological data from the Teide-Pico Viejo complex and its surroundings. A continuous multiparametric network have been deployed in the area, which, together with the data provided by the Volcano Monitoring Network of the Instituto Geográfico Nacional (IGN) and data from other institutions will provide a comprehensive set of data with high resolution in both space and time. This multiparametric network includes a seismic array, two self-potential lines for continuous measurements, five magnetometers and two weather stations. The network will be complemented with 8 CGPS stations, one tiltmeter, 10 seismic stations, and four thermometric stations on the fumaroles of Teide volcano that IGN already manage in Tenerife. The data will be completed with the results from different repeated surveys of self potential, soil temperature and CO2 diffuse flux in several pre-established areas on top of Teide throughout the entire duration of project. During the project, new computation tools will be developed to study the correlation between the different parameters analyzed. The results obtained will characterize the possible seasonal fluctuations of each parameter and the variations related to meteorological phenomena. In addition, they will allow identifying the response of all the analyzed parameters to specific events that are traditionally studied with a single technique, such as short episodes of tremor (sporadically registered in Teide-Pico Viejo surroundings) or changes in activity of the hydrothermal system of the volcanic complex. We present here the first multiparametric results obtained from the project, including locations with the seismic array, CO2 and temperature maps of Teide fumaroles zones and magnetometric measurements.
Automated Vectorization of Decision-Based Algorithms
NASA Technical Reports Server (NTRS)
James, Mark
2006-01-01
Virtually all existing vectorization algorithms are designed to only analyze the numeric properties of an algorithm and distribute those elements across multiple processors. This advances the state of the practice because it is the only known system, at the time of this reporting, that takes high-level statements and analyzes them for their decision properties and converts them to a form that allows them to automatically be executed in parallel. The software takes a high-level source program that describes a complex decision- based condition and rewrites it as a disjunctive set of component Boolean relations that can then be executed in parallel. This is important because parallel architectures are becoming more commonplace in conventional systems and they have always been present in NASA flight systems. This technology allows one to take existing condition-based code and automatically vectorize it so it naturally decomposes across parallel architectures.
Lewis, Bryan; Swarup, Samarth; Bisset, Keith; Eubank, Stephen; Marathe, Madhav; Barrett, Chris
2013-01-01
Disasters affect a society at many levels. Simulation based studies often evaluate the effectiveness of one or two response policies in isolation and are unable to represent impact of the policies to coevolve with others. Similarly, most in-depth analyses are based on a static assessment of the “aftermath” rather than capturing dynamics. We have developed a data-centric simulation environment for applying a systems approach to a dynamic analysis of complex combinations of disaster responses. We analyze an improvised nuclear detonation in Washington DC with this environment. The simulated blast affects the transportation system, communications infrastructure, electrical power system, behaviors and motivations of population, and health status of survivors. The effectiveness of partially restoring wireless communications capacity is analyzed in concert with a range of other disaster response policies. Despite providing a limited increase in cell phone communication, overall health was improved. PMID:23903394
NASA Technical Reports Server (NTRS)
Parnell, Gregory S.; Rowell, William F.; Valusek, John R.
1987-01-01
In recent years there has been increasing interest in applying the computer based problem solving techniques of Artificial Intelligence (AI), Operations Research (OR), and Decision Support Systems (DSS) to analyze extremely complex problems. A conceptual framework is developed for successfully integrating these three techniques. First, the fields of AI, OR, and DSS are defined and the relationships among the three fields are explored. Next, a comprehensive adaptive design methodology for AI and OR modeling within the context of a DSS is described. These observations are made: (1) the solution of extremely complex knowledge problems with ill-defined, changing requirements can benefit greatly from the use of the adaptive design process, (2) the field of DSS provides the focus on the decision making process essential for tailoring solutions to these complex problems, (3) the characteristics of AI, OR, and DSS tools appears to be converging rapidly, and (4) there is a growing need for an interdisciplinary AI/OR/DSS education.
Spectral statistics and scattering resonances of complex primes arrays
NASA Astrophysics Data System (ADS)
Wang, Ren; Pinheiro, Felipe A.; Dal Negro, Luca
2018-01-01
We introduce a class of aperiodic arrays of electric dipoles generated from the distribution of prime numbers in complex quadratic fields (Eisenstein and Gaussian primes) as well as quaternion primes (Hurwitz and Lifschitz primes), and study the nature of their scattering resonances using the vectorial Green's matrix method. In these systems we demonstrate several distinctive spectral properties, such as the absence of level repulsion in the strongly scattering regime, critical statistics of level spacings, and the existence of critical modes, which are extended fractal modes with long lifetimes not supported by either random or periodic systems. Moreover, we show that one can predict important physical properties, such as the existence spectral gaps, by analyzing the eigenvalue distribution of the Green's matrix of the arrays in the complex plane. Our results unveil the importance of aperiodic correlations in prime number arrays for the engineering of gapped photonic media that support far richer mode localization and spectral properties compared to usual periodic and random media.
Opinion diversity and community formation in adaptive networks
NASA Astrophysics Data System (ADS)
Yu, Y.; Xiao, G.; Li, G.; Tay, W. P.; Teoh, H. F.
2017-10-01
It is interesting and of significant importance to investigate how network structures co-evolve with opinions. In this article, we show that, a simple model integrating consensus formation, link rewiring, and opinion change allows complex system dynamics to emerge, driving the system into a dynamic equilibrium with the co-existence of diversified opinions. Specifically, similar opinion holders may form into communities yet with no strict community consensus; and rather than being separated into disconnected communities, different communities are connected by a non-trivial proportion of inter-community links. More importantly, we show that the complex dynamics may lead to different numbers of communities at the steady state with a given tolerance between different opinion holders. We construct a framework for theoretically analyzing the co-evolution process. Theoretical analysis and extensive simulation results reveal some useful insights into the complex co-evolution process, including the formation of dynamic equilibrium, the transition between different steady states with different numbers of communities, and the dynamics between opinion distribution and network modularity.
Chaotic structure of oil prices
NASA Astrophysics Data System (ADS)
Bildirici, Melike; Sonustun, Fulya Ozaksoy
2018-01-01
The fluctuations in oil prices are very complicated and therefore, it is unable to predict its effects on economies. For modelling complex system of oil prices, linear economic models are not sufficient and efficient tools. Thus, in recent years, economists attached great attention to non-linear structure of oil prices. For analyzing this relationship, GARCH types of models were used in some papers. Distinctively from the other papers, in this study, we aimed to analyze chaotic pattern of oil prices. Thus, it was used the Lyapunov Exponents and Hennon Map to determine chaotic behavior of oil prices for the selected time period.
System and method for modeling and analyzing complex scenarios
Shevitz, Daniel Wolf
2013-04-09
An embodiment of the present invention includes a method for analyzing and solving possibility tree. A possibility tree having a plurality of programmable nodes is constructed and solved with a solver module executed by a processor element. The solver module executes the programming of said nodes, and tracks the state of at least a variable through a branch. When a variable of said branch is out of tolerance with a parameter, the solver disables remaining nodes of the branch and marks the branch as an invalid solution. The valid solutions are then aggregated and displayed as valid tree solutions.
Visual Data Mining: An Exploratory Approach to Analyzing Temporal Patterns of Eye Movements
ERIC Educational Resources Information Center
Yu, Chen; Yurovsky, Daniel; Xu, Tian
2012-01-01
Infant eye movements are an important behavioral resource to understand early human development and learning. But the complexity and amount of gaze data recorded from state-of-the-art eye-tracking systems also pose a challenge: how does one make sense of such dense data? Toward this goal, this article describes an interactive approach based on…
Institute for Brain and Neural Systems
2009-10-06
to deal with computational complexity when analyzing large amounts of information in visual scenes. It seems natural that in addition to exploring...algorithms using methods from statistical pattern recognition and machine learning. Over the last fifteen years, significant advances had been made in...recognition, robustness to noise and ability to cope with significant variations in lighting conditions. Identifying an occluded target adds another layer of
ERIC Educational Resources Information Center
Rouse, William B.; Johnson, William B.
A methodological framework is presented for representing tradeoffs among alternative combinations of training and aiding for personnel in complex situations. In general, more highly trained people need less aid, and those with less training need more aid. Balancing training and aiding to accomplish the objectives of the system in a cost effective…
ERIC Educational Resources Information Center
Koberg, Don; Bagnall, Jim
This publication provides an organizational scheme for a creative problem solving process. The authors indicate that all problems can benefit from the same logical and orderly process now employed to solve many complex problems. The principles remain constant; only specific methods change. Chapter 1 analyzes the development of creativity and fear…
Coordinated crew performance in commercial aircraft operations
NASA Technical Reports Server (NTRS)
Murphy, M. R.
1977-01-01
A specific methodology is proposed for an improved system of coding and analyzing crew member interaction. The complexity and lack of precision of many crew and task variables suggest the usefulness of fuzzy linguistic techniques for modeling and computer simulation of the crew performance process. Other research methodologies and concepts that have promise for increasing the effectiveness of research on crew performance are identified.
Epidemics in Complex Networks: The Diversity of Hubs
NASA Astrophysics Data System (ADS)
Kitsak, Maksim; Gallos, Lazaros K.; Havlin, Shlomo; Stanley, H. Eugene; Makse, Hernan A.
2009-03-01
Many complex systems are believed to be vulnerable to spread of viruses and information owing to their high level of interconnectivity. Even viruses of low contagiousness easily proliferate the Internet. Rumors, fads, and innovation ideas are prone to efficient spreading in various social systems. Another commonly accepted standpoint is the importance of the most connected elements (hubs) in the spreading processes. We address following questions. Do all hubs conduct epidemics in the same manner? How does the epidemics spread depend on the structure of the network? What is the most efficient way to spread information over the system? We analyze several large-scale systems in the framework of of the susceptible/infective/removed (SIR) disease spread model which can also be mapped to the problem of rumor or fad spreading. We show that hubs are often ineffective in the transmission of virus or information owing to the highly heterogeneous topology of most networks. We also propose a new tool to evaluate the efficiency of nodes in spreading virus or information.
X-ray absorption near-edge spectroscopy in bioinorganic chemistry: Application to M–O2 systems
Sarangi, Ritimukta
2012-01-01
Metal K-edge X-ray absorption spectroscopy (XAS) has been extensively applied to bioinorganic chemistry to obtain geometric structure information on metalloprotein and biomimetic model complex active sites by analyzing the higher energy extended X-ray absorption fine structure (EXAFS) region of the spectrum. In recent years, focus has been on developing methodologies to interpret the lower energy K-pre-edge and rising-edge regions (XANES) and using it for electronic structure determination in complex bioinorganic systems. In this review, the evolution and progress of 3d-transition metal K-pre-edge and rising-edge methodology development is presented with particular focus on applications to bioinorganic systems. Applications to biomimetic transition metal–O2 intermediates (M = Fe, Co, Ni and Cu) are reviewed, which demonstrate the power of the method as an electronic structure determination technique and its impact in understanding the role of supporting ligands in tuning the electronic configuration of transition metal–O2 systems. PMID:23525635
Evaluation of tocopherol recovery through simulation of molecular distillation process.
Moraes, E B; Batistella, C B; Alvarez, M E Torres; Filho, Rubens Maciel; Maciel, M R Wolf
2004-01-01
DISMOL simulator was used to determine the best possible operating conditions to guide, in future studies, experimental works. This simulator needs several physical-chemical properties and often it is very difficult to determine them because of the complexity of the involved components. Their determinations must be made through correlations and/or predictions, in order to characterize the system and calculate it. The first try is to have simulation results of a system that later can be validated with experimental data. To implement, in the simulator, the necessary parameters of complex systems is a difficult task. In this work, we aimed to determe these properties in order to evaluate the tocopherol (vitamin E) recovery using a DISMOL simulator. The raw material used was the crude deodorizer distillate of soya oil. With this procedure, it is possible to determine the best operating conditions for experimental works and to evaluate the process in the separation of new systems, analyzing the profiles obtained from these simulations for the falling film molecular distillator.
NASA Technical Reports Server (NTRS)
Chung, Ming-Ying; Ciardo, Gianfranco; Siminiceanu, Radu I.
2007-01-01
The Saturation algorithm for symbolic state-space generation, has been a recent break-through in the exhaustive veri cation of complex systems, in particular globally-asyn- chronous/locally-synchronous systems. The algorithm uses a very compact Multiway Decision Diagram (MDD) encoding for states and the fastest symbolic exploration algo- rithm to date. The distributed version of Saturation uses the overall memory available on a network of workstations (NOW) to efficiently spread the memory load during the highly irregular exploration. A crucial factor in limiting the memory consumption during the symbolic state-space generation is the ability to perform garbage collection to free up the memory occupied by dead nodes. However, garbage collection over a NOW requires a nontrivial communication overhead. In addition, operation cache policies become critical while analyzing large-scale systems using the symbolic approach. In this technical report, we develop a garbage collection scheme and several operation cache policies to help on solving extremely complex systems. Experiments show that our schemes improve the performance of the original distributed implementation, SmArTNow, in terms of time and memory efficiency.
Weighted complex network analysis of the Beijing subway system: Train and passenger flows
NASA Astrophysics Data System (ADS)
Feng, Jia; Li, Xiamiao; Mao, Baohua; Xu, Qi; Bai, Yun
2017-05-01
In recent years, complex network theory has become an important approach to the study of the structure and dynamics of traffic networks. However, because traffic data is difficult to collect, previous studies have usually focused on the physical topology of subway systems, whereas few studies have considered the characteristics of traffic flows through the network. Therefore, in this paper, we present a multi-layer model to analyze traffic flow patterns in subway networks, based on trip data and an operation timetable obtained from the Beijing Subway System. We characterize the patterns in terms of the spatiotemporal flow size distributions of both the train flow network and the passenger flow network. In addition, we describe the essential interactions between these two networks based on statistical analyses. The results of this study suggest that layered models of transportation systems can elucidate fundamental differences between the coexisting traffic flows and can also clarify the mechanism that causes these differences.