NASA Astrophysics Data System (ADS)
Zhang, Daili
Increasing societal demand for automation has led to considerable efforts to control large-scale complex systems, especially in the area of autonomous intelligent control methods. The control system of a large-scale complex system needs to satisfy four system level requirements: robustness, flexibility, reusability, and scalability. Corresponding to the four system level requirements, there arise four major challenges. First, it is difficult to get accurate and complete information. Second, the system may be physically highly distributed. Third, the system evolves very quickly. Fourth, emergent global behaviors of the system can be caused by small disturbances at the component level. The Multi-Agent Based Control (MABC) method as an implementation of distributed intelligent control has been the focus of research since the 1970s, in an effort to solve the above-mentioned problems in controlling large-scale complex systems. However, to the author's best knowledge, all MABC systems for large-scale complex systems with significant uncertainties are problem-specific and thus difficult to extend to other domains or larger systems. This situation is partly due to the control architecture of multiple agents being determined by agent to agent coupling and interaction mechanisms. Therefore, the research objective of this dissertation is to develop a comprehensive, generalized framework for the control system design of general large-scale complex systems with significant uncertainties, with the focus on distributed control architecture design and distributed inference engine design. A Hybrid Multi-Agent Based Control (HyMABC) architecture is proposed by combining hierarchical control architecture and module control architecture with logical replication rings. First, it decomposes a complex system hierarchically; second, it combines the components in the same level as a module, and then designs common interfaces for all of the components in the same module; third, replications are made for critical agents and are organized into logical rings. This architecture maintains clear guidelines for complexity decomposition and also increases the robustness of the whole system. Multiple Sectioned Dynamic Bayesian Networks (MSDBNs) as a distributed dynamic probabilistic inference engine, can be embedded into the control architecture to handle uncertainties of general large-scale complex systems. MSDBNs decomposes a large knowledge-based system into many agents. Each agent holds its partial perspective of a large problem domain by representing its knowledge as a Dynamic Bayesian Network (DBN). Each agent accesses local evidence from its corresponding local sensors and communicates with other agents through finite message passing. If the distributed agents can be organized into a tree structure, satisfying the running intersection property and d-sep set requirements, globally consistent inferences are achievable in a distributed way. By using different frequencies for local DBN agent belief updating and global system belief updating, it balances the communication cost with the global consistency of inferences. In this dissertation, a fully factorized Boyen-Koller (BK) approximation algorithm is used for local DBN agent belief updating, and the static Junction Forest Linkage Tree (JFLT) algorithm is used for global system belief updating. MSDBNs assume a static structure and a stable communication network for the whole system. However, for a real system, sub-Bayesian networks as nodes could be lost, and the communication network could be shut down due to partial damage in the system. Therefore, on-line and automatic MSDBNs structure formation is necessary for making robust state estimations and increasing survivability of the whole system. A Distributed Spanning Tree Optimization (DSTO) algorithm, a Distributed D-Sep Set Satisfaction (DDSSS) algorithm, and a Distributed Running Intersection Satisfaction (DRIS) algorithm are proposed in this dissertation. Combining these three distributed algorithms and a Distributed Belief Propagation (DBP) algorithm in MSDBNs makes state estimations robust to partial damage in the whole system. Combining the distributed control architecture design and the distributed inference engine design leads to a process of control system design for a general large-scale complex system. As applications of the proposed methodology, the control system design of a simplified ship chilled water system and a notional ship chilled water system have been demonstrated step by step. Simulation results not only show that the proposed methodology gives a clear guideline for control system design for general large-scale complex systems with dynamic and uncertain environment, but also indicate that the combination of MSDBNs and HyMABC can provide excellent performance for controlling general large-scale complex systems.
Complexity Leadership: A Theoretical Perspective
ERIC Educational Resources Information Center
Baltaci, Ali; Balci, Ali
2017-01-01
Complex systems are social networks composed of interactive employees interconnected through collaborative, dynamic ties such as shared goals, perspectives and needs. Complex systems are largely based on "the complex system theory". The complex system theory focuses mainly on finding out and developing strategies and behaviours that…
Improving Systems Engineering Effectiveness in Rapid Response Development Environments
2012-06-02
environments where large, complex, brownfield systems of systems are evolved through parallel development of new capabilities in response to external, time...license 14. ABSTRACT Systems engineering is often ineffective in development environments where large, complex, brownfield systems of systems are...IEEE Press, Hoboken, NJ, 2008 [18] Boehm, B.: Applying the Incremental Commitment Model to Brownfield Systems Development, Proceedings, CSER 2009
An Event-driven, Value-based, Pull Systems Engineering Scheduling Approach
2012-03-01
engineering in rapid response environments has been difficult, particularly those where large, complex brownfield systems or systems of systems exist and...where large, complex brownfield systems or systems of systems exist and are constantly being updated with both short and long term software enhancements...2004. [13] B. Boehm, “Applying the Incremental Commitment Model to Brownfield System Development,” Proceedings, CSER, 2009. [14] A. Borshchev and A
Exploring Complex Systems Aspects of Blackout Risk and Mitigation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Newman, David E; Carreras, Benjamin A; Lynch, Vickie E
2011-01-01
Electric power transmission systems are a key infrastructure, and blackouts of these systems have major consequences for the economy and national security. Analyses of blackout data suggest that blackout size distributions have a power law form over much of their range. This result is an indication that blackouts behave as a complex dynamical system. We use a simulation of an upgrading power transmission system to investigate how these complex system dynamics impact the assessment and mitigation of blackout risk. The mitigation of failures in complex systems needs to be approached with care. The mitigation efforts can move the system tomore » a new dynamic equilibrium while remaining near criticality and preserving the power law region. Thus, while the absolute frequency of blackouts of all sizes may be reduced, the underlying forces can still cause the relative frequency of large blackouts to small blackouts to remain the same. Moreover, in some cases, efforts to mitigate small blackouts can even increase the frequency of large blackouts. This result occurs because the large and small blackouts are not mutually independent, but are strongly coupled by the complex dynamics.« less
2016-04-30
also that we have started building in a domain where structural patterns matter, especially for large projects. Complex Systems Complexity has been...through minimalistic thinking and parsimony” and perceived elegance, which “hides systemic or organizational complexity from the user.” If the system
Verification of Space Station Secondary Power System Stability Using Design of Experiment
NASA Technical Reports Server (NTRS)
Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce
1998-01-01
This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.
Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems
NASA Astrophysics Data System (ADS)
Koch, Patrick Nathan
Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.
Hybrid estimation of complex systems.
Hofbaur, Michael W; Williams, Brian C
2004-10-01
Modern automated systems evolve both continuously and discretely, and hence require estimation techniques that go well beyond the capability of a typical Kalman Filter. Multiple model (MM) estimation schemes track these system evolutions by applying a bank of filters, one for each discrete system mode. Modern systems, however, are often composed of many interconnected components that exhibit rich behaviors, due to complex, system-wide interactions. Modeling these systems leads to complex stochastic hybrid models that capture the large number of operational and failure modes. This large number of modes makes a typical MM estimation approach infeasible for online estimation. This paper analyzes the shortcomings of MM estimation, and then introduces an alternative hybrid estimation scheme that can efficiently estimate complex systems with large number of modes. It utilizes search techniques from the toolkit of model-based reasoning in order to focus the estimation on the set of most likely modes, without missing symptoms that might be hidden amongst the system noise. In addition, we present a novel approach to hybrid estimation in the presence of unknown behavioral modes. This leads to an overall hybrid estimation scheme for complex systems that robustly copes with unforeseen situations in a degraded, but fail-safe manner.
Etoile Project : Social Intelligent ICT-System for very large scale education in complex systems
NASA Astrophysics Data System (ADS)
Bourgine, P.; Johnson, J.
2009-04-01
The project will devise new theory and implement new ICT-based methods of delivering high-quality low-cost postgraduate education to many thousands of people in a scalable way, with the cost of each extra student being negligible (< a few Euros). The research will create an in vivo laboratory of one to ten thousand postgraduate students studying courses in complex systems. This community is chosen because it is large and interdisciplinary and there is a known requirement for courses for thousand of students across Europe. The project involves every aspect of course production and delivery. Within this the research focused on the creation of a Socially Intelligent Resource Mining system to gather large volumes of high quality educational resources from the internet; new methods to deconstruct these to produce a semantically tagged Learning Object Database; a Living Course Ecology to support the creation and maintenance of evolving course materials; systems to deliver courses; and a ‘socially intelligent assessment system'. The system will be tested on one to ten thousand postgraduate students in Europe working towards the Complex System Society's title of European PhD in Complex Systems. Étoile will have a very high impact both scientifically and socially by (i) the provision of new scalable ICT-based methods for providing very low cost scientific education, (ii) the creation of new mathematical and statistical theory for the multiscale dynamics of complex systems, (iii) the provision of a working example of adaptation and emergence in complex socio-technical systems, and (iv) making a major educational contribution to European complex systems science and its applications.
Engineering research, development and technology FY99
DOE Office of Scientific and Technical Information (OSTI.GOV)
Langland, R T
The growth of computer power and connectivity, together with advances in wireless sensing and communication technologies, is transforming the field of complex distributed systems. The ability to deploy large numbers of sensors with a rapid, broadband communication system will enable high-fidelity, near real-time monitoring of complex systems. These technological developments will provide unprecedented insight into the actual performance of engineered and natural environment systems, enable the evolution of many new types of engineered systems for monitoring and detection, and enhance our ability to perform improved and validated large-scale simulations of complex systems. One of the challenges facing engineering is tomore » develop methodologies to exploit the emerging information technologies. Particularly important will be the ability to assimilate measured data into the simulation process in a way which is much more sophisticated than current, primarily ad hoc procedures. The reports contained in this section on the Center for Complex Distributed Systems describe activities related to the integrated engineering of large complex systems. The first three papers describe recent developments for each link of the integrated engineering process for large structural systems. These include (1) the development of model-based signal processing algorithms which will formalize the process of coupling measurements and simulation and provide a rigorous methodology for validation and update of computational models; (2) collaborative efforts with faculty at the University of California at Berkeley on the development of massive simulation models for the earth and large bridge structures; and (3) the development of wireless data acquisition systems which provide a practical means of monitoring large systems like the National Ignition Facility (NIF) optical support structures. These successful developments are coming to a confluence in the next year with applications to NIF structural characterizations and analysis of large bridge structures for the State of California. Initial feasibility investigations into the development of monitoring and detection systems are described in the papers on imaging of underground structures with ground-penetrating radar, and the use of live insects as sensor platforms. These efforts are establishing the basic performance characteristics essential to the decision process for future development of sensor arrays for information gathering related to national security.« less
2015-01-01
Economies are instances of complex socio-technical systems that are shaped by the interactions of large numbers of individuals. The individual behavior and decision-making of consumer agents is determined by complex psychological dynamics that include their own assessment of present and future economic conditions as well as those of others, potentially leading to feedback loops that affect the macroscopic state of the economic system. We propose that the large-scale interactions of a nation's citizens with its online resources can reveal the complex dynamics of their collective psychology, including their assessment of future system states. Here we introduce a behavioral index of Chinese Consumer Confidence (C3I) that computationally relates large-scale online search behavior recorded by Google Trends data to the macroscopic variable of consumer confidence. Our results indicate that such computational indices may reveal the components and complex dynamics of consumer psychology as a collective socio-economic phenomenon, potentially leading to improved and more refined economic forecasting. PMID:25826692
Dong, Xianlei; Bollen, Johan
2015-01-01
Economies are instances of complex socio-technical systems that are shaped by the interactions of large numbers of individuals. The individual behavior and decision-making of consumer agents is determined by complex psychological dynamics that include their own assessment of present and future economic conditions as well as those of others, potentially leading to feedback loops that affect the macroscopic state of the economic system. We propose that the large-scale interactions of a nation's citizens with its online resources can reveal the complex dynamics of their collective psychology, including their assessment of future system states. Here we introduce a behavioral index of Chinese Consumer Confidence (C3I) that computationally relates large-scale online search behavior recorded by Google Trends data to the macroscopic variable of consumer confidence. Our results indicate that such computational indices may reveal the components and complex dynamics of consumer psychology as a collective socio-economic phenomenon, potentially leading to improved and more refined economic forecasting.
Dobson, Ian; Carreras, Benjamin A; Lynch, Vickie E; Newman, David E
2007-06-01
We give an overview of a complex systems approach to large blackouts of electric power transmission systems caused by cascading failure. Instead of looking at the details of particular blackouts, we study the statistics and dynamics of series of blackouts with approximate global models. Blackout data from several countries suggest that the frequency of large blackouts is governed by a power law. The power law makes the risk of large blackouts consequential and is consistent with the power system being a complex system designed and operated near a critical point. Power system overall loading or stress relative to operating limits is a key factor affecting the risk of cascading failure. Power system blackout models and abstract models of cascading failure show critical points with power law behavior as load is increased. To explain why the power system is operated near these critical points and inspired by concepts from self-organized criticality, we suggest that power system operating margins evolve slowly to near a critical point and confirm this idea using a power system model. The slow evolution of the power system is driven by a steady increase in electric loading, economic pressures to maximize the use of the grid, and the engineering responses to blackouts that upgrade the system. Mitigation of blackout risk should account for dynamical effects in complex self-organized critical systems. For example, some methods of suppressing small blackouts could ultimately increase the risk of large blackouts.
Advanced Kalman Filter for Real-Time Responsiveness in Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Welch, Gregory Francis; Zhang, Jinghe
2014-06-10
Complex engineering systems pose fundamental challenges in real-time operations and control because they are highly dynamic systems consisting of a large number of elements with severe nonlinearities and discontinuities. Today’s tools for real-time complex system operations are mostly based on steady state models, unable to capture the dynamic nature and too slow to prevent system failures. We developed advanced Kalman filtering techniques and the formulation of dynamic state estimation using Kalman filtering techniques to capture complex system dynamics in aiding real-time operations and control. In this work, we looked at complex system issues including severe nonlinearity of system equations, discontinuitiesmore » caused by system controls and network switches, sparse measurements in space and time, and real-time requirements of power grid operations. We sought to bridge the disciplinary boundaries between Computer Science and Power Systems Engineering, by introducing methods that leverage both existing and new techniques. While our methods were developed in the context of electrical power systems, they should generalize to other large-scale scientific and engineering applications.« less
Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling
NASA Technical Reports Server (NTRS)
Hojnicki, Jeffrey S.; Rusick, Jeffrey J.
2005-01-01
Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).
A Comparison of Ada 83 and C++
1991-06-01
developing large, complex, software systems with long lifetimes. Those interviewed for this study who are familiar with both Ada and C++ believe that Ada is...with those who are familiar with both languages, there was a clear preference for using Ada for large complex systems with long lifetimes. These...University, December 1990 Additions by Nelson H. Weiderman, June 1991. Chile Empresa Nacional de Aeronautica (ENAER), real-time avionics system, Data
The production of multiprotein complexes in insect cells using the baculovirus expression system.
Abdulrahman, Wassim; Radu, Laura; Garzoni, Frederic; Kolesnikova, Olga; Gupta, Kapil; Osz-Papai, Judit; Berger, Imre; Poterszman, Arnaud
2015-01-01
The production of a homogeneous protein sample in sufficient quantities is an essential prerequisite not only for structural investigations but represents also a rate-limiting step for many functional studies. In the cell, a large fraction of eukaryotic proteins exists as large multicomponent assemblies with many subunits, which act in concert to catalyze specific activities. Many of these complexes cannot be obtained from endogenous source material, so recombinant expression and reconstitution are then required to overcome this bottleneck. This chapter describes current strategies and protocols for the efficient production of multiprotein complexes in large quantities and of high quality, using the baculovirus/insect cell expression system.
The Emergence of Dominant Design(s) in Large Scale Cyber-Infrastructure Systems
ERIC Educational Resources Information Center
Diamanti, Eirini Ilana
2012-01-01
Cyber-infrastructure systems are integrated large-scale IT systems designed with the goal of transforming scientific practice by enabling multi-disciplinary, cross-institutional collaboration. Their large scale and socio-technical complexity make design decisions for their underlying architecture practically irreversible. Drawing on three…
The Use of Cellular Automata in the Learning of Emergence
ERIC Educational Resources Information Center
Faraco, G.; Pantano, P.; Servidio, R.
2006-01-01
In recent years, research efforts on complex systems have contributed to improve our ability in investigating, at different levels of complexity, the emergent behaviour shown by a system in the course of its evolution. The study of emergence, an intrinsic property of a large number of complex systems, can be tackled by making use of Cellular…
Persistent model order reduction for complex dynamical systems using smooth orthogonal decomposition
NASA Astrophysics Data System (ADS)
Ilbeigi, Shahab; Chelidze, David
2017-11-01
Full-scale complex dynamic models are not effective for parametric studies due to the inherent constraints on available computational power and storage resources. A persistent reduced order model (ROM) that is robust, stable, and provides high-fidelity simulations for a relatively wide range of parameters and operating conditions can provide a solution to this problem. The fidelity of a new framework for persistent model order reduction of large and complex dynamical systems is investigated. The framework is validated using several numerical examples including a large linear system and two complex nonlinear systems with material and geometrical nonlinearities. While the framework is used for identifying the robust subspaces obtained from both proper and smooth orthogonal decompositions (POD and SOD, respectively), the results show that SOD outperforms POD in terms of stability, accuracy, and robustness.
Self-assembly of polyelectrolyte surfactant complexes using large scale MD simulation
NASA Astrophysics Data System (ADS)
Goswami, Monojoy; Sumpter, Bobby
2014-03-01
Polyelectrolytes (PE) and surfactants are known to form interesting structures with varied properties in aqueous solutions. The morphological details of the PE-surfactant complexes depend on a combination of polymer backbone, electrostatic interactions and hydrophobic interactions. We study the self-assembly of cationic PE and anionic surfactants complexes in dilute condition. The importance of such complexes of PE with oppositely charged surfactants can be found in biological systems, such as immobilization of enzymes in polyelectrolyte complexes or nonspecific association of DNA with protein. Many useful properties of PE surfactant complexes come from the highly ordered structures of surfactant self-assembly inside the PE aggregate which has applications in industry. We do large scale molecular dynamics simulation using LAMMPS to understand the structure and dynamics of PE-surfactant systems. Our investigation shows highly ordered pearl-necklace structures that have been observed experimentally in biological systems. We investigate many different properties of PE-surfactant complexation for different parameter ranges that are useful for pharmaceutical, engineering and biological applications.
Effective control of complex turbulent dynamical systems through statistical functionals.
Majda, Andrew J; Qi, Di
2017-05-30
Turbulent dynamical systems characterized by both a high-dimensional phase space and a large number of instabilities are ubiquitous among complex systems in science and engineering, including climate, material, and neural science. Control of these complex systems is a grand challenge, for example, in mitigating the effects of climate change or safe design of technology with fully developed shear turbulence. Control of flows in the transition to turbulence, where there is a small dimension of instabilities about a basic mean state, is an important and successful discipline. In complex turbulent dynamical systems, it is impossible to track and control the large dimension of instabilities, which strongly interact and exchange energy, and new control strategies are needed. The goal of this paper is to propose an effective statistical control strategy for complex turbulent dynamical systems based on a recent statistical energy principle and statistical linear response theory. We illustrate the potential practical efficiency and verify this effective statistical control strategy on the 40D Lorenz 1996 model in forcing regimes with various types of fully turbulent dynamics with nearly one-half of the phase space unstable.
Complexity in Indexing Systems--Abandonment and Failure: Implications for Organizing the Internet.
ERIC Educational Resources Information Center
Weinberg, Bella Hass
1996-01-01
Discusses detailed classification systems, sophisticated alphabetical indexing systems and reasons for the abandonment of complex indexing systems. The suggested structure for indexing the Internet or other large electronic collections of documents is based on that of book indexes: specific headings with coined modifications. (Author/AEF)
Mathematical Models to Determine Stable Behavior of Complex Systems
NASA Astrophysics Data System (ADS)
Sumin, V. I.; Dushkin, A. V.; Smolentseva, T. E.
2018-05-01
The paper analyzes a possibility to predict functioning of a complex dynamic system with a significant amount of circulating information and a large number of random factors impacting its functioning. Functioning of the complex dynamic system is described as a chaotic state, self-organized criticality and bifurcation. This problem may be resolved by modeling such systems as dynamic ones, without applying stochastic models and taking into account strange attractors.
ERIC Educational Resources Information Center
Grotzer, Tina A.; Solis, S. Lynneth; Tutwiler, M. Shane; Cuzzolino, Megan Powell
2017-01-01
Understanding complex systems requires reasoning about causal relationships that behave or appear to behave probabilistically. Features such as distributed agency, large spatial scales, and time delays obscure co-variation relationships and complex interactions can result in non-deterministic relationships between causes and effects that are best…
Challenges in Developing Models Describing Complex Soil Systems
NASA Astrophysics Data System (ADS)
Simunek, J.; Jacques, D.
2014-12-01
Quantitative mechanistic models that consider basic physical, mechanical, chemical, and biological processes have the potential to be powerful tools to integrate our understanding of complex soil systems, and the soil science community has often called for models that would include a large number of these diverse processes. However, once attempts have been made to develop such models, the response from the community has not always been overwhelming, especially after it discovered that these models are consequently highly complex, requiring not only a large number of parameters, not all of which can be easily (or at all) measured and/or identified, and which are often associated with large uncertainties, but also requiring from their users deep knowledge of all/most of these implemented physical, mechanical, chemical and biological processes. Real, or perceived, complexity of these models then discourages users from using them even for relatively simple applications, for which they would be perfectly adequate. Due to the nonlinear nature and chemical/biological complexity of the soil systems, it is also virtually impossible to verify these types of models analytically, raising doubts about their applicability. Code inter-comparisons, which is then likely the most suitable method to assess code capabilities and model performance, requires existence of multiple models of similar/overlapping capabilities, which may not always exist. It is thus a challenge not only to developed models describing complex soil systems, but also to persuade the soil science community in using them. As a result, complex quantitative mechanistic models are still an underutilized tool in soil science research. We will demonstrate some of the challenges discussed above on our own efforts in developing quantitative mechanistic models (such as HP1/2) for complex soil systems.
Dynamical systems in economics
NASA Astrophysics Data System (ADS)
Stanojević, Jelena; Kukić, Katarina
2018-01-01
In last few decades much attention is given to explain complex behaviour of very large systems, such as weather, economy, biology and demography. In this paper we give short overview of basic notions in the field of dynamical systems which are relevant for understanding complex nature of some economic models.
Gradia, Scott D; Ishida, Justin P; Tsai, Miaw-Sheue; Jeans, Chris; Tainer, John A; Fuss, Jill O
2017-01-01
Recombinant expression of large, multiprotein complexes is essential and often rate limiting for determining structural, biophysical, and biochemical properties of DNA repair, replication, transcription, and other key cellular processes. Baculovirus-infected insect cell expression systems are especially well suited for producing large, human proteins recombinantly, and multigene baculovirus systems have facilitated studies of multiprotein complexes. In this chapter, we describe a multigene baculovirus system called MacroBac that uses a Biobricks-type assembly method based on restriction and ligation (Series 11) or ligation-independent cloning (Series 438). MacroBac cloning and assembly is efficient and equally well suited for either single subcloning reactions or high-throughput cloning using 96-well plates and liquid handling robotics. MacroBac vectors are polypromoter with each gene flanked by a strong polyhedrin promoter and an SV40 poly(A) termination signal that minimize gene order expression level effects seen in many polycistronic assemblies. Large assemblies are robustly achievable, and we have successfully assembled as many as 10 genes into a single MacroBac vector. Importantly, we have observed significant increases in expression levels and quality of large, multiprotein complexes using a single, multigene, polypromoter virus rather than coinfection with multiple, single-gene viruses. Given the importance of characterizing functional complexes, we believe that MacroBac provides a critical enabling technology that may change the way that structural, biophysical, and biochemical research is done. © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Waddell, Steve; Cornell, Sarah; Hsueh, Joe; Ozer, Ceren; McLachlan, Milla; Birney, Anna
2015-04-01
Most action to address contemporary complex challenges, including the urgent issues of global sustainability, occurs piecemeal and without meaningful guidance from leading complex change knowledge and methods. The potential benefit of using such knowledge is greater efficacy of effort and investment. However, this knowledge and its associated tools and methods are under-utilized because understanding about them is low, fragmented between diverse knowledge traditions, and often requires shifts in mindsets and skills from expert-led to participant-based action. We have been engaged in diverse action-oriented research efforts in Large System Change for sustainability. For us, "large" systems can be characterized as large-scale systems - up to global - with many components, of many kinds (physical, biological, institutional, cultural/conceptual), operating at multiple levels, driven by multiple forces, and presenting major challenges for people involved. We see change of such systems as complex challenges, in contrast with simple or complicated problems, or chaotic situations. In other words, issues and sub-systems have unclear boundaries, interact with each other, and are often contradictory; dynamics are non-linear; issues are not "controllable", and "solutions" are "emergent" and often paradoxical. Since choices are opportunity-, power- and value-driven, these social, institutional and cultural factors need to be made explicit in any actionable theory of change. Our emerging network is sharing and building a knowledge base of experience, heuristics, and theories of change from multiple disciplines and practice domains. We will present our views on focal issues for the development of the field of large system change, which include processes of goal-setting and alignment; leverage of systemic transitions and transformation; and the role of choice in influencing critical change processes, when only some sub-systems or levels of the system behave in purposeful ways, while others are undeniably and unavoidably deterministic.
Conjugate gradient type methods for linear systems with complex symmetric coefficient matrices
NASA Technical Reports Server (NTRS)
Freund, Roland
1989-01-01
We consider conjugate gradient type methods for the solution of large sparse linear system Ax equals b with complex symmetric coefficient matrices A equals A(T). Such linear systems arise in important applications, such as the numerical solution of the complex Helmholtz equation. Furthermore, most complex non-Hermitian linear systems which occur in practice are actually complex symmetric. We investigate conjugate gradient type iterations which are based on a variant of the nonsymmetric Lanczos algorithm for complex symmetric matrices. We propose a new approach with iterates defined by a quasi-minimal residual property. The resulting algorithm presents several advantages over the standard biconjugate gradient method. We also include some remarks on the obvious approach to general complex linear systems by solving equivalent real linear systems for the real and imaginary parts of x. Finally, numerical experiments for linear systems arising from the complex Helmholtz equation are reported.
Visual analysis and exploration of complex corporate shareholder networks
NASA Astrophysics Data System (ADS)
Tekušová, Tatiana; Kohlhammer, Jörn
2008-01-01
The analysis of large corporate shareholder network structures is an important task in corporate governance, in financing, and in financial investment domains. In a modern economy, large structures of cross-corporation, cross-border shareholder relationships exist, forming complex networks. These networks are often difficult to analyze with traditional approaches. An efficient visualization of the networks helps to reveal the interdependent shareholding formations and the controlling patterns. In this paper, we propose an effective visualization tool that supports the financial analyst in understanding complex shareholding networks. We develop an interactive visual analysis system by combining state-of-the-art visualization technologies with economic analysis methods. Our system is capable to reveal patterns in large corporate shareholder networks, allows the visual identification of the ultimate shareholders, and supports the visual analysis of integrated cash flow and control rights. We apply our system on an extensive real-world database of shareholder relationships, showing its usefulness for effective visual analysis.
Preferential pathways in complex fracture systems and their influence on large scale transport
NASA Astrophysics Data System (ADS)
Willmann, M.; Mañé, R.; Tyukhova, A.
2017-12-01
Many subsurface applications in complex fracture systems require large-scale predictions. Precise predictions are difficult because of the existence of preferential pathways at different scales. The intrinsic complexity of fracture systems increases within fractured sedimentary formations, because also the coupling of fractures and matrix has to be taken into account. This interplay of fracture system and the sedimentary matrix is strongly controlled by the actual fracture aperture of an individual fracture. And an effective aperture cannot be easily be determined because of the preferential pathways along the fracture plane. We investigate the influence of these preferential pathways on large scale solute transport and upscale the aperture. By explicitly modeling flow and particle tracking in individual fractures, we develop a new effective transport aperture, which is weighted by the aperture along the preferential paths, a Lagrangian aperture. We show that this new aperture is consistently larger than existing definitions of effective flow and transport apertures. Finally, we apply our results to a fractured sedimentary formation in Northern Switzerland.
Stoichiometry for binding and transport by the twin arginine translocation system.
Celedon, Jose M; Cline, Kenneth
2012-05-14
Twin arginine translocation (Tat) systems transport large folded proteins across sealed membranes. Tat systems accomplish this feat with three membrane components organized in two complexes. In thylakoid membranes, cpTatC and Hcf106 comprise a large receptor complex containing an estimated eight cpTatC-Hcf106 pairs. Protein transport occurs when Tha4 joins the receptor complex as an oligomer of uncertain size that is thought to form the protein-conducting structure. Here, binding analyses with intact membranes or purified complexes indicate that each receptor complex could bind eight precursor proteins. Kinetic analysis of translocation showed that each precursor-bound site was independently functional for transport, and, with sufficient Tha4, all sites were concurrently active for transport. Tha4 titration determined that ∼26 Tha4 protomers were required for transport of each OE17 (oxygen-evolving complex subunit of 17 kD) precursor protein. Our results suggest that, when fully saturated with precursor proteins and Tha4, the Tat translocase is an ∼2.2-megadalton complex that can individually transport eight precursor proteins or cooperatively transport multimeric precursors.
The methodology of multi-viewpoint clustering analysis
NASA Technical Reports Server (NTRS)
Mehrotra, Mala; Wild, Chris
1993-01-01
One of the greatest challenges facing the software engineering community is the ability to produce large and complex computer systems, such as ground support systems for unmanned scientific missions, that are reliable and cost effective. In order to build and maintain these systems, it is important that the knowledge in the system be suitably abstracted, structured, and otherwise clustered in a manner which facilitates its understanding, manipulation, testing, and utilization. Development of complex mission-critical systems will require the ability to abstract overall concepts in the system at various levels of detail and to consider the system from different points of view. Multi-ViewPoint - Clustering Analysis MVP-CA methodology has been developed to provide multiple views of large, complicated systems. MVP-CA provides an ability to discover significant structures by providing an automated mechanism to structure both hierarchically (from detail to abstract) and orthogonally (from different perspectives). We propose to integrate MVP/CA into an overall software engineering life cycle to support the development and evolution of complex mission critical systems.
Systems engineering for very large systems
NASA Technical Reports Server (NTRS)
Lewkowicz, Paul E.
1993-01-01
Very large integrated systems have always posed special problems for engineers. Whether they are power generation systems, computer networks or space vehicles, whenever there are multiple interfaces, complex technologies or just demanding customers, the challenges are unique. 'Systems engineering' has evolved as a discipline in order to meet these challenges by providing a structured, top-down design and development methodology for the engineer. This paper attempts to define the general class of problems requiring the complete systems engineering treatment and to show how systems engineering can be utilized to improve customer satisfaction and profit ability. Specifically, this work will focus on a design methodology for the largest of systems, not necessarily in terms of physical size, but in terms of complexity and interconnectivity.
Systems engineering for very large systems
NASA Astrophysics Data System (ADS)
Lewkowicz, Paul E.
Very large integrated systems have always posed special problems for engineers. Whether they are power generation systems, computer networks or space vehicles, whenever there are multiple interfaces, complex technologies or just demanding customers, the challenges are unique. 'Systems engineering' has evolved as a discipline in order to meet these challenges by providing a structured, top-down design and development methodology for the engineer. This paper attempts to define the general class of problems requiring the complete systems engineering treatment and to show how systems engineering can be utilized to improve customer satisfaction and profit ability. Specifically, this work will focus on a design methodology for the largest of systems, not necessarily in terms of physical size, but in terms of complexity and interconnectivity.
Can Models Capture the Complexity of the Systems Engineering Process?
NASA Astrophysics Data System (ADS)
Boppana, Krishna; Chow, Sam; de Weck, Olivier L.; Lafon, Christian; Lekkakos, Spyridon D.; Lyneis, James; Rinaldi, Matthew; Wang, Zhiyong; Wheeler, Paul; Zborovskiy, Marat; Wojcik, Leonard A.
Many large-scale, complex systems engineering (SE) programs have been problematic; a few examples are listed below (Bar-Yam, 2003 and Cullen, 2004), and many others have been late, well over budget, or have failed: Hilton/Marriott/American Airlines system for hotel reservations and flights; 1988-1992; 125 million; "scrapped"
ERIC Educational Resources Information Center
Kharabe, Amol T.
2012-01-01
Over the last two decades, firms have operated in "increasingly" accelerated "high-velocity" dynamic markets, which require them to become "agile." During the same time frame, firms have increasingly deployed complex enterprise systems--large-scale packaged software "innovations" that integrate and automate…
2015-09-30
Clark (2014), "Using High Performance Computing to Explore Large Complex Bioacoustic Soundscapes : Case Study for Right Whale Acoustics," Procedia...34Using High Performance Computing to Explore Large Complex Bioacoustic Soundscapes : Case Study for Right Whale Acoustics," Procedia Computer Science 20
Integrating complexity into data-driven multi-hazard supply chain network strategies
Long, Suzanna K.; Shoberg, Thomas G.; Ramachandran, Varun; Corns, Steven M.; Carlo, Hector J.
2013-01-01
Major strategies in the wake of a large-scale disaster have focused on short-term emergency response solutions. Few consider medium-to-long-term restoration strategies that reconnect urban areas to the national supply chain networks (SCN) and their supporting infrastructure. To re-establish this connectivity, the relationships within the SCN must be defined and formulated as a model of a complex adaptive system (CAS). A CAS model is a representation of a system that consists of large numbers of inter-connections, demonstrates non-linear behaviors and emergent properties, and responds to stimulus from its environment. CAS modeling is an effective method of managing complexities associated with SCN restoration after large-scale disasters. In order to populate the data space large data sets are required. Currently access to these data is hampered by proprietary restrictions. The aim of this paper is to identify the data required to build a SCN restoration model, look at the inherent problems associated with these data, and understand the complexity that arises due to integration of these data.
Rogge, Ryan A; Hansen, Jeffrey C
2015-01-01
Sedimentation velocity experiments measure the transport of molecules in solution under centrifugal force. Here, we describe a method for monitoring the sedimentation of very large biological molecular assemblies using the interference optical systems of the analytical ultracentrifuge. The mass, partial-specific volume, and shape of macromolecules in solution affect their sedimentation rates as reflected in the sedimentation coefficient. The sedimentation coefficient is obtained by measuring the solute concentration as a function of radial distance during centrifugation. Monitoring the concentration can be accomplished using interference optics, absorbance optics, or the fluorescence detection system, each with inherent advantages. The interference optical system captures data much faster than these other optical systems, allowing for sedimentation velocity analysis of extremely large macromolecular complexes that sediment rapidly at very low rotor speeds. Supramolecular oligomeric complexes produced by self-association of 12-mer chromatin fibers are used to illustrate the advantages of the interference optics. Using interference optics, we show that chromatin fibers self-associate at physiological divalent salt concentrations to form structures that sediment between 10,000 and 350,000S. The method for characterizing chromatin oligomers described in this chapter will be generally useful for characterization of any biological structures that are too large to be studied by the absorbance optical system. © 2015 Elsevier Inc. All rights reserved.
Low-Complexity Polynomial Channel Estimation in Large-Scale MIMO With Arbitrary Statistics
NASA Astrophysics Data System (ADS)
Shariati, Nafiseh; Bjornson, Emil; Bengtsson, Mats; Debbah, Merouane
2014-10-01
This paper considers pilot-based channel estimation in large-scale multiple-input multiple-output (MIMO) communication systems, also known as massive MIMO, where there are hundreds of antennas at one side of the link. Motivated by the fact that computational complexity is one of the main challenges in such systems, a set of low-complexity Bayesian channel estimators, coined Polynomial ExpAnsion CHannel (PEACH) estimators, are introduced for arbitrary channel and interference statistics. While the conventional minimum mean square error (MMSE) estimator has cubic complexity in the dimension of the covariance matrices, due to an inversion operation, our proposed estimators significantly reduce this to square complexity by approximating the inverse by a L-degree matrix polynomial. The coefficients of the polynomial are optimized to minimize the mean square error (MSE) of the estimate. We show numerically that near-optimal MSEs are achieved with low polynomial degrees. We also derive the exact computational complexity of the proposed estimators, in terms of the floating-point operations (FLOPs), by which we prove that the proposed estimators outperform the conventional estimators in large-scale MIMO systems of practical dimensions while providing a reasonable MSEs. Moreover, we show that L needs not scale with the system dimensions to maintain a certain normalized MSE. By analyzing different interference scenarios, we observe that the relative MSE loss of using the low-complexity PEACH estimators is smaller in realistic scenarios with pilot contamination. On the other hand, PEACH estimators are not well suited for noise-limited scenarios with high pilot power; therefore, we also introduce the low-complexity diagonalized estimator that performs well in this regime. Finally, we ...
Facilitating and Learning at the Edge of Chaos: Expanding the Context of Experiential Education.
ERIC Educational Resources Information Center
Oekerman, Carl
Significant recent discoveries within a number of scientific disciplines, collectively referred to as the science of complexity, are creating a major shift in how human beings understand the complex, adaptive systems that make up the world. A complex adaptive system consists of networks of large numbers of agents that interact with each other and…
NASA Astrophysics Data System (ADS)
Koestner, Stefan
2009-09-01
With the increasing size and degree of complexity of today's experiments in high energy physics the required amount of work and complexity to integrate a complete subdetector into an experiment control system is often underestimated. We report here on the layered software structure and protocols used by the LHCb experiment to control its detectors and readout boards. The experiment control system of LHCb is based on the commercial SCADA system PVSS II. Readout boards which are outside the radiation area are accessed via embedded credit card sized PCs which are connected to a large local area network. The SPECS protocol is used for control of the front end electronics. Finite state machines are introduced to facilitate the control of a large number of electronic devices and to model the whole experiment at the level of an expert system.
Ecological systems are generally considered among the most complex because they are characterized by a large number of diverse components, nonlinear interactions, scale multiplicity, and spatial heterogeneity. Hierarchy theory, as well as empirical evidence, suggests that comp...
Complex and unexpected dynamics in simple genetic regulatory networks
NASA Astrophysics Data System (ADS)
Borg, Yanika; Ullner, Ekkehard; Alagha, Afnan; Alsaedi, Ahmed; Nesbeth, Darren; Zaikin, Alexey
2014-03-01
One aim of synthetic biology is to construct increasingly complex genetic networks from interconnected simpler ones to address challenges in medicine and biotechnology. However, as systems increase in size and complexity, emergent properties lead to unexpected and complex dynamics due to nonlinear and nonequilibrium properties from component interactions. We focus on four different studies of biological systems which exhibit complex and unexpected dynamics. Using simple synthetic genetic networks, small and large populations of phase-coupled quorum sensing repressilators, Goodwin oscillators, and bistable switches, we review how coupled and stochastic components can result in clustering, chaos, noise-induced coherence and speed-dependent decision making. A system of repressilators exhibits oscillations, limit cycles, steady states or chaos depending on the nature and strength of the coupling mechanism. In large repressilator networks, rich dynamics can also be exhibited, such as clustering and chaos. In populations of Goodwin oscillators, noise can induce coherent oscillations. In bistable systems, the speed with which incoming external signals reach steady state can bias the network towards particular attractors. These studies showcase the range of dynamical behavior that simple synthetic genetic networks can exhibit. In addition, they demonstrate the ability of mathematical modeling to analyze nonlinearity and inhomogeneity within these systems.
NASA Technical Reports Server (NTRS)
Feher, Kamilo
1993-01-01
The performance and implementation complexity of coherent and of noncoherent QPSK and GMSK modulation/demodulation techniques in a complex mobile satellite systems environment, including large Doppler shift, delay spread, and low C/I, are compared. We demonstrate that for large f(sub d)T(sub b) products, where f(sub d) is the Doppler shift and T(sub b) is the bit duration, noncoherent (discriminator detector or differential demodulation) systems have a lower BER floor than their coherent counterparts. For significant delay spreads, e.g., tau(sub rms) greater than 0.4 T(sub b), and low C/I, coherent systems outperform noncoherent systems. However, the synchronization time of coherent systems is longer than that of noncoherent systems. Spectral efficiency, overall capacity, and related hardware complexity issues of these systems are also analyzed. We demonstrate that coherent systems have a simpler overall architecture (IF filter implementation-cost versus carrier recovery) and are more robust in an RF frequency drift environment. Additionally, the prediction tools, computer simulations, and analysis of coherent systems is simpler. The threshold or capture effect in low C/I interference environment is critical for noncoherent discriminator based systems. We conclude with a comparison of hardware architectures of coherent and of noncoherent systems, including recent trends in commercial VLSI technology and direct baseband to RF transmit, RF to baseband (0-IF) receiver implementation strategies.
NASA Astrophysics Data System (ADS)
Feher, Kamilo
The performance and implementation complexity of coherent and of noncoherent QPSK and GMSK modulation/demodulation techniques in a complex mobile satellite systems environment, including large Doppler shift, delay spread, and low C/I, are compared. We demonstrate that for large f(sub d)T(sub b) products, where f(sub d) is the Doppler shift and T(sub b) is the bit duration, noncoherent (discriminator detector or differential demodulation) systems have a lower BER floor than their coherent counterparts. For significant delay spreads, e.g., tau(sub rms) greater than 0.4 T(sub b), and low C/I, coherent systems outperform noncoherent systems. However, the synchronization time of coherent systems is longer than that of noncoherent systems. Spectral efficiency, overall capacity, and related hardware complexity issues of these systems are also analyzed. We demonstrate that coherent systems have a simpler overall architecture (IF filter implementation-cost versus carrier recovery) and are more robust in an RF frequency drift environment. Additionally, the prediction tools, computer simulations, and analysis of coherent systems is simpler. The threshold or capture effect in low C/I interference environment is critical for noncoherent discriminator based systems. We conclude with a comparison of hardware architectures of coherent and of noncoherent systems, including recent trends in commercial VLSI technology and direct baseband to RF transmit, RF to baseband (0-IF) receiver implementation strategies.
NASA Technical Reports Server (NTRS)
1988-01-01
Integrated Environments for Large, Complex Systems is the theme for the RICIS symposium of 1988. Distinguished professionals from industry, government, and academia have been invited to participate and present their views and experiences regarding research, education, and future directions related to this topic. Within RICIS, more than half of the research being conducted is in the area of Computer Systems and Software Engineering. The focus of this research is on the software development life-cycle for large, complex, distributed systems. Within the education and training component of RICIS, the primary emphasis has been to provide education and training for software professionals.
Costing Complex Products, Operations & Support
2011-10-19
last version of the Harrier II family introduced in the UK. As such it is similar to the U.S. Marine Corps’ AV-8B in basic structure , systems , and...ABSTRACT Complex products and systems (CoPS) are major capital goods in which customers play a central role from design through to disposal, such as...Complex products and systems (CoPS) are major capital goods in which customers play a central role from design through to disposal, such as large defense
20. OVERVIEW OF SAR3 COMPLEX, SHOWING FORMER RESIDENTIAL AREA, SAR3 ...
20. OVERVIEW OF SAR-3 COMPLEX, SHOWING FORMER RESIDENTIAL AREA, SAR-3 SWITCH RACK, MAINTENANCE YARD, AND GREENSPOT BRIDGE. NOTE ALSO LARGE PIPE CONDUCTING TAILRACE WATER INTO IRRIGATION SYSTEM. VIEW TO SOUTHWEST. - Santa Ana River Hydroelectric System, Redlands, San Bernardino County, CA
Large-scale systems: Complexity, stability, reliability
NASA Technical Reports Server (NTRS)
Siljak, D. D.
1975-01-01
After showing that a complex dynamic system with a competitive structure has highly reliable stability, a class of noncompetitive dynamic systems for which competitive models can be constructed is defined. It is shown that such a construction is possible in the context of the hierarchic stability analysis. The scheme is based on the comparison principle and vector Liapunov functions.
Dependency visualization for complex system understanding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smart, J. Allison Cory
1994-09-01
With the volume of software in production use dramatically increasing, the importance of software maintenance has become strikingly apparent. Techniques now sought and developed for reverse engineering and design extraction and recovery. At present, numerous commercial products and research tools exist which are capable of visualizing a variety of programming languages and software constructs. The list of new tools and services continues to grow rapidly. Although the scope of the existing commercial and academic product set is quite broad, these tools still share a common underlying problem. The ability of each tool to visually organize object representations is increasingly impairedmore » as the number of components and component dependencies within systems increases. Regardless of how objects are defined, complex ``spaghetti`` networks result in nearly all large system cases. While this problem is immediately apparent in modem systems analysis involving large software implementations, it is not new. As will be discussed in Chapter 2, related problems involving the theory of graphs were identified long ago. This important theoretical foundation provides a useful vehicle for representing and analyzing complex system structures. While the utility of directed graph based concepts in software tool design has been demonstrated in literature, these tools still lack the capabilities necessary for large system comprehension. This foundation must therefore be expanded with new organizational and visualization constructs necessary to meet this challenge. This dissertation addresses this need by constructing a conceptual model and a set of methods for interactively exploring, organizing, and understanding the structure of complex software systems.« less
Krylov Subspace Methods for Complex Non-Hermitian Linear Systems. Thesis
NASA Technical Reports Server (NTRS)
Freund, Roland W.
1991-01-01
We consider Krylov subspace methods for the solution of large sparse linear systems Ax = b with complex non-Hermitian coefficient matrices. Such linear systems arise in important applications, such as inverse scattering, numerical solution of time-dependent Schrodinger equations, underwater acoustics, eddy current computations, numerical computations in quantum chromodynamics, and numerical conformal mapping. Typically, the resulting coefficient matrices A exhibit special structures, such as complex symmetry, or they are shifted Hermitian matrices. In this paper, we first describe a Krylov subspace approach with iterates defined by a quasi-minimal residual property, the QMR method, for solving general complex non-Hermitian linear systems. Then, we study special Krylov subspace methods designed for the two families of complex symmetric respectively shifted Hermitian linear systems. We also include some results concerning the obvious approach to general complex linear systems by solving equivalent real linear systems for the real and imaginary parts of x. Finally, numerical experiments for linear systems arising from the complex Helmholtz equation are reported.
Social complexity as a proximate and ultimate factor in communicative complexity
Freeberg, Todd M.; Dunbar, Robin I. M.; Ord, Terry J.
2012-01-01
The ‘social complexity hypothesis’ for communication posits that groups with complex social systems require more complex communicative systems to regulate interactions and relations among group members. Complex social systems, compared with simple social systems, are those in which individuals frequently interact in many different contexts with many different individuals, and often repeatedly interact with many of the same individuals in networks over time. Complex communicative systems, compared with simple communicative systems, are those that contain a large number of structurally and functionally distinct elements or possess a high amount of bits of information. Here, we describe some of the historical arguments that led to the social complexity hypothesis, and review evidence in support of the hypothesis. We discuss social complexity as a driver of communication and possible causal factor in human language origins. Finally, we discuss some of the key current limitations to the social complexity hypothesis—the lack of tests against alternative hypotheses for communicative complexity and evidence corroborating the hypothesis from modalities other than the vocal signalling channel. PMID:22641818
Spatial operator algebra for flexible multibody dynamics
NASA Technical Reports Server (NTRS)
Jain, A.; Rodriguez, G.
1993-01-01
This paper presents an approach to modeling the dynamics of flexible multibody systems such as flexible spacecraft and limber space robotic systems. A large number of degrees of freedom and complex dynamic interactions are typical in these systems. This paper uses spatial operators to develop efficient recursive algorithms for the dynamics of these systems. This approach very efficiently manages complexity by means of a hierarchy of mathematical operations.
Assessing Understanding of Complex Causal Networks Using an Interactive Game
ERIC Educational Resources Information Center
Ross, Joel
2013-01-01
Assessing people's understanding of the causal relationships found in large-scale complex systems may be necessary for addressing many critical social concerns, such as environmental sustainability. Existing methods for assessing systems thinking and causal understanding frequently use the technique of cognitive causal mapping. However, the…
Design of experiments (DOE) - history, concepts, and relevance to in vitro culture
USDA-ARS?s Scientific Manuscript database
Design of experiments (DOE) is a large and well-developed field for understanding and improving the performance of complex systems. Because in vitro culture systems are complex, but easily manipulated in controlled conditions, they are particularly well-suited for the application of DOE principle...
An intermediate level of abstraction for computational systems chemistry.
Andersen, Jakob L; Flamm, Christoph; Merkle, Daniel; Stadler, Peter F
2017-12-28
Computational techniques are required for narrowing down the vast space of possibilities to plausible prebiotic scenarios, because precise information on the molecular composition, the dominant reaction chemistry and the conditions for that era are scarce. The exploration of large chemical reaction networks is a central aspect in this endeavour. While quantum chemical methods can accurately predict the structures and reactivities of small molecules, they are not efficient enough to cope with large-scale reaction systems. The formalization of chemical reactions as graph grammars provides a generative system, well grounded in category theory, at the right level of abstraction for the analysis of large and complex reaction networks. An extension of the basic formalism into the realm of integer hyperflows allows for the identification of complex reaction patterns, such as autocatalysis, in large reaction networks using optimization techniques.This article is part of the themed issue 'Reconceptualizing the origins of life'. © 2017 The Author(s).
Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav
2015-01-01
Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close “neighborhood” of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa. PMID:26327290
Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav
2015-01-01
Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close "neighborhood" of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa.
Adaptive simplification of complex multiscale systems.
Chiavazzo, Eliodoro; Karlin, Ilya
2011-03-01
A fully adaptive methodology is developed for reducing the complexity of large dissipative systems. This represents a significant step toward extracting essential physical knowledge from complex systems, by addressing the challenging problem of a minimal number of variables needed to exactly capture the system dynamics. Accurate reduced description is achieved, by construction of a hierarchy of slow invariant manifolds, with an embarrassingly simple implementation in any dimension. The method is validated with the autoignition of the hydrogen-air mixture where a reduction to a cascade of slow invariant manifolds is observed.
Complex Homology and the Evolution of Nervous Systems
Liebeskind, Benjamin J.; Hillis, David M.; Zakon, Harold H.; Hofmann, Hans A.
2016-01-01
We examine the complex evolution of animal nervous systems and discuss the ramifications of this complexity for inferring the nature of early animals. Although reconstructing the origins of nervous systems remains a central challenge in biology, and the phenotypic complexity of early animals remains controversial, a compelling picture is emerging. We now know that the nervous system and other key animal innovations contain a large degree of homoplasy, at least on the molecular level. Conflicting hypotheses about early nervous system evolution are due primarily to differences in the interpretation of this homoplasy. We highlight the need for explicit discussion of assumptions and discuss the limitations of current approaches for inferring ancient phenotypic states. PMID:26746806
Indurkhya, Sagar; Beal, Jacob
2010-01-06
ODE simulations of chemical systems perform poorly when some of the species have extremely low concentrations. Stochastic simulation methods, which can handle this case, have been impractical for large systems due to computational complexity. We observe, however, that when modeling complex biological systems: (1) a small number of reactions tend to occur a disproportionately large percentage of the time, and (2) a small number of species tend to participate in a disproportionately large percentage of reactions. We exploit these properties in LOLCAT Method, a new implementation of the Gillespie Algorithm. First, factoring reaction propensities allows many propensities dependent on a single species to be updated in a single operation. Second, representing dependencies between reactions with a bipartite graph of reactions and species requires only storage for reactions, rather than the required for a graph that includes only reactions. Together, these improvements allow our implementation of LOLCAT Method to execute orders of magnitude faster than currently existing Gillespie Algorithm variants when simulating several yeast MAPK cascade models.
Indurkhya, Sagar; Beal, Jacob
2010-01-01
ODE simulations of chemical systems perform poorly when some of the species have extremely low concentrations. Stochastic simulation methods, which can handle this case, have been impractical for large systems due to computational complexity. We observe, however, that when modeling complex biological systems: (1) a small number of reactions tend to occur a disproportionately large percentage of the time, and (2) a small number of species tend to participate in a disproportionately large percentage of reactions. We exploit these properties in LOLCAT Method, a new implementation of the Gillespie Algorithm. First, factoring reaction propensities allows many propensities dependent on a single species to be updated in a single operation. Second, representing dependencies between reactions with a bipartite graph of reactions and species requires only storage for reactions, rather than the required for a graph that includes only reactions. Together, these improvements allow our implementation of LOLCAT Method to execute orders of magnitude faster than currently existing Gillespie Algorithm variants when simulating several yeast MAPK cascade models. PMID:20066048
Understanding Complex Adaptive Systems by Playing Games
ERIC Educational Resources Information Center
van Bilsen, Arthur; Bekebrede, Geertje; Mayer, Igor
2010-01-01
While educators teach their students about decision making in complex environments, managers have to deal with the complexity of large projects on a daily basis. To make better decisions it is assumed, that the latter would benefit from better understanding of complex phenomena, as do students as the professionals of the future. The goal of this…
Biologically-Inspired Concepts for Self-Management of Complexity
NASA Technical Reports Server (NTRS)
Sterritt, Roy; Hinchey, G.
2006-01-01
Inherent complexity in large-scale applications may be impossible to eliminate or even ameliorate despite a number of promising advances. In such cases, the complexity must be tolerated and managed. Such management may be beyond the abilities of humans, or require such overhead as to make management by humans unrealistic. A number of initiatives inspired by concepts in biology have arisen for self-management of complex systems. We present some ideas and techniques we have been experimenting with, inspired by lesser-known concepts in biology that show promise in protecting complex systems and represent a step towards self-management of complexity.
NASA Technical Reports Server (NTRS)
McGowan, Anna-Maria R.; Seifert, Colleen M.; Papalambros, Panos Y.
2012-01-01
The design of large-scale complex engineered systems (LaCES) such as an aircraft is inherently interdisciplinary. Multiple engineering disciplines, drawing from a team of hundreds to thousands of engineers and scientists, are woven together throughout the research, development, and systems engineering processes to realize one system. Though research and development (R&D) is typically focused in single disciplines, the interdependencies involved in LaCES require interdisciplinary R&D efforts. This study investigates the interdisciplinary interactions that take place during the R&D and early conceptual design phases in the design of LaCES. Our theoretical framework is informed by both engineering practices and social science research on complex organizations. This paper provides preliminary perspective on some of the organizational influences on interdisciplinary interactions based on organization theory (specifically sensemaking), data from a survey of LaCES experts, and the authors experience in the research and design. The analysis reveals couplings between the engineered system and the organization that creates it. Survey respondents noted the importance of interdisciplinary interactions and their significant benefit to the engineered system, such as innovation and problem mitigation. Substantial obstacles to interdisciplinarity are uncovered beyond engineering that include communication and organizational challenges. Addressing these challenges may ultimately foster greater efficiencies in the design and development of LaCES and improved system performance by assisting with the collective integration of interdependent knowledge bases early in the R&D effort. This research suggests that organizational and human dynamics heavily influence and even constrain the engineering effort for large-scale complex systems.
ERIC Educational Resources Information Center
Forsman, Jonas; van den Bogaard, Maartje; Linder, Cedric; Fraser, Duncan
2015-01-01
This study uses multilayer minimum spanning tree analysis to develop a model for student retention from a complex system perspective, using data obtained from first-year engineering students at a large well-regarded institution in the European Union. The results show that the elements of the system of student retention are related to one another…
FLAME: A platform for high performance computing of complex systems, applied for three case studies
Kiran, Mariam; Bicak, Mesude; Maleki-Dizaji, Saeedeh; ...
2011-01-01
FLAME allows complex models to be automatically parallelised on High Performance Computing (HPC) grids enabling large number of agents to be simulated over short periods of time. Modellers are hindered by complexities of porting models on parallel platforms and time taken to run large simulations on a single machine, which FLAME overcomes. Three case studies from different disciplines were modelled using FLAME, and are presented along with their performance results on a grid.
Karwowski, Waldemar
2012-12-01
In this paper, the author explores a need for a greater understanding of the true nature of human-system interactions from the perspective of the theory of complex adaptive systems, including the essence of complexity, emergent properties of system behavior, nonlinear systems dynamics, and deterministic chaos. Human performance, more often than not, constitutes complex adaptive phenomena with emergent properties that exhibit nonlinear dynamical (chaotic) behaviors. The complexity challenges in the design and management of contemporary work systems, including service systems, are explored. Examples of selected applications of the concepts of nonlinear dynamics to the study of human physical performance are provided. Understanding and applications of the concepts of theory of complex adaptive and dynamical systems should significantly improve the effectiveness of human-centered design efforts of a large system of systems. Performance of many contemporary work systems and environments may be sensitive to the initial conditions and may exhibit dynamic nonlinear properties and chaotic system behaviors. Human-centered design of emergent human-system interactions requires application of the theories of nonlinear dynamics and complex adaptive system. The success of future human-systems integration efforts requires the fusion of paradigms, knowledge, design principles, and methodologies of human factors and ergonomics with those of the science of complex adaptive systems as well as modern systems engineering.
An improved large-field focusing schlieren system
NASA Technical Reports Server (NTRS)
Weinstein, Leonard M.
1991-01-01
The analysis and performance of a high-brightness large-field focusing schlieren system is described. The system can be used to examine complex two- and three-dimensional flows. Techniques are described to obtain focusing schlieren through distorting optical elements, to use multiple colors in a time multiplexing technique, and to use diffuse screen holography for three-dimensional photographs.
Sensemaking in a Value Based Context for Large Scale Complex Engineered Systems
NASA Astrophysics Data System (ADS)
Sikkandar Basha, Nazareen
The design and the development of Large-Scale Complex Engineered Systems (LSCES) requires the involvement of multiple teams and numerous levels of the organization and interactions with large numbers of people and interdisciplinary departments. Traditionally, requirements-driven Systems Engineering (SE) is used in the design and development of these LSCES. The requirements are used to capture the preferences of the stakeholder for the LSCES. Due to the complexity of the system, multiple levels of interactions are required to elicit the requirements of the system within the organization. Since LSCES involves people and interactions between the teams and interdisciplinary departments, it should be socio-technical in nature. The elicitation of the requirements of most large-scale system projects are subjected to creep in time and cost due to the uncertainty and ambiguity of requirements during the design and development. In an organization structure, the cost and time overrun can occur at any level and iterate back and forth thus increasing the cost and time. To avoid such creep past researches have shown that rigorous approaches such as value based designing can be used to control it. But before the rigorous approaches can be used, the decision maker should have a proper understanding of requirements creep and the state of the system when the creep occurs. Sensemaking is used to understand the state of system when the creep occurs and provide a guidance to decision maker. This research proposes the use of the Cynefin framework, sensemaking framework which can be used in the design and development of LSCES. It can aide in understanding the system and decision making to minimize the value gap due to requirements creep by eliminating ambiguity which occurs during design and development. A sample hierarchical organization is used to demonstrate the state of the system at the occurrence of requirements creep in terms of cost and time using the Cynefin framework. These trials are continued for different requirements and at different sub-system level. The results obtained show that the Cynefin framework can be used to improve the value of the system and can be used for predictive analysis. The decision makers can use these findings and use rigorous approaches and improve the design of Large Scale Complex Engineered Systems.
NASA Astrophysics Data System (ADS)
McCaskill, John
There can be large spatial and temporal separation of cause and effect in policy making. Determining the correct linkage between policy inputs and outcomes can be highly impractical in the complex environments faced by policy makers. In attempting to see and plan for the probable outcomes, standard linear models often overlook, ignore, or are unable to predict catastrophic events that only seem improbable due to the issue of multiple feedback loops. There are several issues with the makeup and behaviors of complex systems that explain the difficulty many mathematical models (factor analysis/structural equation modeling) have in dealing with non-linear effects in complex systems. This chapter highlights those problem issues and offers insights to the usefulness of ABM in dealing with non-linear effects in complex policy making environments.
Snowden, Thomas J; van der Graaf, Piet H; Tindall, Marcus J
2017-07-01
Complex models of biochemical reaction systems have become increasingly common in the systems biology literature. The complexity of such models can present a number of obstacles for their practical use, often making problems difficult to intuit or computationally intractable. Methods of model reduction can be employed to alleviate the issue of complexity by seeking to eliminate those portions of a reaction network that have little or no effect upon the outcomes of interest, hence yielding simplified systems that retain an accurate predictive capacity. This review paper seeks to provide a brief overview of a range of such methods and their application in the context of biochemical reaction network models. To achieve this, we provide a brief mathematical account of the main methods including timescale exploitation approaches, reduction via sensitivity analysis, optimisation methods, lumping, and singular value decomposition-based approaches. Methods are reviewed in the context of large-scale systems biology type models, and future areas of research are briefly discussed.
The Effectiveness of an Electronic Security Management System in a Privately Owned Apartment Complex
ERIC Educational Resources Information Center
Greenberg, David F.; Roush, Jeffrey B.
2009-01-01
Poisson and negative binomial regression methods are used to analyze the monthly time series data to determine the effects of introducing an integrated security management system including closed-circuit television (CCTV), door alarm monitoring, proximity card access, and emergency call boxes to a large privately-owned complex of apartment…
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Preheim, Larry E.
1990-01-01
Data systems requirements in the Earth Observing System (EOS) Space Station Freedom (SSF) eras indicate increasing data volume, increased discipline interplay, higher complexity and broader data integration and interpretation. A response to the needs of the interdisciplinary investigator is proposed, considering the increasing complexity and rising costs of scientific investigation. The EOS Data Information System, conceived to be a widely distributed system with reliable communication links between central processing and the science user community, is described. Details are provided on information architecture, system models, intelligent data management of large complex databases, and standards for archiving ancillary data, using a research library, a laboratory and collaboration services.
Statistical complexity without explicit reference to underlying probabilities
NASA Astrophysics Data System (ADS)
Pennini, F.; Plastino, A.
2018-06-01
We show that extremely simple systems of a not too large number of particles can be simultaneously thermally stable and complex. To such an end, we extend the statistical complexity's notion to simple configurations of non-interacting particles, without appeal to probabilities, and discuss configurational properties.
Carrazana-García, Jorge A; Cabaleiro-Lago, Enrique M; Rodríguez-Otero, Jesús
2017-04-19
The present work studies the interaction of two extended curved π-systems (corannulene and sumanene) with various cations (sodium, potassium, ammonium, tetramethylammonium, guanidinium and imidazolium). Polyatomic cations are models of groups found in important biomolecules in which cation-π interaction plays a fundamental role. The results indicate an important size effect: with extended π systems and cations of the size of potassium and larger, dispersion is much more important than has been generally recognized for cation-π interactions. In most of the systems studied here, the stability of the cation-π complexes is the result of a balanced combination of electrostatic, induction and dispersion contributions. None of the systems studied here owes its stability to the electrostatic interaction more than 42%. Induction dominates stabilization in complexes with sodium, and in some of the potassium and ammonium complexes. In complexes with large cations and with flat cations dispersion is the major stabilizing contribution and can provide more than 50% of the stabilization energy. This implies that theoretical studies of the cation-π interaction involving large or even medium-size fragments require a level of calculation capable of properly modelling dispersion. The separation between the cation and the π system is another important factor to take into account, especially when the fragments of the cation-π complex are bound (for example, to a protein backbone) and cannot interact at the most favourable distance.
Automated Design of Complex Dynamic Systems
Hermans, Michiel; Schrauwen, Benjamin; Bienstman, Peter; Dambre, Joni
2014-01-01
Several fields of study are concerned with uniting the concept of computation with that of the design of physical systems. For example, a recent trend in robotics is to design robots in such a way that they require a minimal control effort. Another example is found in the domain of photonics, where recent efforts try to benefit directly from the complex nonlinear dynamics to achieve more efficient signal processing. The underlying goal of these and similar research efforts is to internalize a large part of the necessary computations within the physical system itself by exploiting its inherent non-linear dynamics. This, however, often requires the optimization of large numbers of system parameters, related to both the system's structure as well as its material properties. In addition, many of these parameters are subject to fabrication variability or to variations through time. In this paper we apply a machine learning algorithm to optimize physical dynamic systems. We show that such algorithms, which are normally applied on abstract computational entities, can be extended to the field of differential equations and used to optimize an associated set of parameters which determine their behavior. We show that machine learning training methodologies are highly useful in designing robust systems, and we provide a set of both simple and complex examples using models of physical dynamical systems. Interestingly, the derived optimization method is intimately related to direct collocation a method known in the field of optimal control. Our work suggests that the application domains of both machine learning and optimal control have a largely unexplored overlapping area which envelopes a novel design methodology of smart and highly complex physical systems. PMID:24497969
Best geoscience approach to complex systems in environment
NASA Astrophysics Data System (ADS)
Mezemate, Yacine; Tchiguirinskaia, Ioulia; Schertzer, Daniel
2017-04-01
The environment is a social issue that continues to grow in importance. Its complexity, both cross-disciplinary and multi-scale, has given rise to a large number of scientific and technological locks, that complex systems approaches can solve. Significant challenges must met to achieve the understanding of the environmental complexes systems. There study should proceed in some steps in which the use of data and models is crucial: - Exploration, observation and basic data acquisition - Identification of correlations, patterns, and mechanisms - Modelling - Model validation, implementation and prediction - Construction of a theory Since the e-learning becomes a powerful tool for knowledge and best practice shearing, we use it to teach the environmental complexities and systems. In this presentation we promote the e-learning course dedicated for a large public (undergraduates, graduates, PhD students and young scientists) which gather and puts in coherence different pedagogical materials of complex systems and environmental studies. This course describes a complex processes using numerous illustrations, examples and tests that make it "easy to enjoy" learning process. For the seek of simplicity, the course is divided in different modules and at the end of each module a set of exercises and program codes are proposed for a best practice. The graphical user interface (GUI) which is constructed using an open source Opale Scenari offers a simple navigation through the different module. The course treats the complex systems that can be found in environment and their observables, we particularly highlight the extreme variability of these observables over a wide range of scales. Using the multifractal formalism through different applications (turbulence, precipitation, hydrology) we demonstrate how such extreme variability of the geophysical/biological fields should be used solving everyday (geo-)environmental chalenges.
NASA Astrophysics Data System (ADS)
Kashid, Satishkumar S.; Maity, Rajib
2012-08-01
SummaryPrediction of Indian Summer Monsoon Rainfall (ISMR) is of vital importance for Indian economy, and it has been remained a great challenge for hydro-meteorologists due to inherent complexities in the climatic systems. The Large-scale atmospheric circulation patterns from tropical Pacific Ocean (ENSO) and those from tropical Indian Ocean (EQUINOO) are established to influence the Indian Summer Monsoon Rainfall. The information of these two large scale atmospheric circulation patterns in terms of their indices is used to model the complex relationship between Indian Summer Monsoon Rainfall and the ENSO as well as EQUINOO indices. However, extracting the signal from such large-scale indices for modeling such complex systems is significantly difficult. Rainfall predictions have been done for 'All India' as one unit, as well as for five 'homogeneous monsoon regions of India', defined by Indian Institute of Tropical Meteorology. Recent 'Artificial Intelligence' tool 'Genetic Programming' (GP) has been employed for modeling such problem. The Genetic Programming approach is found to capture the complex relationship between the monthly Indian Summer Monsoon Rainfall and large scale atmospheric circulation pattern indices - ENSO and EQUINOO. Research findings of this study indicate that GP-derived monthly rainfall forecasting models, that use large-scale atmospheric circulation information are successful in prediction of All India Summer Monsoon Rainfall with correlation coefficient as good as 0.866, which may appears attractive for such a complex system. A separate analysis is carried out for All India Summer Monsoon rainfall for India as one unit, and five homogeneous monsoon regions, based on ENSO and EQUINOO indices of months of March, April and May only, performed at end of month of May. In this case, All India Summer Monsoon Rainfall could be predicted with 0.70 as correlation coefficient with somewhat lesser Correlation Coefficient (C.C.) values for different 'homogeneous monsoon regions'.
Complex networks as an emerging property of hierarchical preferential attachment.
Hébert-Dufresne, Laurent; Laurence, Edward; Allard, Antoine; Young, Jean-Gabriel; Dubé, Louis J
2015-12-01
Real complex systems are not rigidly structured; no clear rules or blueprints exist for their construction. Yet, amidst their apparent randomness, complex structural properties universally emerge. We propose that an important class of complex systems can be modeled as an organization of many embedded levels (potentially infinite in number), all of them following the same universal growth principle known as preferential attachment. We give examples of such hierarchy in real systems, for instance, in the pyramid of production entities of the film industry. More importantly, we show how real complex networks can be interpreted as a projection of our model, from which their scale independence, their clustering, their hierarchy, their fractality, and their navigability naturally emerge. Our results suggest that complex networks, viewed as growing systems, can be quite simple, and that the apparent complexity of their structure is largely a reflection of their unobserved hierarchical nature.
Complex networks as an emerging property of hierarchical preferential attachment
NASA Astrophysics Data System (ADS)
Hébert-Dufresne, Laurent; Laurence, Edward; Allard, Antoine; Young, Jean-Gabriel; Dubé, Louis J.
2015-12-01
Real complex systems are not rigidly structured; no clear rules or blueprints exist for their construction. Yet, amidst their apparent randomness, complex structural properties universally emerge. We propose that an important class of complex systems can be modeled as an organization of many embedded levels (potentially infinite in number), all of them following the same universal growth principle known as preferential attachment. We give examples of such hierarchy in real systems, for instance, in the pyramid of production entities of the film industry. More importantly, we show how real complex networks can be interpreted as a projection of our model, from which their scale independence, their clustering, their hierarchy, their fractality, and their navigability naturally emerge. Our results suggest that complex networks, viewed as growing systems, can be quite simple, and that the apparent complexity of their structure is largely a reflection of their unobserved hierarchical nature.
Intelligent systems engineering methodology
NASA Technical Reports Server (NTRS)
Fouse, Scott
1990-01-01
An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.
NASA Technical Reports Server (NTRS)
McGowan, Anna-Maria Rivas; Papalambros, Panos Y.; Baker, Wayne E.
2015-01-01
This paper examines four primary methods of working across disciplines during R&D and early design of large-scale complex engineered systems such as aerospace systems. A conceptualized framework, called the Combining System Elements framework, is presented to delineate several aspects of cross-discipline and system integration practice. The framework is derived from a theoretical and empirical analysis of current work practices in actual operational settings and is informed by theories from organization science and engineering. The explanatory framework may be used by teams to clarify assumptions and associated work practices, which may reduce ambiguity in understanding diverse approaches to early systems research, development and design. The framework also highlights that very different engineering results may be obtained depending on work practices, even when the goals for the engineered system are the same.
Complex Homology and the Evolution of Nervous Systems.
Liebeskind, Benjamin J; Hillis, David M; Zakon, Harold H; Hofmann, Hans A
2016-02-01
We examine the complex evolution of animal nervous systems and discuss the ramifications of this complexity for inferring the nature of early animals. Although reconstructing the origins of nervous systems remains a central challenge in biology, and the phenotypic complexity of early animals remains controversial, a compelling picture is emerging. We now know that the nervous system and other key animal innovations contain a large degree of homoplasy, at least on the molecular level. Conflicting hypotheses about early nervous system evolution are due primarily to differences in the interpretation of this homoplasy. We highlight the need for explicit discussion of assumptions and discuss the limitations of current approaches for inferring ancient phenotypic states. Copyright © 2015. Published by Elsevier Ltd.
1977-01-26
Sisteme Matematicheskogo Obespecheniya YeS EVM [ Applied Programs in the Software System for the Unified System of Computers], by A. Ye. Fateyev, A. I...computerized systems are most effective in large production complexes , in which the level of utilization of computers can be as high as 500,000...performance of these tasks could be furthered by the complex introduction of electronic computers in automated control systems. The creation of ASU
Parametric Analysis of a Hover Test Vehicle using Advanced Test Generation and Data Analysis
NASA Technical Reports Server (NTRS)
Gundy-Burlet, Karen; Schumann, Johann; Menzies, Tim; Barrett, Tony
2009-01-01
Large complex aerospace systems are generally validated in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. This is due to the large parameter space, and complex, highly coupled nonlinear nature of the different systems that contribute to the performance of the aerospace system. We have addressed the factors deterring such an analysis by applying a combination of technologies to the area of flight envelop assessment. We utilize n-factor (2,3) combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. The data generated is automatically analyzed through a combination of unsupervised learning using a Bayesian multivariate clustering technique (AutoBayes) and supervised learning of critical parameter ranges using the machine-learning tool TAR3, a treatment learner. Covariance analysis with scatter plots and likelihood contours are used to visualize correlations between simulation parameters and simulation results, a task that requires tool support, especially for large and complex models. We present results of simulation experiments for a cold-gas-powered hover test vehicle.
Trajectory-probed instability and statistics of desynchronization events in coupled chaotic systems
NASA Astrophysics Data System (ADS)
de Oliveira, Gilson F.; Chevrollier, Martine; Passerat de Silans, Thierry; Oriá, Marcos; de Souza Cavalcante, Hugo L. D.
2015-11-01
Complex systems, such as financial markets, earthquakes, and neurological networks, exhibit extreme events whose mechanisms of formation are not still completely understood. These mechanisms may be identified and better studied in simpler systems with dynamical features similar to the ones encountered in the complex system of interest. For instance, sudden and brief departures from the synchronized state observed in coupled chaotic systems were shown to display non-normal statistical distributions similar to events observed in the complex systems cited above. The current hypothesis accepted is that these desynchronization events are influenced by the presence of unstable object(s) in the phase space of the system. Here, we present further evidence that the occurrence of large events is triggered by the visitation of the system's phase-space trajectory to the vicinity of these unstable objects. In the system studied here, this visitation is controlled by a single parameter, and we exploit this feature to observe the effect of the visitation rate in the overall instability of the synchronized state. We find that the probability of escapes from the synchronized state and the size of those desynchronization events are enhanced in attractors whose shapes permit the chaotic trajectories to approach the region of strong instability. This result shows that the occurrence of large events requires not only a large local instability to amplify noise, or to amplify the effect of parameter mismatch between the coupled subsystems, but also that the trajectories of the system wander close to this local instability.
NASA Astrophysics Data System (ADS)
Manfredi, Sabato
2016-06-01
Large-scale dynamic systems are becoming highly pervasive in their occurrence with applications ranging from system biology, environment monitoring, sensor networks, and power systems. They are characterised by high dimensionality, complexity, and uncertainty in the node dynamic/interactions that require more and more computational demanding methods for their analysis and control design, as well as the network size and node system/interaction complexity increase. Therefore, it is a challenging problem to find scalable computational method for distributed control design of large-scale networks. In this paper, we investigate the robust distributed stabilisation problem of large-scale nonlinear multi-agent systems (briefly MASs) composed of non-identical (heterogeneous) linear dynamical systems coupled by uncertain nonlinear time-varying interconnections. By employing Lyapunov stability theory and linear matrix inequality (LMI) technique, new conditions are given for the distributed control design of large-scale MASs that can be easily solved by the toolbox of MATLAB. The stabilisability of each node dynamic is a sufficient assumption to design a global stabilising distributed control. The proposed approach improves some of the existing LMI-based results on MAS by both overcoming their computational limits and extending the applicative scenario to large-scale nonlinear heterogeneous MASs. Additionally, the proposed LMI conditions are further reduced in terms of computational requirement in the case of weakly heterogeneous MASs, which is a common scenario in real application where the network nodes and links are affected by parameter uncertainties. One of the main advantages of the proposed approach is to allow to move from a centralised towards a distributed computing architecture so that the expensive computation workload spent to solve LMIs may be shared among processors located at the networked nodes, thus increasing the scalability of the approach than the network size. Finally, a numerical example shows the applicability of the proposed method and its advantage in terms of computational complexity when compared with the existing approaches.
NASA Astrophysics Data System (ADS)
Keilis-Borok, V. I.; Soloviev, A. A.
2010-09-01
Socioeconomic and natural complex systems persistently generate extreme events also known as disasters, crises, or critical transitions. Here we analyze patterns of background activity preceding extreme events in four complex systems: economic recessions, surges in homicides in a megacity, magnetic storms, and strong earthquakes. We use as a starting point the indicators describing the system's behavior and identify changes in an indicator's trend. Those changes constitute our background events (BEs). We demonstrate a premonitory pattern common to all four systems considered: relatively large magnitude BEs become more frequent before extreme event. A premonitory change of scaling has been found in various models and observations. Here we demonstrate this change in scaling of uniformly defined BEs in four real complex systems, their enormous differences notwithstanding.
Online Community Detection for Large Complex Networks
Pan, Gang; Zhang, Wangsheng; Wu, Zhaohui; Li, Shijian
2014-01-01
Complex networks describe a wide range of systems in nature and society. To understand complex networks, it is crucial to investigate their community structure. In this paper, we develop an online community detection algorithm with linear time complexity for large complex networks. Our algorithm processes a network edge by edge in the order that the network is fed to the algorithm. If a new edge is added, it just updates the existing community structure in constant time, and does not need to re-compute the whole network. Therefore, it can efficiently process large networks in real time. Our algorithm optimizes expected modularity instead of modularity at each step to avoid poor performance. The experiments are carried out using 11 public data sets, and are measured by two criteria, modularity and NMI (Normalized Mutual Information). The results show that our algorithm's running time is less than the commonly used Louvain algorithm while it gives competitive performance. PMID:25061683
Qualitative Fault Isolation of Hybrid Systems: A Structural Model Decomposition-Based Approach
NASA Technical Reports Server (NTRS)
Bregon, Anibal; Daigle, Matthew; Roychoudhury, Indranil
2016-01-01
Quick and robust fault diagnosis is critical to ensuring safe operation of complex engineering systems. A large number of techniques are available to provide fault diagnosis in systems with continuous dynamics. However, many systems in aerospace and industrial environments are best represented as hybrid systems that consist of discrete behavioral modes, each with its own continuous dynamics. These hybrid dynamics make the on-line fault diagnosis task computationally more complex due to the large number of possible system modes and the existence of autonomous mode transitions. This paper presents a qualitative fault isolation framework for hybrid systems based on structural model decomposition. The fault isolation is performed by analyzing the qualitative information of the residual deviations. However, in hybrid systems this process becomes complex due to possible existence of observation delays, which can cause observed deviations to be inconsistent with the expected deviations for the current mode in the system. The great advantage of structural model decomposition is that (i) it allows to design residuals that respond to only a subset of the faults, and (ii) every time a mode change occurs, only a subset of the residuals will need to be reconfigured, thus reducing the complexity of the reasoning process for isolation purposes. To demonstrate and test the validity of our approach, we use an electric circuit simulation as the case study.
The problem of ecological scaling in spatially complex, nonequilibrium ecological systems [chapter 3
Samuel A. Cushman; Jeremy Littell; Kevin McGarigal
2010-01-01
In the previous chapter we reviewed the challenges posed by spatial complexity and temporal disequilibrium to efforts to understand and predict the structure and dynamics of ecological systems. The central theme was that spatial variability in the environment and population processes fundamentally alters the interactions between species and their environments, largely...
Caie, Peter D; Harrison, David J
2016-01-01
The field of pathology is rapidly transforming from a semiquantitative and empirical science toward a big data discipline. Large data sets from across multiple omics fields may now be extracted from a patient's tissue sample. Tissue is, however, complex, heterogeneous, and prone to artifact. A reductionist view of tissue and disease progression, which does not take this complexity into account, may lead to single biomarkers failing in clinical trials. The integration of standardized multi-omics big data and the retention of valuable information on spatial heterogeneity are imperative to model complex disease mechanisms. Mathematical modeling through systems pathology approaches is the ideal medium to distill the significant information from these large, multi-parametric, and hierarchical data sets. Systems pathology may also predict the dynamical response of disease progression or response to therapy regimens from a static tissue sample. Next-generation pathology will incorporate big data with systems medicine in order to personalize clinical practice for both prognostic and predictive patient care.
Electronic construction collaboration system -- final phase : [tech transfer summary].
DOT National Transportation Integrated Search
2014-07-01
Construction projects have been growing more complex in terms of : project team composition, design aspects, and construction processes. : To help manage the shop/working drawings and requests for information : (RFIs) for its large, complex projects,...
A new decision sciences for complex systems.
Lempert, Robert J
2002-05-14
Models of complex systems can capture much useful information but can be difficult to apply to real-world decision-making because the type of information they contain is often inconsistent with that required for traditional decision analysis. New approaches, which use inductive reasoning over large ensembles of computational experiments, now make possible systematic comparison of alternative policy options using models of complex systems. This article describes Computer-Assisted Reasoning, an approach to decision-making under conditions of deep uncertainty that is ideally suited to applying complex systems to policy analysis. The article demonstrates the approach on the policy problem of global climate change, with a particular focus on the role of technology policies in a robust, adaptive strategy for greenhouse gas abatement.
The sleeping brain as a complex system.
Olbrich, Eckehard; Achermann, Peter; Wennekers, Thomas
2011-10-13
'Complexity science' is a rapidly developing research direction with applications in a multitude of fields that study complex systems consisting of a number of nonlinear elements with interesting dynamics and mutual interactions. This Theme Issue 'The complexity of sleep' aims at fostering the application of complexity science to sleep research, because the brain in its different sleep stages adopts different global states that express distinct activity patterns in large and complex networks of neural circuits. This introduction discusses the contributions collected in the present Theme Issue. We highlight the potential and challenges of a complex systems approach to develop an understanding of the brain in general and the sleeping brain in particular. Basically, we focus on two topics: the complex networks approach to understand the changes in the functional connectivity of the brain during sleep, and the complex dynamics of sleep, including sleep regulation. We hope that this Theme Issue will stimulate and intensify the interdisciplinary communication to advance our understanding of the complex dynamics of the brain that underlies sleep and consciousness.
Long Valley Caldera-Mammoth Mountain unrest: The knowns and unknowns
Hill, David P.
2017-01-01
This perspective is based largely on my study of the Long Valley Caldera (California, USA) over the past 40 years. Here, I’ll examine the “knowns” and the “known unknowns” of the complex tectonic–magmatic system of the Long Valley Caldera volcanic complex. I will also offer a few brief thoughts on the “unknown unknowns” of this system.
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Truszkowski, Walter F.; Rouff, Christopher A.; Sterritt, Roy
2005-01-01
The explosion of capabilities and new products within the sphere of Information Technology (IT) has fostered widespread, overly optimistic opinions regarding the industry, based on common but unjustified assumptions of quality and correctness of software. These assumptions are encouraged by software producers and vendors, who at this late date have not succeeded in finding a way to overcome the lack of an automated, mathematically sound way to develop correct systems from requirements. NASA faces this dilemma as it envisages advanced mission concepts that involve large swarms of small spacecraft that will engage cooperatively to acheve science goals. Such missions entail levels of complexity that beg for new methods for system development far beyond today's methods, which are inadequate for ensuring correct behavior of large numbers of interacting intelligent mission elements. New system development techniques recently devised through NASA-led research will offer some innovative approaches to achieving correctness in complex system development, including autonomous swarm missions that exhibit emergent behavior, as well as general software products created by the computing industry.
Enabling Controlling Complex Networks with Local Topological Information.
Li, Guoqi; Deng, Lei; Xiao, Gaoxi; Tang, Pei; Wen, Changyun; Hu, Wuhua; Pei, Jing; Shi, Luping; Stanley, H Eugene
2018-03-15
Complex networks characterize the nature of internal/external interactions in real-world systems including social, economic, biological, ecological, and technological networks. Two issues keep as obstacles to fulfilling control of large-scale networks: structural controllability which describes the ability to guide a dynamical system from any initial state to any desired final state in finite time, with a suitable choice of inputs; and optimal control, which is a typical control approach to minimize the cost for driving the network to a predefined state with a given number of control inputs. For large complex networks without global information of network topology, both problems remain essentially open. Here we combine graph theory and control theory for tackling the two problems in one go, using only local network topology information. For the structural controllability problem, a distributed local-game matching method is proposed, where every node plays a simple Bayesian game with local information and local interactions with adjacent nodes, ensuring a suboptimal solution at a linear complexity. Starring from any structural controllability solution, a minimizing longest control path method can efficiently reach a good solution for the optimal control in large networks. Our results provide solutions for distributed complex network control and demonstrate a way to link the structural controllability and optimal control together.
Complex Dynamics of the Power Transmission Grid (and other Critical Infrastructures)
NASA Astrophysics Data System (ADS)
Newman, David
2015-03-01
Our modern societies depend crucially on a web of complex critical infrastructures such as power transmission networks, communication systems, transportation networks and many others. These infrastructure systems display a great number of the characteristic properties of complex systems. Important among these characteristics, they exhibit infrequent large cascading failures that often obey a power law distribution in their probability versus size. This power law behavior suggests that conventional risk analysis does not apply to these systems. It is thought that much of this behavior comes from the dynamical evolution of the system as it ages, is repaired, upgraded, and as the operational rules evolve with human decision making playing an important role in the dynamics. In this talk, infrastructure systems as complex dynamical systems will be introduced and some of their properties explored. The majority of the talk will then be focused on the electric power transmission grid though many of the results can be easily applied to other infrastructures. General properties of the grid will be discussed and results from a dynamical complex systems power transmission model will be compared with real world data. Then we will look at a variety of uses of this type of model. As examples, we will discuss the impact of size and network homogeneity on the grid robustness, the change in risk of failure as generation mix (more distributed vs centralized for example) changes, as well as the effect of operational changes such as the changing the operational risk aversion or grid upgrade strategies. One of the important outcomes from this work is the realization that ``improvements'' in the system components and operational efficiency do not always improve the system robustness, and can in fact greatly increase the risk, when measured as a risk of large failure.
Dynamic Identification for Control of Large Space Structures
NASA Technical Reports Server (NTRS)
Ibrahim, S. R.
1985-01-01
This is a compilation of reports by the one author on one subject. It consists of the following five journal articles: (1) A Parametric Study of the Ibrahim Time Domain Modal Identification Algorithm; (2) Large Modal Survey Testing Using the Ibrahim Time Domain Identification Technique; (3) Computation of Normal Modes from Identified Complex Modes; (4) Dynamic Modeling of Structural from Measured Complex Modes; and (5) Time Domain Quasi-Linear Identification of Nonlinear Dynamic Systems.
NASA Astrophysics Data System (ADS)
Cashman, Katharine V.; Giordano, Guido
2014-11-01
Large caldera-forming eruptions have long been a focus of both petrological and volcanological studies; petrologists have used the eruptive products to probe conditions of magma storage (and thus processes that drive magma evolution), while volcanologists have used them to study the conditions under which large volumes of magma are transported to, and emplaced on, the Earth's surface. Traditionally, both groups have worked on the assumption that eruptible magma is stored within a single long-lived melt body. Over the past decade, however, advances in analytical techniques have provided new views of magma storage regions, many of which provide evidence of multiple melt lenses feeding a single eruption, and/or rapid pre-eruptive assembly of large volumes of melt. These new petrological views of magmatic systems have not yet been fully integrated into volcanological perspectives of caldera-forming eruptions. Here we explore the implications of complex magma reservoir configurations for eruption dynamics and caldera formation. We first examine mafic systems, where stacked-sill models have long been invoked but which rarely produce explosive eruptions. An exception is the 2010 eruption of Eyjafjallajökull volcano, Iceland, where seismic and petrologic data show that multiple sills at different depths fed a multi-phase (explosive and effusive) eruption. Extension of this concept to larger mafic caldera-forming systems suggests a mechanism to explain many of their unusual features, including their protracted explosivity, spatially variable compositions and pronounced intra-eruptive pauses. We then review studies of more common intermediate and silicic caldera-forming systems to examine inferred conditions of magma storage, time scales of melt accumulation, eruption triggers, eruption dynamics and caldera collapse. By compiling data from large and small, and crystal-rich and crystal-poor, events, we compare eruptions that are well explained by simple evacuation of a zoned magma chamber (termed the Standard Model by Gualda and Ghiorso, 2013) to eruptions that are better explained by tapping multiple, rather than single, melt lenses stored within a largely crystalline mush (which we term complex magma reservoirs). We then discuss the implications of magma storage within complex, rather than simple, reservoirs for identifying magmatic systems with the potential to produce large eruptions, and for monitoring eruption progress under conditions where successive melt lenses may be tapped. We conclude that emerging views of complex magma reservoir configurations provide exciting opportunities for re-examining volcanological concepts of caldera-forming systems.
Bush, Ian E.
1980-01-01
The lessons of the 70's with MIS were largely painful, often the same as those of the 60's, and were found in different phases on two continents. On examination this turns out to be true for many non-medical fields, true for systems programming, and thus a very general phenomenon. It is related to the functional complexity rather than to the sheer size of the software required, and above all to the relative neglect of human factors at all levels of software and hardware design. Simple hierarchical theory is a useful tool for analyzing complex systems and restoring the necessary dominance of common sense human factors. An example shows the very large effects of neglecting these factors on costs and benefits of MIS and their sub-systems.
Explicit solution techniques for impact with contact constraints
NASA Technical Reports Server (NTRS)
Mccarty, Robert E.
1993-01-01
Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.
Explicit solution techniques for impact with contact constraints
NASA Astrophysics Data System (ADS)
McCarty, Robert E.
1993-08-01
Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.
From a Proven Correct Microkernel to Trustworthy Large Systems
NASA Astrophysics Data System (ADS)
Andronick, June
The seL4 microkernel was the world's first general-purpose operating system kernel with a formal, machine-checked proof of correctness. The next big step in the challenge of building truly trustworthy systems is to provide a framework for developing secure systems on top of seL4. This paper first gives an overview of seL4's correctness proof, together with its main implications and assumptions, and then describes our approach to provide formal security guarantees for large, complex systems.
Human Mars Mission Design - The Ultimate Systems Challenge
NASA Technical Reports Server (NTRS)
Connolly, John F.; Joosten, B. Kent; Drake, Bret; Hoffman, Steve; Polsgrove, Tara; Rucker, Michelle; Andrews, Alida; Williams, Nehemiah
2017-01-01
A human mission to Mars will occur at some time in the coming decades. When it does, it will be the end result of a complex network of interconnected design choices, systems analyses, technical optimizations, and non-technical compromises. This mission will extend the technologies, engineering design, and systems analyses to new limits, and may very well be the most complex undertaking in human history. It can be illustrated as a large menu, or as a large decision tree. Whatever the visualization tool, there are numerous design decisions required to assemble a human Mars mission, and many of these interconnect with one another. This paper examines these many decisions and further details a number of choices that are highly interwoven throughout the mission design. The large quantity of variables and their interconnectedness results in a highly complex systems challenge, and the paper illustrates how a change in one variable results in ripples (sometimes unintended) throughout many other facets of the design. The paper concludes with a discussion of some mission design variables that can be addressed first, and those that have already been addressed as a result of ongoing National Aeronautics and Space Administration (NASA) developments, or as a result of decisions outside the technical arena. It advocates the need for a 'reference design' that can be used as a point of comparison, and to illustrate the system-wide impacts as design variables change.
The Internet As a Large-Scale Complex System
NASA Astrophysics Data System (ADS)
Park, Kihong; Willinger, Walter
2005-06-01
The Internet may be viewed as a "complex system" with diverse features and many components that can give rise to unexpected emergent phenomena, revealing much about its own engineering. This book brings together chapter contributions from a workshop held at the Santa Fe Institute in March 2001. This volume captures a snapshot of some features of the Internet that may be fruitfully approached using a complex systems perspective, meaning using interdisciplinary tools and methods to tackle the subject area. The Internet penetrates the socioeconomic fabric of everyday life; a broader and deeper grasp of the Internet may be needed to meet the challenges facing the future. The resulting empirical data have already proven to be invaluable for gaining novel insights into the network's spatio-temporal dynamics, and can be expected to become even more important when tryin to explain the Internet's complex and emergent behavior in terms of elementary networking-based mechanisms. The discoveries of fractal or self-similar network traffic traces, power-law behavior in network topology and World Wide Web connectivity are instances of unsuspected, emergent system traits. Another important factor at the heart of fair, efficient, and stable sharing of network resources is user behavior. Network systems, when habited by selfish or greedy users, take on the traits of a noncooperative multi-party game, and their stability and efficiency are integral to understanding the overall system and its dynamics. Lastly, fault-tolerance and robustness of large-scale network systems can exhibit spatial and temporal correlations whose effective analysis and management may benefit from rescaling techniques applied in certain physical and biological systems. The present book will bring together several of the leading workers involved in the analysis of complex systems with the future development of the Internet.
The New Maia Detector System: Methods For High Definition Trace Element Imaging Of Natural Material
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryan, C. G.; School of Physics, University of Melbourne, Parkville VIC; CODES Centre of Excellence, University of Tasmania, Hobart TAS
2010-04-06
Motivated by the need for megapixel high definition trace element imaging to capture intricate detail in natural material, together with faster acquisition and improved counting statistics in elemental imaging, a large energy-dispersive detector array called Maia has been developed by CSIRO and BNL for SXRF imaging on the XFM beamline at the Australian Synchrotron. A 96 detector prototype demonstrated the capacity of the system for real-time deconvolution of complex spectral data using an embedded implementation of the Dynamic Analysis method and acquiring highly detailed images up to 77 M pixels spanning large areas of complex mineral sample sections.
The New Maia Detector System: Methods For High Definition Trace Element Imaging Of Natural Material
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryan, C.G.; Siddons, D.P.; Kirkham, R.
2010-05-25
Motivated by the need for megapixel high definition trace element imaging to capture intricate detail in natural material, together with faster acquisition and improved counting statistics in elemental imaging, a large energy-dispersive detector array called Maia has been developed by CSIRO and BNL for SXRF imaging on the XFM beamline at the Australian Synchrotron. A 96 detector prototype demonstrated the capacity of the system for real-time deconvolution of complex spectral data using an embedded implementation of the Dynamic Analysis method and acquiring highly detailed images up to 77 M pixels spanning large areas of complex mineral sample sections.
Modeling Costal Zone Responses to Sea-Level Rise Using MoCCS: A Model of Complex Coastal System
NASA Astrophysics Data System (ADS)
Dai, H.; Niedoroda, A. W.; Ye, M.; Saha, B.; Donoghue, J. F.; Kish, S.
2011-12-01
Large-scale coastal systems consisting of several morphological components (e.g. beach, surf zone, dune, inlet, shoreface, and estuary) can be expected to exhibit complex and interacting responses to changes in the rate of sea level rise and storm climate. We have developed a numerical model of complex coastal systems (MoCCS), derived from earlier morphdynamic models, to represent the large-scale time-averaged physical processes that shape each component and govern the component interactions. These control the ongoing evolution of the barrier islands, beach and dune erosion, shoal formation and sand withdrawal at tidal inlets, depth changes in the bay, and changes in storm flooding. The model has been used to study the response of an idealized coastal system with physical characteristics and storm climatology similar to Santa Rosa Island on the Florida Panhandle coast. Five SLR scenarios have been used, covering the range of recently published projections for the next century. Each scenario has been input with a constant and then a time-varying storm climate. The results indicate that substantial increases in the rate of beach erosion are largely due to increased sand transfer to inlet shoals with increased rates of sea level rise. The barrier island undergoes cycles of dune destruction and regrowth, leading to sand deposition. This largely maintains island freeboard but is progressively less effective in offsetting bayside inundation and marsh habitat loss at accelerated sea level rise rates.
High performance computing in biology: multimillion atom simulations of nanoscale systems
Sanbonmatsu, K. Y.; Tung, C.-S.
2007-01-01
Computational methods have been used in biology for sequence analysis (bioinformatics), all-atom simulation (molecular dynamics and quantum calculations), and more recently for modeling biological networks (systems biology). Of these three techniques, all-atom simulation is currently the most computationally demanding, in terms of compute load, communication speed, and memory load. Breakthroughs in electrostatic force calculation and dynamic load balancing have enabled molecular dynamics simulations of large biomolecular complexes. Here, we report simulation results for the ribosome, using approximately 2.64 million atoms, the largest all-atom biomolecular simulation published to date. Several other nanoscale systems with different numbers of atoms were studied to measure the performance of the NAMD molecular dynamics simulation program on the Los Alamos National Laboratory Q Machine. We demonstrate that multimillion atom systems represent a 'sweet spot' for the NAMD code on large supercomputers. NAMD displays an unprecedented 85% parallel scaling efficiency for the ribosome system on 1024 CPUs. We also review recent targeted molecular dynamics simulations of the ribosome that prove useful for studying conformational changes of this large biomolecular complex in atomic detail. PMID:17187988
Interactive computer graphics and its role in control system design of large space structures
NASA Technical Reports Server (NTRS)
Reddy, A. S. S. R.
1985-01-01
This paper attempts to show the relevance of interactive computer graphics in the design of control systems to maintain attitude and shape of large space structures to accomplish the required mission objectives. The typical phases of control system design, starting from the physical model such as modeling the dynamics, modal analysis, and control system design methodology are reviewed and the need of the interactive computer graphics is demonstrated. Typical constituent parts of large space structures such as free-free beams and free-free plates are used to demonstrate the complexity of the control system design and the effectiveness of the interactive computer graphics.
The Design, Development and Testing of a Multi-process Real-time Software System
2007-03-01
programming large systems stems from the complexity of dealing with many different details at one time. A sound engineering approach is to break...controls and 3) is portable to other OS platforms such as Microsoft Windows. Next, to reduce the complexity of the programming tasks, the system...processes depending on how often the process has to check to see if common data was modified. A good method for one process to quickly notify another
Safety and Suitability for Service Assessment Testing for Surface and Underwater Launched Munitions
2014-12-05
test efficiency that tend to associate the Analytical S3 Test Approach with large, complex munition systems and the Empirical S3 Test Approach with...the smaller, less complex munition systems . 8.1 ANALYTICAL S3 TEST APPROACH. The Analytical S3 test approach, as shown in Figure 3, evaluates...assets than the Analytical S3 Test approach to establish the safety margin of the system . This approach is generally applicable to small munitions
Chen, Xuehui; Sun, Yunxiang; An, Xiongbo; Ming, Dengming
2011-10-14
Normal mode analysis of large biomolecular complexes at atomic resolution remains challenging in computational structure biology due to the requirement of large amount of memory space and central processing unit time. In this paper, we present a method called virtual interface substructure synthesis method or VISSM to calculate approximate normal modes of large biomolecular complexes at atomic resolution. VISSM introduces the subunit interfaces as independent substructures that join contacting molecules so as to keep the integrity of the system. Compared with other approximate methods, VISSM delivers atomic modes with no need of a coarse-graining-then-projection procedure. The method was examined for 54 protein-complexes with the conventional all-atom normal mode analysis using CHARMM simulation program and the overlap of the first 100 low-frequency modes is greater than 0.7 for 49 complexes, indicating its accuracy and reliability. We then applied VISSM to the satellite panicum mosaic virus (SPMV, 78,300 atoms) and to F-actin filament structures of up to 39-mer, 228,813 atoms and found that VISSM calculations capture functionally important conformational changes accessible to these structures at atomic resolution. Our results support the idea that the dynamics of a large biomolecular complex might be understood based on the motions of its component subunits and the way in which subunits bind one another. © 2011 American Institute of Physics
Sturmberg, Joachim P; Martin, Carmel M
2010-10-01
Health services demonstrate key features of complex adaptive systems (CAS), they are dynamic and unfold in unpredictable ways, and unfolding events are often unique. To better understand the complex adaptive nature of health systems around a core attractor we propose the metaphor of the health care vortex. We also suggest that in an ideal health care system the core attractor would be personal health attainment. Health care reforms around the world offer an opportunity to analyse health system change from a complex adaptive perspective. At large health care reforms have been pursued disregarding the complex adaptive nature of the health system. The paper details some recent reforms and outlines how to understand their strategies and outcomes, and what could be learnt for future efforts, utilising CAS principles. Current health systems show the inherent properties of a CAS driven by a core attractor of disease and cost containment. We content that more meaningful health systems reform requires the delicate task of shifting the core attractor from disease and cost containment towards health attainment.
A finite element formulation for scattering from electrically large 2-dimensional structures
NASA Technical Reports Server (NTRS)
Ross, Daniel C.; Volakis, John L.
1992-01-01
A finite element formulation is given using the scattered field approach with a fictitious material absorber to truncate the mesh. The formulation includes the use of arbitrary approximation functions so that more accurate results can be achieved without any modification to the software. Additionally, non-polynomial approximation functions can be used, including complex approximation functions. The banded system that results is solved with an efficient sparse/banded iterative scheme and as a consequence, large structures can be analyzed. Results are given for simple cases to verify the formulation and also for large, complex geometries.
Ellinas, Christos; Allan, Neil; Durugbo, Christopher; Johansson, Anders
2015-01-01
Current societal requirements necessitate the effective delivery of complex projects that can do more while using less. Yet, recent large-scale project failures suggest that our ability to successfully deliver them is still at its infancy. Such failures can be seen to arise through various failure mechanisms; this work focuses on one such mechanism. Specifically, it examines the likelihood of a project sustaining a large-scale catastrophe, as triggered by single task failure and delivered via a cascading process. To do so, an analytical model was developed and tested on an empirical dataset by the means of numerical simulation. This paper makes three main contributions. First, it provides a methodology to identify the tasks most capable of impacting a project. In doing so, it is noted that a significant number of tasks induce no cascades, while a handful are capable of triggering surprisingly large ones. Secondly, it illustrates that crude task characteristics cannot aid in identifying them, highlighting the complexity of the underlying process and the utility of this approach. Thirdly, it draws parallels with systems encountered within the natural sciences by noting the emergence of self-organised criticality, commonly found within natural systems. These findings strengthen the need to account for structural intricacies of a project's underlying task precedence structure as they can provide the conditions upon which large-scale catastrophes materialise.
Multiagent model and mean field theory of complex auction dynamics
NASA Astrophysics Data System (ADS)
Chen, Qinghua; Huang, Zi-Gang; Wang, Yougui; Lai, Ying-Cheng
2015-09-01
Recent years have witnessed a growing interest in analyzing a variety of socio-economic phenomena using methods from statistical and nonlinear physics. We study a class of complex systems arising from economics, the lowest unique bid auction (LUBA) systems, which is a recently emerged class of online auction game systems. Through analyzing large, empirical data sets of LUBA, we identify a general feature of the bid price distribution: an inverted J-shaped function with exponential decay in the large bid price region. To account for the distribution, we propose a multi-agent model in which each agent bids stochastically in the field of winner’s attractiveness, and develop a theoretical framework to obtain analytic solutions of the model based on mean field analysis. The theory produces bid-price distributions that are in excellent agreement with those from the real data. Our model and theory capture the essential features of human behaviors in the competitive environment as exemplified by LUBA, and may provide significant quantitative insights into complex socio-economic phenomena.
Uncertainty Reduction for Stochastic Processes on Complex Networks
NASA Astrophysics Data System (ADS)
Radicchi, Filippo; Castellano, Claudio
2018-05-01
Many real-world systems are characterized by stochastic dynamical rules where a complex network of interactions among individual elements probabilistically determines their state. Even with full knowledge of the network structure and of the stochastic rules, the ability to predict system configurations is generally characterized by a large uncertainty. Selecting a fraction of the nodes and observing their state may help to reduce the uncertainty about the unobserved nodes. However, choosing these points of observation in an optimal way is a highly nontrivial task, depending on the nature of the stochastic process and on the structure of the underlying interaction pattern. In this paper, we introduce a computationally efficient algorithm to determine quasioptimal solutions to the problem. The method leverages network sparsity to reduce computational complexity from exponential to almost quadratic, thus allowing the straightforward application of the method to mid-to-large-size systems. Although the method is exact only for equilibrium stochastic processes defined on trees, it turns out to be effective also for out-of-equilibrium processes on sparse loopy networks.
PeTTSy: a computational tool for perturbation analysis of complex systems biology models.
Domijan, Mirela; Brown, Paul E; Shulgin, Boris V; Rand, David A
2016-03-10
Over the last decade sensitivity analysis techniques have been shown to be very useful to analyse complex and high dimensional Systems Biology models. However, many of the currently available toolboxes have either used parameter sampling, been focused on a restricted set of model observables of interest, studied optimisation of a objective function, or have not dealt with multiple simultaneous model parameter changes where the changes can be permanent or temporary. Here we introduce our new, freely downloadable toolbox, PeTTSy (Perturbation Theory Toolbox for Systems). PeTTSy is a package for MATLAB which implements a wide array of techniques for the perturbation theory and sensitivity analysis of large and complex ordinary differential equation (ODE) based models. PeTTSy is a comprehensive modelling framework that introduces a number of new approaches and that fully addresses analysis of oscillatory systems. It examines sensitivity analysis of the models to perturbations of parameters, where the perturbation timing, strength, length and overall shape can be controlled by the user. This can be done in a system-global setting, namely, the user can determine how many parameters to perturb, by how much and for how long. PeTTSy also offers the user the ability to explore the effect of the parameter perturbations on many different types of outputs: period, phase (timing of peak) and model solutions. PeTTSy can be employed on a wide range of mathematical models including free-running and forced oscillators and signalling systems. To enable experimental optimisation using the Fisher Information Matrix it efficiently allows one to combine multiple variants of a model (i.e. a model with multiple experimental conditions) in order to determine the value of new experiments. It is especially useful in the analysis of large and complex models involving many variables and parameters. PeTTSy is a comprehensive tool for analysing large and complex models of regulatory and signalling systems. It allows for simulation and analysis of models under a variety of environmental conditions and for experimental optimisation of complex combined experiments. With its unique set of tools it makes a valuable addition to the current library of sensitivity analysis toolboxes. We believe that this software will be of great use to the wider biological, systems biology and modelling communities.
Toolsets Maintain Health of Complex Systems
NASA Technical Reports Server (NTRS)
2010-01-01
First featured in Spinoff 2001, Qualtech Systems Inc. (QSI), of Wethersfield, Connecticut, adapted its Testability, Engineering, and Maintenance System (TEAMS) toolset under Small Business Innovation Research (SBIR) contracts from Ames Research Center to strengthen NASA's systems health management approach for its large, complex, and interconnected systems. Today, six NASA field centers utilize the TEAMS toolset, including TEAMS-Designer, TEAMS-RT, TEAMATE, and TEAMS-RDS. TEAMS is also being used on industrial systems that generate power, carry data, refine chemicals, perform medical functions, and produce semiconductor wafers. QSI finds TEAMS can lower costs by decreasing problems requiring service by 30 to 50 percent.
A study of the spreading scheme for viral marketing based on a complex network model
NASA Astrophysics Data System (ADS)
Yang, Jianmei; Yao, Canzhong; Ma, Weicheng; Chen, Guanrong
2010-02-01
Buzzword-based viral marketing, known also as digital word-of-mouth marketing, is a marketing mode attached to some carriers on the Internet, which can rapidly copy marketing information at a low cost. Viral marketing actually uses a pre-existing social network where, however, the scale of the pre-existing network is believed to be so large and so random, so that its theoretical analysis is intractable and unmanageable. There are very few reports in the literature on how to design a spreading scheme for viral marketing on real social networks according to the traditional marketing theory or the relatively new network marketing theory. Complex network theory provides a new model for the study of large-scale complex systems, using the latest developments of graph theory and computing techniques. From this perspective, the present paper extends the complex network theory and modeling into the research of general viral marketing and develops a specific spreading scheme for viral marking and an approach to design the scheme based on a real complex network on the QQ instant messaging system. This approach is shown to be rather universal and can be further extended to the design of various spreading schemes for viral marketing based on different instant messaging systems.
NASA Technical Reports Server (NTRS)
Consoli, Robert David; Sobieszczanski-Sobieski, Jaroslaw
1990-01-01
Advanced multidisciplinary analysis and optimization methods, namely system sensitivity analysis and non-hierarchical system decomposition, are applied to reduce the cost and improve the visibility of an automated vehicle design synthesis process. This process is inherently complex due to the large number of functional disciplines and associated interdisciplinary couplings. Recent developments in system sensitivity analysis as applied to complex non-hierarchic multidisciplinary design optimization problems enable the decomposition of these complex interactions into sub-processes that can be evaluated in parallel. The application of these techniques results in significant cost, accuracy, and visibility benefits for the entire design synthesis process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Appolaire, Alexandre; Girard, Eric; Colombo, Matteo
2014-11-01
The present work illustrates that small-angle neutron scattering, deuteration and contrast variation, combined with in vitro particle reconstruction, constitutes a very efficient approach to determine subunit architectures in large, symmetric protein complexes. In the case of the 468 kDa heterododecameric TET peptidase machine, it was demonstrated that the assembly of the 12 subunits is a highly controlled process and represents a way to optimize the catalytic efficiency of the enzyme. The specific self-association of proteins into oligomeric complexes is a common phenomenon in biological systems to optimize and regulate their function. However, de novo structure determination of these important complexesmore » is often very challenging for atomic-resolution techniques. Furthermore, in the case of homo-oligomeric complexes, or complexes with very similar building blocks, the respective positions of subunits and their assembly pathways are difficult to determine using many structural biology techniques. Here, an elegant and powerful approach based on small-angle neutron scattering is applied, in combination with deuterium labelling and contrast variation, to elucidate the oligomeric organization of the quaternary structure and the assembly pathways of 468 kDa, hetero-oligomeric and symmetric Pyrococcus horikoshii TET2–TET3 aminopeptidase complexes. The results reveal that the topology of the PhTET2 and PhTET3 dimeric building blocks within the complexes is not casual but rather suggests that their quaternary arrangement optimizes the catalytic efficiency towards peptide substrates. This approach bears important potential for the determination of quaternary structures and assembly pathways of large oligomeric and symmetric complexes in biological systems.« less
Symmetry of interactions rules in incompletely connected random replicator ecosystems.
Kärenlampi, Petri P
2014-06-01
The evolution of an incompletely connected system of species with speciation and extinction is investigated in terms of random replicators. It is found that evolving random replicator systems with speciation do become large and complex, depending on speciation parameters. Antisymmetric interactions result in large systems, whereas systems with symmetric interactions remain small. A co-dominating feature is within-species interaction pressure: large within-species interaction increases species diversity. Average fitness evolves in all systems, however symmetry and connectivity evolve in small systems only. Newcomers get extinct almost immediately in symmetric systems. The distribution in species lifetimes is determined for antisymmetric systems. The replicator systems investigated do not show any sign of self-organized criticality. The generalized Lotka-Volterra system is shown to be a tedious way of implementing the replicator system.
Analysis and design of algorithm-based fault-tolerant systems
NASA Technical Reports Server (NTRS)
Nair, V. S. Sukumaran
1990-01-01
An important consideration in the design of high performance multiprocessor systems is to ensure the correctness of the results computed in the presence of transient and intermittent failures. Concurrent error detection and correction have been applied to such systems in order to achieve reliability. Algorithm Based Fault Tolerance (ABFT) was suggested as a cost-effective concurrent error detection scheme. The research was motivated by the complexity involved in the analysis and design of ABFT systems. To that end, a matrix-based model was developed and, based on that, algorithms for both the design and analysis of ABFT systems are formulated. These algorithms are less complex than the existing ones. In order to reduce the complexity further, a hierarchical approach is developed for the analysis of large systems.
Review of integrated digital systems: evolution and adoption
NASA Astrophysics Data System (ADS)
Fritz, Lawrence W.
The factors that are influencing the evolution of photogrammetric and remote sensing technology to transition into fully integrated digital systems are reviewed. These factors include societal pressures for new, more timely digital products from the Spatial Information Sciencesand the adoption of rapid technological advancements in digital processing hardware and software. Current major developments in leading government mapping agencies of the USA, such as the Digital Production System (DPS) modernization programme at the Defense Mapping Agency, and the Automated Nautical Charting System II (ANCS-II) programme and Integrated Digital Photogrammetric Facility (IDPF) at NOAA/National Ocean Service, illustrate the significant benefits to be realized. These programmes are examples of different levels of integrated systems that have been designed to produce digital products. They provide insights to the management complexities to be considered for very large integrated digital systems. In recognition of computer industry trends, a knowledge-based architecture for managing the complexity of the very large spatial information systems of the future is proposed.
NASA Technical Reports Server (NTRS)
Hicks, Brian A.; Lyon, Richard G.; Petrone, Peter, III; Bolcar, Matthew R.; Bolognese, Jeff; Clampin, Mark; Dogoda, Peter; Dworzanski, Daniel; Helmbrecht, Michael A.; Koca, Corina;
2016-01-01
This work presents an overview of the This work presents an overview of the Segmented Aperture Interferometric Nulling Testbed (SAINT), a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNCs demonstrated wavefront sensing and control system to refine and quantify the end-to-end system performance for high-contrast starlight suppression. This pathfinder system will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes., a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNCs demonstrated wavefront sensing and control system to refine and quantify the end-to-end system performance for high-contrast starlight suppression. This pathfinder system will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes.
Costing Complex Products, Operations, and Support
2011-04-30
Symposium, 10-12 May 2011, Seaside, CA. U.S. Government or Federal Rights License 14. ABSTRACT Complex products and systems (CoPS), such as large defense...Program Executive Officer SHIPS • Commander, Naval Sea Systems Command • Army Contracting Command, U.S. Army Materiel Command • Program Manager...Airborne, Maritime and Fixed Station Joint Tactical Radio System = ==================^Åèìáëáíáçå=oÉëÉ~êÅÜW=`ob^qfkd=pvkbodv=clo=fkclojba=`e^kdb=====- ii
Master-slave system with force feedback based on dynamics of virtual model
NASA Technical Reports Server (NTRS)
Nojima, Shuji; Hashimoto, Hideki
1994-01-01
A master-slave system can extend manipulating and sensing capabilities of a human operator to a remote environment. But the master-slave system has two serious problems: one is the mechanically large impedance of the system; the other is the mechanical complexity of the slave for complex remote tasks. These two problems reduce the efficiency of the system. If the slave has local intelligence, it can help the human operator by using its good points like fast calculation and large memory. The authors suggest that the slave is a dextrous hand with many degrees of freedom able to manipulate an object of known shape. It is further suggested that the dimensions of the remote work space be shared by the human operator and the slave. The effect of the large impedance of the system can be reduced in a virtual model, a physical model constructed in a computer with physical parameters as if it were in the real world. A method to determine the damping parameter dynamically for the virtual model is proposed. Experimental results show that this virtual model is better than the virtual model with fixed damping.
Analysis of space vehicle structures using the transfer-function concept
NASA Technical Reports Server (NTRS)
Heer, E.; Trubert, M. R.
1969-01-01
Analysis of large complex systems is accomplished by dividing it into suitable subsystems and determining the individual dynamical and vibrational responses. Frequency transfer functions then determine the vibrational response of the whole system.
Safety management of a complex R&D ground operating system
NASA Technical Reports Server (NTRS)
Connors, J.; Mauer, R. A.
1975-01-01
Report discusses safety program implementation for large R&D operating system. Analytical techniques are defined and suggested as tools for identifying potential hazards and determining means to effectively control or eliminate hazards.
Hierarchical Modeling and Robust Synthesis for the Preliminary Design of Large Scale Complex Systems
NASA Technical Reports Server (NTRS)
Koch, Patrick N.
1997-01-01
Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis; Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration; and Noise modeling techniques for implementing robust preliminary design when approximate models are employed. Hierarchical partitioning and modeling techniques including intermediate responses, linking variables, and compatibility constraints are incorporated within a hierarchical compromise decision support problem formulation for synthesizing subproblem solutions for a partitioned system. Experimentation and approximation techniques are employed for concurrent investigations and modeling of partitioned subproblems. A modified composite experiment is introduced for fitting better predictive models across the ranges of the factors, and an approach for constructing partitioned response surfaces is developed to reduce the computational expense of experimentation for fitting models in a large number of factors. Noise modeling techniques are compared and recommendations are offered for the implementation of robust design when approximate models are sought. These techniques, approaches, and recommendations are incorporated within the method developed for hierarchical robust preliminary design exploration. This method as well as the associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system. The case study is developed in collaboration with Allison Engine Company, Rolls Royce Aerospace, and is based on the Allison AE3007 existing engine designed for midsize commercial, regional business jets. For this case study, the turbofan system-level problem is partitioned into engine cycle design and configuration design and a compressor modules integrated for more detailed subsystem-level design exploration, improving system evaluation. The fan and low pressure turbine subsystems are also modeled, but in less detail. Given the defined partitioning, these subproblems are investigated independently and concurrently, and response surface models are constructed to approximate the responses of each. These response models are then incorporated within a commercial turbofan hierarchical compromise decision support problem formulation. Five design scenarios are investigated, and robust solutions are identified. The method and solutions identified are verified by comparison with the AE3007 engine. The solutions obtained are similar to the AE3007 cycle and configuration, but are better with respect to many of the requirements.
Thermal Environment for Classrooms. Central System Approach to Air Conditioning.
ERIC Educational Resources Information Center
Triechler, Walter W.
This speech compares the air conditioning requirements of high-rise office buildings with those of large centralized school complexes. A description of one particular air conditioning system provides information about the system's arrangement, functions, performance efficiency, and cost effectiveness. (MLF)
Intelligent mobility research for robotic locomotion in complex terrain
NASA Astrophysics Data System (ADS)
Trentini, Michael; Beckman, Blake; Digney, Bruce; Vincent, Isabelle; Ricard, Benoit
2006-05-01
The objective of the Autonomous Intelligent Systems Section of Defence R&D Canada - Suffield is best described by its mission statement, which is "to augment soldiers and combat systems by developing and demonstrating practical, cost effective, autonomous intelligent systems capable of completing military missions in complex operating environments." The mobility requirement for ground-based mobile systems operating in urban settings must increase significantly if robotic technology is to augment human efforts in these roles and environments. The intelligence required for autonomous systems to operate in complex environments demands advances in many fields of robotics. This has resulted in large bodies of research in areas of perception, world representation, and navigation, but the problem of locomotion in complex terrain has largely been ignored. In order to achieve its objective, the Autonomous Intelligent Systems Section is pursuing research that explores the use of intelligent mobility algorithms designed to improve robot mobility. Intelligent mobility uses sensing, control, and learning algorithms to extract measured variables from the world, control vehicle dynamics, and learn by experience. These algorithms seek to exploit available world representations of the environment and the inherent dexterity of the robot to allow the vehicle to interact with its surroundings and produce locomotion in complex terrain. The primary focus of the paper is to present the intelligent mobility research within the framework of the research methodology, plan and direction defined at Defence R&D Canada - Suffield. It discusses the progress and future direction of intelligent mobility research and presents the research tools, topics, and plans to address this critical research gap. This research will create effective intelligence to improve the mobility of ground-based mobile systems operating in urban settings to assist the Canadian Forces in their future urban operations.
Patel, Mohak; Leggett, Susan E; Landauer, Alexander K; Wong, Ian Y; Franck, Christian
2018-04-03
Spatiotemporal tracking of tracer particles or objects of interest can reveal localized behaviors in biological and physical systems. However, existing tracking algorithms are most effective for relatively low numbers of particles that undergo displacements smaller than their typical interparticle separation distance. Here, we demonstrate a single particle tracking algorithm to reconstruct large complex motion fields with large particle numbers, orders of magnitude larger than previously tractably resolvable, thus opening the door for attaining very high Nyquist spatial frequency motion recovery in the images. Our key innovations are feature vectors that encode nearest neighbor positions, a rigorous outlier removal scheme, and an iterative deformation warping scheme. We test this technique for its accuracy and computational efficacy using synthetically and experimentally generated 3D particle images, including non-affine deformation fields in soft materials, complex fluid flows, and cell-generated deformations. We augment this algorithm with additional particle information (e.g., color, size, or shape) to further enhance tracking accuracy for high gradient and large displacement fields. These applications demonstrate that this versatile technique can rapidly track unprecedented numbers of particles to resolve large and complex motion fields in 2D and 3D images, particularly when spatial correlations exist.
Combination of large and small basis sets in electronic structure calculations on large systems
NASA Astrophysics Data System (ADS)
Røeggen, Inge; Gao, Bin
2018-04-01
Two basis sets—a large and a small one—are associated with each nucleus of the system. Each atom has its own separate one-electron basis comprising the large basis set of the atom in question and the small basis sets for the partner atoms in the complex. The perturbed atoms in molecules and solids model is at core of the approach since it allows for the definition of perturbed atoms in a system. It is argued that this basis set approach should be particularly useful for periodic systems. Test calculations are performed on one-dimensional arrays of H and Li atoms. The ground-state energy per atom in the linear H array is determined versus bond length.
Molecular dynamics simulations of large macromolecular complexes.
Perilla, Juan R; Goh, Boon Chong; Cassidy, C Keith; Liu, Bo; Bernardi, Rafael C; Rudack, Till; Yu, Hang; Wu, Zhe; Schulten, Klaus
2015-04-01
Connecting dynamics to structural data from diverse experimental sources, molecular dynamics simulations permit the exploration of biological phenomena in unparalleled detail. Advances in simulations are moving the atomic resolution descriptions of biological systems into the million-to-billion atom regime, in which numerous cell functions reside. In this opinion, we review the progress, driven by large-scale molecular dynamics simulations, in the study of viruses, ribosomes, bioenergetic systems, and other diverse applications. These examples highlight the utility of molecular dynamics simulations in the critical task of relating atomic detail to the function of supramolecular complexes, a task that cannot be achieved by smaller-scale simulations or existing experimental approaches alone. Copyright © 2015 Elsevier Ltd. All rights reserved.
Tool Support for Parametric Analysis of Large Software Simulation Systems
NASA Technical Reports Server (NTRS)
Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony
2008-01-01
The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.
Fundamental concepts of structural loading and load relief techniques for the space shuttle
NASA Technical Reports Server (NTRS)
Ryan, R. S.; Mowery, D. K.; Winder, S. W.
1972-01-01
The prediction of flight loads and their potential reduction, using various control system logics for the space shuttle vehicles, is discussed. Some factors not found on previous launch vehicles that increase the complexity are large lifting surfaces, unsymmetrical structure, unsymmetrical aerodynamics, trajectory control system coupling, and large aeroelastic effects. These load-producing factors and load-reducing techniques are analyzed.
Research directions in large scale systems and decentralized control
NASA Technical Reports Server (NTRS)
Tenney, R. R.
1980-01-01
Control theory provides a well established framework for dealing with automatic decision problems and a set of techniques for automatic decision making which exploit special structure, but it does not deal well with complexity. The potential exists for combining control theoretic and knowledge based concepts into a unified approach. The elements of control theory are diagrammed, including modern control and large scale systems.
NASA Astrophysics Data System (ADS)
Vespignani, Alessandro
From schools of fish and flocks of birds, to digital networks and self-organizing biopolymers, our understanding of spontaneously emergent phenomena, self-organization, and critical behavior is in large part due to complex systems science. The complex systems approach is indeed a very powerful conceptual framework to shed light on the link between the microscopic dynamical evolution of the basic elements of the system and the emergence of oscopic phenomena; often providing evidence for mathematical principles that go beyond the particulars of the individual system, thus hinting to general modeling principles. By killing the myth of the ant queen and shifting the focus on the dynamical interaction across the elements of the systems, complex systems science has ushered our way into the conceptual understanding of many phenomena at the core of major scientific and social challenges such as the emergence of consensus, social opinion dynamics, conflicts and cooperation, contagion phenomena. For many years though, these complex systems approaches to real-world problems were often suffering from being oversimplified and not grounded on actual data...
2012-01-11
dynamic behavior , wherein a dissipative dynamical system can deliver only a fraction of its energy to its surroundings and can store only a fraction of the...collection of interacting subsystems. The behavior and properties of the aggregate large-scale system can then be deduced from the behaviors of the...uniqueness is established. This state space formalism of thermodynamics shows that the behavior of heat, as described by the conservation equations of
Systems Integration Challenges for a National Space Launch System
NASA Technical Reports Server (NTRS)
May, Todd A.
2011-01-01
System Integration was refined through the complexity and early failures experienced in rocket flight. System Integration encompasses many different viewpoints of the system development. System Integration must ensure consistency in development and operations activities. Human Space Flight tends toward large, complex systems. Understanding the system fs operational and use context is the guiding principle for System Integration: (1) Sizeable costs can be driven into systems by not fully understanding context (2). Adhering to the system context throughout the system fs life cycle is essential to maintaining efficient System Integration. System Integration exists within the System Architecture. Beautiful systems are simple in use and operation -- Block upgrades facilitate manageable steps in functionality evolution. Effective System Integration requires a stable system concept. Communication is essential to system simplicity
Clinical quality needs complex adaptive systems and machine learning.
Marsland, Stephen; Buchan, Iain
2004-01-01
The vast increase in clinical data has the potential to bring about large improvements in clinical quality and other aspects of healthcare delivery. However, such benefits do not come without cost. The analysis of such large datasets, particularly where the data may have to be merged from several sources and may be noisy and incomplete, is a challenging task. Furthermore, the introduction of clinical changes is a cyclical task, meaning that the processes under examination operate in an environment that is not static. We suggest that traditional methods of analysis are unsuitable for the task, and identify complexity theory and machine learning as areas that have the potential to facilitate the examination of clinical quality. By its nature the field of complex adaptive systems deals with environments that change because of the interactions that have occurred in the past. We draw parallels between health informatics and bioinformatics, which has already started to successfully use machine learning methods.
Networking at the Protein Society symposium.
McKnight, C James; Cordes, Matthew H J
2005-10-01
From the complex behavior of multicomponent signaling networks to the structures of large protein complexes and aggregates, questions once viewed as daunting are now being tackled fearlessly by protein scientists. The 19th Annual Symposium of the Protein Society in Boston highlighted the maturation of systems biology as applied to proteins.
Toward a theoretical framework for trustworthy cyber sensing
NASA Astrophysics Data System (ADS)
Xu, Shouhuai
2010-04-01
Cyberspace is an indispensable part of the economy and society, but has been "polluted" with many compromised computers that can be abused to launch further attacks against the others. Since it is likely that there always are compromised computers, it is important to be aware of the (dynamic) cyber security-related situation, which is however challenging because cyberspace is an extremely large-scale complex system. Our project aims to investigate a theoretical framework for trustworthy cyber sensing. With the perspective of treating cyberspace as a large-scale complex system, the core question we aim to address is: What would be a competent theoretical (mathematical and algorithmic) framework for designing, analyzing, deploying, managing, and adapting cyber sensor systems so as to provide trustworthy information or input to the higher layer of cyber situation-awareness management, even in the presence of sophisticated malicious attacks against the cyber sensor systems?
Simulation of a Moving Elastic Beam Using Hamilton’s Weak Principle
2006-03-01
versions were limited to two-dimensional systems with open tree configurations (where a cut in any component separates the system in half) [48]. This...whose com- ponents experienced large angular rotations (turbomachinery, camshafts , flywheels, etc.). More complex systems required the simultaneous
NASA Technical Reports Server (NTRS)
McGowan, Anna-Maria R.; Daly, Shanna; Baker, Wayne; Papalambros, panos; Seifert, Colleen
2013-01-01
This study investigates interdisciplinary interactions that take place during the research, development, and early conceptual design phases in the design of large-scale complex engineered systems (LaCES) such as aerospace vehicles. These interactions, that take place throughout a large engineering development organization, become the initial conditions of the systems engineering process that ultimately leads to the development of a viable system. This paper summarizes some of the challenges and opportunities regarding social and organizational issues that emerged from a qualitative study using ethnographic and survey data. The analysis reveals several socio-technical couplings between the engineered system and the organization that creates it. Survey respondents noted the importance of interdisciplinary interactions and their benefits to the engineered system as well as substantial challenges in interdisciplinary interactions. Noted benefits included enhanced knowledge and problem mitigation and noted obstacles centered on organizational and human dynamics. Findings suggest that addressing the social challenges may be a critical need in enabling interdisciplinary interactions
Adaptive Systems Engineering: A Medical Paradigm for Practicing Systems Engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Douglas Hamelin; Ron D. Klingler; Christopher Dieckmann
2011-06-01
From its inception in the defense and aerospace industries, SE has applied holistic, interdisciplinary tools and work-process to improve the design and management of 'large, complex engineering projects.' The traditional scope of engineering in general embraces the design, development, production, and operation of physical systems, and SE, as originally conceived, falls within that scope. While this 'traditional' view has expanded over the years to embrace wider, more holistic applications, much of the literature and training currently available is still directed almost entirely at addressing the large, complex, NASA and defense-sized systems wherein the 'ideal' practice of SE provides the cradle-to-gravemore » foundation for system development and deployment. Under such scenarios, systems engineers are viewed as an integral part of the system and project life-cycle from conception to decommissioning. In far less 'ideal' applications, SE principles are equally applicable to a growing number of complex systems and projects that need to be 'rescued' from overwhelming challenges that threaten imminent failure. The medical profession provides a unique analogy for this latter concept and offers a useful paradigm for tailoring our 'practice' of SE to address the unexpected dynamics of applying SE in the real world. In short, we can be much more effective as systems engineers as we change some of the paradigms under which we teach and 'practice' SE.« less
Detection of timescales in evolving complex systems
Darst, Richard K.; Granell, Clara; Arenas, Alex; Gómez, Sergio; Saramäki, Jari; Fortunato, Santo
2016-01-01
Most complex systems are intrinsically dynamic in nature. The evolution of a dynamic complex system is typically represented as a sequence of snapshots, where each snapshot describes the configuration of the system at a particular instant of time. This is often done by using constant intervals but a better approach would be to define dynamic intervals that match the evolution of the system’s configuration. To this end, we propose a method that aims at detecting evolutionary changes in the configuration of a complex system, and generates intervals accordingly. We show that evolutionary timescales can be identified by looking for peaks in the similarity between the sets of events on consecutive time intervals of data. Tests on simple toy models reveal that the technique is able to detect evolutionary timescales of time-varying data both when the evolution is smooth as well as when it changes sharply. This is further corroborated by analyses of several real datasets. Our method is scalable to extremely large datasets and is computationally efficient. This allows a quick, parameter-free detection of multiple timescales in the evolution of a complex system. PMID:28004820
3D DOSY-TROSY to determine the translational diffusion coefficient of large protein complexes.
Didenko, Tatiana; Boelens, Rolf; Rüdiger, Stefan G D
2011-01-01
The translational diffusion coefficient is a sensitive parameter to probe conformational changes in proteins and protein-protein interactions. Pulsed-field gradient NMR spectroscopy allows one to measure the translational diffusion with high accuracy. Two-dimensional (2D) heteronuclear NMR spectroscopy combined with diffusion-ordered spectroscopy (DOSY) provides improved resolution and therefore selectivity when compared with a conventional 1D readout. Here, we show that a combination of selective isotope labelling, 2D ¹H-¹³C methyl-TROSY (transverse relaxation-optimised spectroscopy) and DOSY allows one to study diffusion properties of large protein complexes. We propose that a 3D DOSY-heteronuclear multiple quantum coherence (HMQC) pulse sequence, that uses the TROSY effect of the HMQC sequence for ¹³C methyl-labelled proteins, is highly suitable for measuring the diffusion coefficient of large proteins. We used the 20 kDa co-chaperone p23 as model system to test this 3D DOSY-TROSY technique under various conditions. We determined the diffusion coefficient of p23 in viscous solutions, mimicking large complexes of up to 200 kDa. We found the experimental data to be in excellent agreement with theoretical predictions. To demonstrate the use for complex formation, we applied this technique to record the formation of a complex of p23 with the molecular chaperone Hsp90, which is around 200 kDa. We anticipate that 3D DOSY-TROSY will be a useful tool to study conformational changes in large protein complexes.
NASA Astrophysics Data System (ADS)
Debnath, Lokenath
2010-09-01
This article is essentially devoted to a brief historical introduction to Euler's formula for polyhedra, topology, theory of graphs and networks with many examples from the real-world. Celebrated Königsberg seven-bridge problem and some of the basic properties of graphs and networks for some understanding of the macroscopic behaviour of real physical systems are included. We also mention some important and modern applications of graph theory or network problems from transportation to telecommunications. Graphs or networks are effectively used as powerful tools in industrial, electrical and civil engineering, communication networks in the planning of business and industry. Graph theory and combinatorics can be used to understand the changes that occur in many large and complex scientific, technical and medical systems. With the advent of fast large computers and the ubiquitous Internet consisting of a very large network of computers, large-scale complex optimization problems can be modelled in terms of graphs or networks and then solved by algorithms available in graph theory. Many large and more complex combinatorial problems dealing with the possible arrangements of situations of various kinds, and computing the number and properties of such arrangements can be formulated in terms of networks. The Knight's tour problem, Hamilton's tour problem, problem of magic squares, the Euler Graeco-Latin squares problem and their modern developments in the twentieth century are also included.
Health conditions and health-policy innovations in Brazil: the way forward.
Victora, Cesar G; Barreto, Mauricio L; do Carmo Leal, Maria; Monteiro, Carlos A; Schmidt, Maria Ines; Paim, Jairnilson; Bastos, Francisco I; Almeida, Celia; Bahia, Ligia; Travassos, Claudia; Reichenheim, Michael; Barros, Fernando C
2011-06-11
Brazil is a large complex country that is undergoing rapid economic, social, and environmental change. In this Series of six articles, we have reported important improvements in health status and life expectancy, which can be ascribed largely to progress in social determinants of health and to implementation of a comprehensive national health system with strong social participation. Many challenges remain, however. Socioeconomic and regional disparities are still unacceptably large, reflecting the fact that much progress is still needed to improve basic living conditions for a large proportion of the population. New health problems arise as a result of urbanisation and social and environmental change, and some old health issues remain unabated. Administration of a complex, decentralised public-health system, in which a large share of services is contracted out to the private sector, together with many private insurance providers, inevitably causes conflict and contradiction. The challenge is ultimately political, and we conclude with a call for action that requires continuous engagement by Brazilian society as a whole in securing the right to health for all Brazilian people. Copyright © 2011 Elsevier Ltd. All rights reserved.
Cx-02 Program, workshop on modeling complex systems
Mossotti, Victor G.; Barragan, Jo Ann; Westergard, Todd D.
2003-01-01
This publication contains the abstracts and program for the workshop on complex systems that was held on November 19-21, 2002, in Reno, Nevada. Complex systems are ubiquitous within the realm of the earth sciences. Geological systems consist of a multiplicity of linked components with nested feedback loops; the dynamics of these systems are non-linear, iterative, multi-scale, and operate far from equilibrium. That notwithstanding, It appears that, with the exception of papers on seismic studies, geology and geophysics work has been disproportionally underrepresented at regional and national meetings on complex systems relative to papers in the life sciences. This is somewhat puzzling because geologists and geophysicists are, in many ways, preadapted to thinking of complex system mechanisms. Geologists and geophysicists think about processes involving large volumes of rock below the sunlit surface of Earth, the accumulated consequence of processes extending hundreds of millions of years in the past. Not only do geologists think in the abstract by virtue of the vast time spans, most of the evidence is out-of-sight. A primary goal of this workshop is to begin to bridge the gap between the Earth sciences and life sciences through demonstration of the universality of complex systems science, both philosophically and in model structures.
Reagan, Matthew T.; Moridis, George J.; Seim, Katie S.
2017-03-27
A recent Department of Energy field test on the Alaska North Slope has increased interest in the ability to simulate systems of mixed CO 2-CH 4 hydrates. However, the physically realistic simulation of mixed-hydrate simulation is not yet a fully solved problem. Limited quantitative laboratory data leads to the use of various ab initio, statistical mechanical, or other mathematic representations of mixed-hydrate phase behavior. Few of these methods are suitable for inclusion in reservoir simulations, particularly for systems with large number of grid elements, 3D systems, or systems with complex geometric configurations. In this paper, we present a set ofmore » fast parametric relationships describing the thermodynamic properties and phase behavior of a mixed methane-carbon dioxide hydrate system. We use well-known, off-the-shelf hydrate physical properties packages to generate a sufficiently large dataset, select the most convenient and efficient mathematical forms, and fit the data to those forms to create a physical properties package suitable for inclusion in the TOUGH+ family of codes. Finally, the mapping of the phase and thermodynamic space reveals the complexity of the mixed-hydrate system and allows understanding of the thermodynamics at a level beyond what much of the existing laboratory data and literature currently offer.« less
NASA Astrophysics Data System (ADS)
Reagan, Matthew T.; Moridis, George J.; Seim, Katie S.
2017-06-01
A recent Department of Energy field test on the Alaska North Slope has increased interest in the ability to simulate systems of mixed CO2-CH4 hydrates. However, the physically realistic simulation of mixed-hydrate simulation is not yet a fully solved problem. Limited quantitative laboratory data leads to the use of various ab initio, statistical mechanical, or other mathematic representations of mixed-hydrate phase behavior. Few of these methods are suitable for inclusion in reservoir simulations, particularly for systems with large number of grid elements, 3D systems, or systems with complex geometric configurations. In this work, we present a set of fast parametric relationships describing the thermodynamic properties and phase behavior of a mixed methane-carbon dioxide hydrate system. We use well-known, off-the-shelf hydrate physical properties packages to generate a sufficiently large dataset, select the most convenient and efficient mathematical forms, and fit the data to those forms to create a physical properties package suitable for inclusion in the TOUGH+ family of codes. The mapping of the phase and thermodynamic space reveals the complexity of the mixed-hydrate system and allows understanding of the thermodynamics at a level beyond what much of the existing laboratory data and literature currently offer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reagan, Matthew T.; Moridis, George J.; Seim, Katie S.
A recent Department of Energy field test on the Alaska North Slope has increased interest in the ability to simulate systems of mixed CO 2-CH 4 hydrates. However, the physically realistic simulation of mixed-hydrate simulation is not yet a fully solved problem. Limited quantitative laboratory data leads to the use of various ab initio, statistical mechanical, or other mathematic representations of mixed-hydrate phase behavior. Few of these methods are suitable for inclusion in reservoir simulations, particularly for systems with large number of grid elements, 3D systems, or systems with complex geometric configurations. In this paper, we present a set ofmore » fast parametric relationships describing the thermodynamic properties and phase behavior of a mixed methane-carbon dioxide hydrate system. We use well-known, off-the-shelf hydrate physical properties packages to generate a sufficiently large dataset, select the most convenient and efficient mathematical forms, and fit the data to those forms to create a physical properties package suitable for inclusion in the TOUGH+ family of codes. Finally, the mapping of the phase and thermodynamic space reveals the complexity of the mixed-hydrate system and allows understanding of the thermodynamics at a level beyond what much of the existing laboratory data and literature currently offer.« less
The feasibility and stability of large complex biological networks: a random matrix approach.
Stone, Lewi
2018-05-29
In the 70's, Robert May demonstrated that complexity creates instability in generic models of ecological networks having random interaction matrices A. Similar random matrix models have since been applied in many disciplines. Central to assessing stability is the "circular law" since it describes the eigenvalue distribution for an important class of random matrices A. However, despite widespread adoption, the "circular law" does not apply for ecological systems in which density-dependence operates (i.e., where a species growth is determined by its density). Instead one needs to study the far more complicated eigenvalue distribution of the community matrix S = DA, where D is a diagonal matrix of population equilibrium values. Here we obtain this eigenvalue distribution. We show that if the random matrix A is locally stable, the community matrix S = DA will also be locally stable, providing the system is feasible (i.e., all species have positive equilibria D > 0). This helps explain why, unusually, nearly all feasible systems studied here are locally stable. Large complex systems may thus be even more fragile than May predicted, given the difficulty of assembling a feasible system. It was also found that the degree of stability, or resilience of a system, depended on the minimum equilibrium population.
Address the Major Societal Challenges
NASA Astrophysics Data System (ADS)
Laubichler, Manfred
In his famous historical account about the origins of molecular biology Gunther Stent introduced a three phase sequence that turns out to be characteristic for many newly emerging paradigms within science. New ideas, according to Stent, follow a sequence of romantic, dogmatic, and academic phases. One can easily see that complex systems science followed this path. The question now is whether we are in an extended academic phase of gradually expanding both theoretical and practical knowledge, or whether we are entering a new transformation of complex systems science that might well bring about a new romantic phase. I would argue that complexity science, indeed, is at the dawn of a new period - let's call it complexity 3.0. The last academic phase has seen the application of complex systems ideas and methods in a variety of different domains. It has been to a large extent business as usual...
A Principled Approach to the Specification of System Architectures for Space Missions
NASA Technical Reports Server (NTRS)
McKelvin, Mark L. Jr.; Castillo, Robert; Bonanne, Kevin; Bonnici, Michael; Cox, Brian; Gibson, Corrina; Leon, Juan P.; Gomez-Mustafa, Jose; Jimenez, Alejandro; Madni, Azad
2015-01-01
Modern space systems are increasing in complexity and scale at an unprecedented pace. Consequently, innovative methods, processes, and tools are needed to cope with the increasing complexity of architecting these systems. A key systems challenge in practice is the ability to scale processes, methods, and tools used to architect complex space systems. Traditionally, the process for specifying space system architectures has largely relied on capturing the system architecture in informal descriptions that are often embedded within loosely coupled design documents and domain expertise. Such informal descriptions often lead to misunderstandings between design teams, ambiguous specifications, difficulty in maintaining consistency as the architecture evolves throughout the system development life cycle, and costly design iterations. Therefore, traditional methods are becoming increasingly inefficient to cope with ever-increasing system complexity. We apply the principles of component-based design and platform-based design to the development of the system architecture for a practical space system to demonstrate feasibility of our approach using SysML. Our results show that we are able to apply a systematic design method to manage system complexity, thus enabling effective data management, semantic coherence and traceability across different levels of abstraction in the design chain. Just as important, our approach enables interoperability among heterogeneous tools in a concurrent engineering model based design environment.
Fragmentation of the CRISPR-Cas Type I-B signature protein Cas8b.
Richter, Hagen; Rompf, Judith; Wiegel, Julia; Rau, Kristina; Randau, Lennart
2017-11-01
CRISPR arrays are transcribed into long precursor RNA species, which are further processed into mature CRISPR RNAs (crRNAs). Cas proteins utilize these crRNAs, which contain spacer sequences that can be derived from mobile genetic elements, to mediate immunity during a reoccurring virus infection. Type I CRISPR-Cas systems are defined by the presence of different Cascade interference complexes containing large and small subunits that play major roles during target DNA selection. Here, we produce the protein and crRNA components of the Type I-B CRISPR-Cas complex of Clostridium thermocellum and Methanococcus maripaludis. The C. thermocellum Cascade complexes were reconstituted and analyzed via size-exclusion chromatography. Activity of the heterologous M. maripaludis CRISPR-Cas system was followed using phage lambda plaques assays. The reconstituted Type-I-B Cascade complex contains Cas7, Cas5, Cas6b and the large subunit Cas8b. Cas6b can be omitted from the reconstitution protocol. The large subunit Cas8b was found to be represented by two tightly associated protein fragments and a small C-terminal Cas8b segment was identified in recombinant complexes and C. thermocellum cell lysate. Production of Cas8b generates a small C-terminal fragment, which is suggested to fulfill the role of the missing small subunit. A heterologous, synthetic M. maripaludis Type I-B system is active in E. coli against phage lambda, highlighting a potential for genome editing using endogenous Type-I-B CRISPR-Cas machineries. This article is part of a Special Issue entitled "Biochemistry of Synthetic Biology - Recent Developments" Guest Editor: Dr. Ilka Heinemann and Dr. Patrick O'Donoghue. Copyright © 2017 Elsevier B.V. All rights reserved.
Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas
2016-01-01
Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.
ERIC Educational Resources Information Center
Cor, Ken; Alves, Cecilia; Gierl, Mark J.
2008-01-01
This review describes and evaluates a software add-in created by Frontline Systems, Inc., that can be used with Microsoft Excel 2007 to solve large, complex test assembly problems. The combination of Microsoft Excel 2007 with the Frontline Systems Premium Solver Platform is significant because Microsoft Excel is the most commonly used spreadsheet…
NASA Technical Reports Server (NTRS)
Mckay, Charles W.; Feagin, Terry; Bishop, Peter C.; Hallum, Cecil R.; Freedman, Glenn B.
1987-01-01
The principle focus of one of the RICIS (Research Institute for Computing and Information Systems) components is computer systems and software engineering in-the-large of the lifecycle of large, complex, distributed systems which: (1) evolve incrementally over a long time; (2) contain non-stop components; and (3) must simultaneously satisfy a prioritized balance of mission and safety critical requirements at run time. This focus is extremely important because of the contribution of the scaling direction problem to the current software crisis. The Computer Systems and Software Engineering (CSSE) component addresses the lifestyle issues of three environments: host, integration, and target.
Impacts of large dams on the complexity of suspended sediment dynamics in the Yangtze River
NASA Astrophysics Data System (ADS)
Wang, Yuankun; Rhoads, Bruce L.; Wang, Dong; Wu, Jichun; Zhang, Xiao
2018-03-01
The Yangtze River is one of the largest and most important rivers in the world. Over the past several decades, the natural sediment regime of the Yangtze River has been altered by the construction of dams. This paper uses multi-scale entropy analysis to ascertain the impacts of large dams on the complexity of high-frequency suspended sediment dynamics in the Yangtze River system, especially after impoundment of the Three Gorges Dam (TGD). In this study, the complexity of sediment dynamics is quantified by framing it within the context of entropy analysis of time series. Data on daily sediment loads for four stations located in the mainstem are analyzed for the past 60 years. The results indicate that dam construction has reduced the complexity of short-term (1-30 days) variation in sediment dynamics near the structures, but that complexity has actually increased farther downstream. This spatial pattern seems to reflect a filtering effect of the dams on the on the temporal pattern of sediment loads as well as decreased longitudinal connectivity of sediment transfer through the river system, resulting in downstream enhancement of the influence of local sediment inputs by tributaries on sediment dynamics. The TGD has had a substantial impact on the complexity of sediment series in the mainstem of the Yangtze River, especially after it became fully operational. This enhanced impact is attributed to the high trapping efficiency of this dam and its associated large reservoir. The sediment dynamics "signal" becomes more spatially variable after dam construction. This study demonstrates the spatial influence of dams on the high-frequency temporal complexity of sediment regimes and provides valuable information that can be used to guide environmental conservation of the Yangtze River.
Software Reliability Issues Concerning Large and Safety Critical Software Systems
NASA Technical Reports Server (NTRS)
Kamel, Khaled; Brown, Barbara
1996-01-01
This research was undertaken to provide NASA with a survey of state-of-the-art techniques using in industrial and academia to provide safe, reliable, and maintainable software to drive large systems. Such systems must match the complexity and strict safety requirements of NASA's shuttle system. In particular, the Launch Processing System (LPS) is being considered for replacement. The LPS is responsible for monitoring and commanding the shuttle during test, repair, and launch phases. NASA built this system in the 1970's using mostly hardware techniques to provide for increased reliability, but it did so often using custom-built equipment, which has not been able to keep up with current technologies. This report surveys the major techniques used in industry and academia to ensure reliability in large and critical computer systems.
Auditory Processing of Complex Sounds Across Frequency Channels.
1992-06-26
towards gaining an understanding how the auditory system processes complex sounds. "The results of binaural psychophysical experiments in human subjects...suggest (1) that spectrally synthetic binaural processing is the rule when the number of components in the tone complex are relatively few (less than...10) and there are no dynamic binaural cues to aid segregation of the target from the background, and (2) that waveforms having large effective
ERIC Educational Resources Information Center
Veaner, Allen B.
Project BALLOTS is a large-scale library automation development project of the Stanford University Libraries which has demonstrated the feasibility of conducting on-line interactive searches of complex bibliographic files, with a large number of users working simultaneously in the same or different files. This report documents the continuing…
Systematic methods for defining coarse-grained maps in large biomolecules.
Zhang, Zhiyong
2015-01-01
Large biomolecules are involved in many important biological processes. It would be difficult to use large-scale atomistic molecular dynamics (MD) simulations to study the functional motions of these systems because of the computational expense. Therefore various coarse-grained (CG) approaches have attracted rapidly growing interest, which enable simulations of large biomolecules over longer effective timescales than all-atom MD simulations. The first issue in CG modeling is to construct CG maps from atomic structures. In this chapter, we review the recent development of a novel and systematic method for constructing CG representations of arbitrarily complex biomolecules, in order to preserve large-scale and functionally relevant essential dynamics (ED) at the CG level. In this ED-CG scheme, the essential dynamics can be characterized by principal component analysis (PCA) on a structural ensemble, or elastic network model (ENM) of a single atomic structure. Validation and applications of the method cover various biological systems, such as multi-domain proteins, protein complexes, and even biomolecular machines. The results demonstrate that the ED-CG method may serve as a very useful tool for identifying functional dynamics of large biomolecules at the CG level.
Fuel cell on-site integrated energy system parametric analysis of a residential complex
NASA Technical Reports Server (NTRS)
Simons, S. N.
1977-01-01
A parametric energy-use analysis was performed for a large apartment complex served by a fuel cell on-site integrated energy system (OS/IES). The variables parameterized include operating characteristics for four phosphoric acid fuel cells, eight OS/IES energy recovery systems, and four climatic locations. The annual fuel consumption for selected parametric combinations are presented and a breakeven economic analysis is presented for one parametric combination. The results show fuel cell electrical efficiency and system component choice have the greatest effect on annual fuel consumption; fuel cell thermal efficiency and geographic location have less of an effect.
NASA Astrophysics Data System (ADS)
Steinberg, Marc
2011-06-01
This paper presents a selective survey of theoretical and experimental progress in the development of biologicallyinspired approaches for complex surveillance and reconnaissance problems with multiple, heterogeneous autonomous systems. The focus is on approaches that may address ISR problems that can quickly become mathematically intractable or otherwise impractical to implement using traditional optimization techniques as the size and complexity of the problem is increased. These problems require dealing with complex spatiotemporal objectives and constraints at a variety of levels from motion planning to task allocation. There is also a need to ensure solutions are reliable and robust to uncertainty and communications limitations. First, the paper will provide a short introduction to the current state of relevant biological research as relates to collective animal behavior. Second, the paper will describe research on largely decentralized, reactive, or swarm approaches that have been inspired by biological phenomena such as schools of fish, flocks of birds, ant colonies, and insect swarms. Next, the paper will discuss approaches towards more complex organizational and cooperative mechanisms in team and coalition behaviors in order to provide mission coverage of large, complex areas. Relevant team behavior may be derived from recent advances in understanding of the social and cooperative behaviors used for collaboration by tens of animals with higher-level cognitive abilities such as mammals and birds. Finally, the paper will briefly discuss challenges involved in user interaction with these types of systems.
Forward design of a complex enzyme cascade reaction
Hold, Christoph; Billerbeck, Sonja; Panke, Sven
2016-01-01
Enzymatic reaction networks are unique in that one can operate a large number of reactions under the same set of conditions concomitantly in one pot, but the nonlinear kinetics of the enzymes and the resulting system complexity have so far defeated rational design processes for the construction of such complex cascade reactions. Here we demonstrate the forward design of an in vitro 10-membered system using enzymes from highly regulated biological processes such as glycolysis. For this, we adapt the characterization of the biochemical system to the needs of classical engineering systems theory: we combine online mass spectrometry and continuous system operation to apply standard system theory input functions and to use the detailed dynamic system responses to parameterize a model of sufficient quality for forward design. This allows the facile optimization of a 10-enzyme cascade reaction for fine chemical production purposes. PMID:27677244
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chassin, David P.; Posse, Christian; Malard, Joel M.
2004-08-01
Physical analogs have shown considerable promise for understanding the behavior of complex adaptive systems, including macroeconomics, biological systems, social networks, and electric power markets. Many of today’s most challenging technical and policy questions can be reduced to a distributed economic control problem. Indeed, economically-based control of large-scale systems is founded on the conjecture that the price-based regulation (e.g., auctions, markets) results in an optimal allocation of resources and emergent optimal system control. This paper explores the state of the art in the use physical analogs for understanding the behavior of some econophysical systems and to deriving stable and robust controlmore » strategies for them. In particular we review and discussion applications of some analytic methods based on the thermodynamic metaphor according to which the interplay between system entropy and conservation laws gives rise to intuitive and governing global properties of complex systems that cannot be otherwise understood.« less
Visualizing Parallel Computer System Performance
NASA Technical Reports Server (NTRS)
Malony, Allen D.; Reed, Daniel A.
1988-01-01
Parallel computer systems are among the most complex of man's creations, making satisfactory performance characterization difficult. Despite this complexity, there are strong, indeed, almost irresistible, incentives to quantify parallel system performance using a single metric. The fallacy lies in succumbing to such temptations. A complete performance characterization requires not only an analysis of the system's constituent levels, it also requires both static and dynamic characterizations. Static or average behavior analysis may mask transients that dramatically alter system performance. Although the human visual system is remarkedly adept at interpreting and identifying anomalies in false color data, the importance of dynamic, visual scientific data presentation has only recently been recognized Large, complex parallel system pose equally vexing performance interpretation problems. Data from hardware and software performance monitors must be presented in ways that emphasize important events while eluding irrelevant details. Design approaches and tools for performance visualization are the subject of this paper.
Large-Scale medical image analytics: Recent methodologies, applications and Future directions.
Zhang, Shaoting; Metaxas, Dimitris
2016-10-01
Despite the ever-increasing amount and complexity of annotated medical image data, the development of large-scale medical image analysis algorithms has not kept pace with the need for methods that bridge the semantic gap between images and diagnoses. The goal of this position paper is to discuss and explore innovative and large-scale data science techniques in medical image analytics, which will benefit clinical decision-making and facilitate efficient medical data management. Particularly, we advocate that the scale of image retrieval systems should be significantly increased at which interactive systems can be effective for knowledge discovery in potentially large databases of medical images. For clinical relevance, such systems should return results in real-time, incorporate expert feedback, and be able to cope with the size, quality, and variety of the medical images and their associated metadata for a particular domain. The design, development, and testing of the such framework can significantly impact interactive mining in medical image databases that are growing rapidly in size and complexity and enable novel methods of analysis at much larger scales in an efficient, integrated fashion. Copyright © 2016. Published by Elsevier B.V.
Acceleration techniques for dependability simulation. M.S. Thesis
NASA Technical Reports Server (NTRS)
Barnette, James David
1995-01-01
As computer systems increase in complexity, the need to project system performance from the earliest design and development stages increases. We have to employ simulation for detailed dependability studies of large systems. However, as the complexity of the simulation model increases, the time required to obtain statistically significant results also increases. This paper discusses an approach that is application independent and can be readily applied to any process-based simulation model. Topics include background on classical discrete event simulation and techniques for random variate generation and statistics gathering to support simulation.
Dshell++: A Component Based, Reusable Space System Simulation Framework
NASA Technical Reports Server (NTRS)
Lim, Christopher S.; Jain, Abhinandan
2009-01-01
This paper describes the multi-mission Dshell++ simulation framework for high fidelity, physics-based simulation of spacecraft, robotic manipulation and mobility systems. Dshell++ is a C++/Python library which uses modern script driven object-oriented techniques to allow component reuse and a dynamic run-time interface for complex, high-fidelity simulation of spacecraft and robotic systems. The goal of the Dshell++ architecture is to manage the inherent complexity of physicsbased simulations while supporting component model reuse across missions. The framework provides several features that support a large degree of simulation configurability and usability.
Method of fuzzy inference for one class of MISO-structure systems with non-singleton inputs
NASA Astrophysics Data System (ADS)
Sinuk, V. G.; Panchenko, M. V.
2018-03-01
In fuzzy modeling, the inputs of the simulated systems can receive both crisp values and non-Singleton. Computational complexity of fuzzy inference with fuzzy non-Singleton inputs corresponds to an exponential. This paper describes a new method of inference, based on the theorem of decomposition of a multidimensional fuzzy implication and a fuzzy truth value. This method is considered for fuzzy inputs and has a polynomial complexity, which makes it possible to use it for modeling large-dimensional MISO-structure systems.
Identification of hybrid node and link communities in complex networks
He, Dongxiao; Jin, Di; Chen, Zheng; Zhang, Weixiong
2015-01-01
Identifying communities in complex networks is an effective means for analyzing complex systems, with applications in diverse areas such as social science, engineering, biology and medicine. Finding communities of nodes and finding communities of links are two popular schemes for network analysis. These schemes, however, have inherent drawbacks and are inadequate to capture complex organizational structures in real networks. We introduce a new scheme and an effective approach for identifying complex mixture structures of node and link communities, called hybrid node-link communities. A central piece of our approach is a probabilistic model that accommodates node, link and hybrid node-link communities. Our extensive experiments on various real-world networks, including a large protein-protein interaction network and a large network of semantically associated words, illustrated that the scheme for hybrid communities is superior in revealing network characteristics. Moreover, the new approach outperformed the existing methods for finding node or link communities separately. PMID:25728010
Identification of hybrid node and link communities in complex networks.
He, Dongxiao; Jin, Di; Chen, Zheng; Zhang, Weixiong
2015-03-02
Identifying communities in complex networks is an effective means for analyzing complex systems, with applications in diverse areas such as social science, engineering, biology and medicine. Finding communities of nodes and finding communities of links are two popular schemes for network analysis. These schemes, however, have inherent drawbacks and are inadequate to capture complex organizational structures in real networks. We introduce a new scheme and an effective approach for identifying complex mixture structures of node and link communities, called hybrid node-link communities. A central piece of our approach is a probabilistic model that accommodates node, link and hybrid node-link communities. Our extensive experiments on various real-world networks, including a large protein-protein interaction network and a large network of semantically associated words, illustrated that the scheme for hybrid communities is superior in revealing network characteristics. Moreover, the new approach outperformed the existing methods for finding node or link communities separately.
Identification of hybrid node and link communities in complex networks
NASA Astrophysics Data System (ADS)
He, Dongxiao; Jin, Di; Chen, Zheng; Zhang, Weixiong
2015-03-01
Identifying communities in complex networks is an effective means for analyzing complex systems, with applications in diverse areas such as social science, engineering, biology and medicine. Finding communities of nodes and finding communities of links are two popular schemes for network analysis. These schemes, however, have inherent drawbacks and are inadequate to capture complex organizational structures in real networks. We introduce a new scheme and an effective approach for identifying complex mixture structures of node and link communities, called hybrid node-link communities. A central piece of our approach is a probabilistic model that accommodates node, link and hybrid node-link communities. Our extensive experiments on various real-world networks, including a large protein-protein interaction network and a large network of semantically associated words, illustrated that the scheme for hybrid communities is superior in revealing network characteristics. Moreover, the new approach outperformed the existing methods for finding node or link communities separately.
Zhang, Xiulan; Bloom, Gerald; Xu, Xiaoxin; Chen, Lin; Liang, Xiaoyun; Wolcott, Sara J
2014-08-26
This paper explores the evolution of schemes for rural finance in China as a case study of the long and complex process of health system development. It argues that the evolution of these schemes has been the outcome of the response of a large number of agents to a rapidly changing context and of efforts by the government to influence this adaptation process and achieve public health goals. The study draws on several sources of data including a review of official policy documents and academic papers and in-depth interviews with key policy actors at national level and at a sample of localities. The study identifies three major transition points associated with changes in broad development strategy and demonstrates how the adaptation of large numbers of actors to these contextual changes had a major impact on the performance of the health system. Further, it documents how the Ministry of Health viewed its role as both an advocate for the interests of health facilities and health workers and as the agency responsible for ensuring that government health system objectives were met. It is argued that a major reason for the resilience of the health system and its ability to adapt to rapid economic and institutional change was the ability of the Ministry to provide overall strategy leadership. Additionally, it postulates that a number of interest groups have emerged, which now also seek to influence the pathway of health system development. This history illustrates the complex and political nature of the management of health system development and reform. The paper concludes that governments will need to increase their capacity to analyze the health sector as a complex system and to manage change processes.
Implementing Parquet equations using HPX
NASA Astrophysics Data System (ADS)
Kellar, Samuel; Wagle, Bibek; Yang, Shuxiang; Tam, Ka-Ming; Kaiser, Hartmut; Moreno, Juana; Jarrell, Mark
A new C++ runtime system (HPX) enables simulations of complex systems to run more efficiently on parallel and heterogeneous systems. This increased efficiency allows for solutions to larger simulations of the parquet approximation for a system with impurities. The relevancy of the parquet equations depends upon the ability to solve systems which require long runs and large amounts of memory. These limitations, in addition to numerical complications arising from stability of the solutions, necessitate running on large distributed systems. As the computational resources trend towards the exascale and the limitations arising from computational resources vanish efficiency of large scale simulations becomes a focus. HPX facilitates efficient simulations through intelligent overlapping of computation and communication. Simulations such as the parquet equations which require the transfer of large amounts of data should benefit from HPX implementations. Supported by the the NSF EPSCoR Cooperative Agreement No. EPS-1003897 with additional support from the Louisiana Board of Regents.
Interoperable Acquisition for Systems of Systems: The Challenges
2006-09-01
Interoperable Acquisition for Systems of Systems: The Challenges James D. Smith II D. Mike Phillips September 2006 TECHNICAL NOTE...Failure of Program-Centric Risk Management 10 3.3.2 Absence of System-of-Systems Engineering 12 3.3.3 Disconnect Between System-of-Systems...SOFTWARE ENGINEERING INSTITUTE | vii viii | CMU/SEI-2006-TN-034 Abstract Large, complex systems development has always been challenging , even when the
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yong-Qiang, E-mail: chenjzxy@126.com; Tian, Yuan
2017-03-15
Three Pb(II) complexes ([Pb{sub 3}(BOABA){sub 2}(H{sub 2}O)]·H{sub 2}O){sub n} (1), ([Pb{sub 4}(BOABA){sub 2}(µ{sub 4}-O)(H{sub 2}O){sub 2}]·H{sub 2}O){sub n} (2), and [Pb{sub 3}(BOABA){sub 2}(H{sub 2}O)]{sub n} (3) (H{sub 3}BOABA=3,5-bis-oxyacetate-benzoic acid) were obtained under the same reaction systems with different temperatures. Complexes 1 and 2 are two dimensional (2D) networks based on Pb-BOABA chains and Pb{sub 4}(µ{sub 4}-O)(COO){sub 6} SBUs, respectively. Complex 3 presents an interesting three dimensional (3D) framework, was obtained by increasing the reaction temperature. Structural transition of the crystallization products is largely dependent on the reaction temperature. Moreover, the fluorescence properties of complexes 1–3 have been investigated. - Graphicalmore » abstract: Three Pb(II) coordination polymers were obtained under the same reaction systems with different temperatures. Both of complexes 1 and 2 are 2D network. 3 presents a 3D framework based on Pb–O–C rods SBUs. The 2D to 3D structures transition between three complexes was achieved successfully by temperature control. - Highlights: • Three Pb(II) complexes were obtained under the same reaction systems with different temperatures. • Structural transition of the crystallization products is largely dependent on the reaction temperature. • The luminescence properties studies reveal that three complexes exhibit yellow fluorescence emission behavior, which might be good candidates for obtaining photoluminescent materials.« less
Lawrenson, John; Eyskens, Benedicte; Vlasselaers, Dirk; Gewillig, Marc
2003-08-01
In all patients undergoing cardiac surgery, the effective delivery of oxygen to the tissues is of paramount importance. In the patient with relatively normal cardiac structures, the pulmonary and systemic circulations are relatively independent of each other. In the patient with a functional single ventricle, the pulmonary and systemic circulations are dependent on the same pump. As a consequence of this interdependency, the haemodynamic changes following complex palliative procedures, such as the Norwood operation, can be difficult to understand. Comparison of the newly created surgical connections to a simple set of direct current electrical circuits may help the practitioner to successfully care for the patient. In patients undergoing complex palliations, the pulmonary and systemic circulations can be compared to two circuits in parallel. Manipulations of variables, such as resistance or flow, in one circuit, can profoundly affect the performance of the other circuit. A large pulmonary flow might result in a large increase in the saturation of haemoglobin with oxygen returning to the heart via the pulmonary veins at the expense of a decreased systemic flow. Accurate balancing of these parallel circulations requires an appreciation of all interventions that can affect individual components of both circulations.
Evaluating Action Learning: A Critical Realist Complex Network Theory Approach
ERIC Educational Resources Information Center
Burgoyne, John G.
2010-01-01
This largely theoretical paper will argue the case for the usefulness of applying network and complex adaptive systems theory to an understanding of action learning and the challenge it is evaluating. This approach, it will be argued, is particularly helpful in the context of improving capability in dealing with wicked problems spread around…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Xiexiaomen; Tutuncu, Azra; Eustes, Alfred
Enhanced Geothermal Systems (EGS) could potentially use technological advancements in coupled implementation of horizontal drilling and multistage hydraulic fracturing techniques in tight oil and shale gas reservoirs along with improvements in reservoir simulation techniques to design and create EGS reservoirs. In this study, a commercial hydraulic fracture simulation package, Mangrove by Schlumberger, was used in an EGS model with largely distributed pre-existing natural fractures to model fracture propagation during the creation of a complex fracture network. The main goal of this study is to investigate optimum treatment parameters in creating multiple large, planar fractures to hydraulically connect a horizontal injectionmore » well and a horizontal production well that are 10,000 ft. deep and spaced 500 ft. apart from each other. A matrix of simulations for this study was carried out to determine the influence of reservoir and treatment parameters on preventing (or aiding) the creation of large planar fractures. The reservoir parameters investigated during the matrix simulations include the in-situ stress state and properties of the natural fracture set such as the primary and secondary fracture orientation, average fracture length, and average fracture spacing. The treatment parameters investigated during the simulations were fluid viscosity, proppant concentration, pump rate, and pump volume. A final simulation with optimized design parameters was performed. The optimized design simulation indicated that high fluid viscosity, high proppant concentration, large pump volume and pump rate tend to minimize the complexity of the created fracture network. Additionally, a reservoir with 'friendly' formation characteristics such as large stress anisotropy, natural fractures set parallel to the maximum horizontal principal stress (SHmax), and large natural fracture spacing also promote the creation of large planar fractures while minimizing fracture complexity.« less
NASA Technical Reports Server (NTRS)
Caines, P. E.
1999-01-01
The work in this research project has been focused on the construction of a hierarchical hybrid control theory which is applicable to flight management systems. The motivation and underlying philosophical position for this work has been that the scale, inherent complexity and the large number of agents (aircraft) involved in an air traffic system imply that a hierarchical modelling and control methodology is required for its management and real time control. In the current work the complex discrete or continuous state space of a system with a small number of agents is aggregated in such a way that discrete (finite state machine or supervisory automaton) controlled dynamics are abstracted from the system's behaviour. High level control may then be either directly applied at this abstracted level, or, if this is in itself of significant complexity, further layers of abstractions may be created to produce a system with an acceptable degree of complexity at each level. By the nature of this construction, high level commands are necessarily realizable at lower levels in the system.
An Efficient Model-based Diagnosis Engine for Hybrid Systems Using Structural Model Decomposition
NASA Technical Reports Server (NTRS)
Bregon, Anibal; Narasimhan, Sriram; Roychoudhury, Indranil; Daigle, Matthew; Pulido, Belarmino
2013-01-01
Complex hybrid systems are present in a large range of engineering applications, like mechanical systems, electrical circuits, or embedded computation systems. The behavior of these systems is made up of continuous and discrete event dynamics that increase the difficulties for accurate and timely online fault diagnosis. The Hybrid Diagnosis Engine (HyDE) offers flexibility to the diagnosis application designer to choose the modeling paradigm and the reasoning algorithms. The HyDE architecture supports the use of multiple modeling paradigms at the component and system level. However, HyDE faces some problems regarding performance in terms of complexity and time. Our focus in this paper is on developing efficient model-based methodologies for online fault diagnosis in complex hybrid systems. To do this, we propose a diagnosis framework where structural model decomposition is integrated within the HyDE diagnosis framework to reduce the computational complexity associated with the fault diagnosis of hybrid systems. As a case study, we apply our approach to a diagnostic testbed, the Advanced Diagnostics and Prognostics Testbed (ADAPT), using real data.
A Design Rationale Capture Using REMAP/MM
1994-06-01
company-wide down-sizing, the power company has determined that an automated service order processing system is the most economical solution. This new...service order processing system for a large power company can easily be 37 led. A system of this complexity would typically require three to five years
The FHWA 2015 R&T story : research and innovative solutions for the nation's highway challenges.
DOT National Transportation Integrated Search
2016-01-01
The U.S. has built one of the worlds greatest transportation systems. With more than 600,000 bridges and 8.5 million lane miles, the system is large, complex, and aging. U.S. transportation agencies are challenged to make the system more efficient...
A continuum theory for multicomponent chromatography modeling.
Pfister, David; Morbidelli, Massimo; Nicoud, Roger-Marc
2016-05-13
A continuum theory is proposed for modeling multicomponent chromatographic systems under linear conditions. The model is based on the description of complex mixtures, possibly involving tens or hundreds of solutes, by a continuum. The present approach is shown to be very efficient when dealing with a large number of similar components presenting close elution behaviors and whose individual analytical characterization is impossible. Moreover, approximating complex mixtures by continuous distributions of solutes reduces the required number of model parameters to the few ones specific to the characterization of the selected continuous distributions. Therefore, in the frame of the continuum theory, the simulation of large multicomponent systems gets simplified and the computational effectiveness of the chromatographic model is thus dramatically improved. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Broccard, Frédéric D.; Joshi, Siddharth; Wang, Jun; Cauwenberghs, Gert
2017-08-01
Objective. Computation in nervous systems operates with different computational primitives, and on different hardware, than traditional digital computation and is thus subjected to different constraints from its digital counterpart regarding the use of physical resources such as time, space and energy. In an effort to better understand neural computation on a physical medium with similar spatiotemporal and energetic constraints, the field of neuromorphic engineering aims to design and implement electronic systems that emulate in very large-scale integration (VLSI) hardware the organization and functions of neural systems at multiple levels of biological organization, from individual neurons up to large circuits and networks. Mixed analog/digital neuromorphic VLSI systems are compact, consume little power and operate in real time independently of the size and complexity of the model. Approach. This article highlights the current efforts to interface neuromorphic systems with neural systems at multiple levels of biological organization, from the synaptic to the system level, and discusses the prospects for future biohybrid systems with neuromorphic circuits of greater complexity. Main results. Single silicon neurons have been interfaced successfully with invertebrate and vertebrate neural networks. This approach allowed the investigation of neural properties that are inaccessible with traditional techniques while providing a realistic biological context not achievable with traditional numerical modeling methods. At the network level, populations of neurons are envisioned to communicate bidirectionally with neuromorphic processors of hundreds or thousands of silicon neurons. Recent work on brain-machine interfaces suggests that this is feasible with current neuromorphic technology. Significance. Biohybrid interfaces between biological neurons and VLSI neuromorphic systems of varying complexity have started to emerge in the literature. Primarily intended as a computational tool for investigating fundamental questions related to neural dynamics, the sophistication of current neuromorphic systems now allows direct interfaces with large neuronal networks and circuits, resulting in potentially interesting clinical applications for neuroengineering systems, neuroprosthetics and neurorehabilitation.
A bifurcation giving birth to order in an impulsively driven complex system
NASA Astrophysics Data System (ADS)
Seshadri, Akshay; Sujith, R. I.
2016-08-01
Nonlinear oscillations lie at the heart of numerous complex systems. Impulsive forcing arises naturally in many scenarios, and we endeavour to study nonlinear oscillators subject to such forcing. We model these kicked oscillatory systems as a piecewise smooth dynamical system, whereby their dynamics can be investigated. We investigate the problem of pattern formation in a turbulent combustion system and apply this formalism with the aim of explaining the observed dynamics. We identify that the transition of this system from low amplitude chaotic oscillations to large amplitude periodic oscillations is the result of a discontinuity induced bifurcation. Further, we provide an explanation for the occurrence of intermittent oscillations in the system.
A bifurcation giving birth to order in an impulsively driven complex system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seshadri, Akshay, E-mail: akshayseshadri@gmail.com; Sujith, R. I., E-mail: sujith@iitm.ac.in
Nonlinear oscillations lie at the heart of numerous complex systems. Impulsive forcing arises naturally in many scenarios, and we endeavour to study nonlinear oscillators subject to such forcing. We model these kicked oscillatory systems as a piecewise smooth dynamical system, whereby their dynamics can be investigated. We investigate the problem of pattern formation in a turbulent combustion system and apply this formalism with the aim of explaining the observed dynamics. We identify that the transition of this system from low amplitude chaotic oscillations to large amplitude periodic oscillations is the result of a discontinuity induced bifurcation. Further, we provide anmore » explanation for the occurrence of intermittent oscillations in the system.« less
Braithwaite, Jeffrey; Churruca, Kate; Long, Janet C; Ellis, Louise A; Herkes, Jessica
2018-04-30
Implementation science has a core aim - to get evidence into practice. Early in the evidence-based medicine movement, this task was construed in linear terms, wherein the knowledge pipeline moved from evidence created in the laboratory through to clinical trials and, finally, via new tests, drugs, equipment, or procedures, into clinical practice. We now know that this straight-line thinking was naïve at best, and little more than an idealization, with multiple fractures appearing in the pipeline. The knowledge pipeline derives from a mechanistic and linear approach to science, which, while delivering huge advances in medicine over the last two centuries, is limited in its application to complex social systems such as healthcare. Instead, complexity science, a theoretical approach to understanding interconnections among agents and how they give rise to emergent, dynamic, systems-level behaviors, represents an increasingly useful conceptual framework for change. Herein, we discuss what implementation science can learn from complexity science, and tease out some of the properties of healthcare systems that enable or constrain the goals we have for better, more effective, more evidence-based care. Two Australian examples, one largely top-down, predicated on applying new standards across the country, and the other largely bottom-up, adopting medical emergency teams in over 200 hospitals, provide empirical support for a complexity-informed approach to implementation. The key lessons are that change can be stimulated in many ways, but a triggering mechanism is needed, such as legislation or widespread stakeholder agreement; that feedback loops are crucial to continue change momentum; that extended sweeps of time are involved, typically much longer than believed at the outset; and that taking a systems-informed, complexity approach, having regard for existing networks and socio-technical characteristics, is beneficial. Construing healthcare as a complex adaptive system implies that getting evidence into routine practice through a step-by-step model is not feasible. Complexity science forces us to consider the dynamic properties of systems and the varying characteristics that are deeply enmeshed in social practices, whilst indicating that multiple forces, variables, and influences must be factored into any change process, and that unpredictability and uncertainty are normal properties of multi-part, intricate systems.
Resource Management for Distributed Parallel Systems
NASA Technical Reports Server (NTRS)
Neuman, B. Clifford; Rao, Santosh
1993-01-01
Multiprocessor systems should exist in the the larger context of distributed systems, allowing multiprocessor resources to be shared by those that need them. Unfortunately, typical multiprocessor resource management techniques do not scale to large networks. The Prospero Resource Manager (PRM) is a scalable resource allocation system that supports the allocation of processing resources in large networks and multiprocessor systems. To manage resources in such distributed parallel systems, PRM employs three types of managers: system managers, job managers, and node managers. There exist multiple independent instances of each type of manager, reducing bottlenecks. The complexity of each manager is further reduced because each is designed to utilize information at an appropriate level of abstraction.
Distributed intrusion detection system based on grid security model
NASA Astrophysics Data System (ADS)
Su, Jie; Liu, Yahui
2008-03-01
Grid computing has developed rapidly with the development of network technology and it can solve the problem of large-scale complex computing by sharing large-scale computing resource. In grid environment, we can realize a distributed and load balance intrusion detection system. This paper first discusses the security mechanism in grid computing and the function of PKI/CA in the grid security system, then gives the application of grid computing character in the distributed intrusion detection system (IDS) based on Artificial Immune System. Finally, it gives a distributed intrusion detection system based on grid security system that can reduce the processing delay and assure the detection rates.
USDA-ARS?s Scientific Manuscript database
In recent years, large-scale watershed modeling has been implemented broadly in the field of water resources planning and management. Complex hydrological, sediment, and nutrient processes can be simulated by sophisticated watershed simulation models for important issues such as water resources all...
Applications of fidelity measures to complex quantum systems
2016-01-01
We revisit fidelity as a measure for the stability and the complexity of the quantum motion of single-and many-body systems. Within the context of cold atoms, we present an overview of applications of two fidelities, which we call static and dynamical fidelity, respectively. The static fidelity applies to quantum problems which can be diagonalized since it is defined via the eigenfunctions. In particular, we show that the static fidelity is a highly effective practical detector of avoided crossings characterizing the complexity of the systems and their evolutions. The dynamical fidelity is defined via the time-dependent wave functions. Focusing on the quantum kicked rotor system, we highlight a few practical applications of fidelity measurements in order to better understand the large variety of dynamical regimes of this paradigm of a low-dimensional system with mixed regular–chaotic phase space. PMID:27140967
Complexity analysis of the Next Gen Air Traffic Management System: trajectory based operations.
Lyons, Rhonda
2012-01-01
According to Federal Aviation Administration traffic predictions currently our Air Traffic Management (ATM) system is operating at 150 percent capacity; forecasting that within the next two decades, the traffic with increase to a staggering 250 percent [17]. This will require a major redesign of our system. Today's ATM system is complex. It is designed to safely, economically, and efficiently provide air traffic services through the cost-effective provision of facilities and seamless services in collaboration with multiple agents however, contrary the vision, the system is loosely integrated and is suffering tremendously from antiquated equipment and saturated airways. The new Next Generation (Next Gen) ATM system is designed to transform the current system into an agile, robust and responsive set of operations that are designed to safely manage the growing needs of the projected increasingly complex, diverse set of air transportation system users and massive projected worldwide traffic rates. This new revolutionary technology-centric system is dynamically complex and is much more sophisticated than it's soon to be predecessor. ATM system failures could yield large scale catastrophic consequences as it is a safety critical system. This work will attempt to describe complexity and the complex nature of the NextGen ATM system and Trajectory Based Operational. Complex human factors interactions within Next Gen will be analyzed using a proposed dual experimental approach designed to identify hazards, gaps and elicit emergent hazards that would not be visible if conducted in isolation. Suggestions will be made along with a proposal for future human factors research in the TBO safety critical Next Gen environment.
Large Crawler Crane for new lightning protection system
2007-10-25
A large crawler crane arrives at the turn basin at the Launch Complex 39 Area on NASA's Kennedy Space Center. The crane with its 70-foot boom will be moved to Launch Pad 39B and used to construct a new lightning protection system for the Constellation Program and Ares/Orion launches. Pad B will be the site of the first Ares vehicle launch, including Ares I-X which is scheduled for April 2009.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hofmann, R.B.
1995-09-01
Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed.more » A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.« less
Cross-slope Movement Patterns in Landslides
NASA Astrophysics Data System (ADS)
Petley, D.; Murphy, W.; Bulmer, M. H.; Keefer, D.
2002-12-01
There is growing evidence that there is a significant element of cross-slope movement in many large landslide systems. These movements may result in changing states of stress between landslide blocks that can establish complex displacement patterns. Such motions, which are not considered in traditional two-dimensional limit-equilibrium analyses, are important in the investigation of a variety of landslide types, such as those triggered by earthquakes. In addition, these movements may introduce considerable errors into the interpretation of strain patterns as derived from InSAR studies. Finally, even traditional interpretation techniques may lead to the amount of total displacement being underestimated. These observations suggest that a three dimensional form of analysis may be more appropriate for large landslide complexes. The significance of such cross-slope movements are being investigated using a detailed investigation of the Lishan landslide complex in Central Taiwan. This landslide system, which was reactivated in 1990 related to the construction of a hotel. The total recorded movements have been approximately 1.5 m over an area of sliding that is estimated to be 450 m wide and 200 m long. Extensive damage has been caused to roads and buildings within the town. Remediation work has resulted largely in the stabilization of the landslide complex. Detailed geomorphological mapping has revealed that the landslide complex is composed of two main components. The first, immediately upslope of the hotel construction site, is a relatively shallow earthflow. The second, which has formed a large headscarp upslope from the main road in the centre of the town, is a deeper translational slide. Both appear to have been reactivations of previous failures. While the displacement patterns of the earthflow indicate a relatively simple downslope movement, the vectors derived from kinematic analysis of surface features have indicated that the movement of the deeper-seated landslide was more complex. Though the dominant movement vector is downslope, there is evidence to suggest that there has been a cross-slope component of motion that corresponds to the bedding orientation.
Guo, Tai Wei; Bartesaghi, Alberto; Yang, Hui; Falconieri, Veronica; Rao, Prashant; Merk, Alan; Eng, Edward T; Raczkowski, Ashleigh M; Fox, Tara; Earl, Lesley A; Patel, Dinshaw J; Subramaniam, Sriram
2017-10-05
Prokaryotic cells possess CRISPR-mediated adaptive immune systems that protect them from foreign genetic elements, such as invading viruses. A central element of this immune system is an RNA-guided surveillance complex capable of targeting non-self DNA or RNA for degradation in a sequence- and site-specific manner analogous to RNA interference. Although the complexes display considerable diversity in their composition and architecture, many basic mechanisms underlying target recognition and cleavage are highly conserved. Using cryoelectron microscopy (cryo-EM), we show that the binding of target double-stranded DNA (dsDNA) to a type I-F CRISPR system yersinia (Csy) surveillance complex leads to large quaternary and tertiary structural changes in the complex that are likely necessary in the pathway leading to target dsDNA degradation by a trans-acting helicase-nuclease. Comparison of the structure of the surveillance complex before and after dsDNA binding, or in complex with three virally encoded anti-CRISPR suppressors that inhibit dsDNA binding, reveals mechanistic details underlying target recognition and inhibition. Published by Elsevier Inc.
Casado-Sánchez, Antonio; Gómez-Ballesteros, Rocío; Tato, Francisco; Soriano, Francisco J; Pascual-Coca, Gustavo; Cabrera, Silvia; Alemán, José
2016-07-12
A new catalytic system for the photooxidation of sulfides based on Pt(ii) complexes is presented. The catalyst is capable of oxidizing a large number of sulfides containing aryl, alkyl, allyl, benzyl, as well as more complex structures such as heterocycles and methionine amino acid, with complete chemoselectivity. In addition, the first sulfur oxidation in a continuous flow process has been developed.
The Problem of Size in Robust Design
NASA Technical Reports Server (NTRS)
Koch, Patrick N.; Allen, Janet K.; Mistree, Farrokh; Mavris, Dimitri
1997-01-01
To facilitate the effective solution of multidisciplinary, multiobjective complex design problems, a departure from the traditional parametric design analysis and single objective optimization approaches is necessary in the preliminary stages of design. A necessary tradeoff becomes one of efficiency vs. accuracy as approximate models are sought to allow fast analysis and effective exploration of a preliminary design space. In this paper we apply a general robust design approach for efficient and comprehensive preliminary design to a large complex system: a high speed civil transport (HSCT) aircraft. Specifically, we investigate the HSCT wing configuration design, incorporating life cycle economic uncertainties to identify economically robust solutions. The approach is built on the foundation of statistical experimentation and modeling techniques and robust design principles, and is specialized through incorporation of the compromise Decision Support Problem for multiobjective design. For large problems however, as in the HSCT example, this robust design approach developed for efficient and comprehensive design breaks down with the problem of size - combinatorial explosion in experimentation and model building with number of variables -and both efficiency and accuracy are sacrificed. Our focus in this paper is on identifying and discussing the implications and open issues associated with the problem of size for the preliminary design of large complex systems.
A Multilevel Gamma-Clustering Layout Algorithm for Visualization of Biological Networks
Hruz, Tomas; Lucas, Christoph; Laule, Oliver; Zimmermann, Philip
2013-01-01
Visualization of large complex networks has become an indispensable part of systems biology, where organisms need to be considered as one complex system. The visualization of the corresponding network is challenging due to the size and density of edges. In many cases, the use of standard visualization algorithms can lead to high running times and poorly readable visualizations due to many edge crossings. We suggest an approach that analyzes the structure of the graph first and then generates a new graph which contains specific semantic symbols for regular substructures like dense clusters. We propose a multilevel gamma-clustering layout visualization algorithm (MLGA) which proceeds in three subsequent steps: (i) a multilevel γ-clustering is used to identify the structure of the underlying network, (ii) the network is transformed to a tree, and (iii) finally, the resulting tree which shows the network structure is drawn using a variation of a force-directed algorithm. The algorithm has a potential to visualize very large networks because it uses modern clustering heuristics which are optimized for large graphs. Moreover, most of the edges are removed from the visual representation which allows keeping the overview over complex graphs with dense subgraphs. PMID:23864855
Electrical and thermal modeling of a large-format lithium titanate oxide battery system.
DOT National Transportation Integrated Search
2015-04-01
The future of mass transportation is clearly moving towards the increased efficiency of hybrid and electric vehicles. Electrical : energy storage is a key component in most of these advanced vehicles, with the system complexity and vehicle cost shift...
COMPUTATIONAL METHODOLOGIES for REAL-SPACE STRUCTURAL REFINEMENT of LARGE MACROMOLECULAR COMPLEXES
Goh, Boon Chong; Hadden, Jodi A.; Bernardi, Rafael C.; Singharoy, Abhishek; McGreevy, Ryan; Rudack, Till; Cassidy, C. Keith; Schulten, Klaus
2017-01-01
The rise of the computer as a powerful tool for model building and refinement has revolutionized the field of structure determination for large biomolecular systems. Despite the wide availability of robust experimental methods capable of resolving structural details across a range of spatiotemporal resolutions, computational hybrid methods have the unique ability to integrate the diverse data from multimodal techniques such as X-ray crystallography and electron microscopy into consistent, fully atomistic structures. Here, commonly employed strategies for computational real-space structural refinement are reviewed, and their specific applications are illustrated for several large macromolecular complexes: ribosome, virus capsids, chemosensory array, and photosynthetic chromatophore. The increasingly important role of computational methods in large-scale structural refinement, along with current and future challenges, is discussed. PMID:27145875
Epidemic outbreaks in complex heterogeneous networks
NASA Astrophysics Data System (ADS)
Moreno, Y.; Pastor-Satorras, R.; Vespignani, A.
2002-04-01
We present a detailed analytical and numerical study for the spreading of infections with acquired immunity in complex population networks. We show that the large connectivity fluctuations usually found in these networks strengthen considerably the incidence of epidemic outbreaks. Scale-free networks, which are characterized by diverging connectivity fluctuations in the limit of a very large number of nodes, exhibit the lack of an epidemic threshold and always show a finite fraction of infected individuals. This particular weakness, observed also in models without immunity, defines a new epidemiological framework characterized by a highly heterogeneous response of the system to the introduction of infected individuals with different connectivity. The understanding of epidemics in complex networks might deliver new insights in the spread of information and diseases in biological and technological networks that often appear to be characterized by complex heterogeneous architectures.
Automated adaptive inference of phenomenological dynamical models.
Daniels, Bryan C; Nemenman, Ilya
2015-08-21
Dynamics of complex systems is often driven by large and intricate networks of microscopic interactions, whose sheer size obfuscates understanding. With limited experimental data, many parameters of such dynamics are unknown, and thus detailed, mechanistic models risk overfitting and making faulty predictions. At the other extreme, simple ad hoc models often miss defining features of the underlying systems. Here we develop an approach that instead constructs phenomenological, coarse-grained models of network dynamics that automatically adapt their complexity to the available data. Such adaptive models produce accurate predictions even when microscopic details are unknown. The approach is computationally tractable, even for a relatively large number of dynamical variables. Using simulated data, it correctly infers the phase space structure for planetary motion, avoids overfitting in a biological signalling system and produces accurate predictions for yeast glycolysis with tens of data points and over half of the interacting species unobserved.
A geometric modeler based on a dual-geometry representation polyhedra and rational b-splines
NASA Technical Reports Server (NTRS)
Klosterman, A. L.
1984-01-01
For speed and data base reasons, solid geometric modeling of large complex practical systems is usually approximated by a polyhedra representation. Precise parametric surface and implicit algebraic modelers are available but it is not yet practical to model the same level of system complexity with these precise modelers. In response to this contrast the GEOMOD geometric modeling system was built so that a polyhedra abstraction of the geometry would be available for interactive modeling without losing the precise definition of the geometry. Part of the reason that polyhedra modelers are effective is that all bounded surfaces can be represented in a single canonical format (i.e., sets of planar polygons). This permits a very simple and compact data structure. Nonuniform rational B-splines are currently the best representation to describe a very large class of geometry precisely with one canonical format. The specific capabilities of the modeler are described.
Automated adaptive inference of phenomenological dynamical models
Daniels, Bryan C.; Nemenman, Ilya
2015-01-01
Dynamics of complex systems is often driven by large and intricate networks of microscopic interactions, whose sheer size obfuscates understanding. With limited experimental data, many parameters of such dynamics are unknown, and thus detailed, mechanistic models risk overfitting and making faulty predictions. At the other extreme, simple ad hoc models often miss defining features of the underlying systems. Here we develop an approach that instead constructs phenomenological, coarse-grained models of network dynamics that automatically adapt their complexity to the available data. Such adaptive models produce accurate predictions even when microscopic details are unknown. The approach is computationally tractable, even for a relatively large number of dynamical variables. Using simulated data, it correctly infers the phase space structure for planetary motion, avoids overfitting in a biological signalling system and produces accurate predictions for yeast glycolysis with tens of data points and over half of the interacting species unobserved. PMID:26293508
GaAs VLSI technology and circuit elements for DSP
NASA Astrophysics Data System (ADS)
Mikkelson, James M.
1990-10-01
Recent progress in digital GaAs circuit performance and complexity is presented to demonstrate the current capabilities of GaAs components. High density GaAs process technology and circuit design techniques are described and critical issues for achieving favorable complexity speed power and cost tradeoffs are reviewed. Some DSP building blocks are described to provide examples of what types of DSP systems could be implemented with present GaAs technology. DIGITAL GaAs CIRCUIT CAPABILITIES In the past few years the capabilities of digital GaAs circuits have dramatically increased to the VLSI level. Major gains in circuit complexity and power-delay products have been achieved by the use of silicon-like process technologies and simple circuit topologies. The very high speed and low power consumption of digital GaAs VLSI circuits have made GaAs a desirable alternative to high performance silicon in hardware intensive high speed system applications. An example of the performance and integration complexity available with GaAs VLSI circuits is the 64x64 crosspoint switch shown in figure 1. This switch which is the most complex GaAs circuit currently available is designed on a 30 gate GaAs gate array. It operates at 200 MHz and dissipates only 8 watts of power. The reasons for increasing the level of integration of GaAs circuits are similar to the reasons for the continued increase of silicon circuit complexity. The market factors driving GaAs VLSI are system design methodology system cost power and reliability. System designers are hesitant or unwilling to go backwards to previous design techniques and lower levels of integration. A more highly integrated system in a lower performance technology can often approach the performance of a system in a higher performance technology at a lower level of integration. Higher levels of integration also lower the system component count which reduces the system cost size and power consumption while improving the system reliability. For large gate count circuits the power per gate must be minimized to prevent reliability and cooling problems. The technical factors which favor increasing GaAs circuit complexity are primarily related to reducing the speed and power penalties incurred when crossing chip boundaries. Because the internal GaAs chip logic levels are not compatible with standard silicon I/O levels input receivers and output drivers are needed to convert levels. These I/O circuits add significant delay to logic paths consume large amounts of power and use an appreciable portion of the die area. The effects of these I/O penalties can be reduced by increasing the ratio of core logic to I/O on a chip. DSP operations which have a large number of logic stages between the input and the output are ideal candidates to take advantage of the performance of GaAs digital circuits. Figure 2 is a schematic representation of the I/O penalties encountered when converting from ECL levels to GaAs
Zhou, Jing; Wang, Yao-Sheng
2017-09-26
The Fbw7-Skp1 complex is an essential component in the formation and development of the mammalian cardiovascular system; the complex interaction is mediated through binding of Skp1 C-terminal peptide (qGlu-peptide) to the F-box domain of Fbw7. By visually examining the crystal structure, we identified a typical cation ···π···π stacking system at the complex interface, which is formed by the Trp1159 residue of qGlu-peptide with the Lys2299 and His2359 residues of Fbw7 F-box domain. Both hybrid quantum mechanics/molecular mechanics (QM/MM) analysis of the real domain-peptide complex and electron-correlation ab initio calculation of the stacking system model suggested that the cation···π···π plays an important role in stabilizing the complex; substitution of peptide Trp1159 residue with aromatic Phe and Tyr would not cause a considerable effect on the configuration and energetics of cation···π···π stacking system, whereas His substitution seems to largely destabilize the system. Subsequently, the qGlu-peptide was stripped from the full-length Skp1 protein to define a so-called self-inhibitory peptide, which may rebind to the domain-peptide complex interface and thus disrupt the complex interaction. Fluorescence polarization (FP) assays revealed that the Trp1159Phe and Trp1159Tyr variants have a comparable or higher affinity (K d = 41 and 62 μM) than the wild-type qGlu-peptide (K d = 56 μM), while the Trp1159His mutation would largely impair the binding potency of qGlu-peptide to Fbw7 F-box domain (K d = 280 μM), confirming that the cation···π···π confers both affinity and specificity to the domain-peptide recognition, which can be reshaped by rational molecular design of the nonbonded interaction system. Graphical abstract Stereoview of the complex structure of Fbw7 with Skp1 (PDB: 2ovp), where the Trp1159 residue of Skp1 qGlu-peptide can form a cation···π···π stacking system with the Lys2299 and His2359 residues of Fbw7 F-box domain.
Modeling Power Systems as Complex Adaptive Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chassin, David P.; Malard, Joel M.; Posse, Christian
2004-12-30
Physical analogs have shown considerable promise for understanding the behavior of complex adaptive systems, including macroeconomics, biological systems, social networks, and electric power markets. Many of today's most challenging technical and policy questions can be reduced to a distributed economic control problem. Indeed, economically based control of large-scale systems is founded on the conjecture that the price-based regulation (e.g., auctions, markets) results in an optimal allocation of resources and emergent optimal system control. This report explores the state-of-the-art physical analogs for understanding the behavior of some econophysical systems and deriving stable and robust control strategies for using them. We reviewmore » and discuss applications of some analytic methods based on a thermodynamic metaphor, according to which the interplay between system entropy and conservation laws gives rise to intuitive and governing global properties of complex systems that cannot be otherwise understood. We apply these methods to the question of how power markets can be expected to behave under a variety of conditions.« less
Multiple damage identification on a wind turbine blade using a structural neural system
NASA Astrophysics Data System (ADS)
Kirikera, Goutham R.; Schulz, Mark J.; Sundaresan, Mannur J.
2007-04-01
A large number of sensors are required to perform real-time structural health monitoring (SHM) to detect acoustic emissions (AE) produced by damage growth on large complicated structures. This requires a large number of high sampling rate data acquisition channels to analyze high frequency signals. To overcome the cost and complexity of having such a large data acquisition system, a structural neural system (SNS) was developed. The SNS reduces the required number of data acquisition channels and predicts the location of damage within a sensor grid. The sensor grid uses interconnected sensor nodes to form continuous sensors. The combination of continuous sensors and the biomimetic parallel processing of the SNS tremendously reduce the complexity of SHM. A wave simulation algorithm (WSA) was developed to understand the flexural wave propagation in composite structures and to utilize the code for developing the SNS. Simulation of AE responses in a plate and comparison with experimental results are shown in the paper. The SNS was recently tested by a team of researchers from University of Cincinnati and North Carolina A&T State University during a quasi-static proof test of a 9 meter long wind turbine blade at the National Renewable Energy Laboratory (NREL) test facility in Golden, Colorado. Twelve piezoelectric sensor nodes were used to form four continuous sensors to monitor the condition of the blade during the test. The four continuous sensors are used as inputs to the SNS. There are only two analog output channels of the SNS, and these signals are digitized and analyzed in a computer to detect damage. In the test of the wind turbine blade, multiple damages were identified and later verified by sectioning of the blade. The results of damage identification using the SNS during this proof test will be shown in this paper. Overall, the SNS is very sensitive and can detect damage on complex structures with ribs, joints, and different materials, and the system relatively inexpensive and simple to implement on large structures.
CAD-Based Aerodynamic Design of Complex Configurations using a Cartesian Method
NASA Technical Reports Server (NTRS)
Nemec, Marian; Aftosmis, Michael J.; Pulliam, Thomas H.
2003-01-01
A modular framework for aerodynamic optimization of complex geometries is developed. By working directly with a parametric CAD system, complex-geometry models are modified nnd tessellated in an automatic fashion. The use of a component-based Cartesian method significantly reduces the demands on the CAD system, and also provides for robust and efficient flowfield analysis. The optimization is controlled using either a genetic or quasi-Newton algorithm. Parallel efficiency of the framework is maintained even when subject to limited CAD resources by dynamically re-allocating the processors of the flow solver. Overall, the resulting framework can explore designs incorporating large shape modifications and changes in topology.
Constraint elimination in dynamical systems
NASA Technical Reports Server (NTRS)
Singh, R. P.; Likins, P. W.
1989-01-01
Large space structures (LSSs) and other dynamical systems of current interest are often extremely complex assemblies of rigid and flexible bodies subjected to kinematical constraints. A formulation is presented for the governing equations of constrained multibody systems via the application of singular value decomposition (SVD). The resulting equations of motion are shown to be of minimum dimension.
Held, Martina; Berz, Annuska; Hensgen, Ronja; Muenz, Thomas S; Scholl, Christina; Rössler, Wolfgang; Homberg, Uwe; Pfeiffer, Keram
2016-01-01
While the ability of honeybees to navigate relying on sky-compass information has been investigated in a large number of behavioral studies, the underlying neuronal system has so far received less attention. The sky-compass pathway has recently been described from its input region, the dorsal rim area (DRA) of the compound eye, to the anterior optic tubercle (AOTU). The aim of this study is to reveal the connection from the AOTU to the central complex (CX). For this purpose, we investigated the anatomy of large microglomerular synaptic complexes in the medial and lateral bulbs (MBUs/LBUs) of the lateral complex (LX). The synaptic complexes are formed by tubercle-lateral accessory lobe neuron 1 (TuLAL1) neurons of the AOTU and GABAergic tangential neurons of the central body's (CB) lower division (TL neurons). Both TuLAL1 and TL neurons strongly resemble neurons forming these complexes in other insect species. We further investigated the ultrastructure of these synaptic complexes using transmission electron microscopy. We found that single large presynaptic terminals of TuLAL1 neurons enclose many small profiles (SPs) of TL neurons. The synaptic connections between these neurons are established by two types of synapses: divergent dyads and divergent tetrads. Our data support the assumption that these complexes are a highly conserved feature in the insect brain and play an important role in reliable signal transmission within the sky-compass pathway.
Translational Systems Biology and Voice Pathophysiology
Li, Nicole Y. K.; Abbott, Katherine Verdolini; Rosen, Clark; An, Gary; Hebda, Patricia A.; Vodovotz, Yoram
2011-01-01
Objectives/Hypothesis Personalized medicine has been called upon to tailor healthcare to an individual's needs. Evidence-based medicine (EBM) has advocated using randomized clinical trials with large populations to evaluate treatment effects. However, due to large variations across patients, the results are likely not to apply to an individual patient. We suggest that a complementary, systems biology approach using computational modeling may help tackle biological complexity in order to improve ultimate patient care. The purpose of the article is: 1) to review the pros and cons of EBM, and 2) to discuss the alternative systems biology method and present its utility in clinical voice research. Study Design Tutorial Methods Literature review and discussion. Results We propose that translational systems biology can address many of the limitations of EBM pertinent to voice and other health care domains, and thus complement current health research models. In particular, recent work using mathematical modeling suggests that systems biology has the ability to quantify the highly complex biologic processes underlying voice pathophysiology. Recent data support the premise that this approach can be applied specifically in the case of phonotrauma and surgically induced vocal fold trauma, and may have particular power to address personalized medicine. Conclusions We propose that evidence around vocal health and disease be expanded beyond a population-based method to consider more fully issues of complexity and systems interactions, especially in implementing personalized medicine in voice care and beyond. PMID:20025041
Origins of chemoreceptor curvature sorting in Escherichia coli
Draper, Will; Liphardt, Jan
2017-01-01
Bacterial chemoreceptors organize into large clusters at the cell poles. Despite a wealth of structural and biochemical information on the system's components, it is not clear how chemoreceptor clusters are reliably targeted to the cell pole. Here, we quantify the curvature-dependent localization of chemoreceptors in live cells by artificially deforming growing cells of Escherichia coli in curved agar microchambers, and find that chemoreceptor cluster localization is highly sensitive to membrane curvature. Through analysis of multiple mutants, we conclude that curvature sensitivity is intrinsic to chemoreceptor trimers-of-dimers, and results from conformational entropy within the trimer-of-dimers geometry. We use the principles of the conformational entropy model to engineer curvature sensitivity into a series of multi-component synthetic protein complexes. When expressed in E. coli, the synthetic complexes form large polar clusters, and a complex with inverted geometry avoids the cell poles. This demonstrates the successful rational design of both polar and anti-polar clustering, and provides a synthetic platform on which to build new systems. PMID:28322223
MonALISA, an agent-based monitoring and control system for the LHC experiments
NASA Astrophysics Data System (ADS)
Balcas, J.; Kcira, D.; Mughal, A.; Newman, H.; Spiropulu, M.; Vlimant, J. R.
2017-10-01
MonALISA, which stands for Monitoring Agents using a Large Integrated Services Architecture, has been developed over the last fifteen years by California Insitute of Technology (Caltech) and its partners with the support of the software and computing program of the CMS and ALICE experiments at the Large Hadron Collider (LHC). The framework is based on Dynamic Distributed Service Architecture and is able to provide complete system monitoring, performance metrics of applications, Jobs or services, system control and global optimization services for complex systems. A short overview and status of MonALISA is given in this paper.
Nonterrestrial material processing and manufacturing of large space systems
NASA Technical Reports Server (NTRS)
Von Tiesenhausen, G.
1979-01-01
Nonterrestrial processing of materials and manufacturing of large space system components from preprocessed lunar materials at a manufacturing site in space is described. Lunar materials mined and preprocessed at the lunar resource complex will be flown to the space manufacturing facility (SMF), where together with supplementary terrestrial materials, they will be final processed and fabricated into space communication systems, solar cell blankets, radio frequency generators, and electrical equipment. Satellite Power System (SPS) material requirements and lunar material availability and utilization are detailed, and the SMF processing, refining, fabricating facilities, material flow and manpower requirements are described.
Komatsu, G.; Dohm, J.M.; Hare, T.M.
2004-01-01
Large-scale tectonomagmatic complexes are common on Earth and Mars. Many of these complexes are created or at least influenced by mantle processes, including a wide array of plume types ranging from superplumes to mantle plumes. Among the most prominent complexes, the Mongolian plateau on Earth and the Tharsis bulge on Mars share remarkable similarities in terms of large domal uplifted areas, great rift canyon systems, and widespread volcanism on their surfaces. Water has also played an important role in the development of the two complexes. In general, atmospheric and surface water play a bigger role in the development of the present-day Mongolian plateau than for the Tharsis bulge, as evidenced by highly developed drainages and thick accumulation of sediments in the basins of the Baikal rift system. On the Tharsis bulge, however, water appears to have remained as ground ice except during periods of elevated magmatic activity. Glacial and periglacial processes are well documented for the Mongolian plateau and are also reported for parts of the Tharsis bulge. Ice-magma interactions, which are represented by the formation of subice volcanoes in parts of the Mongolian plateau region, have been reported for the Valles Marineris region of Mars. The complexes are also characterized by cataclysmic floods, but their triggering mechanism may differ: mainly ice-dam failures for the Mongolian plateau and outburst of groundwater for the Tharsis bulge, probably by magma-ice interactions, although ice-dam failures within the Valles Marineris region cannot be ruled out as a possible contributor. Comparative studies of the Mongolian plateau and Tharsis bulge provide excellent opportunities for understanding surface manifestations of plume-driven processes on terrestrial planets and how they interact with hydro-cryospheres. ?? 2004 Geological Society of America.
Developing Knowledge and Leadership in Pre-Service Teacher Education Systems
ERIC Educational Resources Information Center
Ferreira, Jo-Anne; Ryan, Lisa; Davis, Julie
2015-01-01
Pre-service teacher education institutions are large and complex organisations that are notoriously difficult to change. One factor is that many change efforts focus largely on individual pre-service teacher educators altering their practice. We report here on our experience using a model for effecting change, which views pre-service teacher…
NASA Technical Reports Server (NTRS)
Low, M. D.; Baker, M.; Ferguson, R.; Frost, J. D., Jr.
1972-01-01
This paper describes a complete electroencephalographic acquisition and transmission system, designed to meet the needs of a large hospital with multiple critical care patient monitoring units. The system provides rapid and prolonged access to a centralized recording and computing area from remote locations within the hospital complex, and from locations in other hospitals and other cities. The system includes quick-on electrode caps, amplifier units and cable transmission for access from within the hospital, and EEG digitization and telephone transmission for access from other hospitals or cities.
Bouvier, M; Wiley, D C
1996-01-01
Recognition of peptides bound to class I major histocompatibility complex (MHC) molecules by specific receptors on T cells regulates the development and activity of the cellular immune system. We have designed and synthesized de novo cyclic peptides that incorporate PEG in the ring structure for binding to class I MHC molecules. The large PEG loops are positioned to extend out of the peptide binding site, thus creating steric effects aimed at preventing the recognition of class I MHC complexes by T-cell receptors. Peptides were synthesized and cyclized on polymer support using high molecular weight symmetrical PEG dicarboxylic acids to link the side chains of lysine residues substituted at positions 4 and 8 in the sequence of the HLA-A2-restricted human T-lymphotrophic virus type I Tax peptide. Cyclic peptides promoted the in vitro folding and assembly of HLA-A2 complexes. Thermal denaturation studies using circular dichroism spectroscopy showed that these complexes are as stable as complexes formed with antigenic peptides. Images Fig. 2 Fig. 4 PMID:8643447
Closed-Loop Control of Complex Networks: A Trade-Off between Time and Energy
NASA Astrophysics Data System (ADS)
Sun, Yong-Zheng; Leng, Si-Yang; Lai, Ying-Cheng; Grebogi, Celso; Lin, Wei
2017-11-01
Controlling complex nonlinear networks is largely an unsolved problem at the present. Existing works focus either on open-loop control strategies and their energy consumptions or on closed-loop control schemes with an infinite-time duration. We articulate a finite-time, closed-loop controller with an eye toward the physical and mathematical underpinnings of the trade-off between the control time and energy as well as their dependence on the network parameters and structure. The closed-loop controller is tested on a large number of real systems including stem cell differentiation, food webs, random ecosystems, and spiking neuronal networks. Our results represent a step forward in developing a rigorous and general framework to control nonlinear dynamical networks with a complex topology.
Germain, Ronald N
2017-10-16
A dichotomy exists in the field of vaccinology about the promise versus the hype associated with application of "systems biology" approaches to rational vaccine design. Some feel it is the only way to efficiently uncover currently unknown parameters controlling desired immune responses or discover what elements actually mediate these responses. Others feel that traditional experimental, often reductionist, methods for incrementally unraveling complex biology provide a more solid way forward, and that "systems" approaches are costly ways to collect data without gaining true insight. Here I argue that both views are inaccurate. This is largely because of confusion about what can be gained from classical experimentation versus statistical analysis of large data sets (bioinformatics) versus methods that quantitatively explain emergent properties of complex assemblies of biological components, with the latter reflecting what was previously called "physiology." Reductionist studies will remain essential for generating detailed insight into the functional attributes of specific elements of biological systems, but such analyses lack the power to provide a quantitative and predictive understanding of global system behavior. But by employing (1) large-scale screening methods for discovery of unknown components and connections in the immune system ( omics ), (2) statistical analysis of large data sets ( bioinformatics ), and (3) the capacity of quantitative computational methods to translate these individual components and connections into models of emergent behavior ( systems biology ), we will be able to better understand how the overall immune system functions and to determine with greater precision how to manipulate it to produce desired protective responses. Copyright © 2017 Cold Spring Harbor Laboratory Press; all rights reserved.
NASA Astrophysics Data System (ADS)
Puzyrkov, Dmitry; Polyakov, Sergey; Podryga, Viktoriia; Markizov, Sergey
2018-02-01
At the present stage of computer technology development it is possible to study the properties and processes in complex systems at molecular and even atomic levels, for example, by means of molecular dynamics methods. The most interesting are problems related with the study of complex processes under real physical conditions. Solving such problems requires the use of high performance computing systems of various types, for example, GRID systems and HPC clusters. Considering the time consuming computational tasks, the need arises of software for automatic and unified monitoring of such computations. A complex computational task can be performed over different HPC systems. It requires output data synchronization between the storage chosen by a scientist and the HPC system used for computations. The design of the computational domain is also quite a problem. It requires complex software tools and algorithms for proper atomistic data generation on HPC systems. The paper describes the prototype of a cloud service, intended for design of atomistic systems of large volume for further detailed molecular dynamic calculations and computational management for this calculations, and presents the part of its concept aimed at initial data generation on the HPC systems.
NASA Technical Reports Server (NTRS)
Allen, B. Danette; Alexandrov, Natalia
2016-01-01
Incremental approaches to air transportation system development inherit current architectural constraints, which, in turn, place hard bounds on system capacity, efficiency of performance, and complexity. To enable airspace operations of the future, a clean-slate (ab initio) airspace design(s) must be considered. This ab initio National Airspace System (NAS) must be capable of accommodating increased traffic density, a broader diversity of aircraft, and on-demand mobility. System and subsystem designs should scale to accommodate the inevitable demand for airspace services that include large numbers of autonomous Unmanned Aerial Vehicles and a paradigm shift in general aviation (e.g., personal air vehicles) in addition to more traditional aerial vehicles such as commercial jetliners and weather balloons. The complex and adaptive nature of ab initio designs for the future NAS requires new approaches to validation, adding a significant physical experimentation component to analytical and simulation tools. In addition to software modeling and simulation, the ability to exercise system solutions in a flight environment will be an essential aspect of validation. The NASA Langley Research Center (LaRC) Autonomy Incubator seeks to develop a flight simulation infrastructure for ab initio modeling and simulation that assumes no specific NAS architecture and models vehicle-to-vehicle behavior to examine interactions and emergent behaviors among hundreds of intelligent aerial agents exhibiting collaborative, cooperative, coordinative, selfish, and malicious behaviors. The air transportation system of the future will be a complex adaptive system (CAS) characterized by complex and sometimes unpredictable (or unpredicted) behaviors that result from temporal and spatial interactions among large numbers of participants. A CAS not only evolves with a changing environment and adapts to it, it is closely coupled to all systems that constitute the environment. Thus, the ecosystem that contains the system and other systems evolves with the CAS as well. The effects of the emerging adaptation and co-evolution are difficult to capture with only combined mathematical and computational experimentation. Therefore, an ab initio flight simulation environment must accommodate individual vehicles, groups of self-organizing vehicles, and large-scale infrastructure behavior. Inspired by Massively Multiplayer Online Role Playing Games (MMORPG) and Serious Gaming, the proposed ab initio simulation environment is similar to online gaming environments in which player participants interact with each other, affect their environment, and expect the simulation to persist and change regardless of any individual player's active participation.
Wind turbine wake measurement in complex terrain
NASA Astrophysics Data System (ADS)
Hansen, KS; Larsen, GC; Menke, R.; Vasiljevic, N.; Angelou, N.; Feng, J.; Zhu, WJ; Vignaroli, A.; W, W. Liu; Xu, C.; Shen, WZ
2016-09-01
SCADA data from a wind farm and high frequency time series measurements obtained with remote scanning systems have been analysed with focus on identification of wind turbine wake properties in complex terrain. The analysis indicates that within the flow regime characterized by medium to large downstream distances (more than 5 diameters) from the wake generating turbine, the wake changes according to local atmospheric conditions e.g. vertical wind speed. In very complex terrain the wake effects are often “overruled” by distortion effects due to the terrain complexity or topology.
The T.M.R. Data Dictionary: A Management Tool for Data Base Design
Ostrowski, Maureen; Bernes, Marshall R.
1984-01-01
In January 1981, a dictionary-driven ambulatory care information system known as TMR (The Medical Record) was installed at a large private medical group practice in Los Angeles. TMR's data dictionary has enabled the medical group to adapt the software to meet changing user needs largely without programming support. For top management, the dictionary is also a tool for navigating through the system's complexity and assuring the integrity of management goals.
Large Crawler Crane for new lightning protection system
2007-10-25
A large crawler crane begins moving away from the turn basin at the Launch Complex 39 Area on NASA's Kennedy Space Center. The crane with its 70-foot boom will be moved to Launch Pad 39B and used to construct a new lightning protection system for the Constellation Program and Ares/Orion launches. Pad B will be the site of the first Ares vehicle launch, including Ares I-X which is scheduled for April 2009.
Big Data Analytics with Datalog Queries on Spark.
Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo
2016-01-01
There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics.
Big Data Analytics with Datalog Queries on Spark
Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo
2017-01-01
There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics. PMID:28626296
Simulations of Instabilities in Complex Valve and Feed Systems
NASA Technical Reports Server (NTRS)
Ahuja, Vineet; Hosangadi, Ashvin; Shipman, Jeremy; Cavallo, Peter A.
2006-01-01
CFD analyses are playing an increasingly important role in identifying and characterizing flow induced instabilities in rocket engine test facilities and flight systems. In this paper, we analyze instability mechanisms that range from turbulent pressure fluctuations due to vortex shedding in structurally complex valve systems to flow resonance in plug cavities to large scale pressure fluctuations due to collapse of cavitation induced vapor clouds. Furthermore, we discuss simulations of transient behavior related to valve motion that can serve as guidelines for valve scheduling. Such predictions of valve response to varying flow conditions is of crucial importance to engine operation and testing.
Internet-enabled collaborative agent-based supply chains
NASA Astrophysics Data System (ADS)
Shen, Weiming; Kremer, Rob; Norrie, Douglas H.
2000-12-01
This paper presents some results of our recent research work related to the development of a new Collaborative Agent System Architecture (CASA) and an Infrastructure for Collaborative Agent Systems (ICAS). Initially being proposed as a general architecture for Internet based collaborative agent systems (particularly complex industrial collaborative agent systems), the proposed architecture is very suitable for managing the Internet enabled complex supply chain for a large manufacturing enterprise. The general collaborative agent system architecture with the basic communication and cooperation services, domain independent components, prototypes and mechanisms are described. Benefits of implementing Internet enabled supply chains with the proposed infrastructure are discussed. A case study on Internet enabled supply chain management is presented.
Midbond basis functions for weakly bound complexes
NASA Astrophysics Data System (ADS)
Shaw, Robert A.; Hill, J. Grant
2018-06-01
Weakly bound systems present a difficult problem for conventional atom-centred basis sets due to large separations, necessitating the use of large, computationally expensive bases. This can be remedied by placing a small number of functions in the region between molecules in the complex. We present compact sets of optimised midbond functions for a range of complexes involving noble gases, alkali metals and small molecules for use in high accuracy coupled -cluster calculations, along with a more robust procedure for their optimisation. It is shown that excellent results are possible with double-zeta quality orbital basis sets when a few midbond functions are added, improving both the interaction energy and the equilibrium bond lengths of a series of noble gas dimers by 47% and 8%, respectively. When used in conjunction with explicitly correlated methods, near complete basis set limit accuracy is readily achievable at a fraction of the cost that using a large basis would entail. General purpose auxiliary sets are developed to allow explicitly correlated midbond function studies to be carried out, making it feasible to perform very high accuracy calculations on weakly bound complexes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becthel Jacobs Company LLC
2002-11-01
The Y-12 National Security Complex (Y-12 Complex) is an active manufacturing and developmental engineering facility that is located on the U.S. Department of Energy (DOE) Oak Ridge Reservation. Building 9201-2 was one of the first process buildings constructed at the Y-12 Complex. Construction involved relocating and straightening of the Upper East Fork Poplar Creek (UEFPC) channel, adding large quantities of fill material to level areas along the creek, and pumping of concrete into sinkholes and solution cavities present within the limestone bedrock. Flow from a large natural spring designated as ''Big Spring'' on the original 1943 Stone & Webster Buildingmore » 9201-2 Field Sketch FS6003 was captured and directed to UEFPC through a drainpipe designated Outfall 51. The building was used from 1953 to 1955 for pilot plant operations for an industrial process that involved the use of large quantities of elemental mercury. Past operations at the Y-12 Complex led to the release of mercury to the environment. Significant environmental media at the site were contaminated by accidental releases of mercury from the building process facilities piping and sumps associated with Y-12 Complex mercury handling facilities. Releases to the soil surrounding the buildings have resulted in significant levels of mercury in these areas of contamination, which is ultimately transported to UEFPC, its streambed, and off-site. Bechtel Jacobs Company LLC (BJC) is the DOE-Oak Ridge Operations prime contractor responsible for conducting environmental restoration activities at the Y-12 Complex. In order to mitigate the mercury being released to UEFPC, the Big Spring Water Treatment System will be designed and constructed as a Comprehensive Environmental Response, Compensation, and Liability Act action. This facility will treat the combined flow from Big Spring feeding Outfall 51 and the inflow now being processed at the East End Mercury Treatment System (EEMTS). Both discharge to UEFPC adjacent to Bldg. 9201-2. The EEMTS treats mercury-contaminated groundwater that collects in sumps in the basement of Bldg. 9201-2. A pre-design study was performed to investigate the applicability of various treatment technologies for reducing mercury discharges at Outfall 51 in support of the design of the Big Spring Water Treatment System. This document evaluates the results of the pre-design study for selection of the mercury removal technology for the treatment system.« less
Major technological innovations introduced in the large antennas of the Deep Space Network
NASA Technical Reports Server (NTRS)
Imbriale, W. A.
2002-01-01
The NASA Deep Space Network (DSN) is the largest and most sensitive scientific, telecommunications and radio navigation network in the world. Its principal responsibilities are to provide communications, tracking, and science services to most of the world's spacecraft that travel beyond low Earth orbit. The network consists of three Deep Space Communications Complexes. Each of the three complexes consists of multiple large antennas equipped with ultra sensitive receiving systems. A centralized Signal Processing Center (SPC) remotely controls the antennas, generates and transmits spacecraft commands, and receives and processes the spacecraft telemetry.
Autonomous Energy Grids | Grid Modernization | NREL
control themselves using advanced machine learning and simulation to create resilient, reliable, and affordable optimized energy systems. Current frameworks to monitor, control, and optimize large-scale energy of optimization theory, control theory, big data analytics, and complex system theory and modeling to
Modeling complex aquifer systems: a case study in Baton Rouge, Louisiana (USA)
NASA Astrophysics Data System (ADS)
Pham, Hai V.; Tsai, Frank T.-C.
2017-05-01
This study targets two challenges in groundwater model development: grid generation and model calibration for aquifer systems that are fluvial in origin. Realistic hydrostratigraphy can be developed using a large quantity of well log data to capture the complexity of an aquifer system. However, generating valid groundwater model grids to be consistent with the complex hydrostratigraphy is non-trivial. Model calibration can also become intractable for groundwater models that intend to match the complex hydrostratigraphy. This study uses the Baton Rouge aquifer system, Louisiana (USA), to illustrate a technical need to cope with grid generation and model calibration issues. A grid generation technique is introduced based on indicator kriging to interpolate 583 wireline well logs in the Baton Rouge area to derive a hydrostratigraphic architecture with fine vertical discretization. Then, an upscaling procedure is developed to determine a groundwater model structure with 162 layers that captures facies geometry in the hydrostratigraphic architecture. To handle model calibration for such a large model, this study utilizes a derivative-free optimization method in parallel computing to complete parameter estimation in a few months. The constructed hydrostratigraphy indicates the Baton Rouge aquifer system is fluvial in origin. The calibration result indicates hydraulic conductivity for Miocene sands is higher than that for Pliocene to Holocene sands and indicates the Baton Rouge fault and the Denham Springs-Scotlandville fault to be low-permeability leaky aquifers. The modeling result shows significantly low groundwater level in the "2,000-foot" sand due to heavy pumping, indicating potential groundwater upward flow from the "2,400-foot" sand.
Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas
2017-01-01
Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948
A Novel Interdisciplinary Approach to Socio-Technical Complexity
NASA Astrophysics Data System (ADS)
Bassetti, Chiara
The chapter presents a novel interdisciplinary approach that integrates micro-sociological analysis into computer-vision and pattern-recognition modeling and algorithms, the purpose being to tackle socio-technical complexity at a systemic yet micro-grounded level. The approach is empirically-grounded and both theoretically- and analytically-driven, yet systemic and multidimensional, semi-supervised and computable, and oriented towards large scale applications. The chapter describes the proposed approach especially as for its sociological foundations, and as applied to the analysis of a particular setting --i.e. sport-spectator crowds. Crowds, better defined as large gatherings, are almost ever-present in our societies, and capturing their dynamics is crucial. From social sciences to public safety management and emergency response, modeling and predicting large gatherings' presence and dynamics, thus possibly preventing critical situations and being able to properly react to them, is fundamental. This is where semi/automated technologies can make the difference. The work presented in this chapter is intended as a scientific step towards such an objective.
Thermal modeling and analysis of structurally complex spacecraft using the IDEAS system
NASA Technical Reports Server (NTRS)
Garrett, L. B.
1983-01-01
Large antenna satellites of unprecedented sizes are needed for a number of applications. Antenna diameters on the order of 50 meters and upward are required. Such antennas involve the use of large expanses of lattice structures with hundreds or thousands of individual connecting members. In connection with the design of such structures, the consideration of thermal effects represents a crucial factor. Software capabilities have emerged which are coded to include major first order thermal effects and to purposely ignore, in the interest of computational efficiency, the secondary effects. The Interactive Design and Evaluation of Advanced Spacecraft (IDEAS) is one such system. It has been developed for an employment in connection with thermal-structural interaction analyses related to the design of large structurally complex classes of future spacecraft. An IDEAS overview is presented. Attention is given to a typical antenna analysis using IDEAS, the thermal and loading analyses of a tetrahedral truss spacecraft, and ecliptic and polar orbit analyses.
Thermoelectric Properties of Complex Oxide Heterostructures
NASA Astrophysics Data System (ADS)
Cain, Tyler Andrew
Thermoelectrics are a promising energy conversion technology for power generation and cooling systems. The thermal and electrical properties of the materials at the heart of thermoelectric devices dictate conversion efficiency and technological viability. Studying the fundamental properties of potentially new thermoelectric materials is of great importance for improving device performance and understanding the electronic structure of materials systems. In this dissertation, investigations on the thermoelectric properties of a prototypical complex oxide, SrTiO3, are discussed. Hybrid molecular beam epitaxy (MBE) is used to synthesize La-doped SrTiO3 thin films, which exhibit high electron mobilities and large Seebeck coefficients resulting in large thermoelectric power factors at low temperatures. Large interfacial electron densities have been observed in SrTiO3/RTiO 3 (R=Gd,Sm) heterostructures. The thermoelectric properties of such heterostructures are investigated, including the use of a modulation doping approach to control interfacial electron densities. Low-temperature Seebeck coefficients of extreme electron-density SrTiO3 quantum wells are shown to provide insight into their electronic structure.
System Dynamics Modeling of Transboundary Systems: The Bear River Basin Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerald Sehlke; Jake Jacobson
2005-09-01
System dynamics is a computer-aided approach to evaluating the interrelationships of different components and activities within complex systems. Recently, system dynamics models have been developed in areas such as policy design, biological and medical modeling, energy and the environmental analysis, and in various other areas in the natural and social sciences. The Idaho National Engineering and Environmental Laboratory, a multi-purpose national laboratory managed by the Department of Energy, has developed a systems dynamics model in order to evaluate its utility for modeling large complex hydrological systems. We modeled the Bear River Basin, a transboundary basin that includes portions of Idaho,more » Utah and Wyoming. We found that system dynamics modeling is very useful for integrating surface water and groundwater data and for simulating the interactions between these sources within a given basin. In addition, we also found system dynamics modeling is useful for integrating complex hydrologic data with other information (e.g., policy, regulatory and management criteria) to produce a decision support system. Such decision support systems can allow managers and stakeholders to better visualize the key hydrologic elements and management constraints in the basin, which enables them to better understand the system via the simulation of multiple “what-if” scenarios. Although system dynamics models can be developed to conduct traditional hydraulic/hydrologic surface water or groundwater modeling, we believe that their strength lies in their ability to quickly evaluate trends and cause–effect relationships in large-scale hydrological systems; for integrating disparate data; for incorporating output from traditional hydraulic/hydrologic models; and for integration of interdisciplinary data, information and criteria to support better management decisions.« less
System Dynamics Modeling of Transboundary Systems: the Bear River Basin Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerald Sehlke; Jacob J. Jacobson
2005-09-01
System dynamics is a computer-aided approach to evaluating the interrelationships of different components and activities within complex systems. Recently, system dynamics models have been developed in areas such as policy design, biological and medical modeling, energy and the environmental analysis, and in various other areas in the natural and social sciences. The Idaho National Engineering and Environmental Laboratory, a multi-purpose national laboratory managed by the Department of Energy, has developed a systems dynamics model in order to evaluate its utility for modeling large complex hydrological systems. We modeled the Bear River Basin, a transboundary basin that includes portions of Idaho,more » Utah and Wyoming. We found that system dynamics modeling is very useful for integrating surface water and ground water data and for simulating the interactions between these sources within a given basin. In addition, we also found system dynamics modeling is useful for integrating complex hydrologic data with other information (e.g., policy, regulatory and management criteria) to produce a decision support system. Such decision support systems can allow managers and stakeholders to better visualize the key hydrologic elements and management constraints in the basin, which enables them to better understand the system via the simulation of multiple “what-if” scenarios. Although system dynamics models can be developed to conduct traditional hydraulic/hydrologic surface water or ground water modeling, we believe that their strength lies in their ability to quickly evaluate trends and cause–effect relationships in large-scale hydrological systems; for integrating disparate data; for incorporating output from traditional hydraulic/hydrologic models; and for integration of interdisciplinary data, information and criteria to support better management decisions.« less
The topological requirements for robust perfect adaptation in networks of any size.
Araujo, Robyn P; Liotta, Lance A
2018-05-01
Robustness, and the ability to function and thrive amid changing and unfavorable environments, is a fundamental requirement for living systems. Until now it has been an open question how large and complex biological networks can exhibit robust behaviors, such as perfect adaptation to a variable stimulus, since complexity is generally associated with fragility. Here we report that all networks that exhibit robust perfect adaptation (RPA) to a persistent change in stimulus are decomposable into well-defined modules, of which there exist two distinct classes. These two modular classes represent a topological basis for all RPA-capable networks, and generate the full set of topological realizations of the internal model principle for RPA in complex, self-organizing, evolvable bionetworks. This unexpected result supports the notion that evolutionary processes are empowered by simple and scalable modular design principles that promote robust performance no matter how large or complex the underlying networks become.
Jiang, Shang-Da; Maganas, Dimitrios; Levesanos, Nikolaos; Ferentinos, Eleftherios; Haas, Sabrina; Thirunavukkuarasu, Komalavalli; Krzystek, J; Dressel, Martin; Bogani, Lapo; Neese, Frank; Kyritsis, Panayotis
2015-10-14
The high-spin (S = 1) tetrahedral Ni(II) complex [Ni{(i)Pr2P(Se)NP(Se)(i)Pr2}2] was investigated by magnetometry, spectroscopic, and quantum chemical methods. Angle-resolved magnetometry studies revealed the orientation of the magnetization principal axes. The very large zero-field splitting (zfs), D = 45.40(2) cm(-1), E = 1.91(2) cm(-1), of the complex was accurately determined by far-infrared magnetic spectroscopy, directly observing transitions between the spin sublevels of the triplet ground state. These are the largest zfs values ever determined--directly--for a high-spin Ni(II) complex. Ab initio calculations further probed the electronic structure of the system, elucidating the factors controlling the sign and magnitude of D. The latter is dominated by spin-orbit coupling contributions of the Ni ions, whereas the corresponding effects of the Se atoms are remarkably smaller.
The life of plant mitochondrial complex I.
Braun, Hans-Peter; Binder, Stefan; Brennicke, Axel; Eubel, Holger; Fernie, Alisdair R; Finkemeier, Iris; Klodmann, Jennifer; König, Ann-Christine; Kühn, Kristina; Meyer, Etienne; Obata, Toshihiro; Schwarzländer, Markus; Takenaka, Mizuki; Zehrmann, Anja
2014-11-01
The mitochondrial NADH dehydrogenase complex (complex I) of the respiratory chain has several remarkable features in plants: (i) particularly many of its subunits are encoded by the mitochondrial genome, (ii) its mitochondrial transcripts undergo extensive maturation processes (e.g. RNA editing, trans-splicing), (iii) its assembly follows unique routes, (iv) it includes an additional functional domain which contains carbonic anhydrases and (v) it is, indirectly, involved in photosynthesis. Comprising about 50 distinct protein subunits, complex I of plants is very large. However, an even larger number of proteins are required to synthesize these subunits and assemble the enzyme complex. This review aims to follow the complete "life cycle" of plant complex I from various molecular perspectives. We provide arguments that complex I represents an ideal model system for studying the interplay of respiration and photosynthesis, the cooperation of mitochondria and the nucleus during organelle biogenesis and the evolution of the mitochondrial oxidative phosphorylation system. Copyright © 2014 Elsevier B.V. and Mitochondria Research Society. All rights reserved.
Untangling Brain-Wide Dynamics in Consciousness by Cross-Embedding
Tajima, Satohiro; Yanagawa, Toru; Fujii, Naotaka; Toyoizumi, Taro
2015-01-01
Brain-wide interactions generating complex neural dynamics are considered crucial for emergent cognitive functions. However, the irreducible nature of nonlinear and high-dimensional dynamical interactions challenges conventional reductionist approaches. We introduce a model-free method, based on embedding theorems in nonlinear state-space reconstruction, that permits a simultaneous characterization of complexity in local dynamics, directed interactions between brain areas, and how the complexity is produced by the interactions. We demonstrate this method in large-scale electrophysiological recordings from awake and anesthetized monkeys. The cross-embedding method captures structured interaction underlying cortex-wide dynamics that may be missed by conventional correlation-based analysis, demonstrating a critical role of time-series analysis in characterizing brain state. The method reveals a consciousness-related hierarchy of cortical areas, where dynamical complexity increases along with cross-area information flow. These findings demonstrate the advantages of the cross-embedding method in deciphering large-scale and heterogeneous neuronal systems, suggesting a crucial contribution by sensory-frontoparietal interactions to the emergence of complex brain dynamics during consciousness. PMID:26584045
Tool Use Within NASA Software Quality Assurance
NASA Technical Reports Server (NTRS)
Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel
2013-01-01
As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.
Autonomous Modeling, Statistical Complexity and Semi-annealed Treatment of Boolean Networks
NASA Astrophysics Data System (ADS)
Gong, Xinwei
This dissertation presents three studies on Boolean networks. Boolean networks are a class of mathematical systems consisting of interacting elements with binary state variables. Each element is a node with a Boolean logic gate, and the presence of interactions between any two nodes is represented by directed links. Boolean networks that implement the logic structures of real systems are studied as coarse-grained models of the real systems. Large random Boolean networks are studied with mean field approximations and used to provide a baseline of possible behaviors of large real systems. This dissertation presents one study of the former type, concerning the stable oscillation of a yeast cell-cycle oscillator, and two studies of the latter type, respectively concerning the statistical complexity of large random Boolean networks and an extension of traditional mean field techniques that accounts for the presence of short loops. In the cell-cycle oscillator study, a novel autonomous update scheme is introduced to study the stability of oscillations in small networks. A motif that corrects pulse-growing perturbations and a motif that grows pulses are identified. A combination of the two motifs is capable of sustaining stable oscillations. Examining a Boolean model of the yeast cell-cycle oscillator using an autonomous update scheme yields evidence that it is endowed with such a combination. Random Boolean networks are classified as ordered, critical or disordered based on their response to small perturbations. In the second study, random Boolean networks are taken as prototypical cases for the evaluation of two measures of complexity based on a criterion for optimal statistical prediction. One measure, defined for homogeneous systems, does not distinguish between the static spatial inhomogeneity in the ordered phase and the dynamical inhomogeneity in the disordered phase. A modification in which complexities of individual nodes are calculated yields vanishing complexity values for networks in the ordered and critical phases and for highly disordered networks, peaking somewhere in the disordered phase. Individual nodes with high complexity have, on average, a larger influence on the system dynamics. Lastly, a semi-annealed approximation that preserves the correlation between states at neighboring nodes is introduced to study a social game-inspired network model in which all links are bidirectional and all nodes have a self-input. The technique developed here is shown to yield accurate predictions of distribution of players' states, and accounts for some nontrivial collective behavior of game theoretic interest.
Numerical Technology for Large-Scale Computational Electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharpe, R; Champagne, N; White, D
The key bottleneck of implicit computational electromagnetics tools for large complex geometries is the solution of the resulting linear system of equations. The goal of this effort was to research and develop critical numerical technology that alleviates this bottleneck for large-scale computational electromagnetics (CEM). The mathematical operators and numerical formulations used in this arena of CEM yield linear equations that are complex valued, unstructured, and indefinite. Also, simultaneously applying multiple mathematical modeling formulations to different portions of a complex problem (hybrid formulations) results in a mixed structure linear system, further increasing the computational difficulty. Typically, these hybrid linear systems aremore » solved using a direct solution method, which was acceptable for Cray-class machines but does not scale adequately for ASCI-class machines. Additionally, LLNL's previously existing linear solvers were not well suited for the linear systems that are created by hybrid implicit CEM codes. Hence, a new approach was required to make effective use of ASCI-class computing platforms and to enable the next generation design capabilities. Multiple approaches were investigated, including the latest sparse-direct methods developed by our ASCI collaborators. In addition, approaches that combine domain decomposition (or matrix partitioning) with general-purpose iterative methods and special purpose pre-conditioners were investigated. Special-purpose pre-conditioners that take advantage of the structure of the matrix were adapted and developed based on intimate knowledge of the matrix properties. Finally, new operator formulations were developed that radically improve the conditioning of the resulting linear systems thus greatly reducing solution time. The goal was to enable the solution of CEM problems that are 10 to 100 times larger than our previous capability.« less
ERIC Educational Resources Information Center
Monk, Ellen F.; Lycett, Mark
2016-01-01
Enterprise Resource Planning Systems (ERP) are very large and complex software packages that run every aspect of an organization. Increasingly, ERP systems are being used in higher education as one way to teach business processes, essential knowledge for students competing in today's business environment. Past research attempting to measure…
Measuring School Performance To Improve Student Achievement and To Reward Effective Programs.
ERIC Educational Resources Information Center
Heistad, Dave; Spicuzza, Rick
This paper describes the method that the Minneapolis Public School system (MPS), Minnesota, uses to measure school and student performance. MPS uses a multifaceted system that both captures and accounts for the complexity of a large urban school district. The system incorporates: (1) a hybrid model of critical indicators that report on level of…
Aufderheide, Helge; Rudolf, Lars; Gross, Thilo; Lafferty, Kevin D.
2013-01-01
Recent attempts to predict the response of large food webs to perturbations have revealed that in larger systems increasingly precise information on the elements of the system is required. Thus, the effort needed for good predictions grows quickly with the system's complexity. Here, we show that not all elements need to be measured equally well, suggesting that a more efficient allocation of effort is possible. We develop an iterative technique for determining an efficient measurement strategy. In model food webs, we find that it is most important to precisely measure the mortality and predation rates of long-lived, generalist, top predators. Prioritizing the study of such species will make it easier to understand the response of complex food webs to perturbations.
Statistical Physics of Cascading Failures in Complex Networks
NASA Astrophysics Data System (ADS)
Panduranga, Nagendra Kumar
Systems such as the power grid, world wide web (WWW), and internet are categorized as complex systems because of the presence of a large number of interacting elements. For example, the WWW is estimated to have a billion webpages and understanding the dynamics of such a large number of individual agents (whose individual interactions might not be fully known) is a challenging task. Complex network representations of these systems have proved to be of great utility. Statistical physics is the study of emergence of macroscopic properties of systems from the characteristics of the interactions between individual molecules. Hence, statistical physics of complex networks has been an effective approach to study these systems. In this dissertation, I have used statistical physics to study two distinct phenomena in complex systems: i) Cascading failures and ii) Shortest paths in complex networks. Understanding cascading failures is considered to be one of the "holy grails" in the study of complex systems such as the power grid, transportation networks, and economic systems. Studying failures of these systems as percolation on complex networks has proved to be insightful. Previously, cascading failures have been studied extensively using two different models: k-core percolation and interdependent networks. The first part of this work combines the two models into a general model, solves it analytically, and validates the theoretical predictions through extensive computer simulations. The phase diagram of the percolation transition has been systematically studied as one varies the average local k-core threshold and the coupling between networks. The phase diagram of the combined processes is very rich and includes novel features that do not appear in the models which study each of the processes separately. For example, the phase diagram consists of first- and second-order transition regions separated by two tricritical lines that merge together and enclose a two-stage transition region. In the two-stage transition, the size of the giant component undergoes a first-order jump at a certain occupation probability followed by a continuous second-order transition at a smaller occupation probability. Furthermore, at certain fixed interdependencies, the percolation transition cycles from first-order to second-order to two-stage to first-order as the k-core threshold is increased. We setup the analytical equations describing the phase boundaries of the two-stage transition region and we derive the critical exponents for each type of transition. Understanding the shortest paths between individual elements in systems like communication networks and social media networks is important in the study of information cascades in these systems. Often, large heterogeneity can be present in the connections between nodes in these networks. Certain sets of nodes can be more highly connected among themselves than with the nodes from other sets. These sets of nodes are often referred to as 'communities'. The second part of this work studies the effect of the presence of communities on the distribution of shortest paths in a network using a modular Erdős-Renyi network model. In this model, the number of communities and the degree of modularity of the network can be tuned using the parameters of the model. We find that the model reaches a percolation threshold while tuning the degree of modularity of the network and the distribution of the shortest paths in the network can be used as an indicator of how the communities are connected.
Adaptive sampling strategies with high-throughput molecular dynamics
NASA Astrophysics Data System (ADS)
Clementi, Cecilia
Despite recent significant hardware and software developments, the complete thermodynamic and kinetic characterization of large macromolecular complexes by molecular simulations still presents significant challenges. The high dimensionality of these systems and the complexity of the associated potential energy surfaces (creating multiple metastable regions connected by high free energy barriers) does not usually allow to adequately sample the relevant regions of their configurational space by means of a single, long Molecular Dynamics (MD) trajectory. Several different approaches have been proposed to tackle this sampling problem. We focus on the development of ensemble simulation strategies, where data from a large number of weakly coupled simulations are integrated to explore the configurational landscape of a complex system more efficiently. Ensemble methods are of increasing interest as the hardware roadmap is now mostly based on increasing core counts, rather than clock speeds. The main challenge in the development of an ensemble approach for efficient sampling is in the design of strategies to adaptively distribute the trajectories over the relevant regions of the systems' configurational space, without using any a priori information on the system global properties. We will discuss the definition of smart adaptive sampling approaches that can redirect computational resources towards unexplored yet relevant regions. Our approaches are based on new developments in dimensionality reduction for high dimensional dynamical systems, and optimal redistribution of resources. NSF CHE-1152344, NSF CHE-1265929, Welch Foundation C-1570.
Patterns and controls on historical channel change in the Willamette River, Oregon, USA
Jennifer Rose Wallick; Gordon E. Grant; Stephen T. Lancaster; John P. Bolte; Roger P. Denlinger
2007-01-01
Distinguishing human impacts on channel morphology from the natural behaviour of fluvial systems is problematic for large river basins. Large river basins, by virtue of their size, typically encompass wide ranges of geology and landforms resulting in diverse controls on channel form. They also inevitably incorporate long and complex histories of overlapping human and...
English Teaching and the Economic Development of Colombia.
ERIC Educational Resources Information Center
Stansfield, Charles W.
To supply the large number of workers qualified for complex jobs, a demand created by the growing needs of a rapidly growing population, Colombia must make provisions for an expanded system of higher education. This can be accomplished by sending students abroad to study at the university level. The large number of students coming to the United…
Preparing Non-Educators for the Superintendency
ERIC Educational Resources Information Center
Quinn, Tim
2007-01-01
It takes strong leadership skills to successfully run an entity as large and complex as an urban school district, much less turn around one that is low-performing. Most people don't realize that many urban school systems are as large as the biggest companies in America. Yet, most school district leaders and school board members have no specific…
Using constraints and their value for optimization of large ODE systems
Domijan, Mirela; Rand, David A.
2015-01-01
We provide analytical tools to facilitate a rigorous assessment of the quality and value of the fit of a complex model to data. We use this to provide approaches to model fitting, parameter estimation, the design of optimization functions and experimental optimization. This is in the context where multiple constraints are used to select or optimize a large model defined by differential equations. We illustrate the approach using models of circadian clocks and the NF-κB signalling system. PMID:25673300
NASA Astrophysics Data System (ADS)
Gorille, I.
1980-11-01
The application of MOS switching circuits of high complexity in essential automobile systems, such as ignition and injection, was investigated. A bipolar circuit technology, current hogging logic (CHL), was compared to MOS technologies for its competitiveness. The functional requirements of digital automotive systems can only be met by technologies allowing large packing densities and medium speeds. The properties of n-MOS and CMOS are promising whereas the electrical power needed by p-MOS circuits is in general prohibitively large.
Complexities, Catastrophes and Cities: Emergency Dynamics in Varying Scenarios and Urban Topologies
NASA Astrophysics Data System (ADS)
Narzisi, Giuseppe; Mysore, Venkatesh; Byeon, Jeewoong; Mishra, Bud
Complex Systems are often characterized by agents capable of interacting with each other dynamically, often in non-linear and non-intuitive ways. Trying to characterize their dynamics often results in partial differential equations that are difficult, if not impossible, to solve. A large city or a city-state is an example of such an evolving and self-organizing complex environment that efficiently adapts to different and numerous incremental changes to its social, cultural and technological infrastructure [1]. One powerful technique for analyzing such complex systems is Agent-Based Modeling (ABM) [9], which has seen an increasing number of applications in social science, economics and also biology. The agent-based paradigm facilitates easier transfer of domain specific knowledge into a model. ABM provides a natural way to describe systems in which the overall dynamics can be described as the result of the behavior of populations of autonomous components: agents, with a fixed set of rules based on local information and possible central control. As part of the NYU Center for Catastrophe Preparedness and Response (CCPR1), we have been exploring how ABM can serve as a powerful simulation technique for analyzing large-scale urban disasters. The central problem in Disaster Management is that it is not immediately apparent whether the current emergency plans are robust against such sudden, rare and punctuated catastrophic events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, Daniel
8-Session Symposium on STRUCTURE AND DYNAMICS IN COMPLEX CHEMICAL SYSTEMS: GAINING NEW INSIGHTS THROUGH RECENT ADVANCES IN TIME-RESOLVED SPECTROSCOPIES. The intricacy of most chemical, biochemical, and material processes and their applications are underscored by the complex nature of the environments in which they occur. Substantial challenges for building a global understanding of a heterogeneous system include (1) identifying unique signatures associated with specific structural motifs within the heterogeneous distribution, and (2) resolving the significance of each of multiple time scales involved in both small- and large-scale nuclear reorganization. This symposium focuses on the progress in our understanding of dynamics inmore » complex systems driven by recent innovations in time-resolved spectroscopies and theoretical developments. Such advancement is critical for driving discovery at the molecular level facilitating new applications. Broad areas of interest include: Structural relaxation and the impact of structure on dynamics in liquids, interfaces, biochemical systems, materials, and other heterogeneous environments.« less
Teaching High-Accuracy Global Positioning System to Undergraduates Using Online Processing Services
ERIC Educational Resources Information Center
Wang, Guoquan
2013-01-01
High-accuracy Global Positioning System (GPS) has become an important geoscientific tool used to measure ground motions associated with plate movements, glacial movements, volcanoes, active faults, landslides, subsidence, slow earthquake events, as well as large earthquakes. Complex calculations are required in order to achieve high-precision…
Funding California Schools: The Revenue Limit System
ERIC Educational Resources Information Center
Weston, Margaret
2010-01-01
Tax revenue flows to California's nearly 1,000 school districts through many different channels. According to the Governor's Committee on Education Excellence (2007), this system is so complex that the state cannot determine how revenues are distributed among school districts, and after reviewing a large number of academic studies in the Getting…
Large-Scale Optimization for Bayesian Inference in Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willcox, Karen; Marzouk, Youssef
2013-11-12
The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of themore » SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less
Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghattas, Omar
2013-10-15
The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUAROmore » Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less
NASA Astrophysics Data System (ADS)
Sood, Suresh; Pattinson, Hugh
Traditionally, face-to-face negotiations in the real world have not been looked at as a complex systems interaction of actors resulting in a dynamic and potentially emergent system. If indeed negotiations are an outcome of a dynamic interaction of simpler behavior just as with a complex system, we should be able to see the patterns contributing to the complexities of a negotiation under study. This paper and the supporting research sets out to show B2B (business-to-business) negotiations as complex systems of interacting actors exhibiting dynamic and emergent behavior. This paper discusses the exploratory research based on negotiation simulations in which a large number of business students participate as buyers and sellers. The student interactions are captured on video and a purpose built research method attempts to look for patterns of interactions between actors using visualization techniques traditionally reserved to observe the algorithmic complexity of complex systems. Students are videoed negotiating with partners. Each video is tagged according to a recognized classification and coding scheme for negotiations. The classification relates to the phases through which any particular negotiation might pass, such as laughter, aggression, compromise, and so forth — through some 30 possible categories. Were negotiations more or less successful if they progressed through the categories in different ways? Furthermore, does the data depict emergent pathway segments considered to be more or less successful? This focus on emergence within the data provides further strong support for face-to-face (F2F) negotiations to be construed as complex systems.
NASA Technical Reports Server (NTRS)
Dill, Evan T.; Young, Steven D.
2015-01-01
In the constant drive to further the safety and efficiency of air travel, the complexity of avionics-related systems, and the procedures for interacting with these systems, appear to be on an ever-increasing trend. While this growing complexity often yields productive results with respect to system capabilities and flight efficiency, it can place a larger burden on pilots to manage increasing amounts of information and to understand intricate system designs. Evidence supporting this observation is becoming widespread, yet has been largely anecdotal or the result of subjective analysis. One way to gain more insight into this issue is through experimentation using more objective measures or indicators. This study utilizes and analyzes eye-tracking data obtained during a high-fidelity flight simulation study wherein many of the complexities of current flight decks, as well as those planned for the next generation air transportation system (NextGen), were emulated. The following paper presents the findings of this study with a focus on electronic flight bag (EFB) usage, system state awareness (SSA) and events involving suspected inattentional blindness (IB).
Speeding up GW Calculations to Meet the Challenge of Large Scale Quasiparticle Predictions.
Gao, Weiwei; Xia, Weiyi; Gao, Xiang; Zhang, Peihong
2016-11-11
Although the GW approximation is recognized as one of the most accurate theories for predicting materials excited states properties, scaling up conventional GW calculations for large systems remains a major challenge. We present a powerful and simple-to-implement method that can drastically accelerate fully converged GW calculations for large systems, enabling fast and accurate quasiparticle calculations for complex materials systems. We demonstrate the performance of this new method by presenting the results for ZnO and MgO supercells. A speed-up factor of nearly two orders of magnitude is achieved for a system containing 256 atoms (1024 valence electrons) with a negligibly small numerical error of ±0.03 eV. Finally, we discuss the application of our method to the GW calculations for 2D materials.
Rule-based modeling and simulations of the inner kinetochore structure.
Tschernyschkow, Sergej; Herda, Sabine; Gruenert, Gerd; Döring, Volker; Görlich, Dennis; Hofmeister, Antje; Hoischen, Christian; Dittrich, Peter; Diekmann, Stephan; Ibrahim, Bashar
2013-09-01
Combinatorial complexity is a central problem when modeling biochemical reaction networks, since the association of a few components can give rise to a large variation of protein complexes. Available classical modeling approaches are often insufficient for the analysis of very large and complex networks in detail. Recently, we developed a new rule-based modeling approach that facilitates the analysis of spatial and combinatorially complex problems. Here, we explore for the first time how this approach can be applied to a specific biological system, the human kinetochore, which is a multi-protein complex involving over 100 proteins. Applying our freely available SRSim software to a large data set on kinetochore proteins in human cells, we construct a spatial rule-based simulation model of the human inner kinetochore. The model generates an estimation of the probability distribution of the inner kinetochore 3D architecture and we show how to analyze this distribution using information theory. In our model, the formation of a bridge between CenpA and an H3 containing nucleosome only occurs efficiently for higher protein concentration realized during S-phase but may be not in G1. Above a certain nucleosome distance the protein bridge barely formed pointing towards the importance of chromatin structure for kinetochore complex formation. We define a metric for the distance between structures that allow us to identify structural clusters. Using this modeling technique, we explore different hypothetical chromatin layouts. Applying a rule-based network analysis to the spatial kinetochore complex geometry allowed us to integrate experimental data on kinetochore proteins, suggesting a 3D model of the human inner kinetochore architecture that is governed by a combinatorial algebraic reaction network. This reaction network can serve as bridge between multiple scales of modeling. Our approach can be applied to other systems beyond kinetochores. Copyright © 2013 Elsevier Ltd. All rights reserved.
Natural selection and self-organization in complex adaptive systems.
Di Bernardo, Mirko
2010-01-01
The central theme of this work is self-organization "interpreted" both from the point of view of theoretical biology, and from a philosophical point of view. By analysing, on the one hand, those which are now considered--not only in the field of physics--some of the most important discoveries, that is complex systems and deterministic chaos and, on the other hand, the new frontiers of systemic biology, this work highlights how large thermodynamic systems which are open can spontaneously stay in an orderly regime. Such systems can represent the natural source of the order required for a stable self-organization, for homoeostasis and for hereditary variations. The order, emerging in enormous randomly interconnected nets of binary variables, is almost certainly only the precursor of similar orders emerging in all the varieties of complex systems. Hence, this work, by finding new foundations for the order pervading the living world, advances the daring hypothesis according to which Darwinian natural selection is not the only source of order in the biosphere. Thus, the article, by examining the passage from Prigogine's dissipative structures theory to the contemporary theory of biological complexity, highlights the development of a coherent and continuous line of research which is set to individuate the general principles marking the profound reality of that mysterious self-organization characterizing the complexity of life.
Can We Advance Macroscopic Quantum Systems Outside the Framework of Complex Decoherence Theory?
Brezinski, Mark E; Rupnick, Maria
2016-01-01
Macroscopic quantum systems (MQS) are macroscopic systems driven by quantum rather than classical mechanics, a long studied area with minimal success till recently. Harnessing the benefits of quantum mechanics on a macroscopic level would revolutionize fields ranging from telecommunication to biology, the latter focused on here for reasons discussed. Contrary to misconceptions, there are no known physical laws that prevent the development of MQS. Instead, they are generally believed universally lost in complex systems from environmental entanglements (decoherence). But we argue success is achievable MQS with decoherence compensation developed, naturally or artificially, from top-down rather current reductionist approaches. This paper advances the MQS field by a complex systems approach to decoherence. First, why complex system decoherence approaches (top-down) are needed is discussed. Specifically, complex adaptive systems (CAS) are not amenable to reductionist models (and their master equations) because of emergent behaviour, approximation failures, not accounting for quantum compensatory mechanisms, ignoring path integrals, and the subentity problem. In addition, since MQS must exist within the context of the classical world, where rapid decoherence and prolonged coherence are both needed. Nature has already demonstrated this for quantum subsystems such as photosynthesis and magnetoreception. Second, we perform a preliminary study that illustrates a top-down approach to potential MQS. In summary, reductionist arguments against MQS are not justifiable. It is more likely they are not easily detectable in large intact classical systems or have been destroyed by reductionist experimental set-ups. This complex systems decoherence approach, using top down investigations, is critical to paradigm shifts in MQS research both in biological and non-biological systems. PMID:29200743
Can We Advance Macroscopic Quantum Systems Outside the Framework of Complex Decoherence Theory?
Brezinski, Mark E; Rupnick, Maria
2014-07-01
Macroscopic quantum systems (MQS) are macroscopic systems driven by quantum rather than classical mechanics, a long studied area with minimal success till recently. Harnessing the benefits of quantum mechanics on a macroscopic level would revolutionize fields ranging from telecommunication to biology, the latter focused on here for reasons discussed. Contrary to misconceptions, there are no known physical laws that prevent the development of MQS. Instead, they are generally believed universally lost in complex systems from environmental entanglements (decoherence). But we argue success is achievable MQS with decoherence compensation developed, naturally or artificially, from top-down rather current reductionist approaches. This paper advances the MQS field by a complex systems approach to decoherence. First, why complex system decoherence approaches (top-down) are needed is discussed. Specifically, complex adaptive systems (CAS) are not amenable to reductionist models (and their master equations) because of emergent behaviour, approximation failures, not accounting for quantum compensatory mechanisms, ignoring path integrals, and the subentity problem. In addition, since MQS must exist within the context of the classical world, where rapid decoherence and prolonged coherence are both needed. Nature has already demonstrated this for quantum subsystems such as photosynthesis and magnetoreception. Second, we perform a preliminary study that illustrates a top-down approach to potential MQS. In summary, reductionist arguments against MQS are not justifiable. It is more likely they are not easily detectable in large intact classical systems or have been destroyed by reductionist experimental set-ups. This complex systems decoherence approach, using top down investigations, is critical to paradigm shifts in MQS research both in biological and non-biological systems.
Supporting Knowledge Transfer in IS Deployment Projects
NASA Astrophysics Data System (ADS)
Schönström, Mikael
To deploy new information systems is an expensive and complex task, and does seldom result in successful usage where the system adds strategic value to the firm (e.g. Sharma et al. 2003). It has been argued that innovation diffusion is a knowledge integration problem (Newell et al. 2000). Knowledge about business processes, deployment processes, information systems and technology are needed in a large-scale deployment of a corporate IS. These deployments can therefore to a large extent be argued to be a knowledge management (KM) problem. An effective deployment requires that knowledge about the system is effectively transferred to the target organization (Ko et al. 2005).
NASA Astrophysics Data System (ADS)
Aggarwal, Anil Kr.; Kumar, Sanjeev; Singh, Vikram
2017-03-01
The binary states, i.e., success or failed state assumptions used in conventional reliability are inappropriate for reliability analysis of complex industrial systems due to lack of sufficient probabilistic information. For large complex systems, the uncertainty of each individual parameter enhances the uncertainty of the system reliability. In this paper, the concept of fuzzy reliability has been used for reliability analysis of the system, and the effect of coverage factor, failure and repair rates of subsystems on fuzzy availability for fault-tolerant crystallization system of sugar plant is analyzed. Mathematical modeling of the system is carried out using the mnemonic rule to derive Chapman-Kolmogorov differential equations. These governing differential equations are solved with Runge-Kutta fourth-order method.
2015-05-22
sensor networks for managing power levels of wireless networks ; air and ground transportation systems for air traffic control and payload transport and... network systems, large-scale systems, adaptive control, discontinuous systems 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF...cover a broad spectrum of ap- plications including cooperative control of unmanned air vehicles, autonomous underwater vehicles, distributed sensor
Optically controlled phased-array antenna technology for space communication systems
NASA Technical Reports Server (NTRS)
Kunath, Richard R.; Bhasin, Kul B.
1988-01-01
Using MMICs in phased-array applications above 20 GHz requires complex RF and control signal distribution systems. Conventional waveguide, coaxial cable, and microstrip methods are undesirable due to their high weight, high loss, limited mechanical flexibility and large volume. An attractive alternative to these transmission media, for RF and control signal distribution in MMIC phased-array antennas, is optical fiber. Presented are potential system architectures and their associated characteristics. The status of high frequency opto-electronic components needed to realize the potential system architectures is also discussed. It is concluded that an optical fiber network will reduce weight and complexity, and increase reliability and performance, but may require higher power.
Enhancing metaproteomics-The value of models and defined environmental microbial systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herbst, Florian-Alexander; Lünsmann, Vanessa; Kjeldal, Henrik
Metaproteomicsthe large-scale characterization of the entire protein complement of environmental microbiota at a given point in timehas provided new features to study complex microbial communities in order to unravel these black boxes. Some new technical challenges arose that were not an issue for classical proteome analytics before that could be tackled by the application of different model systems. Here, we review different current and future model systems for metaproteome analysis. We introduce model systems for clinical and biotechnological research questions including acid mine drainage, anaerobic digesters, and activated sludge, following a short introduction to microbial communities and metaproteomics. Model systemsmore » are useful to evaluate the challenges encountered within (but not limited to) metaproteomics, including species complexity and coverage, biomass availability, or reliable protein extraction. Moreover, the implementation of model systems can be considered as a step forward to better understand microbial community responses and ecological functions of single member organisms. In the future, improvements are necessary to fully explore complex environmental systems by metaproteomics.« less
Enhancing metaproteomics-The value of models and defined environmental microbial systems
Herbst, Florian-Alexander; Lünsmann, Vanessa; Kjeldal, Henrik; ...
2016-01-21
Metaproteomicsthe large-scale characterization of the entire protein complement of environmental microbiota at a given point in timehas provided new features to study complex microbial communities in order to unravel these black boxes. Some new technical challenges arose that were not an issue for classical proteome analytics before that could be tackled by the application of different model systems. Here, we review different current and future model systems for metaproteome analysis. We introduce model systems for clinical and biotechnological research questions including acid mine drainage, anaerobic digesters, and activated sludge, following a short introduction to microbial communities and metaproteomics. Model systemsmore » are useful to evaluate the challenges encountered within (but not limited to) metaproteomics, including species complexity and coverage, biomass availability, or reliable protein extraction. Moreover, the implementation of model systems can be considered as a step forward to better understand microbial community responses and ecological functions of single member organisms. In the future, improvements are necessary to fully explore complex environmental systems by metaproteomics.« less
Jatobá, Alessandro; de Carvalho, Paulo Victor R; da Cunha, Amauri Marques
2012-01-01
Work in organizations requires a minimum level of consensus on the understanding of the practices performed. To adopt technological devices to support the activities in environments where work is complex, characterized by the interdependence among a large number of variables, understanding about how work is done not only takes an even greater importance, but also becomes a more difficult task. Therefore, this study aims to present a method for modeling of work in complex systems, which allows improving the knowledge about the way activities are performed where these activities do not simply happen by performing procedures. Uniting techniques of Cognitive Task Analysis with the concept of Work Process, this work seeks to provide a method capable of providing a detailed and accurate vision of how people perform their tasks, in order to apply information systems for supporting work in organizations.
24 CFR 103.205 - Systemic processing.
Code of Federal Regulations, 2010 CFR
2010-04-01
... are pervasive or institutional in nature, or that the processing of the complaint will involve complex issues, novel questions of fact or law, or will affect a large number of persons, the Assistant Secretary...
Style, Sarah; Beard, B James; Harris-Fry, Helen; Sengupta, Aman; Jha, Sonali; Shrestha, Bhim P; Rai, Anjana; Paudel, Vikas; Thondoo, Meelan; Pulkki-Brannstrom, Anni-Maria; Skordis-Worrall, Jolene; Manandhar, Dharma S; Costello, Anthony; Saville, Naomi M
2017-01-01
The increasing availability and capabilities of mobile phones make them a feasible means of data collection. Electronic Data Capture (EDC) systems have been used widely for public health monitoring and surveillance activities, but documentation of their use in complicated research studies requiring multiple systems is limited. This paper shares our experiences of designing and implementing a complex multi-component EDC system for a community-based four-armed cluster-Randomised Controlled Trial in the rural plains of Nepal, to help other researchers planning to use EDC for complex studies in low-income settings. We designed and implemented three interrelated mobile phone data collection systems to enrol and follow-up pregnant women (trial participants), and to support the implementation of trial interventions (women's groups, food and cash transfers). 720 field staff used basic phones to send simple coded text messages, 539 women's group facilitators used Android smartphones with Open Data Kit Collect, and 112 Interviewers, Coordinators and Supervisors used smartphones with CommCare. Barcoded photo ID cards encoded with participant information were generated for each enrolled woman. Automated systems were developed to download, recode and merge data for nearly real-time access by researchers. The systems were successfully rolled out and used by 1371 staff. A total of 25,089 pregnant women were enrolled, and 17,839 follow-up forms completed. Women's group facilitators recorded 5717 women's groups and the distribution of 14,647 food and 13,482 cash transfers. Using EDC sped up data collection and processing, although time needed for programming and set-up delayed the study inception. EDC using three interlinked mobile data management systems (FrontlineSMS, ODK and CommCare) was a feasible and effective method of data capture in a complex large-scale trial in the plains of Nepal. Despite challenges including prolonged set-up times, the systems met multiple data collection needs for users with varying levels of literacy and experience.
Style, Sarah; Beard, B. James; Harris-Fry, Helen; Sengupta, Aman; Jha, Sonali; Shrestha, Bhim P.; Rai, Anjana; Paudel, Vikas; Thondoo, Meelan; Pulkki-Brannstrom, Anni-Maria; Skordis-Worrall, Jolene; Manandhar, Dharma S.; Costello, Anthony; Saville, Naomi M.
2017-01-01
ABSTRACT The increasing availability and capabilities of mobile phones make them a feasible means of data collection. Electronic Data Capture (EDC) systems have been used widely for public health monitoring and surveillance activities, but documentation of their use in complicated research studies requiring multiple systems is limited. This paper shares our experiences of designing and implementing a complex multi-component EDC system for a community-based four-armed cluster-Randomised Controlled Trial in the rural plains of Nepal, to help other researchers planning to use EDC for complex studies in low-income settings. We designed and implemented three interrelated mobile phone data collection systems to enrol and follow-up pregnant women (trial participants), and to support the implementation of trial interventions (women’s groups, food and cash transfers). 720 field staff used basic phones to send simple coded text messages, 539 women’s group facilitators used Android smartphones with Open Data Kit Collect, and 112 Interviewers, Coordinators and Supervisors used smartphones with CommCare. Barcoded photo ID cards encoded with participant information were generated for each enrolled woman. Automated systems were developed to download, recode and merge data for nearly real-time access by researchers. The systems were successfully rolled out and used by 1371 staff. A total of 25,089 pregnant women were enrolled, and 17,839 follow-up forms completed. Women’s group facilitators recorded 5717 women’s groups and the distribution of 14,647 food and 13,482 cash transfers. Using EDC sped up data collection and processing, although time needed for programming and set-up delayed the study inception. EDC using three interlinked mobile data management systems (FrontlineSMS, ODK and CommCare) was a feasible and effective method of data capture in a complex large-scale trial in the plains of Nepal. Despite challenges including prolonged set-up times, the systems met multiple data collection needs for users with varying levels of literacy and experience. PMID:28613121
Using systems thinking to support clinical system transformation.
Best, Allan; Berland, Alex; Herbert, Carol; Bitz, Jennifer; van Dijk, Marlies W; Krause, Christina; Cochrane, Douglas; Noel, Kevin; Marsden, Julian; McKeown, Shari; Millar, John
2016-05-16
Purpose - The British Columbia Ministry of Health's Clinical Care Management initiative was used as a case study to better understand large-scale change (LSC) within BC's health system. Using a complex system framework, the purpose of this paper is to examine mechanisms that enable and constrain the implementation of clinical guidelines across various clinical settings. Design/methodology/approach - Researchers applied a general model of complex adaptive systems plus two specific conceptual frameworks (realist evaluation and system dynamics mapping) to define and study enablers and constraints. Focus group sessions and interviews with clinicians, executives, managers and board members were validated through an online survey. Findings - The functional themes for managing large-scale clinical change included: creating a context to prepare clinicians for health system transformation initiatives; promoting shared clinical leadership; strengthening knowledge management, strategic communications and opportunities for networking; and clearing pathways through the complexity of a multilevel, dynamic system. Research limitations/implications - The action research methodology was designed to guide continuing improvement of implementation. A sample of initiatives was selected; it was not intended to compare and contrast facilitators and barriers across all initiatives and regions. Similarly, evaluating the results or process of guideline implementation was outside the scope; the methods were designed to enable conversations at multiple levels - policy, management and practice - about how to improve implementation. The study is best seen as a case study of LSC, offering a possible model for replication by others and a tool to shape further dialogue. Practical implications - Recommended action-oriented strategies included engaging local champions; supporting local adaptation for implementation of clinical guidelines; strengthening local teams to guide implementation; reducing change fatigue; ensuring adequate resources; providing consistent communication especially for front-line care providers; and supporting local teams to demonstrate the clinical value of the guidelines to their colleagues. Originality/value - Bringing a complex systems perspective to clinical guideline implementation resulted in a clear understanding of the challenges involved in LSC.
Cascade-based attacks on complex networks
NASA Astrophysics Data System (ADS)
Motter, Adilson E.; Lai, Ying-Cheng
2002-12-01
We live in a modern world supported by large, complex networks. Examples range from financial markets to communication and transportation systems. In many realistic situations the flow of physical quantities in the network, as characterized by the loads on nodes, is important. We show that for such networks where loads can redistribute among the nodes, intentional attacks can lead to a cascade of overload failures, which can in turn cause the entire or a substantial part of the network to collapse. This is relevant for real-world networks that possess a highly heterogeneous distribution of loads, such as the Internet and power grids. We demonstrate that the heterogeneity of these networks makes them particularly vulnerable to attacks in that a large-scale cascade may be triggered by disabling a single key node. This brings obvious concerns on the security of such systems.
Wickstrom, Lauren; He, Peng; Gallicchio, Emilio; Levy, Ronald M.
2013-01-01
Host-guest inclusion complexes are useful models for understanding the structural and energetic aspects of molecular recognition. Due to their small size relative to much larger protein-ligand complexes, converged results can be obtained rapidly for these systems thus offering the opportunity to more reliably study fundamental aspects of the thermodynamics of binding. In this work, we have performed a large scale binding affinity survey of 57 β-cyclodextrin (CD) host guest systems using the binding energy distribution analysis method (BEDAM) with implicit solvation (OPLS-AA/AGBNP2). Converged estimates of the standard binding free energies are obtained for these systems by employing techniques such as parallel Hamitionian replica exchange molecular dynamics, conformational reservoirs and multistate free energy estimators. Good agreement with experimental measurements is obtained in terms of both numerical accuracy and affinity rankings. Overall, average effective binding energies reproduce affinity rank ordering better than the calculated binding affinities, even though calculated binding free energies, which account for effects such as conformational strain and entropy loss upon binding, provide lower root mean square errors when compared to measurements. Interestingly, we find that binding free energies are superior rank order predictors for a large subset containing the most flexible guests. The results indicate that, while challenging, accurate modeling of reorganization effects can lead to ligand design models of superior predictive power for rank ordering relative to models based only on ligand-receptor interaction energies. PMID:25147485
Challenges and the state of the technology for printed sensor arrays for structural monitoring
NASA Astrophysics Data System (ADS)
Joshi, Shiv; Bland, Scott; DeMott, Robert; Anderson, Nickolas; Jursich, Gregory
2017-04-01
Printed sensor arrays are attractive for reliable, low-cost, and large-area mapping of structural systems. These sensor arrays can be printed on flexible substrates or directly on monitored structural parts. This technology is sought for continuous or on-demand real-time diagnosis and prognosis of complex structural components. In the past decade, many innovative technologies and functional materials have been explored to develop printed electronics and sensors. For example, an all-printed strain sensor array is a recent example of a low-cost, flexible and light-weight system that provides a reliable method for monitoring the state of aircraft structural parts. Among all-printing techniques, screen and inkjet printing methods are well suited for smaller-scale prototyping and have drawn much interest due to maturity of printing procedures and availability of compatible inks and substrates. Screen printing relies on a mask (screen) to transfer a pattern onto a substrate. Screen printing is widely used because of the high printing speed, large selection of ink/substrate materials, and capability of making complex multilayer devices. The complexity of collecting signals from a large number of sensors over a large area necessitates signal multiplexing electronics that need to be printed on flexible substrate or structure. As a result, these components are subjected to same deformation, temperature and other parameters for which sensor arrays are designed. The characteristics of these electronic components, such as transistors, are affected by deformation and other environmental parameters which can lead to erroneous sensed parameters. The manufacturing and functional challenges of the technology of printed sensor array systems for structural state monitoring are the focus of this presentation. Specific examples of strain sensor arrays will be presented to highlight the technical challenges.
Detwiler, R.L.; Mehl, S.; Rajaram, H.; Cheung, W.W.
2002-01-01
Numerical solution of large-scale ground water flow and transport problems is often constrained by the convergence behavior of the iterative solvers used to solve the resulting systems of equations. We demonstrate the ability of an algebraic multigrid algorithm (AMG) to efficiently solve the large, sparse systems of equations that result from computational models of ground water flow and transport in large and complex domains. Unlike geometric multigrid methods, this algorithm is applicable to problems in complex flow geometries, such as those encountered in pore-scale modeling of two-phase flow and transport. We integrated AMG into MODFLOW 2000 to compare two- and three-dimensional flow simulations using AMG to simulations using PCG2, a preconditioned conjugate gradient solver that uses the modified incomplete Cholesky preconditioner and is included with MODFLOW 2000. CPU times required for convergence with AMG were up to 140 times faster than those for PCG2. The cost of this increased speed was up to a nine-fold increase in required random access memory (RAM) for the three-dimensional problems and up to a four-fold increase in required RAM for the two-dimensional problems. We also compared two-dimensional numerical simulations of steady-state transport using AMG and the generalized minimum residual method with an incomplete LU-decomposition preconditioner. For these transport simulations, AMG yielded increased speeds of up to 17 times with only a 20% increase in required RAM. The ability of AMG to solve flow and transport problems in large, complex flow systems and its ready availability make it an ideal solver for use in both field-scale and pore-scale modeling.
Astakhov, Vadim
2009-01-01
Interest in simulation of large-scale metabolic networks, species development, and genesis of various diseases requires new simulation techniques to accommodate the high complexity of realistic biological networks. Information geometry and topological formalisms are proposed to analyze information processes. We analyze the complexity of large-scale biological networks as well as transition of the system functionality due to modification in the system architecture, system environment, and system components. The dynamic core model is developed. The term dynamic core is used to define a set of causally related network functions. Delocalization of dynamic core model provides a mathematical formalism to analyze migration of specific functions in biosystems which undergo structure transition induced by the environment. The term delocalization is used to describe these processes of migration. We constructed a holographic model with self-poetic dynamic cores which preserves functional properties under those transitions. Topological constraints such as Ricci flow and Pfaff dimension were found for statistical manifolds which represent biological networks. These constraints can provide insight on processes of degeneration and recovery which take place in large-scale networks. We would like to suggest that therapies which are able to effectively implement estimated constraints, will successfully adjust biological systems and recover altered functionality. Also, we mathematically formulate the hypothesis that there is a direct consistency between biological and chemical evolution. Any set of causal relations within a biological network has its dual reimplementation in the chemistry of the system environment.
Massively parallel support for a case-based planning system
NASA Technical Reports Server (NTRS)
Kettler, Brian P.; Hendler, James A.; Anderson, William A.
1993-01-01
Case-based planning (CBP), a kind of case-based reasoning, is a technique in which previously generated plans (cases) are stored in memory and can be reused to solve similar planning problems in the future. CBP can save considerable time over generative planning, in which a new plan is produced from scratch. CBP thus offers a potential (heuristic) mechanism for handling intractable problems. One drawback of CBP systems has been the need for a highly structured memory to reduce retrieval times. This approach requires significant domain engineering and complex memory indexing schemes to make these planners efficient. In contrast, our CBP system, CaPER, uses a massively parallel frame-based AI language (PARKA) and can do extremely fast retrieval of complex cases from a large, unindexed memory. The ability to do fast, frequent retrievals has many advantages: indexing is unnecessary; very large case bases can be used; memory can be probed in numerous alternate ways; and queries can be made at several levels, allowing more specific retrieval of stored plans that better fit the target problem with less adaptation. In this paper we describe CaPER's case retrieval techniques and some experimental results showing its good performance, even on large case bases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jang, Seogjoo; Hoyer, Stephan; Fleming, Graham
2014-10-31
A generalized master equation (GME) governing quantum evolution of modular exciton density (MED) is derived for large scale light harvesting systems composed of weakly interacting modules of multiple chromophores. The GME-MED offers a practical framework to incorporate real time coherent quantum dynamics calculations of small length scales into dynamics over large length scales, and also provides a non-Markovian generalization and rigorous derivation of the Pauli master equation employing multichromophoric Förster resonance energy transfer rates. A test of the GME-MED for four sites of the Fenna-Matthews-Olson complex demonstrates how coherent dynamics of excitonic populations over coupled chromophores can be accurately describedmore » by transitions between subgroups (modules) of delocalized excitons. Application of the GME-MED to the exciton dynamics between a pair of light harvesting complexes in purple bacteria demonstrates its promise as a computationally efficient tool to investigate large scale exciton dynamics in complex environments.« less
Human factors in air traffic control: problems at the interfaces.
Shouksmith, George
2003-10-01
The triangular ISIS model for describing the operation of human factors in complex sociotechnical organisations or systems is applied in this research to a large international air traffic control system. A large sample of senior Air Traffic Controllers were randomly assigned to small focus discussion groups, whose task was to identify problems occurring at the interfaces of the three major human factor components: individual, system impacts, and social. From these discussions, a number of significant interface problems, which could adversely affect the functioning of the Air Traffic Control System, emerged. The majority of these occurred at the Individual-System Impact and Individual-Social interfaces and involved a perceived need for further interface centered training.
Artificial intelligence applied to process signal analysis
NASA Technical Reports Server (NTRS)
Corsberg, Dan
1988-01-01
Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.
NASA Astrophysics Data System (ADS)
Andrianov, A. V.
2018-04-01
We have developed an optical gating system for continuously monitoring a complex-shaped periodic optical signal with picosecond resolution in a nanosecond time window using an all-fibre optical gate in the form of a nonlinear loop mirror and a passively mode-locked femtosecond laser. The distinctive features of the system are the possibility of characterizing signals with a very large spectral bandwidth, the possibility of using a gating pulse source with a wavelength falling in the band of the signal under study and its all-fibre design with the use of standard fibres and telecom components.
NASA Astrophysics Data System (ADS)
Loppini, Alessandro
2018-03-01
Complex network theory represents a comprehensive mathematical framework to investigate biological systems, ranging from sub-cellular and cellular scales up to large-scale networks describing species interactions and ecological systems. In their exhaustive and comprehensive work [1], Gosak et al. discuss several scenarios in which the network approach was able to uncover general properties and underlying mechanisms of cells organization and regulation, tissue functions and cell/tissue failure in pathology, by the study of chemical reaction networks, structural networks and functional connectivities.
Development of a large scale Chimera grid system for the Space Shuttle Launch Vehicle
NASA Technical Reports Server (NTRS)
Pearce, Daniel G.; Stanley, Scott A.; Martin, Fred W., Jr.; Gomez, Ray J.; Le Beau, Gerald J.; Buning, Pieter G.; Chan, William M.; Chiu, Ing-Tsau; Wulf, Armin; Akdag, Vedat
1993-01-01
The application of CFD techniques to large problems has dictated the need for large team efforts. This paper offers an opportunity to examine the motivations, goals, needs, problems, as well as the methods, tools, and constraints that defined NASA's development of a 111 grid/16 million point grid system model for the Space Shuttle Launch Vehicle. The Chimera approach used for domain decomposition encouraged separation of the complex geometry into several major components each of which was modeled by an autonomous team. ICEM-CFD, a CAD based grid generation package, simplified the geometry and grid topology definition by provoding mature CAD tools and patch independent meshing. The resulting grid system has, on average, a four inch resolution along the surface.
Two-Photon Rabi Splitting in a Coupled System of a Nanocavity and Exciton Complexes.
Qian, Chenjiang; Wu, Shiyao; Song, Feilong; Peng, Kai; Xie, Xin; Yang, Jingnan; Xiao, Shan; Steer, Matthew J; Thayne, Iain G; Tang, Chengchun; Zuo, Zhanchun; Jin, Kuijuan; Gu, Changzhi; Xu, Xiulai
2018-05-25
Two-photon Rabi splitting in a cavity-dot system provides a basis for multiqubit coherent control in a quantum photonic network. Here we report on two-photon Rabi splitting in a strongly coupled cavity-dot system. The quantum dot was grown intentionally large in size for a large oscillation strength and small biexciton binding energy. Both exciton and biexciton transitions couple to a high-quality-factor photonic crystal cavity with large coupling strengths over 130 μeV. Furthermore, the small binding energy enables the cavity to simultaneously couple with two exciton states. Thereby, two-photon Rabi splitting between the biexciton and cavity is achieved, which can be well reproduced by theoretical calculations with quantum master equations.
Two-Photon Rabi Splitting in a Coupled System of a Nanocavity and Exciton Complexes
NASA Astrophysics Data System (ADS)
Qian, Chenjiang; Wu, Shiyao; Song, Feilong; Peng, Kai; Xie, Xin; Yang, Jingnan; Xiao, Shan; Steer, Matthew J.; Thayne, Iain G.; Tang, Chengchun; Zuo, Zhanchun; Jin, Kuijuan; Gu, Changzhi; Xu, Xiulai
2018-05-01
Two-photon Rabi splitting in a cavity-dot system provides a basis for multiqubit coherent control in a quantum photonic network. Here we report on two-photon Rabi splitting in a strongly coupled cavity-dot system. The quantum dot was grown intentionally large in size for a large oscillation strength and small biexciton binding energy. Both exciton and biexciton transitions couple to a high-quality-factor photonic crystal cavity with large coupling strengths over 130 μ eV . Furthermore, the small binding energy enables the cavity to simultaneously couple with two exciton states. Thereby, two-photon Rabi splitting between the biexciton and cavity is achieved, which can be well reproduced by theoretical calculations with quantum master equations.
Applying Principles from Complex Systems to Studying the Efficacy of CAM Therapies
Nahin, Richard L.; Calabrese, Carlo; Folkman, Susan; Kimbrough, Elizabeth; Shoham, Jacob; Haramati, Aviad
2010-01-01
Abstract In October 2007, a National Center for Complementary and Alternative Medicine (NCCAM)–sponsored workshop, entitled “Applying Principles from Complex Systems to Studying the Efficacy of CAM Therapies,” was held at Georgetown University in Washington, DC. Over a 2-day period, the workshop engaged a small group of experts from the fields of complementary and alternative medicine (CAM) research and complexity science to discuss and examine ways in which complexity science can be applied to CAM research. After didactic presentations and small-group discussions, a number of salient themes and ideas emerged. This paper article describes the workshop program and summarizes these emergent ideas, which are divided into five broad categories: (1) introduction to complexity; (2) challenges to CAM research; (3) applications of complexity science to CAM; (4) CAM as a model of complexity applied to medicine; and (5) future directions. This discusses possible benefits and challenges associated with applying complexity science to CAM research. By providing an introductory framework for this collaboration and exchange, it is hoped that this article may stimulate further inquiry into this largely unexplored area of research. PMID:20715978
The multi-replication protein A (RPA) system--a new perspective.
Sakaguchi, Kengo; Ishibashi, Toyotaka; Uchiyama, Yukinobu; Iwabata, Kazuki
2009-02-01
Replication protein A (RPA) complex has been shown, using both in vivo and in vitro approaches, to be required for most aspects of eukaryotic DNA metabolism: replication, repair, telomere maintenance and homologous recombination. Here, we review recent data concerning the function and biological importance of the multi-RPA complex. There are distinct complexes of RPA found in the biological kingdoms, although for a long time only one type of RPA complex was believed to be present in eukaryotes. Each complex probably serves a different role. In higher plants, three distinct large and medium subunits are present, but only one species of the smallest subunit. Each of these protein subunits forms stable complexes with their respective partners. They are paralogs as complex. Humans possess two paralogs and one analog of RPA. The multi-RPA system can be regarded as universal in eukaryotes. Among eukaryotic kingdoms, paralogs, orthologs, analogs and heterologs of many DNA synthesis-related factors, including RPA, are ubiquitous. Convergent evolution seems to be ubiquitous in these processes. Using recent findings, we review the composition and biological functions of RPA complexes.
Science information systems: Visualization
NASA Technical Reports Server (NTRS)
Wall, Ray J.
1991-01-01
Future programs in earth science, planetary science, and astrophysics will involve complex instruments that produce data at unprecedented rates and volumes. Current methods for data display, exploration, and discovery are inadequate. Visualization technology offers a means for the user to comprehend, explore, and examine complex data sets. The goal of this program is to increase the effectiveness and efficiency of scientists in extracting scientific information from large volumes of instrument data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-09-25
The Megatux platform enables the emulation of large scale (multi-million node) distributed systems. In particular, it allows for the emulation of large-scale networks interconnecting a very large number of emulated computer systems. It does this by leveraging virtualization and associated technologies to allow hundreds of virtual computers to be hosted on a single moderately sized server or workstation. Virtualization technology provided by modern processors allows for multiple guest OSs to run at the same time, sharing the hardware resources. The Megatux platform can be deployed on a single PC, a small cluster of a few boxes or a large clustermore » of computers. With a modest cluster, the Megatux platform can emulate complex organizational networks. By using virtualization, we emulate the hardware, but run actual software enabling large scale without sacrificing fidelity.« less
Examining Food Risk in the Large using a Complex, Networked System-of-sytems Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ambrosiano, John; Newkirk, Ryan; Mc Donald, Mark P
2010-12-03
The food production infrastructure is a highly complex system of systems. Characterizing the risks of intentional contamination in multi-ingredient manufactured foods is extremely challenging because the risks depend on the vulnerabilities of food processing facilities and on the intricacies of the supply-distribution networks that link them. A pure engineering approach to modeling the system is impractical because of the overall system complexity and paucity of data. A methodology is needed to assess food contamination risk 'in the large', based on current, high-level information about manufacturing facilities, corrunodities and markets, that will indicate which food categories are most at risk ofmore » intentional contamination and warrant deeper analysis. The approach begins by decomposing the system for producing a multi-ingredient food into instances of two subsystem archetypes: (1) the relevant manufacturing and processing facilities, and (2) the networked corrunodity flows that link them to each other and consumers. Ingredient manufacturing subsystems are modeled as generic systems dynamics models with distributions of key parameters that span the configurations of real facilities. Networks representing the distribution systems are synthesized from general information about food corrunodities. This is done in a series of steps. First, probability networks representing the aggregated flows of food from manufacturers to wholesalers, retailers, other manufacturers, and direct consumers are inferred from high-level approximate information. This is followed by disaggregation of the general flows into flows connecting 'large' and 'small' categories of manufacturers, wholesalers, retailers, and consumers. Optimization methods are then used to determine the most likely network flows consistent with given data. Vulnerability can be assessed for a potential contamination point using a modified CARVER + Shock model. Once the facility and corrunodity flow models are instantiated, a risk consequence analysis can be performed by injecting contaminant at chosen points in the system and propagating the event through the overarching system to arrive at morbidity and mortality figures. A generic chocolate snack cake model, consisting of fluid milk, liquid eggs, and cocoa, is described as an intended proof of concept for multi-ingredient food systems. We aim for an eventual tool that can be used directly by policy makers and planners.« less
Complexity and the Limits of Revolution: What Will Happen to the Arab Spring?
NASA Astrophysics Data System (ADS)
Gard-Murray, Alexander S.; Bar-Yam, Yaneer
The recent social unrest across the Middle East and North Africa has deposed dictators who had ruled for decades. While the events have been hailed as an "Arab Spring" by those who hope that repressive autocracies will be replaced by democracies, what sort of regimes will eventually emerge from the crisis remains far from certain. Here we provide a complex systems framework, validated by historical precedent, to help answer this question. We describe the dynamics of governmental change as an evolutionary process similar to biological evolution, in which complex organizations gradually arise by replication, variation, and competitive selection. Different kinds of governments, however, have differing levels of complexity. Democracies must be more systemically complex than autocracies because of their need to incorporate large numbers of people in decision-making. This difference has important implications for the relative robustness of democratic and autocratic governments after revolutions. Revolutions may disrupt existing evolved complexity, limiting the potential for building more complex structures quickly. Insofar as systemic complexity is reduced by revolution, democracy is harder to create in the wake of unrest than autocracy. Applying this analysis to the Middle East and North Africa, we infer that in the absence of stable institutions or external assistance, new governments are in danger of facing increasingly insurmountable challenges and reverting to autocracy.
Design and Implementation of a Metadata-rich File System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ames, S; Gokhale, M B; Maltzahn, C
2010-01-19
Despite continual improvements in the performance and reliability of large scale file systems, the management of user-defined file system metadata has changed little in the past decade. The mismatch between the size and complexity of large scale data stores and their ability to organize and query their metadata has led to a de facto standard in which raw data is stored in traditional file systems, while related, application-specific metadata is stored in relational databases. This separation of data and semantic metadata requires considerable effort to maintain consistency and can result in complex, slow, and inflexible system operation. To address thesemore » problems, we have developed the Quasar File System (QFS), a metadata-rich file system in which files, user-defined attributes, and file relationships are all first class objects. In contrast to hierarchical file systems and relational databases, QFS defines a graph data model composed of files and their relationships. QFS incorporates Quasar, an XPATH-extended query language for searching the file system. Results from our QFS prototype show the effectiveness of this approach. Compared to the de facto standard, the QFS prototype shows superior ingest performance and comparable query performance on user metadata-intensive operations and superior performance on normal file metadata operations.« less
Scaling laws of strategic behavior and size heterogeneity in agent dynamics
NASA Astrophysics Data System (ADS)
Vaglica, Gabriella; Lillo, Fabrizio; Moro, Esteban; Mantegna, Rosario N.
2008-03-01
We consider the financial market as a model system and study empirically how agents strategically adjust the properties of large orders in order to meet their preference and minimize their impact. We quantify this strategic behavior by detecting scaling relations between the variables characterizing the trading activity of different institutions. We also observe power-law distributions in the investment time horizon, in the number of transactions needed to execute a large order, and in the traded value exchanged by large institutions, and we show that heterogeneity of agents is a key ingredient for the emergence of some aggregate properties characterizing this complex system.
ERIC Educational Resources Information Center
Pulkki, Jutta Maarit; Rissanen, Pekka; Raitanen, Jani A.; Viitanen, Elina A.
2011-01-01
This study focuses on a large set of rehabilitation services used between 2004 and 2005 in one hospital district area in Finland. The rehabilitation system consists of several subsystems. This complex system is suggested to produce arbitrary rehabilitation services. Despite the criticisms against the system during decades, no attempts have been…
ERIC Educational Resources Information Center
Johnson, LeAnne D.
2017-01-01
Bringing effective practices to scale across large systems requires attending to how information and belief systems come together in decisions to adopt, implement, and sustain those practices. Statewide scaling of the Pyramid Model, a framework for positive behavior intervention and support, across different types of early childhood programs…
Bernard R. Parresol; Joe H. Scott; Anne Andreu; Susan Prichard; Laurie Kurth
2012-01-01
Currently geospatial fire behavior analyses are performed with an array of fire behavior modeling systems such as FARSITE, FlamMap, and the Large Fire Simulation System. These systems currently require standard or customized surface fire behavior fuel models as inputs that are often assigned through remote sensing information. The ability to handle hundreds or...
Big Data Analysis of Manufacturing Processes
NASA Astrophysics Data System (ADS)
Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert
2015-11-01
The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.
NASA Technical Reports Server (NTRS)
Meyer, G.; Cicolani, L.
1981-01-01
A practical method for the design of automatic flight control systems for aircraft with complex characteristics and operational requirements, such as the powered lift STOL and V/STOL configurations, is presented. The method is effective for a large class of dynamic systems requiring multi-axis control which have highly coupled nonlinearities, redundant controls, and complex multidimensional operational envelopes. It exploits the concept of inverse dynamic systems, and an algorithm for the construction of inverse is given. A hierarchic structure for the total control logic with inverses is presented. The method is illustrated with an application to the Augmentor Wing Jet STOL Research Aircraft equipped with a digital flight control system. Results of flight evaluation of the control concept on this aircraft are presented.
The Stochastic Multi-strain Dengue Model: Analysis of the Dynamics
NASA Astrophysics Data System (ADS)
Aguiar, Maíra; Stollenwerk, Nico; Kooi, Bob W.
2011-09-01
Dengue dynamics is well known to be particularly complex with large fluctuations of disease incidences. An epidemic multi-strain model motivated by dengue fever epidemiology shows deterministic chaos in wide parameter regions. The addition of seasonal forcing, mimicking the vectorial dynamics, and a low import of infected individuals, which is realistic in the dynamics of infectious diseases epidemics show complex dynamics and qualitatively a good agreement between empirical DHF monitoring data and the obtained model simulation. The addition of noise can explain the fluctuations observed in the empirical data and for large enough population size, the stochastic system can be well described by the deterministic skeleton.
Cognitive engineering models in space systems
NASA Technical Reports Server (NTRS)
Mitchell, Christine M.
1992-01-01
NASA space systems, including mission operations on the ground and in space, are complex, dynamic, predominantly automated systems in which the human operator is a supervisory controller. The human operator monitors and fine-tunes computer-based control systems and is responsible for ensuring safe and efficient system operation. In such systems, the potential consequences of human mistakes and errors may be very large, and low probability of such events is likely. Thus, models of cognitive functions in complex systems are needed to describe human performance and form the theoretical basis of operator workstation design, including displays, controls, and decision support aids. The operator function model represents normative operator behavior-expected operator activities given current system state. The extension of the theoretical structure of the operator function model and its application to NASA Johnson mission operations and space station applications is discussed.
2013-10-23
The channel shown here is part of a large system of depressions located on the eastern side of the Elysium Mons volcanic complex. The depression in this image from NASA 2001 Mars Odyssey spacecraft is located just south of Albor Tholus.
Large-N -approximated field theory for multipartite entanglement
NASA Astrophysics Data System (ADS)
Facchi, P.; Florio, G.; Parisi, G.; Pascazio, S.; Scardicchio, A.
2015-12-01
We try to characterize the statistics of multipartite entanglement of the random states of an n -qubit system. Unable to solve the problem exactly we generalize it, replacing complex numbers with real vectors with Nc components (the original problem is recovered for Nc=2 ). Studying the leading diagrams in the large-Nc approximation, we unearth the presence of a phase transition and, in an explicit example, show that the so-called entanglement frustration disappears in the large-Nc limit.
Geometric quantification of features in large flow fields.
Kendall, Wesley; Huang, Jian; Peterka, Tom
2012-01-01
Interactive exploration of flow features in large-scale 3D unsteady-flow data is one of the most challenging visualization problems today. To comprehensively explore the complex feature spaces in these datasets, a proposed system employs a scalable framework for investigating a multitude of characteristics from traced field lines. This capability supports the examination of various neighborhood-based geometric attributes in concert with other scalar quantities. Such an analysis wasn't previously possible because of the large computational overhead and I/O requirements. The system integrates visual analytics methods by letting users procedurally and interactively describe and extract high-level flow features. An exploration of various phenomena in a large global ocean-modeling simulation demonstrates the approach's generality and expressiveness as well as its efficacy.
Considering Complex Objectives and Scarce Resources in Information Systems' Analysis.
ERIC Educational Resources Information Center
Crowther, Warren
The low efficacy of many of the library and large-scale information systems that have been implemented in the developing countries has been disappointing, and their appropriateness is often questioned in the governmental and educational institutions of more industrialized countries beset by budget-crunching and a very dynamic transformation of…
Q&A: The Basics of California's School Finance System
ERIC Educational Resources Information Center
EdSource, 2006
2006-01-01
In a state as large and complex as California, education financing can become as complicated as rocket science. This two-page Q&A provides a brief, easy-to-understand explanation of California's school finance system and introduces the issues of its adequacy and equity. A list of resources providing additional information is provided.
The adaption and use of research codes for performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liebetrau, A.M.
1987-05-01
Models of real-world phenomena are developed for many reasons. The models are usually, if not always, implemented in the form of a computer code. The characteristics of a code are determined largely by its intended use. Realizations or implementations of detailed mathematical models of complex physical and/or chemical processes are often referred to as research or scientific (RS) codes. Research codes typically require large amounts of computing time. One example of an RS code is a finite-element code for solving complex systems of differential equations that describe mass transfer through some geologic medium. Considerable computing time is required because computationsmore » are done at many points in time and/or space. Codes used to evaluate the overall performance of real-world physical systems are called performance assessment (PA) codes. Performance assessment codes are used to conduct simulated experiments involving systems that cannot be directly observed. Thus, PA codes usually involve repeated simulations of system performance in situations that preclude the use of conventional experimental and statistical methods. 3 figs.« less
Plant Phenotyping through the Eyes of Complex Systems: Theoretical Considerations
NASA Astrophysics Data System (ADS)
Kim, J.
2017-12-01
Plant phenotyping is an emerging transdisciplinary research which necessitates not only the communication and collaboration of scientists from different disciplines but also the paradigm shift to a holistic approach. Complex system is defined as a system having a large number of interacting parts (or particles, agents), whose interactions give rise to non-trivial properties like self-organization and emergence. Plant ecosystems are complex systems which are continually morphing dynamical systems, i.e. self-organizing hierarchical open systems. Such systems are composed of many subunits/subsystems with nonlinear interactions and feedback. The throughput such as the flow of energy, matter and information is the key control parameter in complex systems. Information theoretic approaches can be used to understand and identify such interactions, structures and dynamics through reductions in uncertainty (i.e. entropy). The theoretical considerations based on network and thermodynamic thinking and exemplary analyses (e.g. dynamic process network, spectral entropy) of the throughput time series will be presented. These can be used as a framework to develop more discipline-specific fundamental approaches to provide tools for the transferability of traits between measurement scales in plant phenotyping. Acknowledgment: This work was funded by the Weather Information Service Engine Program of the Korea Meteorological Administration under Grant KMIPA-2012-0001.
NASA Technical Reports Server (NTRS)
Jenkins, Luther N.; Yao, Chung-Sheng; Bartram, Scott M.; Harris, Jerome; Allan, Brian; Wong, Oliver; Mace, W. Derry
2009-01-01
A Large Field-of-View Particle Image Velocimetry (LFPIV) system has been developed for rotor wake diagnostics in the 14-by 22-Foot Subsonic Tunnel. The system has been used to measure three components of velocity in a plane as large as 1.524 meters by 0.914 meters in both forward flight and hover tests. Overall, the system performance has exceeded design expectations in terms of accuracy and efficiency. Measurements synchronized with the rotor position during forward flight and hover tests have shown that the system is able to capture the complex interaction of the body and rotor wakes as well as basic details of the blade tip vortex at several wake ages. Measurements obtained with traditional techniques such as multi-hole pressure probes, Laser Doppler Velocimetry (LDV), and 2D Particle Image Velocimetry (PIV) show good agreement with LFPIV measurements.
Direct heuristic dynamic programming for damping oscillations in a large power system.
Lu, Chao; Si, Jennie; Xie, Xiaorong
2008-08-01
This paper applies a neural-network-based approximate dynamic programming method, namely, the direct heuristic dynamic programming (direct HDP), to a large power system stability control problem. The direct HDP is a learning- and approximation-based approach to addressing nonlinear coordinated control under uncertainty. One of the major design parameters, the controller learning objective function, is formulated to directly account for network-wide low-frequency oscillation with the presence of nonlinearity, uncertainty, and coupling effect among system components. Results include a novel learning control structure based on the direct HDP with applications to two power system problems. The first case involves static var compensator supplementary damping control, which is used to provide a comprehensive evaluation of the learning control performance. The second case aims at addressing a difficult complex system challenge by providing a new solution to a large interconnected power network oscillation damping control problem that frequently occurs in the China Southern Power Grid.
Damping characterization in large structures
NASA Technical Reports Server (NTRS)
Eke, Fidelis O.; Eke, Estelle M.
1991-01-01
This research project has as its main goal the development of methods for selecting the damping characteristics of components of a large structure or multibody system, in such a way as to produce some desired system damping characteristics. The main need for such an analytical device is in the simulation of the dynamics of multibody systems consisting, at least partially, of flexible components. The reason for this need is that all existing simulation codes for multibody systems require component-by-component characterization of complex systems, whereas requirements (including damping) often appear at the overall system level. The main goal was met in large part by the development of a method that will in fact synthesize component damping matrices from a given system damping matrix. The restrictions to the method are that the desired system damping matrix must be diagonal (which is almost always the case) and that interbody connections must be by simple hinges. In addition to the technical outcome, this project contributed positively to the educational and research infrastructure of Tuskegee University - a Historically Black Institution.
NASA Astrophysics Data System (ADS)
Korotkova, T. I.; Popova, V. I.
2017-11-01
The generalized mathematical model of decision-making in the problem of planning and mode selection providing required heat loads in a large heat supply system is considered. The system is multilevel, decomposed into levels of main and distribution heating networks with intermediate control stages. Evaluation of the effectiveness, reliability and safety of such a complex system is carried out immediately according to several indicators, in particular pressure, flow, temperature. This global multicriteria optimization problem with constraints is decomposed into a number of local optimization problems and the coordination problem. An agreed solution of local problems provides a solution to the global multicriterion problem of decision making in a complex system. The choice of the optimum operational mode of operation of a complex heat supply system is made on the basis of the iterative coordination process, which converges to the coordinated solution of local optimization tasks. The interactive principle of multicriteria task decision-making includes, in particular, periodic adjustment adjustments, if necessary, guaranteeing optimal safety, reliability and efficiency of the system as a whole in the process of operation. The degree of accuracy of the solution, for example, the degree of deviation of the internal air temperature from the required value, can also be changed interactively. This allows to carry out adjustment activities in the best way and to improve the quality of heat supply to consumers. At the same time, an energy-saving task is being solved to determine the minimum required values of heads at sources and pumping stations.
An implementation of the distributed programming structural synthesis system (PROSSS)
NASA Technical Reports Server (NTRS)
Rogers, J. L., Jr.
1981-01-01
A method is described for implementing a flexible software system that combines large, complex programs with small, user-supplied, problem-dependent programs and that distributes their execution between a mainframe and a minicomputer. The Programming Structural Synthesis System (PROSSS) was the specific software system considered. The results of such distributed implementation are flexibility of the optimization procedure organization and versatility of the formulation of constraints and design variables.
Finding equilibrium in the spatiotemporal chaos of the complex Ginzburg-Landau equation
NASA Astrophysics Data System (ADS)
Ballard, Christopher C.; Esty, C. Clark; Egolf, David A.
2016-11-01
Equilibrium statistical mechanics allows the prediction of collective behaviors of large numbers of interacting objects from just a few system-wide properties; however, a similar theory does not exist for far-from-equilibrium systems exhibiting complex spatial and temporal behavior. We propose a method for predicting behaviors in a broad class of such systems and apply these ideas to an archetypal example, the spatiotemporal chaotic 1D complex Ginzburg-Landau equation in the defect chaos regime. Building on the ideas of Ruelle and of Cross and Hohenberg that a spatiotemporal chaotic system can be considered a collection of weakly interacting dynamical units of a characteristic size, the chaotic length scale, we identify underlying, mesoscale, chaotic units and effective interaction potentials between them. We find that the resulting equilibrium Takahashi model accurately predicts distributions of particle numbers. These results suggest the intriguing possibility that a class of far-from-equilibrium systems may be well described at coarse-grained scales by the well-established theory of equilibrium statistical mechanics.
Finding equilibrium in the spatiotemporal chaos of the complex Ginzburg-Landau equation.
Ballard, Christopher C; Esty, C Clark; Egolf, David A
2016-11-01
Equilibrium statistical mechanics allows the prediction of collective behaviors of large numbers of interacting objects from just a few system-wide properties; however, a similar theory does not exist for far-from-equilibrium systems exhibiting complex spatial and temporal behavior. We propose a method for predicting behaviors in a broad class of such systems and apply these ideas to an archetypal example, the spatiotemporal chaotic 1D complex Ginzburg-Landau equation in the defect chaos regime. Building on the ideas of Ruelle and of Cross and Hohenberg that a spatiotemporal chaotic system can be considered a collection of weakly interacting dynamical units of a characteristic size, the chaotic length scale, we identify underlying, mesoscale, chaotic units and effective interaction potentials between them. We find that the resulting equilibrium Takahashi model accurately predicts distributions of particle numbers. These results suggest the intriguing possibility that a class of far-from-equilibrium systems may be well described at coarse-grained scales by the well-established theory of equilibrium statistical mechanics.
System for decision analysis support on complex waste management issues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shropshire, D.E.
1997-10-01
A software system called the Waste Flow Analysis has been developed and applied to complex environmental management processes for the United States Department of Energy (US DOE). The system can evaluate proposed methods of waste retrieval, treatment, storage, transportation, and disposal. Analysts can evaluate various scenarios to see the impacts to waste slows and schedules, costs, and health and safety risks. Decision analysis capabilities have been integrated into the system to help identify preferred alternatives based on a specific objectives may be to maximize the waste moved to final disposition during a given time period, minimize health risks, minimize costs,more » or combinations of objectives. The decision analysis capabilities can support evaluation of large and complex problems rapidly, and under conditions of variable uncertainty. The system is being used to evaluate environmental management strategies to safely disposition wastes in the next ten years and reduce the environmental legacy resulting from nuclear material production over the past forty years.« less
Lost in transportation: Information measures and cognitive limits in multilayer navigation.
Gallotti, Riccardo; Porter, Mason A; Barthelemy, Marc
2016-02-01
Cities and their transportation systems become increasingly complex and multimodal as they grow, and it is natural to wonder whether it is possible to quantitatively characterize our difficulty navigating in them and whether such navigation exceeds our cognitive limits. A transition between different search strategies for navigating in metropolitan maps has been observed for large, complex metropolitan networks. This evidence suggests the existence of a limit associated with cognitive overload and caused by a large amount of information that needs to be processed. In this light, we analyzed the world's 15 largest metropolitan networks and estimated the information limit for determining a trip in a transportation system to be on the order of 8 bits. Similar to the "Dunbar number," which represents a limit to the size of an individual's friendship circle, our cognitive limit suggests that maps should not consist of more than 250 connection points to be easily readable. We also show that including connections with other transportation modes dramatically increases the information needed to navigate in multilayer transportation networks. In large cities such as New York, Paris, and Tokyo, more than 80% of the trips are above the 8-bit limit. Multimodal transportation systems in large cities have thus already exceeded human cognitive limits and, consequently, the traditional view of navigation in cities has to be revised substantially.
Lost in transportation: Information measures and cognitive limits in multilayer navigation
Gallotti, Riccardo; Porter, Mason A.; Barthelemy, Marc
2016-01-01
Cities and their transportation systems become increasingly complex and multimodal as they grow, and it is natural to wonder whether it is possible to quantitatively characterize our difficulty navigating in them and whether such navigation exceeds our cognitive limits. A transition between different search strategies for navigating in metropolitan maps has been observed for large, complex metropolitan networks. This evidence suggests the existence of a limit associated with cognitive overload and caused by a large amount of information that needs to be processed. In this light, we analyzed the world’s 15 largest metropolitan networks and estimated the information limit for determining a trip in a transportation system to be on the order of 8 bits. Similar to the “Dunbar number,” which represents a limit to the size of an individual’s friendship circle, our cognitive limit suggests that maps should not consist of more than 250 connection points to be easily readable. We also show that including connections with other transportation modes dramatically increases the information needed to navigate in multilayer transportation networks. In large cities such as New York, Paris, and Tokyo, more than 80% of the trips are above the 8-bit limit. Multimodal transportation systems in large cities have thus already exceeded human cognitive limits and, consequently, the traditional view of navigation in cities has to be revised substantially. PMID:26989769
Muhlfeld, Clint C.; Marotz, Brian
2005-01-01
Despite the importance of large-scale habitat connectivity to the threatened bull trout Salvelinus confluentus, little is known about the life history characteristics and processes influencing natural dispersal of migratory populations. We used radiotelemetry to investigate the seasonal movements and habitat use by subadult bull trout (i.e., fish that emigrated from natal streams to the river system) tracked for varying durations from 1999 to 2002 in the upper Flathead River system in northwestern Montana. Telemetry data revealed migratory (N = 32 fish) and nonmigratory (N = 35 fish) behavior, indicating variable movement patterns in the subadult phase of bull trout life history. Most migrating subadults (84%) made rapid or incremental downriver movements (mean distance, 33 km; range, 6–129 km) to lower portions of the river system and to Flathead Lake during high spring flows and as temperatures declined in the fall and winter. Bull trout subadults used complex daytime habitat throughout the upper river system, including deep runs that contained unembedded boulder and cobble substrates, pools with large woody debris, and deep lake-influenced areas of the lower river system. Our results elucidate the importance of maintaining natural connections and a diversity of complex habitats over a large spatial scale to conserve the full expression of life history traits and processes influencing the natural dispersal of bull trout populations. Managers should seek to restore and enhance critical river corridor habitat and remove migration barriers, where possible, for recovery and management programs.
Speeding up GW Calculations to Meet the Challenge of Large Scale Quasiparticle Predictions
Gao, Weiwei; Xia, Weiyi; Gao, Xiang; Zhang, Peihong
2016-01-01
Although the GW approximation is recognized as one of the most accurate theories for predicting materials excited states properties, scaling up conventional GW calculations for large systems remains a major challenge. We present a powerful and simple-to-implement method that can drastically accelerate fully converged GW calculations for large systems, enabling fast and accurate quasiparticle calculations for complex materials systems. We demonstrate the performance of this new method by presenting the results for ZnO and MgO supercells. A speed-up factor of nearly two orders of magnitude is achieved for a system containing 256 atoms (1024 valence electrons) with a negligibly small numerical error of ±0.03 eV. Finally, we discuss the application of our method to the GW calculations for 2D materials. PMID:27833140
Molecular orbital studies of the bonding in heavy element organometallics: Progress report
NASA Astrophysics Data System (ADS)
Bursten, B. E.
1988-03-01
Over the past two years we have made considerable progress in the understanding of the bonding in heavy element mononuclear and binuclear complexes. For mononuclear complexes, our strategy has been to study the orbital interactions between the actinide metal center and the surrounding ligands. One particular system which has been studied extensively is X sub 3 AnL (where X = Cp, Cl, NH sub 2 ; An = actinide; and L = neutral or anionic ligand). We are interested not only in the mechanics of the An-X orbital interactions, but also how the relative donor characteristics of X may influence coordination of the fourth ligand L to the actinide. For binuclear systems, we are interested not only in homobimetallic complexes, but also in heterobimetallic complexes containing actinides and transition metals. In order to make the calculations of such large systems tractable, we have transferred the X-alpha-SW codes to the newly acquired Cray XMP24 at the Ohio Supercomputer Center. This has resulted in significant savings of money and time.
Modeling stochastic noise in gene regulatory systems
Meister, Arwen; Du, Chao; Li, Ye Henry; Wong, Wing Hung
2014-01-01
The Master equation is considered the gold standard for modeling the stochastic mechanisms of gene regulation in molecular detail, but it is too complex to solve exactly in most cases, so approximation and simulation methods are essential. However, there is still a lack of consensus about the best way to carry these out. To help clarify the situation, we review Master equation models of gene regulation, theoretical approximations based on an expansion method due to N.G. van Kampen and R. Kubo, and simulation algorithms due to D.T. Gillespie and P. Langevin. Expansion of the Master equation shows that for systems with a single stable steady-state, the stochastic model reduces to a deterministic model in a first-order approximation. Additional theory, also due to van Kampen, describes the asymptotic behavior of multistable systems. To support and illustrate the theory and provide further insight into the complex behavior of multistable systems, we perform a detailed simulation study comparing the various approximation and simulation methods applied to synthetic gene regulatory systems with various qualitative characteristics. The simulation studies show that for large stochastic systems with a single steady-state, deterministic models are quite accurate, since the probability distribution of the solution has a single peak tracking the deterministic trajectory whose variance is inversely proportional to the system size. In multistable stochastic systems, large fluctuations can cause individual trajectories to escape from the domain of attraction of one steady-state and be attracted to another, so the system eventually reaches a multimodal probability distribution in which all stable steady-states are represented proportional to their relative stability. However, since the escape time scales exponentially with system size, this process can take a very long time in large systems. PMID:25632368
Problems of Automation and Management Principles Information Flow in Manufacturing
NASA Astrophysics Data System (ADS)
Grigoryuk, E. N.; Bulkin, V. V.
2017-07-01
Automated control systems of technological processes are complex systems that are characterized by the presence of elements of the overall focus, the systemic nature of the implemented algorithms for the exchange and processing of information, as well as a large number of functional subsystems. The article gives examples of automatic control systems and automated control systems of technological processes held parallel between them by identifying strengths and weaknesses. Other proposed non-standard control system of technological process.
A modeling framework for exposing risks in complex systems.
Sharit, J
2000-08-01
This article introduces and develops a modeling framework for exposing risks in the form of human errors and adverse consequences in high-risk systems. The modeling framework is based on two components: a two-dimensional theory of accidents in systems developed by Perrow in 1984, and the concept of multiple system perspectives. The theory of accidents differentiates systems on the basis of two sets of attributes. One set characterizes the degree to which systems are interactively complex; the other emphasizes the extent to which systems are tightly coupled. The concept of multiple perspectives provides alternative descriptions of the entire system that serve to enhance insight into system processes. The usefulness of these two model components derives from a modeling framework that cross-links them, enabling a variety of work contexts to be exposed and understood that would otherwise be very difficult or impossible to identify. The model components and the modeling framework are illustrated in the case of a large and comprehensive trauma care system. In addition to its general utility in the area of risk analysis, this methodology may be valuable in applications of current methods of human and system reliability analysis in complex and continually evolving high-risk systems.
Large/Complex Antenna Performance Validation for Spaceborne Radar/Radiometeric Instruments
NASA Technical Reports Server (NTRS)
Focardi, Paolo; Harrell, Jefferson; Vacchione, Joseph
2013-01-01
Over the past decade, Earth observing missions which employ spaceborne combined radar & radiometric instruments have been developed and implemented. These instruments include the use of large and complex deployable antennas whose radiation characteristics need to be accurately determined over 4 pisteradians. Given the size and complexity of these antennas, the performance of the flight units cannot be readily measured. In addition, the radiation performance is impacted by the presence of the instrument's service platform which cannot easily be included in any measurement campaign. In order to meet the system performance knowledge requirements, a two pronged approach has been employed. The first is to use modeling tools to characterize the system and the second is to build a scale model of the system and use RF measurements to validate the results of the modeling tools. This paper demonstrates the resulting level of agreement between scale model and numerical modeling for two recent missions: (1) the earlier Aquarius instrument currently in Earth orbit and (2) the upcoming Soil Moisture Active Passive (SMAP) mission. The results from two modeling approaches, Ansoft's High Frequency Structure Simulator (HFSS) and TICRA's General RF Applications Software Package (GRASP), were compared with measurements of approximately 1/10th scale models of the Aquarius and SMAP systems. Generally good agreement was found between the three methods but each approach had its shortcomings as will be detailed in this paper.
NASA Astrophysics Data System (ADS)
Sherje, Atul P.; Patel, Forum; Murahari, Manikanta; Suvarna, Vasanti; Patel, Kavitkumar
2018-02-01
The present study demonstrated the binary and ternary complexes of Zaltoprofen (ZPF) with β-CD and HP-β-CD. The products were characterized using solubility, in vitro dissolution, and DSC studies. The mode of interaction of guest and host was revealed through 1H NMR and FT-IR studies. A significant increase was noticed in the stability constant (Kc) and complexation efficiency (CE) of β-CD and HP-β-CD due to addition of L-Arg in ternary complexes. The ternary complexes showed greater increase in solubility and dissolution of ZPF than binary complexes. Thus, ternary system of ZPF could be an innovative approach for its solubility and dissolution enhancement.
McMahon, Christopher J; Toomey, Joshua P; Kane, Deb M
2017-01-01
We have analysed large data sets consisting of tens of thousands of time series from three Type B laser systems: a semiconductor laser in a photonic integrated chip, a semiconductor laser subject to optical feedback from a long free-space-external-cavity, and a solid-state laser subject to optical injection from a master laser. The lasers can deliver either constant, periodic, pulsed, or chaotic outputs when parameters such as the injection current and the level of external perturbation are varied. The systems represent examples of experimental nonlinear systems more generally and cover a broad range of complexity including systematically varying complexity in some regions. In this work we have introduced a new procedure for semi-automatically interrogating experimental laser system output power time series to calculate the correlation dimension (CD) using the commonly adopted Grassberger-Proccacia algorithm. The new CD procedure is called the 'minimum gradient detection algorithm'. A value of minimum gradient is returned for all time series in a data set. In some cases this can be identified as a CD, with uncertainty. Applying the new 'minimum gradient detection algorithm' CD procedure, we obtained robust measurements of the correlation dimension for many of the time series measured from each laser system. By mapping the results across an extended parameter space for operation of each laser system, we were able to confidently identify regions of low CD (CD < 3) and assign these robust values for the correlation dimension. However, in all three laser systems, we were not able to measure the correlation dimension at all parts of the parameter space. Nevertheless, by mapping the staged progress of the algorithm, we were able to broadly classify the dynamical output of the lasers at all parts of their respective parameter spaces. For two of the laser systems this included displaying regions of high-complexity chaos and dynamic noise. These high-complexity regions are differentiated from regions where the time series are dominated by technical noise. This is the first time such differentiation has been achieved using a CD analysis approach. More can be known of the CD for a system when it is interrogated in a mapping context, than from calculations using isolated time series. This has been shown for three laser systems and the approach is expected to be useful in other areas of nonlinear science where large data sets are available and need to be semi-automatically analysed to provide real dimensional information about the complex dynamics. The CD/minimum gradient algorithm measure provides additional information that complements other measures of complexity and relative complexity, such as the permutation entropy; and conventional physical measurements.
McMahon, Christopher J.; Toomey, Joshua P.
2017-01-01
Background We have analysed large data sets consisting of tens of thousands of time series from three Type B laser systems: a semiconductor laser in a photonic integrated chip, a semiconductor laser subject to optical feedback from a long free-space-external-cavity, and a solid-state laser subject to optical injection from a master laser. The lasers can deliver either constant, periodic, pulsed, or chaotic outputs when parameters such as the injection current and the level of external perturbation are varied. The systems represent examples of experimental nonlinear systems more generally and cover a broad range of complexity including systematically varying complexity in some regions. Methods In this work we have introduced a new procedure for semi-automatically interrogating experimental laser system output power time series to calculate the correlation dimension (CD) using the commonly adopted Grassberger-Proccacia algorithm. The new CD procedure is called the ‘minimum gradient detection algorithm’. A value of minimum gradient is returned for all time series in a data set. In some cases this can be identified as a CD, with uncertainty. Findings Applying the new ‘minimum gradient detection algorithm’ CD procedure, we obtained robust measurements of the correlation dimension for many of the time series measured from each laser system. By mapping the results across an extended parameter space for operation of each laser system, we were able to confidently identify regions of low CD (CD < 3) and assign these robust values for the correlation dimension. However, in all three laser systems, we were not able to measure the correlation dimension at all parts of the parameter space. Nevertheless, by mapping the staged progress of the algorithm, we were able to broadly classify the dynamical output of the lasers at all parts of their respective parameter spaces. For two of the laser systems this included displaying regions of high-complexity chaos and dynamic noise. These high-complexity regions are differentiated from regions where the time series are dominated by technical noise. This is the first time such differentiation has been achieved using a CD analysis approach. Conclusions More can be known of the CD for a system when it is interrogated in a mapping context, than from calculations using isolated time series. This has been shown for three laser systems and the approach is expected to be useful in other areas of nonlinear science where large data sets are available and need to be semi-automatically analysed to provide real dimensional information about the complex dynamics. The CD/minimum gradient algorithm measure provides additional information that complements other measures of complexity and relative complexity, such as the permutation entropy; and conventional physical measurements. PMID:28837602
The new challenges of multiplex networks: Measures and models
NASA Astrophysics Data System (ADS)
Battiston, Federico; Nicosia, Vincenzo; Latora, Vito
2017-02-01
What do societies, the Internet, and the human brain have in common? They are all examples of complex relational systems, whose emerging behaviours are largely determined by the non-trivial networks of interactions among their constituents, namely individuals, computers, or neurons, rather than only by the properties of the units themselves. In the last two decades, network scientists have proposed models of increasing complexity to better understand real-world systems. Only recently we have realised that multiplexity, i.e. the coexistence of several types of interactions among the constituents of a complex system, is responsible for substantial qualitative and quantitative differences in the type and variety of behaviours that a complex system can exhibit. As a consequence, multilayer and multiplex networks have become a hot topic in complexity science. Here we provide an overview of some of the measures proposed so far to characterise the structure of multiplex networks, and a selection of models aiming at reproducing those structural properties and quantifying their statistical significance. Focusing on a subset of relevant topics, this brief review is a quite comprehensive introduction to the most basic tools for the analysis of multiplex networks observed in the real-world. The wide applicability of multiplex networks as a framework to model complex systems in different fields, from biology to social sciences, and the colloquial tone of the paper will make it an interesting read for researchers working on both theoretical and experimental analysis of networked systems.
Computational complexity of Boolean functions
NASA Astrophysics Data System (ADS)
Korshunov, Aleksei D.
2012-02-01
Boolean functions are among the fundamental objects of discrete mathematics, especially in those of its subdisciplines which fall under mathematical logic and mathematical cybernetics. The language of Boolean functions is convenient for describing the operation of many discrete systems such as contact networks, Boolean circuits, branching programs, and some others. An important parameter of discrete systems of this kind is their complexity. This characteristic has been actively investigated starting from Shannon's works. There is a large body of scientific literature presenting many fundamental results. The purpose of this survey is to give an account of the main results over the last sixty years related to the complexity of computation (realization) of Boolean functions by contact networks, Boolean circuits, and Boolean circuits without branching. Bibliography: 165 titles.
Interaction of a supersonic particle with a three-dimensional complex plasma
NASA Astrophysics Data System (ADS)
Zaehringer, E.; Schwabe, M.; Zhdanov, S.; Mohr, D. P.; Knapek, C. A.; Huber, P.; Semenov, I. L.; Thomas, H. M.
2018-03-01
The influence of a supersonic projectile on a three-dimensional complex plasma is studied. Micron sized particles in a low-temperature plasma formed a large undisturbed system in the new "Zyflex" chamber during microgravity conditions. A supersonic probe particle excited a Mach cone with Mach number M ≈ 1.5-2 and double Mach cone structure in the large weakly damped particle cloud. The speed of sound is measured with different methods and particle charge estimations are compared to the calculations from standard theories. The high image resolution enables the study of Mach cones in microgravity on the single particle level of a three-dimensional complex plasma and gives insight to the dynamics. A heating of the microparticles is discovered behind the supersonic projectile but not in the flanks of the Mach cone.
Advanced functional network analysis in the geosciences: The pyunicorn package
NASA Astrophysics Data System (ADS)
Donges, Jonathan F.; Heitzig, Jobst; Runge, Jakob; Schultz, Hanna C. H.; Wiedermann, Marc; Zech, Alraune; Feldhoff, Jan; Rheinwalt, Aljoscha; Kutza, Hannes; Radebach, Alexander; Marwan, Norbert; Kurths, Jürgen
2013-04-01
Functional networks are a powerful tool for analyzing large geoscientific datasets such as global fields of climate time series originating from observations or model simulations. pyunicorn (pythonic unified complex network and recurrence analysis toolbox) is an open-source, fully object-oriented and easily parallelizable package written in the language Python. It allows for constructing functional networks (aka climate networks) representing the structure of statistical interrelationships in large datasets and, subsequently, investigating this structure using advanced methods of complex network theory such as measures for networks of interacting networks, node-weighted statistics or network surrogates. Additionally, pyunicorn allows to study the complex dynamics of geoscientific systems as recorded by time series by means of recurrence networks and visibility graphs. The range of possible applications of the package is outlined drawing on several examples from climatology.
Interpolymer complexation: comparisons of bulk and interfacial structures.
Cattoz, Beatrice; de Vos, Wiebe M; Cosgrove, Terence; Crossman, Martin; Espidel, Youssef; Prescott, Stuart W
2015-04-14
The interactions between the strong polyelectrolyte sodium poly(styrenesulfonate), NaPSS, and the neutral polymer poly(vinylpyrrolidone), PVP, were investigated in bulk and at the silica/solution interface using a combination of diffusion nuclear magnetic resonance spectroscopy (NMR), small-angle neutron scattering (SANS), solvent relaxation NMR, and ellipsometry. We show for the first time that complex formation occurs between NaPSS and PVP in solution; the complexes formed were shown not to be influenced by pH variation, whereas increasing the ionic strength increases the complexation of NaPSS but does not influence the PVP directly. The complexes formed contained a large proportion of NaPSS. Study of these interactions at the silica interface demonstrated that complexes also form at the nanoparticle interface where PVP is added in the system prior to NaPSS. For a constant PVP concentration and varying NaPSS concentration, the system remains stable until NaPSS is added in excess, which leads to depletion flocculation. Surface complex formation using the layer-by-layer technique was also reported at a planar silica interface.
Feng, Cun-Fang; Xu, Xin-Jian; Wang, Sheng-Jun; Wang, Ying-Hai
2008-06-01
We study projective-anticipating, projective, and projective-lag synchronization of time-delayed chaotic systems on random networks. We relax some limitations of previous work, where projective-anticipating and projective-lag synchronization can be achieved only on two coupled chaotic systems. In this paper, we realize projective-anticipating and projective-lag synchronization on complex dynamical networks composed of a large number of interconnected components. At the same time, although previous work studied projective synchronization on complex dynamical networks, the dynamics of the nodes are coupled partially linear chaotic systems. In this paper, the dynamics of the nodes of the complex networks are time-delayed chaotic systems without the limitation of the partial linearity. Based on the Lyapunov stability theory, we suggest a generic method to achieve the projective-anticipating, projective, and projective-lag synchronization of time-delayed chaotic systems on random dynamical networks, and we find both its existence and sufficient stability conditions. The validity of the proposed method is demonstrated and verified by examining specific examples using Ikeda and Mackey-Glass systems on Erdos-Renyi networks.
The role of artificial intelligence techniques in scheduling systems
NASA Technical Reports Server (NTRS)
Geoffroy, Amy L.; Britt, Daniel L.; Gohring, John R.
1990-01-01
Artificial Intelligence (AI) techniques provide good solutions for many of the problems which are characteristic of scheduling applications. However, scheduling is a large, complex heterogeneous problem. Different applications will require different solutions. Any individual application will require the use of a variety of techniques, including both AI and conventional software methods. The operational context of the scheduling system will also play a large role in design considerations. The key is to identify those places where a specific AI technique is in fact the preferable solution, and to integrate that technique into the overall architecture.
Ishikawa, Atsushi; Nakao, Yoshihide; Sato, Hirofumi; Sakaki, Shigeyoshi
2009-09-07
Oxygen atom transfer reaction between ML(3)=O and ML(3) (L = 2,4,6-trimethylphenyl (Mes) for M = Ir and L = 2,6-diisopropylphenylimide (NAr) for M = Os) was theoretically investigated by DFT method. The optimized geometry of (Mes)(3)Ir-O-Ir(Mes)(3) agrees well with the experimental one, although those of (CH(3))(3)Ir-O-Ir(CH(3))(3) and Ph(3)Ir-O-IrPh(3) are much different from the experimental one of the Mes complex. These results indicate that the bulky ligand plays important roles to determine geometry of the mu-oxo dinuclear Ir complex. Theoretical study of the real systems presents clear pictures of these oxygen atom transfer reactions, as follows: In the Ir reaction system, (i) the mu-oxo bridged dinuclear complex is more stable than the infinite separation system in potential energy surface, indicating this is incomplete oxygen atom transfer reaction which does not occur at very low temperature, (ii) unsymmetrical transition state is newly found, in which one Ir-O distance is longer than the other one, (iii) unsymmetrical local minimum is also newly found between the transition state and the infinite separation system, and (iv) activation barrier (E(a)) is very small. In the Os reaction system, (v) the transition state is symmetrical, while no intermediate is observed unlike the Ir reaction system, and (vi) E(a) is very large. These results are consistent with the experimental results that the reaction rapidly occurs in the Ir system but very slowly in the Os system, and that the mu-oxo bridged dinuclear intermediate is detected in the Ir system but not in the Os system. To elucidate the reasons of these differences between Ir and Os systems, the E(a) value is decomposed into the nuclear and electronic factors. The former is the energy necessary to distort ML(3) and ML(3)=O moieties from their equilibrium geometries to those in the transition state. The latter depends on donor-acceptor interaction between ML(3)=O and ML(3). The nuclear factor is much larger in the Os system than in the Ir system and it contributes to about 70% of the difference in E(a). The energy gap between the donor orbital of ML(3) and the acceptor orbital of ML(3)=O is much larger in the Os system than in the Ir system, which also contributes to the lower E(a) value of the Ir system than that of the Os system.
Length based vehicle classification on freeways from single loop detectors.
DOT National Transportation Integrated Search
2009-10-15
Roadway usage, particularly by large vehicles, is one of the fundamental factors determining the lifespan : of highway infrastructure, e.g., as evidenced by the federally mandated Highway Performance : Monitoring System (HPMS). But the complexity of ...
Computational modelling of oxygenation processes in enzymes and biomimetic model complexes.
de Visser, Sam P; Quesne, Matthew G; Martin, Bodo; Comba, Peter; Ryde, Ulf
2014-01-11
With computational resources becoming more efficient and more powerful and at the same time cheaper, computational methods have become more and more popular for studies on biochemical and biomimetic systems. Although large efforts from the scientific community have gone into exploring the possibilities of computational methods for studies on large biochemical systems, such studies are not without pitfalls and often cannot be routinely done but require expert execution. In this review we summarize and highlight advances in computational methodology and its application to enzymatic and biomimetic model complexes. In particular, we emphasize on topical and state-of-the-art methodologies that are able to either reproduce experimental findings, e.g., spectroscopic parameters and rate constants, accurately or make predictions of short-lived intermediates and fast reaction processes in nature. Moreover, we give examples of processes where certain computational methods dramatically fail.
Secondary Cutaneous Involvement in Follicular Diffuse Lymphoma Treated with Helical Tomotherapy
Dar, A. Rashid; Jordan, Kevin
2017-01-01
Non-Hodgkin’s lymphoma is a complex heterogeneous group of disease entities that involves nodal and extranodal tissues. Cutaneous involvement can occur either as a primary or secondary in course of disease. Radiation therapy with either total body or localized treatments is often used for local control and symptom relief, depending on the target volume. We describe a 60-year-old male with a remote history of stage IA left neck follicular lymphoma treated with radiation 20 years ago and previous relapses aggressively treated by chemotherapy. Treatment to a large volume of back and posterior shoulders on a helical tomotherapy radiotherapy system is reported. The skin lesions responded completely with no toxicity. Palliative radiotherapy to a fairly large and complex volume of skin with modest dose avoiding underlying critical tissues on tomotherapy is feasible, well tolerated with an excellent durable response, without compromising future chemotherapy and stem cell transplant for systemic relapse. PMID:28944110
Self-assembly kinetics of DNA functionalised liposomes
NASA Astrophysics Data System (ADS)
Mognetti, B. M.; Bachmann, S. J.; Kotar, J.; Parolini, L.; Petitzon, M.; Cicuta, P.; di Michele, L.
DNA has been largely used to program state-dependent interactions between functionalised Brownian units resulting in responsive systems featuring complex phase behaviours. In this talk I will show how DNA can also be used to control aggregation kinetics in systems of liposomes functionalised by three types of linkers that can simultaneously bind. In doing so, I will present a general coarse-graining strategy that allows calculating the adhesion free energy between pairs of compliant units functionalised by mobile binders. I will highlight the important role played by bilayer deformability and will calculate the free energy contribution due to the presence of complexes made by more than two binders. Finally we will demonstrate the importance of explicitly accounting for the kinetics underlying ligand-receptor reactions when studying large-scale self-assembly. We acknowledge support from ULB, the Oppenheimer Fund, and the EPSRC Programme Grant CAPITALS No. EP/J017566/1.
Replica Exchange with Solute Tempering: Efficiency in Large Scale Systems
Huang, Xuhui; Hagen, Morten; Kim, Byungchan; Friesner, Richard A.; Zhou, Ruhong; Berne, B. J.
2009-01-01
We apply the recently developed replica exchange with solute tempering (REST) to three large solvated peptide systems: an α-helix, a β-hairpin, and a TrpCage, with these peptides defined as the “central group”. We find that our original implementation of REST is not always more efficient than the replica exchange method (REM). Specifically, we find that exchanges between folded (F) and unfolded (U) conformations with vastly different structural energies are greatly reduced by the nonappearance of the water self-interaction energy in the replica exchange acceptance probabilities. REST, however, is expected to remain useful for a large class of systems for which the energy gap between the two states is not large, such as weakly bound protein–ligand complexes. Alternatively, a shell of water molecules can be incorporated into the central group, as discussed in the original paper. PMID:17439169
Research on the adaptive optical control technology based on DSP
NASA Astrophysics Data System (ADS)
Zhang, Xiaolu; Xue, Qiao; Zeng, Fa; Zhao, Junpu; Zheng, Kuixing; Su, Jingqin; Dai, Wanjun
2018-02-01
Adaptive optics is a real-time compensation technique using high speed support system for wavefront errors caused by atmospheric turbulence. However, the randomness and instantaneity of atmospheric changing introduce great difficulties to the design of adaptive optical systems. A large number of complex real-time operations lead to large delay, which is an insurmountable problem. To solve this problem, hardware operation and parallel processing strategy are proposed, and a high-speed adaptive optical control system based on DSP is developed. The hardware counter is used to check the system. The results show that the system can complete a closed loop control in 7.1ms, and improve the controlling bandwidth of the adaptive optical system. Using this system, the wavefront measurement and closed loop experiment are carried out, and obtain the good results.
Criticality as a Set-Point for Adaptive Behavior in Neuromorphic Hardware
Srinivasa, Narayan; Stepp, Nigel D.; Cruz-Albrecht, Jose
2015-01-01
Neuromorphic hardware are designed by drawing inspiration from biology to overcome limitations of current computer architectures while forging the development of a new class of autonomous systems that can exhibit adaptive behaviors. Several designs in the recent past are capable of emulating large scale networks but avoid complexity in network dynamics by minimizing the number of dynamic variables that are supported and tunable in hardware. We believe that this is due to the lack of a clear understanding of how to design self-tuning complex systems. It has been widely demonstrated that criticality appears to be the default state of the brain and manifests in the form of spontaneous scale-invariant cascades of neural activity. Experiment, theory and recent models have shown that neuronal networks at criticality demonstrate optimal information transfer, learning and information processing capabilities that affect behavior. In this perspective article, we argue that understanding how large scale neuromorphic electronics can be designed to enable emergent adaptive behavior will require an understanding of how networks emulated by such hardware can self-tune local parameters to maintain criticality as a set-point. We believe that such capability will enable the design of truly scalable intelligent systems using neuromorphic hardware that embrace complexity in network dynamics rather than avoiding it. PMID:26648839
Empirical results on scheduling and dynamic backtracking
NASA Technical Reports Server (NTRS)
Boddy, Mark S.; Goldman, Robert P.
1994-01-01
At the Honeywell Technology Center (HTC), we have been working on a scheduling problem related to commercial avionics. This application is large, complex, and hard to solve. To be a little more concrete: 'large' means almost 20,000 activities, 'complex' means several activity types, periodic behavior, and assorted types of temporal constraints, and 'hard to solve' means that we have been unable to eliminate backtracking through the use of search heuristics. At this point, we can generate solutions, where solutions exist, or report failure and sometimes why the system failed. To the best of our knowledge, this is among the largest and most complex scheduling problems to have been solved as a constraint satisfaction problem, at least that has appeared in the published literature. This abstract is a preliminary report on what we have done and how. In the next section, we present our approach to treating scheduling as a constraint satisfaction problem. The following sections present the application in more detail and describe how we solve scheduling problems in the application domain. The implemented system makes use of Ginsberg's Dynamic Backtracking algorithm, with some minor extensions to improve its utility for scheduling. We describe those extensions and the performance of the resulting system. The paper concludes with some general remarks, open questions and plans for future work.
A large-scale perspective on ecosystems
NASA Technical Reports Server (NTRS)
Mizutani, Hiroshi
1987-01-01
Interactions between ecological elements must be better understood in order to construct an ecological life support system in space. An index was devised to describe the complexity of material cyclings within a given ecosystem. It was then applied to the cyclings of bioelements in various systems of material cyclings including the whole Earth and national economies. The results show interesting characteristics of natural and man-made systems.
CORALINA: a universal method for the generation of gRNA libraries for CRISPR-based screening.
Köferle, Anna; Worf, Karolina; Breunig, Christopher; Baumann, Valentin; Herrero, Javier; Wiesbeck, Maximilian; Hutter, Lukas H; Götz, Magdalena; Fuchs, Christiane; Beck, Stephan; Stricker, Stefan H
2016-11-14
The bacterial CRISPR system is fast becoming the most popular genetic and epigenetic engineering tool due to its universal applicability and adaptability. The desire to deploy CRISPR-based methods in a large variety of species and contexts has created an urgent need for the development of easy, time- and cost-effective methods enabling large-scale screening approaches. Here we describe CORALINA (comprehensive gRNA library generation through controlled nuclease activity), a method for the generation of comprehensive gRNA libraries for CRISPR-based screens. CORALINA gRNA libraries can be derived from any source of DNA without the need of complex oligonucleotide synthesis. We show the utility of CORALINA for human and mouse genomic DNA, its reproducibility in covering the most relevant genomic features including regulatory, coding and non-coding sequences and confirm the functionality of CORALINA generated gRNAs. The simplicity and cost-effectiveness make CORALINA suitable for any experimental system. The unprecedented sequence complexities obtainable with CORALINA libraries are a necessary pre-requisite for less biased large scale genomic and epigenomic screens.
Visualization-based decision support for value-driven system design
NASA Astrophysics Data System (ADS)
Tibor, Elliott
In the past 50 years, the military, communication, and transportation systems that permeate our world, have grown exponentially in size and complexity. The development and production of these systems has seen ballooning costs and increased risk. This is particularly critical for the aerospace industry. The inability to deal with growing system complexity is a crippling force in the advancement of engineered systems. Value-Driven Design represents a paradigm shift in the field of design engineering that has potential to help counteract this trend. The philosophy of Value-Driven Design places the desires of the stakeholder at the forefront of the design process to capture true preferences and reveal system alternatives that were never previously thought possible. Modern aerospace engineering design problems are large, complex, and involve multiple levels of decision-making. To find the best design, the decision-maker is often required to analyze hundreds or thousands of combinations of design variables and attributes. Visualization can be used to support these decisions, by communicating large amounts of data in a meaningful way. Understanding the design space, the subsystem relationships, and the design uncertainties is vital to the advancement of Value-Driven Design as an accepted process for the development of more effective, efficient, robust, and elegant aerospace systems. This research investigates the use of multi-dimensional data visualization tools to support decision-making under uncertainty during the Value-Driven Design process. A satellite design system comprising a satellite, ground station, and launch vehicle is used to demonstrate effectiveness of new visualization methods to aid in decision support during complex aerospace system design. These methods are used to facilitate the exploration of the feasible design space by representing the value impact of system attribute changes and comparing the results of multi-objective optimization formulations with a Value-Driven Design formulation. The visualization methods are also used to assist in the decomposition of a value function, by representing attribute sensitivities to aid with trade-off studies. Lastly, visualization is used to enable greater understanding of the subsystem relationships, by displaying derivative-based couplings, and the design uncertainties, through implementation of utility theory. The use of these visualization methods is shown to enhance the decision-making capabilities of the designer by granting them a more holistic view of the complex design space.
ERP (enterprise resource planning) systems can streamline healthcare business functions.
Jenkins, E K; Christenson, E
2001-05-01
Enterprise resource planning (ERP) software applications are designed to facilitate the systemwide integration of complex processes and functions across a large enterprise consisting of many internal and external constituents. Although most currently available ERP applications generally are tailored to the needs of the manufacturing industry, many large healthcare systems are investigating these applications. Due to the significant differences between manufacturing and patient care, ERP-based systems do not easily translate to the healthcare setting. In particular, the lack of clinical standardization impedes the use of ERP systems for clinical integration. Nonetheless, an ERP-based system can help a healthcare organization integrate many functions, including patient scheduling, human resources management, workload forecasting, and management of workflow, that are not directly dependent on clinical decision making.
NMESys: An expert system for network fault detection
NASA Technical Reports Server (NTRS)
Nelson, Peter C.; Warpinski, Janet
1991-01-01
The problem of network management is becoming an increasingly difficult and challenging task. It is very common today to find heterogeneous networks consisting of many different types of computers, operating systems, and protocols. The complexity of implementing a network with this many components is difficult enough, while the maintenance of such a network is an even larger problem. A prototype network management expert system, NMESys, implemented in the C Language Integrated Production System (CLIPS). NMESys concentrates on solving some of the critical problems encountered in managing a large network. The major goal of NMESys is to provide a network operator with an expert system tool to quickly and accurately detect hard failures, potential failures, and to minimize or eliminate user down time in a large network.
Nanoparticles from renewable polymers
Wurm, Frederik R.; Weiss, Clemens K.
2014-01-01
The use of polymers from natural resources can bring many benefits for novel polymeric nanoparticle systems. Such polymers have a variety of beneficial properties such as biodegradability and biocompatibility, they are readily available on large scale and at low cost. As the amount of fossil fuels decrease, their application becomes more interesting even if characterization is in many cases more challenging due to structural complexity, either by broad distribution of their molecular weights (polysaccharides, polyesters, lignin) or by complex structure (proteins, lignin). This review summarizes different sources and methods for the preparation of biopolymer-based nanoparticle systems for various applications. PMID:25101259
High throughput computing: a solution for scientific analysis
O'Donnell, M.
2011-01-01
handle job failures due to hardware, software, or network interruptions (obviating the need to manually resubmit the job after each stoppage); be affordable; and most importantly, allow us to complete very large, complex analyses that otherwise would not even be possible. In short, we envisioned a job-management system that would take advantage of unused FORT CPUs within a local area network (LAN) to effectively distribute and run highly complex analytical processes. What we found was a solution that uses High Throughput Computing (HTC) and High Performance Computing (HPC) systems to do exactly that (Figure 1).
The price of complexity in financial networks
NASA Astrophysics Data System (ADS)
Battiston, Stefano; Caldarelli, Guido; May, Robert M.; Roukny, Tarik; Stiglitz, Joseph E.
2016-09-01
Financial institutions form multilayer networks by engaging in contracts with each other and by holding exposures to common assets. As a result, the default probability of one institution depends on the default probability of all of the other institutions in the network. Here, we show how small errors on the knowledge of the network of contracts can lead to large errors in the probability of systemic defaults. From the point of view of financial regulators, our findings show that the complexity of financial networks may decrease the ability to mitigate systemic risk, and thus it may increase the social cost of financial crises.
The price of complexity in financial networks.
Battiston, Stefano; Caldarelli, Guido; May, Robert M; Roukny, Tarik; Stiglitz, Joseph E
2016-09-06
Financial institutions form multilayer networks by engaging in contracts with each other and by holding exposures to common assets. As a result, the default probability of one institution depends on the default probability of all of the other institutions in the network. Here, we show how small errors on the knowledge of the network of contracts can lead to large errors in the probability of systemic defaults. From the point of view of financial regulators, our findings show that the complexity of financial networks may decrease the ability to mitigate systemic risk, and thus it may increase the social cost of financial crises.
Statistical mechanics of complex economies
NASA Astrophysics Data System (ADS)
Bardoscia, Marco; Livan, Giacomo; Marsili, Matteo
2017-04-01
In the pursuit of ever increasing efficiency and growth, our economies have evolved to remarkable degrees of complexity, with nested production processes feeding each other in order to create products of greater sophistication from less sophisticated ones, down to raw materials. The engine of such an expansion have been competitive markets that, according to general equilibrium theory (GET), achieve efficient allocations under specific conditions. We study large random economies within the GET framework, as templates of complex economies, and we find that a non-trivial phase transition occurs: the economy freezes in a state where all production processes collapse when either the number of primary goods or the number of available technologies fall below a critical threshold. As in other examples of phase transitions in large random systems, this is an unintended consequence of the growth in complexity. Our findings suggest that the Industrial Revolution can be regarded as a sharp transition between different phases, but also imply that well developed economies can collapse if too many intermediate goods are introduced.
Mahoney, J. Matthew; Titiz, Ali S.; Hernan, Amanda E.; Scott, Rod C.
2016-01-01
Hippocampal neural systems consolidate multiple complex behaviors into memory. However, the temporal structure of neural firing supporting complex memory consolidation is unknown. Replay of hippocampal place cells during sleep supports the view that a simple repetitive behavior modifies sleep firing dynamics, but does not explain how multiple episodes could be integrated into associative networks for recollection during future cognition. Here we decode sequential firing structure within spike avalanches of all pyramidal cells recorded in sleeping rats after running in a circular track. We find that short sequences that combine into multiple long sequences capture the majority of the sequential structure during sleep, including replay of hippocampal place cells. The ensemble, however, is not optimized for maximally producing the behavior-enriched episode. Thus behavioral programming of sequential correlations occurs at the level of short-range interactions, not whole behavioral sequences and these short sequences are assembled into a large and complex milieu that could support complex memory consolidation. PMID:26866597
2014-07-01
mucos"x1; N Acquired Abnormality 4.7350 93696 76 0.85771...4. Roden DM, Pulley JM, Basford MA, et al. Development of a large- scale de-identified DNA biobank to enable personalized medicine. Clin Pharmacol...large healthcare system which incorporated clinical information from a 20-hospital setting (both aca- demic and community hospitals) of University of
Sampling Large Graphs for Anticipatory Analytics
2015-05-15
low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges
NASA Astrophysics Data System (ADS)
Griffiths, John D.
2015-12-01
The modern understanding of the brain as a large, complex network of interacting elements is a natural consequence of the Neuron Doctrine [1,2] that has been bolstered in recent years by the tools and concepts of connectomics. In this abstracted, network-centric view, the essence of neural and cognitive function derives from the flows between network elements of activity and information - or, more generally, causal influence. The appropriate characterization of causality in neural systems, therefore, is a question at the very heart of systems neuroscience.
A Portable Computer System for Auditing Quality of Ambulatory Care
McCoy, J. Michael; Dunn, Earl V.; Borgiel, Alexander E.
1987-01-01
Prior efforts to effectively and efficiently audit quality of ambulatory care based on comprehensive process criteria have been limited largely by the complexity and cost of data abstraction and management. Over the years, several demonstration projects have generated large sets of process criteria and mapping systems for evaluating quality of care, but these paper-based approaches have been impractical to implement on a routine basis. Recognizing that portable microcomputers could solve many of the technical problems in abstracting data from medical records, we built upon previously described criteria and developed a microcomputer-based abstracting system that facilitates reliable and cost-effective data abstraction.
VPipe: Virtual Pipelining for Scheduling of DAG Stream Query Plans
NASA Astrophysics Data System (ADS)
Wang, Song; Gupta, Chetan; Mehta, Abhay
There are data streams all around us that can be harnessed for tremendous business and personal advantage. For an enterprise-level stream processing system such as CHAOS [1] (Continuous, Heterogeneous Analytic Over Streams), handling of complex query plans with resource constraints is challenging. While several scheduling strategies exist for stream processing, efficient scheduling of complex DAG query plans is still largely unsolved. In this paper, we propose a novel execution scheme for scheduling complex directed acyclic graph (DAG) query plans with meta-data enriched stream tuples. Our solution, called Virtual Pipelined Chain (or VPipe Chain for short), effectively extends the "Chain" pipelining scheduling approach to complex DAG query plans.
The R-Shell approach - Using scheduling agents in complex distributed real-time systems
NASA Technical Reports Server (NTRS)
Natarajan, Swaminathan; Zhao, Wei; Goforth, Andre
1993-01-01
Large, complex real-time systems such as space and avionics systems are extremely demanding in their scheduling requirements. The current OS design approaches are quite limited in the capabilities they provide for task scheduling. Typically, they simply implement a particular uniprocessor scheduling strategy and do not provide any special support for network scheduling, overload handling, fault tolerance, distributed processing, etc. Our design of the R-Shell real-time environment fcilitates the implementation of a variety of sophisticated but efficient scheduling strategies, including incorporation of all these capabilities. This is accomplished by the use of scheduling agents which reside in the application run-time environment and are responsible for coordinating the scheduling of the application.
Hot cheese: a processed Swiss cheese model.
Li, Y; Thimbleby, H
2014-01-01
James Reason's classic Swiss cheese model is a vivid and memorable way to visualise how patient harm happens only when all system defences fail. Although Reason's model has been criticised for its simplicity and static portrait of complex systems, its use has been growing, largely because of the direct clarity of its simple and memorable metaphor. A more general, more flexible and equally memorable model of accident causation in complex systems is needed. We present the hot cheese model, which is more realistic, particularly in portraying defence layers as dynamic and active - more defences may cause more hazards. The hot cheese model, being more flexible, encourages deeper discussion of incidents than the simpler Swiss cheese model permits.
Massive Multi-Agent Systems Control
NASA Technical Reports Server (NTRS)
Campagne, Jean-Charles; Gardon, Alain; Collomb, Etienne; Nishida, Toyoaki
2004-01-01
In order to build massive multi-agent systems, considered as complex and dynamic systems, one needs a method to analyze and control the system. We suggest an approach using morphology to represent and control the state of large organizations composed of a great number of light software agents. Morphology is understood as representing the state of the multi-agent system as shapes in an abstract geometrical space, this notion is close to the notion of phase space in physics.
System engineering of the Atacama Large Millimeter/submillimeter Array
NASA Astrophysics Data System (ADS)
Bhatia, Ravinder; Marti, Javier; Sugimoto, Masahiro; Sramek, Richard; Miccolis, Maurizio; Morita, Koh-Ichiro; Arancibia, Demián.; Araya, Andrea; Asayama, Shin'ichiro; Barkats, Denis; Brito, Rodrigo; Brundage, William; Grammer, Wes; Haupt, Christoph; Kurlandczyk, Herve; Mizuno, Norikazu; Napier, Peter; Pizarro, Eduardo; Saini, Kamaljeet; Stahlman, Gretchen; Verzichelli, Gianluca; Whyborn, Nick; Yagoubov, Pavel
2012-09-01
The Atacama Large Millimeter/submillimeter Array (ALMA) will be composed of 66 high precision antennae located at 5000 meters altitude in northern Chile. This paper will present the methodology, tools and processes adopted to system engineer a project of high technical complexity, by system engineering teams that are remotely located and from different cultures, and in accordance with a demanding schedule and within tight financial constraints. The technical and organizational complexity of ALMA requires a disciplined approach to the definition, implementation and verification of the ALMA requirements. During the development phase, System Engineering chairs all technical reviews and facilitates the resolution of technical conflicts. We have developed analysis tools to analyze the system performance, incorporating key parameters that contribute to the ultimate performance, and are modeled using best estimates and/or measured values obtained during test campaigns. Strict tracking and control of the technical budgets ensures that the different parts of the system can operate together as a whole within ALMA boundary conditions. System Engineering is responsible for acceptances of the thousands of hardware items delivered to Chile, and also supports the software acceptance process. In addition, System Engineering leads the troubleshooting efforts during testing phases of the construction project. Finally, the team is conducting System level verification and diagnostics activities to assess the overall performance of the observatory. This paper will also share lessons learned from these system engineering and verification approaches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sola, M.; Haakon Nordby, L.; Dailey, D.V.
High resolution 3-D visualization of horizon interpretation and seismic attributes from large 3-D seismic surveys in deepwater Nigeria has greatly enhanced the exploration team`s ability to quickly recognize prospective segments of subregional and prospect specific scale areas. Integrated workstation generated structure, isopach and extracted horizon consistent, interval and windowed attributes are particularly useful in illustrating the complex structural and stratigraphical prospectivity of deepwater Nigeria. Large 3-D seismic volumes acquired over 750 square kilometers can be manipulated within the visualization system with attribute tracking capability that allows for real time data interrogation and interpretation. As in classical seismic stratigraphic studies, patternmore » recognition is fundamental to effective depositions facies interpretation and reservoir model construction. The 3-D perspective enhances the data interpretation through clear representation of relative scale, spatial distribution and magnitude of attributes. In deepwater Nigeria, many prospective traps rely on an interplay between syndepositional structure and slope turbidite depositional systems. Reservoir systems in many prospects appear to be dominated by unconfined to moderately focused slope feeder channel facies. These units have spatially complex facies architecture with feeder channel axes separated by extensive interchannel areas. Structural culminations generally have a history of initial compressional folding with late in extensional collapse and accommodation faulting. The resulting complex trap configurations often have stacked reservoirs over intervals as thick as 1500 meters. Exploration, appraisal and development scenarios in these settings can be optimized by taking full advantage of integrating high resolution 3-D visualization and seismic workstation interpretation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sola, M.; Haakon Nordby, L.; Dailey, D.V.
High resolution 3-D visualization of horizon interpretation and seismic attributes from large 3-D seismic surveys in deepwater Nigeria has greatly enhanced the exploration team's ability to quickly recognize prospective segments of subregional and prospect specific scale areas. Integrated workstation generated structure, isopach and extracted horizon consistent, interval and windowed attributes are particularly useful in illustrating the complex structural and stratigraphical prospectivity of deepwater Nigeria. Large 3-D seismic volumes acquired over 750 square kilometers can be manipulated within the visualization system with attribute tracking capability that allows for real time data interrogation and interpretation. As in classical seismic stratigraphic studies, patternmore » recognition is fundamental to effective depositions facies interpretation and reservoir model construction. The 3-D perspective enhances the data interpretation through clear representation of relative scale, spatial distribution and magnitude of attributes. In deepwater Nigeria, many prospective traps rely on an interplay between syndepositional structure and slope turbidite depositional systems. Reservoir systems in many prospects appear to be dominated by unconfined to moderately focused slope feeder channel facies. These units have spatially complex facies architecture with feeder channel axes separated by extensive interchannel areas. Structural culminations generally have a history of initial compressional folding with late in extensional collapse and accommodation faulting. The resulting complex trap configurations often have stacked reservoirs over intervals as thick as 1500 meters. Exploration, appraisal and development scenarios in these settings can be optimized by taking full advantage of integrating high resolution 3-D visualization and seismic workstation interpretation.« less
Scientific Models Help Students Understand the Water Cycle
ERIC Educational Resources Information Center
Forbes, Cory; Vo, Tina; Zangori, Laura; Schwarz, Christina
2015-01-01
The water cycle is a large, complex system that encompasses ideas across the K-12 science curriculum. By the time students leave fifth grade, they should understand "that a system is a group of related parts that make up a whole and can carry out functions its individual parts cannot" and be able to describe both components and processes…
Eppur Si Muove! The 2013 Nobel Prize in Chemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Jeremy C.; Roux, Benoit
2013-12-03
The 2013 Nobel Prize in Chemistry has been awarded to Martin Karplus, Michael Levitt, and Arieh Warshel for their work on developing computational methods to study complex chemical systems. Hence, their work has led to mechanistic critical insights into chemical systems both large and small and has enabled progress in a number of different fields, including structural biology.
Information Power Grid Posters
NASA Technical Reports Server (NTRS)
Vaziri, Arsi
2003-01-01
This document is a summary of the accomplishments of the Information Power Grid (IPG). Grids are an emerging technology that provide seamless and uniform access to the geographically dispersed, computational, data storage, networking, instruments, and software resources needed for solving large-scale scientific and engineering problems. The goal of the NASA IPG is to use NASA's remotely located computing and data system resources to build distributed systems that can address problems that are too large or complex for a single site. The accomplishments outlined in this poster presentation are: access to distributed data, IPG heterogeneous computing, integration of large-scale computing node into distributed environment, remote access to high data rate instruments,and exploratory grid environment.
Large space structure damping design
NASA Technical Reports Server (NTRS)
Pilkey, W. D.; Haviland, J. K.
1983-01-01
Several FORTRAN subroutines and programs were developed which compute complex eigenvalues of a damped system using different approaches, and which rescale mode shapes to unit generalized mass and make rigid bodies orthogonal to each other. An analytical proof of a Minimum Constrained Frequency Criterion (MCFC) for a single damper is presented. A method to minimize the effect of control spill-over for large space structures is proposed. The characteristic equation of an undamped system with a generalized control law is derived using reanalysis theory. This equation can be implemented in computer programs for efficient eigenvalue analysis or control quasi synthesis. Methods to control vibrations in large space structure are reviewed and analyzed. The resulting prototype, using electromagnetic actuator, is described.
NASA Astrophysics Data System (ADS)
Elag, M.; Kumar, P.
2014-12-01
Often, scientists and small research groups collect data, which target to address issues and have limited geographic or temporal range. A large number of such collections together constitute a large database that is of immense value to Earth Science studies. Complexity of integrating these data include heterogeneity in dimensions, coordinate systems, scales, variables, providers, users and contexts. They have been defined as long-tail data. Similarly, we use "long-tail models" to characterize a heterogeneous collection of models and/or modules developed for targeted problems by individuals and small groups, which together provide a large valuable collection. Complexity of integrating across these models include differing variable names and units for the same concept, model runs at different time steps and spatial resolution, use of differing naming and reference conventions, etc. Ability to "integrate long-tail models and data" will provide an opportunity for the interoperability and reusability of communities' resources, where not only models can be combined in a workflow, but each model will be able to discover and (re)use data in application specific context of space, time and questions. This capability is essential to represent, understand, predict, and manage heterogeneous and interconnected processes and activities by harnessing the complex, heterogeneous, and extensive set of distributed resources. Because of the staggering production rate of long-tail models and data resulting from the advances in computational, sensing, and information technologies, an important challenge arises: how can geoinformatics bring together these resources seamlessly, given the inherent complexity among model and data resources that span across various domains. We will present a semantic-based framework to support integration of "long-tail" models and data. This builds on existing technologies including: (i) SEAD (Sustainable Environmental Actionable Data) which supports curation and preservation of long-tail data during its life-cycle; (ii) BrownDog, which enhances the machine interpretability of large unstructured and uncurated data; and (iii) CSDMS (Community Surface Dynamics Modeling System), which "componentizes" models by providing plug-and-play environment for models integration.
Improved mine blast algorithm for optimal cost design of water distribution systems
NASA Astrophysics Data System (ADS)
Sadollah, Ali; Guen Yoo, Do; Kim, Joong Hoon
2015-12-01
The design of water distribution systems is a large class of combinatorial, nonlinear optimization problems with complex constraints such as conservation of mass and energy equations. Since feasible solutions are often extremely complex, traditional optimization techniques are insufficient. Recently, metaheuristic algorithms have been applied to this class of problems because they are highly efficient. In this article, a recently developed optimizer called the mine blast algorithm (MBA) is considered. The MBA is improved and coupled with the hydraulic simulator EPANET to find the optimal cost design for water distribution systems. The performance of the improved mine blast algorithm (IMBA) is demonstrated using the well-known Hanoi, New York tunnels and Balerma benchmark networks. Optimization results obtained using IMBA are compared to those using MBA and other optimizers in terms of their minimum construction costs and convergence rates. For the complex Balerma network, IMBA offers the cheapest network design compared to other optimization algorithms.
Synthetic mixed-signal computation in living cells
Rubens, Jacob R.; Selvaggio, Gianluca; Lu, Timothy K.
2016-01-01
Living cells implement complex computations on the continuous environmental signals that they encounter. These computations involve both analogue- and digital-like processing of signals to give rise to complex developmental programs, context-dependent behaviours and homeostatic activities. In contrast to natural biological systems, synthetic biological systems have largely focused on either digital or analogue computation separately. Here we integrate analogue and digital computation to implement complex hybrid synthetic genetic programs in living cells. We present a framework for building comparator gene circuits to digitize analogue inputs based on different thresholds. We then demonstrate that comparators can be predictably composed together to build band-pass filters, ternary logic systems and multi-level analogue-to-digital converters. In addition, we interface these analogue-to-digital circuits with other digital gene circuits to enable concentration-dependent logic. We expect that this hybrid computational paradigm will enable new industrial, diagnostic and therapeutic applications with engineered cells. PMID:27255669
Broeders, Ivo A M J
2014-02-01
Robotic systems were introduced 15 years ago to support complex endoscopic procedures. The technology is increasingly used in gastro-intestinal surgery. In this article, literature on experimental- and clinical research is reviewed and ergonomic issues are discussed. literature review was based on Medline search using a large variety of search terms, including e.g. robot(ic), randomized, rectal, oesophageal, ergonomics. Review articles on relevant topics are discussed with preference. There is abundant evidence of supremacy in performing complex endoscopic surgery tasks when using the robot in an experimental setting. There is little high-level evidence so far on translation of these merits to clinical practice. Robotic systems may appear helpful in complex gastro-intestinal surgery. Moreover, dedicated computer based technology integrated in telepresence systems opens the way to integration of planning, diagnostics and therapy. The first high tech add-ons such as near infrared technology are under clinical evaluation. Copyright © 2014 Elsevier Ltd. All rights reserved.
Is Self-organization a Rational Expectation?
NASA Astrophysics Data System (ADS)
Luediger, Heinz
Over decades and under varying names the study of biology-inspired algorithms applied to non-living systems has been the subject of a small and somewhat exotic research community. Only the recent coincidence of a growing inability to master the design, development and operation of increasingly intertwined systems and processes, and an accelerated trend towards a naïve if not romanticizing view of nature in the sciences, has led to the adoption of biology-inspired algorithmic research by a wider range of sciences. Adaptive systems, as we apparently observe in nature, are meanwhile viewed as a promising way out of the complexity trap and, propelled by a long list of ‘self’ catchwords, complexity research has become an influential stream in the science community. This paper presents four provocative theses that cast doubt on the strategic potential of complexity research and the viability of large scale deployment of biology-inspired algorithms in an expectation driven world.
Automatic Fault Characterization via Abnormality-Enhanced Classification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bronevetsky, G; Laguna, I; de Supinski, B R
Enterprise and high-performance computing systems are growing extremely large and complex, employing hundreds to hundreds of thousands of processors and software/hardware stacks built by many people across many organizations. As the growing scale of these machines increases the frequency of faults, system complexity makes these faults difficult to detect and to diagnose. Current system management techniques, which focus primarily on efficient data access and query mechanisms, require system administrators to examine the behavior of various system services manually. Growing system complexity is making this manual process unmanageable: administrators require more effective management tools that can detect faults and help tomore » identify their root causes. System administrators need timely notification when a fault is manifested that includes the type of fault, the time period in which it occurred and the processor on which it originated. Statistical modeling approaches can accurately characterize system behavior. However, the complex effects of system faults make these tools difficult to apply effectively. This paper investigates the application of classification and clustering algorithms to fault detection and characterization. We show experimentally that naively applying these methods achieves poor accuracy. Further, we design novel techniques that combine classification algorithms with information on the abnormality of application behavior to improve detection and characterization accuracy. Our experiments demonstrate that these techniques can detect and characterize faults with 65% accuracy, compared to just 5% accuracy for naive approaches.« less
Asthma and Respiratory Allergic Disease
The pathogenesis of non-communicable diseases such as allergy is complex and poorly understood. The causes of chronic allergic diseases including asthma involve to a large extent, immunomodulation of the adaptive and particularly the innate immune systems and are markedly influen...
Scalable Metadata Management for a Large Multi-Source Seismic Data Repository
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaylord, J. M.; Dodge, D. A.; Magana-Zook, S. A.
In this work, we implemented the key metadata management components of a scalable seismic data ingestion framework to address limitations in our existing system, and to position it for anticipated growth in volume and complexity.
TERATOLOGY v2.0 – building a path forward
Unraveling the complex relationships between environmental factors and early life susceptibility in assessing the risk for adverse pregnancy outcomes requires advanced knowledge of biological systems. Large datasets and deep data-mining tools are emerging resources for predictive...
Energy Landscape of All-Atom Protein-Protein Interactions Revealed by Multiscale Enhanced Sampling
Moritsugu, Kei; Terada, Tohru; Kidera, Akinori
2014-01-01
Protein-protein interactions are regulated by a subtle balance of complicated atomic interactions and solvation at the interface. To understand such an elusive phenomenon, it is necessary to thoroughly survey the large configurational space from the stable complex structure to the dissociated states using the all-atom model in explicit solvent and to delineate the energy landscape of protein-protein interactions. In this study, we carried out a multiscale enhanced sampling (MSES) simulation of the formation of a barnase-barstar complex, which is a protein complex characterized by an extraordinary tight and fast binding, to determine the energy landscape of atomistic protein-protein interactions. The MSES adopts a multicopy and multiscale scheme to enable for the enhanced sampling of the all-atom model of large proteins including explicit solvent. During the 100-ns MSES simulation of the barnase-barstar system, we observed the association-dissociation processes of the atomistic protein complex in solution several times, which contained not only the native complex structure but also fully non-native configurations. The sampled distributions suggest that a large variety of non-native states went downhill to the stable complex structure, like a fast folding on a funnel-like potential. This funnel landscape is attributed to dominant configurations in the early stage of the association process characterized by near-native orientations, which will accelerate the native inter-molecular interactions. These configurations are guided mostly by the shape complementarity between barnase and barstar, and lead to the fast formation of the final complex structure along the downhill energy landscape. PMID:25340714
He, Shan; Botkin, Jeffrey R; Hurdle, John F
2015-02-01
The clinical research landscape has changed dramatically in recent years in terms of both volume and complexity. This poses new challenges for Institutional Review Boards' (IRBs) review efficiency and quality, especially at large academic medical centers. This article discusses the technical facets of IRB modernization. We analyzed the information technology used by IRBs in large academic institutions across the United States. We found that large academic medical centers have a high electronic IRB adoption rate; however, the capabilities of electronic IRB systems vary greatly. We discuss potential use-cases of a fully exploited electronic IRB system that promise to streamline the clinical research work flow. The key to that approach utilizes a structured and standardized information model for the IRB application. © The Author(s) 2014.
Reliability Standards of Complex Engineering Systems
NASA Astrophysics Data System (ADS)
Galperin, E. M.; Zayko, V. A.; Gorshkalev, P. A.
2017-11-01
Production and manufacture play an important role in today’s modern society. Industrial production is nowadays characterized by increased and complex communications between its parts. The problem of preventing accidents in a large industrial enterprise becomes especially relevant. In these circumstances, the reliability of enterprise functioning is of particular importance. Potential damage caused by an accident at such enterprise may lead to substantial material losses and, in some cases, can even cause a loss of human lives. That is why industrial enterprise functioning reliability is immensely important. In terms of their reliability, industrial facilities (objects) are divided into simple and complex. Simple objects are characterized by only two conditions: operable and non-operable. A complex object exists in more than two conditions. The main characteristic here is the stability of its operation. This paper develops the reliability indicator combining the set theory methodology and a state space method. Both are widely used to analyze dynamically developing probability processes. The research also introduces a set of reliability indicators for complex technical systems.
Modular microfluidic systems using reversibly attached PDMS fluid control modules
NASA Astrophysics Data System (ADS)
Skafte-Pedersen, Peder; Sip, Christopher G.; Folch, Albert; Dufva, Martin
2013-05-01
The use of soft lithography-based poly(dimethylsiloxane) (PDMS) valve systems is the dominating approach for high-density microscale fluidic control. Integrated systems enable complex flow control and large-scale integration, but lack modularity. In contrast, modular systems are attractive alternatives to integration because they can be tailored for different applications piecewise and without redesigning every element of the system. We present a method for reversibly coupling hard materials to soft lithography defined systems through self-aligning O-ring features thereby enabling easy interfacing of complex-valve-based systems with simpler detachable units. Using this scheme, we demonstrate the seamless interfacing of a PDMS-based fluid control module with hard polymer chips. In our system, 32 self-aligning O-ring features protruding from the PDMS fluid control module form chip-to-control module interconnections which are sealed by tightening four screws. The interconnection method is robust and supports complex fluidic operations in the reversibly attached passive chip. In addition, we developed a double-sided molding method for fabricating PDMS devices with integrated through-holes. The versatile system facilitates a wide range of applications due to the modular approach, where application specific passive chips can be readily attached to the flow control module.
A Model-Based Approach to Engineering Behavior of Complex Aerospace Systems
NASA Technical Reports Server (NTRS)
Ingham, Michel; Day, John; Donahue, Kenneth; Kadesch, Alex; Kennedy, Andrew; Khan, Mohammed Omair; Post, Ethan; Standley, Shaun
2012-01-01
One of the most challenging yet poorly defined aspects of engineering a complex aerospace system is behavior engineering, including definition, specification, design, implementation, and verification and validation of the system's behaviors. This is especially true for behaviors of highly autonomous and intelligent systems. Behavior engineering is more of an art than a science. As a process it is generally ad-hoc, poorly specified, and inconsistently applied from one project to the next. It uses largely informal representations, and results in system behavior being documented in a wide variety of disparate documents. To address this problem, JPL has undertaken a pilot project to apply its institutional capabilities in Model-Based Systems Engineering to the challenge of specifying complex spacecraft system behavior. This paper describes the results of the work in progress on this project. In particular, we discuss our approach to modeling spacecraft behavior including 1) requirements and design flowdown from system-level to subsystem-level, 2) patterns for behavior decomposition, 3) allocation of behaviors to physical elements in the system, and 4) patterns for capturing V&V activities associated with behavioral requirements. We provide examples of interesting behavior specification patterns, and discuss findings from the pilot project.
NASA Astrophysics Data System (ADS)
Hassan, Rania A.
In the design of complex large-scale spacecraft systems that involve a large number of components and subsystems, many specialized state-of-the-art design tools are employed to optimize the performance of various subsystems. However, there is no structured system-level concept-architecting process. Currently, spacecraft design is heavily based on the heritage of the industry. Old spacecraft designs are modified to adapt to new mission requirements, and feasible solutions---rather than optimal ones---are often all that is achieved. During the conceptual phase of the design, the choices available to designers are predominantly discrete variables describing major subsystems' technology options and redundancy levels. The complexity of spacecraft configurations makes the number of the system design variables that need to be traded off in an optimization process prohibitive when manual techniques are used. Such a discrete problem is well suited for solution with a Genetic Algorithm, which is a global search technique that performs optimization-like tasks. This research presents a systems engineering framework that places design requirements at the core of the design activities and transforms the design paradigm for spacecraft systems to a top-down approach rather than the current bottom-up approach. To facilitate decision-making in the early phases of the design process, the population-based search nature of the Genetic Algorithm is exploited to provide computationally inexpensive---compared to the state-of-the-practice---tools for both multi-objective design optimization and design optimization under uncertainty. In terms of computational cost, those tools are nearly on the same order of magnitude as that of standard single-objective deterministic Genetic Algorithm. The use of a multi-objective design approach provides system designers with a clear tradeoff optimization surface that allows them to understand the effect of their decisions on all the design objectives under consideration simultaneously. Incorporating uncertainties avoids large safety margins and unnecessary high redundancy levels. The focus on low computational cost for the optimization tools stems from the objective that improving the design of complex systems should not be achieved at the expense of a costly design methodology.
Zhang, Kaka; Yeung, Margaret Ching-Lam; Leung, Sammual Yu-Lut; Yam, Vivian Wing-Wah
2017-01-01
An important feature of biological systems to achieve complexity and precision is the involvement of multiple components where each component plays its own role and collaborates with other components. Mimicking this, we report living supramolecular polymerization achieved by collaborative assembly of two structurally dissimilar components, that is, platinum(II) complexes and poly(ethylene glycol)-b-poly(acrylic acid) (PEG-b-PAA). The PAA blocks neutralize the charges of the platinum(II) complexes, with the noncovalent metal–metal and π–π interactions directing the longitudinal growth of the platinum(II) complexes into 1D crystalline nanostructures, and the PEG blocks inhibiting the transverse growth of the platinum(II) complexes and providing the whole system with excellent solubility. The ends of the 1D crystalline nanostructures have been found to be active during the assembly and remain active after the assembly. One-dimensional segmented nanostructures with heterojunctions have been produced by sequential growth of two types of platinum(II) complexes. The PAA blocks act as adapters at the heterojunctions for lattice matching between chemically and crystallographically different platinum(II) complexes, achieving heterojunctions with a lattice mismatch as large as 21%. PMID:29078381
Experiments with arbitrary networks in time-multiplexed delay systems
NASA Astrophysics Data System (ADS)
Hart, Joseph D.; Schmadel, Don C.; Murphy, Thomas E.; Roy, Rajarshi
2017-12-01
We report a new experimental approach using an optoelectronic feedback loop to investigate the dynamics of oscillators coupled on large complex networks with arbitrary topology. Our implementation is based on a single optoelectronic feedback loop with time delays. We use the space-time interpretation of systems with time delay to create large networks of coupled maps. Others have performed similar experiments using high-pass filters to implement the coupling; this restricts the network topology to the coupling of only a few nearest neighbors. In our experiment, the time delays and coupling are implemented on a field-programmable gate array, allowing the creation of networks with arbitrary coupling topology. This system has many advantages: the network nodes are truly identical, the network is easily reconfigurable, and the network dynamics occur at high speeds. We use this system to study cluster synchronization and chimera states in both small and large networks of different topologies.
Durham extremely large telescope adaptive optics simulation platform.
Basden, Alastair; Butterley, Timothy; Myers, Richard; Wilson, Richard
2007-03-01
Adaptive optics systems are essential on all large telescopes for which image quality is important. These are complex systems with many design parameters requiring optimization before good performance can be achieved. The simulation of adaptive optics systems is therefore necessary to categorize the expected performance. We describe an adaptive optics simulation platform, developed at Durham University, which can be used to simulate adaptive optics systems on the largest proposed future extremely large telescopes as well as on current systems. This platform is modular, object oriented, and has the benefit of hardware application acceleration that can be used to improve the simulation performance, essential for ensuring that the run time of a given simulation is acceptable. The simulation platform described here can be highly parallelized using parallelization techniques suited for adaptive optics simulation, while still offering the user complete control while the simulation is running. The results from the simulation of a ground layer adaptive optics system are provided as an example to demonstrate the flexibility of this simulation platform.
Biocellion: accelerating computer simulation of multicellular biological system models
Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya
2014-01-01
Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572
Effects of Voice Harmonic Complexity on ERP Responses to Pitch-Shifted Auditory Feedback
Behroozmand, Roozbeh; Korzyukov, Oleg; Larson, Charles R.
2011-01-01
Objective The present study investigated the neural mechanisms of voice pitch control for different levels of harmonic complexity in the auditory feedback. Methods Event-related potentials (ERPs) were recorded in response to +200 cents pitch perturbations in the auditory feedback of self-produced natural human vocalizations, complex and pure tone stimuli during active vocalization and passive listening conditions. Results During active vocal production, ERP amplitudes were largest in response to pitch shifts in the natural voice, moderately large for non-voice complex stimuli and smallest for the pure tones. However, during passive listening, neural responses were equally large for pitch shifts in voice and non-voice complex stimuli but still larger than that for pure tones. Conclusions These findings suggest that pitch change detection is facilitated for spectrally rich sounds such as natural human voice and non-voice complex stimuli compared with pure tones. Vocalization-induced increase in neural responses for voice feedback suggests that sensory processing of naturally-produced complex sounds such as human voice is enhanced by means of motor-driven mechanisms (e.g. efference copies) during vocal production. Significance This enhancement may enable the audio-vocal system to more effectively detect and correct for vocal errors in the feedback of natural human vocalizations to maintain an intended vocal output for speaking. PMID:21719346
Information Flows? A Critique of Transfer Entropies
NASA Astrophysics Data System (ADS)
James, Ryan G.; Barnett, Nix; Crutchfield, James P.
2016-06-01
A central task in analyzing complex dynamics is to determine the loci of information storage and the communication topology of information flows within a system. Over the last decade and a half, diagnostics for the latter have come to be dominated by the transfer entropy. Via straightforward examples, we show that it and a derivative quantity, the causation entropy, do not, in fact, quantify the flow of information. At one and the same time they can overestimate flow or underestimate influence. We isolate why this is the case and propose several avenues to alternate measures for information flow. We also address an auxiliary consequence: The proliferation of networks as a now-common theoretical model for large-scale systems, in concert with the use of transferlike entropies, has shoehorned dyadic relationships into our structural interpretation of the organization and behavior of complex systems. This interpretation thus fails to include the effects of polyadic dependencies. The net result is that much of the sophisticated organization of complex systems may go undetected.
Increasingly automated procedure acquisition in dynamic systems
NASA Technical Reports Server (NTRS)
Mathe, Nathalie; Kedar, Smadar
1992-01-01
Procedures are widely used by operators for controlling complex dynamic systems. Currently, most development of such procedures is done manually, consuming a large amount of paper, time, and manpower in the process. While automated knowledge acquisition is an active field of research, not much attention has been paid to the problem of computer-assisted acquisition and refinement of complex procedures for dynamic systems. The Procedure Acquisition for Reactive Control Assistant (PARC), which is designed to assist users in more systematically and automatically encoding and refining complex procedures. PARC is able to elicit knowledge interactively from the user during operation of the dynamic system. We categorize procedure refinement into two stages: diagnosis - diagnose the failure and choose a repair - and repair - plan and perform the repair. The basic approach taken in PARC is to assist the user in all steps of this process by providing increased levels of assistance with layered tools. We illustrate the operation of PARC in refining procedures for the control of a robot arm.
NASA Astrophysics Data System (ADS)
Deufel, Christopher L.; Furutani, Keith M.
2014-02-01
As dose optimization for high dose rate brachytherapy becomes more complex, it becomes increasingly important to have a means of verifying that optimization results are reasonable. A method is presented for using a simple optimization as quality assurance for the more complex optimization algorithms typically found in commercial brachytherapy treatment planning systems. Quality assurance tests may be performed during commissioning, at regular intervals, and/or on a patient specific basis. A simple optimization method is provided that optimizes conformal target coverage using an exact, variance-based, algebraic approach. Metrics such as dose volume histogram, conformality index, and total reference air kerma agree closely between simple and complex optimizations for breast, cervix, prostate, and planar applicators. The simple optimization is shown to be a sensitive measure for identifying failures in a commercial treatment planning system that are possibly due to operator error or weaknesses in planning system optimization algorithms. Results from the simple optimization are surprisingly similar to the results from a more complex, commercial optimization for several clinical applications. This suggests that there are only modest gains to be made from making brachytherapy optimization more complex. The improvements expected from sophisticated linear optimizations, such as PARETO methods, will largely be in making systems more user friendly and efficient, rather than in finding dramatically better source strength distributions.
The natural science underlying big history.
Chaisson, Eric J
2014-01-01
Nature's many varied complex systems-including galaxies, stars, planets, life, and society-are islands of order within the increasingly disordered Universe. All organized systems are subject to physical, biological, or cultural evolution, which together comprise the grander interdisciplinary subject of cosmic evolution. A wealth of observational data supports the hypothesis that increasingly complex systems evolve unceasingly, uncaringly, and unpredictably from big bang to humankind. These are global history greatly extended, big history with a scientific basis, and natural history broadly portrayed across ∼14 billion years of time. Human beings and our cultural inventions are not special, unique, or apart from Nature; rather, we are an integral part of a universal evolutionary process connecting all such complex systems throughout space and time. Such evolution writ large has significant potential to unify the natural sciences into a holistic understanding of who we are and whence we came. No new science (beyond frontier, nonequilibrium thermodynamics) is needed to describe cosmic evolution's major milestones at a deep and empirical level. Quantitative models and experimental tests imply that a remarkable simplicity underlies the emergence and growth of complexity for a wide spectrum of known and diverse systems. Energy is a principal facilitator of the rising complexity of ordered systems within the expanding Universe; energy flows are as central to life and society as they are to stars and galaxies. In particular, energy rate density-contrasting with information content or entropy production-is an objective metric suitable to gauge relative degrees of complexity among a hierarchy of widely assorted systems observed throughout the material Universe. Operationally, those systems capable of utilizing optimum amounts of energy tend to survive, and those that cannot are nonrandomly eliminated.
Systems Proteomics for Translational Network Medicine
Arrell, D. Kent; Terzic, Andre
2012-01-01
Universal principles underlying network science, and their ever-increasing applications in biomedicine, underscore the unprecedented capacity of systems biology based strategies to synthesize and resolve massive high throughput generated datasets. Enabling previously unattainable comprehension of biological complexity, systems approaches have accelerated progress in elucidating disease prediction, progression, and outcome. Applied to the spectrum of states spanning health and disease, network proteomics establishes a collation, integration, and prioritization algorithm to guide mapping and decoding of proteome landscapes from large-scale raw data. Providing unparalleled deconvolution of protein lists into global interactomes, integrative systems proteomics enables objective, multi-modal interpretation at molecular, pathway, and network scales, merging individual molecular components, their plurality of interactions, and functional contributions for systems comprehension. As such, network systems approaches are increasingly exploited for objective interpretation of cardiovascular proteomics studies. Here, we highlight network systems proteomic analysis pipelines for integration and biological interpretation through protein cartography, ontological categorization, pathway and functional enrichment and complex network analysis. PMID:22896016
Configuration Management at NASA
NASA Technical Reports Server (NTRS)
Doreswamy, Rajiv
2013-01-01
NASA programs are characterized by complexity, harsh environments and the fact that we usually have one chance to get it right. Programs last decades and need to accept new hardware and technology as it is developed. We have multiple suppliers and international partners Our challenges are many, our costs are high and our failures are highly visible. CM systems need to be scalable, adaptable to new technology and span the life cycle of the program (30+ years). Multiple Systems, Contractors and Countries added major levels of complexity to the ISS program and CM/DM and Requirements management systems center dot CM Systems need to be designed for long design life center dot Space Station Design started in 1984 center dot Assembly Complete in 2012 center dot Systems were developed on a task basis without an overall system perspective center dot Technology moves faster than a large project office, try to make sure you have a system that can adapt
Group Decision Support System to Aid the Process of Design and Maintenance of Large Scale Systems
1992-03-23
from a fuzzy set of user requirements. The overall objective of the project is to develop a system combining the characteristics of a compact computer... AHP ) for hierarchical prioritization. 4) Individual Evaluation and Selection of Alternatives - Allows the decision maker to individually evaluate...its concept of outranking relations. The AHP method supports complex decision problems by successively decomposing and synthesizing various elements
ERIC Educational Resources Information Center
Zeyer, Albert
2018-01-01
The present study is based on a large cross-cultural study, which showed that a systemizing cognition type has a high impact on motivation to learn science, while the impact of gender is only indirect thorough systemizing. The present study uses the same structural equation model as in the cross-cultural study and separately tests it for physics,…
MFIRE-2: A Multi Agent System for Flow-Based Intrusion Detection Using Stochastic Search
2012-03-01
attacks that are distributed in nature , but may not protect individual systems effectively without incurring large bandwidth penalties while collecting...system-level information to help prepare for more significant attacks. The type of information potentially revealed by footprinting includes account...key areas where MAS may be appropriate: • The environment is open, highly dynamic, uncertain, or complex • Agents are a natural metaphor—Many
2014-10-01
considering new approaches. According to Air Force Space Command, U.S. space systems face intentional and unintentional threats , which have increased...life cycle costs • Demand for more satellites may stimulate new entrants and competition to lower acquisition costs. • Smaller, less complex...Fiscal constraints and growing threats to space systems have led DOD to consider alternatives for acquiring space-based capabilities, including
Enhanced cellular transport and drug targeting using dendritic nanostructures
NASA Astrophysics Data System (ADS)
Kannan, R. M.; Kolhe, Parag; Kannan, Sujatha; Lieh-Lai, Mary
2003-03-01
Dendrimers and hyperbranched polymers possess highly branched architectures, with a large number of controllable, tailorable, peripheral' functionalities. Since the surface chemistry of these materials can be modified with relative ease, these materials have tremendous potential in targeted drug delivery. The large density of end groups can also be tailored to create enhanced affinity to targeted cells, and can also encapsulate drugs and deliver them in a controlled manner. We are developing tailor-modified dendritic systems for drug delivery. Synthesis, drug/ligand conjugation, in vitro cellular and in vivo drug delivery, and the targeting efficiency to the cell are being studied systematically using a wide variety of experimental tools. Results on PAMAM dendrimers and polyol hyperbranched polymers suggest that: (1) These materials complex/encapsulate a large number of drug molecules and release them at tailorable rates; (2) The drug-dendrimer complex is transported very rapidly through a A549 lung epithelial cancel cell line, compared to free drug, perhaps by endocytosis. The ability of the drug-dendrimer-ligand complexes to target specific asthma and cancer cells is currently being explored using in vitro and in vivo animal models.
Exploring model based engineering for large telescopes: getting started with descriptive models
NASA Astrophysics Data System (ADS)
Karban, R.; Zamparelli, M.; Bauvir, B.; Koehler, B.; Noethe, L.; Balestra, A.
2008-07-01
Large telescopes pose a continuous challenge to systems engineering due to their complexity in terms of requirements, operational modes, long duty lifetime, interfaces and number of components. A multitude of decisions must be taken throughout the life cycle of a new system, and a prime means of coping with complexity and uncertainty is using models as one decision aid. The potential of descriptive models based on the OMG Systems Modeling Language (OMG SysMLTM) is examined in different areas: building a comprehensive model serves as the basis for subsequent activities of soliciting and review for requirements, analysis and design alike. Furthermore a model is an effective communication instrument against misinterpretation pitfalls which are typical of cross disciplinary activities when using natural language only or free-format diagrams. Modeling the essential characteristics of the system, like interfaces, system structure and its behavior, are important system level issues which are addressed. Also shown is how to use a model as an analysis tool to describe the relationships among disturbances, opto-mechanical effects and control decisions and to refine the control use cases. Considerations on the scalability of the model structure and organization, its impact on the development process, the relation to document-centric structures, style and usage guidelines and the required tool chain are presented.
Large-System Transformation in Health Care: A Realist Review
Best, Allan; Greenhalgh, Trisha; Lewis, Steven; Saul, Jessie E; Carroll, Simon; Bitz, Jennifer
2012-01-01
Context An evidence base that addresses issues of complexity and context is urgently needed for large-system transformation (LST) and health care reform. Fundamental conceptual and methodological challenges also must be addressed. The Saskatchewan Ministry of Health in Canada requested a six-month synthesis project to guide four major policy development and strategy initiatives focused on patient- and family-centered care, primary health care renewal, quality improvement, and surgical wait lists. The aims of the review were to analyze examples of successful and less successful transformation initiatives, to synthesize knowledge of the underlying mechanisms, to clarify the role of government, and to outline options for evaluation. Methods We used realist review, whose working assumption is that a particular intervention triggers particular mechanisms of change. Mechanisms may be more or less effective in producing their intended outcomes, depending on their interaction with various contextual factors. We explain the variations in outcome as the interplay between context and mechanisms. We nested this analytic approach in a macro framing of complex adaptive systems (CAS). Findings Our rapid realist review identified five “simple rules” of LST that were likely to enhance the success of the target initiatives: (1) blend designated leadership with distributed leadership; (2) establish feedback loops; (3) attend to history; (4) engage physicians; and (5) include patients and families. These principles play out differently in different contexts affecting human behavior (and thereby contributing to change) through a wide range of different mechanisms. Conclusions Realist review methodology can be applied in combination with a complex system lens on published literature to produce a knowledge synthesis that informs a prospective change effort in large-system transformation. A collaborative process engaging both research producers and research users contributes to local applications of universal principles and mid-range theories, as well as to a more robust knowledge base for applied research. We conclude with suggestions for the future development of synthesis and evaluation methods. PMID:22985277
Vernon, Ian; Liu, Junli; Goldstein, Michael; Rowe, James; Topping, Jen; Lindsey, Keith
2018-01-02
Many mathematical models have now been employed across every area of systems biology. These models increasingly involve large numbers of unknown parameters, have complex structure which can result in substantial evaluation time relative to the needs of the analysis, and need to be compared to observed data of various forms. The correct analysis of such models usually requires a global parameter search, over a high dimensional parameter space, that incorporates and respects the most important sources of uncertainty. This can be an extremely difficult task, but it is essential for any meaningful inference or prediction to be made about any biological system. It hence represents a fundamental challenge for the whole of systems biology. Bayesian statistical methodology for the uncertainty analysis of complex models is introduced, which is designed to address the high dimensional global parameter search problem. Bayesian emulators that mimic the systems biology model but which are extremely fast to evaluate are embeded within an iterative history match: an efficient method to search high dimensional spaces within a more formal statistical setting, while incorporating major sources of uncertainty. The approach is demonstrated via application to a model of hormonal crosstalk in Arabidopsis root development, which has 32 rate parameters, for which we identify the sets of rate parameter values that lead to acceptable matches between model output and observed trend data. The multiple insights into the model's structure that this analysis provides are discussed. The methodology is applied to a second related model, and the biological consequences of the resulting comparison, including the evaluation of gene functions, are described. Bayesian uncertainty analysis for complex models using both emulators and history matching is shown to be a powerful technique that can greatly aid the study of a large class of systems biology models. It both provides insight into model behaviour and identifies the sets of rate parameters of interest.
The q-dependent detrended cross-correlation analysis of stock market
NASA Astrophysics Data System (ADS)
Zhao, Longfeng; Li, Wei; Fenu, Andrea; Podobnik, Boris; Wang, Yougui; Stanley, H. Eugene
2018-02-01
Properties of the q-dependent cross-correlation matrices of the stock market have been analyzed by using random matrix theory and complex networks. The correlation structures of the fluctuations at different magnitudes have unique properties. The cross-correlations among small fluctuations are much stronger than those among large fluctuations. The large and small fluctuations are dominated by different groups of stocks. We use complex network representation to study these q-dependent matrices and discover some new identities. By utilizing those q-dependent correlation-based networks, we are able to construct some portfolios of those more independent stocks which consistently perform better. The optimal multifractal order for portfolio optimization is around q = 2 under the mean-variance portfolio framework, and q\\in[2, 6] under the expected shortfall criterion. These results have deepened our understanding regarding the collective behavior of the complex financial system.
Kleiter, Ingo; Luerding, Ralf; Diendorfer, Gerhard; Rek, Helga; Bogdahn, Ulrich; Schalke, Berthold
2007-01-01
The case of a 23‐year‐old mountaineer who was hit by a lightning strike to the occiput causing a large central visual field defect and bilateral tympanic membrane ruptures is described. Owing to extreme agitation, the patient was set to a drug‐induced coma for 3 days. After extubation, she experienced simple and complex visual hallucinations for several days, but otherwise recovered largely. Neuropsychological tests revealed deficits in fast visual detection tasks and non‐verbal learning, and indicated a right temporal lobe dysfunction, consistent with a right temporal focus on electroencephalography. Four months after the accident, she developed a psychological reaction consisting of nightmares with reappearance of the complex visual hallucinations and a depressive syndrome. Using the European Cooperation for Lightning Detection network, a meteorological system for lightning surveillance, the exact geographical location and nature of the lightning flash were retrospectively retraced. PMID:17369595
Kleiter, Ingo; Luerding, Ralf; Diendorfer, Gerhard; Rek, Helga; Bogdahn, Ulrich; Schalke, Berthold
2009-01-01
The case of a 23-year-old mountaineer who was hit by a lightning strike to the occiput causing a large central visual field defect and bilateral tympanic membrane ruptures is described. Owing to extreme agitation, the patient was sent into a drug-induced coma for 3 days. After extubation, she experienced simple and complex visual hallucinations for several days, but otherwise largely recovered. Neuropsychological tests revealed deficits in fast visual detection tasks and non-verbal learning and indicated a right temporal lobe dysfunction, consistent with a right temporal focus on electroencephalography. At 4 months after the accident, she developed a psychological reaction consisting of nightmares, with reappearance of the complex visual hallucinations and a depressive syndrome. Using the European Cooperation for Lightning Detection network, a meteorological system for lightning surveillance, the exact geographical location and nature of the lightning strike were retrospectively retraced PMID:21734915
Rotor dynamic considerations for large wind power generator systems
NASA Technical Reports Server (NTRS)
Ormiston, R. A.
1973-01-01
Successful large, reliable, low maintenance wind turbines must be designed with full consideration for minimizing dynamic response to aerodynamic, inertial, and gravitational forces. Much of existing helicopter rotor technology is applicable to this problem. Compared with helicopter rotors, large wind turbines are likely to be relatively less flexible with higher dimensionless natural frequencies. For very large wind turbines, low power output per unit weight and stresses due to gravitational forces are limiting factors. The need to reduce rotor complexity to a minimum favors the use of cantilevered (hingeless) rotor configurations where stresses are relieved by elastic deformations.
On generalized Volterra systems
NASA Astrophysics Data System (ADS)
Charalambides, S. A.; Damianou, P. A.; Evripidou, C. A.
2015-01-01
We construct a large family of evidently integrable Hamiltonian systems which are generalizations of the KM system. The algorithm uses the root system of a complex simple Lie algebra. The Hamiltonian vector field is homogeneous cubic but in a number of cases a simple change of variables transforms such a system to a quadratic Lotka-Volterra system. We present in detail all such systems in the cases of A3, A4 and we also give some examples from higher dimensions. We classify all possible Lotka-Volterra systems that arise via this algorithm in the An case.
ERIC Educational Resources Information Center
Frees, Edward W.; Kim, Jee-Seon
2006-01-01
Multilevel models are proven tools in social research for modeling complex, hierarchical systems. In multilevel modeling, statistical inference is based largely on quantification of random variables. This paper distinguishes among three types of random variables in multilevel modeling--model disturbances, random coefficients, and future response…
Avoiding Decline: Fostering Resilience and Sustainability in Midsize Cities
Eighty-five percent of United States citizens live in urban areas. However, research surrounding the resilience and sustainability of complex urban systems focuses largely on coastal megacities (>1 million people). Midsize cities differ from their larger counterparts due to tight...
Conservation of design knowledge. [of large complex spaceborne systems
NASA Technical Reports Server (NTRS)
Sivard, Cecilia; Zweben, Monte; Cannon, David; Lakin, Fred; Leifer, Larry
1989-01-01
This paper presents an approach for acquiring knowledge about a design during the design process. The objective is to increase the efficiency of the lifecycle management of a space-borne system by providing operational models of the system's structure and behavior, as well as the design rationale, to human and automated operators. A design knowledge acquisition system is under development that compares how two alternative design versions meet the system requirements as a means for automatically capturing rationale for design changes.
Security alarm communication and display systems development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waddoups, I.G.
1990-01-01
Sandia National Laboratories has developed a variety of alarm communication and display systems for a broad spectrum of users. This paper will briefly describe the latest systems developed for the Department of Energy (DOE), the Department of Defense (DoD), and the Department of State (DOS) applications. Applications covered will vary from relatively small facilities to large complex sites. Ongoing system developments will also be discussed. The concluding section will summarize the practical, implementable state-of-the-art features available in new systems. 6 figs.
Fjield, T; Hynynen, K
2000-01-01
Phased-array technology offers an incredible advantage to therapeutic ultrasound due to the ability to electronically steer foci, create multiple foci, or to create an enlarged focal region by using phase cancellation. However, to take advantage of this flexibility, the phased-arrays generally consist of many elements. Each of these elements requires its own radio-frequency generator with independent amplitude and phase control, resulting in a large, complex, and expensive driving system. A method is presented here where in certain cases the number of amplifier channels can be reduced to a fraction of the number of transducer elements, thereby simplifying the driving system and reducing the overall system complexity and cost, by using isolation transformers to produce 180 degrees phase shifts.
Information Technology in Complex Health Services
Southon, Frank Charles Gray; Sauer, Chris; Dampney, Christopher Noel Grant (Kit)
1997-01-01
Abstract Objective: To identify impediments to the successful transfer and implementation of packaged information systems through large, divisionalized health services. Design: A case analysis of the failure of an implementation of a critical application in the Public Health System of the State of New South Wales, Australia, was carried out. This application had been proven in the United States environment. Measurements: Interviews involving over 60 staff at all levels of the service were undertaken by a team of three. The interviews were recorded and analyzed for key themes, and the results were shared and compared to enable a continuing critical assessment. Results: Two components of the transfer of the system were considered: the transfer from a different environment, and the diffusion throughout a large, divisionalized organization. The analyses were based on the Scott-Morton organizational fit framework. In relation to the first, it was found that there was a lack of fit in the business environments and strategies, organizational structures and strategy-structure pairing as well as the management process-roles pairing. The diffusion process experienced problems because of the lack of fit in the strategy-structure, strategy-structure-management processes, and strategy-structure-role relationships. Conclusion: The large-scale developments of integrated health services present great challenges to the efficient and reliable implementation of information technology, especially in large, divisionalized organizations. There is a need to take a more sophisticated approach to understanding the complexities of organizational factors than has traditionally been the case. PMID:9067877
Southon, F C; Sauer, C; Grant, C N
1997-01-01
To identify impediments to the successful transfer and implementation of packaged information systems through large, divisionalized health services. A case analysis of the failure of an implementation of a critical application in the Public Health System of the State of New South Wales, Australia, was carried out. This application had been proven in the United States environment. Interviews involving over 60 staff at all levels of the service were undertaken by a team of three. The interviews were recorded and analyzed for key themes, and the results were shared and compared to enable a continuing critical assessment. Two components of the transfer of the system were considered: the transfer from a different environment, and the diffusion throughout a large, divisionalized organization. The analyses were based on the Scott-Morton organizational fit framework. In relation to the first, it was found that there was a lack of fit in the business environments and strategies, organizational structures and strategy-structure pairing as well as the management process-roles pairing. The diffusion process experienced problems because of the lack of fit in the strategy-structure, strategy-structure-management processes, and strategy-structure-role relationships. The large-scale developments of integrated health services present great challenges to the efficient and reliable implementation of information technology, especially in large, divisionalized organizations. There is a need to take a more sophisticated approach to understanding the complexities of organizational factors than has traditionally been the case.
NASA Technical Reports Server (NTRS)
Johnson, Sally C.; Boerschlein, David P.
1995-01-01
Semi-Markov models can be used to analyze the reliability of virtually any fault-tolerant system. However, the process of delineating all the states and transitions in a complex system model can be devastatingly tedious and error prone. The Abstract Semi-Markov Specification Interface to the SURE Tool (ASSIST) computer program allows the user to describe the semi-Markov model in a high-level language. Instead of listing the individual model states, the user specifies the rules governing the behavior of the system, and these are used to generate the model automatically. A few statements in the abstract language can describe a very large, complex model. Because no assumptions are made about the system being modeled, ASSIST can be used to generate models describing the behavior of any system. The ASSIST program and its input language are described and illustrated by examples.
NASA Technical Reports Server (NTRS)
Scholl, R. E. (Editor)
1979-01-01
Earthquake engineering research capabilities of the National Aeronautics and Space Administration (NASA) facilities at George C. Marshall Space Flight Center (MSFC), Alabama, were evaluated. The results indicate that the NASA/MSFC facilities and supporting capabilities offer unique opportunities for conducting earthquake engineering research. Specific features that are particularly attractive for large scale static and dynamic testing of natural and man-made structures include the following: large physical dimensions of buildings and test bays; high loading capacity; wide range and large number of test equipment and instrumentation devices; multichannel data acquisition and processing systems; technical expertise for conducting large-scale static and dynamic testing; sophisticated techniques for systems dynamics analysis, simulation, and control; and capability for managing large-size and technologically complex programs. Potential uses of the facilities for near and long term test programs to supplement current earthquake research activities are suggested.
NASA Technical Reports Server (NTRS)
Bera, Partha P.; Head-Gordon, Martin; Lee, Timothy J.
2011-01-01
A feasible initiation step for particle growth in the interstellar medium (ISM) is simulated by means of ab quantum chemistry methods. The systems studied are dimer ions formed by pairing nitrogen containing small molecules known to exist in the ISM with ions of unsaturated hydrocarbons or vice versa. Complexation energies, structures of ensuing complexes and electronic excitation spectra of the encounter complexes are estimated using various quantum chemistry methods. Moller-Plesset perturbation theory (MP2, Z-averaged perturbation theory (ZAP2), coupled cluster singles and doubles with perturbative triples corrections (CCSD(T)), and density functional theory (DFT) methods (B3LYP) were employed along with the correlation consistent cc-pVTZ and aug-cc-pVTZ basis sets. Two types of complexes are predicted. One type of complex has electrostatic binding with moderate (7-20 kcal per mol) binding energies, that are nonetheless significantly stronger than typical van der Waals interactions between molecules of this size. The other type of complex develops strong covalent bonds between the fragments. Cyclic isomers of the nitrogen containing complexes are produced very easily by ion-molecule reactions. Some of these complexes show intense ultraviolet visible spectra for electronic transitions with large oscillator strengths at the B3LYP, omegaB97, and equations of motion coupled cluster (EOM-CCSD) levels. The open shell nitrogen containing carbonaceous complexes especially exhibit a large oscillator strength electronic transition in the visible region of the electromagnetic spectrum.
The use of inflatable structures for re-entry of orbiting vehicles
NASA Astrophysics Data System (ADS)
Kendall, Robert T.; Maddox, Arthur R.
1990-10-01
Inflatable recovery systems offer the unique advantage that a large high-drag shape can be stored initially in a relatively small package. The resulting shapes decelerate rapidly with lower heating inputs than other types of re-entry vehicles. Recent developments have led to some light-weight materials, with little thermal protection, can withstand the heating inputs to such vehicles. As a result, inflatable recovery vehicles offer a simple, reliable and economical way to return various vehicles from orbit. This paper examines the application of this concept to a large and a small vehicle with the accompanying dynamics that might be expected. More complex systems could extend the concept to emergency personnel escape systems, payload abort and satellite recovery systems.
The physics of complex systems in information and biology
NASA Astrophysics Data System (ADS)
Walker, Dylan
Citation networks have re-emerged as a topic intense interest in the complex networks community with the recent availability of large-scale data sets. The ranking of citation networks is a necessary practice as a means to improve information navigability and search. Unlike many information networks, the aging characteristics of citation networks require the development of new ranking methods. To account for strong aging characteristics of citation networks, we modify the PageRank algorithm by initially distributing random surfers exponentially with age, in favor of more recent publications. The output of this algorithm, which we call CiteRank, is interpreted as approximate traffic to individual publications in a simple model of how researchers find new information. We optimize parameters of our algorithm to achieve the best performance. The results are compared for two rather different citation networks: all American Physical Society publications between 1893-2003 and the set of high-energy physics theory (hep-th) preprints. Despite major differences between these two networks, we find that their optimal parameters for the CiteRank algorithm are remarkably similar. The advantages and performance of CiteRank over more conventional methods of ranking publications are discussed. Collaborative voting systems have emerged as an abundant form of real-world, complex information systems that exist in a variety of online applications. These systems are comprised of large populations of users that collectively submit and vote on objects. While the specific properties of these systems vary widely, many of them share a core set of features and dynamical behaviors that govern their evolution. We study a subset of these systems that involve material of a time-critical nature as in the popular example of news items. We consider a general model system in which articles are introduced, voted on by a population of users, and subsequently expire after a proscribed period of time. To study the interaction between popularity and quality, we introduce simple stochastic models of user behavior that approximate differing user quality and susceptibility to the common notion of popularity. We define a metric to quantify user reputation in a manner that is self-consistent, adaptable and content-blind and shows good correlation with the probability that a user behaves in an optimal fashion. We further construct a mechanism for ranking documents that take into account user reputation and provides substantial improvement in the time-critical performance of the system. The structure of complex systems have been well studied in the context of both information and biological systems. More recently, dynamics in complex systems that occur over the background of the underlying network has received a great deal of attention. In particular, the study of fluctuations in complex systems has emerged as an issue central to understanding dynamical behavior. We approach the problem of collective effects of the underlying network on dynamical fluctuations by considering the protein-protein interaction networks for the system of the living cell. We consider two types of fluctuations in the mass-action equilibrium in protein binding networks. The first type is driven by relatively slow changes in total concentrations (copy numbers) of interacting proteins. The second type, to which we refer to as spontaneous, is caused by quickly decaying thermodynamic deviations away from the mass-action equilibrium of the system. As such they are amenable to methods of equilibrium statistical mechanics used in our study. We investigate the effects of network connectivity on these fluctuations by comparing them to different scenarios in which the interacting pair is isolated form the rest of the network. Such comparison allows us to analytically derive upper and lower bounds on network fluctuations. The collective effects are shown to sometimes lead to relatively large amplification of spontaneous fluctuations as compared to the expectation for isolated dimers. As a consequence of this, the strength of both types of fluctuations is positively correlated with the overall network connectivity of proteins forming the complex. On the other hand, the relative amplitude of fluctuations is negatively correlated with the equilibrium concentration of the complex. Our general findings are illustrated using a curated network of protein-protein interactions and multi-protein complexes in bakers yeast with experimentally determined protein concentrations.
Large Animal Models of an In Vivo Bioreactor for Engineering Vascularized Bone.
Akar, Banu; Tatara, Alexander M; Sutradhar, Alok; Hsiao, Hui-Yi; Miller, Michael; Cheng, Ming-Huei; Mikos, Antonios G; Brey, Eric M
2018-04-12
Reconstruction of large skeletal defects is challenging due to the requirement for large volumes of donor tissue and the often complex surgical procedures. Tissue engineering has the potential to serve as a new source of tissue for bone reconstruction, but current techniques are often limited in regards to the size and complexity of tissue that can be formed. Building tissue using an in vivo bioreactor approach may enable the production of appropriate amounts of specialized tissue, while reducing issues of donor site morbidity and infection. Large animals are required to screen and optimize new strategies for growing clinically appropriate volumes of tissues in vivo. In this article, we review both ovine and porcine models that serve as models of the technique proposed for clinical engineering of bone tissue in vivo. Recent findings are discussed with these systems, as well as description of next steps required for using these models, to develop clinically applicable tissue engineering applications.
Nie, Yan; Viola, Cristina; Bieniossek, Christoph; Trowitzsch, Simon; Vijay-achandran, Lakshmi Sumitra; Chaillet, Maxime; Garzoni, Frederic; Berger, Imre
2009-01-01
We are witnessing tremendous advances in our understanding of the organization of life. Complete genomes are being deciphered with ever increasing speed and accuracy, thereby setting the stage for addressing the entire gene product repertoire of cells, towards understanding whole biological systems. Advances in bioinformatics and mass spectrometric techniques have revealed the multitude of interactions present in the proteome. Multiprotein complexes are emerging as a paramount cornerstone of biological activity, as many proteins appear to participate, stably or transiently, in large multisubunit assemblies. Analysis of the architecture of these assemblies and their manifold interactions is imperative for understanding their function at the molecular level. Structural genomics efforts have fostered the development of many technologies towards achieving the throughput required for studying system-wide single proteins and small interaction motifs at high resolution. The present shift in focus towards large multiprotein complexes, in particular in eukaryotes, now calls for a likewise concerted effort to develop and provide new technologies that are urgently required to produce in quality and quantity the plethora of multiprotein assemblies that form the complexome, and to routinely study their structure and function at the molecular level. Current efforts towards this objective are summarized and reviewed in this contribution. PMID:20514218
Large Scale Multi-area Static/Dynamic Economic Dispatch using Nature Inspired Optimization
NASA Astrophysics Data System (ADS)
Pandit, Manjaree; Jain, Kalpana; Dubey, Hari Mohan; Singh, Rameshwar
2017-04-01
Economic dispatch (ED) ensures that the generation allocation to the power units is carried out such that the total fuel cost is minimized and all the operating equality/inequality constraints are satisfied. Classical ED does not take transmission constraints into consideration, but in the present restructured power systems the tie-line limits play a very important role in deciding operational policies. ED is a dynamic problem which is performed on-line in the central load dispatch centre with changing load scenarios. The dynamic multi-area ED (MAED) problem is more complex due to the additional tie-line, ramp-rate and area-wise power balance constraints. Nature inspired (NI) heuristic optimization methods are gaining popularity over the traditional methods for complex problems. This work presents the modified particle swarm optimization (PSO) based techniques where parameter automation is effectively used for improving the search efficiency by avoiding stagnation to a sub-optimal result. This work validates the performance of the PSO variants with traditional solver GAMS for single as well as multi-area economic dispatch (MAED) on three test cases of a large 140-unit standard test system having complex constraints.
NASA Astrophysics Data System (ADS)
Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin
2012-08-01
Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.
EvoluCode: Evolutionary Barcodes as a Unifying Framework for Multilevel Evolutionary Data.
Linard, Benjamin; Nguyen, Ngoc Hoan; Prosdocimi, Francisco; Poch, Olivier; Thompson, Julie D
2012-01-01
Evolutionary systems biology aims to uncover the general trends and principles governing the evolution of biological networks. An essential part of this process is the reconstruction and analysis of the evolutionary histories of these complex, dynamic networks. Unfortunately, the methodologies for representing and exploiting such complex evolutionary histories in large scale studies are currently limited. Here, we propose a new formalism, called EvoluCode (Evolutionary barCode), which allows the integration of different evolutionary parameters (eg, sequence conservation, orthology, synteny …) in a unifying format and facilitates the multilevel analysis and visualization of complex evolutionary histories at the genome scale. The advantages of the approach are demonstrated by constructing barcodes representing the evolution of the complete human proteome. Two large-scale studies are then described: (i) the mapping and visualization of the barcodes on the human chromosomes and (ii) automatic clustering of the barcodes to highlight protein subsets sharing similar evolutionary histories and their functional analysis. The methodologies developed here open the way to the efficient application of other data mining and knowledge extraction techniques in evolutionary systems biology studies. A database containing all EvoluCode data is available at: http://lbgi.igbmc.fr/barcodes.
ERIC Educational Resources Information Center
San Diego, Jonathan P.; Cox, Margaret J.; Quinn, Barry F. A.; Newton, Jonathan Tim; Banerjee, Avijit; Woolford, Mark
2012-01-01
hapTEL, an interdisciplinary project funded by two UK research councils from 2007 to 2011, involves a large interdisciplinary team (with undergraduate and post-graduate student participants) which has been developing and evaluating a virtual learning system within an HE healthcare education setting, working on three overlapping strands. Strand 1…
Evaluation of RCAS Inflow Models for Wind Turbine Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tangler, J.; Bir, G.
The finite element structural modeling in the Rotorcraft Comprehensive Analysis System (RCAS) provides a state-of-the-art approach to aeroelastic analysis. This, coupled with its ability to model all turbine components, results in a methodology that can simulate complex system interactions characteristic of large wind. In addition, RCAS is uniquely capable of modeling advanced control algorithms and the resulting dynamic responses.
Large space telescope engineering scale model optical design
NASA Technical Reports Server (NTRS)
Facey, T. A.
1973-01-01
The objective is to develop the detailed design and tolerance data for the LST engineering scale model optical system. This will enable MSFC to move forward to the optical element procurement phase and also to evaluate tolerances, manufacturing requirements, assembly/checkout procedures, reliability, operational complexity, stability requirements of the structure and thermal system, and the flexibility to change and grow.
Interdisciplinary Team Science in Cell Biology.
Horwitz, Rick
2016-11-01
The cell is complex. With its multitude of components, spatial-temporal character, and gene expression diversity, it is challenging to comprehend the cell as an integrated system and to develop models that predict its behaviors. I suggest an approach to address this issue, involving system level data analysis, large scale team science, and philanthropy. Copyright © 2016 Elsevier Ltd. All rights reserved.
A measurement system for large, complex software programs
NASA Technical Reports Server (NTRS)
Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.
1994-01-01
This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.
NASA Astrophysics Data System (ADS)
Michael, H. A.; Voss, C. I.
2009-12-01
Widespread arsenic poisoning is occurring in large areas of Bangladesh and West Bengal, India due to high arsenic levels in shallow groundwater, which is the primary source of irrigation and drinking water in the region. The high-arsenic groundwater exists in aquifers of the Bengal Basin, a huge sedimentary system approximately 500km x 500km wide and greater than 15km deep in places. Deeper groundwater (>150m) is nearly universally low in arsenic and a potential source of safe drinking water, but evaluation of its sustainability requires understanding of the entire, interconnected regional aquifer system. Numerical modeling of flow and arsenic transport in the basin introduces problems of scale: challenges in representing the system in enough detail to produce meaningful simulations and answer relevant questions while maintaining enough simplicity to understand controls on processes and operating within computational constraints. A regional groundwater flow and transport model of the Bengal Basin was constructed to assess the large-scale functioning of the deep groundwater flow system, the vulnerability of deep groundwater to pumping-induced migration from above, and the effect of chemical properties of sediments (sorption) on sustainability. The primary challenges include the very large spatial scale of the system, dynamic monsoonal hydrology (small temporal scale fluctuations), complex sedimentary architecture (small spatial scale heterogeneity), and a lack of reliable hydrologic and geologic data. The approach was simple. Detailed inputs were reduced to only those that affect the functioning of the deep flow system. Available data were used to estimate upscaled parameter values. Nested small-scale simulations were performed to determine the effects of the simplifications, which include treatment of the top boundary condition and transience, effects of small-scale heterogeneity, and effects of individual pumping wells. Simulation of arsenic transport at the large scale adds another element of complexity. Minimization of numerical oscillation and mass balance errors required experimentation with solvers and discretization. In the face of relatively few data in a very large-scale model, sensitivity analyses were essential. The scale of the system limits evaluation of localized behavior, but results clearly identified the primary controls on the system and effects of various pumping scenarios and sorptive properties. It was shown that limiting deep pumping to domestic supply may result in sustainable arsenic-safe water for 90% of the arsenic-affected region over a 1000 year timescale, and that sorption of arsenic onto deep, oxidized Pleistocene sediments may increase the breakthrough time in unsustainable zones by more than an order of magnitude. Thus, both hydraulic and chemical defenses indicate the potential for sustainable, managed use of deep, safe groundwater resources in the Bengal Basin.
NASA Astrophysics Data System (ADS)
Havlin, S.; Kenett, D. Y.; Ben-Jacob, E.; Bunde, A.; Cohen, R.; Hermann, H.; Kantelhardt, J. W.; Kertész, J.; Kirkpatrick, S.; Kurths, J.; Portugali, J.; Solomon, S.
2012-11-01
Network theory has become one of the most visible theoretical frameworks that can be applied to the description, analysis, understanding, design and repair of multi-level complex systems. Complex networks occur everywhere, in man-made and human social systems, in organic and inorganic matter, from nano to macro scales, and in natural and anthropogenic structures. New applications are developed at an ever-increasing rate and the promise for future growth is high, since increasingly we interact with one another within these vital and complex environments. Despite all the great successes of this field, crucial aspects of multi-level complex systems have been largely ignored. Important challenges of network science are to take into account many of these missing realistic features such as strong coupling between networks (networks are not isolated), the dynamics of networks (networks are not static), interrelationships between structure, dynamics and function of networks, interdependencies in given networks (and other classes of links, including different signs of interactions), and spatial properties (including geographical aspects) of networks. This aim of this paper is to introduce and discuss the challenges that future network science needs to address, and how different disciplines will be accordingly affected.
Automated Derivation of Complex System Constraints from User Requirements
NASA Technical Reports Server (NTRS)
Foshee, Mark; Murey, Kim; Marsh, Angela
2010-01-01
The Payload Operations Integration Center (POIC) located at the Marshall Space Flight Center has the responsibility of integrating US payload science requirements for the International Space Station (ISS). All payload operations must request ISS system resources so that the resource usage will be included in the ISS on-board execution timelines. The scheduling of resources and building of the timeline is performed using the Consolidated Planning System (CPS). The ISS resources are quite complex due to the large number of components that must be accounted for. The planners at the POIC simplify the process for Payload Developers (PD) by providing the PDs with a application that has the basic functionality PDs need as well as list of simplified resources in the User Requirements Collection (URC) application. The planners maintained a mapping of the URC resources to the CPS resources. The process of manually converting PD's science requirements from a simplified representation to a more complex CPS representation is a time-consuming and tedious process. The goal is to provide a software solution to allow the planners to build a mapping of the complex CPS constraints to the basic URC constraints and automatically convert the PD's requirements into systems requirements during export to CPS.
Localization Algorithm Based on a Spring Model (LASM) for Large Scale Wireless Sensor Networks.
Chen, Wanming; Mei, Tao; Meng, Max Q-H; Liang, Huawei; Liu, Yumei; Li, Yangming; Li, Shuai
2008-03-15
A navigation method for a lunar rover based on large scale wireless sensornetworks is proposed. To obtain high navigation accuracy and large exploration area, highnode localization accuracy and large network scale are required. However, thecomputational and communication complexity and time consumption are greatly increasedwith the increase of the network scales. A localization algorithm based on a spring model(LASM) method is proposed to reduce the computational complexity, while maintainingthe localization accuracy in large scale sensor networks. The algorithm simulates thedynamics of physical spring system to estimate the positions of nodes. The sensor nodesare set as particles with masses and connected with neighbor nodes by virtual springs. Thevirtual springs will force the particles move to the original positions, the node positionscorrespondingly, from the randomly set positions. Therefore, a blind node position can bedetermined from the LASM algorithm by calculating the related forces with the neighbornodes. The computational and communication complexity are O(1) for each node, since thenumber of the neighbor nodes does not increase proportionally with the network scale size.Three patches are proposed to avoid local optimization, kick out bad nodes and deal withnode variation. Simulation results show that the computational and communicationcomplexity are almost constant despite of the increase of the network scale size. The time consumption has also been proven to remain almost constant since the calculation steps arealmost unrelated with the network scale size.
Dragas, Jelena; Jäckel, David; Hierlemann, Andreas; Franke, Felix
2017-01-01
Reliable real-time low-latency spike sorting with large data throughput is essential for studies of neural network dynamics and for brain-machine interfaces (BMIs), in which the stimulation of neural networks is based on the networks' most recent activity. However, the majority of existing multi-electrode spike-sorting algorithms are unsuited for processing high quantities of simultaneously recorded data. Recording from large neuronal networks using large high-density electrode sets (thousands of electrodes) imposes high demands on the data-processing hardware regarding computational complexity and data transmission bandwidth; this, in turn, entails demanding requirements in terms of chip area, memory resources and processing latency. This paper presents computational complexity optimization techniques, which facilitate the use of spike-sorting algorithms in large multi-electrode-based recording systems. The techniques are then applied to a previously published algorithm, on its own, unsuited for large electrode set recordings. Further, a real-time low-latency high-performance VLSI hardware architecture of the modified algorithm is presented, featuring a folded structure capable of processing the activity of hundreds of neurons simultaneously. The hardware is reconfigurable “on-the-fly” and adaptable to the nonstationarities of neuronal recordings. By transmitting exclusively spike time stamps and/or spike waveforms, its real-time processing offers the possibility of data bandwidth and data storage reduction. PMID:25415989
Dragas, Jelena; Jackel, David; Hierlemann, Andreas; Franke, Felix
2015-03-01
Reliable real-time low-latency spike sorting with large data throughput is essential for studies of neural network dynamics and for brain-machine interfaces (BMIs), in which the stimulation of neural networks is based on the networks' most recent activity. However, the majority of existing multi-electrode spike-sorting algorithms are unsuited for processing high quantities of simultaneously recorded data. Recording from large neuronal networks using large high-density electrode sets (thousands of electrodes) imposes high demands on the data-processing hardware regarding computational complexity and data transmission bandwidth; this, in turn, entails demanding requirements in terms of chip area, memory resources and processing latency. This paper presents computational complexity optimization techniques, which facilitate the use of spike-sorting algorithms in large multi-electrode-based recording systems. The techniques are then applied to a previously published algorithm, on its own, unsuited for large electrode set recordings. Further, a real-time low-latency high-performance VLSI hardware architecture of the modified algorithm is presented, featuring a folded structure capable of processing the activity of hundreds of neurons simultaneously. The hardware is reconfigurable “on-the-fly” and adaptable to the nonstationarities of neuronal recordings. By transmitting exclusively spike time stamps and/or spike waveforms, its real-time processing offers the possibility of data bandwidth and data storage reduction.
NASA Astrophysics Data System (ADS)
Gusev, Anatoly; Diansky, Nikolay; Zalesny, Vladimir
2010-05-01
The original program complex is proposed for the ocean circulation sigma-model, developed in the Institute of Numerical Mathematics (INM), Russian Academy of Sciences (RAS). The complex can be used in various curvilinear orthogonal coordinate systems. In addition to ocean circulation model, the complex contains a sea ice dynamics and thermodynamics model, as well as the original system of the atmospheric forcing implementation on the basis of both prescribed meteodata and atmospheric model results. This complex can be used as the oceanic block of Earth climate model as well as for solving the scientific and practical problems concerning the World ocean and its separate oceans and seas. The developed program complex can be effectively used on parallel shared memory computational systems and on contemporary personal computers. On the base of the complex proposed the ocean general circulation model (OGCM) was developed. The model is realized in the curvilinear orthogonal coordinate system obtained by the conformal transformation of the standard geographical grid that allowed us to locate the system singularities outside the integration domain. The horizontal resolution of the OGCM is 1 degree on longitude, 0.5 degree on latitude, and it has 40 non-uniform sigma-levels in depth. The model was integrated for 100 years starting from the Levitus January climatology using the realistic atmospheric annual cycle calculated on the base of CORE datasets. The experimental results showed us that the model adequately reproduces the basic characteristics of large-scale World Ocean dynamics, that is in good agreement with both observational data and results of the best climatic OGCMs. This OGCM is used as the oceanic component of the new version of climatic system model (CSM) developed in INM RAS. The latter is now ready for carrying out the new numerical experiments on climate and its change modelling according to IPCC (Intergovernmental Panel on Climate Change) scenarios in the scope of the CMIP-5 (Coupled Model Intercomparison Project). On the base of the complex proposed the Pacific Ocean circulation eddy-resolving model was realized. The integration domain covers the Pacific from Equator to Bering Strait. The model horizontal resolution is 0.125 degree and it has 20 non-uniform sigma-levels in depth. The model adequately reproduces circulation large-scale structure and its variability: Kuroshio meandering, ocean synoptic eddies, frontal zones, etc. Kuroshio high variability is shown. The distribution of contaminant was simulated that is admittedly wasted near Petropavlovsk-Kamchatsky. The results demonstrate contaminant distribution structure and provide us understanding of hydrological fields formation processes in the North-West Pacific.
Numerical propulsion system simulation
NASA Technical Reports Server (NTRS)
Lytle, John K.; Remaklus, David A.; Nichols, Lester D.
1990-01-01
The cost of implementing new technology in aerospace propulsion systems is becoming prohibitively expensive. One of the major contributors to the high cost is the need to perform many large scale system tests. Extensive testing is used to capture the complex interactions among the multiple disciplines and the multiple components inherent in complex systems. The objective of the Numerical Propulsion System Simulation (NPSS) is to provide insight into these complex interactions through computational simulations. This will allow for comprehensive evaluation of new concepts early in the design phase before a commitment to hardware is made. It will also allow for rapid assessment of field-related problems, particularly in cases where operational problems were encountered during conditions that would be difficult to simulate experimentally. The tremendous progress taking place in computational engineering and the rapid increase in computing power expected through parallel processing make this concept feasible within the near future. However it is critical that the framework for such simulations be put in place now to serve as a focal point for the continued developments in computational engineering and computing hardware and software. The NPSS concept which is described will provide that framework.
Earthquake Complex Network Analysis Before and After the Mw 8.2 Earthquake in Iquique, Chile
NASA Astrophysics Data System (ADS)
Pasten, D.
2017-12-01
The earthquake complex networks have shown that they are abble to find specific features in seismic data set. In space, this networkshave shown a scale-free behavior for the probability distribution of connectivity, in directed networks and theyhave shown a small-world behavior, for the undirected networks.In this work, we present an earthquake complex network analysis for the large earthquake Mw 8.2 in the north ofChile (near to Iquique) in April, 2014. An earthquake complex network is made dividing the three dimensional space intocubic cells, if one of this cells contain an hypocenter, we name this cell like a node. The connections between nodes aregenerated in time. We follow the time sequence of seismic events and we are making the connections betweennodes. Now, we have two different networks: a directed and an undirected network. Thedirected network takes in consideration the time-direction of the connections, that is very important for the connectivityof the network: we are considering the connectivity, ki of the i-th node, like the number of connections going out ofthe node i plus the self-connections (if two seismic events occurred successive in time in the same cubic cell, we havea self-connection). The undirected network is made removing the direction of the connections and the self-connectionsfrom the directed network. For undirected networks, we are considering only if two nodes are or not connected.We have built a directed complex network and an undirected complex network, before and after the large earthquake in Iquique. We have used magnitudes greater than Mw = 1.0 and Mw = 3.0. We found that this method can recognize the influence of thissmall seismic events in the behavior of the network and we found that the size of the cell used to build the network isanother important factor to recognize the influence of the large earthquake in this complex system. This method alsoshows a difference in the values of the critical exponent γ (for the probability distribution of connectivity in the directednetwork) before and after the large earthquake, but this method does not show a change in the clustering behavior ofthe undirected network, before and after the large earthquake, showing a small-world behavior for the network beforeand after of this large seismic event.
Operational development of small plant growth systems
NASA Technical Reports Server (NTRS)
Scheld, H. W.; Magnuson, J. W.; Sauer, R. L.
1986-01-01
The results of a study undertaken on the first phase of an empricial effort in the development of small plant growth chambers for production of salad type vegetables on space shuttle or space station are discussed. The overall effort is visualized as providing the underpinning of practical experience in handling of plant systems in space which will provide major support for future efforts in planning, design, and construction of plant-based (phytomechanical) systems for support of human habitation in space. The assumptions underlying the effort hold that large scale phytomechanical habitability support systems for future space stations must evolve from the simple to the complex. The highly complex final systems will be developed from the accumulated experience and data gathered from repetitive tests and trials of fragments or subsystems of the whole in an operational mode. These developing system components will, meanwhile, serve a useful operational function in providing psychological support and diversion for the crews.
The Natural Science Underlying Big History
Chaisson, Eric J.
2014-01-01
Nature's many varied complex systems—including galaxies, stars, planets, life, and society—are islands of order within the increasingly disordered Universe. All organized systems are subject to physical, biological, or cultural evolution, which together comprise the grander interdisciplinary subject of cosmic evolution. A wealth of observational data supports the hypothesis that increasingly complex systems evolve unceasingly, uncaringly, and unpredictably from big bang to humankind. These are global history greatly extended, big history with a scientific basis, and natural history broadly portrayed across ∼14 billion years of time. Human beings and our cultural inventions are not special, unique, or apart from Nature; rather, we are an integral part of a universal evolutionary process connecting all such complex systems throughout space and time. Such evolution writ large has significant potential to unify the natural sciences into a holistic understanding of who we are and whence we came. No new science (beyond frontier, nonequilibrium thermodynamics) is needed to describe cosmic evolution's major milestones at a deep and empirical level. Quantitative models and experimental tests imply that a remarkable simplicity underlies the emergence and growth of complexity for a wide spectrum of known and diverse systems. Energy is a principal facilitator of the rising complexity of ordered systems within the expanding Universe; energy flows are as central to life and society as they are to stars and galaxies. In particular, energy rate density—contrasting with information content or entropy production—is an objective metric suitable to gauge relative degrees of complexity among a hierarchy of widely assorted systems observed throughout the material Universe. Operationally, those systems capable of utilizing optimum amounts of energy tend to survive, and those that cannot are nonrandomly eliminated. PMID:25032228
Smith, Robert W; van Rosmalen, Rik P; Martins Dos Santos, Vitor A P; Fleck, Christian
2018-06-19
Models of metabolism are often used in biotechnology and pharmaceutical research to identify drug targets or increase the direct production of valuable compounds. Due to the complexity of large metabolic systems, a number of conclusions have been drawn using mathematical methods with simplifying assumptions. For example, constraint-based models describe changes of internal concentrations that occur much quicker than alterations in cell physiology. Thus, metabolite concentrations and reaction fluxes are fixed to constant values. This greatly reduces the mathematical complexity, while providing a reasonably good description of the system in steady state. However, without a large number of constraints, many different flux sets can describe the optimal model and we obtain no information on how metabolite levels dynamically change. Thus, to accurately determine what is taking place within the cell, finer quality data and more detailed models need to be constructed. In this paper we present a computational framework, DMPy, that uses a network scheme as input to automatically search for kinetic rates and produce a mathematical model that describes temporal changes of metabolite fluxes. The parameter search utilises several online databases to find measured reaction parameters. From this, we take advantage of previous modelling efforts, such as Parameter Balancing, to produce an initial mathematical model of a metabolic pathway. We analyse the effect of parameter uncertainty on model dynamics and test how recent flux-based model reduction techniques alter system properties. To our knowledge this is the first time such analysis has been performed on large models of metabolism. Our results highlight that good estimates of at least 80% of the reaction rates are required to accurately model metabolic systems. Furthermore, reducing the size of the model by grouping reactions together based on fluxes alters the resulting system dynamics. The presented pipeline automates the modelling process for large metabolic networks. From this, users can simulate their pathway of interest and obtain a better understanding of how altering conditions influences cellular dynamics. By testing the effects of different parameterisations we are also able to provide suggestions to help construct more accurate models of complete metabolic systems in the future.
Complex behavior in chains of nonlinear oscillators.
Alonso, Leandro M
2017-06-01
This article outlines sufficient conditions under which a one-dimensional chain of identical nonlinear oscillators can display complex spatio-temporal behavior. The units are described by phase equations and consist of excitable oscillators. The interactions are local and the network is poised to a critical state by balancing excitation and inhibition locally. The results presented here suggest that in networks composed of many oscillatory units with local interactions, excitability together with balanced interactions is sufficient to give rise to complex emergent features. For values of the parameters where complex behavior occurs, the system also displays a high-dimensional bifurcation where an exponentially large number of equilibria are borne in pairs out of multiple saddle-node bifurcations.
GODIVA2: interactive visualization of environmental data on the Web.
Blower, J D; Haines, K; Santokhee, A; Liu, C L
2009-03-13
GODIVA2 is a dynamic website that provides visual access to several terabytes of physically distributed, four-dimensional environmental data. It allows users to explore large datasets interactively without the need to install new software or download and understand complex data. Through the use of open international standards, GODIVA2 maintains a high level of interoperability with third-party systems, allowing diverse datasets to be mutually compared. Scientists can use the system to search for features in large datasets and to diagnose the output from numerical simulations and data processing algorithms. Data providers around Europe have adopted GODIVA2 as an INSPIRE-compliant dynamic quick-view system for providing visual access to their data.
He, Xinhua; Hu, Wenfa
2014-01-01
This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model.
He, Xinhua
2014-01-01
This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model. PMID:24688367
The application of sensitivity analysis to models of large scale physiological systems
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1974-01-01
A survey of the literature of sensitivity analysis as it applies to biological systems is reported as well as a brief development of sensitivity theory. A simple population model and a more complex thermoregulatory model illustrate the investigatory techniques and interpretation of parameter sensitivity analysis. The role of sensitivity analysis in validating and verifying models, and in identifying relative parameter influence in estimating errors in model behavior due to uncertainty in input data is presented. This analysis is valuable to the simulationist and the experimentalist in allocating resources for data collection. A method for reducing highly complex, nonlinear models to simple linear algebraic models that could be useful for making rapid, first order calculations of system behavior is presented.
Biswas, Amitava; Liu, Chen; Monga, Inder; ...
2016-01-01
For last few years, there has been a tremendous growth in data traffic due to high adoption rate of mobile devices and cloud computing. Internet of things (IoT) will stimulate even further growth. This is increasing scale and complexity of telecom/internet service provider (SP) and enterprise data centre (DC) compute and network infrastructures. As a result, managing these large network-compute converged infrastructures is becoming complex and cumbersome. To cope up, network and DC operators are trying to automate network and system operations, administrations and management (OAM) functions. OAM includes all non-functional mechanisms which keep the network running.
The price of complexity in financial networks
May, Robert M.; Roukny, Tarik; Stiglitz, Joseph E.
2016-01-01
Financial institutions form multilayer networks by engaging in contracts with each other and by holding exposures to common assets. As a result, the default probability of one institution depends on the default probability of all of the other institutions in the network. Here, we show how small errors on the knowledge of the network of contracts can lead to large errors in the probability of systemic defaults. From the point of view of financial regulators, our findings show that the complexity of financial networks may decrease the ability to mitigate systemic risk, and thus it may increase the social cost of financial crises. PMID:27555583
Percolation in multiplex networks with overlap.
Cellai, Davide; López, Eduardo; Zhou, Jie; Gleeson, James P; Bianconi, Ginestra
2013-11-01
From transportation networks to complex infrastructures, and to social and communication networks, a large variety of systems can be described in terms of multiplexes formed by a set of nodes interacting through different networks (layers). Multiplexes may display an increased fragility with respect to the single layers that constitute them. However, so far the overlap of the links in different layers has been mostly neglected, despite the fact that it is an ubiquitous phenomenon in most multiplexes. Here, we show that the overlap among layers can improve the robustness of interdependent multiplex systems and change the critical behavior of the percolation phase transition in a complex way.
Hentschel, Mario; Schäferling, Martin; Duan, Xiaoyang; Giessen, Harald; Liu, Na
2017-01-01
We present a comprehensive overview of chirality and its optical manifestation in plasmonic nanosystems and nanostructures. We discuss top-down fabricated structures that range from solid metallic nanostructures to groupings of metallic nanoparticles arranged in three dimensions. We also present the large variety of bottom-up synthesized structures. Using DNA, peptides, or other scaffolds, complex nanoparticle arrangements of up to hundreds of individual nanoparticles have been realized. Beyond this static picture, we also give an overview of recent demonstrations of active chiral plasmonic systems, where the chiral optical response can be controlled by an external stimulus. We discuss the prospect of using the unique properties of complex chiral plasmonic systems for enantiomeric sensing schemes. PMID:28560336
The large-scale organization of metabolic networks
NASA Astrophysics Data System (ADS)
Jeong, H.; Tombor, B.; Albert, R.; Oltvai, Z. N.; Barabási, A.-L.
2000-10-01
In a cell or microorganism, the processes that generate mass, energy, information transfer and cell-fate specification are seamlessly integrated through a complex network of cellular constituents and reactions. However, despite the key role of these networks in sustaining cellular functions, their large-scale structure is essentially unknown. Here we present a systematic comparative mathematical analysis of the metabolic networks of 43 organisms representing all three domains of life. We show that, despite significant variation in their individual constituents and pathways, these metabolic networks have the same topological scaling properties and show striking similarities to the inherent organization of complex non-biological systems. This may indicate that metabolic organization is not only identical for all living organisms, but also complies with the design principles of robust and error-tolerant scale-free networks, and may represent a common blueprint for the large-scale organization of interactions among all cellular constituents.
NASA Technical Reports Server (NTRS)
Singh, Mrityunjay; Petko, Jeannie F.
2004-01-01
Affordable fiber-reinforced ceramic matrix composites with multifunctional properties are critically needed for high-temperature aerospace and space transportation applications. These materials have various applications in advanced high-efficiency and high-performance engines, airframe and propulsion components for next-generation launch vehicles, and components for land-based systems. A number of these applications require materials with specific functional characteristics: for example, thick component, hybrid layups for environmental durability and stress management, and self-healing and smart composite matrices. At present, with limited success and very high cost, traditional composite fabrication technologies have been utilized to manufacture some large, complex-shape components of these materials. However, many challenges still remain in developing affordable, robust, and flexible manufacturing technologies for large, complex-shape components with multifunctional properties. The prepreg and melt infiltration (PREMI) technology provides an affordable and robust manufacturing route for low-cost, large-scale production of multifunctional ceramic composite components.
Electronic sensor and actuator webs for large-area complex geometry cardiac mapping and therapy
Kim, Dae-Hyeong; Ghaffari, Roozbeh; Lu, Nanshu; Wang, Shuodao; Lee, Stephen P.; Keum, Hohyun; D’Angelo, Robert; Klinker, Lauren; Su, Yewang; Lu, Chaofeng; Kim, Yun-Soung; Ameen, Abid; Li, Yuhang; Zhang, Yihui; de Graff, Bassel; Hsu, Yung-Yu; Liu, ZhuangJian; Ruskin, Jeremy; Xu, Lizhi; Lu, Chi; Omenetto, Fiorenzo G.; Huang, Yonggang; Mansour, Moussa; Slepian, Marvin J.; Rogers, John A.
2012-01-01
Curved surfaces, complex geometries, and time-dynamic deformations of the heart create challenges in establishing intimate, nonconstraining interfaces between cardiac structures and medical devices or surgical tools, particularly over large areas. We constructed large area designs for diagnostic and therapeutic stretchable sensor and actuator webs that conformally wrap the epicardium, establishing robust contact without sutures, mechanical fixtures, tapes, or surgical adhesives. These multifunctional web devices exploit open, mesh layouts and mount on thin, bio-resorbable sheets of silk to facilitate handling in a way that yields, after dissolution, exceptionally low mechanical moduli and thicknesses. In vivo studies in rabbit and pig animal models demonstrate the effectiveness of these device webs for measuring and spatially mapping temperature, electrophysiological signals, strain, and physical contact in sheet and balloon-based systems that also have the potential to deliver energy to perform localized tissue ablation. PMID:23150574
Poincaré resonances and the limits of trajectory dynamics.
Petrosky, T; Prigogine, I
1993-01-01
In previous papers we have shown that the elimination of the resonance divergences in large Poincare systems leads to complex irreducible spectral representations for the Liouville-von Neumann operator. Complex means that time symmetry is broken and irreducibility means that this representation is implementable only by statistical ensembles and not by trajectories. We consider in this paper classical potential scattering. Our theory applies to persistent scattering. Numerical simulations show quantitative agreement with our predictions. PMID:11607428
UHF (Ultra-High-Frequency) Propagation in Vegetative Media.
1980-04-01
Y V /ik) where k = 2A/X is the wave number and the asterisk indicates complex conjugate. In order to obtain useful results for average values that are...easy to make an accurate estimation of the expected effects under one set of conditions on the basis of experimental observa- tions carried out under... systems propagating horizontally through vegetation. The large quantity A-13 of measured data demonstrates the complex effects upon path loss of irregu
Computationally efficient algorithm for high sampling-frequency operation of active noise control
NASA Astrophysics Data System (ADS)
Rout, Nirmal Kumar; Das, Debi Prasad; Panda, Ganapati
2015-05-01
In high sampling-frequency operation of active noise control (ANC) system the length of the secondary path estimate and the ANC filter are very long. This increases the computational complexity of the conventional filtered-x least mean square (FXLMS) algorithm. To reduce the computational complexity of long order ANC system using FXLMS algorithm, frequency domain block ANC algorithms have been proposed in past. These full block frequency domain ANC algorithms are associated with some disadvantages such as large block delay, quantization error due to computation of large size transforms and implementation difficulties in existing low-end DSP hardware. To overcome these shortcomings, the partitioned block ANC algorithm is newly proposed where the long length filters in ANC are divided into a number of equal partitions and suitably assembled to perform the FXLMS algorithm in the frequency domain. The complexity of this proposed frequency domain partitioned block FXLMS (FPBFXLMS) algorithm is quite reduced compared to the conventional FXLMS algorithm. It is further reduced by merging one fast Fourier transform (FFT)-inverse fast Fourier transform (IFFT) combination to derive the reduced structure FPBFXLMS (RFPBFXLMS) algorithm. Computational complexity analysis for different orders of filter and partition size are presented. Systematic computer simulations are carried out for both the proposed partitioned block ANC algorithms to show its accuracy compared to the time domain FXLMS algorithm.
Complex Event Extraction using DRUM
2015-10-01
towards tackling these challenges . Figure 9. Evaluation results for eleven teams. The diamond ◆ represents the results of our system. The two topmost...Proceedings of the Joint SIGDAT Conference on Empirical Methods in Natural Language Processing and Very Large Corpora (EMNLP/ VLC -2000). The UniProt
2005-07-01
vehicles to deliver mail or packages across town, in a large indoor complex or building, or even to deliver...drugs, marihuana , depressants or stimulants.330 Further, current licenses and certified personnel may have their
Towards the understanding of network information processing in biology
NASA Astrophysics Data System (ADS)
Singh, Vijay
Living organisms perform incredibly well in detecting a signal present in the environment. This information processing is achieved near optimally and quite reliably, even though the sources of signals are highly variable and complex. The work in the last few decades has given us a fair understanding of how individual signal processing units like neurons and cell receptors process signals, but the principles of collective information processing on biological networks are far from clear. Information processing in biological networks, like the brain, metabolic circuits, cellular-signaling circuits, etc., involves complex interactions among a large number of units (neurons, receptors). The combinatorially large number of states such a system can exist in makes it impossible to study these systems from the first principles, starting from the interactions between the basic units. The principles of collective information processing on such complex networks can be identified using coarse graining approaches. This could provide insights into the organization and function of complex biological networks. Here I study models of biological networks using continuum dynamics, renormalization, maximum likelihood estimation and information theory. Such coarse graining approaches identify features that are essential for certain processes performed by underlying biological networks. We find that long-range connections in the brain allow for global scale feature detection in a signal. These also suppress the noise and remove any gaps present in the signal. Hierarchical organization with long-range connections leads to large-scale connectivity at low synapse numbers. Time delays can be utilized to separate a mixture of signals with temporal scales. Our observations indicate that the rules in multivariate signal processing are quite different from traditional single unit signal processing.
Continuous probing of cold complex molecules with infrared frequency comb spectroscopy
NASA Astrophysics Data System (ADS)
Spaun, Ben; Changala, P. Bryan; Patterson, David; Bjork, Bryce J.; Heckl, Oliver H.; Doyle, John M.; Ye, Jun
2016-05-01
For more than half a century, high-resolution infrared spectroscopy has played a crucial role in probing molecular structure and dynamics. Such studies have so far been largely restricted to relatively small and simple systems, because at room temperature even molecules of modest size already occupy many millions of rotational/vibrational states, yielding highly congested spectra that are difficult to assign. Targeting more complex molecules requires methods that can record broadband infrared spectra (that is, spanning multiple vibrational bands) with both high resolution and high sensitivity. However, infrared spectroscopic techniques have hitherto been limited either by narrow bandwidth and long acquisition time, or by low sensitivity and resolution. Cavity-enhanced direct frequency comb spectroscopy (CE-DFCS) combines the inherent broad bandwidth and high resolution of an optical frequency comb with the high detection sensitivity provided by a high-finesse enhancement cavity, but it still suffers from spectral congestion. Here we show that this problem can be overcome by using buffer gas cooling to produce continuous, cold samples of molecules that are then subjected to CE-DFCS. This integration allows us to acquire a rotationally resolved direct absorption spectrum in the C-H stretching region of nitromethane, a model system that challenges our understanding of large-amplitude vibrational motion. We have also used this technique on several large organic molecules that are of fundamental spectroscopic and astrochemical relevance, including naphthalene, adamantane and hexamethylenetetramine. These findings establish the value of our approach for studying much larger and more complex molecules than have been probed so far, enabling complex molecules and their kinetics to be studied with orders-of-magnitude improvements in efficiency, spectral resolution and specificity.
Impact of delayed information in sub-second complex systems
NASA Astrophysics Data System (ADS)
Manrique, Pedro D.; Zheng, Minzhang; Johnson Restrepo, D. Dylan; Hui, Pak Ming; Johnson, Neil F.
What happens when you slow down the delivery of information in large-scale complex systems that operate faster than the blink of an eye? This question just adopted immediate commercial, legal and political importance following U.S. regulators' decision to allow an intentional 350 microsecond delay to be added in the ultrafast network of financial exchanges. However there is still no scientific understanding available to policymakers of the potential system-wide impact of such delays. Here we take a first step in addressing this question using a minimal model of a population of competing, heterogeneous, adaptive agents which has previously been shown to produce similar statistical features to real markets. We find that while certain extreme system-level behaviors can be prevented by such delays, the duration of others is increased. This leads to a highly non-trivial relationship between delays and system-wide instabilities which warrants deeper empirical investigation. The generic nature of our model suggests there should be a fairly wide class of complex systems where such delay-driven extreme behaviors can arise, e.g. sub-second delays in brain function possibly impacting individuals' behavior, and sub-second delays in navigational systems potentially impacting the safety of driverless vehicles.
Integration of systems biology with organs-on-chips to humanize therapeutic development
NASA Astrophysics Data System (ADS)
Edington, Collin D.; Cirit, Murat; Chen, Wen Li Kelly; Clark, Amanda M.; Wells, Alan; Trumper, David L.; Griffith, Linda G.
2017-02-01
"Mice are not little people" - a refrain becoming louder as the gaps between animal models and human disease become more apparent. At the same time, three emerging approaches are headed toward integration: powerful systems biology analysis of cell-cell and intracellular signaling networks in patient-derived samples; 3D tissue engineered models of human organ systems, often made from stem cells; and micro-fluidic and meso-fluidic devices that enable living systems to be sustained, perturbed and analyzed for weeks in culture. Integration of these rapidly moving fields has the potential to revolutionize development of therapeutics for complex, chronic diseases, including those that have weak genetic bases and substantial contributions from gene-environment interactions. Technical challenges in modeling complex diseases with "organs on chips" approaches include the need for relatively large tissue masses and organ-organ cross talk to capture systemic effects, such that current microfluidic formats often fail to capture the required scale and complexity for interconnected systems. These constraints drive development of new strategies for designing in vitro models, including perfusing organ models, as well as "mesofluidic" pumping and circulation in platforms connecting several organ systems, to achieve the appropriate physiological relevance.
NASA Astrophysics Data System (ADS)
Plebe, Alice; Grasso, Giorgio
2016-12-01
This paper describes a system developed for the simulation of flames inside an open-source 3D computer graphic software, Blender, with the aim of analyzing in virtual reality scenarios of hazards in large-scale industrial plants. The advantages of Blender are of rendering at high resolution the very complex structure of large industrial plants, and of embedding a physical engine based on smoothed particle hydrodynamics. This particle system is used to evolve a simulated fire. The interaction of this fire with the components of the plant is computed using polyhedron separation distance, adopting a Voronoi-based strategy that optimizes the number of feature distance computations. Results on a real oil and gas refining industry are presented.
2016-01-01
Although heavy-tailed fluctuations are ubiquitous in complex systems, a good understanding of the mechanisms that generate them is still lacking. Optical complex systems are ideal candidates for investigating heavy-tailed fluctuations, as they allow recording large datasets under controllable experimental conditions. A dynamical regime that has attracted a lot of attention over the years is the so-called low-frequency fluctuations (LFFs) of semiconductor lasers with optical feedback. In this regime, the laser output intensity is characterized by abrupt and apparently random dropouts. The statistical analysis of the inter-dropout-intervals (IDIs) has provided many useful insights into the underlying dynamics. However, the presence of large temporal fluctuations in the IDI sequence has not yet been investigated. Here, by applying fluctuation analysis we show that the experimental distribution of IDI fluctuations is heavy-tailed, and specifically, is well-modeled by a non-Gaussian stable distribution. We find a good qualitative agreement with simulations of the Lang-Kobayashi model. Moreover, we uncover a transition from a less-heavy-tailed state at low pump current to a more-heavy-tailed state at higher pump current. Our results indicate that fluctuation analysis can be a useful tool for investigating the output signals of complex optical systems; it can be used for detecting underlying regime shifts, for model validation and parameter estimation. PMID:26901346
Heuristic decomposition for non-hierarchic systems
NASA Technical Reports Server (NTRS)
Bloebaum, Christina L.; Hajela, P.
1991-01-01
Design and optimization is substantially more complex in multidisciplinary and large-scale engineering applications due to the existing inherently coupled interactions. The paper introduces a quasi-procedural methodology for multidisciplinary optimization that is applicable for nonhierarchic systems. The necessary decision-making support for the design process is provided by means of an embedded expert systems capability. The method employs a decomposition approach whose modularity allows for implementation of specialized methods for analysis and optimization within disciplines.
A Bridge to Coordination Isomer Selection in Lanthanide(III) DOTA-tetraamide Complexes
Vipond, Jeff; Woods, Mark; Zhao, Piyu; Tircso, Gyula; Ren, Jimin; Bott, Simon G.; Ogrin, Doug; Kiefer, Garry E.; Kovacs, Zoltan; Sherry, A.Dean
2008-01-01
Interest in macrocyclic lanthanide complexes such as DOTA is driven largely through interest in their use as contrast agents for MRI. The lanthanide tetraamide derivatives of DOTA have shown considerable promise as PARACEST agents, taking advantage of the slow water exchange kinetics of this class of complex. We postulated that water exchange in these tetraamide complexes could be slowed even further by introducing a group to sterically encumber the space above the water coordination site, thereby hindering the departure and approach of water molecules to the complex. The ligand 8O2-bridged-DOTAM was synthesized in a 34% yield from cyclen. It was found that the lanthanide complexes of this ligand did not possess a water molecule in the inner coordination sphere of the bound lanthanide. The crystal structure of the ytterbium complex revealed that distortions to the coordination sphere were induced by the steric constraints imposed on the complex by the bridging unit. The extent of the distortion was found to increase with increasing ionic radius of the lanthanide ion, eventually resulting in a complete loss of symmetry in the complex. Because this ligand system is bicyclic, the conformation of each ring in the system is constrained by that of the other, in consequence inclusion of the bridging unit in the complexes means only a twisted square antiprismatic coordination geometry is observed for complexes of 8O2-bridged-DOTAM. PMID:17295475
Bacterial community changes in an industrial algae production system.
Fulbright, Scott P; Robbins-Pianka, Adam; Berg-Lyons, Donna; Knight, Rob; Reardon, Kenneth F; Chisholm, Stephen T
2018-04-01
While microalgae are a promising feedstock for production of fuels and other chemicals, a challenge for the algal bioproducts industry is obtaining consistent, robust algae growth. Algal cultures include complex bacterial communities and can be difficult to manage because specific bacteria can promote or reduce algae growth. To overcome bacterial contamination, algae growers may use closed photobioreactors designed to reduce the number of contaminant organisms. Even with closed systems, bacteria are known to enter and cohabitate, but little is known about these communities. Therefore, the richness, structure, and composition of bacterial communities were characterized in closed photobioreactor cultivations of Nannochloropsis salina in F/2 medium at different scales, across nine months spanning late summer-early spring, and during a sequence of serially inoculated cultivations. Using 16S rRNA sequence data from 275 samples, bacterial communities in small, medium, and large cultures were shown to be significantly different. Larger systems contained richer bacterial communities compared to smaller systems. Relationships between bacterial communities and algae growth were complex. On one hand, blooms of a specific bacterial type were observed in three abnormal, poorly performing replicate cultivations, while on the other, notable changes in the bacterial community structures were observed in a series of serial large-scale batch cultivations that had similar growth rates. Bacteria common to the majority of samples were identified, including a single OTU within the class Saprospirae that was found in all samples. This study contributes important information for crop protection in algae systems, and demonstrates the complex ecosystems that need to be understood for consistent, successful industrial algae cultivation. This is the first study to profile bacterial communities during the scale-up process of industrial algae systems.
NASA Astrophysics Data System (ADS)
Kang, L.; Lin, J.; Liu, C.; Zhou, H.; Ren, T.; Yao, Y.
2017-12-01
A new frequency-domain AEM system with a grounded electric source, which was called ground-airborne frequency-domain electromagnetic (GAFEM) system, was proposed to extend penetration depth without compromising the resolution and detection efficiency. In GAFEM system, an electric source was placed on the ground to enlarge the strength of response signals. UVA was chosen as aircraft to reduce interaction noise and improve its ability to adapt to complex terrain. Multi-source and multi-frequency emission method has been researched and applied to improve the efficiency of GAFEM system. 2n pseudorandom sequence was introduced as transmitting waveform, to ensure resolution and detection efficiency. Inversion-procedure based on full-space apparent resistivity formula was built to realize GAFEM method and extend the survey area to non-far field. Based on GAFEM system, two application was conducted in Changchun, China, to map the deep conductive structure. As shown in the results of this exploration, GAFEM system shows its effectiveness to conductive structure, obtaining a depth of about 1km with a source-receiver distance of over 6km. And it shows the same level of resolution with CSAMT method with an over 10 times of efficiency. This extended a range of important applications where the terrain is too complex to be accessed or large penetration depth is required in a large survey area.
Topological Principles of Control in Dynamical Networks
NASA Astrophysics Data System (ADS)
Kim, Jason; Pasqualetti, Fabio; Bassett, Danielle
Networked biological systems, such as the brain, feature complex patterns of interactions. To predict and correct the dynamic behavior of such systems, it is imperative to understand how the underlying topological structure affects and limits the function of the system. Here, we use network control theory to extract topological features that favor or prevent network controllability, and to understand the network-wide effect of external stimuli on large-scale brain systems. Specifically, we treat each brain region as a dynamic entity with real-valued state, and model the time evolution of all interconnected regions using linear, time-invariant dynamics. We propose a simplified feed-forward scheme where the effect of upstream regions (drivers) on the connected downstream regions (non-drivers) is characterized in closed-form. Leveraging this characterization of the simplified model, we derive topological features that predict the controllability properties of non-simplified networks. We show analytically and numerically that these predictors are accurate across a large range of parameters. Among other contributions, our analysis shows that heterogeneity in the network weights facilitate controllability, and allows us to implement targeted interventions that profoundly improve controllability. By assuming an underlying dynamical mechanism, we are able to understand the complex topology of networked biological systems in a functionally meaningful way.
Automated dynamic analytical model improvement for damped structures
NASA Technical Reports Server (NTRS)
Fuh, J. S.; Berman, A.
1985-01-01
A method is described to improve a linear nonproportionally damped analytical model of a structure. The procedure finds the smallest changes in the analytical model such that the improved model matches the measured modal parameters. Features of the method are: (1) ability to properly treat complex valued modal parameters of a damped system; (2) applicability to realistically large structural models; and (3) computationally efficiency without involving eigensolutions and inversion of a large matrix.