Sample records for complex distributed systems

  1. Multi-agent based control of large-scale complex systems employing distributed dynamic inference engine

    NASA Astrophysics Data System (ADS)

    Zhang, Daili

    Increasing societal demand for automation has led to considerable efforts to control large-scale complex systems, especially in the area of autonomous intelligent control methods. The control system of a large-scale complex system needs to satisfy four system level requirements: robustness, flexibility, reusability, and scalability. Corresponding to the four system level requirements, there arise four major challenges. First, it is difficult to get accurate and complete information. Second, the system may be physically highly distributed. Third, the system evolves very quickly. Fourth, emergent global behaviors of the system can be caused by small disturbances at the component level. The Multi-Agent Based Control (MABC) method as an implementation of distributed intelligent control has been the focus of research since the 1970s, in an effort to solve the above-mentioned problems in controlling large-scale complex systems. However, to the author's best knowledge, all MABC systems for large-scale complex systems with significant uncertainties are problem-specific and thus difficult to extend to other domains or larger systems. This situation is partly due to the control architecture of multiple agents being determined by agent to agent coupling and interaction mechanisms. Therefore, the research objective of this dissertation is to develop a comprehensive, generalized framework for the control system design of general large-scale complex systems with significant uncertainties, with the focus on distributed control architecture design and distributed inference engine design. A Hybrid Multi-Agent Based Control (HyMABC) architecture is proposed by combining hierarchical control architecture and module control architecture with logical replication rings. First, it decomposes a complex system hierarchically; second, it combines the components in the same level as a module, and then designs common interfaces for all of the components in the same module; third, replications are made for critical agents and are organized into logical rings. This architecture maintains clear guidelines for complexity decomposition and also increases the robustness of the whole system. Multiple Sectioned Dynamic Bayesian Networks (MSDBNs) as a distributed dynamic probabilistic inference engine, can be embedded into the control architecture to handle uncertainties of general large-scale complex systems. MSDBNs decomposes a large knowledge-based system into many agents. Each agent holds its partial perspective of a large problem domain by representing its knowledge as a Dynamic Bayesian Network (DBN). Each agent accesses local evidence from its corresponding local sensors and communicates with other agents through finite message passing. If the distributed agents can be organized into a tree structure, satisfying the running intersection property and d-sep set requirements, globally consistent inferences are achievable in a distributed way. By using different frequencies for local DBN agent belief updating and global system belief updating, it balances the communication cost with the global consistency of inferences. In this dissertation, a fully factorized Boyen-Koller (BK) approximation algorithm is used for local DBN agent belief updating, and the static Junction Forest Linkage Tree (JFLT) algorithm is used for global system belief updating. MSDBNs assume a static structure and a stable communication network for the whole system. However, for a real system, sub-Bayesian networks as nodes could be lost, and the communication network could be shut down due to partial damage in the system. Therefore, on-line and automatic MSDBNs structure formation is necessary for making robust state estimations and increasing survivability of the whole system. A Distributed Spanning Tree Optimization (DSTO) algorithm, a Distributed D-Sep Set Satisfaction (DDSSS) algorithm, and a Distributed Running Intersection Satisfaction (DRIS) algorithm are proposed in this dissertation. Combining these three distributed algorithms and a Distributed Belief Propagation (DBP) algorithm in MSDBNs makes state estimations robust to partial damage in the whole system. Combining the distributed control architecture design and the distributed inference engine design leads to a process of control system design for a general large-scale complex system. As applications of the proposed methodology, the control system design of a simplified ship chilled water system and a notional ship chilled water system have been demonstrated step by step. Simulation results not only show that the proposed methodology gives a clear guideline for control system design for general large-scale complex systems with dynamic and uncertain environment, but also indicate that the combination of MSDBNs and HyMABC can provide excellent performance for controlling general large-scale complex systems.

  2. Modeling complexity in engineered infrastructure system: Water distribution network as an example

    NASA Astrophysics Data System (ADS)

    Zeng, Fang; Li, Xiang; Li, Ke

    2017-02-01

    The complex topology and adaptive behavior of infrastructure systems are driven by both self-organization of the demand and rigid engineering solutions. Therefore, engineering complex systems requires a method balancing holism and reductionism. To model the growth of water distribution networks, a complex network model was developed following the combination of local optimization rules and engineering considerations. The demand node generation is dynamic and follows the scaling law of urban growth. The proposed model can generate a water distribution network (WDN) similar to reported real-world WDNs on some structural properties. Comparison with different modeling approaches indicates that a realistic demand node distribution and co-evolvement of demand node and network are important for the simulation of real complex networks. The simulation results indicate that the efficiency of water distribution networks is exponentially affected by the urban growth pattern. On the contrary, the improvement of efficiency by engineering optimization is limited and relatively insignificant. The redundancy and robustness, on another aspect, can be significantly improved through engineering methods.

  3. Enabling Requirements-Based Programming for Highly-Dependable Complex Parallel and Distributed Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    The manual application of formal methods in system specification has produced successes, but in the end, despite any claims and assertions by practitioners, there is no provable relationship between a manually derived system specification or formal model and the customer's original requirements. Complex parallel and distributed system present the worst case implications for today s dearth of viable approaches for achieving system dependability. No avenue other than formal methods constitutes a serious contender for resolving the problem, and so recognition of requirements-based programming has come at a critical juncture. We describe a new, NASA-developed automated requirement-based programming method that can be applied to certain classes of systems, including complex parallel and distributed systems, to achieve a high degree of dependability.

  4. Generalised Central Limit Theorems for Growth Rate Distribution of Complex Systems

    NASA Astrophysics Data System (ADS)

    Takayasu, Misako; Watanabe, Hayafumi; Takayasu, Hideki

    2014-04-01

    We introduce a solvable model of randomly growing systems consisting of many independent subunits. Scaling relations and growth rate distributions in the limit of infinite subunits are analysed theoretically. Various types of scaling properties and distributions reported for growth rates of complex systems in a variety of fields can be derived from this basic physical model. Statistical data of growth rates for about 1 million business firms are analysed as a real-world example of randomly growing systems. Not only are the scaling relations consistent with the theoretical solution, but the entire functional form of the growth rate distribution is fitted with a theoretical distribution that has a power-law tail.

  5. Simulating the Daylight Performance of Complex Fenestration Systems Using Bidirectional Scattering Distribution Functions within Radiance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ward, Gregory; Mistrick, Ph.D., Richard; Lee, Eleanor

    2011-01-21

    We describe two methods which rely on bidirectional scattering distribution functions (BSDFs) to model the daylighting performance of complex fenestration systems (CFS), enabling greater flexibility and accuracy in evaluating arbitrary assemblies of glazing, shading, and other optically-complex coplanar window systems. Two tools within Radiance enable a) efficient annual performance evaluations of CFS, and b) accurate renderings of CFS despite the loss of spatial resolution associated with low-resolution BSDF datasets for inhomogeneous systems. Validation, accuracy, and limitations of the methods are discussed.

  6. Networked buffering: a basic mechanism for distributed robustness in complex adaptive systems.

    PubMed

    Whitacre, James M; Bender, Axel

    2010-06-15

    A generic mechanism--networked buffering--is proposed for the generation of robust traits in complex systems. It requires two basic conditions to be satisfied: 1) agents are versatile enough to perform more than one single functional role within a system and 2) agents are degenerate, i.e. there exists partial overlap in the functional capabilities of agents. Given these prerequisites, degenerate systems can readily produce a distributed systemic response to local perturbations. Reciprocally, excess resources related to a single function can indirectly support multiple unrelated functions within a degenerate system. In models of genome:proteome mappings for which localized decision-making and modularity of genetic functions are assumed, we verify that such distributed compensatory effects cause enhanced robustness of system traits. The conditions needed for networked buffering to occur are neither demanding nor rare, supporting the conjecture that degeneracy may fundamentally underpin distributed robustness within several biotic and abiotic systems. For instance, networked buffering offers new insights into systems engineering and planning activities that occur under high uncertainty. It may also help explain recent developments in understanding the origins of resilience within complex ecosystems.

  7. Young Children's Playfully Complex Communication: Distributed Imagination

    ERIC Educational Resources Information Center

    Alcock, Sophie

    2010-01-01

    This paper draws on research exploring young children's playful and humorous communication. It explores how playful activity mediates and connects children in complex activity systems where imagination, cognition, and consciousness become distributed across individuals. Children's playfulness is mediated and distributed via artefacts (tools, signs…

  8. A Metrics-Based Approach to Intrusion Detection System Evaluation for Distributed Real-Time Systems

    DTIC Science & Technology

    2002-04-01

    Based Approach to Intrusion Detection System Evaluation for Distributed Real - Time Systems Authors: G. A. Fink, B. L. Chappell, T. G. Turner, and...Distributed, Security. 1 Introduction Processing and cost requirements are driving future naval combat platforms to use distributed, real - time systems of...distributed, real - time systems . As these systems grow more complex, the timing requirements do not diminish; indeed, they may become more constrained

  9. Empirical Analysis of Optical Attenuator Performance in Quantum Key Distribution Systems Using a Particle Model

    DTIC Science & Technology

    2012-03-01

    EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING A...DISTRIBUTION IS UNLIMITED AFIT/GCS/ENG/12-01 EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING ...challenging as the complexity of actual implementation specifics are considered. Two components common to most quantum key distribution

  10. 47 CFR 25.103 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...) Communication-satellite earth station complex. The term communication-satellite earth station complex includes transmitters, receivers, and communications antennas at the earth station site together with the... communication to terrestrial distribution system(s). (e) Communication-satellite earth station complex functions...

  11. 47 CFR 25.103 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...) Communication-satellite earth station complex. The term communication-satellite earth station complex includes transmitters, receivers, and communications antennas at the earth station site together with the... communication to terrestrial distribution system(s). (e) Communication-satellite earth station complex functions...

  12. A new approach to interpretation of heterogeneity of fluorescence decay in complex biological systems

    NASA Astrophysics Data System (ADS)

    Wlodarczyk, Jakub; Kierdaszuk, Borys

    2005-08-01

    Decays of tyrosine fluorescence in protein-ligand complexes are described by a model of continuous distribution of fluorescence lifetimes. Resulted analytical power-like decay function provides good fits to highly complex fluorescence kinetics. Moreover, this is a manifestation of so-called Tsallis q-exponential function, which is suitable for description of the systems with long-range interactions, memory effect, as well as with fluctuations of the characteristic lifetime of fluorescence. The proposed decay functions were applied to analysis of fluorescence decays of tyrosine in a protein, i.e. the enzyme purine nucleoside phosphorylase from E. coli (the product of the deoD gene), free in aqueous solution and in a complex with formycin A (an inhibitor) and orthophosphate (a co-substrate). The power-like function provides new information about enzyme-ligand complex formation based on the physically justified heterogeneity parameter directly related to the lifetime distribution. A measure of the heterogeneity parameter in the enzyme systems is provided by a variance of fluorescence lifetime distribution. The possible number of deactivation channels and excited state mean lifetime can be easily derived without a priori knowledge of the complexity of studied system. Moreover, proposed model is simpler then traditional multi-exponential one, and better describes heterogeneous nature of studied systems.

  13. First Experiences Using XACML for Access Control in Distributed Systems

    NASA Technical Reports Server (NTRS)

    Lorch, Marcus; Proctor, Seth; Lepro, Rebekah; Kafura, Dennis; Shah, Sumit

    2003-01-01

    Authorization systems today are increasingly complex. They span domains of administration, rely on many different authentication sources, and manage permissions that can be as complex as the system itself. Worse still, while there are many standards that define authentication mechanisms, the standards that address authorization are less well defined and tend to work only within homogeneous systems. This paper presents XACML, a standard access control language, as one component of a distributed and inter-operable authorization framework. Several emerging systems which incorporate XACML are discussed. These discussions illustrate how authorization can be deployed in distributed, decentralized systems. Finally, some new and future topics are presented to show where this work is heading and how it will help connect the general components of an authorization system.

  14. Methods and tools for profiling and control of distributed systems

    NASA Astrophysics Data System (ADS)

    Sukharev, R.; Lukyanchikov, O.; Nikulchev, E.; Biryukov, D.; Ryadchikov, I.

    2018-02-01

    This article is devoted to the topic of profiling and control of distributed systems. Distributed systems have a complex architecture, applications are distributed among various computing nodes, and many network operations are performed. Therefore, today it is important to develop methods and tools for profiling distributed systems. The article analyzes and standardizes methods for profiling distributed systems that focus on simulation to conduct experiments and build a graph model of the system. The theory of queueing networks is used for simulation modeling of distributed systems, receiving and processing user requests. To automate the above method of profiling distributed systems the software application was developed with a modular structure and similar to a SCADA-system.

  15. An Environment for Incremental Development of Distributed Extensible Asynchronous Real-time Systems

    NASA Technical Reports Server (NTRS)

    Ames, Charles K.; Burleigh, Scott; Briggs, Hugh C.; Auernheimer, Brent

    1996-01-01

    Incremental parallel development of distributed real-time systems is difficult. Architectural techniques and software tools developed at the Jet Propulsion Laboratory's (JPL's) Flight System Testbed make feasible the integration of complex systems in various stages of development.

  16. Complexation behavior of oppositely charged polyelectrolytes: Effect of charge distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Mingtian; Li, Baohui, E-mail: dliang@pku.edu.cn, E-mail: baohui@nankai.edu.cn; Zhou, Jihan

    Complexation behavior of oppositely charged polyelectrolytes in a solution is investigated using a combination of computer simulations and experiments, focusing on the influence of polyelectrolyte charge distributions along the chains on the structure of the polyelectrolyte complexes. The simulations are performed using Monte Carlo with the replica-exchange algorithm for three model systems where each system is composed of a mixture of two types of oppositely charged model polyelectrolyte chains (EGEG){sub 5}/(KGKG){sub 5}, (EEGG){sub 5}/(KKGG){sub 5}, and (EEGG){sub 5}/(KGKG){sub 5}, in a solution including explicit solvent molecules. Among the three model systems, only the charge distributions along the chains are notmore » identical. Thermodynamic quantities are calculated as a function of temperature (or ionic strength), and the microscopic structures of complexes are examined. It is found that the three systems have different transition temperatures, and form complexes with different sizes, structures, and densities at a given temperature. Complex microscopic structures with an alternating arrangement of one monolayer of E/K monomers and one monolayer of G monomers, with one bilayer of E and K monomers and one bilayer of G monomers, and with a mixture of monolayer and bilayer of E/K monomers in a box shape and a trilayer of G monomers inside the box are obtained for the three mixture systems, respectively. The experiments are carried out for three systems where each is composed of a mixture of two types of oppositely charged peptide chains. Each peptide chain is composed of Lysine (K) and glycine (G) or glutamate (E) and G, in solution, and the chain length and amino acid sequences, and hence the charge distribution, are precisely controlled, and all of them are identical with those for the corresponding model chain. The complexation behavior and complex structures are characterized through laser light scattering and atomic force microscopy measurements. The order of the apparent weight-averaged molar mass and the order of density of complexes observed from the three experimental systems are qualitatively in agreement with those predicted from the simulations.« less

  17. Control of complex dynamics and chaos in distributed parameter systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakravarti, S.; Marek, M.; Ray, W.H.

    This paper discusses a methodology for controlling complex dynamics and chaos in distributed parameter systems. The reaction-diffusion system with Brusselator kinetics, where the torus-doubling or quasi-periodic (two characteristic incommensurate frequencies) route to chaos exists in a defined range of parameter values, is used as an example. Poincare maps are used for characterization of quasi-periodic and chaotic attractors. The dominant modes or topos, which are inherent properties of the system, are identified by means of the Singular Value Decomposition. Tested modal feedback control schemas based on identified dominant spatial modes confirm the possibility of stabilization of simple quasi-periodic trajectories in themore » complex quasi-periodic or chaotic spatiotemporal patterns.« less

  18. Fluid Intelligence Predicts Novel Rule Implementation in a Distributed Frontoparietal Control Network.

    PubMed

    Tschentscher, Nadja; Mitchell, Daniel; Duncan, John

    2017-05-03

    Fluid intelligence has been associated with a distributed cognitive control or multiple-demand (MD) network, comprising regions of lateral frontal, insular, dorsomedial frontal, and parietal cortex. Human fluid intelligence is also intimately linked to task complexity, and the process of solving complex problems in a sequence of simpler, more focused parts. Here, a complex target detection task included multiple independent rules, applied one at a time in successive task epochs. Although only one rule was applied at a time, increasing task complexity (i.e., the number of rules) impaired performance in participants of lower fluid intelligence. Accompanying this loss of performance was reduced response to rule-critical events across the distributed MD network. The results link fluid intelligence and MD function to a process of attentional focus on the successive parts of complex behavior. SIGNIFICANCE STATEMENT Fluid intelligence is intimately linked to the ability to structure complex problems in a sequence of simpler, more focused parts. We examine the basis for this link in the functions of a distributed frontoparietal or multiple-demand (MD) network. With increased task complexity, participants of lower fluid intelligence showed reduced responses to task-critical events. Reduced responses in the MD system were accompanied by impaired behavioral performance. Low fluid intelligence is linked to poor foregrounding of task-critical information across a distributed MD system. Copyright © 2017 Tschentscher et al.

  19. High-resolution imaging using a wideband MIMO radar system with two distributed arrays.

    PubMed

    Wang, Dang-wei; Ma, Xiao-yan; Chen, A-Lei; Su, Yi

    2010-05-01

    Imaging a fast maneuvering target has been an active research area in past decades. Usually, an array antenna with multiple elements is implemented to avoid the motion compensations involved in the inverse synthetic aperture radar (ISAR) imaging. Nevertheless, there is a price dilemma due to the high level of hardware complexity compared to complex algorithm implemented in the ISAR imaging system with only one antenna. In this paper, a wideband multiple-input multiple-output (MIMO) radar system with two distributed arrays is proposed to reduce the hardware complexity of the system. Furthermore, the system model, the equivalent array production method and the imaging procedure are presented. As compared with the classical real aperture radar (RAR) imaging system, there is a very important contribution in our method that the lower hardware complexity can be involved in the imaging system since many additive virtual array elements can be obtained. Numerical simulations are provided for testing our system and imaging method.

  20. Distributed Cooperation Solution Method of Complex System Based on MAS

    NASA Astrophysics Data System (ADS)

    Weijin, Jiang; Yuhui, Xu

    To adapt the model in reconfiguring fault diagnosing to dynamic environment and the needs of solving the tasks of complex system fully, the paper introduced multi-Agent and related technology to the complicated fault diagnosis, an integrated intelligent control system is studied in this paper. Based on the thought of the structure of diagnostic decision and hierarchy in modeling, based on multi-layer decomposition strategy of diagnosis task, a multi-agent synchronous diagnosis federation integrated different knowledge expression modes and inference mechanisms are presented, the functions of management agent, diagnosis agent and decision agent are analyzed, the organization and evolution of agents in the system are proposed, and the corresponding conflict resolution algorithm in given, Layered structure of abstract agent with public attributes is build. System architecture is realized based on MAS distributed layered blackboard. The real world application shows that the proposed control structure successfully solves the fault diagnose problem of the complex plant, and the special advantage in the distributed domain.

  1. A development framework for artificial intelligence based distributed operations support systems

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.; Cottman, Bruce H.

    1990-01-01

    Advanced automation is required to reduce costly human operations support requirements for complex space-based and ground control systems. Existing knowledge based technologies have been used successfully to automate individual operations tasks. Considerably less progress has been made in integrating and coordinating multiple operations applications for unified intelligent support systems. To fill this gap, SOCIAL, a tool set for developing Distributed Artificial Intelligence (DAI) systems is being constructed. SOCIAL consists of three primary language based components defining: models of interprocess communication across heterogeneous platforms; models for interprocess coordination, concurrency control, and fault management; and for accessing heterogeneous information resources. DAI applications subsystems, either new or existing, will access these distributed services non-intrusively, via high-level message-based protocols. SOCIAL will reduce the complexity of distributed communications, control, and integration, enabling developers to concentrate on the design and functionality of the target DAI system itself.

  2. Foundational Report Series. Advanced Distribution management Systems for Grid Modernization (Importance of DMS for Distribution Grid Modernization)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jianhui

    2015-09-01

    Grid modernization is transforming the operation and management of electric distribution systems from manual, paper-driven business processes to electronic, computer-assisted decisionmaking. At the center of this business transformation is the distribution management system (DMS), which provides a foundation from which optimal levels of performance can be achieved in an increasingly complex business and operating environment. Electric distribution utilities are facing many new challenges that are dramatically increasing the complexity of operating and managing the electric distribution system: growing customer expectations for service reliability and power quality, pressure to achieve better efficiency and utilization of existing distribution system assets, and reductionmore » of greenhouse gas emissions by accommodating high penetration levels of distributed generating resources powered by renewable energy sources (wind, solar, etc.). Recent “storm of the century” events in the northeastern United States and the lengthy power outages and customer hardships that followed have greatly elevated the need to make power delivery systems more resilient to major storm events and to provide a more effective electric utility response during such regional power grid emergencies. Despite these newly emerging challenges for electric distribution system operators, only a small percentage of electric utilities have actually implemented a DMS. This paper discusses reasons why a DMS is needed and why the DMS may emerge as a mission-critical system that will soon be considered essential as electric utilities roll out their grid modernization strategies.« less

  3. Improved mine blast algorithm for optimal cost design of water distribution systems

    NASA Astrophysics Data System (ADS)

    Sadollah, Ali; Guen Yoo, Do; Kim, Joong Hoon

    2015-12-01

    The design of water distribution systems is a large class of combinatorial, nonlinear optimization problems with complex constraints such as conservation of mass and energy equations. Since feasible solutions are often extremely complex, traditional optimization techniques are insufficient. Recently, metaheuristic algorithms have been applied to this class of problems because they are highly efficient. In this article, a recently developed optimizer called the mine blast algorithm (MBA) is considered. The MBA is improved and coupled with the hydraulic simulator EPANET to find the optimal cost design for water distribution systems. The performance of the improved mine blast algorithm (IMBA) is demonstrated using the well-known Hanoi, New York tunnels and Balerma benchmark networks. Optimization results obtained using IMBA are compared to those using MBA and other optimizers in terms of their minimum construction costs and convergence rates. For the complex Balerma network, IMBA offers the cheapest network design compared to other optimization algorithms.

  4. Sparse distributed memory overview

    NASA Technical Reports Server (NTRS)

    Raugh, Mike

    1990-01-01

    The Sparse Distributed Memory (SDM) project is investigating the theory and applications of massively parallel computing architecture, called sparse distributed memory, that will support the storage and retrieval of sensory and motor patterns characteristic of autonomous systems. The immediate objectives of the project are centered in studies of the memory itself and in the use of the memory to solve problems in speech, vision, and robotics. Investigation of methods for encoding sensory data is an important part of the research. Examples of NASA missions that may benefit from this work are Space Station, planetary rovers, and solar exploration. Sparse distributed memory offers promising technology for systems that must learn through experience and be capable of adapting to new circumstances, and for operating any large complex system requiring automatic monitoring and control. Sparse distributed memory is a massively parallel architecture motivated by efforts to understand how the human brain works. Sparse distributed memory is an associative memory, able to retrieve information from cues that only partially match patterns stored in the memory. It is able to store long temporal sequences derived from the behavior of a complex system, such as progressive records of the system's sensory data and correlated records of the system's motor controls.

  5. An XML-Based Protocol for Distributed Event Services

    NASA Technical Reports Server (NTRS)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    A recent trend in distributed computing is the construction of high-performance distributed systems called computational grids. One difficulty we have encountered is that there is no standard format for the representation of performance information and no standard protocol for transmitting this information. This limits the types of performance analysis that can be undertaken in complex distributed systems. To address this problem, we present an XML-based protocol for transmitting performance events in distributed systems and evaluate the performance of this protocol.

  6. A programming environment for distributed complex computing. An overview of the Framework for Interdisciplinary Design Optimization (FIDO) project. NASA Langley TOPS exhibit H120b

    NASA Technical Reports Server (NTRS)

    Townsend, James C.; Weston, Robert P.; Eidson, Thomas M.

    1993-01-01

    The Framework for Interdisciplinary Design Optimization (FIDO) is a general programming environment for automating the distribution of complex computing tasks over a networked system of heterogeneous computers. For example, instead of manually passing a complex design problem between its diverse specialty disciplines, the FIDO system provides for automatic interactions between the discipline tasks and facilitates their communications. The FIDO system networks all the computers involved into a distributed heterogeneous computing system, so they have access to centralized data and can work on their parts of the total computation simultaneously in parallel whenever possible. Thus, each computational task can be done by the most appropriate computer. Results can be viewed as they are produced and variables changed manually for steering the process. The software is modular in order to ease migration to new problems: different codes can be substituted for each of the current code modules with little or no effect on the others. The potential for commercial use of FIDO rests in the capability it provides for automatically coordinating diverse computations on a networked system of workstations and computers. For example, FIDO could provide the coordination required for the design of vehicles or electronics or for modeling complex systems.

  7. Developing an Integration Infrastructure for Distributed Engine Control Technologies

    NASA Technical Reports Server (NTRS)

    Culley, Dennis; Zinnecker, Alicia; Aretskin-Hariton, Eliot; Kratz, Jonathan

    2014-01-01

    Turbine engine control technology is poised to make the first revolutionary leap forward since the advent of full authority digital engine control in the mid-1980s. This change aims squarely at overcoming the physical constraints that have historically limited control system hardware on aero-engines to a federated architecture. Distributed control architecture allows complex analog interfaces existing between system elements and the control unit to be replaced by standardized digital interfaces. Embedded processing, enabled by high temperature electronics, provides for digitization of signals at the source and network communications resulting in a modular system at the hardware level. While this scheme simplifies the physical integration of the system, its complexity appears in other ways. In fact, integration now becomes a shared responsibility among suppliers and system integrators. While these are the most obvious changes, there are additional concerns about performance, reliability, and failure modes due to distributed architecture that warrant detailed study. This paper describes the development of a new facility intended to address the many challenges of the underlying technologies of distributed control. The facility is capable of performing both simulation and hardware studies ranging from component to system level complexity. Its modular and hierarchical structure allows the user to focus their interaction on specific areas of interest.

  8. An implementation of the distributed programming structural synthesis system (PROSSS)

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.

    1981-01-01

    A method is described for implementing a flexible software system that combines large, complex programs with small, user-supplied, problem-dependent programs and that distributes their execution between a mainframe and a minicomputer. The Programming Structural Synthesis System (PROSSS) was the specific software system considered. The results of such distributed implementation are flexibility of the optimization procedure organization and versatility of the formulation of constraints and design variables.

  9. Distributed Electrical Energy Systems: Needs, Concepts, Approaches and Vision (in Chinese)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yingchen; Zhang, Jun; Gao, Wenzhong

    Intelligent distributed electrical energy systems (IDEES) are featured by vast system components, diversifled component types, and difficulties in operation and management, which results in that the traditional centralized power system management approach no longer flts the operation. Thus, it is believed that the blockchain technology is one of the important feasible technical paths for building future large-scale distributed electrical energy systems. An IDEES is inherently with both social and technical characteristics, as a result, a distributed electrical energy system needs to be divided into multiple layers, and at each layer, a blockchain is utilized to model and manage its logicmore » and physical functionalities. The blockchains at difierent layers coordinate with each other and achieve successful operation of the IDEES. Speciflcally, the multi-layer blockchains, named 'blockchain group', consist of distributed data access and service blockchain, intelligent property management blockchain, power system analysis blockchain, intelligent contract operation blockchain, and intelligent electricity trading blockchain. It is expected that the blockchain group can be self-organized into a complex, autonomous and distributed IDEES. In this complex system, frequent and in-depth interactions and computing will derive intelligence, and it is expected that such intelligence can bring stable, reliable and efficient electrical energy production, transmission and consumption.« less

  10. Jungle Computing: Distributed Supercomputing Beyond Clusters, Grids, and Clouds

    NASA Astrophysics Data System (ADS)

    Seinstra, Frank J.; Maassen, Jason; van Nieuwpoort, Rob V.; Drost, Niels; van Kessel, Timo; van Werkhoven, Ben; Urbani, Jacopo; Jacobs, Ceriel; Kielmann, Thilo; Bal, Henri E.

    In recent years, the application of high-performance and distributed computing in scientific practice has become increasingly wide spread. Among the most widely available platforms to scientists are clusters, grids, and cloud systems. Such infrastructures currently are undergoing revolutionary change due to the integration of many-core technologies, providing orders-of-magnitude speed improvements for selected compute kernels. With high-performance and distributed computing systems thus becoming more heterogeneous and hierarchical, programming complexity is vastly increased. Further complexities arise because urgent desire for scalability and issues including data distribution, software heterogeneity, and ad hoc hardware availability commonly force scientists into simultaneous use of multiple platforms (e.g., clusters, grids, and clouds used concurrently). A true computing jungle.

  11. A Study of Students' Reasoning about Probabilistic Causality: Implications for Understanding Complex Systems and for Instructional Design

    ERIC Educational Resources Information Center

    Grotzer, Tina A.; Solis, S. Lynneth; Tutwiler, M. Shane; Cuzzolino, Megan Powell

    2017-01-01

    Understanding complex systems requires reasoning about causal relationships that behave or appear to behave probabilistically. Features such as distributed agency, large spatial scales, and time delays obscure co-variation relationships and complex interactions can result in non-deterministic relationships between causes and effects that are best…

  12. Distributed software framework and continuous integration in hydroinformatics systems

    NASA Astrophysics Data System (ADS)

    Zhou, Jianzhong; Zhang, Wei; Xie, Mengfei; Lu, Chengwei; Chen, Xiao

    2017-08-01

    When encountering multiple and complicated models, multisource structured and unstructured data, complex requirements analysis, the platform design and integration of hydroinformatics systems become a challenge. To properly solve these problems, we describe a distributed software framework and it’s continuous integration process in hydroinformatics systems. This distributed framework mainly consists of server cluster for models, distributed database, GIS (Geographic Information System) servers, master node and clients. Based on it, a GIS - based decision support system for joint regulating of water quantity and water quality of group lakes in Wuhan China is established.

  13. Entropy, complexity, and Markov diagrams for random walk cancer models.

    PubMed

    Newton, Paul K; Mason, Jeremy; Hurt, Brian; Bethel, Kelly; Bazhenova, Lyudmila; Nieva, Jorge; Kuhn, Peter

    2014-12-19

    The notion of entropy is used to compare the complexity associated with 12 common cancers based on metastatic tumor distribution autopsy data. We characterize power-law distributions, entropy, and Kullback-Liebler divergence associated with each primary cancer as compared with data for all cancer types aggregated. We then correlate entropy values with other measures of complexity associated with Markov chain dynamical systems models of progression. The Markov transition matrix associated with each cancer is associated with a directed graph model where nodes are anatomical locations where a metastatic tumor could develop, and edge weightings are transition probabilities of progression from site to site. The steady-state distribution corresponds to the autopsy data distribution. Entropy correlates well with the overall complexity of the reduced directed graph structure for each cancer and with a measure of systemic interconnectedness of the graph, called graph conductance. The models suggest that grouping cancers according to their entropy values, with skin, breast, kidney, and lung cancers being prototypical high entropy cancers, stomach, uterine, pancreatic and ovarian being mid-level entropy cancers, and colorectal, cervical, bladder, and prostate cancers being prototypical low entropy cancers, provides a potentially useful framework for viewing metastatic cancer in terms of predictability, complexity, and metastatic potential.

  14. Entropy, complexity, and Markov diagrams for random walk cancer models

    NASA Astrophysics Data System (ADS)

    Newton, Paul K.; Mason, Jeremy; Hurt, Brian; Bethel, Kelly; Bazhenova, Lyudmila; Nieva, Jorge; Kuhn, Peter

    2014-12-01

    The notion of entropy is used to compare the complexity associated with 12 common cancers based on metastatic tumor distribution autopsy data. We characterize power-law distributions, entropy, and Kullback-Liebler divergence associated with each primary cancer as compared with data for all cancer types aggregated. We then correlate entropy values with other measures of complexity associated with Markov chain dynamical systems models of progression. The Markov transition matrix associated with each cancer is associated with a directed graph model where nodes are anatomical locations where a metastatic tumor could develop, and edge weightings are transition probabilities of progression from site to site. The steady-state distribution corresponds to the autopsy data distribution. Entropy correlates well with the overall complexity of the reduced directed graph structure for each cancer and with a measure of systemic interconnectedness of the graph, called graph conductance. The models suggest that grouping cancers according to their entropy values, with skin, breast, kidney, and lung cancers being prototypical high entropy cancers, stomach, uterine, pancreatic and ovarian being mid-level entropy cancers, and colorectal, cervical, bladder, and prostate cancers being prototypical low entropy cancers, provides a potentially useful framework for viewing metastatic cancer in terms of predictability, complexity, and metastatic potential.

  15. New definition of complexity for self-gravitating fluid distributions: The spherically symmetric, static case

    NASA Astrophysics Data System (ADS)

    Herrera, L.

    2018-02-01

    We put forward a new definition of complexity, for static and spherically symmetric self-gravitating systems, based on a quantity, hereafter referred to as complexity factor, that appears in the orthogonal splitting of the Riemann tensor, in the context of general relativity. We start by assuming that the homogeneous (in the energy density) fluid, with isotropic pressure is endowed with minimal complexity. For this kind of fluid distribution, the value of complexity factor is zero. So, the rationale behind our proposal for the definition of complexity factor stems from the fact that it measures the departure, in the value of the active gravitational mass (Tolman mass), with respect to its value for a zero complexity system. Such departure is produced by a specific combination of energy density inhomogeneity and pressure anisotropy. Thus, zero complexity factor may also be found in self-gravitating systems with inhomogeneous energy density and anisotropic pressure, provided the effects of these two factors, on the complexity factor, cancel each other. Some exact interior solutions to the Einstein equations satisfying the zero complexity criterium are found, and prospective applications of this newly defined concept, to the study of the structure and evolution of compact objects, are discussed.

  16. Equilibrium sampling by reweighting nonequilibrium simulation trajectories

    NASA Astrophysics Data System (ADS)

    Yang, Cheng; Wan, Biao; Xu, Shun; Wang, Yanting; Zhou, Xin

    2016-03-01

    Based on equilibrium molecular simulations, it is usually difficult to efficiently visit the whole conformational space of complex systems, which are separated into some metastable regions by high free energy barriers. Nonequilibrium simulations could enhance transitions among these metastable regions and then be applied to sample equilibrium distributions in complex systems, since the associated nonequilibrium effects can be removed by employing the Jarzynski equality (JE). Here we present such a systematical method, named reweighted nonequilibrium ensemble dynamics (RNED), to efficiently sample equilibrium conformations. The RNED is a combination of the JE and our previous reweighted ensemble dynamics (RED) method. The original JE reproduces equilibrium from lots of nonequilibrium trajectories but requires that the initial distribution of these trajectories is equilibrium. The RED reweights many equilibrium trajectories from an arbitrary initial distribution to get the equilibrium distribution, whereas the RNED has both advantages of the two methods, reproducing equilibrium from lots of nonequilibrium simulation trajectories with an arbitrary initial conformational distribution. We illustrated the application of the RNED in a toy model and in a Lennard-Jones fluid to detect its liquid-solid phase coexistence. The results indicate that the RNED sufficiently extends the application of both the original JE and the RED in equilibrium sampling of complex systems.

  17. Equilibrium sampling by reweighting nonequilibrium simulation trajectories.

    PubMed

    Yang, Cheng; Wan, Biao; Xu, Shun; Wang, Yanting; Zhou, Xin

    2016-03-01

    Based on equilibrium molecular simulations, it is usually difficult to efficiently visit the whole conformational space of complex systems, which are separated into some metastable regions by high free energy barriers. Nonequilibrium simulations could enhance transitions among these metastable regions and then be applied to sample equilibrium distributions in complex systems, since the associated nonequilibrium effects can be removed by employing the Jarzynski equality (JE). Here we present such a systematical method, named reweighted nonequilibrium ensemble dynamics (RNED), to efficiently sample equilibrium conformations. The RNED is a combination of the JE and our previous reweighted ensemble dynamics (RED) method. The original JE reproduces equilibrium from lots of nonequilibrium trajectories but requires that the initial distribution of these trajectories is equilibrium. The RED reweights many equilibrium trajectories from an arbitrary initial distribution to get the equilibrium distribution, whereas the RNED has both advantages of the two methods, reproducing equilibrium from lots of nonequilibrium simulation trajectories with an arbitrary initial conformational distribution. We illustrated the application of the RNED in a toy model and in a Lennard-Jones fluid to detect its liquid-solid phase coexistence. The results indicate that the RNED sufficiently extends the application of both the original JE and the RED in equilibrium sampling of complex systems.

  18. Spatial Distributions of Guest Molecule and Hydration Level in Dendrimer-Based Guest–Host Complex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Chih-Ying; Chen, Hsin-Lung; Do, Changwoo

    2016-08-09

    Using the electrostatic complex of G4 poly(amidoamine) (PAMAM) dendrimer with an amphiphilic surfactant as a model system, contrast variation small angle neutron scattering (SANS) is implemented to resolve the key structural characteristics of dendrimer-based guest–host system. Quantifications of the radial distributions of the scattering length density and the hydration level within the complex molecule reveal that the surfactant is embedded in the peripheral region of dendrimer and the steric crowding in this region increases the backfolding of the dendritic segments, thereby reducing the hydration level throughout the complex molecule. Here, the insights into the spatial location of the guest moleculesmore » as well as the perturbations of dendrimer conformation and hydration level deduced here are crucial for the delicate design of dendrimer-based guest–host system for biomedical applications.« less

  19. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems.

    DTIC Science & Technology

    1997-09-30

    set of software tools for specification and verification of distributed real time systems using formal methods. The task of this SBIR Phase II effort...to be used by designers of real - time systems for early detection of errors. The mathematical complexity of formal specification and verification has

  20. Hardware-Software Complex for Measurement of Energy and Angular Distributions of Charged Particles Formed in Nuclear Reactions

    NASA Astrophysics Data System (ADS)

    Vikhlyantsev, O. P.; Generalov, L. N.; Kuryakin, A. V.; Karpov, I. A.; Gurin, N. E.; Tumkin, A. D.; Fil'chagin, S. V.

    2017-12-01

    A hardware-software complex for measurement of energy and angular distributions of charged particles formed in nuclear reactions is presented. Hardware and software structures of the complex, the basic set of the modular nuclear-physical apparatus of a multichannel detecting system on the basis of Δ E- E telescopes of silicon detectors, and the hardware of experimental data collection, storage, and processing are presented and described.

  1. Status of 20 kHz space station power distribution technology

    NASA Technical Reports Server (NTRS)

    Hansen, Irving G.

    1988-01-01

    Power Distribution on the NASA Space Station will be accomplished by a 20 kHz sinusoidal, 440 VRMS, single phase system. In order to minimize both system complexity and the total power coversion steps required, high frequency power will be distributed end-to-end in the system. To support the final design of flight power system hardware, advanced development and demonstrations have been made on key system technologies and components. The current status of this program is discussed.

  2. EVALUATING DISCONTINUITIES IN COMPLEX SYSTEMS: TOWARD QUANTITATIVE MEASURE OF RESILIENCE

    EPA Science Inventory

    The textural discontinuity hypothesis (TDH) is based on the observation that animal body mass distributions exhibit discontinuities that may reflect the texture of the landscape available for exploitation. This idea has been extended to other complex systems, hinting that the ide...

  3. Planning of distributed generation in distribution network based on improved particle swarm optimization algorithm

    NASA Astrophysics Data System (ADS)

    Li, Jinze; Qu, Zhi; He, Xiaoyang; Jin, Xiaoming; Li, Tie; Wang, Mingkai; Han, Qiu; Gao, Ziji; Jiang, Feng

    2018-02-01

    Large-scale access of distributed power can improve the current environmental pressure, at the same time, increasing the complexity and uncertainty of overall distribution system. Rational planning of distributed power can effectively improve the system voltage level. To this point, the specific impact on distribution network power quality caused by the access of typical distributed power was analyzed and from the point of improving the learning factor and the inertia weight, an improved particle swarm optimization algorithm (IPSO) was proposed which could solve distributed generation planning for distribution network to improve the local and global search performance of the algorithm. Results show that the proposed method can well reduce the system network loss and improve the economic performance of system operation with distributed generation.

  4. Coordinating complex problem-solving among distributed intelligent agents

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1992-01-01

    A process-oriented control model is described for distributed problem solving. The model coordinates the transfer and manipulation of information across independent networked applications, both intelligent and conventional. The model was implemented using SOCIAL, a set of object-oriented tools for distributing computing. Complex sequences of distributed tasks are specified in terms of high level scripts. Scripts are executed by SOCIAL objects called Manager Agents, which realize an intelligent coordination model that routes individual tasks to suitable server applications across the network. These tools are illustrated in a prototype distributed system for decision support of ground operations for NASA's Space Shuttle fleet.

  5. Distributed intrusion detection system based on grid security model

    NASA Astrophysics Data System (ADS)

    Su, Jie; Liu, Yahui

    2008-03-01

    Grid computing has developed rapidly with the development of network technology and it can solve the problem of large-scale complex computing by sharing large-scale computing resource. In grid environment, we can realize a distributed and load balance intrusion detection system. This paper first discusses the security mechanism in grid computing and the function of PKI/CA in the grid security system, then gives the application of grid computing character in the distributed intrusion detection system (IDS) based on Artificial Immune System. Finally, it gives a distributed intrusion detection system based on grid security system that can reduce the processing delay and assure the detection rates.

  6. Efficient calculation of open quantum system dynamics and time-resolved spectroscopy with distributed memory HEOM (DM-HEOM).

    PubMed

    Kramer, Tobias; Noack, Matthias; Reinefeld, Alexander; Rodríguez, Mirta; Zelinskyy, Yaroslav

    2018-06-11

    Time- and frequency-resolved optical signals provide insights into the properties of light-harvesting molecular complexes, including excitation energies, dipole strengths and orientations, as well as in the exciton energy flow through the complex. The hierarchical equations of motion (HEOM) provide a unifying theory, which allows one to study the combined effects of system-environment dissipation and non-Markovian memory without making restrictive assumptions about weak or strong couplings or separability of vibrational and electronic degrees of freedom. With increasing system size the exact solution of the open quantum system dynamics requires memory and compute resources beyond a single compute node. To overcome this barrier, we developed a scalable variant of HEOM. Our distributed memory HEOM, DM-HEOM, is a universal tool for open quantum system dynamics. It is used to accurately compute all experimentally accessible time- and frequency-resolved processes in light-harvesting molecular complexes with arbitrary system-environment couplings for a wide range of temperatures and complex sizes. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  7. Multiagent model and mean field theory of complex auction dynamics

    NASA Astrophysics Data System (ADS)

    Chen, Qinghua; Huang, Zi-Gang; Wang, Yougui; Lai, Ying-Cheng

    2015-09-01

    Recent years have witnessed a growing interest in analyzing a variety of socio-economic phenomena using methods from statistical and nonlinear physics. We study a class of complex systems arising from economics, the lowest unique bid auction (LUBA) systems, which is a recently emerged class of online auction game systems. Through analyzing large, empirical data sets of LUBA, we identify a general feature of the bid price distribution: an inverted J-shaped function with exponential decay in the large bid price region. To account for the distribution, we propose a multi-agent model in which each agent bids stochastically in the field of winner’s attractiveness, and develop a theoretical framework to obtain analytic solutions of the model based on mean field analysis. The theory produces bid-price distributions that are in excellent agreement with those from the real data. Our model and theory capture the essential features of human behaviors in the competitive environment as exemplified by LUBA, and may provide significant quantitative insights into complex socio-economic phenomena.

  8. Entropy, complexity, and Markov diagrams for random walk cancer models

    PubMed Central

    Newton, Paul K.; Mason, Jeremy; Hurt, Brian; Bethel, Kelly; Bazhenova, Lyudmila; Nieva, Jorge; Kuhn, Peter

    2014-01-01

    The notion of entropy is used to compare the complexity associated with 12 common cancers based on metastatic tumor distribution autopsy data. We characterize power-law distributions, entropy, and Kullback-Liebler divergence associated with each primary cancer as compared with data for all cancer types aggregated. We then correlate entropy values with other measures of complexity associated with Markov chain dynamical systems models of progression. The Markov transition matrix associated with each cancer is associated with a directed graph model where nodes are anatomical locations where a metastatic tumor could develop, and edge weightings are transition probabilities of progression from site to site. The steady-state distribution corresponds to the autopsy data distribution. Entropy correlates well with the overall complexity of the reduced directed graph structure for each cancer and with a measure of systemic interconnectedness of the graph, called graph conductance. The models suggest that grouping cancers according to their entropy values, with skin, breast, kidney, and lung cancers being prototypical high entropy cancers, stomach, uterine, pancreatic and ovarian being mid-level entropy cancers, and colorectal, cervical, bladder, and prostate cancers being prototypical low entropy cancers, provides a potentially useful framework for viewing metastatic cancer in terms of predictability, complexity, and metastatic potential. PMID:25523357

  9. Mathematic modeling of complex aquifer: Evian Natural Mineral Water case study considering lumped and distributed models.

    NASA Astrophysics Data System (ADS)

    Henriot, abel; Blavoux, bernard; Travi, yves; Lachassagne, patrick; Beon, olivier; Dewandel, benoit; Ladouche, bernard

    2013-04-01

    The Evian Natural Mineral Water (NMW) aquifer is a highly heterogeneous Quaternary glacial deposits complex composed of three main units, from bottom to top: - The "Inferior Complex" mainly composed of basal and impermeable till lying on the Alpine rocks. It outcrops only at the higher altitudes but is known in depth through drilled holes. - The "Gavot Plateau Complex" is an interstratified complex of mainly basal and lateral till up to 400 m thick. It outcrops at heights above approximately 850 m a.m.s.l. and up to 1200 m a.m.s.l. over a 30 km² area. It is the main recharge area known for the hydromineral system. - The "Terminal Complex" from which the Evian NMW is emerging at 410 m a.m.s.l. It is composed of sand and gravel Kame terraces that allow water to flow from the deep "Gavot Plateau Complex" permeable layers to the "Terminal Complex". A thick and impermeable terminal till caps and seals the system. Aquifer is then confined at its downstream area. Because of heterogeneity and complexity of this hydrosystem, distributed modeling tools are difficult to implement at the whole system scale: important hypothesis would have to be made about geometry, hydraulic properties, boundary conditions for example and extrapolation would lead with no doubt to unacceptable errors. Consequently a modeling strategy is being developed and leads also to improve the conceptual model of the hydrosystem. Lumped models mainly based on tritium time series allow the whole hydrosystem to be modeled combining in series: an exponential model (superficial aquifers of the "Gavot Plateau Complex"), a dispersive model (Gavot Plateau interstratified complex) and a piston flow model (sand and gravel from the Kame terraces) respectively 8, 60 and 2.5 years of mean transit time. These models provide insight on the governing parameters for the whole mineral aquifer. They help improving the current conceptual model and are to be improved with other environmental tracers such as CFC, SF6. A deterministic approach (distributed model; flow and transport) is performed at the scale of the terminal complex. The geometry of the system is quite well known from drill holes and the aquifer properties from data processing of hydraulic heads and pumping tests interpretation. A multidisciplinary approach (hydrodynamic, hydrochemistry, geology, isotopes) for the recharge area (Gavot Plateau Complex) aims to provide better constraint for the upstream boundary of distributed model. More, perfect tracer modeling approach highly constrains fitting of this distributed model. The result is a high resolution conceptual model leading to a future operational management tool of the aquifer.

  10. The Development of Design Guides for the Implementation of Multiprocessing Element Systems.

    DTIC Science & Technology

    1985-09-01

    Conclusions............................ 30 -~-.4 IMPLEMENTATION OF CHILL SIGNALS . COMMUNICATION PRIMITIVES ON A DISTRIBUTED SYSTEM ........................ 31...Architecture of a Distributed System .......... ........................... 32 4.2 Algorithm for the SEND Signal Operation ...... 35 4.3 Algorithm for the...elements operating concurrently. Such Multi Processing-element Systems are clearly going to be complex and it is important that the designers of such

  11. Detection and characterization of nonspecific, sparsely-populated binding modes in the early stages of complexation

    PubMed Central

    Cardone, A.; Bornstein, A.; Pant, H. C.; Brady, M.; Sriram, R.; Hassan, S. A.

    2015-01-01

    A method is proposed to study protein-ligand binding in a system governed by specific and non-specific interactions. Strong associations lead to narrow distributions in the proteins configuration space; weak and ultra-weak associations lead instead to broader distributions, a manifestation of non-specific, sparsely-populated binding modes with multiple interfaces. The method is based on the notion that a discrete set of preferential first-encounter modes are metastable states from which stable (pre-relaxation) complexes at equilibrium evolve. The method can be used to explore alternative pathways of complexation with statistical significance and can be integrated into a general algorithm to study protein interaction networks. The method is applied to a peptide-protein complex. The peptide adopts several low-population conformers and binds in a variety of modes with a broad range of affinities. The system is thus well suited to analyze general features of binding, including conformational selection, multiplicity of binding modes, and nonspecific interactions, and to illustrate how the method can be applied to study these problems systematically. The equilibrium distributions can be used to generate biasing functions for simulations of multiprotein systems from which bulk thermodynamic quantities can be calculated. PMID:25782918

  12. Generation of flower high-order Poincaré sphere laser beams from a spatial light modulator

    NASA Astrophysics Data System (ADS)

    Lu, T. H.; Huang, T. D.; Wang, J. G.; Wang, L. W.; Alfano, R. R.

    2016-12-01

    We propose and experimentally demonstrate a new complex laser beam with inhomogeneous polarization distributions mapping onto high-order Poincaré spheres (HOPSs). The complex laser mode is achieved by superposition of Laguerre-Gaussian modes and manifests exotic flower-like localization on intensity and phase profiles. A simple optical system is used to generate a polarization-variant distribution on the complex laser mode by superposition of orthogonal circular polarizations with opposite topological charges. Numerical analyses of the polarization distribution are consistent with the experimental results. The novel flower HOPS beams can act as a new light source for photonic applications.

  13. Linking Health Concepts in the Assessment and Evaluation of Water Distribution Systems

    ERIC Educational Resources Information Center

    Karney, Bryan W.; Filion, Yves R.

    2005-01-01

    The concept of health is not only a specific criterion for evaluation of water quality delivered by a distribution system but also a suitable paradigm for overall functioning of the hydraulic and structural components of the system. This article views health, despite its complexities, as the only criterion with suitable depth and breadth to allow…

  14. Optically controlled phased-array antenna technology for space communication systems

    NASA Technical Reports Server (NTRS)

    Kunath, Richard R.; Bhasin, Kul B.

    1988-01-01

    Using MMICs in phased-array applications above 20 GHz requires complex RF and control signal distribution systems. Conventional waveguide, coaxial cable, and microstrip methods are undesirable due to their high weight, high loss, limited mechanical flexibility and large volume. An attractive alternative to these transmission media, for RF and control signal distribution in MMIC phased-array antennas, is optical fiber. Presented are potential system architectures and their associated characteristics. The status of high frequency opto-electronic components needed to realize the potential system architectures is also discussed. It is concluded that an optical fiber network will reduce weight and complexity, and increase reliability and performance, but may require higher power.

  15. Regimes of Flow over Complex Structures of Endothelial Glycocalyx: A Molecular Dynamics Simulation Study.

    PubMed

    Jiang, Xi Zhuo; Feng, Muye; Ventikos, Yiannis; Luo, Kai H

    2018-04-10

    Flow patterns on surfaces grafted with complex structures play a pivotal role in many engineering and biomedical applications. In this research, large-scale molecular dynamics (MD) simulations are conducted to study the flow over complex surface structures of an endothelial glycocalyx layer. A detailed structure of glycocalyx has been adopted and the flow/glycocalyx system comprises about 5,800,000 atoms. Four cases involving varying external forces and modified glycocalyx configurations are constructed to reveal intricate fluid behaviour. Flow profiles including temporal evolutions and spatial distributions of velocity are illustrated. Moreover, streamline length and vorticity distributions under the four scenarios are compared and discussed to elucidate the effects of external forces and glycocalyx configurations on flow patterns. Results show that sugar chain configurations affect streamline length distributions but their impact on vorticity distributions is statistically insignificant, whilst the influence of the external forces on both streamline length and vorticity distributions are trivial. Finally, a regime diagram for flow over complex surface structures is proposed to categorise flow patterns.

  16. Computer-Assisted Monitoring Of A Complex System

    NASA Technical Reports Server (NTRS)

    Beil, Bob J.; Mickelson, Eric M.; Sterritt, John M.; Costantino, Rob W.; Houvener, Bob C.; Super, Mike A.

    1995-01-01

    Propulsion System Advisor (PSA) computer-based system assists engineers and technicians in analyzing masses of sensory data indicative of operating conditions of space shuttle propulsion system during pre-launch and launch activities. Designed solely for monitoring; does not perform any control functions. Although PSA developed for highly specialized application, serves as prototype of noncontrolling, computer-based subsystems for monitoring other complex systems like electric-power-distribution networks and factories.

  17. Alternative Architectures for Distributed Cooperative Problem-Solving in the National Airspace System

    NASA Technical Reports Server (NTRS)

    Smith, Phillip J.; Billings, Charles; McCoy, C. Elaine; Orasanu, Judith

    1999-01-01

    The air traffic management system in the United States is an example of a distributed problem solving system. It has elements of both cooperative and competitive problem-solving. This system includes complex organizations such as Airline Operations Centers (AOCs), the FAA Air Traffic Control Systems Command Center (ATCSCC), and traffic management units (TMUs) at enroute centers and TRACONs, all of which have a major focus on strategic decision-making. It also includes individuals concerned more with tactical decisions (such as air traffic controllers and pilots). The architecture for this system has evolved over time to rely heavily on the distribution of tasks and control authority in order to keep cognitive complexity manageable for any one individual operator, and to provide redundancy (both human and technological) to serve as a safety net to catch the slips or mistakes that any one person or entity might make. Currently, major changes are being considered for this architecture, especially with respect to the locus of control, in an effort to improve efficiency and safety. This paper uses a series of case studies to help evaluate some of these changes from the perspective of system complexity, and to point out possible alternative approaches that might be taken to improve system performance. The paper illustrates the need to maintain a clear understanding of what is required to assure a high level of performance when alternative system architectures and decompositions are developed.

  18. Dynamical complexity changes during two forms of meditation

    NASA Astrophysics Data System (ADS)

    Li, Jin; Hu, Jing; Zhang, Yinhong; Zhang, Xiaofeng

    2011-06-01

    Detection of dynamical complexity changes in natural and man-made systems has deep scientific and practical meaning. We use the base-scale entropy method to analyze dynamical complexity changes for heart rate variability (HRV) series during specific traditional forms of Chinese Chi and Kundalini Yoga meditation techniques in healthy young adults. The results show that dynamical complexity decreases in meditation states for two forms of meditation. Meanwhile, we detected changes in probability distribution of m-words during meditation and explained this changes using probability distribution of sine function. The base-scale entropy method may be used on a wider range of physiologic signals.

  19. Adaptive Fault-Resistant Systems

    DTIC Science & Technology

    1994-10-01

    An Architectural Overview of the Alpha Real-Time Distributed Kernel . In Proceeding., of the USEN[X Workshop on Microkernels and Other Kernel ...system and the controller are monolithic . We have noted earlier some of the problems of distributed systems-for exam- ple, the need to bound the...are monolithic . In practice, designers employ a layered structuring for their systems in order to manage complexity, and we expect that practical

  20. Performance Enhancement of Radial Distributed System with Distributed Generators by Reconfiguration Using Binary Firefly Algorithm

    NASA Astrophysics Data System (ADS)

    Rajalakshmi, N.; Padma Subramanian, D.; Thamizhavel, K.

    2015-03-01

    The extent of real power loss and voltage deviation associated with overloaded feeders in radial distribution system can be reduced by reconfiguration. Reconfiguration is normally achieved by changing the open/closed state of tie/sectionalizing switches. Finding optimal switch combination is a complicated problem as there are many switching combinations possible in a distribution system. Hence optimization techniques are finding greater importance in reducing the complexity of reconfiguration problem. This paper presents the application of firefly algorithm (FA) for optimal reconfiguration of radial distribution system with distributed generators (DG). The algorithm is tested on IEEE 33 bus system installed with DGs and the results are compared with binary genetic algorithm. It is found that binary FA is more effective than binary genetic algorithm in achieving real power loss reduction and improving voltage profile and hence enhancing the performance of radial distribution system. Results are found to be optimum when DGs are added to the test system, which proved the impact of DGs on distribution system.

  1. Analyzing Distributed Functions in an Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2010-01-01

    Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

  2. Double Wigner distribution function of a first-order optical system with a hard-edge aperture.

    PubMed

    Pan, Weiqing

    2008-01-01

    The effect of an apertured optical system on Wigner distribution can be expressed as a superposition integral of the input Wigner distribution function and the double Wigner distribution function of the apertured optical system. By introducing a hard aperture function into a finite sum of complex Gaussian functions, the double Wigner distribution functions of a first-order optical system with a hard aperture outside and inside it are derived. As an example of application, the analytical expressions of the Wigner distribution for a Gaussian beam passing through a spatial filtering optical system with an internal hard aperture are obtained. The analytical results are also compared with the numerical integral results, and they show that the analytical results are proper and ascendant.

  3. The feasibility and stability of large complex biological networks: a random matrix approach.

    PubMed

    Stone, Lewi

    2018-05-29

    In the 70's, Robert May demonstrated that complexity creates instability in generic models of ecological networks having random interaction matrices A. Similar random matrix models have since been applied in many disciplines. Central to assessing stability is the "circular law" since it describes the eigenvalue distribution for an important class of random matrices A. However, despite widespread adoption, the "circular law" does not apply for ecological systems in which density-dependence operates (i.e., where a species growth is determined by its density). Instead one needs to study the far more complicated eigenvalue distribution of the community matrix S = DA, where D is a diagonal matrix of population equilibrium values. Here we obtain this eigenvalue distribution. We show that if the random matrix A is locally stable, the community matrix S = DA will also be locally stable, providing the system is feasible (i.e., all species have positive equilibria D > 0). This helps explain why, unusually, nearly all feasible systems studied here are locally stable. Large complex systems may thus be even more fragile than May predicted, given the difficulty of assembling a feasible system. It was also found that the degree of stability, or resilience of a system, depended on the minimum equilibrium population.

  4. Complexity and dynamics of topological and community structure in complex networks

    NASA Astrophysics Data System (ADS)

    Berec, Vesna

    2017-07-01

    Complexity is highly susceptible to variations in the network dynamics, reflected on its underlying architecture where topological organization of cohesive subsets into clusters, system's modular structure and resulting hierarchical patterns, are cross-linked with functional dynamics of the system. Here we study connection between hierarchical topological scales of the simplicial complexes and the organization of functional clusters - communities in complex networks. The analysis reveals the full dynamics of different combinatorial structures of q-th-dimensional simplicial complexes and their Laplacian spectra, presenting spectral properties of resulting symmetric and positive semidefinite matrices. The emergence of system's collective behavior from inhomogeneous statistical distribution is induced by hierarchically ordered topological structure, which is mapped to simplicial complex where local interactions between the nodes clustered into subcomplexes generate flow of information that characterizes complexity and dynamics of the full system.

  5. Simulation of groundwater flow in the glacial aquifer system of northeastern Wisconsin with variable model complexity

    USGS Publications Warehouse

    Juckem, Paul F.; Clark, Brian R.; Feinstein, Daniel T.

    2017-05-04

    The U.S. Geological Survey, National Water-Quality Assessment seeks to map estimated intrinsic susceptibility of the glacial aquifer system of the conterminous United States. Improved understanding of the hydrogeologic characteristics that explain spatial patterns of intrinsic susceptibility, commonly inferred from estimates of groundwater age distributions, is sought so that methods used for the estimation process are properly equipped. An important step beyond identifying relevant hydrogeologic datasets, such as glacial geology maps, is to evaluate how incorporation of these resources into process-based models using differing levels of detail could affect resulting simulations of groundwater age distributions and, thus, estimates of intrinsic susceptibility.This report describes the construction and calibration of three groundwater-flow models of northeastern Wisconsin that were developed with differing levels of complexity to provide a framework for subsequent evaluations of the effects of process-based model complexity on estimations of groundwater age distributions for withdrawal wells and streams. Preliminary assessments, which focused on the effects of model complexity on simulated water levels and base flows in the glacial aquifer system, illustrate that simulation of vertical gradients using multiple model layers improves simulated heads more in low-permeability units than in high-permeability units. Moreover, simulation of heterogeneous hydraulic conductivity fields in coarse-grained and some fine-grained glacial materials produced a larger improvement in simulated water levels in the glacial aquifer system compared with simulation of uniform hydraulic conductivity within zones. The relation between base flows and model complexity was less clear; however, the relation generally seemed to follow a similar pattern as water levels. Although increased model complexity resulted in improved calibrations, future application of the models using simulated particle tracking is anticipated to evaluate if these model design considerations are similarly important for understanding the primary modeling objective - to simulate reasonable groundwater age distributions.

  6. Robust scalable stabilisability conditions for large-scale heterogeneous multi-agent systems with uncertain nonlinear interactions: towards a distributed computing architecture

    NASA Astrophysics Data System (ADS)

    Manfredi, Sabato

    2016-06-01

    Large-scale dynamic systems are becoming highly pervasive in their occurrence with applications ranging from system biology, environment monitoring, sensor networks, and power systems. They are characterised by high dimensionality, complexity, and uncertainty in the node dynamic/interactions that require more and more computational demanding methods for their analysis and control design, as well as the network size and node system/interaction complexity increase. Therefore, it is a challenging problem to find scalable computational method for distributed control design of large-scale networks. In this paper, we investigate the robust distributed stabilisation problem of large-scale nonlinear multi-agent systems (briefly MASs) composed of non-identical (heterogeneous) linear dynamical systems coupled by uncertain nonlinear time-varying interconnections. By employing Lyapunov stability theory and linear matrix inequality (LMI) technique, new conditions are given for the distributed control design of large-scale MASs that can be easily solved by the toolbox of MATLAB. The stabilisability of each node dynamic is a sufficient assumption to design a global stabilising distributed control. The proposed approach improves some of the existing LMI-based results on MAS by both overcoming their computational limits and extending the applicative scenario to large-scale nonlinear heterogeneous MASs. Additionally, the proposed LMI conditions are further reduced in terms of computational requirement in the case of weakly heterogeneous MASs, which is a common scenario in real application where the network nodes and links are affected by parameter uncertainties. One of the main advantages of the proposed approach is to allow to move from a centralised towards a distributed computing architecture so that the expensive computation workload spent to solve LMIs may be shared among processors located at the networked nodes, thus increasing the scalability of the approach than the network size. Finally, a numerical example shows the applicability of the proposed method and its advantage in terms of computational complexity when compared with the existing approaches.

  7. Statistical Features of Complex Systems ---Toward Establishing Sociological Physics---

    NASA Astrophysics Data System (ADS)

    Kobayashi, Naoki; Kuninaka, Hiroto; Wakita, Jun-ichi; Matsushita, Mitsugu

    2011-07-01

    Complex systems have recently attracted much attention, both in natural sciences and in sociological sciences. Members constituting a complex system evolve through nonlinear interactions among each other. This means that in a complex system the multiplicative experience or, so to speak, the history of each member produces its present characteristics. If attention is paid to any statistical property in any complex system, the lognormal distribution is the most natural and appropriate among the standard or ``normal'' statistics to overview the whole system. In fact, the lognormality emerges rather conspicuously when we examine, as familiar and typical examples of statistical aspects in complex systems, the nursing-care period for the aged, populations of prefectures and municipalities, and our body height and weight. Many other examples are found in nature and society. On the basis of these observations, we discuss the possibility of sociological physics.

  8. VOLTTRON - An Intelligent Agent Platform for the Smart Grid

    ScienceCinema

    None

    2018-05-16

    The distributed nature of the Smart Grid, such as responsive loads, solar and wind generation, and automation in the distribution system present a complex environment not easily controlled in a centralized manner.

  9. Distributed Trajectory Flexibility Preservation for Traffic Complexity Mitigation

    NASA Technical Reports Server (NTRS)

    Idris, Husni; Wing, David; Delahaye, Daniel

    2009-01-01

    The growing demand for air travel is increasing the need for mitigation of air traffic congestion and complexity problems, which are already at high levels. At the same time new information and automation technologies are enabling the distribution of tasks and decisions from the service providers to the users of the air traffic system, with potential capacity and cost benefits. This distribution of tasks and decisions raises the concern that independent user actions will decrease the predictability and increase the complexity of the traffic system, hence inhibiting and possibly reversing any potential benefits. In answer to this concern, the authors propose the introduction of decision-making metrics for preserving user trajectory flexibility. The hypothesis is that such metrics will make user actions naturally mitigate traffic complexity. In this paper, the impact of using these metrics on traffic complexity is investigated. The scenarios analyzed include aircraft in en route airspace with each aircraft meeting a required time of arrival in a one-hour time horizon while mitigating the risk of loss of separation with the other aircraft, thus preserving its trajectory flexibility. The experiments showed promising results in that the individual trajectory flexibility preservation induced self-separation and self-organization effects in the overall traffic situation. The effects were quantified using traffic complexity metrics based on Lyapunov exponents and traffic proximity.

  10. Multilevel recording of complex amplitude data pages in a holographic data storage system using digital holography.

    PubMed

    Nobukawa, Teruyoshi; Nomura, Takanori

    2016-09-05

    A holographic data storage system using digital holography is proposed to record and retrieve multilevel complex amplitude data pages. Digital holographic techniques are capable of modulating and detecting complex amplitude distribution using current electronic devices. These techniques allow the development of a simple, compact, and stable holographic storage system that mainly consists of a single phase-only spatial light modulator and an image sensor. As a proof-of-principle experiment, complex amplitude data pages with binary amplitude and four-level phase are recorded and retrieved. Experimental results show the feasibility of the proposed holographic data storage system.

  11. Self-dissimilarity as a High Dimensional Complexity Measure

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Macready, William

    2005-01-01

    For many systems characterized as "complex" the patterns exhibited on different scales differ markedly from one another. For example the biomass distribution in a human body "looks very different" depending on the scale at which one examines it. Conversely, the patterns at different scales in "simple" systems (e.g., gases, mountains, crystals) vary little from one scale to another. Accordingly, the degrees of self-dissimilarity between the patterns of a system at various scales constitute a complexity "signature" of that system. Here we present a novel quantification of self-dissimilarity. This signature can, if desired, incorporate a novel information-theoretic measure of the distance between probability distributions that we derive here. Whatever distance measure is chosen, our quantification of self-dissimilarity can be measured for many kinds of real-world data. This allows comparisons of the complexity signatures of wholly different kinds of systems (e.g., systems involving information density in a digital computer vs. species densities in a rain-forest vs. capital density in an economy, etc.). Moreover, in contrast to many other suggested complexity measures, evaluating the self-dissimilarity of a system does not require one to already have a model of the system. These facts may allow self-dissimilarity signatures to be used a s the underlying observational variables of an eventual overarching theory relating all complex systems. To illustrate self-dissimilarity we present several numerical experiments. In particular, we show that underlying structure of the logistic map is picked out by the self-dissimilarity signature of time series produced by that map

  12. The construction of general basis functions in reweighting ensemble dynamics simulations: Reproduce equilibrium distribution in complex systems from multiple short simulation trajectories

    NASA Astrophysics Data System (ADS)

    Zhang, Chuan-Biao; Ming, Li; Xin, Zhou

    2015-12-01

    Ensemble simulations, which use multiple short independent trajectories from dispersive initial conformations, rather than a single long trajectory as used in traditional simulations, are expected to sample complex systems such as biomolecules much more efficiently. The re-weighted ensemble dynamics (RED) is designed to combine these short trajectories to reconstruct the global equilibrium distribution. In the RED, a number of conformational functions, named as basis functions, are applied to relate these trajectories to each other, then a detailed-balance-based linear equation is built, whose solution provides the weights of these trajectories in equilibrium distribution. Thus, the sufficient and efficient selection of basis functions is critical to the practical application of RED. Here, we review and present a few possible ways to generally construct basis functions for applying the RED in complex molecular systems. Especially, for systems with less priori knowledge, we could generally use the root mean squared deviation (RMSD) among conformations to split the whole conformational space into a set of cells, then use the RMSD-based-cell functions as basis functions. We demonstrate the application of the RED in typical systems, including a two-dimensional toy model, the lattice Potts model, and a short peptide system. The results indicate that the RED with the constructions of basis functions not only more efficiently sample the complex systems, but also provide a general way to understand the metastable structure of conformational space. Project supported by the National Natural Science Foundation of China (Grant No. 11175250).

  13. Modeling a hierarchical structure of factors influencing exploitation policy for water distribution systems using ISM approach

    NASA Astrophysics Data System (ADS)

    Jasiulewicz-Kaczmarek, Małgorzata; Wyczółkowski, Ryszard; Gładysiak, Violetta

    2017-12-01

    Water distribution systems are one of the basic elements of contemporary technical infrastructure of urban and rural areas. It is a complex engineering system composed of transmission networks and auxiliary equipment (e.g. controllers, checkouts etc.), scattered territorially over a large area. From the water distribution system operation point of view, its basic features are: functional variability, resulting from the need to adjust the system to temporary fluctuations in demand for water and territorial dispersion. The main research questions are: What external factors should be taken into account when developing an effective water distribution policy? Does the size and nature of the water distribution system significantly affect the exploitation policy implemented? These questions have shaped the objectives of research and the method of research implementation.

  14. Distributed mixed-integer fuzzy hierarchical programming for municipal solid waste management. Part I: System identification and methodology development.

    PubMed

    Cheng, Guanhui; Huang, Guohe; Dong, Cong; Xu, Ye; Chen, Xiujuan; Chen, Jiapei

    2017-03-01

    Due to the existence of complexities of heterogeneities, hierarchy, discreteness, and interactions in municipal solid waste management (MSWM) systems such as Beijing, China, a series of socio-economic and eco-environmental problems may emerge or worsen and result in irredeemable damages in the following decades. Meanwhile, existing studies, especially ones focusing on MSWM in Beijing, could hardly reflect these complexities in system simulations and provide reliable decision support for management practices. Thus, a framework of distributed mixed-integer fuzzy hierarchical programming (DMIFHP) is developed in this study for MSWM under these complexities. Beijing is selected as a representative case. The Beijing MSWM system is comprehensively analyzed in many aspects such as socio-economic conditions, natural conditions, spatial heterogeneities, treatment facilities, and system complexities, building a solid foundation for system simulation and optimization. Correspondingly, the MSWM system in Beijing is discretized as 235 grids to reflect spatial heterogeneity. A DMIFHP model which is a nonlinear programming problem is constructed to parameterize the Beijing MSWM system. To enable scientific solving of it, a solution algorithm is proposed based on coupling of fuzzy programming and mixed-integer linear programming. Innovations and advantages of the DMIFHP framework are discussed. The optimal MSWM schemes and mechanism revelations will be discussed in another companion paper due to length limitation.

  15. Temporal Decompostion of a Distribution System Quasi-Static Time-Series Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mather, Barry A; Hunsberger, Randolph J

    This paper documents the first phase of an investigation into reducing runtimes of complex OpenDSS models through parallelization. As the method seems promising, future work will quantify - and further mitigate - errors arising from this process. In this initial report, we demonstrate how, through the use of temporal decomposition, the run times of a complex distribution-system-level quasi-static time series simulation can be reduced roughly proportional to the level of parallelization. Using this method, the monolithic model runtime of 51 hours was reduced to a minimum of about 90 minutes. As expected, this comes at the expense of control- andmore » voltage-errors at the time-slice boundaries. All evaluations were performed using a real distribution circuit model with the addition of 50 PV systems - representing a mock complex PV impact study. We are able to reduce induced transition errors through the addition of controls initialization, though small errors persist. The time savings with parallelization are so significant that we feel additional investigation to reduce control errors is warranted.« less

  16. Autonomous perception and decision making in cyber-physical systems

    NASA Astrophysics Data System (ADS)

    Sarkar, Soumik

    2011-07-01

    The cyber-physical system (CPS) is a relatively new interdisciplinary technology area that includes the general class of embedded and hybrid systems. CPSs require integration of computation and physical processes that involves the aspects of physical quantities such as time, energy and space during information processing and control. The physical space is the source of information and the cyber space makes use of the generated information to make decisions. This dissertation proposes an overall architecture of autonomous perception-based decision & control of complex cyber-physical systems. Perception involves the recently developed framework of Symbolic Dynamic Filtering for abstraction of physical world in the cyber space. For example, under this framework, sensor observations from a physical entity are discretized temporally and spatially to generate blocks of symbols, also called words that form a language. A grammar of a language is the set of rules that determine the relationships among words to build sentences. Subsequently, a physical system is conjectured to be a linguistic source that is capable of generating a specific language. The proposed technology is validated on various (experimental and simulated) case studies that include health monitoring of aircraft gas turbine engines, detection and estimation of fatigue damage in polycrystalline alloys, and parameter identification. Control of complex cyber-physical systems involve distributed sensing, computation, control as well as complexity analysis. A novel statistical mechanics-inspired complexity analysis approach is proposed in this dissertation. In such a scenario of networked physical systems, the distribution of physical entities determines the underlying network topology and the interaction among the entities forms the abstract cyber space. It is envisioned that the general contributions, made in this dissertation, will be useful for potential application areas such as smart power grids and buildings, distributed energy systems, advanced health care procedures and future ground and air transportation systems.

  17. Information Theory Applied to Dolphin Whistle Vocalizations with Possible Application to SETI Signals

    NASA Astrophysics Data System (ADS)

    Doyle, Laurance R.; McCowan, Brenda; Hanser, Sean F.

    2002-01-01

    Information theory allows a quantification of the complexity of a given signaling system. We are applying information theory to dolphin whistle vocalizations, humpback whale songs, squirrel monkey chuck calls, and several other animal communication systems' in order to develop a quantitative and objective way to compare inter species communication systems' complexity. Once signaling units have been correctly classified the communication system must obey certain statistical distributions in order to contain complexity whether it is human languages, dolphin whistle vocalizations, or even a system of communication signals received from an extraterrestrial source.

  18. Reduze - Feynman integral reduction in C++

    NASA Astrophysics Data System (ADS)

    Studerus, C.

    2010-07-01

    Reduze is a computer program for reducing Feynman integrals to master integrals employing a Laporta algorithm. The program is written in C++ and uses classes provided by the GiNaC library to perform the simplifications of the algebraic prefactors in the system of equations. Reduze offers the possibility to run reductions in parallel. Program summaryProgram title:Reduze Catalogue identifier: AEGE_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGE_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions:: yes No. of lines in distributed program, including test data, etc.: 55 433 No. of bytes in distributed program, including test data, etc.: 554 866 Distribution format: tar.gz Programming language: C++ Computer: All Operating system: Unix/Linux Number of processors used: The number of processors is problem dependent. More than one possible but not arbitrary many. RAM: Depends on the complexity of the system. Classification: 4.4, 5 External routines: CLN ( http://www.ginac.de/CLN/), GiNaC ( http://www.ginac.de/) Nature of problem: Solving large systems of linear equations with Feynman integrals as unknowns and rational polynomials as prefactors. Solution method: Using a Gauss/Laporta algorithm to solve the system of equations. Restrictions: Limitations depend on the complexity of the system (number of equations, number of kinematic invariants). Running time: Depends on the complexity of the system.

  19. Resource Management for Distributed Parallel Systems

    NASA Technical Reports Server (NTRS)

    Neuman, B. Clifford; Rao, Santosh

    1993-01-01

    Multiprocessor systems should exist in the the larger context of distributed systems, allowing multiprocessor resources to be shared by those that need them. Unfortunately, typical multiprocessor resource management techniques do not scale to large networks. The Prospero Resource Manager (PRM) is a scalable resource allocation system that supports the allocation of processing resources in large networks and multiprocessor systems. To manage resources in such distributed parallel systems, PRM employs three types of managers: system managers, job managers, and node managers. There exist multiple independent instances of each type of manager, reducing bottlenecks. The complexity of each manager is further reduced because each is designed to utilize information at an appropriate level of abstraction.

  20. Automated distribution system management for multichannel space power systems

    NASA Technical Reports Server (NTRS)

    Fleck, G. W.; Decker, D. K.; Graves, J.

    1983-01-01

    A NASA sponsored study of space power distribution system technology is in progress to develop an autonomously managed power system (AMPS) for large space power platforms. The multichannel, multikilowatt, utility-type power subsystem proposed presents new survivability requirements and increased subsystem complexity. The computer controls under development for the power management system must optimize the power subsystem performance and minimize the life cycle cost of the platform. A distribution system management philosophy has been formulated which incorporates these constraints. Its implementation using a TI9900 microprocessor and FORTH as the programming language is presented. The approach offers a novel solution to the perplexing problem of determining the optimal combination of loads which should be connected to each power channel for a versatile electrical distribution concept.

  1. Building a generalized distributed system model

    NASA Technical Reports Server (NTRS)

    Mukkamala, R.

    1992-01-01

    The key elements in the second year (1991-92) of our project are: (1) implementation of the distributed system prototype; (2) successful passing of the candidacy examination and a PhD proposal acceptance by the funded student; (3) design of storage efficient schemes for replicated distributed systems; and (4) modeling of gracefully degrading reliable computing systems. In the third year of the project (1992-93), we propose to: (1) complete the testing of the prototype; (2) enhance the functionality of the modules by enabling the experimentation with more complex protocols; (3) use the prototype to verify the theoretically predicted performance of locking protocols, etc.; and (4) work on issues related to real-time distributed systems. This should result in efficient protocols for these systems.

  2. Programming model for distributed intelligent systems

    NASA Technical Reports Server (NTRS)

    Sztipanovits, J.; Biegl, C.; Karsai, G.; Bogunovic, N.; Purves, B.; Williams, R.; Christiansen, T.

    1988-01-01

    A programming model and architecture which was developed for the design and implementation of complex, heterogeneous measurement and control systems is described. The Multigraph Architecture integrates artificial intelligence techniques with conventional software technologies, offers a unified framework for distributed and shared memory based parallel computational models and supports multiple programming paradigms. The system can be implemented on different hardware architectures and can be adapted to strongly different applications.

  3. The R-Shell approach - Using scheduling agents in complex distributed real-time systems

    NASA Technical Reports Server (NTRS)

    Natarajan, Swaminathan; Zhao, Wei; Goforth, Andre

    1993-01-01

    Large, complex real-time systems such as space and avionics systems are extremely demanding in their scheduling requirements. The current OS design approaches are quite limited in the capabilities they provide for task scheduling. Typically, they simply implement a particular uniprocessor scheduling strategy and do not provide any special support for network scheduling, overload handling, fault tolerance, distributed processing, etc. Our design of the R-Shell real-time environment fcilitates the implementation of a variety of sophisticated but efficient scheduling strategies, including incorporation of all these capabilities. This is accomplished by the use of scheduling agents which reside in the application run-time environment and are responsible for coordinating the scheduling of the application.

  4. High rate information systems - Architectural trends in support of the interdisciplinary investigator

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Preheim, Larry E.

    1990-01-01

    Data systems requirements in the Earth Observing System (EOS) Space Station Freedom (SSF) eras indicate increasing data volume, increased discipline interplay, higher complexity and broader data integration and interpretation. A response to the needs of the interdisciplinary investigator is proposed, considering the increasing complexity and rising costs of scientific investigation. The EOS Data Information System, conceived to be a widely distributed system with reliable communication links between central processing and the science user community, is described. Details are provided on information architecture, system models, intelligent data management of large complex databases, and standards for archiving ancillary data, using a research library, a laboratory and collaboration services.

  5. A Distributed Prognostic Health Management Architecture

    NASA Technical Reports Server (NTRS)

    Bhaskar, Saha; Saha, Sankalita; Goebel, Kai

    2009-01-01

    This paper introduces a generic distributed prognostic health management (PHM) architecture with specific application to the electrical power systems domain. Current state-of-the-art PHM systems are mostly centralized in nature, where all the processing is reliant on a single processor. This can lead to loss of functionality in case of a crash of the central processor or monitor. Furthermore, with increases in the volume of sensor data as well as the complexity of algorithms, traditional centralized systems become unsuitable for successful deployment, and efficient distributed architectures are required. A distributed architecture though, is not effective unless there is an algorithmic framework to take advantage of its unique abilities. The health management paradigm envisaged here incorporates a heterogeneous set of system components monitored by a varied suite of sensors and a particle filtering (PF) framework that has the power and the flexibility to adapt to the different diagnostic and prognostic needs. Both the diagnostic and prognostic tasks are formulated as a particle filtering problem in order to explicitly represent and manage uncertainties; however, typically the complexity of the prognostic routine is higher than the computational power of one computational element ( CE). Individual CEs run diagnostic routines until the system variable being monitored crosses beyond a nominal threshold, upon which it coordinates with other networked CEs to run the prognostic routine in a distributed fashion. Implementation results from a network of distributed embedded devices monitoring a prototypical aircraft electrical power system are presented, where the CEs are Sun Microsystems Small Programmable Object Technology (SPOT) devices.

  6. Distributed containment control of heterogeneous fractional-order multi-agent systems with communication delays

    NASA Astrophysics Data System (ADS)

    Yang, Hongyong; Han, Fujun; Zhao, Mei; Zhang, Shuning; Yue, Jun

    2017-08-01

    Because many networked systems can only be characterized with fractional-order dynamics in complex environments, fractional-order calculus has been studied deeply recently. When diverse individual features are shown in different agents of networked systems, heterogeneous fractional-order dynamics will be used to describe the complex systems. Based on the distinguishing properties of agents, heterogeneous fractional-order multi-agent systems (FOMAS) are presented. With the supposition of multiple leader agents in FOMAS, distributed containment control of FOMAS is studied in directed weighted topologies. By applying Laplace transformation and frequency domain theory of the fractional-order operator, an upper bound of delays is obtained to ensure containment consensus of delayed heterogenous FOMAS. Consensus results of delayed FOMAS in this paper can be extended to systems with integer-order models. Finally, numerical examples are used to verify our results.

  7. System-of-Systems Approach for Integrated Energy Systems Modeling and Simulation: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mittal, Saurabh; Ruth, Mark; Pratt, Annabelle

    Today’s electricity grid is the most complex system ever built—and the future grid is likely to be even more complex because it will incorporate distributed energy resources (DERs) such as wind, solar, and various other sources of generation and energy storage. The complexity is further augmented by the possible evolution to new retail market structures that provide incentives to owners of DERs to support the grid. To understand and test new retail market structures and technologies such as DERs, demand-response equipment, and energy management systems while providing reliable electricity to all customers, an Integrated Energy System Model (IESM) is beingmore » developed at NREL. The IESM is composed of a power flow simulator (GridLAB-D), home energy management systems implemented using GAMS/Pyomo, a market layer, and hardware-in-the-loop simulation (testing appliances such as HVAC, dishwasher, etc.). The IESM is a system-of-systems (SoS) simulator wherein the constituent systems are brought together in a virtual testbed. We will describe an SoS approach for developing a distributed simulation environment. We will elaborate on the methodology and the control mechanisms used in the co-simulation illustrated by a case study.« less

  8. Maximum entropy approach to H -theory: Statistical mechanics of hierarchical systems

    NASA Astrophysics Data System (ADS)

    Vasconcelos, Giovani L.; Salazar, Domingos S. P.; Macêdo, A. M. S.

    2018-02-01

    A formalism, called H-theory, is applied to the problem of statistical equilibrium of a hierarchical complex system with multiple time and length scales. In this approach, the system is formally treated as being composed of a small subsystem—representing the region where the measurements are made—in contact with a set of "nested heat reservoirs" corresponding to the hierarchical structure of the system, where the temperatures of the reservoirs are allowed to fluctuate owing to the complex interactions between degrees of freedom at different scales. The probability distribution function (pdf) of the temperature of the reservoir at a given scale, conditioned on the temperature of the reservoir at the next largest scale in the hierarchy, is determined from a maximum entropy principle subject to appropriate constraints that describe the thermal equilibrium properties of the system. The marginal temperature distribution of the innermost reservoir is obtained by integrating over the conditional distributions of all larger scales, and the resulting pdf is written in analytical form in terms of certain special transcendental functions, known as the Fox H functions. The distribution of states of the small subsystem is then computed by averaging the quasiequilibrium Boltzmann distribution over the temperature of the innermost reservoir. This distribution can also be written in terms of H functions. The general family of distributions reported here recovers, as particular cases, the stationary distributions recently obtained by Macêdo et al. [Phys. Rev. E 95, 032315 (2017), 10.1103/PhysRevE.95.032315] from a stochastic dynamical approach to the problem.

  9. Maximum entropy approach to H-theory: Statistical mechanics of hierarchical systems.

    PubMed

    Vasconcelos, Giovani L; Salazar, Domingos S P; Macêdo, A M S

    2018-02-01

    A formalism, called H-theory, is applied to the problem of statistical equilibrium of a hierarchical complex system with multiple time and length scales. In this approach, the system is formally treated as being composed of a small subsystem-representing the region where the measurements are made-in contact with a set of "nested heat reservoirs" corresponding to the hierarchical structure of the system, where the temperatures of the reservoirs are allowed to fluctuate owing to the complex interactions between degrees of freedom at different scales. The probability distribution function (pdf) of the temperature of the reservoir at a given scale, conditioned on the temperature of the reservoir at the next largest scale in the hierarchy, is determined from a maximum entropy principle subject to appropriate constraints that describe the thermal equilibrium properties of the system. The marginal temperature distribution of the innermost reservoir is obtained by integrating over the conditional distributions of all larger scales, and the resulting pdf is written in analytical form in terms of certain special transcendental functions, known as the Fox H functions. The distribution of states of the small subsystem is then computed by averaging the quasiequilibrium Boltzmann distribution over the temperature of the innermost reservoir. This distribution can also be written in terms of H functions. The general family of distributions reported here recovers, as particular cases, the stationary distributions recently obtained by Macêdo et al. [Phys. Rev. E 95, 032315 (2017)10.1103/PhysRevE.95.032315] from a stochastic dynamical approach to the problem.

  10. Preferential attachment and growth dynamics in complex systems

    NASA Astrophysics Data System (ADS)

    Yamasaki, Kazuko; Matia, Kaushik; Buldyrev, Sergey V.; Fu, Dongfeng; Pammolli, Fabio; Riccaboni, Massimo; Stanley, H. Eugene

    2006-09-01

    Complex systems can be characterized by classes of equivalency of their elements defined according to system specific rules. We propose a generalized preferential attachment model to describe the class size distribution. The model postulates preferential growth of the existing classes and the steady influx of new classes. According to the model, the distribution changes from a pure exponential form for zero influx of new classes to a power law with an exponential cut-off form when the influx of new classes is substantial. Predictions of the model are tested through the analysis of a unique industrial database, which covers both elementary units (products) and classes (markets, firms) in a given industry (pharmaceuticals), covering the entire size distribution. The model’s predictions are in good agreement with the data. The paper sheds light on the emergence of the exponent τ≈2 observed as a universal feature of many biological, social and economic problems.

  11. Precision time distribution within a deep space communications complex

    NASA Technical Reports Server (NTRS)

    Curtright, J. B.

    1972-01-01

    The Precision Time Distribution System (PTDS) at the Golstone Deep Space Communications Complex is a practical application of existing technology to the solution of a local problem. The problem was to synchronize four station timing systems to a master source with a relative accuracy consistently and significantly better than 10 microseconds. The solution involved combining a precision timing source, an automatic error detection assembly and a microwave distribution network into an operational system. Upon activation of the completed PTDS two years ago, synchronization accuracy at Goldstone (two station relative) was improved by an order of magnitude. It is felt that the validation of the PTDS mechanization is now completed. Other facilities which have site dispersion and synchronization accuracy requirements similar to Goldstone may find the PTDS mechanization useful in solving their problem. At present, the two station relative synchronization accuracy at Goldstone is better than one microsecond.

  12. Comparison between wavelet and wavelet packet transform features for classification of faults in distribution system

    NASA Astrophysics Data System (ADS)

    Arvind, Pratul

    2012-11-01

    The ability to identify and classify all ten types of faults in a distribution system is an important task for protection engineers. Unlike transmission system, distribution systems have a complex configuration and are subjected to frequent faults. In the present work, an algorithm has been developed for identifying all ten types of faults in a distribution system by collecting current samples at the substation end. The samples are subjected to wavelet packet transform and artificial neural network in order to yield better classification results. A comparison of results between wavelet transform and wavelet packet transform is also presented thereby justifying the feature extracted from wavelet packet transform yields promising results. It should also be noted that current samples are collected after simulating a 25kv distribution system in PSCAD software.

  13. Universality classes of fluctuation dynamics in hierarchical complex systems

    NASA Astrophysics Data System (ADS)

    Macêdo, A. M. S.; González, Iván R. Roa; Salazar, D. S. P.; Vasconcelos, G. L.

    2017-03-01

    A unified approach is proposed to describe the statistics of the short-time dynamics of multiscale complex systems. The probability density function of the relevant time series (signal) is represented as a statistical superposition of a large time-scale distribution weighted by the distribution of certain internal variables that characterize the slowly changing background. The dynamics of the background is formulated as a hierarchical stochastic model whose form is derived from simple physical constraints, which in turn restrict the dynamics to only two possible classes. The probability distributions of both the signal and the background have simple representations in terms of Meijer G functions. The two universality classes for the background dynamics manifest themselves in the signal distribution as two types of tails: power law and stretched exponential, respectively. A detailed analysis of empirical data from classical turbulence and financial markets shows excellent agreement with the theory.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hao, He; Lian, Jianming; Kalsi, Karanjit

    The HVAC (Heating, Ventilation, and Air- Conditioning) system of commercial buildings is a complex system with a large number of dynamically interacting components. In particular, the thermal dynamics of each zone are coupled with those of the neighboring zones. In this paper, we study a multi-agent based approach to model and control commercial building HVAC system for providing grid services. In the multi-agent system (MAS), individual zones are modeled as agents that can communicate, interact, and negotiate with one another to achieve a common objective. We first propose a distributed characterization method on the aggregated airflow (and thus fan power)more » flexibility that the HVAC system can provide to the ancillary service market. Then, we propose a Nash-bargaining based airflow allocation strategy to track a dispatch signal (that is within the offered flexibility limit) while respecting the preference and flexibility of individual zones. Moreover, we devise a distributed algorithm to obtain the Nash bargaining solution via dual decomposition and average consensus. Numerical simulations illustrate that the proposed distributed protocols are much more scalable than the centralized approaches especially when the system becomes larger and more complex.« less

  15. Quantification of Graph Complexity Based on the Edge Weight Distribution Balance: Application to Brain Networks.

    PubMed

    Gomez-Pilar, Javier; Poza, Jesús; Bachiller, Alejandro; Gómez, Carlos; Núñez, Pablo; Lubeiro, Alba; Molina, Vicente; Hornero, Roberto

    2018-02-01

    The aim of this study was to introduce a novel global measure of graph complexity: Shannon graph complexity (SGC). This measure was specifically developed for weighted graphs, but it can also be applied to binary graphs. The proposed complexity measure was designed to capture the interplay between two properties of a system: the 'information' (calculated by means of Shannon entropy) and the 'order' of the system (estimated by means of a disequilibrium measure). SGC is based on the concept that complex graphs should maintain an equilibrium between the aforementioned two properties, which can be measured by means of the edge weight distribution. In this study, SGC was assessed using four synthetic graph datasets and a real dataset, formed by electroencephalographic (EEG) recordings from controls and schizophrenia patients. SGC was compared with graph density (GD), a classical measure used to evaluate graph complexity. Our results showed that SGC is invariant with respect to GD and independent of node degree distribution. Furthermore, its variation with graph size [Formula: see text] is close to zero for [Formula: see text]. Results from the real dataset showed an increment in the weight distribution balance during the cognitive processing for both controls and schizophrenia patients, although these changes are more relevant for controls. Our findings revealed that SGC does not need a comparison with null-hypothesis networks constructed by a surrogate process. In addition, SGC results on the real dataset suggest that schizophrenia is associated with a deficit in the brain dynamic reorganization related to secondary pathways of the brain network.

  16. Modeling and Verification of Dependable Electronic Power System Architecture

    NASA Astrophysics Data System (ADS)

    Yuan, Ling; Fan, Ping; Zhang, Xiao-fang

    The electronic power system can be viewed as a system composed of a set of concurrently interacting subsystems to generate, transmit, and distribute electric power. The complex interaction among sub-systems makes the design of electronic power system complicated. Furthermore, in order to guarantee the safe generation and distribution of electronic power, the fault tolerant mechanisms are incorporated in the system design to satisfy high reliability requirements. As a result, the incorporation makes the design of such system more complicated. We propose a dependable electronic power system architecture, which can provide a generic framework to guide the development of electronic power system to ease the development complexity. In order to provide common idioms and patterns to the system *designers, we formally model the electronic power system architecture by using the PVS formal language. Based on the PVS model of this system architecture, we formally verify the fault tolerant properties of the system architecture by using the PVS theorem prover, which can guarantee that the system architecture can satisfy high reliability requirements.

  17. High-frequency ultrasonic wire bonding systems

    PubMed

    Tsujino; Yoshihara; Sano; Ihara

    2000-03-01

    The vibration characteristics of longitudinal-complex transverse vibration systems with multiple resonance frequencies of 350-980 kHz for ultrasonic wire bonding of IC, LSI or electronic devices were studied. The complex vibration systems can be applied for direct welding of semiconductor tips (face-down bonding, flip-chip bonding) and packaging of electronic devices. A longitudinal-complex transverse vibration bonding system consists of a complex transverse vibration rod, two driving longitudinal transducers 7.0 mm in diameter and a transverse vibration welding tip. The vibration distributions along ceramic and stainless-steel welding tips were measured at up to 980 kHz. A high-frequency vibration system with a height of 20.7 mm and a weight of less than 15 g was obtained.

  18. Mathematics of Failures in Complex Systems: Characterization and Mitigation of Service Failures in Complex Dynamic Systems

    DTIC Science & Technology

    2007-06-30

    fractal dimensions and Lyapunov exponents . Fractal dimensions characterize geometri- cal complexity of dynamics (e.g., spatial distribution of points along...ant classi3ers (e.g., Lyapunov exponents , and fractal dimensions). The 3rst three steps show how chaotic systems may be separated from stochastic...correlated random walk in which a ¼ 2H, where H is the Hurst exponen interval 0pHp1 with the case H ¼ 0:5 corresponding to a simple rando This model has been

  19. Networked control of microgrid system of systems

    NASA Astrophysics Data System (ADS)

    Mahmoud, Magdi S.; Rahman, Mohamed Saif Ur; AL-Sunni, Fouad M.

    2016-08-01

    The microgrid has made its mark in distributed generation and has attracted widespread research. However, microgrid is a complex system which needs to be viewed from an intelligent system of systems perspective. In this paper, a network control system of systems is designed for the islanded microgrid system consisting of three distributed generation units as three subsystems supplying a load. The controller stabilises the microgrid system in the presence of communication infractions such as packet dropouts and delays. Simulation results are included to elucidate the effectiveness of the proposed control strategy.

  20. Ultraviolet-B radiation mobilizes uranium from uranium-dissolved organic carbon complexes in aquatic systems, demonstrated by asymmetrical flow field-flow fractionation.

    PubMed

    Nehete, Sachin Vilas; Christensen, Terje; Salbu, Brit; Teien, Hans-Christian

    2017-05-05

    Humic substances have a tendency to form complexes with metal ions in aquatic medium, impacting the metal mobility, decreasing bioavailability and toxicity. Ultraviolet-B (UV-B) radiation exposure degrades the humic substance, changes their molecular weight distribution and their metal binding capacity in aquatic medium. In this study, we experimented the effect of UV-B radiation on the uranium complexed with fulvic acids and humic acids in a soft water system at different pH, uranium concentrations and radiant exposure. The concentration and distribution of uranium in a complexed form were investigated by asymmetrical flow field-flow fractionation coupled to multi detection technique (AsFlFFF-UV-ICP-MS). The major concentration of uranium present in complexes was primarily associated with average and higher molecular weight fulvic and humic acids components. The concentration of uranium in a complexed form increased with increasing fulvic and humic acid concentrations as well as pH of the solution. The higher molecular weight fraction of uranium was degraded due to the UV-B exposure, transforming about 50% of the uranium-dissolved organic carbon complexes into low molecular weight uranium species in complex form with organic ligands and/or free form. The result also suggests AsFlFFF-UV-ICP-MS to be an important separation and detection technique for understanding the interaction of radionuclides with dissolved organic matter, tracking size distribution changes during degradation of organic complexes for understanding mobility, bioavailability and ecosystem transfer of radionuclides as well as metals. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Distributed Saturation

    NASA Technical Reports Server (NTRS)

    Chung, Ming-Ying; Ciardo, Gianfranco; Siminiceanu, Radu I.

    2007-01-01

    The Saturation algorithm for symbolic state-space generation, has been a recent break-through in the exhaustive veri cation of complex systems, in particular globally-asyn- chronous/locally-synchronous systems. The algorithm uses a very compact Multiway Decision Diagram (MDD) encoding for states and the fastest symbolic exploration algo- rithm to date. The distributed version of Saturation uses the overall memory available on a network of workstations (NOW) to efficiently spread the memory load during the highly irregular exploration. A crucial factor in limiting the memory consumption during the symbolic state-space generation is the ability to perform garbage collection to free up the memory occupied by dead nodes. However, garbage collection over a NOW requires a nontrivial communication overhead. In addition, operation cache policies become critical while analyzing large-scale systems using the symbolic approach. In this technical report, we develop a garbage collection scheme and several operation cache policies to help on solving extremely complex systems. Experiments show that our schemes improve the performance of the original distributed implementation, SmArTNow, in terms of time and memory efficiency.

  2. The role of order in distributed programs

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.; Marzullo, Keith

    1989-01-01

    The role of order in building distributed systems is discussed. It is the belief that a principle of event ordering underlies the wide range of operating systems mechanisms that were put forward for building robust distributed software. Stated concisely, this principle achieves correct distributed behavior by ordering classes of distributed events that conflict with one another. By focusing on order, simplified descriptions can be obtained and convincingly correct solutions to problems that might otherwise have looked extremely complex. Moreover, it is observed that there are a limited number of ways to obtain order, and that the choice made impacts greatly on performance.

  3. Quantitative Risk - Phase 1

    DTIC Science & Technology

    2013-09-03

    SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) 11 . SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION... 11 Connecting technical risk and types of complexity...24 Figure 11 . Complexity evolution throughout the systems acquisition lifecycle ......................................... 25

  4. Quantitative Risk - Phases 1 & 2

    DTIC Science & Technology

    2013-11-12

    MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) 11 . SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION/AVAILABILITY... 11 Connecting technical risk and types of complexity...24 Figure 11 . Complexity evolution throughout the systems acquisition lifecycle

  5. R&D100: Lightweight Distributed Metric Service

    ScienceCinema

    Gentile, Ann; Brandt, Jim; Tucker, Tom; Showerman, Mike

    2018-06-12

    On today's High Performance Computing platforms, the complexity of applications and configurations makes efficient use of resources difficult. The Lightweight Distributed Metric Service (LDMS) is monitoring software developed by Sandia National Laboratories to provide detailed metrics of system performance. LDMS provides collection, transport, and storage of data from extreme-scale systems at fidelities and timescales to provide understanding of application and system performance with no statistically significant impact on application performance.

  6. R&D100: Lightweight Distributed Metric Service

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gentile, Ann; Brandt, Jim; Tucker, Tom

    2015-11-19

    On today's High Performance Computing platforms, the complexity of applications and configurations makes efficient use of resources difficult. The Lightweight Distributed Metric Service (LDMS) is monitoring software developed by Sandia National Laboratories to provide detailed metrics of system performance. LDMS provides collection, transport, and storage of data from extreme-scale systems at fidelities and timescales to provide understanding of application and system performance with no statistically significant impact on application performance.

  7. Dense power-law networks and simplicial complexes

    NASA Astrophysics Data System (ADS)

    Courtney, Owen T.; Bianconi, Ginestra

    2018-05-01

    There is increasing evidence that dense networks occur in on-line social networks, recommendation networks and in the brain. In addition to being dense, these networks are often also scale-free, i.e., their degree distributions follow P (k ) ∝k-γ with γ ∈(1 ,2 ] . Models of growing networks have been successfully employed to produce scale-free networks using preferential attachment, however these models can only produce sparse networks as the numbers of links and nodes being added at each time step is constant. Here we present a modeling framework which produces networks that are both dense and scale-free. The mechanism by which the networks grow in this model is based on the Pitman-Yor process. Variations on the model are able to produce undirected scale-free networks with exponent γ =2 or directed networks with power-law out-degree distribution with tunable exponent γ ∈(1 ,2 ) . We also extend the model to that of directed two-dimensional simplicial complexes. Simplicial complexes are generalization of networks that can encode the many body interactions between the parts of a complex system and as such are becoming increasingly popular to characterize different data sets ranging from social interacting systems to the brain. Our model produces dense directed simplicial complexes with power-law distribution of the generalized out-degrees of the nodes.

  8. State estimation for distributed systems with sensing delay

    NASA Astrophysics Data System (ADS)

    Alexander, Harold L.

    1991-08-01

    Control of complex systems such as remote robotic vehicles requires combining data from many sensors where the data may often be delayed by sensory processing requirements. The number and variety of sensors make it desirable to distribute the computational burden of sensing and estimation among multiple processors. Classic Kalman filters do not lend themselves to distributed implementations or delayed measurement data. The alternative Kalman filter designs presented in this paper are adapted for delays in sensor data generation and for distribution of computation for sensing and estimation over a set of networked processors.

  9. Apollo experience report: Command and service module electrical power distribution on subsystem

    NASA Technical Reports Server (NTRS)

    Munford, R. E.; Hendrix, B.

    1974-01-01

    A review of the design philosophy and development of the Apollo command and service modules electrical power distribution subsystem, a brief history of the evolution of the total system, and some of the more significant components within the system are discussed. The electrical power distribution primarily consisted of individual control units, interconnecting units, and associated protective devices. Because each unit within the system operated more or less independently of other units, the discussion of the subsystem proceeds generally in descending order of complexity; the discussion begins with the total system, progresses to the individual units of the system, and concludes with the components within the units.

  10. Distributed Processing and Cortical Specialization for Speech and Environmental Sounds in Human Temporal Cortex

    ERIC Educational Resources Information Center

    Leech, Robert; Saygin, Ayse Pinar

    2011-01-01

    Using functional MRI, we investigated whether auditory processing of both speech and meaningful non-linguistic environmental sounds in superior and middle temporal cortex relies on a complex and spatially distributed neural system. We found that evidence for spatially distributed processing of speech and environmental sounds in a substantial…

  11. A General theory of Signal Integration for Fault-Tolerant Dynamic Distributed Sensor Networks

    DTIC Science & Technology

    1993-10-01

    related to a) the architecture and fault- tolerance of the distributed sensor network, b) the proper synchronisation of sensor signals, c) the...Computational complexities of the problem of distributed detection. 5) Issues related to recording of events and synchronization in distributed sensor...Intervals for Synchronization in Real Time Distributed Systems", Submitted to Electronic Encyclopedia. 3. V. G. Hegde and S. S. Iyengar "Efficient

  12. Improving Distributed Diagnosis Through Structural Model Decomposition

    NASA Technical Reports Server (NTRS)

    Bregon, Anibal; Daigle, Matthew John; Roychoudhury, Indranil; Biswas, Gautam; Koutsoukos, Xenofon; Pulido, Belarmino

    2011-01-01

    Complex engineering systems require efficient fault diagnosis methodologies, but centralized approaches do not scale well, and this motivates the development of distributed solutions. This work presents an event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, by using the structural model decomposition capabilities provided by Possible Conflicts. We develop a distributed diagnosis algorithm that uses residuals computed by extending Possible Conflicts to build local event-based diagnosers based on global diagnosability analysis. The proposed approach is applied to a multitank system, and results demonstrate an improvement in the design of local diagnosers. Since local diagnosers use only a subset of the residuals, and use subsystem models to compute residuals (instead of the global system model), the local diagnosers are more efficient than previously developed distributed approaches.

  13. Individual complex Dirac eigenvalue distributions from random matrix theory and comparison to quenched lattice QCD with a quark chemical potential.

    PubMed

    Akemann, G; Bloch, J; Shifrin, L; Wettig, T

    2008-01-25

    We analyze how individual eigenvalues of the QCD Dirac operator at nonzero quark chemical potential are distributed in the complex plane. Exact and approximate analytical results for both quenched and unquenched distributions are derived from non-Hermitian random matrix theory. When comparing these to quenched lattice QCD spectra close to the origin, excellent agreement is found for zero and nonzero topology at several values of the quark chemical potential. Our analytical results are also applicable to other physical systems in the same symmetry class.

  14. On a Possible Unified Scaling Law for Volcanic Eruption Durations

    PubMed Central

    Cannavò, Flavio; Nunnari, Giuseppe

    2016-01-01

    Volcanoes constitute dissipative systems with many degrees of freedom. Their eruptions are the result of complex processes that involve interacting chemical-physical systems. At present, due to the complexity of involved phenomena and to the lack of precise measurements, both analytical and numerical models are unable to simultaneously include the main processes involved in eruptions thus making forecasts of volcanic dynamics rather unreliable. On the other hand, accurate forecasts of some eruption parameters, such as the duration, could be a key factor in natural hazard estimation and mitigation. Analyzing a large database with most of all the known volcanic eruptions, we have determined that the duration of eruptions seems to be described by a universal distribution which characterizes eruption duration dynamics. In particular, this paper presents a plausible global power-law distribution of durations of volcanic eruptions that holds worldwide for different volcanic environments. We also introduce a new, simple and realistic pipe model that can follow the same found empirical distribution. Since the proposed model belongs to the family of the self-organized systems it may support the hypothesis that simple mechanisms can lead naturally to the emergent complexity in volcanic behaviour. PMID:26926425

  15. On a Possible Unified Scaling Law for Volcanic Eruption Durations.

    PubMed

    Cannavò, Flavio; Nunnari, Giuseppe

    2016-03-01

    Volcanoes constitute dissipative systems with many degrees of freedom. Their eruptions are the result of complex processes that involve interacting chemical-physical systems. At present, due to the complexity of involved phenomena and to the lack of precise measurements, both analytical and numerical models are unable to simultaneously include the main processes involved in eruptions thus making forecasts of volcanic dynamics rather unreliable. On the other hand, accurate forecasts of some eruption parameters, such as the duration, could be a key factor in natural hazard estimation and mitigation. Analyzing a large database with most of all the known volcanic eruptions, we have determined that the duration of eruptions seems to be described by a universal distribution which characterizes eruption duration dynamics. In particular, this paper presents a plausible global power-law distribution of durations of volcanic eruptions that holds worldwide for different volcanic environments. We also introduce a new, simple and realistic pipe model that can follow the same found empirical distribution. Since the proposed model belongs to the family of the self-organized systems it may support the hypothesis that simple mechanisms can lead naturally to the emergent complexity in volcanic behaviour.

  16. Development of a calculation method for estimating specific energy distribution in complex radiation fields.

    PubMed

    Sato, Tatsuhiko; Watanabe, Ritsuko; Niita, Koji

    2006-01-01

    Estimation of the specific energy distribution in a human body exposed to complex radiation fields is of great importance in the planning of long-term space missions and heavy ion cancer therapies. With the aim of developing a tool for this estimation, the specific energy distributions in liquid water around the tracks of several HZE particles with energies up to 100 GeV n(-1) were calculated by performing track structure simulation with the Monte Carlo technique. In the simulation, the targets were assumed to be spherical sites with diameters from 1 nm to 1 microm. An analytical function to reproduce the simulation results was developed in order to predict the distributions of all kinds of heavy ions over a wide energy range. The incorporation of this function into the Particle and Heavy Ion Transport code System (PHITS) enables us to calculate the specific energy distributions in complex radiation fields in a short computational time.

  17. Hypothetical Modeling of Redox Conditions Within a Complex Ground-Water Flow Field in a Glacial Setting

    USGS Publications Warehouse

    Feinstein, Daniel T.; Thomas, Mary Ann

    2009-01-01

    This report describes a modeling approach for studying how redox conditions evolve under the influence of a complex ground-water flow field. The distribution of redox conditions within a flow system is of interest because of the intrinsic susceptibility of an aquifer to redox-sensitive, naturally occurring contaminants - such as arsenic - as well as anthropogenic contaminants - such as chlorinated solvents. The MODFLOW-MT3D-RT3D suite of code was applied to a glacial valley-fill aquifer to demonstrate a method for testing the interaction of flow patterns, sources of reactive organic carbon, and availability of electron acceptors in controlling redox conditions. Modeling results show how three hypothetical distributions of organic carbon influence the development of redox conditions in a water-supply aquifer. The distribution of strongly reduced water depends on the balance between the rate of redox reactions and the capability of different parts of the flow system to transmit oxygenated water. The method can take account of changes in the flow system induced by pumping that result in a new distribution of reduced water.

  18. Atomic switch networks as complex adaptive systems

    NASA Astrophysics Data System (ADS)

    Scharnhorst, Kelsey S.; Carbajal, Juan P.; Aguilera, Renato C.; Sandouk, Eric J.; Aono, Masakazu; Stieg, Adam Z.; Gimzewski, James K.

    2018-03-01

    Complexity is an increasingly crucial aspect of societal, environmental and biological phenomena. Using a dense unorganized network of synthetic synapses it is shown that a complex adaptive system can be physically created on a microchip built especially for complex problems. These neuro-inspired atomic switch networks (ASNs) are a dynamic system with inherent and distributed memory, recurrent pathways, and up to a billion interacting elements. We demonstrate key parameters describing self-organized behavior such as non-linearity, power law dynamics, and multistate switching regimes. Device dynamics are then investigated using a feedback loop which provides control over current and voltage power-law behavior. Wide ranging prospective applications include understanding and eventually predicting future events that display complex emergent behavior in the critical regime.

  19. The design and implementation of the Technical Facilities Controller (TFC) for the Goldstone deep space communications complex

    NASA Technical Reports Server (NTRS)

    Killian, D. A.; Menninger, F. J.; Gorman, T.; Glenn, P.

    1988-01-01

    The Technical Facilities Controller is a microprocessor-based energy management system that is to be implemented in the Deep Space Network facilities. This system is used in conjunction with facilities equipment at each of the complexes in the operation and maintenance of air-conditioning equipment, power generation equipment, power distribution equipment, and other primary facilities equipment. The implementation of the Technical Facilities Controller was completed at the Goldstone Deep Space Communications Complex and is now operational. The installation completed at the Goldstone Complex is described and the utilization of the Technical Facilities Controller is evaluated. The findings will be used in the decision to implement a similar system at the overseas complexes at Canberra, Australia, and Madrid, Spain.

  20. Network Theory: A Primer and Questions for Air Transportation Systems Applications

    NASA Technical Reports Server (NTRS)

    Holmes, Bruce J.

    2004-01-01

    A new understanding (with potential applications to air transportation systems) has emerged in the past five years in the scientific field of networks. This development emerges in large part because we now have a new laboratory for developing theories about complex networks: The Internet. The premise of this new understanding is that most complex networks of interest, both of nature and of human contrivance, exhibit a fundamentally different behavior than thought for over two hundred years under classical graph theory. Classical theory held that networks exhibited random behavior, characterized by normal, (e.g., Gaussian or Poisson) degree distributions of the connectivity between nodes by links. The new understanding turns this idea on its head: networks of interest exhibit scale-free (or small world) degree distributions of connectivity, characterized by power law distributions. The implications of scale-free behavior for air transportation systems include the potential that some behaviors of complex system architectures might be analyzed through relatively simple approximations of local elements of the system. For air transportation applications, this presentation proposes a framework for constructing topologies (architectures) that represent the relationships between mobility, flight operations, aircraft requirements, and airspace capacity, and the related externalities in airspace procedures and architectures. The proposed architectures or topologies may serve as a framework for posing comparative and combinative analyses of performance, cost, security, environmental, and related metrics.

  1. A PDA-based system for online recording and analysis of concurrent events in complex behavioral processes.

    PubMed

    Held, Jürgen; Manser, Tanja

    2005-02-01

    This article outlines how a Palm- or Newton-based PDA (personal digital assistant) system for online event recording was used to record and analyze concurrent events. We describe the features of this PDA-based system, called the FIT-System (flexible interface technique), and its application to the analysis of concurrent events in complex behavioral processes--in this case, anesthesia work processes. The patented FIT-System has a unique user interface design allowing the user to design an interface template with a pencil and paper or using a transparency film. The template usually consists of a drawing or sketch that includes icons or symbols that depict the observer's representation of the situation to be observed. In this study, the FIT-System allowed us to create a design for fast, intuitive online recording of concurrent events using a set of 41 observation codes. An analysis of concurrent events leads to a description of action density, and our results revealed a characteristic distribution of action density during the administration of anesthesia in the operating room. This distribution indicated the central role of the overlapping operations in the action sequences of medical professionals as they deal with the varying requirements of this complex task. We believe that the FIT-System for online recording of concurrent events in complex behavioral processes has the potential to be useful across a broad spectrum of research areas.

  2. A virtual data language and system for scientific workflow management in data grid environments

    NASA Astrophysics Data System (ADS)

    Zhao, Yong

    With advances in scientific instrumentation and simulation, scientific data is growing fast in both size and analysis complexity. So-called Data Grids aim to provide high performance, distributed data analysis infrastructure for data- intensive sciences, where scientists distributed worldwide need to extract information from large collections of data, and to share both data products and the resources needed to produce and store them. However, the description, composition, and execution of even logically simple scientific workflows are often complicated by the need to deal with "messy" issues like heterogeneous storage formats and ad-hoc file system structures. We show how these difficulties can be overcome via a typed workflow notation called virtual data language, within which issues of physical representation are cleanly separated from logical typing, and by the implementation of this notation within the context of a powerful virtual data system that supports distributed execution. The resulting language and system are capable of expressing complex workflows in a simple compact form, enacting those workflows in distributed environments, monitoring and recording the execution processes, and tracing the derivation history of data products. We describe the motivation, design, implementation, and evaluation of the virtual data language and system, and the application of the virtual data paradigm in various science disciplines, including astronomy, cognitive neuroscience.

  3. An Integrated Framework for Model-Based Distributed Diagnosis and Prognosis

    NASA Technical Reports Server (NTRS)

    Bregon, Anibal; Daigle, Matthew J.; Roychoudhury, Indranil

    2012-01-01

    Diagnosis and prognosis are necessary tasks for system reconfiguration and fault-adaptive control in complex systems. Diagnosis consists of detection, isolation and identification of faults, while prognosis consists of prediction of the remaining useful life of systems. This paper presents a novel integrated framework for model-based distributed diagnosis and prognosis, where system decomposition is used to enable the diagnosis and prognosis tasks to be performed in a distributed way. We show how different submodels can be automatically constructed to solve the local diagnosis and prognosis problems. We illustrate our approach using a simulated four-wheeled rover for different fault scenarios. Our experiments show that our approach correctly performs distributed fault diagnosis and prognosis in an efficient and robust manner.

  4. A formation control strategy with coupling weights for the multi-robot system

    NASA Astrophysics Data System (ADS)

    Liang, Xudong; Wang, Siming; Li, Weijie

    2017-12-01

    The distributed formation problem of the multi-robot system with general linear dynamic characteristics and directed communication topology is discussed. In order to avoid that the multi-robot system can not maintain the desired formation in the complex communication environment, the distributed cooperative algorithm with coupling weights based on zipf distribution is designed. The asymptotic stability condition for the formation of the multi-robot system is given, and the theory of the graph and the Lyapunov theory are used to prove that the formation can converge to the desired geometry formation and the desired motion rules of the virtual leader under this condition. Nontrivial simulations are performed to validate the effectiveness of the distributed cooperative algorithm with coupling weights.

  5. Functionally Adequate but Causally Idle: W(h)ither Distributed Leadership?

    ERIC Educational Resources Information Center

    Lakomski, Gabriele

    2008-01-01

    Purpose: The purpose of this conceptual paper is to argue that leadership, including distributed leadership, is a concept of folk psychology and is more productively viewed as an emergent self-organising property of complex systems. It aims to argue the case on the basis that claims to (distributed) leadership outrun the theoretical and empirical…

  6. Complex Dynamics of the Power Transmission Grid (and other Critical Infrastructures)

    NASA Astrophysics Data System (ADS)

    Newman, David

    2015-03-01

    Our modern societies depend crucially on a web of complex critical infrastructures such as power transmission networks, communication systems, transportation networks and many others. These infrastructure systems display a great number of the characteristic properties of complex systems. Important among these characteristics, they exhibit infrequent large cascading failures that often obey a power law distribution in their probability versus size. This power law behavior suggests that conventional risk analysis does not apply to these systems. It is thought that much of this behavior comes from the dynamical evolution of the system as it ages, is repaired, upgraded, and as the operational rules evolve with human decision making playing an important role in the dynamics. In this talk, infrastructure systems as complex dynamical systems will be introduced and some of their properties explored. The majority of the talk will then be focused on the electric power transmission grid though many of the results can be easily applied to other infrastructures. General properties of the grid will be discussed and results from a dynamical complex systems power transmission model will be compared with real world data. Then we will look at a variety of uses of this type of model. As examples, we will discuss the impact of size and network homogeneity on the grid robustness, the change in risk of failure as generation mix (more distributed vs centralized for example) changes, as well as the effect of operational changes such as the changing the operational risk aversion or grid upgrade strategies. One of the important outcomes from this work is the realization that ``improvements'' in the system components and operational efficiency do not always improve the system robustness, and can in fact greatly increase the risk, when measured as a risk of large failure.

  7. A distributed scheduling algorithm for heterogeneous real-time systems

    NASA Technical Reports Server (NTRS)

    Zeineldine, Osman; El-Toweissy, Mohamed; Mukkamala, Ravi

    1991-01-01

    Much of the previous work on load balancing and scheduling in distributed environments was concerned with homogeneous systems and homogeneous loads. Several of the results indicated that random policies are as effective as other more complex load allocation policies. The effects of heterogeneity on scheduling algorithms for hard real time systems is examined. A distributed scheduler specifically to handle heterogeneities in both nodes and node traffic is proposed. The performance of the algorithm is measured in terms of the percentage of jobs discarded. While a random task allocation is very sensitive to heterogeneities, the algorithm is shown to be robust to such non-uniformities in system components and load.

  8. Measuring the effects of heterogeneity on distributed systems

    NASA Technical Reports Server (NTRS)

    El-Toweissy, Mohamed; Zeineldine, Osman; Mukkamala, Ravi

    1991-01-01

    Distributed computer systems in daily use are becoming more and more heterogeneous. Currently, much of the design and analysis studies of such systems assume homogeneity. This assumption of homogeneity has been mainly driven by the resulting simplicity in modeling and analysis. A simulation study is presented which investigated the effects of heterogeneity on scheduling algorithms for hard real time distributed systems. In contrast to previous results which indicate that random scheduling may be as good as a more complex scheduler, this algorithm is shown to be consistently better than a random scheduler. This conclusion is more prevalent at high workloads as well as at high levels of heterogeneity.

  9. Structural Behavioral Study on the General Aviation Network Based on Complex Network

    NASA Astrophysics Data System (ADS)

    Zhang, Liang; Lu, Na

    2017-12-01

    The general aviation system is an open and dissipative system with complex structures and behavioral features. This paper has established the system model and network model for general aviation. We have analyzed integral attributes and individual attributes by applying the complex network theory and concluded that the general aviation network has influential enterprise factors and node relations. We have checked whether the network has small world effect, scale-free property and network centrality property which a complex network should have by applying degree distribution of functions and proved that the general aviation network system is a complex network. Therefore, we propose to achieve the evolution process of the general aviation industrial chain to collaborative innovation cluster of advanced-form industries by strengthening network multiplication effect, stimulating innovation performance and spanning the structural hole path.

  10. About Distributed Simulation-based Optimization of Forming Processes using a Grid Architecture

    NASA Astrophysics Data System (ADS)

    Grauer, Manfred; Barth, Thomas

    2004-06-01

    Permanently increasing complexity of products and their manufacturing processes combined with a shorter "time-to-market" leads to more and more use of simulation and optimization software systems for product design. Finding a "good" design of a product implies the solution of computationally expensive optimization problems based on the results of simulation. Due to the computational load caused by the solution of these problems, the requirements on the Information&Telecommunication (IT) infrastructure of an enterprise or research facility are shifting from stand-alone resources towards the integration of software and hardware resources in a distributed environment for high-performance computing. Resources can either comprise software systems, hardware systems, or communication networks. An appropriate IT-infrastructure must provide the means to integrate all these resources and enable their use even across a network to cope with requirements from geographically distributed scenarios, e.g. in computational engineering and/or collaborative engineering. Integrating expert's knowledge into the optimization process is inevitable in order to reduce the complexity caused by the number of design variables and the high dimensionality of the design space. Hence, utilization of knowledge-based systems must be supported by providing data management facilities as a basis for knowledge extraction from product data. In this paper, the focus is put on a distributed problem solving environment (PSE) capable of providing access to a variety of necessary resources and services. A distributed approach integrating simulation and optimization on a network of workstations and cluster systems is presented. For geometry generation the CAD-system CATIA is used which is coupled with the FEM-simulation system INDEED for simulation of sheet-metal forming processes and the problem solving environment OpTiX for distributed optimization.

  11. Experimental observations of the hydrodynamic behavior of solvent systems in high-speed counter-current chromatography. I. Hydrodynamic distribution of two solvent phases in a helical column subjected to two types of synchronous planetary motion.

    PubMed

    Ito, Y

    1984-10-05

    Hydrodynamic distribution of two-phase solvent systems in a rotating helical column subjected to centrifugal fields produced by two different types of synchronous planetary motion has been studied by the use of the combined horizontal flow-through coil planet centrifuge. With continuous elution of the mobile phase, the simpler type of motion resulted in low retention of the stationary phase in the column whereas a more complex motion, which produces a quasi-radial centrifugal field varying in both intensity and direction, yielded high stationary phase retention for commonly used solvent systems having a wide range of hydrophobicity. These solvent systems display highly complex modes of hydrodynamic interaction in the coil according to their particular physical properties.

  12. Experimental analysis of bidirectional reflectance distribution function cross section conversion term in direction cosine space.

    PubMed

    Butler, Samuel D; Nauyoks, Stephen E; Marciniak, Michael A

    2015-06-01

    Of the many classes of bidirectional reflectance distribution function (BRDF) models, two popular classes of models are the microfacet model and the linear systems diffraction model. The microfacet model has the benefit of speed and simplicity, as it uses geometric optics approximations, while linear systems theory uses a diffraction approach to compute the BRDF, at the expense of greater computational complexity. In this Letter, nongrazing BRDF measurements of rough and polished surface-reflecting materials at multiple incident angles are scaled by the microfacet cross section conversion term, but in the linear systems direction cosine space, resulting in great alignment of BRDF data at various incident angles in this space. This results in a predictive BRDF model for surface-reflecting materials at nongrazing angles, while avoiding some of the computational complexities in the linear systems diffraction model.

  13. Diversity, community composition, and dynamics of nonpigmented and late-pigmenting rapidly growing mycobacteria in an urban tap water production and distribution system.

    PubMed

    Dubrou, S; Konjek, J; Macheras, E; Welté, B; Guidicelli, L; Chignon, E; Joyeux, M; Gaillard, J L; Heym, B; Tully, T; Sapriel, G

    2013-09-01

    Nonpigmented and late-pigmenting rapidly growing mycobacteria (RGM) have been reported to commonly colonize water production and distribution systems. However, there is little information about the nature and distribution of RGM species within the different parts of such complex networks or about their clustering into specific RGM species communities. We conducted a large-scale survey between 2007 and 2009 in the Parisian urban tap water production and distribution system. We analyzed 1,418 water samples from 36 sites, covering all production units, water storage tanks, and distribution units; RGM isolates were identified by using rpoB gene sequencing. We detected 18 RGM species and putative new species, with most isolates being Mycobacterium chelonae and Mycobacterium llatzerense. Using hierarchical clustering and principal-component analysis, we found that RGM were organized into various communities correlating with water origin (groundwater or surface water) and location within the distribution network. Water treatment plants were more specifically associated with species of the Mycobacterium septicum group. On average, M. chelonae dominated network sites fed by surface water, and M. llatzerense dominated those fed by groundwater. Overall, the M. chelonae prevalence index increased along the distribution network and was associated with a correlative decrease in the prevalence index of M. llatzerense, suggesting competitive or niche exclusion between these two dominant species. Our data describe the great diversity and complexity of RGM species living in the interconnected environments that constitute the water production and distribution system of a large city and highlight the prevalence index of the potentially pathogenic species M. chelonae in the distribution network.

  14. Diversity, Community Composition, and Dynamics of Nonpigmented and Late-Pigmenting Rapidly Growing Mycobacteria in an Urban Tap Water Production and Distribution System

    PubMed Central

    Dubrou, S.; Konjek, J.; Macheras, E.; Welté, B.; Guidicelli, L.; Chignon, E.; Joyeux, M.; Gaillard, J. L.; Heym, B.; Tully, T.

    2013-01-01

    Nonpigmented and late-pigmenting rapidly growing mycobacteria (RGM) have been reported to commonly colonize water production and distribution systems. However, there is little information about the nature and distribution of RGM species within the different parts of such complex networks or about their clustering into specific RGM species communities. We conducted a large-scale survey between 2007 and 2009 in the Parisian urban tap water production and distribution system. We analyzed 1,418 water samples from 36 sites, covering all production units, water storage tanks, and distribution units; RGM isolates were identified by using rpoB gene sequencing. We detected 18 RGM species and putative new species, with most isolates being Mycobacterium chelonae and Mycobacterium llatzerense. Using hierarchical clustering and principal-component analysis, we found that RGM were organized into various communities correlating with water origin (groundwater or surface water) and location within the distribution network. Water treatment plants were more specifically associated with species of the Mycobacterium septicum group. On average, M. chelonae dominated network sites fed by surface water, and M. llatzerense dominated those fed by groundwater. Overall, the M. chelonae prevalence index increased along the distribution network and was associated with a correlative decrease in the prevalence index of M. llatzerense, suggesting competitive or niche exclusion between these two dominant species. Our data describe the great diversity and complexity of RGM species living in the interconnected environments that constitute the water production and distribution system of a large city and highlight the prevalence index of the potentially pathogenic species M. chelonae in the distribution network. PMID:23835173

  15. Object-oriented Tools for Distributed Computing

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1993-01-01

    Distributed computing systems are proliferating, owing to the availability of powerful, affordable microcomputers and inexpensive communication networks. A critical problem in developing such systems is getting application programs to interact with one another across a computer network. Remote interprogram connectivity is particularly challenging across heterogeneous environments, where applications run on different kinds of computers and operating systems. NetWorks! (trademark) is an innovative software product that provides an object-oriented messaging solution to these problems. This paper describes the design and functionality of NetWorks! and illustrates how it is being used to build complex distributed applications for NASA and in the commercial sector.

  16. Self-Tuning Fully-Connected PID Neural Network System for Distributed Temperature Sensing and Control of Instrument with Multi-Modules.

    PubMed

    Zhang, Zhen; Ma, Cheng; Zhu, Rong

    2016-10-14

    High integration of multi-functional instruments raises a critical issue in temperature control that is challenging due to its spatial-temporal complexity. This paper presents a multi-input multi-output (MIMO) self-tuning temperature sensing and control system for efficiently modulating the temperature environment within a multi-module instrument. The smart system ensures that the internal temperature of the instrument converges to a target without the need of a system model, thus making the control robust. The system consists of a fully-connected proportional-integral-derivative (PID) neural network (FCPIDNN) and an on-line self-tuning module. The experimental results show that the presented system can effectively control the internal temperature under various mission scenarios, in particular, it is able to self-reconfigure upon actuator failure. The system provides a new scheme for a complex and time-variant MIMO control system which can be widely applied for the distributed measurement and control of the environment in instruments, integration electronics, and house constructions.

  17. Decision Criteria for Distributed Versus Non-Distributed Information Systems in the Health Care Environment

    PubMed Central

    McGinnis, John W.

    1980-01-01

    The very same technological advances that support distributed systems have also dramatically increased the efficiency and capabilities of centralized systems making it more complex for health care managers to select the “right” system architecture to meet their particular needs. How this selection can be made with a reasonable degree of managerial comfort is the focus of this paper. The approach advocated is based on experience in developing the Tri-Service Medical Information System (TRIMIS) program. Along with this technical standards and configuration management procedures were developed that provided the necessary guidance to implement the selected architecture and to allow it to change in a controlled way over its life cycle.

  18. Distributed Mission Operations: Training Today’s Warfighters for Tomorrow’s Conflicts

    DTIC Science & Technology

    2016-02-01

    systems or include dissimilar weapons systems to rehearse more complex mission sets. In addition to networking geographically separated simulators...over the past decade. Today, distributed mission operations can facilitate the rehearsal of theater wide operations, integrating all the anticipated...effective that many aviators earn their basic aircraft qualification before their first flight in the airplane.11 Computer memory was once a

  19. Virtual-system-coupled adaptive umbrella sampling to compute free-energy landscape for flexible molecular docking.

    PubMed

    Higo, Junichi; Dasgupta, Bhaskar; Mashimo, Tadaaki; Kasahara, Kota; Fukunishi, Yoshifumi; Nakamura, Haruki

    2015-07-30

    A novel enhanced conformational sampling method, virtual-system-coupled adaptive umbrella sampling (V-AUS), was proposed to compute 300-K free-energy landscape for flexible molecular docking, where a virtual degrees of freedom was introduced to control the sampling. This degree of freedom interacts with the biomolecular system. V-AUS was applied to complex formation of two disordered amyloid-β (Aβ30-35 ) peptides in a periodic box filled by an explicit solvent. An interpeptide distance was defined as the reaction coordinate, along which sampling was enhanced. A uniform conformational distribution was obtained covering a wide interpeptide distance ranging from the bound to unbound states. The 300-K free-energy landscape was characterized by thermodynamically stable basins of antiparallel and parallel β-sheet complexes and some other complex forms. Helices were frequently observed, when the two peptides contacted loosely or fluctuated freely without interpeptide contacts. We observed that V-AUS converged to uniform distribution more effectively than conventional AUS sampling did. © 2015 Wiley Periodicals, Inc.

  20. Evidence of non-extensivity and complexity in the seismicity observed during 2011-2012 at the Santorini volcanic complex, Greece

    NASA Astrophysics Data System (ADS)

    Vallianatos, F.; Tzanis, A.; Michas, G.; Papadakis, G.

    2012-04-01

    Since the middle of summer 2011, an increase in the seismicity rates of the volcanic complex system of Santorini Island, Greece, was observed. In the present work, the temporal distribution of seismicity, as well as the magnitude distribution of earthquakes, have been studied using the concept of Non-Extensive Statistical Physics (NESP; Tsallis, 2009) along with the evolution of Shanon entropy H (also called information entropy). The analysis is based on the earthquake catalogue of the Geodynamic Institute of the National Observatory of Athens for the period July 2011-January 2012 (http://www.gein.noa.gr/). Non-Extensive Statistical Physics, which is a generalization of Boltzmann-Gibbs statistical physics, seems a suitable framework for studying complex systems. The observed distributions of seismicity rates at Santorini can be described (fitted) with NESP models to exceptionally well. This implies the inherent complexity of the Santorini volcanic seismicity, the applicability of NESP concepts to volcanic earthquake activity and the usefulness of NESP in investigating phenomena exhibiting multifractality and long-range coupling effects. Acknowledgments. This work was supported in part by the THALES Program of the Ministry of Education of Greece and the European Union in the framework of the project entitled "Integrated understanding of Seismicity, using innovative Methodologies of Fracture mechanics along with Earthquake and non extensive statistical physics - Application to the geodynamic system of the Hellenic Arc. SEISMO FEAR HELLARC". GM and GP wish to acknowledge the partial support of the Greek State Scholarships Foundation (ΙΚΥ).

  1. Recording information on protein complexes in an information management system

    PubMed Central

    Savitsky, Marc; Diprose, Jonathan M.; Morris, Chris; Griffiths, Susanne L.; Daniel, Edward; Lin, Bill; Daenke, Susan; Bishop, Benjamin; Siebold, Christian; Wilson, Keith S.; Blake, Richard; Stuart, David I.; Esnouf, Robert M.

    2011-01-01

    The Protein Information Management System (PiMS) is a laboratory information management system (LIMS) designed for use with the production of proteins in a research environment. The software is distributed under the CCP4 licence, and so is available free of charge to academic laboratories. Like most LIMS, the underlying PiMS data model originally had no support for protein–protein complexes. To support the SPINE2-Complexes project the developers have extended PiMS to meet these requirements. The modifications to PiMS, described here, include data model changes, additional protocols, some user interface changes and functionality to detect when an experiment may have formed a complex. Example data are shown for the production of a crystal of a protein complex. Integration with SPINE2-Complexes Target Tracker application is also described. PMID:21605682

  2. Recording information on protein complexes in an information management system.

    PubMed

    Savitsky, Marc; Diprose, Jonathan M; Morris, Chris; Griffiths, Susanne L; Daniel, Edward; Lin, Bill; Daenke, Susan; Bishop, Benjamin; Siebold, Christian; Wilson, Keith S; Blake, Richard; Stuart, David I; Esnouf, Robert M

    2011-08-01

    The Protein Information Management System (PiMS) is a laboratory information management system (LIMS) designed for use with the production of proteins in a research environment. The software is distributed under the CCP4 licence, and so is available free of charge to academic laboratories. Like most LIMS, the underlying PiMS data model originally had no support for protein-protein complexes. To support the SPINE2-Complexes project the developers have extended PiMS to meet these requirements. The modifications to PiMS, described here, include data model changes, additional protocols, some user interface changes and functionality to detect when an experiment may have formed a complex. Example data are shown for the production of a crystal of a protein complex. Integration with SPINE2-Complexes Target Tracker application is also described. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. A framework for real-time distributed expert systems: On-orbit spacecraft fault diagnosis, monitoring and control

    NASA Technical Reports Server (NTRS)

    Mullikin, Richard L.

    1987-01-01

    Control of on-orbit operation of a spacecraft requires retention and application of special purpose, often unique, knowledge of equipment and procedures. Real-time distributed expert systems (RTDES) permit a modular approach to a complex application such as on-orbit spacecraft support. One aspect of a human-machine system that lends itself to the application of RTDES is the function of satellite/mission controllers - the next logical step toward the creation of truly autonomous spacecraft systems. This system application is described.

  4. Distributed intelligent control and status networking

    NASA Technical Reports Server (NTRS)

    Fortin, Andre; Patel, Manoj

    1993-01-01

    Over the past two years, the Network Control Systems Branch (Code 532) has been investigating control and status networking technologies. These emerging technologies use distributed processing over a network to accomplish a particular custom task. These networks consist of small intelligent 'nodes' that perform simple tasks. Containing simple, inexpensive hardware and software, these nodes can be easily developed and maintained. Once networked, the nodes can perform a complex operation without a central host. This type of system provides an alternative to more complex control and status systems which require a central computer. This paper will provide some background and discuss some applications of this technology. It will also demonstrate the suitability of one particular technology for the Space Network (SN) and discuss the prototyping activities of Code 532 utilizing this technology.

  5. An application of sample entropy to precipitation in Paraíba State, Brazil

    NASA Astrophysics Data System (ADS)

    Xavier, Sílvio Fernando Alves; da Silva Jale, Jader; Stosic, Tatijana; dos Santos, Carlos Antonio Costa; Singh, Vijay P.

    2018-05-01

    A climate system is characterized to be a complex non-linear system. In order to describe the complex characteristics of precipitation series in Paraíba State, Brazil, we aim the use of sample entropy, a kind of entropy-based algorithm, to evaluate the complexity of precipitation series. Sixty-nine meteorological stations are distributed over four macroregions: Zona da Mata, Agreste, Borborema, and Sertão. The results of the analysis show that intricacies of monthly average precipitation have differences in the macroregions. Sample entropy is able to reflect the dynamic change of precipitation series providing a new way to investigate complexity of hydrological series. The complexity exhibits areal variation of local water resource systems which can influence the basis for utilizing and developing resources in dry areas.

  6. Towards a cyber-physical era: soft computing framework based multi-sensor array for water quality monitoring

    NASA Astrophysics Data System (ADS)

    Bhardwaj, Jyotirmoy; Gupta, Karunesh K.; Gupta, Rajiv

    2018-02-01

    New concepts and techniques are replacing traditional methods of water quality parameter measurement systems. This paper introduces a cyber-physical system (CPS) approach for water quality assessment in a distribution network. Cyber-physical systems with embedded sensors, processors and actuators can be designed to sense and interact with the water environment. The proposed CPS is comprised of sensing framework integrated with five different water quality parameter sensor nodes and soft computing framework for computational modelling. Soft computing framework utilizes the applications of Python for user interface and fuzzy sciences for decision making. Introduction of multiple sensors in a water distribution network generates a huge number of data matrices, which are sometimes highly complex, difficult to understand and convoluted for effective decision making. Therefore, the proposed system framework also intends to simplify the complexity of obtained sensor data matrices and to support decision making for water engineers through a soft computing framework. The target of this proposed research is to provide a simple and efficient method to identify and detect presence of contamination in a water distribution network using applications of CPS.

  7. Information Theory Applied to Animal Communication Systems and Its Possible Application to SETI

    NASA Astrophysics Data System (ADS)

    Hanser, Sean F.; Doyle, Laurance R.; McCowan, Brenda; Jenkins, Jon M.

    2004-06-01

    Information theory, as first introduced by Claude Shannon (Shannon &Weaver 1949) quantitatively evaluates the organizational complexity of communication systems. At the same time George Zipf was examining linguistic structure in a way that was mathematically similar to the components of the Shannon first-order entropy (Zipf 1949). Both Shannon's and Zipf's mathematical procedures have been applied to animal communication and recently have been providing insightful results. The Zipf plot is a useful tool for a first estimate of the characterization of a communication system's complexity (which can later be examined for complex structure at deeper levels using Shannon entropic analysis). In this paper we shall discuss some of the applications and pitfalls of using the Zipf distribution as a preliminary evaluator of the communication complexity of a signaling system.

  8. Distributed acoustic sensing: how to make the best out of the Rayleigh-backscattered energy?

    NASA Astrophysics Data System (ADS)

    Eyal, A.; Gabai, H.; Shpatz, I.

    2017-04-01

    Coherent fading noise (also known as speckle noise) affects the SNR and sensitivity of Distributed Acoustic Sensing (DAS) systems and makes them random processes of position and time. As in speckle noise, the statistical distribution of DAS SNR is particularly wide and its standard deviation (STD) roughly equals its mean (σSNR/ ≍ 0.89). Trading resolution for SNR may improve the mean SNR but not necessarily narrow its distribution. Here a new approach to achieve both SNR improvement (by sacrificing resolution) and narrowing of the distribution is introduced. The method is based on acquiring high resolution complex backscatter profiles of the sensing fiber, using them to compute complex power profiles of the fiber which retain phase variation information and filtering of the power profiles. The approach is tested via a computer simulation and demonstrates distribution narrowing up to σSNR/ < 0.2.

  9. Climatic differentiation in polyploid apomictic Ranunculus auricomus complex in Europe.

    PubMed

    Paule, Juraj; Dunkel, Franz G; Schmidt, Marco; Gregor, Thomas

    2018-05-21

    Polyploidy and apomixis are important factors influencing plant distributions often resulting in range shifts, expansions and geographical parthenogenesis. We used the Ranunculus auricomus complex as a model to asses if the past and present distribution and climatic preferences were determined by these phenomena. Ecological differentiation among diploids and polyploids was tested by comparing the sets of climatic variables and distribution modelling using 191 novel ploidy estimations and 561 literature data. Significant differences in relative genome size on the diploid level were recorded between the "auricomus" and "cassubicus" groups and several new diploid occurrences were found in Slovenia and Hungary. The current distribution of diploids overlapped with the modelled paleodistribution (22 kyr BP), except Austria and the Carpathians, which are proposed to be colonized later on from refugia in the Balkans. Current and historical presence of diploids from the R. auricomus complex is suggested also for the foothills of the Caucasus. Based on comparisons of the climatic preferences polyploids from the R. auricomus complex occupy slightly drier and colder habitats than the diploids. The change of reproductive mode and selection due to competition with the diploid ancestors may have facilitated the establishment of polyploids within the R. auricomus complex in environments slightly cooler and drier, than those tolerated by diploid ancestors. Much broader distribution of polyploid apomicts may have been achieved due to faster colonization mediated by uniparental reproductive system.

  10. General and craniofacial development are complex adaptive processes influenced by diversity.

    PubMed

    Brook, A H; O'Donnell, M Brook; Hone, A; Hart, E; Hughes, T E; Smith, R N; Townsend, G C

    2014-06-01

    Complex systems are present in such diverse areas as social systems, economies, ecosystems and biology and, therefore, are highly relevant to dental research, education and practice. A Complex Adaptive System in biological development is a dynamic process in which, from interacting components at a lower level, higher level phenomena and structures emerge. Diversity makes substantial contributions to the performance of complex adaptive systems. It enhances the robustness of the process, allowing multiple responses to external stimuli as well as internal changes. From diversity comes variation in outcome and the possibility of major change; outliers in the distribution enhance the tipping points. The development of the dentition is a valuable, accessible model with extensive and reliable databases for investigating the role of complex adaptive systems in craniofacial and general development. The general characteristics of such systems are seen during tooth development: self-organization; bottom-up emergence; multitasking; self-adaptation; variation; tipping points; critical phases; and robustness. Dental findings are compatible with the Random Network Model, the Threshold Model and also with the Scale Free Network Model which has a Power Law distribution. In addition, dental development shows the characteristics of Modularity and Clustering to form Hierarchical Networks. The interactions between the genes (nodes) demonstrate Small World phenomena, Subgraph Motifs and Gene Regulatory Networks. Genetic mechanisms are involved in the creation and evolution of variation during development. The genetic factors interact with epigenetic and environmental factors at the molecular level and form complex networks within the cells. From these interactions emerge the higher level tissues, tooth germs and mineralized teeth. Approaching development in this way allows investigation of why there can be variations in phenotypes from identical genotypes; the phenotype is the outcome of perturbations in the cellular systems and networks, as well as of the genotype. Understanding and applying complexity theory will bring about substantial advances not only in dental research and education but also in the organization and delivery of oral health care. © 2014 Australian Dental Association.

  11. Symmetric and Asymmetric Tendencies in Stable Complex Systems

    PubMed Central

    Tan, James P. L.

    2016-01-01

    A commonly used approach to study stability in a complex system is by analyzing the Jacobian matrix at an equilibrium point of a dynamical system. The equilibrium point is stable if all eigenvalues have negative real parts. Here, by obtaining eigenvalue bounds of the Jacobian, we show that stable complex systems will favor mutualistic and competitive relationships that are asymmetrical (non-reciprocative) and trophic relationships that are symmetrical (reciprocative). Additionally, we define a measure called the interdependence diversity that quantifies how distributed the dependencies are between the dynamical variables in the system. We find that increasing interdependence diversity has a destabilizing effect on the equilibrium point, and the effect is greater for trophic relationships than for mutualistic and competitive relationships. These predictions are consistent with empirical observations in ecology. More importantly, our findings suggest stabilization algorithms that can apply very generally to a variety of complex systems. PMID:27545722

  12. Symmetric and Asymmetric Tendencies in Stable Complex Systems.

    PubMed

    Tan, James P L

    2016-08-22

    A commonly used approach to study stability in a complex system is by analyzing the Jacobian matrix at an equilibrium point of a dynamical system. The equilibrium point is stable if all eigenvalues have negative real parts. Here, by obtaining eigenvalue bounds of the Jacobian, we show that stable complex systems will favor mutualistic and competitive relationships that are asymmetrical (non-reciprocative) and trophic relationships that are symmetrical (reciprocative). Additionally, we define a measure called the interdependence diversity that quantifies how distributed the dependencies are between the dynamical variables in the system. We find that increasing interdependence diversity has a destabilizing effect on the equilibrium point, and the effect is greater for trophic relationships than for mutualistic and competitive relationships. These predictions are consistent with empirical observations in ecology. More importantly, our findings suggest stabilization algorithms that can apply very generally to a variety of complex systems.

  13. Modelling the Burstiness of Complex Space Plasmas Using Linear Fractional Stable Motion

    NASA Astrophysics Data System (ADS)

    Watkins, N. W.; Rosenberg, S. J.; Chapman, S. C.; Sanchez, R.; Credgington, D.

    2009-12-01

    The Earth's magnetosphere is quite clearly “complex" in the everyday sense of the word. However, in the last 15 to 20 years there has been a growing thread in space physics (e.g. Freeman & Watkins [Science, 2002] , Chapman & Watkins [Space Science Reviews, 2001]) using and developing some of the emerging science of complex systems (e.g. Sornette, 2nd Edition, 2004). A particularly well-studied set of system properties has been derived from those used in the study of critical phenomena, notably correlation functions, power spectra, distributions of bursts above a threshold, and so on (e.g. Watkins [Nonlinear Processes in Geophysics, 2002]). These have revealed behaviours familiar from many other complex systems, such as burstiness, long range dependence, heavy tailed probability distributions and so forth. The results of these studies are typically interpreted within existing paradigms, most notably self-organised criticality. However, just as in other developing areas of complexity science (Sornette, op. cit.; Watkins & Freeman [Science, 2008]), it is increasingly being realised that the diagnostics in use have not been extensively studied outside the context in which they were originally proposed. This means that, for example, it is not well established what the expected distribution of bursts above a fixed threshold will be for time series other than Brownian (or fractional Brownian) motion. We will describe some preliminary investigations (Watkins et al [Physical Review E, 2009]) into the burst distribution problem, using Linear Fractional Stable Motion as a controllable toy model of a process exhibiting both long-range dependence and heavy tails. A by product of the work was a differential equation for LFSM (Watkins et al, op cit), which we also briefly discuss. Current and future work will also focus on the thorny problem of distinguishing turbulence from SOC in natural datasets (Watkins et al; Uritsky et al [Physical Review Letters, 2009]) with limited dynamic range, an area which will also be briefly discussed.

  14. Computer simulations of dendrimer-polyelectrolyte complexes.

    PubMed

    Pandav, Gunja; Ganesan, Venkat

    2014-08-28

    We carry out a systematic analysis of static properties of the clusters formed by complexation between charged dendrimers and linear polyelectrolyte (LPE) chains in a dilute solution under good solvent conditions. We use single chain in mean-field simulations and analyze the structure of the clusters through radial distribution functions of the dendrimer, cluster size, and charge distributions. The effects of LPE length, charge ratio between LPE and dendrimer, the influence of salt concentration, and the dendrimer generation number are examined. Systems with short LPEs showed a reduced propensity for aggregation with dendrimers, leading to formation of smaller clusters. In contrast, larger dendrimers and longer LPEs lead to larger clusters with significant bridging. Increasing salt concentration was seen to reduce aggregation between dendrimers as a result of screening of electrostatic interactions. Generally, maximum complexation was observed in systems with an equal amount of net dendrimer and LPE charges, whereas either excess LPE or dendrimer concentrations resulted in reduced clustering between dendrimers.

  15. Reliability Estimation of Aero-engine Based on Mixed Weibull Distribution Model

    NASA Astrophysics Data System (ADS)

    Yuan, Zhongda; Deng, Junxiang; Wang, Dawei

    2018-02-01

    Aero-engine is a complex mechanical electronic system, based on analysis of reliability of mechanical electronic system, Weibull distribution model has an irreplaceable role. Till now, only two-parameter Weibull distribution model and three-parameter Weibull distribution are widely used. Due to diversity of engine failure modes, there is a big error with single Weibull distribution model. By contrast, a variety of engine failure modes can be taken into account with mixed Weibull distribution model, so it is a good statistical analysis model. Except the concept of dynamic weight coefficient, in order to make reliability estimation result more accurately, three-parameter correlation coefficient optimization method is applied to enhance Weibull distribution model, thus precision of mixed distribution reliability model is improved greatly. All of these are advantageous to popularize Weibull distribution model in engineering applications.

  16. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    PubMed

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  17. Analysis of the Chinese air route network as a complex network

    NASA Astrophysics Data System (ADS)

    Cai, Kai-Quan; Zhang, Jun; Du, Wen-Bo; Cao, Xian-Bin

    2012-02-01

    The air route network, which supports all the flight activities of the civil aviation, is the most fundamental infrastructure of air traffic management system. In this paper, we study the Chinese air route network (CARN) within the framework of complex networks. We find that CARN is a geographical network possessing exponential degree distribution, low clustering coefficient, large shortest path length and exponential spatial distance distribution that is obviously different from that of the Chinese airport network (CAN). Besides, via investigating the flight data from 2002 to 2010, we demonstrate that the topology structure of CARN is homogeneous, howbeit the distribution of flight flow on CARN is rather heterogeneous. In addition, the traffic on CARN keeps growing in an exponential form and the increasing speed of west China is remarkably larger than that of east China. Our work will be helpful to better understand Chinese air traffic systems.

  18. Stochastic tools hidden behind the empirical dielectric relaxation laws

    NASA Astrophysics Data System (ADS)

    Stanislavsky, Aleksander; Weron, Karina

    2017-03-01

    The paper is devoted to recent advances in stochastic modeling of anomalous kinetic processes observed in dielectric materials which are prominent examples of disordered (complex) systems. Theoretical studies of dynamical properties of ‘structures with variations’ (Goldenfield and Kadanoff 1999 Science 284 87-9) require application of such mathematical tools—by means of which their random nature can be analyzed and, independently of the details distinguishing various systems (dipolar materials, glasses, semiconductors, liquid crystals, polymers, etc), the empirical universal kinetic patterns can be derived. We begin with a brief survey of the historical background of the dielectric relaxation study. After a short outline of the theoretical ideas providing the random tools applicable to modeling of relaxation phenomena, we present probabilistic implications for the study of the relaxation-rate distribution models. In the framework of the probability distribution of relaxation rates we consider description of complex systems, in which relaxing entities form random clusters interacting with each other and single entities. Then we focus on stochastic mechanisms of the relaxation phenomenon. We discuss the diffusion approach and its usefulness for understanding of anomalous dynamics of relaxing systems. We also discuss extensions of the diffusive approach to systems under tempered random processes. Useful relationships among different stochastic approaches to the anomalous dynamics of complex systems allow us to get a fresh look at this subject. The paper closes with a final discussion on achievements of stochastic tools describing the anomalous time evolution of complex systems.

  19. Cooperative action of coherent groups in broadly heterogeneous populations of interacting chemical oscillators

    PubMed Central

    Mikhailov, A. S.; Zanette, D. H.; Zhai, Y. M.; Kiss, I. Z.; Hudson, J. L.

    2004-01-01

    We present laboratory experiments on the effects of global coupling in a population of electrochemical oscillators with a multimodal frequency distribution. The experiments show that complex collective signals are generated by this system through spontaneous emergence and joint operation of coherently acting groups representing hierarchically organized resonant clusters. Numerical simulations support these experimental findings. Our results suggest that some forms of internal self-organization, characteristic for complex multiagent systems, are already possible in simple chemical systems. PMID:15263084

  20. Service Discovery Oriented Management System Construction Method

    NASA Astrophysics Data System (ADS)

    Li, Huawei; Ren, Ying

    2017-10-01

    In order to solve the problem that there is no uniform method for design service quality management system in large-scale complex service environment, this paper proposes a distributed service-oriented discovery management system construction method. Three measurement functions are proposed to compute nearest neighbor user similarity at different levels. At present in view of the low efficiency of service quality management systems, three solutions are proposed to improve the efficiency of the system. Finally, the key technologies of distributed service quality management system based on service discovery are summarized through the factor addition and subtraction of quantitative experiment.

  1. Autonomous Decentralized Voltage Profile Control of Super Distributed Energy System using Multi-agent Technology

    NASA Astrophysics Data System (ADS)

    Tsuji, Takao; Hara, Ryoichi; Oyama, Tsutomu; Yasuda, Keiichiro

    A super distributed energy system is a future energy system in which the large part of its demand is fed by a huge number of distributed generators. At one time some nodes in the super distributed energy system behave as load, however, at other times they behave as generator - the characteristic of each node depends on the customers' decision. In such situation, it is very difficult to regulate voltage profile over the system due to the complexity of power flows. This paper proposes a novel control method of distributed generators that can achieve the autonomous decentralized voltage profile regulation by using multi-agent technology. The proposed multi-agent system employs two types of agent; a control agent and a mobile agent. Control agents generate or consume reactive power to regulate the voltage profile of neighboring nodes and mobile agents transmit the information necessary for VQ-control among the control agents. The proposed control method is tested through numerical simulations.

  2. On the Relevancy of Efficient, Integrated Computer and Network Monitoring in HEP Distributed Online Environment

    NASA Astrophysics Data System (ADS)

    Carvalho, D.; Gavillet, Ph.; Delgado, V.; Albert, J. N.; Bellas, N.; Javello, J.; Miere, Y.; Ruffinoni, D.; Smith, G.

    Large Scientific Equipments are controlled by Computer Systems whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, the sophistication of its treatment and, on the other hand by the fast evolution of the computer and network market. Some people call them genetically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this framework the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is proposed to integrate the various functions of DCCS monitoring into one general purpose Multi-layer System.

  3. Distributed Cognition on the road: Using EAST to explore future road transportation systems.

    PubMed

    Banks, Victoria A; Stanton, Neville A; Burnett, Gary; Hermawati, Setia

    2018-04-01

    Connected and Autonomous Vehicles (CAV) are set to revolutionise the way in which we use our transportation system. However, we do not fully understand how the integration of wireless and autonomous technology into the road transportation network affects overall network dynamism. This paper uses the theoretical principles underlying Distributed Cognition to explore the dependencies and interdependencies that exist between system agents located within the road environment, traffic management centres and other external agencies in both non-connected and connected transportation systems. This represents a significant step forward in modelling complex sociotechnical systems as it shows that the principles underlying Distributed Cognition can be applied to macro-level systems using the visual representations afforded by the Event Analysis of Systemic Teamwork (EAST) method. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Evolving Scale-Free Networks by Poisson Process: Modeling and Degree Distribution.

    PubMed

    Feng, Minyu; Qu, Hong; Yi, Zhang; Xie, Xiurui; Kurths, Jurgen

    2016-05-01

    Since the great mathematician Leonhard Euler initiated the study of graph theory, the network has been one of the most significant research subject in multidisciplinary. In recent years, the proposition of the small-world and scale-free properties of complex networks in statistical physics made the network science intriguing again for many researchers. One of the challenges of the network science is to propose rational models for complex networks. In this paper, in order to reveal the influence of the vertex generating mechanism of complex networks, we propose three novel models based on the homogeneous Poisson, nonhomogeneous Poisson and birth death process, respectively, which can be regarded as typical scale-free networks and utilized to simulate practical networks. The degree distribution and exponent are analyzed and explained in mathematics by different approaches. In the simulation, we display the modeling process, the degree distribution of empirical data by statistical methods, and reliability of proposed networks, results show our models follow the features of typical complex networks. Finally, some future challenges for complex systems are discussed.

  5. Automation of multi-agent control for complex dynamic systems in heterogeneous computational network

    NASA Astrophysics Data System (ADS)

    Oparin, Gennady; Feoktistov, Alexander; Bogdanova, Vera; Sidorov, Ivan

    2017-01-01

    The rapid progress of high-performance computing entails new challenges related to solving large scientific problems for various subject domains in a heterogeneous distributed computing environment (e.g., a network, Grid system, or Cloud infrastructure). The specialists in the field of parallel and distributed computing give the special attention to a scalability of applications for problem solving. An effective management of the scalable application in the heterogeneous distributed computing environment is still a non-trivial issue. Control systems that operate in networks, especially relate to this issue. We propose a new approach to the multi-agent management for the scalable applications in the heterogeneous computational network. The fundamentals of our approach are the integrated use of conceptual programming, simulation modeling, network monitoring, multi-agent management, and service-oriented programming. We developed a special framework for an automation of the problem solving. Advantages of the proposed approach are demonstrated on the parametric synthesis example of the static linear regulator for complex dynamic systems. Benefits of the scalable application for solving this problem include automation of the multi-agent control for the systems in a parallel mode with various degrees of its detailed elaboration.

  6. A framework for analyzing the cognitive complexity of computer-assisted clinical ordering.

    PubMed

    Horsky, Jan; Kaufman, David R; Oppenheim, Michael I; Patel, Vimla L

    2003-01-01

    Computer-assisted provider order entry is a technology that is designed to expedite medical ordering and to reduce the frequency of preventable errors. This paper presents a multifaceted cognitive methodology for the characterization of cognitive demands of a medical information system. Our investigation was informed by the distributed resources (DR) model, a novel approach designed to describe the dimensions of user interfaces that introduce unnecessary cognitive complexity. This method evaluates the relative distribution of external (system) and internal (user) representations embodied in system interaction. We conducted an expert walkthrough evaluation of a commercial order entry system, followed by a simulated clinical ordering task performed by seven clinicians. The DR model was employed to explain variation in user performance and to characterize the relationship of resource distribution and ordering errors. The analysis revealed that the configuration of resources in this ordering application placed unnecessarily heavy cognitive demands on the user, especially on those who lacked a robust conceptual model of the system. The resources model also provided some insight into clinicians' interactive strategies and patterns of associated errors. Implications for user training and interface design based on the principles of human-computer interaction in the medical domain are discussed.

  7. PILOT: An intelligent distributed operations support system

    NASA Technical Reports Server (NTRS)

    Rasmussen, Arthur N.

    1993-01-01

    The Real-Time Data System (RTDS) project is exploring the application of advanced technologies to the real-time flight operations environment of the Mission Control Centers at NASA's Johnson Space Center. The system, based on a network of engineering workstations, provides services such as delivery of real time telemetry data to flight control applications. To automate the operation of this complex distributed environment, a facility called PILOT (Process Integrity Level and Operation Tracker) is being developed. PILOT comprises a set of distributed agents cooperating with a rule-based expert system; together they monitor process operation and data flows throughout the RTDS network. The goal of PILOT is to provide unattended management and automated operation under user control.

  8. Drinking water for dairy cattle: always a benefit or a microbiological risk?

    PubMed

    Van Eenige, M J E M; Counotte, G H M; Noordhuizen, J P T M

    2013-02-01

    Drinking water can be considered an essential nutrient for dairy cattle. However, because it comes from different sources, its chemical and microbiological quality does not always reach accepted standards. Moreover, water quality is not routinely assessed on dairy farms. The microecology of drinking water sources and distribution systems is rather complex and still not fully understood. Water quality is adversely affected by the formation of biofilms in distribution systems, which form a persistent reservoir for potentially pathogenic bacteria. Saprophytic microorganisms associated with such biofilms interact with organic and inorganic matter in water, with pathogens, and even with each other. In addition, the presence of biofilms in water distribution systems makes cleaning and disinfection difficult and sometimes impossible. This article describes the complex dynamics of microorganisms in water distribution systems. Water quality is diminished primarily as a result of faecal contamination and rarely as a result of putrefaction in water distribution systems. The design of such systems (with/ without anti-backflow valves and pressure) and the materials used (polyethylene enhances biofilm; stainless steel does not) affect the quality of water they provide. The best option is an open, funnel-shaped galvanized drinking trough, possibly with a pressure system, air inlet, and anti-backflow valves. A poor microbiological quality of drinking water may adversely affect feed intake, and herd health and productivity. In turn, public health may be affected because cattle can become a reservoir of microorganisms hazardous to humans, such as some strains of E. coli, Yersinia enterocolitica, and Campylobacter jejuni. A better understanding of the biological processes in water sources and distribution systems and of the viability of microorganisms in these systems may contribute to better advice on herd health and productivity at a farm level. Certain on-farm risk factors for water quality have been identified. A practical approach will facilitate the control and management of these risks, and thereby improve herd health and productivity.

  9. Threshold transitions in a regional urban system

    EPA Science Inventory

    In this paper we analyze the evolution of city size distributions over time in a regional urban system. This urban complex system is in constant flux with changing groups and city migration across existing and newly created groups. Using group formation as an emergent property, t...

  10. Developing Use Cases for Evaluation of ADMS Applications to Accelerate Technology Adoption: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veda, Santosh; Wu, Hongyu; Martin, Maurice

    Grid modernization for the distribution systems comprise of the ability to effectively monitor and manage unplanned events while ensuring reliable operations. Integration of Distributed Energy Resources (DERs) and proliferation of autonomous smart controllers like microgrids and smart inverters in the distribution networks challenge the status quo of distribution system operations. Advanced Distribution Management System (ADMS) technologies are being increasingly deployed to manage the complexities of operating distribution systems. The ability to evaluate the ADMS applications in specific utility environments and for future scenarios will accelerate wider adoption of the ADMS and will lower the risks and costs of their implementation.more » This paper addresses the first step - identify and define the use cases for evaluating these applications. The applications that are selected for this discussion include Volt-VAr Optimization (VVO), Fault Location Isolation and Service Restoration (FLISR), Online Power Flow (OLPF)/Distribution System State Estimation (DSSE) and Market Participation. A technical description and general operational requirements for each of these applications is presented. The test scenarios that are most relevant to the utility challenges are also addressed.« less

  11. Political complexity predicts the spread of ethnolinguistic groups

    PubMed Central

    Currie, Thomas E.; Mace, Ruth

    2009-01-01

    Human languages show a remarkable degree of variation in the area they cover. However, the factors governing the distribution of human cultural groups such as languages are not well understood. While previous studies have examined the role of a number of environmental variables the importance of cultural factors has not been systematically addressed. Here we use a geographical information system (GIS) to integrate information about languages with environmental, ecological, and ethnographic data to test a number of hypotheses that have been proposed to explain the global distribution of languages. We show that the degree of political complexity and type of subsistence strategy exhibited by societies are important predictors of the area covered by a language. Political complexity is also strongly associated with the latitudinal gradient in language area, whereas subsistence strategy is not. We argue that a process of cultural group selection favoring more complex societies may have been important in shaping the present-day global distribution of language diversity. PMID:19380740

  12. Distributed Health Monitoring System for Reusable Liquid Rocket Engines

    NASA Technical Reports Server (NTRS)

    Lin, C. F.; Figueroa, F.; Politopoulos, T.; Oonk, S.

    2009-01-01

    The ability to correctly detect and identify any possible failure in the systems, subsystems, or sensors within a reusable liquid rocket engine is a major goal at NASA John C. Stennis Space Center (SSC). A health management (HM) system is required to provide an on-ground operation crew with an integrated awareness of the condition of every element of interest by determining anomalies, examining their causes, and making predictive statements. However, the complexity associated with relevant systems, and the large amount of data typically necessary for proper interpretation and analysis, presents difficulties in implementing complete failure detection, identification, and prognostics (FDI&P). As such, this paper presents a Distributed Health Monitoring System for Reusable Liquid Rocket Engines as a solution to these problems through the use of highly intelligent algorithms for real-time FDI&P, and efficient and embedded processing at multiple levels. The end result is the ability to successfully incorporate a comprehensive HM platform despite the complexity of the systems under consideration.

  13. A Pub/Sub Message Distribution Architecture for Disruption Tolerant Networks

    NASA Astrophysics Data System (ADS)

    Carrilho, Sergio; Esaki, Hiroshi

    Access to information is taken for granted in urban areas covered by a robust communication infrastructure. Nevertheless most of the areas in the world, are not covered by such infrastructures. We propose a DTN publish and subscribe system called Hikari, which uses nodes' mobility in order to distribute messages without using a robust infrastructure. The area of Disruption/Delay Tolerant Networks (DTN) focuses on providing connectivity to locations separated by networks with disruptions and delays. The Hikari system does not use node identifiers for message forwarding thus eliminating the complexity of routing associated with many forwarding schemes in DTN. Hikari uses nodes paths' information, advertised by special nodes in the system or predicted by the system itself, for optimizing the message dissemination process. We have used the Paris subway system, due to it's complexity, to validate Hikari and to analyze it's performance. We have shown that Hikari achieves a superior deliver rate while keeping redundant messages in the system low, which is ideal when using devices with limited resources for message dissemination.

  14. A Theoretical Solid Oxide Fuel Cell Model for System Controls and Stability Design

    NASA Technical Reports Server (NTRS)

    Kopasakis, George; Brinson, Thomas; Credle, Sydni; Xu, Ming

    2006-01-01

    As the aviation industry moves towards higher efficiency electrical power generation, all electric aircraft, or zero emissions and more quiet aircraft, fuel cells are sought as the technology that can deliver on these high expectations. The Hybrid Solid Oxide Fuel Cell system combines the fuel cell with a microturbine to obtain up to 70 percent cycle efficiency, and then distributes the electrical power to the loads via a power distribution system. The challenge is to understand the dynamics of this complex multi-discipline system, and design distributed controls that take the system through its operating conditions in a stable and safe manner while maintaining the system performance. This particular system is a power generation and distribution system and the fuel cell and microturbine model fidelity should be compatible with the dynamics of the power distribution system in order to allow proper stability and distributed controls design. A novel modeling approach is proposed for the fuel cell that will allow the fuel cell and the power system to be integrated and designed for stability, distributed controls, and other interface specifications. This investigation shows that for the fuel cell, the voltage characteristic should be modeled, but in addition, conservation equation dynamics, ion diffusion, charge transfer kinetics, and the electron flow inherent impedance should also be included.

  15. Design and implementation of a status at a glance user interface for a power distribution expert system

    NASA Technical Reports Server (NTRS)

    Liberman, Eugene M.; Manner, David B.; Dolce, James L.; Mellor, Pamela A.

    1993-01-01

    A user interface to the power distribution expert system for Space Station Freedom is discussed. The importance of features which simplify assessing system status and which minimize navigating through layers of information are examined. Design rationale and implementation choices are also presented. The amalgamation of such design features as message linking arrows, reduced information content screens, high salience anomaly icons, and color choices with failure detection and diagnostic explanation from an expert system is shown to provide an effective status-at-a-glance monitoring system for power distribution. This user interface design offers diagnostic reasoning without compromising the monitoring of current events. The display can convey complex concepts in terms that are clear to its users.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lopez-Ruiz, R.; Nagy, A.; Romera, E.

    A two-parameter family of complexity measures C-tilde{sup ({alpha},{beta})} based on the Renyi entropies is introduced and characterized by a detailed study of its mathematical properties. This family is the generalization of a continuous version of the Lopez-Ruiz-Mancini-Calbet complexity, which is recovered for {alpha}=1 and {beta}=2. These complexity measures are obtained by multiplying two quantities bringing global information on the probability distribution defining the system. When one of the parameters, {alpha} or {beta}, goes to infinity, one of the global factors becomes a local factor. For this special case, the complexity is calculated on different quantum systems: H-atom, harmonic oscillator, andmore » square well.« less

  17. Scaling behavior of sleep-wake transitions across species

    NASA Astrophysics Data System (ADS)

    Lo, Chung-Chuan; Chou, Thomas; Ivanov, Plamen Ch.; Penzel, Thomas; Mochizuki, Takatoshi; Scammell, Thomas; Saper, Clifford B.; Stanley, H. Eugene

    2003-03-01

    Uncovering the mechanisms controlling sleep is a fascinating scientific challenge. It can be viewed as transitions of states of a very complex system, the brain. We study the time dynamics of short awakenings during sleep for three species: humans, rats and mice. We find, for all three species, that wake durations follow a power-law distribution, and sleep durations follow exponential distributions. Surprisingly, all three species have the same power-law exponent for the distribution of wake durations, but the exponential time scale of the distributions of sleep durations varies across species. We suggest that the dynamics of short awakenings are related to species-independent fluctuations of the system, while the dynamics of sleep is related to system-dependent mechanisms which change with species.

  18. Soil temperature variability in complex terrain measured using fiber-optic distributed temperature sensing

    USDA-ARS?s Scientific Manuscript database

    Soil temperature (Ts) exerts critical controls on hydrologic and biogeochemical processes but magnitude and nature of Ts variability in a landscape setting are rarely documented. Fiber optic distributed temperature sensing systems (FO-DTS) potentially measure Ts at high density over a large extent. ...

  19. Exploring Distributed Leadership for the Quality Management of Online Learning Environments

    ERIC Educational Resources Information Center

    Palmer, Stuart; Holt, Dale; Gosper, Maree; Sankey, Michael; Allan, Garry

    2013-01-01

    Online learning environments (OLEs) are complex information technology (IT) systems that intersect with many areas of university organisation. Distributed models of leadership have been proposed as appropriate for the good governance of OLEs. Based on theoretical and empirical research, a group of Australian universities proposed a framework for…

  20. Exploring Complex Systems Aspects of Blackout Risk and Mitigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, David E; Carreras, Benjamin A; Lynch, Vickie E

    2011-01-01

    Electric power transmission systems are a key infrastructure, and blackouts of these systems have major consequences for the economy and national security. Analyses of blackout data suggest that blackout size distributions have a power law form over much of their range. This result is an indication that blackouts behave as a complex dynamical system. We use a simulation of an upgrading power transmission system to investigate how these complex system dynamics impact the assessment and mitigation of blackout risk. The mitigation of failures in complex systems needs to be approached with care. The mitigation efforts can move the system tomore » a new dynamic equilibrium while remaining near criticality and preserving the power law region. Thus, while the absolute frequency of blackouts of all sizes may be reduced, the underlying forces can still cause the relative frequency of large blackouts to small blackouts to remain the same. Moreover, in some cases, efforts to mitigate small blackouts can even increase the frequency of large blackouts. This result occurs because the large and small blackouts are not mutually independent, but are strongly coupled by the complex dynamics.« less

  1. A controlled experiment on the impact of software structure on maintainability

    NASA Technical Reports Server (NTRS)

    Rombach, Dieter H.

    1987-01-01

    The impact of software structure on maintainability aspects including comprehensibility, locality, modifiability, and reusability in a distributed system environment is studied in a controlled maintenance experiment involving six medium-size distributed software systems implemented in LADY (language for distributed systems) and six in an extended version of sequential PASCAL. For all maintenance aspects except reusability, the results were quantitatively given in terms of complexity metrics which could be automated. The results showed LADY to be better suited to the development of maintainable software than the extension of sequential PASCAL. The strong typing combined with high parametrization of units is suggested to improve the reusability of units in LADY.

  2. Modeling and Simulation of Upset-Inducing Disturbances for Digital Systems in an Electromagnetic Reverberation Chamber

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2014-01-01

    This report describes a modeling and simulation approach for disturbance patterns representative of the environment experienced by a digital system in an electromagnetic reverberation chamber. The disturbance is modeled by a multi-variate statistical distribution based on empirical observations. Extended versions of the Rejection Samping and Inverse Transform Sampling techniques are developed to generate multi-variate random samples of the disturbance. The results show that Inverse Transform Sampling returns samples with higher fidelity relative to the empirical distribution. This work is part of an ongoing effort to develop a resilience assessment methodology for complex safety-critical distributed systems.

  3. Frontogenesis driven by horizontally quadratic distributions of density

    NASA Technical Reports Server (NTRS)

    Jacqmin, David

    1991-01-01

    Attention is given to the quadratic density distribution in a channel, which has been established by Simpson and Linden to be the simplest case of the horizontally nonlinear distribution of fluid density required for the production of frontogenesis. The porous-media and Boussinesq flow models are examined, and their evolution equations are reduced to one-dimensional systems. While both the porous-media and the inviscid/nondiffusive Boussinesq systems exhibit classic frontogenesis behavior, the viscous Boussinesq system exhibits a more complex behavior: boundary-layer effects force frontogenesis away from the lower boundary, and at late times the steepest density gradients are close to mid-channel.

  4. Reconstructing the equilibrium Boltzmann distribution from well-tempered metadynamics.

    PubMed

    Bonomi, M; Barducci, A; Parrinello, M

    2009-08-01

    Metadynamics is a widely used and successful method for reconstructing the free-energy surface of complex systems as a function of a small number of suitably chosen collective variables. This is achieved by biasing the dynamics of the system. The bias acting on the collective variables distorts the probability distribution of the other variables. Here we present a simple reweighting algorithm for recovering the unbiased probability distribution of any variable from a well-tempered metadynamics simulation. We show the efficiency of the reweighting procedure by reconstructing the distribution of the four backbone dihedral angles of alanine dipeptide from two and even one dimensional metadynamics simulation. 2009 Wiley Periodicals, Inc.

  5. Cooperating Expert Systems For Space Station Power Distribution Management

    NASA Astrophysics Data System (ADS)

    Nguyen, T. A.; Chiou, W. C.

    1987-02-01

    In a complex system such as the manned Space Station, it is deem necessary that many expert systems must perform tasks in a concurrent and cooperative manner. An important question arise is: what cooperative-task-performing models are appropriate for multiple expert systems to jointly perform tasks. The solution to this question will provide a crucial automation design criteria for the Space Station complex systems architecture. Based on a client/server model for performing tasks, we have developed a system that acts as a front-end to support loosely-coupled communications between expert systems running on multiple Symbolics machines. As an example, we use two ART*-based expert systems to demonstrate the concept of parallel symbolic manipulation for power distribution management and dynamic load planner/scheduler in the simulated Space Station environment. This on-going work will also explore other cooperative-task-performing models as alternatives which can evaluate inter and intra expert system communication mechanisms. It will be served as a testbed and a bench-marking tool for other Space Station expert subsystem communication and information exchange.

  6. Integration science and distributed networks

    NASA Astrophysics Data System (ADS)

    Landauer, Christopher; Bellman, Kirstie L.

    2002-07-01

    Our work on integration of data and knowledge sources is based in a common theoretical treatment of 'Integration Science', which leads to systematic processes for combining formal logical and mathematical systems, computational and physical systems, and human systems and organizations. The theory is based on the processing of explicit meta-knowledge about the roles played by the different knowledge sources and the methods of analysis and semantic implications of the different data values, together with information about the context in which and the purpose for which they are being combined. The research treatment is primarily mathematical, and though this kind of integration mathematics is still under development, there are some applicable common threads that have emerged already. Instead of describing the current state of the mathematical investigations, since they are not yet crystallized enough for formalisms, we describe our applications of the approach in several different areas, including our focus area of 'Constructed Complex Systems', which are complex heterogeneous systems managed or mediated by computing systems. In this context, it is important to remember that all systems are embedded, all systems are autonomous, and that all systems are distributed networks.

  7. Cooperative Management of a Lithium-Ion Battery Energy Storage Network: A Distributed MPC Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fang, Huazhen; Wu, Di; Yang, Tao

    2016-12-12

    This paper presents a study of cooperative power supply and storage for a network of Lithium-ion energy storage systems (LiBESSs). We propose to develop a distributed model predictive control (MPC) approach for two reasons. First, able to account for the practical constraints of a LiBESS, the MPC can enable a constraint-aware operation. Second, a distributed management can cope with a complex network that integrates a large number of LiBESSs over a complex communication topology. With this motivation, we then build a fully distributed MPC algorithm from an optimization perspective, which is based on an extension of the alternating direction methodmore » of multipliers (ADMM) method. A simulation example is provided to demonstrate the effectiveness of the proposed algorithm.« less

  8. Finite-time hybrid projective synchronization of the drive-response complex networks with distributed-delay via adaptive intermittent control

    NASA Astrophysics Data System (ADS)

    Cheng, Lin; Yang, Yongqing; Li, Li; Sui, Xin

    2018-06-01

    This paper studies the finite-time hybrid projective synchronization of the drive-response complex networks. In the model, general transmission delays and distributed delays are also considered. By designing the adaptive intermittent controllers, the response network can achieve hybrid projective synchronization with the drive system in finite time. Based on finite-time stability theory and several differential inequalities, some simple finite-time hybrid projective synchronization criteria are derived. Two numerical examples are given to illustrate the effectiveness of the proposed method.

  9. Analysis and Modeling of Complex Geomorphic Systems: Technique Development, Data Collection, and Application to Rangeland Terrain

    DTIC Science & Technology

    2008-10-01

    attempts to measure the long-term distribution of stor- age time have relied unrealistic assumptions, but two recent studies suggest a new approach. As...sediment 10 age . Everitt (1968) mapped the age distribution of cottonwoods along a 34 km stretch of the Little Missouri River in North Dakota...Dietrich et al. (1982) applied Erikssons (1971) method to estimate the residence time distribution from Everitts age distribution. Somewhat mysteriously

  10. Complex Network Theory Applied to the Growth of Kuala Lumpur's Public Urban Rail Transit Network.

    PubMed

    Ding, Rui; Ujang, Norsidah; Hamid, Hussain Bin; Wu, Jianjun

    2015-01-01

    Recently, the number of studies involving complex network applications in transportation has increased steadily as scholars from various fields analyze traffic networks. Nonetheless, research on rail network growth is relatively rare. This research examines the evolution of the Public Urban Rail Transit Networks of Kuala Lumpur (PURTNoKL) based on complex network theory and covers both the topological structure of the rail system and future trends in network growth. In addition, network performance when facing different attack strategies is also assessed. Three topological network characteristics are considered: connections, clustering and centrality. In PURTNoKL, we found that the total number of nodes and edges exhibit a linear relationship and that the average degree stays within the interval [2.0488, 2.6774] with heavy-tailed distributions. The evolutionary process shows that the cumulative probability distribution (CPD) of degree and the average shortest path length show good fit with exponential distribution and normal distribution, respectively. Moreover, PURTNoKL exhibits clear cluster characteristics; most of the nodes have a 2-core value, and the CPDs of the centrality's closeness and betweenness follow a normal distribution function and an exponential distribution, respectively. Finally, we discuss four different types of network growth styles and the line extension process, which reveal that the rail network's growth is likely based on the nodes with the biggest lengths of the shortest path and that network protection should emphasize those nodes with the largest degrees and the highest betweenness values. This research may enhance the networkability of the rail system and better shape the future growth of public rail networks.

  11. A fractal approach to dynamic inference and distribution analysis

    PubMed Central

    van Rooij, Marieke M. J. W.; Nash, Bertha A.; Rajaraman, Srinivasan; Holden, John G.

    2013-01-01

    Event-distributions inform scientists about the variability and dispersion of repeated measurements. This dispersion can be understood from a complex systems perspective, and quantified in terms of fractal geometry. The key premise is that a distribution's shape reveals information about the governing dynamics of the system that gave rise to the distribution. Two categories of characteristic dynamics are distinguished: additive systems governed by component-dominant dynamics and multiplicative or interdependent systems governed by interaction-dominant dynamics. A logic by which systems governed by interaction-dominant dynamics are expected to yield mixtures of lognormal and inverse power-law samples is discussed. These mixtures are described by a so-called cocktail model of response times derived from human cognitive performances. The overarching goals of this article are twofold: First, to offer readers an introduction to this theoretical perspective and second, to offer an overview of the related statistical methods. PMID:23372552

  12. Visualizing request-flow comparison to aid performance diagnosis in distributed systems.

    PubMed

    Sambasivan, Raja R; Shafer, Ilari; Mazurek, Michelle L; Ganger, Gregory R

    2013-12-01

    Distributed systems are complex to develop and administer, and performance problem diagnosis is particularly challenging. When performance degrades, the problem might be in any of the system's many components or could be a result of poor interactions among them. Recent research efforts have created tools that automatically localize the problem to a small number of potential culprits, but research is needed to understand what visualization techniques work best for helping distributed systems developers understand and explore their results. This paper compares the relative merits of three well-known visualization approaches (side-by-side, diff, and animation) in the context of presenting the results of one proven automated localization technique called request-flow comparison. Via a 26-person user study, which included real distributed systems developers, we identify the unique benefits that each approach provides for different problem types and usage modes.

  13. Determination of optimum allocation and pricing of distributed generation using genetic algorithm methodology

    NASA Astrophysics Data System (ADS)

    Mwakabuta, Ndaga Stanslaus

    Electric power distribution systems play a significant role in providing continuous and "quality" electrical energy to different classes of customers. In the context of the present restrictions on transmission system expansions and the new paradigm of "open and shared" infrastructure, new approaches to distribution system analyses, economic and operational decision-making need investigation. This dissertation includes three layers of distribution system investigations. In the basic level, improved linear models are shown to offer significant advantages over previous models for advanced analysis. In the intermediate level, the improved model is applied to solve the traditional problem of operating cost minimization using capacitors and voltage regulators. In the advanced level, an artificial intelligence technique is applied to minimize cost under Distributed Generation injection from private vendors. Soft computing techniques are finding increasing applications in solving optimization problems in large and complex practical systems. The dissertation focuses on Genetic Algorithm for investigating the economic aspects of distributed generation penetration without compromising the operational security of the distribution system. The work presents a methodology for determining the optimal pricing of distributed generation that would help utilities make a decision on how to operate their system economically. This would enable modular and flexible investments that have real benefits to the electric distribution system. Improved reliability for both customers and the distribution system in general, reduced environmental impacts, increased efficiency of energy use, and reduced costs of energy services are some advantages.

  14. A continuum theory for multicomponent chromatography modeling.

    PubMed

    Pfister, David; Morbidelli, Massimo; Nicoud, Roger-Marc

    2016-05-13

    A continuum theory is proposed for modeling multicomponent chromatographic systems under linear conditions. The model is based on the description of complex mixtures, possibly involving tens or hundreds of solutes, by a continuum. The present approach is shown to be very efficient when dealing with a large number of similar components presenting close elution behaviors and whose individual analytical characterization is impossible. Moreover, approximating complex mixtures by continuous distributions of solutes reduces the required number of model parameters to the few ones specific to the characterization of the selected continuous distributions. Therefore, in the frame of the continuum theory, the simulation of large multicomponent systems gets simplified and the computational effectiveness of the chromatographic model is thus dramatically improved. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. A Novel Distributed Privacy Paradigm for Visual Sensor Networks Based on Sharing Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Luh, William; Kundur, Deepa; Zourntos, Takis

    2006-12-01

    Visual sensor networks (VSNs) provide surveillance images/video which must be protected from eavesdropping and tampering en route to the base station. In the spirit of sensor networks, we propose a novel paradigm for securing privacy and confidentiality in a distributed manner. Our paradigm is based on the control of dynamical systems, which we show is well suited for VSNs due to its low complexity in terms of processing and communication, while achieving robustness to both unintentional noise and intentional attacks as long as a small subset of nodes are affected. We also present a low complexity algorithm called TANGRAM to demonstrate the feasibility of applying our novel paradigm to VSNs. We present and discuss simulation results of TANGRAM.

  16. Empirical research on complex networks modeling of combat SoS based on data from real war-game, Part I: Statistical characteristics

    NASA Astrophysics Data System (ADS)

    Chen, Lei; Kou, Yingxin; Li, Zhanwu; Xu, An; Wu, Cheng

    2018-01-01

    We build a complex networks model of combat System-of-Systems (SoS) based on empirical data from a real war-game, this model is a combination of command & control (C2) subnetwork, sensors subnetwork, influencers subnetwork and logistical support subnetwork, each subnetwork has idiographic components and statistical characteristics. The C2 subnetwork is the core of whole combat SoS, it has a hierarchical structure with no modularity, of which robustness is strong enough to maintain normal operation after any two nodes is destroyed; the sensors subnetwork and influencers subnetwork are like sense organ and limbs of whole combat SoS, they are both flat modular networks of which degree distribution obey GEV distribution and power-law distribution respectively. The communication network is the combination of all subnetworks, it is an assortative Small-World network with core-periphery structure, the Intelligence & Communication Stations/Command Center integrated with C2 nodes in the first three level act as the hub nodes in communication network, and all the fourth-level C2 nodes, sensors, influencers and logistical support nodes have communication capability, they act as the periphery nodes in communication network, its degree distribution obeys exponential distribution in the beginning, Gaussian distribution in the middle, and power-law distribution in the end, and its path length obeys GEV distribution. The betweenness centrality distribution, closeness centrality distribution and eigenvector centrality are also been analyzed to measure the vulnerability of nodes.

  17. A concurrent distributed system for aircraft tactical decision generation

    NASA Technical Reports Server (NTRS)

    Mcmanus, John W.

    1990-01-01

    A research program investigating the use of AI techniques to aid in the development of a tactical decision generator (TDG) for within visual range (WVR) air combat engagements is discussed. The application of AI programming and problem-solving methods in the development and implementation of a concurrent version of the computerized logic for air-to-air warfare simulations (CLAWS) program, a second-generation TDG, is presented. Concurrent computing environments and programming approaches are discussed, and the design and performance of prototype concurrent TDG system (Cube CLAWS) are presented. It is concluded that the Cube CLAWS has provided a useful testbed to evaluate the development of a distributed blackboard system. The project has shown that the complexity of developing specialized software on a distributed, message-passing architecture such as the Hypercube is not overwhelming, and that reasonable speedups and processor efficiency can be achieved by a distributed blackboard system. The project has also highlighted some of the costs of using a distributed approach to designing a blackboard system.

  18. An Event-Based Approach to Distributed Diagnosis of Continuous Systems

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Roychoudhurry, Indranil; Biswas, Gautam; Koutsoukos, Xenofon

    2010-01-01

    Distributed fault diagnosis solutions are becoming necessary due to the complexity of modern engineering systems, and the advent of smart sensors and computing elements. This paper presents a novel event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, based on a qualitative abstraction of measurement deviations from the nominal behavior. We systematically derive dynamic fault signatures expressed as event-based fault models. We develop a distributed diagnoser design algorithm that uses these models for designing local event-based diagnosers based on global diagnosability analysis. The local diagnosers each generate globally correct diagnosis results locally, without a centralized coordinator, and by communicating a minimal number of measurements between themselves. The proposed approach is applied to a multi-tank system, and results demonstrate a marked improvement in scalability compared to a centralized approach.

  19. Extreme fluctuations in stochastic network coordination with time delays

    NASA Astrophysics Data System (ADS)

    Hunt, D.; Molnár, F.; Szymanski, B. K.; Korniss, G.

    2015-12-01

    We study the effects of uniform time delays on the extreme fluctuations in stochastic synchronization and coordination problems with linear couplings in complex networks. We obtain the average size of the fluctuations at the nodes from the behavior of the underlying modes of the network. We then obtain the scaling behavior of the extreme fluctuations with system size, as well as the distribution of the extremes on complex networks, and compare them to those on regular one-dimensional lattices. For large complex networks, when the delay is not too close to the critical one, fluctuations at the nodes effectively decouple, and the limit distributions converge to the Fisher-Tippett-Gumbel density. In contrast, fluctuations in low-dimensional spatial graphs are strongly correlated, and the limit distribution of the extremes is the Airy density. Finally, we also explore the effects of nonlinear couplings on the stability and on the extremes of the synchronization landscapes.

  20. Comment on "Universal relation between skewness and kurtosis in complex dynamics"

    NASA Astrophysics Data System (ADS)

    Celikoglu, Ahmet; Tirnakli, Ugur

    2015-12-01

    In a recent paper [M. Cristelli, A. Zaccaria, and L. Pietronero, Phys. Rev. E 85, 066108 (2012), 10.1103/PhysRevE.85.066108], the authors analyzed the relation between skewness and kurtosis for complex dynamical systems, and they identified two power-law regimes of non-Gaussianity, one of which scales with an exponent of 2 and the other with 4 /3 . They concluded that the observed relation is a universal fact in complex dynamical systems. In this Comment, we test the proposed universal relation between skewness and kurtosis with a large number of synthetic data, and we show that in fact it is not a universal relation and originates only due to the small number of data points in the datasets considered. The proposed relation is tested using a family of non-Gaussian distribution known as q -Gaussians. We show that this relation disappears for sufficiently large datasets provided that the fourth moment of the distribution is finite. We find that kurtosis saturates to a single value, which is of course different from the Gaussian case (K =3 ), as the number of data is increased, and this indicates that the kurtosis will converge to a finite single value if all moments of the distribution up to fourth are finite. The converged kurtosis value for the finite fourth-moment distributions and the number of data points needed to reach this value depend on the deviation of the original distribution from the Gaussian case.

  1. Distributed control systems with incomplete and uncertain information

    NASA Astrophysics Data System (ADS)

    Tang, Jingpeng

    Scientific and engineering advances in wireless communication, sensors, propulsion, and other areas are rapidly making it possible to develop unmanned air vehicles (UAVs) with sophisticated capabilities. UAVs have come to the forefront as tools for airborne reconnaissance to search for, detect, and destroy enemy targets in relatively complex environments. They potentially reduce risk to human life, are cost effective, and are superior to manned aircraft for certain types of missions. It is desirable for UAVs to have a high level of intelligent autonomy to carry out mission tasks with little external supervision and control. This raises important issues involving tradeoffs between centralized control and the associated potential to optimize mission plans, and decentralized control with great robustness and the potential to adapt to changing conditions. UAV capabilities have been extended several ways through armament (e.g., Hellfire missiles on Predator UAVs), increased endurance and altitude (e.g., Global Hawk), and greater autonomy. Some known barriers to full-scale implementation of UAVs are increased communication and control requirements as well as increased platform and system complexity. One of the key problems is how UAV systems can handle incomplete and uncertain information in dynamic environments. Especially when the system is composed of heterogeneous and distributed UAVs, the overall system complexity is increased under such conditions. Presented through the use of published papers, this dissertation lays the groundwork for the study of methodologies for handling incomplete and uncertain information for distributed control systems. An agent-based simulation framework is built to investigate mathematical approaches (optimization) and emergent intelligence approaches. The first paper provides a mathematical approach for systems of UAVs to handle incomplete and uncertain information. The second paper describes an emergent intelligence approach for UAVs, again in handling incomplete and uncertain information. The third paper combines mathematical and emergent intelligence approaches.

  2. Comparing Different Fault Identification Algorithms in Distributed Power System

    NASA Astrophysics Data System (ADS)

    Alkaabi, Salim

    A power system is a huge complex system that delivers the electrical power from the generation units to the consumers. As the demand for electrical power increases, distributed power generation was introduced to the power system. Faults may occur in the power system at any time in different locations. These faults cause a huge damage to the system as they might lead to full failure of the power system. Using distributed generation in the power system made it even harder to identify the location of the faults in the system. The main objective of this work is to test the different fault location identification algorithms while tested on a power system with the different amount of power injected using distributed generators. As faults may lead the system to full failure, this is an important area for research. In this thesis different fault location identification algorithms have been tested and compared while the different amount of power is injected from distributed generators. The algorithms were tested on IEEE 34 node test feeder using MATLAB and the results were compared to find when these algorithms might fail and the reliability of these methods.

  3. The Impact of Heterogeneous Thresholds on Social Contagion with Multiple Initiators

    PubMed Central

    Karampourniotis, Panagiotis D.; Sreenivasan, Sameet; Szymanski, Boleslaw K.; Korniss, Gyorgy

    2015-01-01

    The threshold model is a simple but classic model of contagion spreading in complex social systems. To capture the complex nature of social influencing we investigate numerically and analytically the transition in the behavior of threshold-limited cascades in the presence of multiple initiators as the distribution of thresholds is varied between the two extreme cases of identical thresholds and a uniform distribution. We accomplish this by employing a truncated normal distribution of the nodes’ thresholds and observe a non-monotonic change in the cascade size as we vary the standard deviation. Further, for a sufficiently large spread in the threshold distribution, the tipping-point behavior of the social influencing process disappears and is replaced by a smooth crossover governed by the size of initiator set. We demonstrate that for a given size of the initiator set, there is a specific variance of the threshold distribution for which an opinion spreads optimally. Furthermore, in the case of synthetic graphs we show that the spread asymptotically becomes independent of the system size, and that global cascades can arise just by the addition of a single node to the initiator set. PMID:26571486

  4. How Emotions Affect Learning.

    ERIC Educational Resources Information Center

    Sylwester, Robert

    1994-01-01

    Studies show our emotional system is a complex, widely distributed, and error-prone system that defines our basic personality early in life and is quite resistant to change. This article describes our emotional system's major parts (the peptides that carry emotional information and the body and brain structures that activate and regulate emotions)…

  5. Further Understanding of Complex Information Processing in Verbal Adolescents and Adults with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Williams, Diane L.; Minshew, Nancy J.; Goldstein, Gerald

    2015-01-01

    More than 20?years ago, Minshew and colleagues proposed the Complex Information Processing model of autism in which the impairment is characterized as a generalized deficit involving multiple modalities and cognitive domains that depend on distributed cortical systems responsible for higher order abilities. Subsequent behavioral work revealed a…

  6. Utilization of Matrix-Assisted Laser Desorption/Ionization Mass Spectrometry (MALDI-MS) to Identify environmental Strains of Mycobacterium Complex

    EPA Science Inventory

    Species within the Mycobacterium avium Complex (MAC) group are found to be both prevalent and persistent in drinking water distribution systems. The MAC is composed of two predominant species: M. avium and M. intracellulare. These species have the ability to survive drinking ...

  7. Architectures for Distributed and Complex M-Learning Systems: Applying Intelligent Technologies

    ERIC Educational Resources Information Center

    Caballe, Santi, Ed.; Xhafa, Fatos, Ed.; Daradoumis, Thanasis, Ed.; Juan, Angel A., Ed.

    2009-01-01

    Over the last decade, the needs of educational organizations have been changing in accordance with increasingly complex pedagogical models and with the technological evolution of e-learning environments with very dynamic teaching and learning requirements. This book explores state-of-the-art software architectures and platforms used to support…

  8. The mysteries of the diffusion region in asymmetric systems

    NASA Astrophysics Data System (ADS)

    Hesse, M.; Aunai, N.; Zenitani, S.; Kuznetsova, M. M.; Birn, J.

    2013-12-01

    Unlike in symmetric systems, where symmetry dictates a comparatively simple structure of the reconnection region, asymmetric systems offer a surprising, much more complex, structure of the diffusion region. Beyond the well-known lack of colocation of flow stagnation and magnetic null, the physical mechanism underpinning the reconnection electric field also appears to be considerably more complex. In this presentation, we will perform a detailed analysis of the reconnection diffusion region in an asymmetric system. We will show that, unlike in symmetric systems, the immediate reconnection electric field is not given by electron pressure tensor nongyrotropies, but by electron inertial contributions. We will further discuss the role of pressure nongyrotropies, and we will study the origin of the complex structures of electron distributions in the central part of the diffusion region.

  9. Trust and Influence

    DTIC Science & Technology

    2012-03-05

    DISTRIBUTION A: Approved for public release; distribution is unlimited. Program Trends •Trust in Autonomous Systems • Cross - cultural Trust...Trust & trustworthiness are independent (Mayer et al, 1995) •Trust is relational •Humans in cross - cultural interactions •Complex human-machine...Interpersonal Trustworthiness •Ability •Benevolence •Integrity Trust Metrics Cross - Cultural Trust Issues Human-Machine Interactions Autonomous

  10. A mechanism producing power law etc. distributions

    NASA Astrophysics Data System (ADS)

    Li, Heling; Shen, Hongjun; Yang, Bin

    2017-07-01

    Power law distribution is playing an increasingly important role in the complex system study. Based on the insolvability of complex systems, the idea of incomplete statistics is utilized and expanded, three different exponential factors are introduced in equations about the normalization condition, statistical average and Shannon entropy, with probability distribution function deduced about exponential function, power function and the product form between power function and exponential function derived from Shannon entropy and maximal entropy principle. So it is shown that maximum entropy principle can totally replace equal probability hypothesis. Owing to the fact that power and probability distribution in the product form between power function and exponential function, which cannot be derived via equal probability hypothesis, can be derived by the aid of maximal entropy principle, it also can be concluded that maximal entropy principle is a basic principle which embodies concepts more extensively and reveals basic principles on motion laws of objects more fundamentally. At the same time, this principle also reveals the intrinsic link between Nature and different objects in human society and principles complied by all.

  11. Cardea: Dynamic Access Control in Distributed Systems

    NASA Technical Reports Server (NTRS)

    Lepro, Rebekah

    2004-01-01

    Modern authorization systems span domains of administration, rely on many different authentication sources, and manage complex attributes as part of the authorization process. This . paper presents Cardea, a distributed system that facilitates dynamic access control, as a valuable piece of an inter-operable authorization framework. First, the authorization model employed in Cardea and its functionality goals are examined. Next, critical features of the system architecture and its handling of the authorization process are then examined. Then the S A M L and XACML standards, as incorporated into the system, are analyzed. Finally, the future directions of this project are outlined and connection points with general components of an authorization system are highlighted.

  12. Integrated Nationwide Electronic Health Records system: Semi-distributed architecture approach.

    PubMed

    Fragidis, Leonidas L; Chatzoglou, Prodromos D; Aggelidis, Vassilios P

    2016-11-14

    The integration of heterogeneous electronic health records systems by building an interoperable nationwide electronic health record system provides undisputable benefits in health care, like superior health information quality, medical errors prevention and cost saving. This paper proposes a semi-distributed system architecture approach for an integrated national electronic health record system incorporating the advantages of the two dominant approaches, the centralized architecture and the distributed architecture. The high level design of the main elements for the proposed architecture is provided along with diagrams of execution and operation and data synchronization architecture for the proposed solution. The proposed approach effectively handles issues related to redundancy, consistency, security, privacy, availability, load balancing, maintainability, complexity and interoperability of citizen's health data. The proposed semi-distributed architecture offers a robust interoperability framework without healthcare providers to change their local EHR systems. It is a pragmatic approach taking into account the characteristics of the Greek national healthcare system along with the national public administration data communication network infrastructure, for achieving EHR integration with acceptable implementation cost.

  13. [The Herceptin® case : A case of falsification of medicinal products to a greater extent].

    PubMed

    Streit, Renz

    2017-11-01

    Falsified medicines are a raising problem for the German drug market. The complex distribution channels across the European market facilitates the introduction of falsified and stolen medicines into the legal supply chain and may pose a risk for patients. The "Herceptin® case" from 2014 of falsified medicines of Italian origin demonstrates how complex distribution systems have been misused by criminal organizations in order to introduce stolen and thus falsified medicines via the parallel trade into the market, and which measures the authorities and the parallel-traders in the national and European network have taken to ensure patient safety. Falsified medicines will continue to be a problem in the future, so new monitoring systems have to be established and effectively used for prevention. The introduction of the EU-wide serialisation system in February 2019 is therefore intended to identify falsified drugs and to prevent the further trade as well as the expenditure to the patient. Furthermore, the maintenance and intensification of the cooperation between all EU authorities involved remains indispensable to close gateways in the distribution system for falsified medicines and to minimise the risk to the population.

  14. [Soft- and hardware support for the setup for computer tracking of radiation teletherapy].

    PubMed

    Tarutin, I G; Piliavets, V I; Strakh, A G; Minenko, V F; Golubovskiĭ, A I

    1983-06-01

    A hard and soft ware computer assisted complex has been worked out for gamma-beam therapy. The complex included all radiotherapeutic units, including a Siemens program controlled betatron with an energy of 42 MEV computer ES-1022, a Medigraf system of the processing of graphic information, a Mars-256 system for control over the homogeneity of distribution of dose rate on the field of irradiation and a package of mathematical programs to select a plan of irradiation of various tumor sites. The prospects of the utilization of such complexes in the dosimetric support of radiation therapy are discussed.

  15. Optimized distributed computing environment for mask data preparation

    NASA Astrophysics Data System (ADS)

    Ahn, Byoung-Sup; Bang, Ju-Mi; Ji, Min-Kyu; Kang, Sun; Jang, Sung-Hoon; Choi, Yo-Han; Ki, Won-Tai; Choi, Seong-Woon; Han, Woo-Sung

    2005-11-01

    As the critical dimension (CD) becomes smaller, various resolution enhancement techniques (RET) are widely adopted. In developing sub-100nm devices, the complexity of optical proximity correction (OPC) is severely increased and applied OPC layers are expanded to non-critical layers. The transformation of designed pattern data by OPC operation causes complexity, which cause runtime overheads to following steps such as mask data preparation (MDP), and collapse of existing design hierarchy. Therefore, many mask shops exploit the distributed computing method in order to reduce the runtime of mask data preparation rather than exploit the design hierarchy. Distributed computing uses a cluster of computers that are connected to local network system. However, there are two things to limit the benefit of the distributing computing method in MDP. First, every sequential MDP job, which uses maximum number of available CPUs, is not efficient compared to parallel MDP job execution due to the input data characteristics. Second, the runtime enhancement over input cost is not sufficient enough since the scalability of fracturing tools is limited. In this paper, we will discuss optimum load balancing environment that is useful in increasing the uptime of distributed computing system by assigning appropriate number of CPUs for each input design data. We will also describe the distributed processing (DP) parameter optimization to obtain maximum throughput in MDP job processing.

  16. Integrated Tools for Future Distributed Engine Control Technologies

    NASA Technical Reports Server (NTRS)

    Culley, Dennis; Thomas, Randy; Saus, Joseph

    2013-01-01

    Turbine engines are highly complex mechanical systems that are becoming increasingly dependent on control technologies to achieve system performance and safety metrics. However, the contribution of controls to these measurable system objectives is difficult to quantify due to a lack of tools capable of informing the decision makers. This shortcoming hinders technology insertion in the engine design process. NASA Glenn Research Center is developing a Hardware-inthe- Loop (HIL) platform and analysis tool set that will serve as a focal point for new control technologies, especially those related to the hardware development and integration of distributed engine control. The HIL platform is intended to enable rapid and detailed evaluation of new engine control applications, from conceptual design through hardware development, in order to quantify their impact on engine systems. This paper discusses the complex interactions of the control system, within the context of the larger engine system, and how new control technologies are changing that paradigm. The conceptual design of the new HIL platform is then described as a primary tool to address those interactions and how it will help feed the insertion of new technologies into future engine systems.

  17. Hysteresis of liquid adsorption in porous media by coarse-grained Monte Carlo with direct experimental validation

    NASA Astrophysics Data System (ADS)

    Zeidman, Benjamin D.; Lu, Ning; Wu, David T.

    2016-05-01

    The effects of path-dependent wetting and drying manifest themselves in many types of physical systems, including nanomaterials, biological systems, and porous media such as soil. It is desirable to better understand how these hysteretic macroscopic properties result from a complex interplay between gasses, liquids, and solids at the pore scale. Coarse-Grained Monte Carlo (CGMC) is an appealing approach to model these phenomena in complex pore spaces, including ones determined experimentally. We present two-dimensional CGMC simulations of wetting and drying in two systems with pore spaces determined by sections from micro X-ray computed tomography: a system of randomly distributed spheres and a system of Ottawa sand. Results for the phase distribution, water uptake, and matric suction when corrected for extending to three dimensions show excellent agreement with experimental measurements on the same systems. This supports the hypothesis that CGMC can generate metastable configurations representative of experimental hysteresis and can also be used to predict hysteretic constitutive properties of particular experimental systems, given pore space images.

  18. Hysteresis of liquid adsorption in porous media by coarse-grained Monte Carlo with direct experimental validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeidman, Benjamin D.; Lu, Ning; Wu, David T., E-mail: dwu@mines.edu

    2016-05-07

    The effects of path-dependent wetting and drying manifest themselves in many types of physical systems, including nanomaterials, biological systems, and porous media such as soil. It is desirable to better understand how these hysteretic macroscopic properties result from a complex interplay between gasses, liquids, and solids at the pore scale. Coarse-Grained Monte Carlo (CGMC) is an appealing approach to model these phenomena in complex pore spaces, including ones determined experimentally. We present two-dimensional CGMC simulations of wetting and drying in two systems with pore spaces determined by sections from micro X-ray computed tomography: a system of randomly distributed spheres andmore » a system of Ottawa sand. Results for the phase distribution, water uptake, and matric suction when corrected for extending to three dimensions show excellent agreement with experimental measurements on the same systems. This supports the hypothesis that CGMC can generate metastable configurations representative of experimental hysteresis and can also be used to predict hysteretic constitutive properties of particular experimental systems, given pore space images.« less

  19. A statistical physics perspective on criticality in financial markets

    NASA Astrophysics Data System (ADS)

    Bury, Thomas

    2013-11-01

    Stock markets are complex systems exhibiting collective phenomena and particular features such as synchronization, fluctuations distributed as power-laws, non-random structures and similarity to neural networks. Such specific properties suggest that markets operate at a very special point. Financial markets are believed to be critical by analogy to physical systems, but little statistically founded evidence has been given. Through a data-based methodology and comparison to simulations inspired by the statistical physics of complex systems, we show that the Dow Jones and index sets are not rigorously critical. However, financial systems are closer to criticality in the crash neighborhood.

  20. Unifying distribution functions: some lesser known distributions.

    PubMed

    Moya-Cessa, J R; Moya-Cessa, H; Berriel-Valdos, L R; Aguilar-Loreto, O; Barberis-Blostein, P

    2008-08-01

    We show that there is a way to unify distribution functions that describe simultaneously a classical signal in space and (spatial) frequency and position and momentum for a quantum system. Probably the most well known of them is the Wigner distribution function. We show how to unify functions of the Cohen class, Rihaczek's complex energy function, and Husimi and Glauber-Sudarshan distribution functions. We do this by showing how they may be obtained from ordered forms of creation and annihilation operators and by obtaining them in terms of expectation values in different eigenbases.

  1. Delivery of complex organic compounds from evolved stars to the solar system.

    PubMed

    Kwok, Sun

    2011-12-01

    Stars in the late stages of evolution are able to synthesize complex organic compounds with aromatic and aliphatic structures over very short time scales. These compounds are ejected into the interstellar medium and distributed throughout the Galaxy. The structures of these compounds are similar to the insoluble organic matter found in meteorites. In this paper, we discuss to what extent stellar organics has enriched the primordial Solar System and possibly the early Earth.

  2. Integrating research tools to support the management of social-ecological systems under climate change

    USGS Publications Warehouse

    Miller, Brian W.; Morisette, Jeffrey T.

    2014-01-01

    Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.

  3. Distribution of glutamatergic, GABAergic, and glycinergic neurons in the auditory pathways of macaque monkeys.

    PubMed

    Ito, T; Inoue, K; Takada, M

    2015-12-03

    Macaque monkeys use complex communication calls and are regarded as a model for studying the coding and decoding of complex sound in the auditory system. However, little is known about the distribution of excitatory and inhibitory neurons in the auditory system of macaque monkeys. In this study, we examined the overall distribution of cell bodies that expressed mRNAs for VGLUT1, and VGLUT2 (markers for glutamatergic neurons), GAD67 (a marker for GABAergic neurons), and GLYT2 (a marker for glycinergic neurons) in the auditory system of the Japanese macaque. In addition, we performed immunohistochemistry for VGLUT1, VGLUT2, and GAD67 in order to compare the distribution of proteins and mRNAs. We found that most of the excitatory neurons in the auditory brainstem expressed VGLUT2. In contrast, the expression of VGLUT1 mRNA was restricted to the auditory cortex (AC), periolivary nuclei, and cochlear nuclei (CN). The co-expression of GAD67 and GLYT2 mRNAs was common in the ventral nucleus of the lateral lemniscus (VNLL), CN, and superior olivary complex except for the medial nucleus of the trapezoid body, which expressed GLYT2 alone. In contrast, the dorsal nucleus of the lateral lemniscus, inferior colliculus, thalamus, and AC expressed GAD67 alone. The absence of co-expression of VGLUT1 and VGLUT2 in the medial geniculate, medial superior olive, and VNLL suggests that synaptic responses in the target neurons of these nuclei may be different between rodents and macaque monkeys. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.

  4. The influence of bile salts on the distribution of simvastatin in the octanol/buffer system.

    PubMed

    Đanić, Maja; Pavlović, Nebojša; Stanimirov, Bojan; Vukmirović, Saša; Nikolić, Katarina; Agbaba, Danica; Mikov, Momir

    2016-01-01

    Distribution coefficient (D) is useful parameter for evaluating drugs permeability properties across biological membranes, which are of importance for drugs bioavailability. Given that bile acids are intensively studied as drug permeation-modifying and -solubilizing agents, the aim of this study was to estimate the influence of sodium salts of cholic (CA), deoxycholic (DCA) and 12-monoketocholic acids (MKC) on distribution coefficient of simvastatin (SV) (lactone [SVL] and acid form [SVA]) which is a highly lipophilic compound with extremely low water solubility and bioavailability. LogD values of SVA and SVL with or without bile salts were measured by liquid-liquid extraction in n-octanol/buffer systems at pH 5 and 7.4. SV concentrations in aqueous phase were determined by HPLC-DAD. Chem3D Ultra program was applied for computation of physico-chemical properties of analyzed compounds and their complexes. Statistically significant decrease in both SVA and SVL logD was observed for all three studied bile salts at both selected pH. MKC exerted the most pronounced effect in the case of SVA while there were no statistically significant differences between observed bile salts for SVL. The calculated physico-chemical properties of analyzed compounds and their complexes supported experimental results. Our data indicate that the addition of bile salts into the n-octanol/buffer system decreases the values of SV distribution coefficient at both studied pH values. This may be the result of the formation of hydrophilic complexes increasing the solubility of SV that could consequently impact the pharmacokinetic parameters of SV and the final drug response in patients.

  5. Bilinear effect in complex systems

    NASA Astrophysics Data System (ADS)

    Lam, Lui; Bellavia, David C.; Han, Xiao-Pu; Alston Liu, Chih-Hui; Shu, Chang-Qing; Wei, Zhengjin; Zhou, Tao; Zhu, Jichen

    2010-09-01

    The distribution of the lifetime of Chinese dynasties (as well as that of the British Isles and Japan) in a linear Zipf plot is found to consist of two straight lines intersecting at a transition point. This two-section piecewise-linear distribution is different from the power law or the stretched exponent distribution, and is called the Bilinear Effect for short. With assumptions mimicking the organization of ancient Chinese regimes, a 3-layer network model is constructed. Numerical results of this model show the bilinear effect, providing a plausible explanation of the historical data. The bilinear effect in two other social systems is presented, indicating that such a piecewise-linear effect is widespread in social systems.

  6. Combined distribution functions: A powerful tool to identify cation coordination geometries in liquid systems

    NASA Astrophysics Data System (ADS)

    Sessa, Francesco; D'Angelo, Paola; Migliorati, Valentina

    2018-01-01

    In this work we have developed an analytical procedure to identify metal ion coordination geometries in liquid media based on the calculation of Combined Distribution Functions (CDFs) starting from Molecular Dynamics (MD) simulations. CDFs provide a fingerprint which can be easily and unambiguously assigned to a reference polyhedron. The CDF analysis has been tested on five systems and has proven to reliably identify the correct geometries of several ion coordination complexes. This tool is simple and general and can be efficiently applied to different MD simulations of liquid systems.

  7. Increased Fire and Toxic Contaminant Detection Responsibility by Use of Distributed, Aspirating Sensors

    NASA Technical Reports Server (NTRS)

    Youngblood, Wallace W.

    1990-01-01

    Viewgraphs of increased fire and toxic contaminant detection responsivity by use of distributed, aspirating sensors for space station are presented. Objectives of the concept described are (1) to enhance fire and toxic contaminant detection responsivity in habitable regions of space station; (2) to reduce system weight and complexity through centralized detector/monitor systems; (3) to increase fire signature information from selected locations in a space station module; and (4) to reduce false alarms.

  8. Understanding quantum work in a quantum many-body system.

    PubMed

    Wang, Qian; Quan, H T

    2017-03-01

    Based on previous studies in a single-particle system in both the integrable [Jarzynski, Quan, and Rahav, Phys. Rev. X 5, 031038 (2015)2160-330810.1103/PhysRevX.5.031038] and the chaotic systems [Zhu, Gong, Wu, and Quan, Phys. Rev. E 93, 062108 (2016)1539-375510.1103/PhysRevE.93.062108], we study the the correspondence principle between quantum and classical work distributions in a quantum many-body system. Even though the interaction and the indistinguishability of identical particles increase the complexity of the system, we find that for a quantum many-body system the quantum work distribution still converges to its classical counterpart in the semiclassical limit. Our results imply that there exists a correspondence principle between quantum and classical work distributions in an interacting quantum many-body system, especially in the large particle number limit, and further justify the definition of quantum work via two-point energy measurements in quantum many-body systems.

  9. Elucidating complicated assembling systems in biology using size-and-shape analysis of sedimentation velocity data

    PubMed Central

    Chaton, Catherine T.

    2017-01-01

    Sedimentation velocity analytical ultracentrifugation (SV-AUC) has seen a resurgence in popularity as a technique for characterizing macromolecules and complexes in solution. SV-AUC is a particularly powerful tool for studying protein conformation, complex stoichiometry, and interacting systems in general. Deconvoluting velocity data to determine a sedimentation coefficient distribution c(s) allows for the study of either individual proteins or multi-component mixtures. The standard c(s) approach estimates molar masses of the sedimenting species based on determination of the frictional ratio (f/f0) from boundary shapes. The frictional ratio in this case is a weight-averaged parameter, which can lead to distortion of mass estimates and loss of information when attempting to analyze mixtures of macromolecules with different shapes. A two-dimensional extension of the c(s) analysis approach provides size-and-shape distributions that describe the data in terms of a sedimentation coefficient and frictional ratio grid. This allows for better resolution of species with very distinct shapes that may co-sediment and provides better molar mass determinations for multi-component mixtures. An example case is illustrated using globular and non-globular proteins of different masses with nearly identical sedimentation coefficients that could only be resolved using the size-and-shape distribution. Other applications of this analytical approach to complex biological systems are presented, focusing on proteins involved in the innate immune response to cytosolic microbial DNA. PMID:26412652

  10. Analysis of Power Laws, Shape Collapses, and Neural Complexity: New Techniques and MATLAB Support via the NCC Toolbox.

    PubMed

    Marshall, Najja; Timme, Nicholas M; Bennett, Nicholas; Ripp, Monica; Lautzenhiser, Edward; Beggs, John M

    2016-01-01

    Neural systems include interactions that occur across many scales. Two divergent methods for characterizing such interactions have drawn on the physical analysis of critical phenomena and the mathematical study of information. Inferring criticality in neural systems has traditionally rested on fitting power laws to the property distributions of "neural avalanches" (contiguous bursts of activity), but the fractal nature of avalanche shapes has recently emerged as another signature of criticality. On the other hand, neural complexity, an information theoretic measure, has been used to capture the interplay between the functional localization of brain regions and their integration for higher cognitive functions. Unfortunately, treatments of all three methods-power-law fitting, avalanche shape collapse, and neural complexity-have suffered from shortcomings. Empirical data often contain biases that introduce deviations from true power law in the tail and head of the distribution, but deviations in the tail have often been unconsidered; avalanche shape collapse has required manual parameter tuning; and the estimation of neural complexity has relied on small data sets or statistical assumptions for the sake of computational efficiency. In this paper we present technical advancements in the analysis of criticality and complexity in neural systems. We use maximum-likelihood estimation to automatically fit power laws with left and right cutoffs, present the first automated shape collapse algorithm, and describe new techniques to account for large numbers of neural variables and small data sets in the calculation of neural complexity. In order to facilitate future research in criticality and complexity, we have made the software utilized in this analysis freely available online in the MATLAB NCC (Neural Complexity and Criticality) Toolbox.

  11. The Mating System of the Wild-to-Domesticated Complex of Gossypium hirsutum L. Is Mixed

    PubMed Central

    Velázquez-López, Rebeca; Wegier, Ana; Alavez, Valeria; Pérez-López, Javier; Vázquez-Barrios, Valeria; Arroyo-Lambaer, Denise; Ponce-Mendoza, Alejandro; Kunin, William E.

    2018-01-01

    The domestication syndrome of many plants includes changes in their mating systems. The evolution of the latter is shaped by ecological and genetic factors that are particular to an area. Thus, the reproductive biology of wild relatives must be studied in their natural distribution to understand the mating system of a crop species as a whole. Gossypium hirsutum (upland cotton) includes both domesticated varieties and wild populations of the same species. Most studies on mating systems describe cultivated cotton as self-pollinated, while studies on pollen dispersal report outcrossing; however, the mating system of upland cotton has not been described as mixed and little is known about its wild relatives. In this study we selected two wild metapopulations for comparison with domesticated plants and one metapopulation with evidence of recent gene flow between wild relatives and the crop to evaluate the mating system of cotton’s wild-to-domesticated complex. Using classic reproductive biology methods, our data demonstrate that upland cotton presents a mixed mating system throughout the complex. Given cotton’s capacity for outcrossing, differences caused by the domestication process in cultivated individuals can have consequences for its wild relatives. This characterization of the diversity of the wild relatives in their natural distribution, as well as their interactions with the crop, will be useful to design and implement adequate strategies for conservation and biosecurity. PMID:29868048

  12. Atomic switch networks—nanoarchitectonic design of a complex system for natural computing

    NASA Astrophysics Data System (ADS)

    Demis, E. C.; Aguilera, R.; Sillin, H. O.; Scharnhorst, K.; Sandouk, E. J.; Aono, M.; Stieg, A. Z.; Gimzewski, J. K.

    2015-05-01

    Self-organized complex systems are ubiquitous in nature, and the structural complexity of these natural systems can be used as a model to design new classes of functional nanotechnology based on highly interconnected networks of interacting units. Conventional fabrication methods for electronic computing devices are subject to known scaling limits, confining the diversity of possible architectures. This work explores methods of fabricating a self-organized complex device known as an atomic switch network and discusses its potential utility in computing. Through a merger of top-down and bottom-up techniques guided by mathematical and nanoarchitectonic design principles, we have produced functional devices comprising nanoscale elements whose intrinsic nonlinear dynamics and memorization capabilities produce robust patterns of distributed activity and a capacity for nonlinear transformation of input signals when configured in the appropriate network architecture. Their operational characteristics represent a unique potential for hardware implementation of natural computation, specifically in the area of reservoir computing—a burgeoning field that investigates the computational aptitude of complex biologically inspired systems.

  13. Atomic switch networks-nanoarchitectonic design of a complex system for natural computing.

    PubMed

    Demis, E C; Aguilera, R; Sillin, H O; Scharnhorst, K; Sandouk, E J; Aono, M; Stieg, A Z; Gimzewski, J K

    2015-05-22

    Self-organized complex systems are ubiquitous in nature, and the structural complexity of these natural systems can be used as a model to design new classes of functional nanotechnology based on highly interconnected networks of interacting units. Conventional fabrication methods for electronic computing devices are subject to known scaling limits, confining the diversity of possible architectures. This work explores methods of fabricating a self-organized complex device known as an atomic switch network and discusses its potential utility in computing. Through a merger of top-down and bottom-up techniques guided by mathematical and nanoarchitectonic design principles, we have produced functional devices comprising nanoscale elements whose intrinsic nonlinear dynamics and memorization capabilities produce robust patterns of distributed activity and a capacity for nonlinear transformation of input signals when configured in the appropriate network architecture. Their operational characteristics represent a unique potential for hardware implementation of natural computation, specifically in the area of reservoir computing-a burgeoning field that investigates the computational aptitude of complex biologically inspired systems.

  14. Dynamics of a distributed drill string system: Characteristic parameters and stability maps

    NASA Astrophysics Data System (ADS)

    Aarsnes, Ulf Jakob F.; van de Wouw, Nathan

    2018-03-01

    This paper involves the dynamic (stability) analysis of distributed drill-string systems. A minimal set of parameters characterizing the linearized, axial-torsional dynamics of a distributed drill string coupled through the bit-rock interaction is derived. This is found to correspond to five parameters for a simple drill string and eight parameters for a two-sectioned drill-string (e.g., corresponding to the pipe and collar sections of a drilling system). These dynamic characterizations are used to plot the inverse gain margin of the system, parametrized in the non-dimensional parameters, effectively creating a stability map covering the full range of realistic physical parameters. This analysis reveals a complex spectrum of dynamics not evident in stability analysis with lumped models, thus indicating the importance of analysis using distributed models. Moreover, it reveals trends concerning stability properties depending on key system parameters useful in the context of system and control design aiming at the mitigation of vibrations.

  15. Representing distributed cognition in complex systems: how a submarine returns to periscope depth.

    PubMed

    Stanton, Neville A

    2014-01-01

    This paper presents the Event Analysis of Systemic Teamwork (EAST) method as a means of modelling distributed cognition in systems. The method comprises three network models (i.e. task, social and information) and their combination. This method was applied to the interactions between the sound room and control room in a submarine, following the activities of returning the submarine to periscope depth. This paper demonstrates three main developments in EAST. First, building the network models directly, without reference to the intervening methods. Second, the application of analysis metrics to all three networks. Third, the combination of the aforementioned networks in different ways to gain a broader understanding of the distributed cognition. Analyses have shown that EAST can be used to gain both qualitative and quantitative insights into distributed cognition. Future research should focus on the analyses of network resilience and modelling alternative versions of a system.

  16. Managing Complexity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chassin, David P.; Posse, Christian; Malard, Joel M.

    2004-08-01

    Physical analogs have shown considerable promise for understanding the behavior of complex adaptive systems, including macroeconomics, biological systems, social networks, and electric power markets. Many of today’s most challenging technical and policy questions can be reduced to a distributed economic control problem. Indeed, economically-based control of large-scale systems is founded on the conjecture that the price-based regulation (e.g., auctions, markets) results in an optimal allocation of resources and emergent optimal system control. This paper explores the state of the art in the use physical analogs for understanding the behavior of some econophysical systems and to deriving stable and robust controlmore » strategies for them. In particular we review and discussion applications of some analytic methods based on the thermodynamic metaphor according to which the interplay between system entropy and conservation laws gives rise to intuitive and governing global properties of complex systems that cannot be otherwise understood.« less

  17. Non-Maxwellian and magnetic field effects in complex plasma wakes★

    NASA Astrophysics Data System (ADS)

    Ludwig, Patrick; Jung, Hendrik; Kählert, Hanno; Joost, Jan-Philip; Greiner, Franko; Moldabekov, Zhandos; Carstensen, Jan; Sundar, Sita; Bonitz, Michael; Piel, Alexander

    2018-05-01

    In a streaming plasma, negatively charged dust particles create complex charge distributions on the downstream side of the particle, which are responsible for attractive forces between the like-charged particles. This wake phenomenon is studied by means of refined linear response theory and molecular dynamics simulations as well as in experiments. Particular attention is paid to non-Maxwellian velocity distributions that are found in the plasma sheath and to situations with strong magnetic fields, which are becoming increasingly important. Non-Maxwellian distributions and strong magnetic fields result in a substantial damping of the oscillatory wake potential. The interaction force in particle pairs is explored with the phase-resolved resonance method, which demonstrates the non-reciprocity of the interparticle forces in unmagnetized and magnetized systems.

  18. Methods of Information Geometry to model complex shapes

    NASA Astrophysics Data System (ADS)

    De Sanctis, A.; Gattone, S. A.

    2016-09-01

    In this paper, a new statistical method to model patterns emerging in complex systems is proposed. A framework for shape analysis of 2- dimensional landmark data is introduced, in which each landmark is represented by a bivariate Gaussian distribution. From Information Geometry we know that Fisher-Rao metric endows the statistical manifold of parameters of a family of probability distributions with a Riemannian metric. Thus this approach allows to reconstruct the intermediate steps in the evolution between observed shapes by computing the geodesic, with respect to the Fisher-Rao metric, between the corresponding distributions. Furthermore, the geodesic path can be used for shape predictions. As application, we study the evolution of the rat skull shape. A future application in Ophthalmology is introduced.

  19. Improving the analysis, storage and sharing of neuroimaging data using relational databases and distributed computing.

    PubMed

    Hasson, Uri; Skipper, Jeremy I; Wilde, Michael J; Nusbaum, Howard C; Small, Steven L

    2008-01-15

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data.

  20. Data-driven planning of distributed energy resources amidst socio-technical complexities

    NASA Astrophysics Data System (ADS)

    Jain, Rishee K.; Qin, Junjie; Rajagopal, Ram

    2017-08-01

    New distributed energy resources (DER) are rapidly replacing centralized power generation due to their environmental, economic and resiliency benefits. Previous analyses of DER systems have been limited in their ability to account for socio-technical complexities, such as intermittent supply, heterogeneous demand and balance-of-system cost dynamics. Here we develop ReMatch, an interdisciplinary modelling framework, spanning engineering, consumer behaviour and data science, and apply it to 10,000 consumers in California, USA. Our results show that deploying DER would yield nearly a 50% reduction in the levelized cost of electricity (LCOE) over the status quo even after accounting for socio-technical complexities. We abstract a detailed matching of consumers to DER infrastructure from our results and discuss how this matching can facilitate the development of smart and targeted renewable energy policies, programmes and incentives. Our findings point to the large-scale economic and technical feasibility of DER and underscore the pertinent role DER can play in achieving sustainable energy goals.

  1. Spectral statistics and scattering resonances of complex primes arrays

    NASA Astrophysics Data System (ADS)

    Wang, Ren; Pinheiro, Felipe A.; Dal Negro, Luca

    2018-01-01

    We introduce a class of aperiodic arrays of electric dipoles generated from the distribution of prime numbers in complex quadratic fields (Eisenstein and Gaussian primes) as well as quaternion primes (Hurwitz and Lifschitz primes), and study the nature of their scattering resonances using the vectorial Green's matrix method. In these systems we demonstrate several distinctive spectral properties, such as the absence of level repulsion in the strongly scattering regime, critical statistics of level spacings, and the existence of critical modes, which are extended fractal modes with long lifetimes not supported by either random or periodic systems. Moreover, we show that one can predict important physical properties, such as the existence spectral gaps, by analyzing the eigenvalue distribution of the Green's matrix of the arrays in the complex plane. Our results unveil the importance of aperiodic correlations in prime number arrays for the engineering of gapped photonic media that support far richer mode localization and spectral properties compared to usual periodic and random media.

  2. Improving the Analysis, Storage and Sharing of Neuroimaging Data using Relational Databases and Distributed Computing

    PubMed Central

    Hasson, Uri; Skipper, Jeremy I.; Wilde, Michael J.; Nusbaum, Howard C.; Small, Steven L.

    2007-01-01

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data. PMID:17964812

  3. A trade-off between local and distributed information processing associated with remote episodic versus semantic memory.

    PubMed

    Heisz, Jennifer J; Vakorin, Vasily; Ross, Bernhard; Levine, Brian; McIntosh, Anthony R

    2014-01-01

    Episodic memory and semantic memory produce very different subjective experiences yet rely on overlapping networks of brain regions for processing. Traditional approaches for characterizing functional brain networks emphasize static states of function and thus are blind to the dynamic information processing within and across brain regions. This study used information theoretic measures of entropy to quantify changes in the complexity of the brain's response as measured by magnetoencephalography while participants listened to audio recordings describing past personal episodic and general semantic events. Personal episodic recordings evoked richer subjective mnemonic experiences and more complex brain responses than general semantic recordings. Critically, we observed a trade-off between the relative contribution of local versus distributed entropy, such that personal episodic recordings produced relatively more local entropy whereas general semantic recordings produced relatively more distributed entropy. Changes in the relative contributions of local and distributed entropy to the total complexity of the system provides a potential mechanism that allows the same network of brain regions to represent cognitive information as either specific episodes or more general semantic knowledge.

  4. Extended q -Gaussian and q -exponential distributions from gamma random variables

    NASA Astrophysics Data System (ADS)

    Budini, Adrián A.

    2015-05-01

    The family of q -Gaussian and q -exponential probability densities fit the statistical behavior of diverse complex self-similar nonequilibrium systems. These distributions, independently of the underlying dynamics, can rigorously be obtained by maximizing Tsallis "nonextensive" entropy under appropriate constraints, as well as from superstatistical models. In this paper we provide an alternative and complementary scheme for deriving these objects. We show that q -Gaussian and q -exponential random variables can always be expressed as a function of two statistically independent gamma random variables with the same scale parameter. Their shape index determines the complexity q parameter. This result also allows us to define an extended family of asymmetric q -Gaussian and modified q -exponential densities, which reduce to the standard ones when the shape parameters are the same. Furthermore, we demonstrate that a simple change of variables always allows relating any of these distributions with a beta stochastic variable. The extended distributions are applied in the statistical description of different complex dynamics such as log-return signals in financial markets and motion of point defects in a fluid flow.

  5. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME

    PubMed Central

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2017-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948

  6. Experiment Design for Complex VTOL Aircraft with Distributed Propulsion and Tilt Wing

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Landman, Drew

    2015-01-01

    Selected experimental results from a wind tunnel study of a subscale VTOL concept with distributed propulsion and tilt lifting surfaces are presented. The vehicle complexity and automated test facility were ideal for use with a randomized designed experiment. Design of Experiments and Response Surface Methods were invoked to produce run efficient, statistically rigorous regression models with minimized prediction error. Static tests were conducted at the NASA Langley 12-Foot Low-Speed Tunnel to model all six aerodynamic coefficients over a large flight envelope. This work supports investigations at NASA Langley in developing advanced configurations, simulations, and advanced control systems.

  7. An Integration of a GIS with Peatland Management

    NASA Technical Reports Server (NTRS)

    Hoshal, J. C.; Johnson, R. L.

    1982-01-01

    The complexities of peatland management in Minnesota and the use of a geographic information system, the Minnesota Land Management Information System (MLMIS) in the management process are examined. General information on the nature of peat and it quantity and distribution in Minnesota is also presented.

  8. Digital Libraries: The Next Generation in File System Technology.

    ERIC Educational Resources Information Center

    Bowman, Mic; Camargo, Bill

    1998-01-01

    Examines file sharing within corporations that use wide-area, distributed file systems. Applications and user interactions strongly suggest that the addition of services typically associated with digital libraries (content-based file location, strongly typed objects, representation of complex relationships between documents, and extrinsic…

  9. A meteorological distribution system for high-resolution terrestrial modeling (MicroMet)

    Treesearch

    Glen E. Liston; Kelly Elder

    2006-01-01

    An intermediate-complexity, quasi-physically based, meteorological model (MicroMet) has been developed to produce high-resolution (e.g., 30-m to 1-km horizontal grid increment) atmospheric forcings required to run spatially distributed terrestrial models over a wide variety of landscapes. The following eight variables, required to run most terrestrial models, are...

  10. Playable Serious Games for Studying and Programming Computational STEM and Informatics Applications of Distributed and Parallel Computer Architectures

    ERIC Educational Resources Information Center

    Amenyo, John-Thones

    2012-01-01

    Carefully engineered playable games can serve as vehicles for students and practitioners to learn and explore the programming of advanced computer architectures to execute applications, such as high performance computing (HPC) and complex, inter-networked, distributed systems. The article presents families of playable games that are grounded in…

  11. Controls on meadow distribution and characteristics [chapter 2

    Treesearch

    Dru Germanoski; Jerry R. Miller; Mark L. Lord

    2011-01-01

    Meadow complexes are located in distinct geomorphic and hydrologic settings that allow groundwater to be at or near the ground surface during at least part of the year. Meadows are manifestations of the subsurface flow system, and their distribution is controlled by factors that cause localized zones of groundwater discharge. Knowledge of the factors that serve as...

  12. 117. VIEW OF CABINETS ON EAST SIDE OF LANDLINE INSTRUMENTATION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    117. VIEW OF CABINETS ON EAST SIDE OF LANDLINE INSTRUMENTATION ROOM (206), LSB (BLDG. 751). FEATURES LEFT TO RIGHT: ALTERNATING CURRENT POWER DISTRIBUTION RELAY BOX, AIRBORNE BEACON ELECTRONIC TEST SYSTEM (ABETS), AUTOPILOT CHECKOUT CONTROLS, POWER DISTRIBUTION UNITS. - Vandenberg Air Force Base, Space Launch Complex 3, Launch Pad 3 East, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  13. A superstatistical model of metastasis and cancer survival

    NASA Astrophysics Data System (ADS)

    Leon Chen, L.; Beck, Christian

    2008-05-01

    We introduce a superstatistical model for the progression statistics of malignant cancer cells. The metastatic cascade is modeled as a complex nonequilibrium system with several macroscopic pathways and inverse-chi-square distributed parameters of the underlying Poisson processes. The predictions of the model are in excellent agreement with observed survival-time probability distributions of breast cancer patients.

  14. Northeast Artificial Intelligence Consortium Annual Report. Volume 6. 1988 Building an Intelligent Assistant: The Acquisition, Integration, and Maintenance of Complex Distributed Tasks

    DTIC Science & Technology

    1989-10-01

    of.ezpertiae Seymour. Wright (or artificisi. intelligence distributed. ai planning robo tics computer.vsion))." Implementation: (replace-values-in-constraint...by mechanical partners or advisors that customize the system’s response to the idiosyncrasies of the student. This paper describes the initial

  15. A Distributed Architecture for Tsunami Early Warning and Collaborative Decision-support in Crises

    NASA Astrophysics Data System (ADS)

    Moßgraber, J.; Middleton, S.; Hammitzsch, M.; Poslad, S.

    2012-04-01

    The presentation will describe work on the system architecture that is being developed in the EU FP7 project TRIDEC on "Collaborative, Complex and Critical Decision-Support in Evolving Crises". The challenges for a Tsunami Early Warning System (TEWS) are manifold and the success of a system depends crucially on the system's architecture. A modern warning system following a system-of-systems approach has to integrate various components and sub-systems such as different information sources, services and simulation systems. Furthermore, it has to take into account the distributed and collaborative nature of warning systems. In order to create an architecture that supports the whole spectrum of a modern, distributed and collaborative warning system one must deal with multiple challenges. Obviously, one cannot expect to tackle these challenges adequately with a monolithic system or with a single technology. Therefore, a system architecture providing the blueprints to implement the system-of-systems approach has to combine multiple technologies and architectural styles. At the bottom layer it has to reliably integrate a large set of conventional sensors, such as seismic sensors and sensor networks, buoys and tide gauges, and also innovative and unconventional sensors, such as streams of messages from social media services. At the top layer it has to support collaboration on high-level decision processes and facilitates information sharing between organizations. In between, the system has to process all data and integrate information on a semantic level in a timely manner. This complex communication follows an event-driven mechanism allowing events to be published, detected and consumed by various applications within the architecture. Therefore, at the upper layer the event-driven architecture (EDA) aspects are combined with principles of service-oriented architectures (SOA) using standards for communication and data exchange. The most prominent challenges on this layer include providing a framework for information integration on a syntactic and semantic level, leveraging distributed processing resources for a scalable data processing platform, and automating data processing and decision support workflows.

  16. Investigation of interaction between the Pt(II) ions and aminosilane-modified silica surface in heterogeneous system

    NASA Astrophysics Data System (ADS)

    Nowicki, Waldemar; Gąsowska, Anna; Kirszensztejn, Piotr

    2016-05-01

    UV-vis spectroscopy measurements confirmed the reaction in heterogeneous system between Pt(II) ions and ethylenediamine type ligand, n-(2-aminoethyl)-3-aminopropyl-trimethoxysilane, immobilized at the silica surface. The formation of complexes is a consequence of interaction between the amine groups from the ligand grafted onto SiO2 and ions of platinum. A potentiometric titration technique was to determine the stability constants of complexes of Pt(II) with immobilized insoluble ligand (SG-L), on the silica gel. The results show the formation of three surface complexes of the same type (PtHSG-L, Pt(HSG-L)2, PtSG-L) with SG-L ligand, in a wide range of pH for different Debye length. The concentration distribution of the complexes in a heterogeneous system is evaluated.

  17. Comparative phyloinformatics of virus genes at micro and macro levels in a distributed computing environment.

    PubMed

    Singh, Dadabhai T; Trehan, Rahul; Schmidt, Bertil; Bretschneider, Timo

    2008-01-01

    Preparedness for a possible global pandemic caused by viruses such as the highly pathogenic influenza A subtype H5N1 has become a global priority. In particular, it is critical to monitor the appearance of any new emerging subtypes. Comparative phyloinformatics can be used to monitor, analyze, and possibly predict the evolution of viruses. However, in order to utilize the full functionality of available analysis packages for large-scale phyloinformatics studies, a team of computer scientists, biostatisticians and virologists is needed--a requirement which cannot be fulfilled in many cases. Furthermore, the time complexities of many algorithms involved leads to prohibitive runtimes on sequential computer platforms. This has so far hindered the use of comparative phyloinformatics as a commonly applied tool in this area. In this paper the graphical-oriented workflow design system called Quascade and its efficient usage for comparative phyloinformatics are presented. In particular, we focus on how this task can be effectively performed in a distributed computing environment. As a proof of concept, the designed workflows are used for the phylogenetic analysis of neuraminidase of H5N1 isolates (micro level) and influenza viruses (macro level). The results of this paper are hence twofold. Firstly, this paper demonstrates the usefulness of a graphical user interface system to design and execute complex distributed workflows for large-scale phyloinformatics studies of virus genes. Secondly, the analysis of neuraminidase on different levels of complexity provides valuable insights of this virus's tendency for geographical based clustering in the phylogenetic tree and also shows the importance of glycan sites in its molecular evolution. The current study demonstrates the efficiency and utility of workflow systems providing a biologist friendly approach to complex biological dataset analysis using high performance computing. In particular, the utility of the platform Quascade for deploying distributed and parallelized versions of a variety of computationally intensive phylogenetic algorithms has been shown. Secondly, the analysis of the utilized H5N1 neuraminidase datasets at macro and micro levels has clearly indicated a pattern of spatial clustering of the H5N1 viral isolates based on geographical distribution rather than temporal or host range based clustering.

  18. Distributed state-space generation of discrete-state stochastic models

    NASA Technical Reports Server (NTRS)

    Ciardo, Gianfranco; Gluckman, Joshua; Nicol, David

    1995-01-01

    High-level formalisms such as stochastic Petri nets can be used to model complex systems. Analysis of logical and numerical properties of these models of ten requires the generation and storage of the entire underlying state space. This imposes practical limitations on the types of systems which can be modeled. Because of the vast amount of memory consumed, we investigate distributed algorithms for the generation of state space graphs. The distributed construction allows us to take advantage of the combined memory readily available on a network of workstations. The key technical problem is to find effective methods for on-the-fly partitioning, so that the state space is evenly distributed among processors. In this paper we report on the implementation of a distributed state-space generator that may be linked to a number of existing system modeling tools. We discuss partitioning strategies in the context of Petri net models, and report on performance observed on a network of workstations, as well as on a distributed memory multi-computer.

  19. Architecture, Voltage, and Components for a Turboelectric Distributed Propulsion Electric Grid

    NASA Technical Reports Server (NTRS)

    Armstrong, Michael J.; Blackwelder, Mark; Bollman, Andrew; Ross, Christine; Campbell, Angela; Jones, Catherine; Norman, Patrick

    2015-01-01

    The development of a wholly superconducting turboelectric distributed propulsion system presents unique opportunities for the aerospace industry. However, this transition from normally conducting systems to superconducting systems significantly increases the equipment complexity necessary to manage the electrical power systems. Due to the low technology readiness level (TRL) nature of all components and systems, current Turboelectric Distributed Propulsion (TeDP) technology developments are driven by an ambiguous set of system-level electrical integration standards for an airborne microgrid system (Figure 1). While multiple decades' worth of advancements are still required for concept realization, current system-level studies are necessary to focus the technology development, target specific technological shortcomings, and enable accurate prediction of concept feasibility and viability. An understanding of the performance sensitivity to operating voltages and an early definition of advantageous voltage regulation standards for unconventional airborne microgrids will allow for more accurate targeting of technology development. Propulsive power-rated microgrid systems necessitate the introduction of new aircraft distribution system voltage standards. All protection, distribution, control, power conversion, generation, and cryocooling equipment are affected by voltage regulation standards. Information on the desired operating voltage and voltage regulation is required to determine nominal and maximum currents for sizing distribution and fault isolation equipment, developing machine topologies and machine controls, and the physical attributes of all component shielding and insulation. Voltage impacts many components and system performance.

  20. Empirical tests of Zipf's law mechanism in open source Linux distribution.

    PubMed

    Maillart, T; Sornette, D; Spaeth, S; von Krogh, G

    2008-11-21

    Zipf's power law is a ubiquitous empirical regularity found in many systems, thought to result from proportional growth. Here, we establish empirically the usually assumed ingredients of stochastic growth models that have been previously conjectured to be at the origin of Zipf's law. We use exceptionally detailed data on the evolution of open source software projects in Linux distributions, which offer a remarkable example of a growing complex self-organizing adaptive system, exhibiting Zipf's law over four full decades.

  1. Wigner distribution function of Hermite-cosine-Gaussian beams through an apertured optical system.

    PubMed

    Sun, Dong; Zhao, Daomu

    2005-08-01

    By introducing the hard-aperture function into a finite sum of complex Gaussian functions, the approximate analytical expressions of the Wigner distribution function for Hermite-cosine-Gaussian beams passing through an apertured paraxial ABCD optical system are obtained. The analytical results are compared with the numerically integrated ones, and the absolute errors are also given. It is shown that the analytical results are proper and that the calculation speed for them is much faster than for the numerical results.

  2. Distributed Computer Networks in Support of Complex Group Practices

    PubMed Central

    Wess, Bernard P.

    1978-01-01

    The economics of medical computer networks are presented in context with the patient care and administrative goals of medical networks. Design alternatives and network topologies are discussed with an emphasis on medical network design requirements in distributed data base design, telecommunications, satellite systems, and software engineering. The success of the medical computer networking technology is predicated on the ability of medical and data processing professionals to design comprehensive, efficient, and virtually impenetrable security systems to protect data bases, network access and services, and patient confidentiality.

  3. A Theoretical Solid Oxide Fuel Cell Model for Systems Controls and Stability Design

    NASA Technical Reports Server (NTRS)

    Kopasakis, George; Brinson, Thomas; Credle, Sydni

    2008-01-01

    As the aviation industry moves toward higher efficiency electrical power generation, all electric aircraft, or zero emissions and more quiet aircraft, fuel cells are sought as the technology that can deliver on these high expectations. The hybrid solid oxide fuel cell system combines the fuel cell with a micro-turbine to obtain up to 70% cycle efficiency, and then distributes the electrical power to the loads via a power distribution system. The challenge is to understand the dynamics of this complex multidiscipline system and the design distributed controls that take the system through its operating conditions in a stable and safe manner while maintaining the system performance. This particular system is a power generation and a distribution system, and the fuel cell and micro-turbine model fidelity should be compatible with the dynamics of the power distribution system in order to allow proper stability and distributed controls design. The novelty in this paper is that, first, the case is made why a high fidelity fuel cell mode is needed for systems control and stability designs. Second, a novel modeling approach is proposed for the fuel cell that will allow the fuel cell and the power system to be integrated and designed for stability, distributed controls, and other interface specifications. This investigation shows that for the fuel cell, the voltage characteristic should be modeled but in addition, conservation equation dynamics, ion diffusion, charge transfer kinetics, and the electron flow inherent impedance should also be included.

  4. Exact density-potential pairs from complex-shifted axisymmetric systems

    NASA Astrophysics Data System (ADS)

    Ciotti, Luca; Marinacci, Federico

    2008-07-01

    In a previous paper, the complex-shift method has been applied to self-gravitating spherical systems, producing new analytical axisymmetric density-potential pairs. We now extend the treatment to the Miyamoto-Nagai disc and the Binney logarithmic halo, and we study the resulting axisymmetric and triaxial analytical density-potential pairs; we also show how to obtain the surface density of shifted systems from the complex shift of the surface density of the parent model. In particular, the systems obtained from Miyamoto-Nagai discs can be used to describe disc galaxies with a peanut-shaped bulge or with a central triaxial bar, depending on the direction of the shift vector. By using a constructive method that can be applied to generic axisymmetric systems, we finally show that the Miyamoto-Nagai and the Satoh discs, and the Binney logarithmic halo cannot be obtained from the complex shift of any spherical parent distribution. As a by-product of this study, we also found two new generating functions in closed form for even and odd Legendre polynomials, respectively.

  5. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    PubMed

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  6. An Actor-Network Theory Analysis of Policy Innovation for Smoke-Free Places: Understanding Change in Complex Systems

    PubMed Central

    Borland, Ron; Coghill, Ken

    2010-01-01

    Complex, transnational issues like the tobacco epidemic are major challenges that defy analysis and management by conventional methods, as are other public health issues, such as those associated with global food distribution and climate change. We examined the evolution of indoor smoke-free regulations, a tobacco control policy innovation, and identified the key attributes of those jurisdictions that successfully pursued this innovation and those that to date have not. In doing so, we employed the actor-network theory, a comprehensive framework for the analysis of fundamental system change. Through our analysis, we identified approaches to help overcome some systemic barriers to the solution of the tobacco problem and comment on other complex transnational problems. PMID:20466949

  7. An actor-network theory analysis of policy innovation for smoke-free places: understanding change in complex systems.

    PubMed

    Young, David; Borland, Ron; Coghill, Ken

    2010-07-01

    Complex, transnational issues like the tobacco epidemic are major challenges that defy analysis and management by conventional methods, as are other public health issues, such as those associated with global food distribution and climate change. We examined the evolution of indoor smoke-free regulations, a tobacco control policy innovation, and identified the key attributes of those jurisdictions that successfully pursued this innovation and those that to date have not. In doing so, we employed the actor-network theory, a comprehensive framework for the analysis of fundamental system change. Through our analysis, we identified approaches to help overcome some systemic barriers to the solution of the tobacco problem and comment on other complex transnational problems.

  8. Measurement of Strain Distributions in Mouse Femora with 3D-Digital Speckle Pattern Interferometry

    PubMed Central

    Yang, Lianxiang; Zhang, Ping; Liu, Sheng; Samala, Praveen R; Su, Min; Yokota, Hiroki

    2007-01-01

    Bone is a mechanosensitive tissue that adapts its mass, architecture and mechanical properties to external loading. Appropriate mechanical loads offer an effective means to stimulate bone remodeling and prevent bone loss. A role of in situ strain in bone is considered essential in enhancement of bone formation, and establishing a quantitative relationship between 3D strain distributions and a rate of local bone formation is important. Digital speckle pattern interferometry (DSPI) can achieve whole-field, non-contacting measurements of microscopic deformation for high-resolution determination of 3D strain distributions. However, the current system does not allow us to derive accurate strain distributions because of complex surface contours inherent to biological samples. Through development of a custom-made piezoelectric loading device as well as a new DSPI-based force calibration system, we built an advanced DSPI system and integrated local contour information to deformation data. Using a mouse femur in response to a knee loading modality as a model system, we determined 3D strain distributions and discussed effectiveness and limitations of the described system. PMID:18670581

  9. Analyzing SystemC Designs: SystemC Analysis Approaches for Varying Applications

    PubMed Central

    Stoppe, Jannis; Drechsler, Rolf

    2015-01-01

    The complexity of hardware designs is still increasing according to Moore's law. With embedded systems being more and more intertwined and working together not only with each other, but also with their environments as cyber physical systems (CPSs), more streamlined development workflows are employed to handle the increasing complexity during a system's design phase. SystemC is a C++ library for the design of hardware/software systems, enabling the designer to quickly prototype, e.g., a distributed CPS without having to decide about particular implementation details (such as whether to implement a feature in hardware or in software) early in the design process. Thereby, this approach reduces the initial implementation's complexity by offering an abstract layer with which to build a working prototype. However, as SystemC is based on C++, analyzing designs becomes a difficult task due to the complex language features that are available to the designer. Several fundamentally different approaches for analyzing SystemC designs have been suggested. This work illustrates several different SystemC analysis approaches, including their specific advantages and shortcomings, allowing designers to pick the right tools to assist them with a specific problem during the design of a system using SystemC. PMID:25946632

  10. Analyzing SystemC Designs: SystemC Analysis Approaches for Varying Applications.

    PubMed

    Stoppe, Jannis; Drechsler, Rolf

    2015-05-04

    The complexity of hardware designs is still increasing according to Moore's law. With embedded systems being more and more intertwined and working together not only with each other, but also with their environments as cyber physical systems (CPSs), more streamlined development workflows are employed to handle the increasing complexity during a system's design phase. SystemC is a C++ library for the design of hardware/software systems, enabling the designer to quickly prototype, e.g., a distributed CPS without having to decide about particular implementation details (such as whether to implement a feature in hardware or in software) early in the design process. Thereby, this approach reduces the initial implementation's complexity by offering an abstract layer with which to build a working prototype. However, as SystemC is based on C++, analyzing designs becomes a difficult task due to the complex language features that are available to the designer. Several fundamentally different approaches for analyzing SystemC designs have been suggested. This work illustrates several different SystemC analysis approaches, including their specific advantages and shortcomings, allowing designers to pick the right tools to assist them with a specific problem during the design of a system using SystemC.

  11. Solute transport with equilibrium aqueous complexation and either sorption or ion exchange: Simulation methodology and applications

    USGS Publications Warehouse

    Lewis, F.M.; Voss, C.I.; Rubin, J.

    1987-01-01

    Methodologies that account for specific types of chemical reactions in the simulation of solute transport can be developed so they are compatible with solution algorithms employed in existing transport codes. This enables the simulation of reactive transport in complex multidimensional flow regimes, and provides a means for existing codes to account for some of the fundamental chemical processes that occur among transported solutes. Two equilibrium-controlled reaction systems demonstrate a methodology for accommodating chemical interaction into models of solute transport. One system involves the sorption of a given chemical species, as well as two aqueous complexations in which the sorbing species is a participant. The other reaction set involves binary ion exchange coupled with aqueous complexation involving one of the exchanging species. The methodology accommodates these reaction systems through the addition of nonlinear terms to the transport equations for the sorbing species. Example simulation results show (1) the effect equilibrium chemical parameters have on the spatial distributions of concentration for complexing solutes; (2) that an interrelationship exists between mechanical dispersion and the various reaction processes; (3) that dispersive parameters of the porous media cannot be determined from reactive concentration distributions unless the reaction is accounted for or the influence of the reaction is negligible; (4) how the concentration of a chemical species may be significantly affected by its participation in an aqueous complex with a second species which also sorbs; and (5) that these coupled chemical processes influencing reactive transport can be demonstrated in two-dimensional flow regimes. ?? 1987.

  12. Some fast elliptic solvers on parallel architectures and their complexities

    NASA Technical Reports Server (NTRS)

    Gallopoulos, E.; Saad, Youcef

    1989-01-01

    The discretization of separable elliptic partial differential equations leads to linear systems with special block triangular matrices. Several methods are known to solve these systems, the most general of which is the Block Cyclic Reduction (BCR) algorithm which handles equations with nonconsistant coefficients. A method was recently proposed to parallelize and vectorize BCR. Here, the mapping of BCR on distributed memory architectures is discussed, and its complexity is compared with that of other approaches, including the Alternating-Direction method. A fast parallel solver is also described, based on an explicit formula for the solution, which has parallel computational complexity lower than that of parallel BCR.

  13. Enabling Controlling Complex Networks with Local Topological Information.

    PubMed

    Li, Guoqi; Deng, Lei; Xiao, Gaoxi; Tang, Pei; Wen, Changyun; Hu, Wuhua; Pei, Jing; Shi, Luping; Stanley, H Eugene

    2018-03-15

    Complex networks characterize the nature of internal/external interactions in real-world systems including social, economic, biological, ecological, and technological networks. Two issues keep as obstacles to fulfilling control of large-scale networks: structural controllability which describes the ability to guide a dynamical system from any initial state to any desired final state in finite time, with a suitable choice of inputs; and optimal control, which is a typical control approach to minimize the cost for driving the network to a predefined state with a given number of control inputs. For large complex networks without global information of network topology, both problems remain essentially open. Here we combine graph theory and control theory for tackling the two problems in one go, using only local network topology information. For the structural controllability problem, a distributed local-game matching method is proposed, where every node plays a simple Bayesian game with local information and local interactions with adjacent nodes, ensuring a suboptimal solution at a linear complexity. Starring from any structural controllability solution, a minimizing longest control path method can efficiently reach a good solution for the optimal control in large networks. Our results provide solutions for distributed complex network control and demonstrate a way to link the structural controllability and optimal control together.

  14. A framework for building real-time expert systems

    NASA Technical Reports Server (NTRS)

    Lee, S. Daniel

    1991-01-01

    The Space Station Freedom is an example of complex systems that require both traditional and artificial intelligence (AI) real-time methodologies. It was mandated that Ada should be used for all new software development projects. The station also requires distributed processing. Catastrophic failures on the station can cause the transmission system to malfunction for a long period of time, during which ground-based expert systems cannot provide any assistance to the crisis situation on the station. This is even more critical for other NASA projects that would have longer transmission delays (e.g., the lunar base, Mars missions, etc.). To address these issues, a distributed agent architecture (DAA) is proposed that can support a variety of paradigms based on both traditional real-time computing and AI. The proposed testbed for DAA is an autonomous power expert (APEX) which is a real-time monitoring and diagnosis expert system for the electrical power distribution system of the space station.

  15. Analysis of Power Laws, Shape Collapses, and Neural Complexity: New Techniques and MATLAB Support via the NCC Toolbox

    PubMed Central

    Marshall, Najja; Timme, Nicholas M.; Bennett, Nicholas; Ripp, Monica; Lautzenhiser, Edward; Beggs, John M.

    2016-01-01

    Neural systems include interactions that occur across many scales. Two divergent methods for characterizing such interactions have drawn on the physical analysis of critical phenomena and the mathematical study of information. Inferring criticality in neural systems has traditionally rested on fitting power laws to the property distributions of “neural avalanches” (contiguous bursts of activity), but the fractal nature of avalanche shapes has recently emerged as another signature of criticality. On the other hand, neural complexity, an information theoretic measure, has been used to capture the interplay between the functional localization of brain regions and their integration for higher cognitive functions. Unfortunately, treatments of all three methods—power-law fitting, avalanche shape collapse, and neural complexity—have suffered from shortcomings. Empirical data often contain biases that introduce deviations from true power law in the tail and head of the distribution, but deviations in the tail have often been unconsidered; avalanche shape collapse has required manual parameter tuning; and the estimation of neural complexity has relied on small data sets or statistical assumptions for the sake of computational efficiency. In this paper we present technical advancements in the analysis of criticality and complexity in neural systems. We use maximum-likelihood estimation to automatically fit power laws with left and right cutoffs, present the first automated shape collapse algorithm, and describe new techniques to account for large numbers of neural variables and small data sets in the calculation of neural complexity. In order to facilitate future research in criticality and complexity, we have made the software utilized in this analysis freely available online in the MATLAB NCC (Neural Complexity and Criticality) Toolbox. PMID:27445842

  16. A Framework for the Evaluation of the Cost and Benefits of Microgrids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, Greg Young; Abbey, Chad; Joos, Geza

    2011-07-15

    A Microgrid is recognized as an innovative technology to help integrate renewables into distribution systems and to provide additional benefits to a variety of stakeholders, such as offsetting infrastructure investments and improving the reliability of the local system. However, these systems require additional investments for control infrastructure, and as such, additional costs and the anticipated benefits need to be quantified in order to determine whether the investment is economically feasible. This paper proposes a methodology for systematizing and representing benefits and their interrelationships based on the UML Use Case paradigm, which allows complex systems to be represented in a concise,more » elegant format. This methodology is demonstrated by determining the economic feasibility of a Microgrid and Distributed Generation installed on a typical Canadian rural distribution system model as a case study. The study attempts to minimize the cost of energy served to the community, considering the fixed costs associated with Microgrids and Distributed Generation, and suggests benefits to a variety of stakeholders.« less

  17. Spatio-temporal assessment of food safety risks in Canadian food distribution systems using GIS.

    PubMed

    Hashemi Beni, Leila; Villeneuve, Sébastien; LeBlanc, Denyse I; Côté, Kevin; Fazil, Aamir; Otten, Ainsley; McKellar, Robin; Delaquis, Pascal

    2012-09-01

    While the value of geographic information systems (GIS) is widely applied in public health there have been comparatively few examples of applications that extend to the assessment of risks in food distribution systems. GIS can provide decision makers with strong computing platforms for spatial data management, integration, analysis, querying and visualization. The present report addresses some spatio-analyses in a complex food distribution system and defines influence areas as travel time zones generated through road network analysis on a national scale rather than on a community scale. In addition, a dynamic risk index is defined to translate a contamination event into a public health risk as time progresses. More specifically, in this research, GIS is used to map the Canadian produce distribution system, analyze accessibility to contaminated product by consumers, and estimate the level of risk associated with a contamination event over time, as illustrated in a scenario. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.

  18. Analysis of Trajectory Flexibility Preservation Impact on Traffic Complexity

    NASA Technical Reports Server (NTRS)

    Idris, Husni; El-Wakil, Tarek; Wing, David J.

    2009-01-01

    The growing demand for air travel is increasing the need for mitigation of air traffic congestion and complexity problems, which are already at high levels. At the same time new information and automation technologies are enabling the distribution of tasks and decisions from the service providers to the users of the air traffic system, with potential capacity and cost benefits. This distribution of tasks and decisions raises the concern that independent user actions will decrease the predictability and increase the complexity of the traffic system, hence inhibiting and possibly reversing any potential benefits. In answer to this concern, the authors proposed the introduction of decision-making metrics for preserving user trajectory flexibility. The hypothesis is that such metrics will make user actions naturally mitigate traffic complexity. In this paper, the impact of using these metrics on traffic complexity is investigated. The scenarios analyzed include aircraft in en route airspace with each aircraft meeting a required time of arrival in a one-hour time horizon while mitigating the risk of loss of separation with the other aircraft, thus preserving its trajectory flexibility. The experiments showed promising results in that the individual trajectory flexibility preservation induced self-separation and self-organization effects in the overall traffic situation. The effects were quantified using traffic complexity metrics, namely dynamic density indicators, which indicated that using the flexibility metrics reduced aircraft density and the potential of loss of separation.

  19. Characterizing Complexity of Containerized Cargo X-ray Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Guangxing; Martz, Harry; Glenn, Steven

    X-ray imaging can be used to inspect cargos imported into the United States. In order to better understand the performance of X-ray inspection systems, the X-ray characteristics (density, complexity) of cargo need to be quantified. In this project, an image complexity measure called integrated power spectral density (IPSD) was studied using both DNDO engineered cargos and stream-of-commerce (SOC) cargos. A joint distribution of cargo density and complexity was obtained. A support vector machine was used to classify the SOC cargos into four categories to estimate the relative fractions.

  20. Efficient packet transportation on complex networks with nonuniform node capacity distribution

    NASA Astrophysics Data System (ADS)

    He, Xuan; Niu, Kai; He, Zhiqiang; Lin, Jiaru; Jiang, Zhong-Yuan

    2015-03-01

    Provided that node delivery capacity may be not uniformly distributed in many realistic networks, we present a node delivery capacity distribution in which each node capacity is composed of uniform fraction and degree related proportion. Based on the node delivery capacity distribution, we construct a novel routing mechanism called efficient weighted routing (EWR) strategy to enhance network traffic capacity and transportation efficiency. Compared with the shortest path routing and the efficient routing strategies, the EWR achieves the highest traffic capacity. After investigating average path length, network diameter, maximum efficient betweenness, average efficient betweenness, average travel time and average traffic load under extensive simulations, it indicates that the EWR appears to be a very effective routing method. The idea of this routing mechanism gives us a good insight into network science research. The practical use of this work is prospective in some real complex systems such as the Internet.

  1. All fiber-coupled, long-term stable timing distribution for free-electron lasers with few-femtosecond jitter

    PubMed Central

    Şafak, K.; Xin, M.; Callahan, P. T.; Peng, M. Y.; Kärtner, F. X.

    2015-01-01

    We report recent progress made in a complete fiber-optic, high-precision, long-term stable timing distribution system for synchronization of next generation X-ray free-electron lasers. Timing jitter characterization of the master laser shows less than 170-as RMS integrated jitter for frequencies above 10 kHz, limited by the detection noise floor. Timing stabilization of a 3.5-km polarization-maintaining fiber link is successfully achieved with an RMS drift of 3.3 fs over 200 h of operation using all fiber-coupled elements. This all fiber-optic implementation will greatly reduce the complexity of optical alignment in timing distribution systems and improve the overall mechanical and timing stability of the system. PMID:26798814

  2. Cooperative Adaptive Output Regulation for Second-Order Nonlinear Multiagent Systems With Jointly Connected Switching Networks.

    PubMed

    Liu, Wei; Huang, Jie

    2018-03-01

    This paper studies the cooperative global robust output regulation problem for a class of heterogeneous second-order nonlinear uncertain multiagent systems with jointly connected switching networks. The main contributions consist of the following three aspects. First, we generalize the result of the adaptive distributed observer from undirected jointly connected switching networks to directed jointly connected switching networks. Second, by performing a new coordinate and input transformation, we convert our problem into the cooperative global robust stabilization problem of a more complex augmented system via the distributed internal model principle. Third, we solve the stabilization problem by a distributed state feedback control law. Our result is illustrated by the leader-following consensus problem for a group of Van der Pol oscillators.

  3. A problem of optimal control and observation for distributed homogeneous multi-agent system

    NASA Astrophysics Data System (ADS)

    Kruglikov, Sergey V.

    2017-12-01

    The paper considers the implementation of a algorithm for controlling a distributed complex of several mobile multi-robots. The concept of a unified information space of the controlling system is applied. The presented information and mathematical models of participants and obstacles, as real agents, and goals and scenarios, as virtual agents, create the base forming the algorithmic and software background for computer decision support system. The controlling scheme assumes the indirect management of the robotic team on the basis of optimal control and observation problem predicting intellectual behavior in a dynamic, hostile environment. A basic content problem is a compound cargo transportation by a group of participants in the case of a distributed control scheme in the terrain with multiple obstacles.

  4. Surface plasmon holographic microscopy for near-field refractive index detection and thin film mapping

    NASA Astrophysics Data System (ADS)

    Zhao, Jianlin; Zhang, Jiwei; Dai, Siqing; Di, Jianglei; Xi, Teli

    2018-02-01

    Surface plasmon microscopy (SPM) is widely applied for label-free detection of changes of refractive index and concentration, as well as mapping thin films in near field. Traditionally, the SPM systems are based on the detection of light intensity or phase changes. Here, we present two kinds of surface plasmon holographic microscopy (SPHM) systems for amplitude- and phase-contrast imaging simultaneously. Through recording off-axis holograms and numerical reconstruction, the complex amplitude distributions of surface plasmon resonance (SPR) images can be obtained. According to the Fresnel's formula, in a prism/ gold/ dielectric structure, the reflection phase shift is uniquely decided by refractive index of the dielectric. By measuring the phase shift difference of the reflected light exploiting prism-coupling SPHM system based on common-path interference configuration, monitoring tiny refractive index variation and imaging biological tissue are performed. Furthermore, to characterize the thin film thickness in near field, we employ a four-layer SPR model in which the third film layer is within the evanescent field. The complex reflection coefficient, including the reflectivity and reflection phase shift, is uniquely decided by the film thickness. By measuring the complex amplitude distributions of the SPR images exploiting objective-coupling SPHM system based on common-path interference configuration, the thickness distributions of thin films are mapped with sub-nanometer resolution theoretically. Owing to its high temporal stability, the recommended SPHMs show great potentials for monitoring tiny refractive index variations, imaging biological tissues and mapping thin films in near field with dynamic, nondestructive and full-field measurement capabilities in chemistry, biomedicine field, etc.

  5. Intelligent building system for airport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ancevic, M.

    1997-11-01

    The Munich airport uses a state-of-the-art intelligent building management system to control systems such as HVAC, runway lights, baggage handling, etc. Planning the new Munich II international airport provided a unique opportunity to use the latest state-of-the-art technical systems, while integrating their control through a single intelligent building management system. Opened in 1992, the airport is Germany`s second-largest airport after Frankfurt. The airport is staffed by 16,000 employees and can handle 17 million passengers a year. The sprawling site encompasses more than 120 buildings. The airport`s distributed control system is specifically designed to optimize the complex`s unique range of functions,more » while providing a high degree of comfort, convenience and safety for airport visitors. With the capacity to control 200,000 points, this system controls more than 112,000 points and integrates 13 major subsystems from nine different vendors. It provides convenient, accessible control of everything including the complex`s power plant, HVAC Control, the terminal`s people-moving functions, interior lighting controls, runway lights, baggage forwarding systems, elevators, and boarding bridges. The airport was named 1993 intelligent building of the year by the Intelligent Buildings Institute Foundation. Its building management system is a striking example of the degree to which a building complex`s functions can be integrated for greater operational control and efficiency.« less

  6. Stationary properties of maximum-entropy random walks.

    PubMed

    Dixit, Purushottam D

    2015-10-01

    Maximum-entropy (ME) inference of state probabilities using state-dependent constraints is popular in the study of complex systems. In stochastic systems, how state space topology and path-dependent constraints affect ME-inferred state probabilities remains unknown. To that end, we derive the transition probabilities and the stationary distribution of a maximum path entropy Markov process subject to state- and path-dependent constraints. A main finding is that the stationary distribution over states differs significantly from the Boltzmann distribution and reflects a competition between path multiplicity and imposed constraints. We illustrate our results with particle diffusion on a two-dimensional landscape. Connections with the path integral approach to diffusion are discussed.

  7. Engineering research, development and technology FY99

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langland, R T

    The growth of computer power and connectivity, together with advances in wireless sensing and communication technologies, is transforming the field of complex distributed systems. The ability to deploy large numbers of sensors with a rapid, broadband communication system will enable high-fidelity, near real-time monitoring of complex systems. These technological developments will provide unprecedented insight into the actual performance of engineered and natural environment systems, enable the evolution of many new types of engineered systems for monitoring and detection, and enhance our ability to perform improved and validated large-scale simulations of complex systems. One of the challenges facing engineering is tomore » develop methodologies to exploit the emerging information technologies. Particularly important will be the ability to assimilate measured data into the simulation process in a way which is much more sophisticated than current, primarily ad hoc procedures. The reports contained in this section on the Center for Complex Distributed Systems describe activities related to the integrated engineering of large complex systems. The first three papers describe recent developments for each link of the integrated engineering process for large structural systems. These include (1) the development of model-based signal processing algorithms which will formalize the process of coupling measurements and simulation and provide a rigorous methodology for validation and update of computational models; (2) collaborative efforts with faculty at the University of California at Berkeley on the development of massive simulation models for the earth and large bridge structures; and (3) the development of wireless data acquisition systems which provide a practical means of monitoring large systems like the National Ignition Facility (NIF) optical support structures. These successful developments are coming to a confluence in the next year with applications to NIF structural characterizations and analysis of large bridge structures for the State of California. Initial feasibility investigations into the development of monitoring and detection systems are described in the papers on imaging of underground structures with ground-penetrating radar, and the use of live insects as sensor platforms. These efforts are establishing the basic performance characteristics essential to the decision process for future development of sensor arrays for information gathering related to national security.« less

  8. Efficient statistically accurate algorithms for the Fokker-Planck equation in large dimensions

    NASA Astrophysics Data System (ADS)

    Chen, Nan; Majda, Andrew J.

    2018-02-01

    Solving the Fokker-Planck equation for high-dimensional complex turbulent dynamical systems is an important and practical issue. However, most traditional methods suffer from the curse of dimensionality and have difficulties in capturing the fat tailed highly intermittent probability density functions (PDFs) of complex systems in turbulence, neuroscience and excitable media. In this article, efficient statistically accurate algorithms are developed for solving both the transient and the equilibrium solutions of Fokker-Planck equations associated with high-dimensional nonlinear turbulent dynamical systems with conditional Gaussian structures. The algorithms involve a hybrid strategy that requires only a small number of ensembles. Here, a conditional Gaussian mixture in a high-dimensional subspace via an extremely efficient parametric method is combined with a judicious non-parametric Gaussian kernel density estimation in the remaining low-dimensional subspace. Particularly, the parametric method provides closed analytical formulae for determining the conditional Gaussian distributions in the high-dimensional subspace and is therefore computationally efficient and accurate. The full non-Gaussian PDF of the system is then given by a Gaussian mixture. Different from traditional particle methods, each conditional Gaussian distribution here covers a significant portion of the high-dimensional PDF. Therefore a small number of ensembles is sufficient to recover the full PDF, which overcomes the curse of dimensionality. Notably, the mixture distribution has significant skill in capturing the transient behavior with fat tails of the high-dimensional non-Gaussian PDFs, and this facilitates the algorithms in accurately describing the intermittency and extreme events in complex turbulent systems. It is shown in a stringent set of test problems that the method only requires an order of O (100) ensembles to successfully recover the highly non-Gaussian transient PDFs in up to 6 dimensions with only small errors.

  9. BIO-Plex Information System Concept

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Boulanger, Richard; Arnold, James O. (Technical Monitor)

    1999-01-01

    This paper describes a suggested design for an integrated information system for the proposed BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) at Johnson Space Center (JSC), including distributed control systems, central control, networks, database servers, personal computers and workstations, applications software, and external communications. The system will have an open commercial computing and networking, architecture. The network will provide automatic real-time transfer of information to database server computers which perform data collection and validation. This information system will support integrated, data sharing applications for everything, from system alarms to management summaries. Most existing complex process control systems have information gaps between the different real time subsystems, between these subsystems and central controller, between the central controller and system level planning and analysis application software, and between the system level applications and management overview reporting. An integrated information system is vitally necessary as the basis for the integration of planning, scheduling, modeling, monitoring, and control, which will allow improved monitoring and control based on timely, accurate and complete data. Data describing the system configuration and the real time processes can be collected, checked and reconciled, analyzed and stored in database servers that can be accessed by all applications. The required technology is available. The only opportunity to design a distributed, nonredundant, integrated system is before it is built. Retrofit is extremely difficult and costly.

  10. RICIS Symposium 1988

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Integrated Environments for Large, Complex Systems is the theme for the RICIS symposium of 1988. Distinguished professionals from industry, government, and academia have been invited to participate and present their views and experiences regarding research, education, and future directions related to this topic. Within RICIS, more than half of the research being conducted is in the area of Computer Systems and Software Engineering. The focus of this research is on the software development life-cycle for large, complex, distributed systems. Within the education and training component of RICIS, the primary emphasis has been to provide education and training for software professionals.

  11. Analysis of Immune Complex Structure by Statistical Mechanics and Light Scattering Techniques.

    NASA Astrophysics Data System (ADS)

    Busch, Nathan Adams

    1995-01-01

    The size and structure of immune complexes determine their behavior in the immune system. The chemical physics of the complex formation is not well understood; this is due in part to inadequate characterization of the proteins involved, and in part by lack of sufficiently well developed theoretical techniques. Understanding the complex formation will permit rational design of strategies for inhibiting tissue deposition of the complexes. A statistical mechanical model of the proteins based upon the theory of associating fluids was developed. The multipole electrostatic potential for each protein used in this study was characterized for net protein charge, dipole moment magnitude, and dipole moment direction. The binding sites, between the model antigen and antibodies, were characterized for their net surface area, energy, and position relative to the dipole moment of the protein. The equilibrium binding graphs generated with the protein statistical mechanical model compares favorably with experimental data obtained from radioimmunoassay results. The isothermal compressibility predicted by the model agrees with results obtained from dynamic light scattering. The statistical mechanics model was used to investigate association between the model antigen and selected pairs of antibodies. It was found that, in accordance to expectations from thermodynamic arguments, the highest total binding energy yielded complex distributions which were skewed to higher complex size. From examination of the simulated formation of ring structures from linear chain complexes, and from the joint shape probability surfaces, it was found that ring configurations were formed by the "folding" of linear chains until the ends are within binding distance. By comparing the single antigen/two antibody system which differ only in their respective binding site locations, it was found that binding site location influences complex size and shape distributions only when ring formation occurs. The internal potential energy of a ring complex is considerably less than that of the non-associating system; therefore the ring complexes are quite stable and show no evidence of breaking, and collapsing into smaller complexes. The ring formation will occur only in systems where the total free energy of each complex may be minimized. Thus, ring formation will occur even though entropically unfavorable conformations result if the total free energy can be minimized by doing so.

  12. Distributed synchronization control of complex networks with communication constraints.

    PubMed

    Xu, Zhenhua; Zhang, Dan; Song, Hongbo

    2016-11-01

    This paper is concerned with the distributed synchronization control of complex networks with communication constraints. In this work, the controllers communicate with each other through the wireless network, acting as a controller network. Due to the constrained transmission power, techniques such as the packet size reduction and transmission rate reduction schemes are proposed which could help reduce communication load of the controller network. The packet dropout problem is also considered in the controller design since it is often encountered in networked control systems. We show that the closed-loop system can be modeled as a switched system with uncertainties and random variables. By resorting to the switched system approach and some stochastic system analysis method, a new sufficient condition is firstly proposed such that the exponential synchronization is guaranteed in the mean-square sense. The controller gains are determined by using the well-known cone complementarity linearization (CCL) algorithm. Finally, a simulation study is performed, which demonstrates the effectiveness of the proposed design algorithm. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Parallel processing for scientific computations

    NASA Technical Reports Server (NTRS)

    Alkhatib, Hasan S.

    1995-01-01

    The scope of this project dealt with the investigation of the requirements to support distributed computing of scientific computations over a cluster of cooperative workstations. Various experiments on computations for the solution of simultaneous linear equations were performed in the early phase of the project to gain experience in the general nature and requirements of scientific applications. A specification of a distributed integrated computing environment, DICE, based on a distributed shared memory communication paradigm has been developed and evaluated. The distributed shared memory model facilitates porting existing parallel algorithms that have been designed for shared memory multiprocessor systems to the new environment. The potential of this new environment is to provide supercomputing capability through the utilization of the aggregate power of workstations cooperating in a cluster interconnected via a local area network. Workstations, generally, do not have the computing power to tackle complex scientific applications, making them primarily useful for visualization, data reduction, and filtering as far as complex scientific applications are concerned. There is a tremendous amount of computing power that is left unused in a network of workstations. Very often a workstation is simply sitting idle on a desk. A set of tools can be developed to take advantage of this potential computing power to create a platform suitable for large scientific computations. The integration of several workstations into a logical cluster of distributed, cooperative, computing stations presents an alternative to shared memory multiprocessor systems. In this project we designed and evaluated such a system.

  14. A Nonequilibrium Rate Formula for Collective Motions of Complex Molecular Systems

    NASA Astrophysics Data System (ADS)

    Yanao, Tomohiro; Koon, Wang Sang; Marsden, Jerrold E.

    2010-09-01

    We propose a compact reaction rate formula that accounts for a non-equilibrium distribution of residence times of complex molecules, based on a detailed study of the coarse-grained phase space of a reaction coordinate. We take the structural transition dynamics of a six-atom Morse cluster between two isomers as a prototype of multi-dimensional molecular reactions. Residence time distribution of one of the isomers shows an exponential decay, while that of the other isomer deviates largely from the exponential form and has multiple peaks. Our rate formula explains such equilibrium and non-equilibrium distributions of residence times in terms of the rates of diffusions of energy and the phase of the oscillations of the reaction coordinate. Rapid diffusions of energy and the phase generally give rise to the exponential decay of residence time distribution, while slow diffusions give rise to a non-exponential decay with multiple peaks. We finally make a conjecture about a general relationship between the rates of the diffusions and the symmetry of molecular mass distributions.

  15. Simple Algorithms for Distributed Leader Election in Anonymous Synchronous Rings and Complete Networks Inspired by Neural Development in Fruit Flies.

    PubMed

    Xu, Lei; Jeavons, Peter

    2015-11-01

    Leader election in anonymous rings and complete networks is a very practical problem in distributed computing. Previous algorithms for this problem are generally designed for a classical message passing model where complex messages are exchanged. However, the need to send and receive complex messages makes such algorithms less practical for some real applications. We present some simple synchronous algorithms for distributed leader election in anonymous rings and complete networks that are inspired by the development of the neural system of the fruit fly. Our leader election algorithms all assume that only one-bit messages are broadcast by nodes in the network and processors are only able to distinguish between silence and the arrival of one or more messages. These restrictions allow implementations to use a simpler message-passing architecture. Even with these harsh restrictions our algorithms are shown to achieve good time and message complexity both analytically and experimentally.

  16. Modeling Power Systems as Complex Adaptive Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chassin, David P.; Malard, Joel M.; Posse, Christian

    2004-12-30

    Physical analogs have shown considerable promise for understanding the behavior of complex adaptive systems, including macroeconomics, biological systems, social networks, and electric power markets. Many of today's most challenging technical and policy questions can be reduced to a distributed economic control problem. Indeed, economically based control of large-scale systems is founded on the conjecture that the price-based regulation (e.g., auctions, markets) results in an optimal allocation of resources and emergent optimal system control. This report explores the state-of-the-art physical analogs for understanding the behavior of some econophysical systems and deriving stable and robust control strategies for using them. We reviewmore » and discuss applications of some analytic methods based on a thermodynamic metaphor, according to which the interplay between system entropy and conservation laws gives rise to intuitive and governing global properties of complex systems that cannot be otherwise understood. We apply these methods to the question of how power markets can be expected to behave under a variety of conditions.« less

  17. The interaction of insulin with phospholipids

    PubMed Central

    Perry, M. C.; Tampion, W.; Lucy, J. A.

    1971-01-01

    1. A simple two-phase chloroform–aqueous buffer system was used to investigate the interaction of insulin with phospholipids and other amphipathic substances. 2. The distribution of 125I-labelled insulin in this system was determined after incubation at 37°C. Phosphatidic acid, dicetylphosphoric acid and, to a lesser extent, phosphatidylcholine and cetyltrimethylammonium bromide solubilized 125I-labelled insulin in the chloroform phase, indicating the formation of chloroform-soluble insulin–phospholipid or insulin–amphipath complexes. Phosphatidylethanolamine, sphingomyelin, cholesterol, stearylamine and Triton X-100 were without effect. 3. Formation of insulin–phospholipid complex was confirmed by paper chromatography. 4. The two-phase system was adapted to act as a simple functional system with which to investigate possible effects of insulin on the structural and functional properties of phospholipid micelles in chloroform, by using the distribution of [14C]glucose between the two phases as a monitor of phospholipid–insulin interactions. The ability of phospholipids to solubilize [14C]glucose in chloroform increased in the order phosphatidylcholine

  18. Acrally distributed dermatoses: Vascular dermatoses (purpura and vasculitis).

    PubMed

    Kazandjieva, Jana; Antonov, Dimitar; Kamarashev, Jivko; Tsankov, Nikolai

    Purpuric lesions appear in acral distribution in a variety of conditions and often provide clues to the clinical diagnosis. Purpuric means "hemorrhagic"-that is, the lesions do not blanch from pressure. This review focuses on dermatoses that produce hemorrhagic lesions in acral distribution from the large groups of the vasculitic diseases and their mimics. Cutaneous small vessel vasculitis is confined to the skin, involves mainly postcapillary venules, and has the hallmark manifestation of palpable purpura. Henoch-Schönlein purpura is an immune complex-mediated systemic vasculitis of the small vessels with manifestations from the skin, joints, kidneys, and gastrointestinal system. Only cases where the immune complexes contain immunoglobulin A type are classified as Henoch-Schönlein purpura. Cryoglobulinemic vasculitis is induced by the deposition of cold-precipitated immune complexes in the small vessels. Urticarial vasculitis comprises a spectrum of conditions with the characteristic course of chronic urticaria, with wheals that persist longer than 24 hours, leave hyperpigmentation, and have leukocytoclastic vasculitis on histologic examination. Polyarteritis nodosa is a rare multisystem, segmental necrotizing vasculitis of mainly the medium-sized vessels. Pigmented purpuric dermatoses are chronic benign dermatoses characterized by petechiae, purpura, and increased skin pigmentation. The hallmark of pigmented purpuric dermatoses is their orange-brown, speckled, cayenne pepper-like discoloration. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Research on distributed virtual reality system in electronic commerce

    NASA Astrophysics Data System (ADS)

    Xue, Qiang; Wang, Jiening; Sun, Jizhou

    2004-03-01

    In this paper, Distributed Virtual Reality (DVR) technology applied in Electronical Commerce (EC) is discussed. DVR has the capability of providing a new means for human being to recognize, analyze and resolve the large scale, complex problems, which makes it develop quickly in EC fields. The technology of CSCW (Computer Supported Cooperative Work) and middleware is introduced into the development of EC-DVR system to meet the need of a platform which can provide the necessary cooperation and communication services to avoid developing the basic module repeatedly. Finally, the paper gives a platform structure of EC-DVR system.

  20. Distribution pattern of rare earth ions between water and montmorillonite and its relation to the sorbed species of the ions.

    PubMed

    Takahashi, Yoshio; Tada, Akisa; Shimizu, Hiroshi

    2004-09-01

    REE (rare earth element) distribution coefficients (Kd) between the aqueous phase and montmorillonite surface were obtained to investigate the relation between the REE distribution patterns and the species of REE sorbed on the solid-water interface. It was shown that the features in the REE patterns, such as the slope of the REE patterns, the tetrad effect, and the Y/Ho ratio, were closely related to the REE species at the montmorillonite-water interface. In a binary system (REE-montmorillonite) below pH 5, three features (a larger Kd value for a lighter REE, the absence of the tetrad effect, and the Y/Ho ratio being unchanged from its initial value) suggest that hydrated REE are directly sorbed as an outer-sphere complex at the montmorillonite-water interface. Above pH 5.5, the features in the REE patterns, the larger Kd value for heavier REE, the M-type tetrad effect, and the reduced Y/Ho ratio, showed the formation of an inner-sphere complex of REE with -OH group at the montmorillonite surface. In addition, the REE patterns in the presence of humic acid at pH 5.9 were also studied, where the REE patterns became flat, suggesting that the humate complex is dominant as both dissolved and sorbed species of REE in the ternary system. All of these results were consistent with the spectroscopic data (laser-induced fluorescence spectroscopy) showing the local structure of Eu(III) conducted in the same experimental system. The present results suggest that the features in the REE distribution patterns include information on the REE species at the solid-water interface.

  1. Research in Modeling and Simulation for Airspace Systems Innovation

    NASA Technical Reports Server (NTRS)

    Ballin, Mark G.; Kimmel, William M.; Welch, Sharon S.

    2007-01-01

    This viewgraph presentation provides an overview of some of the applied research and simulation methodologies at the NASA Langley Research Center that support aerospace systems innovation. Risk assessment methodologies, complex systems design and analysis methodologies, and aer ospace operations simulations are described. Potential areas for future research and collaboration using interactive and distributed simula tions are also proposed.

  2. Thrust distribution for attitude control in a variable thrust propulsion system with four ACS nozzles

    NASA Astrophysics Data System (ADS)

    Lim, Yeerang; Lee, Wonsuk; Bang, Hyochoong; Lee, Hosung

    2017-04-01

    A thrust distribution approach is proposed in this paper for a variable thrust solid propulsion system with an attitude control system (ACS) that uses a reduced number of nozzles for a three-axis attitude maneuver. Although a conventional variable thrust solid propulsion system needs six ACS nozzles, this paper proposes a thrust system with four ACS nozzles to reduce the complexity and mass of the system. The performance of the new system was analyzed with numerical simulations, and the results show that the performance of the system with four ACS nozzles was similar to the original system while the mass of the whole system was simultaneously reduced. Moreover, a feasibility analysis was performed to determine whether a thrust system with three ACS nozzles is possible.

  3. The noisy voter model on complex networks.

    PubMed

    Carro, Adrián; Toral, Raúl; San Miguel, Maxi

    2016-04-20

    We propose a new analytical method to study stochastic, binary-state models on complex networks. Moving beyond the usual mean-field theories, this alternative approach is based on the introduction of an annealed approximation for uncorrelated networks, allowing to deal with the network structure as parametric heterogeneity. As an illustration, we study the noisy voter model, a modification of the original voter model including random changes of state. The proposed method is able to unfold the dependence of the model not only on the mean degree (the mean-field prediction) but also on more complex averages over the degree distribution. In particular, we find that the degree heterogeneity--variance of the underlying degree distribution--has a strong influence on the location of the critical point of a noise-induced, finite-size transition occurring in the model, on the local ordering of the system, and on the functional form of its temporal correlations. Finally, we show how this latter point opens the possibility of inferring the degree heterogeneity of the underlying network by observing only the aggregate behavior of the system as a whole, an issue of interest for systems where only macroscopic, population level variables can be measured.

  4. Investigating accident causation through information network modelling.

    PubMed

    Griffin, T G C; Young, M S; Stanton, N A

    2010-02-01

    Management of risk in complex domains such as aviation relies heavily on post-event investigations, requiring complex approaches to fully understand the integration of multi-causal, multi-agent and multi-linear accident sequences. The Event Analysis of Systemic Teamwork methodology (EAST; Stanton et al. 2008) offers such an approach based on network models. In this paper, we apply EAST to a well-known aviation accident case study, highlighting communication between agents as a central theme and investigating the potential for finding agents who were key to the accident. Ultimately, this work aims to develop a new model based on distributed situation awareness (DSA) to demonstrate that the risk inherent in a complex system is dependent on the information flowing within it. By identifying key agents and information elements, we can propose proactive design strategies to optimize the flow of information and help work towards avoiding aviation accidents. Statement of Relevance: This paper introduces a novel application of an holistic methodology for understanding aviation accidents. Furthermore, it introduces an ongoing project developing a nonlinear and prospective method that centralises distributed situation awareness and communication as themes. The relevance of findings are discussed in the context of current ergonomic and aviation issues of design, training and human-system interaction.

  5. @neurIST: infrastructure for advanced disease management through integration of heterogeneous data, computing, and complex processing services.

    PubMed

    Benkner, Siegfried; Arbona, Antonio; Berti, Guntram; Chiarini, Alessandro; Dunlop, Robert; Engelbrecht, Gerhard; Frangi, Alejandro F; Friedrich, Christoph M; Hanser, Susanne; Hasselmeyer, Peer; Hose, Rod D; Iavindrasana, Jimison; Köhler, Martin; Iacono, Luigi Lo; Lonsdale, Guy; Meyer, Rodolphe; Moore, Bob; Rajasekaran, Hariharan; Summers, Paul E; Wöhrer, Alexander; Wood, Steven

    2010-11-01

    The increasing volume of data describing human disease processes and the growing complexity of understanding, managing, and sharing such data presents a huge challenge for clinicians and medical researchers. This paper presents the @neurIST system, which provides an infrastructure for biomedical research while aiding clinical care, by bringing together heterogeneous data and complex processing and computing services. Although @neurIST targets the investigation and treatment of cerebral aneurysms, the system's architecture is generic enough that it could be adapted to the treatment of other diseases. Innovations in @neurIST include confining the patient data pertaining to aneurysms inside a single environment that offers clinicians the tools to analyze and interpret patient data and make use of knowledge-based guidance in planning their treatment. Medical researchers gain access to a critical mass of aneurysm related data due to the system's ability to federate distributed information sources. A semantically mediated grid infrastructure ensures that both clinicians and researchers are able to seamlessly access and work on data that is distributed across multiple sites in a secure way in addition to providing computing resources on demand for performing computationally intensive simulations for treatment planning and research.

  6. Electron detachment of the hydrogen-bonded amino acid side-chain guanine complexes

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Gu, Jiande; Leszczynski, Jerzy

    2007-07-01

    The photoelectron spectra of the hydrogen-bonded amino acid side-chain-guanine complexes has been studied at the partial third order (P3) self-energy approximation of the electron propagator theory. The correlation between the vertical electron detachment energy and the charge distributions on the guanine moiety reveals that the vertical electron detachment energy (VDE) increases as the positive charge distribution on the guanine increases. The low VDE values determined for the negatively charged complexes of the guanine-side-chain-group of Asp/Glu suggest that the influence of the H-bonded anionic groups on the VDE of guanine could be more important than that of the anionic backbone structure. The even lower vertical electron detachment energy for guanine is thus can be expected in the H-bonded protein-DNA systems.

  7. Intelligent services for discovery of complex geospatial features from remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Yue, Peng; Di, Liping; Wei, Yaxing; Han, Weiguo

    2013-09-01

    Remote sensing imagery has been commonly used by intelligence analysts to discover geospatial features, including complex ones. The overwhelming volume of routine image acquisition requires automated methods or systems for feature discovery instead of manual image interpretation. The methods of extraction of elementary ground features such as buildings and roads from remote sensing imagery have been studied extensively. The discovery of complex geospatial features, however, is still rather understudied. A complex feature, such as a Weapon of Mass Destruction (WMD) proliferation facility, is spatially composed of elementary features (e.g., buildings for hosting fuel concentration machines, cooling towers, transportation roads, and fences). Such spatial semantics, together with thematic semantics of feature types, can be used to discover complex geospatial features. This paper proposes a workflow-based approach for discovery of complex geospatial features that uses geospatial semantics and services. The elementary features extracted from imagery are archived in distributed Web Feature Services (WFSs) and discoverable from a catalogue service. Using spatial semantics among elementary features and thematic semantics among feature types, workflow-based service chains can be constructed to locate semantically-related complex features in imagery. The workflows are reusable and can provide on-demand discovery of complex features in a distributed environment.

  8. Soilscapes in the dynamic tropical environments: The case of Sierra Madre del Sur

    NASA Astrophysics Data System (ADS)

    Krasilnikov, P. V.; García-Calderón, N. E.; Ibáñez-Huerta, A.; Bazán-Mateos, M.; Hernández-Santana, J. R.

    2011-12-01

    The paper gives an analysis of the pattern of soil cover of the Sierra Madre del Sur, one of the most complex physiographic regions of Mexico. It presents the results of the study of four latitudinal traverses across the region. We show that the distribution of soils in the Sierra Madre del Sur is associated with major climatic gradients, namely by vertical bioclimatic zonality in the mountains and by the effect of mountain shadow. Altitudinal distribution of soil-bioclimatic belts is complex due to non-uniform gradients of temperature and rainfall, and varies with the configuration of the mountain range. The distribution of soils is associated with the erosion and accumulation rates both on mountain slopes and in river valleys. The abundance of poorly developed soils in (semi)arid areas was ascribed to high erosion rate rather than to low pedogenetic potential. The formation of soil mosaic at a larger scale might be ascribed to the complex net of gully erosion and to the system of seismically triggered landslides of various ages. In the valleys, the distribution of soils depends upon the dynamics of sedimentation and erosion, which eventually exposes paleosols. Red-colored clayey sediments are remains of ancient weathering and pedogenesis. Their distribution is associated mainly with the intensity of recent slope processes. The soil cover pattern of the Sierra Madre del Sur cannot be explained by simplified schemes of bioclimatic zonality. The soil ranges can be explained by the distribution of climates, lithology, complex geological history of the region, and recent geomorphological processes.

  9. Macro-Econophysics

    NASA Astrophysics Data System (ADS)

    Aoyama, Hideaki; Fujiwara, Yoshi; Ikeda, Yuichi; Iyetomi, Hiroshi; Souma, Wataru; Yoshikawa, Hiroshi

    2017-07-01

    Preface; Foreword, Acknowledgements, List of tables; List of figures, prologue, 1. Introduction: reconstructing macroeconomics; 2. Basic concepts in statistical physics and stochastic models; 3. Income and firm-size distributions; 4. Productivity distribution and related topics; 5. Multivariate time-series analysis; 6. Business cycles; 7. Price dynamics and inflation/deflation; 8. Complex network, community analysis, visualization; 9. Systemic risks; Appendix A: computer program for beginners; Epilogue; Bibliography; Index.

  10. Surface speciation of phosphate on goethite as seen by InfraRed Surface Titrations (IRST)

    NASA Astrophysics Data System (ADS)

    Arroyave, Jeison Manuel; Puccia, Virginia; Zanini, Graciela P.; Avena, Marcelo J.

    2018-06-01

    Phosphate adsorption at the metal oxide-water interface has been intensely studied, and the system phosphate-goethite in aqueous media is normally used as a model system with abundant information regarding adsorption-desorption under very different conditions. In spite of this, there is still discussion on whether the main inner-sphere surface complexes that phosphate forms on goethite are monodentate or bidentate. A new spectroscopic technique, InfraRed Surface Titration (IRST), is presented here and used to systematically explore the surface speciation of phosphate on goethite in the pH range 4.5-9.5 at different surface coverages. IRST enabled to construct distribution curves of surface species and distribution curves of dissolved phosphate species. In combination with the CD-MUSIC surface complexation model it was possible to conclude that surface complexes are monodentate. Very accurate distribution curves were obtained, showing a crossing point at pH 5.5 at a surface coverage of 2.0 μmol m-2, with a mononuclear monoprotonated species predominating at pH > 5.5 and a mononuclear diprotonated species prevailing at pH < 5.5. On the contrary, at the low surface coverage of 0.7 μmol m-2 there is no crossing point, with the mononuclear monoprotonated species prevailing at all pH. IRST can become a powerful technique to investigate structure, properties and reactions of any IR-active surface complex at the solid-water interface.

  11. The US business cycle: power law scaling for interacting units with complex internal structure

    NASA Astrophysics Data System (ADS)

    Ormerod, Paul

    2002-11-01

    In the social sciences, there is increasing evidence of the existence of power law distributions. The distribution of recessions in capitalist economies has recently been shown to follow such a distribution. The preferred explanation for this is self-organised criticality. Gene Stanley and colleagues propose an alternative, namely that power law scaling can arise from the interplay between random multiplicative growth and the complex structure of the units composing the system. This paper offers a parsimonious model of the US business cycle based on similar principles. The business cycle, along with long-term growth, is one of the two features which distinguishes capitalism from all previously existing societies. Yet, economics lacks a satisfactory theory of the cycle. The source of cycles is posited in economic theory to be a series of random shocks which are external to the system. In this model, the cycle is an internal feature of the system, arising from the level of industrial concentration of the agents and the interactions between them. The model-in contrast to existing economic theories of the cycle-accounts for the key features of output growth in the US business cycle in the 20th century.

  12. Using eDNA to estimate distribution of fish species in a complex river system (presentation)

    EPA Science Inventory

    Environmental DNA (eDNA) analysis of biological material shed by aquatic organisms is a noninvasive genetic tool that can improve efficiency and reduce costs associated with species detection in aquatic systems. eDNA methods are widely used to assess presence/absence of a target ...

  13. THE EFFECT OF TEMPERATURE ON THE GROWTH OF MYCOBACTERIUM AVIUM COMPLEX (MAC) ORGANISMS

    EPA Science Inventory

    MAC organisms are able to grow, persist, and colonize in water distribution systems and may amplify in hospital hot water systems. This study examined the response of MAC organisms (M. avium, M. intracellulare, and MX) to a range of temperatures commonly associated with drinking...

  14. School Technology Leadership: Artifacts in Systems of Practice

    ERIC Educational Resources Information Center

    Dexter, Sara

    2011-01-01

    A cross-case analysis of five case studies of team-based technology leadership in middle schools with laptop programs identifies systems of practice that organize teams' distributed leadership. These cases suggest that successfully implementing a complex improvement effort warrants a team-based leadership approach, especially for an improvement…

  15. Funding California Schools: The Revenue Limit System

    ERIC Educational Resources Information Center

    Weston, Margaret

    2010-01-01

    Tax revenue flows to California's nearly 1,000 school districts through many different channels. According to the Governor's Committee on Education Excellence (2007), this system is so complex that the state cannot determine how revenues are distributed among school districts, and after reviewing a large number of academic studies in the Getting…

  16. 41 CFR 101-25.101-4 - Supply through indefinite quantity requirement contracts.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... (3) The item is proprietary or so complex in design, function, or operation as to be noncompetitive... Federal Property Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND... introduced into a supply system), or no advantage accrues doing so; and (b) Industry distribution facilities...

  17. 41 CFR 101-25.101-4 - Supply through indefinite quantity requirement contracts.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... (3) The item is proprietary or so complex in design, function, or operation as to be noncompetitive... Federal Property Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND... introduced into a supply system), or no advantage accrues doing so; and (b) Industry distribution facilities...

  18. Aerospace Applications of Weibull and Monte Carlo Simulation with Importance Sampling

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.

    1998-01-01

    Recent developments in reliability modeling and computer technology have made it practical to use the Weibull time to failure distribution to model the system reliability of complex fault-tolerant computer-based systems. These system models are becoming increasingly popular in space systems applications as a result of mounting data that support the decreasing Weibull failure distribution and the expectation of increased system reliability. This presentation introduces the new reliability modeling developments and demonstrates their application to a novel space system application. The application is a proposed guidance, navigation, and control (GN&C) system for use in a long duration manned spacecraft for a possible Mars mission. Comparisons to the constant failure rate model are presented and the ramifications of doing so are discussed.

  19. Stochastic model simulation using Kronecker product analysis and Zassenhaus formula approximation.

    PubMed

    Caglar, Mehmet Umut; Pal, Ranadip

    2013-01-01

    Probabilistic Models are regularly applied in Genetic Regulatory Network modeling to capture the stochastic behavior observed in the generation of biological entities such as mRNA or proteins. Several approaches including Stochastic Master Equations and Probabilistic Boolean Networks have been proposed to model the stochastic behavior in genetic regulatory networks. It is generally accepted that Stochastic Master Equation is a fundamental model that can describe the system being investigated in fine detail, but the application of this model is computationally enormously expensive. On the other hand, Probabilistic Boolean Network captures only the coarse-scale stochastic properties of the system without modeling the detailed interactions. We propose a new approximation of the stochastic master equation model that is able to capture the finer details of the modeled system including bistabilities and oscillatory behavior, and yet has a significantly lower computational complexity. In this new method, we represent the system using tensors and derive an identity to exploit the sparse connectivity of regulatory targets for complexity reduction. The algorithm involves an approximation based on Zassenhaus formula to represent the exponential of a sum of matrices as product of matrices. We derive upper bounds on the expected error of the proposed model distribution as compared to the stochastic master equation model distribution. Simulation results of the application of the model to four different biological benchmark systems illustrate performance comparable to detailed stochastic master equation models but with considerably lower computational complexity. The results also demonstrate the reduced complexity of the new approach as compared to commonly used Stochastic Simulation Algorithm for equivalent accuracy.

  20. Priority of a Hesitant Fuzzy Linguistic Preference Relation with a Normal Distribution in Meteorological Disaster Risk Assessment.

    PubMed

    Wang, Lihong; Gong, Zaiwu

    2017-10-10

    As meteorological disaster systems are large complex systems, disaster reduction programs must be based on risk analysis. Consequently, judgment by an expert based on his or her experience (also known as qualitative evaluation) is an important link in meteorological disaster risk assessment. In some complex and non-procedural meteorological disaster risk assessments, a hesitant fuzzy linguistic preference relation (HFLPR) is often used to deal with a situation in which experts may be hesitant while providing preference information of a pairwise comparison of alternatives, that is, the degree of preference of one alternative over another. This study explores hesitation from the perspective of statistical distributions, and obtains an optimal ranking of an HFLPR based on chance-restricted programming, which provides a new approach for hesitant fuzzy optimisation of decision-making in meteorological disaster risk assessments.

  1. Modeling complex systems in the geosciences

    NASA Astrophysics Data System (ADS)

    Balcerak, Ernie

    2013-03-01

    Many geophysical phenomena can be described as complex systems, involving phenomena such as extreme or "wild" events that often do not follow the Gaussian distribution that would be expected if the events were simply random and uncorrelated. For instance, some geophysical phenomena like earthquakes show a much higher occurrence of relatively large values than would a Gaussian distribution and so are examples of the "Noah effect" (named by Benoit Mandelbrot for the exceptionally heavy rain in the biblical flood). Other geophysical phenomena are examples of the "Joseph effect," in which a state is especially persistent, such as a spell of multiple consecutive hot days (heat waves) or several dry summers in a row. The Joseph effect was named after the biblical story in which Joseph's dream of seven fat cows and seven thin ones predicted 7 years of plenty followed by 7 years of drought.

  2. Experimental quantum fingerprinting with weak coherent pulses

    PubMed Central

    Xu, Feihu; Arrazola, Juan Miguel; Wei, Kejin; Wang, Wenyuan; Palacios-Avila, Pablo; Feng, Chen; Sajeed, Shihan; Lütkenhaus, Norbert; Lo, Hoi-Kwong

    2015-01-01

    Quantum communication holds the promise of creating disruptive technologies that will play an essential role in future communication networks. For example, the study of quantum communication complexity has shown that quantum communication allows exponential reductions in the information that must be transmitted to solve distributed computational tasks. Recently, protocols that realize this advantage using optical implementations have been proposed. Here we report a proof-of-concept experimental demonstration of a quantum fingerprinting system that is capable of transmitting less information than the best-known classical protocol. Our implementation is based on a modified version of a commercial quantum key distribution system using off-the-shelf optical components over telecom wavelengths, and is practical for messages as large as 100 Mbits, even in the presence of experimental imperfections. Our results provide a first step in the development of experimental quantum communication complexity. PMID:26515586

  3. Experimental quantum fingerprinting with weak coherent pulses.

    PubMed

    Xu, Feihu; Arrazola, Juan Miguel; Wei, Kejin; Wang, Wenyuan; Palacios-Avila, Pablo; Feng, Chen; Sajeed, Shihan; Lütkenhaus, Norbert; Lo, Hoi-Kwong

    2015-10-30

    Quantum communication holds the promise of creating disruptive technologies that will play an essential role in future communication networks. For example, the study of quantum communication complexity has shown that quantum communication allows exponential reductions in the information that must be transmitted to solve distributed computational tasks. Recently, protocols that realize this advantage using optical implementations have been proposed. Here we report a proof-of-concept experimental demonstration of a quantum fingerprinting system that is capable of transmitting less information than the best-known classical protocol. Our implementation is based on a modified version of a commercial quantum key distribution system using off-the-shelf optical components over telecom wavelengths, and is practical for messages as large as 100 Mbits, even in the presence of experimental imperfections. Our results provide a first step in the development of experimental quantum communication complexity.

  4. Experimental quantum fingerprinting with weak coherent pulses

    NASA Astrophysics Data System (ADS)

    Xu, Feihu; Arrazola, Juan Miguel; Wei, Kejin; Wang, Wenyuan; Palacios-Avila, Pablo; Feng, Chen; Sajeed, Shihan; Lütkenhaus, Norbert; Lo, Hoi-Kwong

    2015-10-01

    Quantum communication holds the promise of creating disruptive technologies that will play an essential role in future communication networks. For example, the study of quantum communication complexity has shown that quantum communication allows exponential reductions in the information that must be transmitted to solve distributed computational tasks. Recently, protocols that realize this advantage using optical implementations have been proposed. Here we report a proof-of-concept experimental demonstration of a quantum fingerprinting system that is capable of transmitting less information than the best-known classical protocol. Our implementation is based on a modified version of a commercial quantum key distribution system using off-the-shelf optical components over telecom wavelengths, and is practical for messages as large as 100 Mbits, even in the presence of experimental imperfections. Our results provide a first step in the development of experimental quantum communication complexity.

  5. A new method for predicting response in complex linear systems. II. [under random or deterministic steady state excitation

    NASA Technical Reports Server (NTRS)

    Bogdanoff, J. L.; Kayser, K.; Krieger, W.

    1977-01-01

    The paper describes convergence and response studies in the low frequency range of complex systems, particularly with low values of damping of different distributions, and reports on the modification of the relaxation procedure required under these conditions. A new method is presented for response estimation in complex lumped parameter linear systems under random or deterministic steady state excitation. The essence of the method is the use of relaxation procedures with a suitable error function to find the estimated response; natural frequencies and normal modes are not computed. For a 45 degree of freedom system, and two relaxation procedures, convergence studies and frequency response estimates were performed. The low frequency studies are considered in the framework of earlier studies (Kayser and Bogdanoff, 1975) involving the mid to high frequency range.

  6. Interpreting Popov criteria in Lure´ systems with complex scaling stability analysis

    NASA Astrophysics Data System (ADS)

    Zhou, J.

    2018-06-01

    The paper presents a novel frequency-domain interpretation of Popov criteria for absolute stability in Lure´ systems by means of what we call complex scaling stability analysis. The complex scaling technique is developed for exponential/asymptotic stability in LTI feedback systems, which dispenses open-loop poles distribution, contour/locus orientation and prior frequency sweeping. Exploiting the technique for alternatively revealing positive realness of transfer functions, re-interpreting Popov criteria is explicated. More specifically, the suggested frequency-domain stability conditions are conformable both in scalar and multivariable cases, and can be implemented either graphically with locus plotting or numerically without; in particular, the latter is suitable as a design tool with auxiliary parameter freedom. The interpretation also reveals further frequency-domain facts about Lure´ systems. Numerical examples are included to illustrate the main results.

  7. Estimates of water source contributions in a dynamic urban water supply system inferred via a Bayesian stable isotope mixing model

    NASA Astrophysics Data System (ADS)

    Jameel, M. Y.; Brewer, S.; Fiorella, R.; Tipple, B. J.; Bowen, G. J.; Terry, S.

    2017-12-01

    Public water supply systems (PWSS) are complex distribution systems and critical infrastructure, making them vulnerable to physical disruption and contamination. Exploring the susceptibility of PWSS to such perturbations requires detailed knowledge of the supply system structure and operation. Although the physical structure of supply systems (i.e., pipeline connection) is usually well documented for developed cities, the actual flow patterns of water in these systems are typically unknown or estimated based on hydrodynamic models with limited observational validation. Here, we present a novel method for mapping the flow structure of water in a large, complex PWSS, building upon recent work highlighting the potential of stable isotopes of water (SIW) to document water management practices within complex PWSS. We sampled a major water distribution system of the Salt Lake Valley, Utah, measuring SIW of water sources, treatment facilities, and numerous sites within in the supply system. We then developed a hierarchical Bayesian (HB) isotope mixing model to quantify the proportion of water supplied by different sources at sites within the supply system. Known production volumes and spatial distance effects were used to define the prior probabilities for each source; however, we did not include other physical information about the supply system. Our results were in general agreement with those obtained by hydrodynamic models and provide quantitative estimates of contributions of different water sources to a given site along with robust estimates of uncertainty. Secondary properties of the supply system, such as regions of "static" and "dynamic" source (e.g., regions supplied dominantly by one source vs. those experiencing active mixing between multiple sources), can be inferred from the results. The isotope-based HB isotope mixing model offers a new investigative technique for analyzing PWSS and documenting aspects of supply system structure and operation that are otherwise challenging to observe. The method could allow water managers to document spatiotemporal variation in PWSS flow patterns, critical for interrogating the distribution system to inform operation decision making or disaster response, optimize water supply and, monitor and enforce water rights.

  8. Enhanced cellulose orientation analysis in complex model plant tissues.

    PubMed

    Rüggeberg, Markus; Saxe, Friederike; Metzger, Till H; Sundberg, Björn; Fratzl, Peter; Burgert, Ingo

    2013-09-01

    The orientation distribution of cellulose microfibrils in the plant cell wall is a key parameter for understanding anisotropic plant growth and mechanical behavior. However, precisely visualizing cellulose orientation in the plant cell wall has ever been a challenge due to the small size of the cellulose microfibrils and the complex network of polymers in the plant cell wall. X-ray diffraction is one of the most frequently used methods for analyzing cellulose orientation in single cells and plant tissues, but the interpretation of the diffraction images is complex. Traditionally, circular or square cells and Gaussian orientation of the cellulose microfibrils have been assumed to elucidate cellulose orientation from the diffraction images. However, the complex tissue structures of common model plant systems such as Arabidopsis or aspen (Populus) require a more sophisticated approach. We present an evaluation procedure which takes into account the precise cell geometry and is able to deal with complex microfibril orientation distributions. The evaluation procedure reveals the entire orientation distribution of the cellulose microfibrils, reflecting different orientations within the multi-layered cell wall. By analyzing aspen wood and Arabidopsis stems we demonstrate the versatility of this method and show that simplifying assumptions on geometry and orientation distributions can lead to errors in the calculated microfibril orientation pattern. The simulation routine is intended to be used as a valuable tool for nanostructural analysis of plant cell walls and is freely available from the authors on request. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Design and Simulation of Material-Integrated Distributed Sensor Processing with a Code-Based Agent Platform and Mobile Multi-Agent Systems

    PubMed Central

    Bosse, Stefan

    2015-01-01

    Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques. PMID:25690550

  10. Design and simulation of material-integrated distributed sensor processing with a code-based agent platform and mobile multi-agent systems.

    PubMed

    Bosse, Stefan

    2015-02-16

    Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.

  11. An in vitro study of interactions between insulin-mimetic zinc(II) complexes and selected plasma components.

    PubMed

    Enyedy, Eva Anna; Horváth, László; Gajda-Schrantz, Krisztina; Galbács, Gábor; Kiss, Tamás

    2006-12-01

    The speciations of some potent insulin-mimetic zinc(II) complexes of bidentate ligands: maltol and 1,2-dimethyl-3-hydroxypyridinone with (O,O) and picolinic acid with (N,O) coordination modes, were studied via solution equilibrium investigations of the ternary complex formation in the presence of small relevant bioligands of the blood serum such as cysteine, histidine and citric acid. Results show that formation of the ternary complexes, especially with cysteine, is favoured at physiological pH range in almost all systems studied. Besides these low molecular mass binders, serum proteins among others albumin and transferrin can bind zinc(II) or its complexes. Accordingly, the distribution of zinc(II) between the small and high molecular mass fractions of the serum was also studied by ultrafiltration. Modelling calculations relating to the distribution of zinc(II), using the stability constants of the ternary complexes studied and those of the serum proteins reported in the literature, confirmed the ultrafiltration results, namely, the primary role of albumin in zinc(II) binding among the low and high molecular mass components of the serum.

  12. Unlimited multistability in multisite phosphorylation systems.

    PubMed

    Thomson, Matthew; Gunawardena, Jeremy

    2009-07-09

    Reversible phosphorylation on serine, threonine and tyrosine is the most widely studied posttranslational modification of proteins. The number of phosphorylated sites on a protein (n) shows a significant increase from prokaryotes, with n /= 150 sites. Multisite phosphorylation has many roles and site conservation indicates that increasing numbers of sites cannot be due merely to promiscuous phosphorylation. A substrate with n sites has an exponential number (2(n)) of phospho-forms and individual phospho-forms may have distinct biological effects. The distribution of these phospho-forms and how this distribution is regulated have remained unknown. Here we show that, when kinase and phosphatase act in opposition on a multisite substrate, the system can exhibit distinct stable phospho-form distributions at steady state and that the maximum number of such distributions increases with n. Whereas some stable distributions are focused on a single phospho-form, others are more diffuse, giving the phospho-proteome the potential to behave as a fluid regulatory network able to encode information and flexibly respond to varying demands. Such plasticity may underlie complex information processing in eukaryotic cells and suggests a functional advantage in having many sites. Our results follow from the unusual geometry of the steady-state phospho-form concentrations, which we show to constitute a rational algebraic curve, irrespective of n. We thereby reduce the complexity of calculating steady states from simulating 3 x 2(n) differential equations to solving two algebraic equations, while treating parameters symbolically. We anticipate that these methods can be extended to systems with multiple substrates and multiple enzymes catalysing different modifications, as found in posttranslational modification 'codes' such as the histone code. Whereas simulations struggle with exponentially increasing molecular complexity, mathematical methods of the kind developed here can provide a new language in which to articulate the principles of cellular information processing.

  13. Resilient Core Networks for Energy Distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuntze, Nicolai; Rudolph, Carsten; Leivesley, Sally

    2014-07-28

    Abstract—Substations and their control are crucial for the availability of electricity in today’s energy distribution. Ad- vanced energy grids with Distributed Energy Resources require higher complexity in substations, distributed functionality and communication between devices inside substations and between substations. Also, substations include more and more intelligent devices and ICT based systems. All these devices are connected to other systems by different types of communication links or are situated in uncontrolled environments. Therefore, the risk of ICT based attacks on energy grids is growing. Consequently, security measures to counter these risks need to be an intrinsic part of energy grids. Thismore » paper introduces the concept of a Resilient Core Network to interconnected substations. This core network provides essen- tial security features, enables fast detection of attacks and allows for a distributed and autonomous mitigation of ICT based risks.« less

  14. Complex systems and the technology of variability analysis

    PubMed Central

    Seely, Andrew JE; Macklem, Peter T

    2004-01-01

    Characteristic patterns of variation over time, namely rhythms, represent a defining feature of complex systems, one that is synonymous with life. Despite the intrinsic dynamic, interdependent and nonlinear relationships of their parts, complex biological systems exhibit robust systemic stability. Applied to critical care, it is the systemic properties of the host response to a physiological insult that manifest as health or illness and determine outcome in our patients. Variability analysis provides a novel technology with which to evaluate the overall properties of a complex system. This review highlights the means by which we scientifically measure variation, including analyses of overall variation (time domain analysis, frequency distribution, spectral power), frequency contribution (spectral analysis), scale invariant (fractal) behaviour (detrended fluctuation and power law analysis) and regularity (approximate and multiscale entropy). Each technique is presented with a definition, interpretation, clinical application, advantages, limitations and summary of its calculation. The ubiquitous association between altered variability and illness is highlighted, followed by an analysis of how variability analysis may significantly improve prognostication of severity of illness and guide therapeutic intervention in critically ill patients. PMID:15566580

  15. Vlasov dynamics of periodically driven systems

    NASA Astrophysics Data System (ADS)

    Banerjee, Soumyadip; Shah, Kushal

    2018-04-01

    Analytical solutions of the Vlasov equation for periodically driven systems are of importance in several areas of plasma physics and dynamical systems and are usually approximated using ponderomotive theory. In this paper, we derive the plasma distribution function predicted by ponderomotive theory using Hamiltonian averaging theory and compare it with solutions obtained by the method of characteristics. Our results show that though ponderomotive theory is relatively much easier to use, its predictions are very restrictive and are likely to be very different from the actual distribution function of the system. We also analyse all possible initial conditions which lead to periodic solutions of the Vlasov equation for periodically driven systems and conjecture that the irreducible polynomial corresponding to the initial condition must only have squares of the spatial and momentum coordinate. The resulting distribution function for other initial conditions is aperiodic and can lead to complex relaxation processes within the plasma.

  16. Deceit: A flexible distributed file system

    NASA Technical Reports Server (NTRS)

    Siegel, Alex; Birman, Kenneth; Marzullo, Keith

    1989-01-01

    Deceit, a distributed file system (DFS) being developed at Cornell, focuses on flexible file semantics in relation to efficiency, scalability, and reliability. Deceit servers are interchangeable and collectively provide the illusion of a single, large server machine to any clients of the Deceit service. Non-volatile replicas of each file are stored on a subset of the file servers. The user is able to set parameters on a file to achieve different levels of availability, performance, and one-copy serializability. Deceit also supports a file version control mechanism. In contrast with many recent DFS efforts, Deceit can behave like a plain Sun Network File System (NFS) server and can be used by any NFS client without modifying any client software. The current Deceit prototype uses the ISIS Distributed Programming Environment for all communication and process group management, an approach that reduces system complexity and increases system robustness.

  17. Temporal complexity in emission from Anderson localized lasers

    NASA Astrophysics Data System (ADS)

    Kumar, Randhir; Balasubrahmaniyam, M.; Alee, K. Shadak; Mujumdar, Sushil

    2017-12-01

    Anderson localization lasers exploit resonant cavities formed due to structural disorder. The inherent randomness in the structure of these cavities realizes a probability distribution in all cavity parameters such as quality factors, mode volumes, mode structures, and so on, implying resultant statistical fluctuations in the temporal behavior. Here we provide direct experimental measurements of temporal width distributions of Anderson localization lasing pulses in intrinsically and extrinsically disordered coupled-microresonator arrays. We first illustrate signature exponential decays in the spatial intensity distributions of the lasing modes that quantify their localized character, and then measure the temporal width distributions of the pulsed emission over several configurations. We observe a dependence of temporal widths on the disorder strength, wherein the widths show a single-peaked, left-skewed distribution in extrinsic disorder and a dual-peaked distribution in intrinsic disorder. We propose a model based on coupled rate equations for an emitter and an Anderson cavity with a random mode structure, which gives excellent quantitative and qualitative agreement with the experimental observations. The experimental and theoretical analyses bring to the fore the temporal complexity in Anderson-localization-based lasing systems.

  18. Interactions and reversal-field memory in complex magnetic nanowire arrays

    NASA Astrophysics Data System (ADS)

    Rotaru, Aurelian; Lim, Jin-Hee; Lenormand, Denny; Diaconu, Andrei; Wiley, John. B.; Postolache, Petronel; Stancu, Alexandru; Spinu, Leonard

    2011-10-01

    Interactions and magnetization reversal of Ni nanowire arrays have been investigated by the first-order reversal curve (FORC) method. Several series of samples with controlled spatial distribution were considered including simple wires of different lengths and diameters (70 and 110 nm) and complex wires with a single modulated diameter along their length. Subtle features of magnetic interactions are revealed through a quantitative analysis of the local interaction field profile distributions obtained from the FORC method. In addition, the FORC analysis indicates that the nanowire systems with a mean diameter of 70 nm appear to be organized in symmetric clusters indicative of a reversal-field memory effect.

  19. Improving the Aircraft Design Process Using Web-Based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)

    2000-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  20. Improving the Aircraft Design Process Using Web-based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.

    2003-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  1. Characterization of time-resolved fluorescence response measurements for distributed optical-fiber sensing.

    PubMed

    Sinchenko, Elena; Gibbs, W E Keith; Davis, Claire E; Stoddart, Paul R

    2010-11-20

    A distributed optical-fiber sensing system based on pulsed excitation and time-gated photon counting has been used to locate a fluorescent region along the fiber. The complex Alq3 and the infrared dye IR-125 were examined with 405 and 780 nm excitation, respectively. A model to characterize the response of the distributed fluorescence sensor to a Gaussian input pulse was developed and tested. Analysis of the Alq3 fluorescent response confirmed the validity of the model and enabled the fluorescence lifetime to be determined. The intrinsic lifetime obtained (18.2±0.9 ns) is in good agreement with published data. The decay rate was found to be proportional to concentration, which is indicative of collisional deactivation. The model allows the spatial resolution of a distributed sensing system to be improved for fluorophores with lifetimes that are longer than the resolution of the sensing system.

  2. Adaptation of Fusarium oxysporum and Fusarium dimerum to the specific aquatic environment provided by the water systems of hospitals.

    PubMed

    Steinberg, Christian; Laurent, Julie; Edel-Hermann, Véronique; Barbezant, Marie; Sixt, Nathalie; Dalle, Frédéric; Aho, Serge; Bonnin, Alain; Hartemann, Philippe; Sautour, Marc

    2015-06-01

    Members of the Fusarium group were recently detected in water distribution systems of several hospitals in the world. An epidemiological investigation was conducted over 2 years in hospital buildings in Dijon and Nancy (France) and in non-hospital buildings in Dijon. The fungi were detected only within the water distribution systems of the hospital buildings and also, but at very low concentrations, in the urban water network of Nancy. All fungi were identified as Fusarium oxysporum species complex (FOSC) and Fusarium dimerum species complex (FDSC) by sequencing part of the translation elongation factor 1-alpha (TEF-1α) gene. Very low diversity was found in each complex, suggesting the existence of a clonal population for each. Density and heterogeneous distributions according to buildings and variability over time were explained by episodic detachments of parts of the colony from biofilms in the pipes. Isolates of these waterborne populations as well as soilborne isolates were tested for their ability to grow in liquid medium in the presence of increasing concentrations of sodium hypochlorite, copper sulfate, anti-corrosion pipe coating, at various temperatures (4°-42 °C) and on agar medium with amphotericin B and voriconazole. The waterborne isolates tolerated higher sodium hypochlorite and copper sulfate concentrations and temperatures than did soilborne isolates but did not show any specific resistance to fungicides. In addition, unlike waterborne isolates, soilborne isolates did not survive in water even supplemented with glucose, while the former developed in the soil as well as soilborne isolates. We concluded the existence of homogeneous populations of FOSC and FDSC common to all contaminated hospital sites. These populations are present at very low densities in natural waters, making them difficult to detect, but they are adapted to the specific conditions offered by the complex water systems of public hospitals in Dijon and Nancy and probably other localities in the world. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. The Complex Refractive Index of Volcanic Ash Aerosol Retrieved From Spectral Mass Extinction

    NASA Astrophysics Data System (ADS)

    Reed, Benjamin E.; Peters, Daniel M.; McPheat, Robert; Grainger, R. G.

    2018-01-01

    The complex refractive indices of eight volcanic ash samples, chosen to have a representative range of SiO2 contents, were retrieved from simultaneous measurements of their spectral mass extinction coefficient and size distribution. The mass extinction coefficients, at 0.33-19 μm, were measured using two optical systems: a Fourier transform spectrometer in the infrared and two diffraction grating spectrometers covering visible and ultraviolet wavelengths. The particle size distribution was measured using a scanning mobility particle sizer and an optical particle counter; values for the effective radius of ash particles measured in this study varied from 0.574 to 1.16 μm. Verification retrievals on high-purity silica aerosol demonstrated that the Rayleigh continuous distribution of ellipsoids (CDEs) scattering model significantly outperformed Mie theory in retrieving the complex refractive index, when compared to literature values. Assuming the silica particles provided a good analogue of volcanic ash, the CDE scattering model was applied to retrieve the complex refractive index of the eight ash samples. The Lorentz formulation of the complex refractive index was used within the retrievals as a convenient way to ensure consistency with the Kramers-Kronig relation. The short-wavelength limit of the electric susceptibility was constrained by using independently measured reference values of the complex refractive index of the ash samples at a visible wavelength. The retrieved values of the complex refractive indices of the ash samples showed considerable variation, highlighting the importance of using accurate refractive index data in ash cloud radiative transfer models.

  4. Complexity, Robustness, and Multistability in Network Systems with Switching Topologies: A Hierarchical Hybrid Control Approach

    DTIC Science & Technology

    2015-05-22

    sensor networks for managing power levels of wireless networks ; air and ground transportation systems for air traffic control and payload transport and... network systems, large-scale systems, adaptive control, discontinuous systems 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF...cover a broad spectrum of ap- plications including cooperative control of unmanned air vehicles, autonomous underwater vehicles, distributed sensor

  5. Attention and its role in the operator's work. [control stability in man machine systems

    NASA Technical Reports Server (NTRS)

    Shvetsov, O. P.

    1975-01-01

    A complex attentiometer investigation of the distribution, redirection, and concentration of attention during an operator's work notes the following stages: (1) General attentiveness is still not adequately expressed in the beginning; and (2) operator self-control of actions develops and gradually decreases errors in redirecting and distributing attention. A definite relationship is found between the improvement of concentration, distribution and redirection of attention and automation of sensorimotor performance. Excercises prove less effective in redirection of attention.

  6. Manganese As a Metal Accumulator

    EPA Science Inventory

    Manganese deposits in water distribution systems accumulate metals, radionuclides and oxyanions by a combination of surface complexation, adsorption and solid substitution, as well as a combination of oxidation followed by manganese reduction and sorption of the oxidized constitu...

  7. 3D beam shape estimation based on distributed coaxial cable interferometric sensor

    NASA Astrophysics Data System (ADS)

    Cheng, Baokai; Zhu, Wenge; Liu, Jie; Yuan, Lei; Xiao, Hai

    2017-03-01

    We present a coaxial cable interferometer based distributed sensing system for 3D beam shape estimation. By making a series of reflectors on a coaxial cable, multiple Fabry-Perot cavities are created on it. Two cables are mounted on the beam at proper locations, and a vector network analyzer (VNA) is connected to them to obtain the complex reflection signal, which is used to calculate the strain distribution of the beam in horizontal and vertical planes. With 6 GHz swept bandwidth on the VNA, the spatial resolution for distributed strain measurement is 0.1 m, and the sensitivity is 3.768 MHz mɛ -1 at the interferogram dip near 3.3 GHz. Using displacement-strain transformation, the shape of the beam is reconstructed. With only two modified cables and a VNA, this system is easy to implement and manage. Comparing to optical fiber based sensor systems, the coaxial cable sensors have the advantage of large strain and robustness, making this system suitable for structure health monitoring applications.

  8. 121. VIEW OF CABINETS ON WEST SIDE OF LANDLINE INSTRUMENTATION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    121. VIEW OF CABINETS ON WEST SIDE OF LANDLINE INSTRUMENTATION ROOM (206), LSB (BLDG. 751). FEATURES LEFT TO RIGHT: FACILITY DISTRIBUTION CONSOLE FOR WATER CONTROL SYSTEMS, PROPULSION ELECTRICAL CHECKOUT SYSTEM (PECOS), LOGIC CONTROL AND MONITOR UNITS FOR BOOSTER AND FUEL SYSTEMS. - Vandenberg Air Force Base, Space Launch Complex 3, Launch Pad 3 East, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  9. Predictors of Interpersonal Trust in Virtual Distributed Teams

    DTIC Science & Technology

    2008-09-01

    understand systems that are very complex in nature . Such understanding is essential to facilitate building or maintaining operators’ mental models of the...a significant impact on overall system performance. Specifically, the level of automation that combined human generation of options with computer...and/or computer servers had a significant impact on automated system performance. Additionally, Parasuraman, Sheridan, & Wickens (2000) proposed

  10. Use and Distribution of Rehabilitation Services: A Register Linkage Study in One Hospital District Area in Finland

    ERIC Educational Resources Information Center

    Pulkki, Jutta Maarit; Rissanen, Pekka; Raitanen, Jani A.; Viitanen, Elina A.

    2011-01-01

    This study focuses on a large set of rehabilitation services used between 2004 and 2005 in one hospital district area in Finland. The rehabilitation system consists of several subsystems. This complex system is suggested to produce arbitrary rehabilitation services. Despite the criticisms against the system during decades, no attempts have been…

  11. Rotation And Scale Invariant Object Recognition Using A Distributed Associative Memory

    NASA Astrophysics Data System (ADS)

    Wechsler, Harry; Zimmerman, George Lee

    1988-04-01

    This paper describes an approach to 2-dimensional object recognition. Complex-log conformal mapping is combined with a distributed associative memory to create a system which recognizes objects regardless of changes in rotation or scale. Recalled information from the memorized database is used to classify an object, reconstruct the memorized version of the object, and estimate the magnitude of changes in scale or rotation. The system response is resistant to moderate amounts of noise and occlusion. Several experiments, using real, gray scale images, are presented to show the feasibility of our approach.

  12. Examining Food Risk in the Large using a Complex, Networked System-of-sytems Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ambrosiano, John; Newkirk, Ryan; Mc Donald, Mark P

    2010-12-03

    The food production infrastructure is a highly complex system of systems. Characterizing the risks of intentional contamination in multi-ingredient manufactured foods is extremely challenging because the risks depend on the vulnerabilities of food processing facilities and on the intricacies of the supply-distribution networks that link them. A pure engineering approach to modeling the system is impractical because of the overall system complexity and paucity of data. A methodology is needed to assess food contamination risk 'in the large', based on current, high-level information about manufacturing facilities, corrunodities and markets, that will indicate which food categories are most at risk ofmore » intentional contamination and warrant deeper analysis. The approach begins by decomposing the system for producing a multi-ingredient food into instances of two subsystem archetypes: (1) the relevant manufacturing and processing facilities, and (2) the networked corrunodity flows that link them to each other and consumers. Ingredient manufacturing subsystems are modeled as generic systems dynamics models with distributions of key parameters that span the configurations of real facilities. Networks representing the distribution systems are synthesized from general information about food corrunodities. This is done in a series of steps. First, probability networks representing the aggregated flows of food from manufacturers to wholesalers, retailers, other manufacturers, and direct consumers are inferred from high-level approximate information. This is followed by disaggregation of the general flows into flows connecting 'large' and 'small' categories of manufacturers, wholesalers, retailers, and consumers. Optimization methods are then used to determine the most likely network flows consistent with given data. Vulnerability can be assessed for a potential contamination point using a modified CARVER + Shock model. Once the facility and corrunodity flow models are instantiated, a risk consequence analysis can be performed by injecting contaminant at chosen points in the system and propagating the event through the overarching system to arrive at morbidity and mortality figures. A generic chocolate snack cake model, consisting of fluid milk, liquid eggs, and cocoa, is described as an intended proof of concept for multi-ingredient food systems. We aim for an eventual tool that can be used directly by policy makers and planners.« less

  13. Gravitational lensing by eigenvalue distributions of random matrix models

    NASA Astrophysics Data System (ADS)

    Martínez Alonso, Luis; Medina, Elena

    2018-05-01

    We propose to use eigenvalue densities of unitary random matrix ensembles as mass distributions in gravitational lensing. The corresponding lens equations reduce to algebraic equations in the complex plane which can be treated analytically. We prove that these models can be applied to describe lensing by systems of edge-on galaxies. We illustrate our analysis with the Gaussian and the quartic unitary matrix ensembles.

  14. Scalable collaborative risk management technology for complex critical systems

    NASA Technical Reports Server (NTRS)

    Campbell, Scott; Torgerson, Leigh; Burleigh, Scott; Feather, Martin S.; Kiper, James D.

    2004-01-01

    We describe here our project and plans to develop methods, software tools, and infrastructure tools to address challenges relating to geographically distributed software development. Specifically, this work is creating an infrastructure that supports applications working over distributed geographical and organizational domains and is using this infrastructure to develop a tool that supports project development using risk management and analysis techniques where the participants are not collocated.

  15. Coherent Frequency Reference System for the NASA Deep Space Network

    NASA Technical Reports Server (NTRS)

    Tucker, Blake C.; Lauf, John E.; Hamell, Robert L.; Gonzaler, Jorge, Jr.; Diener, William A.; Tjoelker, Robert L.

    2010-01-01

    The NASA Deep Space Network (DSN) requires state-of-the-art frequency references that are derived and distributed from very stable atomic frequency standards. A new Frequency Reference System (FRS) and Frequency Reference Distribution System (FRD) have been developed, which together replace the previous Coherent Reference Generator System (CRG). The FRS and FRD each provide new capabilities that significantly improve operability and reliability. The FRS allows for selection and switching between frequency standards, a flywheel capability (to avoid interruptions when switching frequency standards), and a frequency synthesis system (to generate standardized 5-, 10-, and 100-MHz reference signals). The FRS is powered by redundant, specially filtered, and sustainable power systems and includes a monitor and control capability for station operations to interact and control the frequency-standard selection process. The FRD receives the standardized 5-, 10-, and 100-MHz reference signals and distributes signals to distribution amplifiers in a fan out fashion to dozens of DSN users that require the highly stable reference signals. The FRD is also powered by redundant, specially filtered, and sustainable power systems. The new DSN Frequency Distribution System, which consists of the FRS and FRD systems described here, is central to all operational activities of the NASA DSN. The frequency generation and distribution system provides ultra-stable, coherent, and very low phase-noise references at 5, l0, and 100 MHz to between 60 and 100 separate users at each Deep Space Communications Complex.

  16. Service-oriented architecture for the ARGOS instrument control software

    NASA Astrophysics Data System (ADS)

    Borelli, J.; Barl, L.; Gässler, W.; Kulas, M.; Rabien, Sebastian

    2012-09-01

    The Advanced Rayleigh Guided ground layer Adaptive optic System, ARGOS, equips the Large Binocular Telescope (LBT) with a constellation of six rayleigh laser guide stars. By correcting atmospheric turbulence near the ground, the system is designed to increase the image quality of the multi-object spectrograph LUCIFER approximately by a factor of 3 over a field of 4 arc minute diameter. The control software has the critical task of orchestrating several devices, instruments, and high level services, including the already existing adaptive optic system and the telescope control software. All these components are widely distributed over the telescope, adding more complexity to the system design. The approach used by the ARGOS engineers is to write loosely coupled and distributed services under the control of different ownership systems, providing a uniform mechanism to offer, discover, interact and use these distributed capabilities. The control system counts with several finite state machines, vibration and flexure compensation loops, and safety mechanism, such as interlocks, aircraft, and satellite avoidance systems.

  17. Application of multi-function display and control technology

    NASA Technical Reports Server (NTRS)

    Spiger, R. J.; Farrell, R. J.; Holcomb, G. A.

    1982-01-01

    The NASA orbiter spacecraft incorporates a complex array of systems, displays, and controls. The incorporation of discrete dedicated controls into a multifunction display and control system (MFDCS) offers the potential for savings in weight, power, panel space, and crew training time. Technology identified as applicable to a MFDCS is applied to the orbiter orbital maneuvering system (OMS) and the electrical power distribution and control system (EPDCS) to derive concepts for a MFDCS design. Several concepts of varying degrees of performance and complexity are discussed and a suggested concept for further development is presented in greater detail. Both the hardware and software aspects and the human factors considerations of the designs are included.

  18. Creating virtual humans for simulation-based training and planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stansfield, S.; Sobel, A.

    1998-05-12

    Sandia National Laboratories has developed a distributed, high fidelity simulation system for training and planning small team Operations. The system provides an immersive environment populated by virtual objects and humans capable of displaying complex behaviors. The work has focused on developing the behaviors required to carry out complex tasks and decision making under stress. Central to this work are techniques for creating behaviors for virtual humans and for dynamically assigning behaviors to CGF to allow scenarios without fixed outcomes. Two prototype systems have been developed that illustrate these capabilities: MediSim, a trainer for battlefield medics and VRaptor, a system formore » planning, rehearsing and training assault operations.« less

  19. Information Power Grid Posters

    NASA Technical Reports Server (NTRS)

    Vaziri, Arsi

    2003-01-01

    This document is a summary of the accomplishments of the Information Power Grid (IPG). Grids are an emerging technology that provide seamless and uniform access to the geographically dispersed, computational, data storage, networking, instruments, and software resources needed for solving large-scale scientific and engineering problems. The goal of the NASA IPG is to use NASA's remotely located computing and data system resources to build distributed systems that can address problems that are too large or complex for a single site. The accomplishments outlined in this poster presentation are: access to distributed data, IPG heterogeneous computing, integration of large-scale computing node into distributed environment, remote access to high data rate instruments,and exploratory grid environment.

  20. GraphStore: A Distributed Graph Storage System for Big Data Networks

    ERIC Educational Resources Information Center

    Martha, VenkataSwamy

    2013-01-01

    Networks, such as social networks, are a universal solution for modeling complex problems in real time, especially in the Big Data community. While previous studies have attempted to enhance network processing algorithms, none have paved a path for the development of a persistent storage system. The proposed solution, GraphStore, provides an…

  1. Advanced EMT and Phasor-Domain Hybrid Simulation with Simulation Mode Switching Capability for Transmission and Distribution Systems

    DOE PAGES

    Huang, Qiuhua; Vittal, Vijay

    2018-05-09

    Conventional electromagnetic transient (EMT) and phasor-domain hybrid simulation approaches presently exist for trans-mission system level studies. Their simulation efficiency is generally constrained by the EMT simulation. With an increasing number of distributed energy resources and non-conventional loads being installed in distribution systems, it is imperative to extend the hybrid simulation application to include distribution systems and integrated transmission and distribution systems. Meanwhile, it is equally important to improve the simulation efficiency as the modeling scope and complexity of the detailed system in the EMT simulation increases. To meet both requirements, this paper introduces an advanced EMT and phasor-domain hybrid simulationmore » approach. This approach has two main features: 1) a comprehensive phasor-domain modeling framework which supports positive-sequence, three-sequence, three-phase and mixed three-sequence/three-phase representations and 2) a robust and flexible simulation mode switching scheme. The developed scheme enables simulation switching from hybrid simulation mode back to pure phasor-domain dynamic simulation mode to achieve significantly improved simulation efficiency. The proposed method has been tested on integrated transmission and distribution systems. In conclusion, the results show that with the developed simulation switching feature, the total computational time is significantly reduced compared to running the hybrid simulation for the whole simulation period, while maintaining good simulation accuracy.« less

  2. Advanced EMT and Phasor-Domain Hybrid Simulation with Simulation Mode Switching Capability for Transmission and Distribution Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Qiuhua; Vittal, Vijay

    Conventional electromagnetic transient (EMT) and phasor-domain hybrid simulation approaches presently exist for trans-mission system level studies. Their simulation efficiency is generally constrained by the EMT simulation. With an increasing number of distributed energy resources and non-conventional loads being installed in distribution systems, it is imperative to extend the hybrid simulation application to include distribution systems and integrated transmission and distribution systems. Meanwhile, it is equally important to improve the simulation efficiency as the modeling scope and complexity of the detailed system in the EMT simulation increases. To meet both requirements, this paper introduces an advanced EMT and phasor-domain hybrid simulationmore » approach. This approach has two main features: 1) a comprehensive phasor-domain modeling framework which supports positive-sequence, three-sequence, three-phase and mixed three-sequence/three-phase representations and 2) a robust and flexible simulation mode switching scheme. The developed scheme enables simulation switching from hybrid simulation mode back to pure phasor-domain dynamic simulation mode to achieve significantly improved simulation efficiency. The proposed method has been tested on integrated transmission and distribution systems. In conclusion, the results show that with the developed simulation switching feature, the total computational time is significantly reduced compared to running the hybrid simulation for the whole simulation period, while maintaining good simulation accuracy.« less

  3. Description of a 20 kilohertz power distribution system

    NASA Technical Reports Server (NTRS)

    Hansen, I. G.

    1986-01-01

    A single phase, 440 VRMS, 20 kHz power distribution system with a regulated sinusoidal wave form is discussed. A single phase power system minimizes the wiring, sensing, and control complexities required in a multi-sourced redundantly distributed power system. The single phase addresses only the distribution links multiphase lower frequency inputs and outputs accommodation techniques are described. While the 440 V operating potential was initially selected for aircraft operating below 50,000 ft, this potential also appears suitable for space power systems. This voltage choice recognizes a reasonable upper limit for semiconductor ratings, yet will direct synthesis of 220 V, 3 power. A 20 kHz operating frequency was selected to be above the range of audibility, minimize the weight of reactive components, yet allow the construction of single power stages of 25 to 30 kW. The regulated sinusoidal distribution system has several advantages. With a regulated voltage, most ac/dc conversions involve rather simple transformer rectifier applications. A sinusoidal distribution system, when used in conjunction with zero crossing switching, represents a minimal source of EMI. The present state of 20 kHz power technology includes computer controls of voltage and/or frequency, low inductance cable, current limiting circuit protection, bi-directional power flow, and motor/generator operating using standard induction machines. A status update and description of each of these items and their significance is presented.

  4. Description of a 20 Kilohertz power distribution system

    NASA Technical Reports Server (NTRS)

    Hansen, I. G.

    1986-01-01

    A single phase, 440 VRMS, 20 kHz power distribution system with a regulated sinusoidal wave form is discussed. A single phase power system minimizes the wiring, sensing, and control complexities required in a multi-sourced redundantly distributed power system. The single phase addresses only the distribution link; mulitphase lower frequency inputs and outputs accommodation techniques are described. While the 440 V operating potential was initially selected for aircraft operating below 50,000 ft, this potential also appears suitable for space power systems. This voltage choice recognizes a reasonable upper limit for semiconductor ratings, yet will direct synthesis of 220 V, 3 power. A 20 kHz operating frequency was selected to be above the range of audibility, minimize the weight of reactive components, yet allow the construction of single power stages of 25 to 30 kW. The regulated sinusoidal distribution system has several advantages. With a regulated voltage, most ac/dc conversions involve rather simple transformer rectifier applications. A sinusoidal distribution system, when used in conjunction with zero crossing switching, represents a minimal source of EMI. The present state of 20 kHz power technology includes computer controls of voltage and/or frequency, low inductance cable, current limiting circuit protection, bi-directional power flow, and motor/generator operating using standard induction machines. A status update and description of each of these items and their significance is presented.

  5. Examining System-Wide Impacts of Solar PV Control Systems with a Power Hardware-in-the-Loop Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Tess L.; Fuller, Jason C.; Schneider, Kevin P.

    2014-06-08

    High penetration levels of distributed solar PV power generation can lead to adverse power quality impacts, such as excessive voltage rise, voltage flicker, and reactive power values that result in unacceptable voltage levels. Advanced inverter control schemes have been developed that have the potential to mitigate many power quality concerns. However, local closed-loop control may lead to unintended behavior in deployed systems as complex interactions can occur between numerous operating devices. To enable the study of the performance of advanced control schemes in a detailed distribution system environment, a test platform has been developed that integrates Power Hardware-in-the-Loop (PHIL) withmore » concurrent time-series electric distribution system simulation. In the test platform, GridLAB-D, a distribution system simulation tool, runs a detailed simulation of a distribution feeder in real-time mode at the Pacific Northwest National Laboratory (PNNL) and supplies power system parameters at a point of common coupling. At the National Renewable Energy Laboratory (NREL), a hardware inverter interacts with grid and PV simulators emulating an operational distribution system. Power output from the inverters is measured and sent to PNNL to update the real-time distribution system simulation. The platform is described and initial test cases are presented. The platform is used to study the system-wide impacts and the interactions of inverter control modes—constant power factor and active Volt/VAr control—when integrated into a simulated IEEE 8500-node test feeder. We demonstrate that this platform is well-suited to the study of advanced inverter controls and their impacts on the power quality of a distribution feeder. Additionally, results are used to validate GridLAB-D simulations of advanced inverter controls.« less

  6. Design and Development of a 200-kW Turbo-Electric Distributed Propulsion Testbed

    NASA Technical Reports Server (NTRS)

    Papathakis, Kurt V.; Kloesel, Kurt J.; Lin, Yohan; Clarke, Sean; Ediger, Jacob J.; Ginn, Starr

    2016-01-01

    The National Aeronautics and Space Administration (NASA) Armstrong Flight Research Center (AFRC) (Edwards, California) is developing a Hybrid-Electric Integrated Systems Testbed (HEIST) Testbed as part of the HEIST Project, to study power management and transition complexities, modular architectures, and flight control laws for turbo-electric distributed propulsion technologies using representative hardware and piloted simulations. Capabilities are being developed to assess the flight readiness of hybrid electric and distributed electric vehicle architectures. Additionally, NASA will leverage experience gained and assets developed from HEIST to assist in flight-test proposal development, flight-test vehicle design, and evaluation of hybrid electric and distributed electric concept vehicles for flight safety. The HEIST test equipment will include three trailers supporting a distributed electric propulsion wing, a battery system and turbogenerator, dynamometers, and supporting power and communication infrastructure, all connected to the AFRC Core simulation. Plans call for 18 high performance electric motors that will be powered by batteries and the turbogenerator, and commanded by a piloted simulation. Flight control algorithms will be developed on the turbo-electric distributed propulsion system.

  7. What Is a Complex Innovation System?

    PubMed Central

    Katz, J. Sylvan

    2016-01-01

    Innovation systems are sometimes referred to as complex systems, something that is intuitively understood but poorly defined. A complex system dynamically evolves in non-linear ways giving it unique properties that distinguish it from other systems. In particular, a common signature of complex systems is scale-invariant emergent properties. A scale-invariant property can be identified because it is solely described by a power law function, f(x) = kxα, where the exponent, α, is a measure of scale-invariance. The focus of this paper is to describe and illustrate that innovation systems have properties of a complex adaptive system. In particular scale-invariant emergent properties indicative of their complex nature that can be quantified and used to inform public policy. The global research system is an example of an innovation system. Peer-reviewed publications containing knowledge are a characteristic output. Citations or references to these articles are an indirect measure of the impact the knowledge has on the research community. Peer-reviewed papers indexed in Scopus and in the Web of Science were used as data sources to produce measures of sizes and impact. These measures are used to illustrate how scale-invariant properties can be identified and quantified. It is demonstrated that the distribution of impact has a reasonable likelihood of being scale-invariant with scaling exponents that tended toward a value of less than 3.0 with the passage of time and decreasing group sizes. Scale-invariant correlations are shown between the evolution of impact and size with time and between field impact and sizes at points in time. The recursive or self-similar nature of scale-invariance suggests that any smaller innovation system within the global research system is likely to be complex with scale-invariant properties too. PMID:27258040

  8. High-speed wavelength-division multiplexing quantum key distribution system.

    PubMed

    Yoshino, Ken-ichiro; Fujiwara, Mikio; Tanaka, Akihiro; Takahashi, Seigo; Nambu, Yoshihiro; Tomita, Akihisa; Miki, Shigehito; Yamashita, Taro; Wang, Zhen; Sasaki, Masahide; Tajima, Akio

    2012-01-15

    A high-speed quantum key distribution system was developed with the wavelength-division multiplexing (WDM) technique and dedicated key distillation hardware engines. Two interferometers for encoding and decoding are shared over eight wavelengths to reduce the system's size, cost, and control complexity. The key distillation engines can process a huge amount of data from the WDM channels by using a 1 Mbit block in real time. We demonstrated a three-channel WDM system that simultaneously uses avalanche photodiodes and superconducting single-photon detectors. We achieved 12 h continuous key generation with a secure key rate of 208 kilobits per second through a 45 km field fiber with 14.5 dB loss.

  9. MTL distributed magnet measurement system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nogiec, J.M.; Craker, P.A.; Garbarini, J.P.

    1993-04-01

    The Magnet Test Laboratory (MTL) at the Superconducting Super collider Laboratory will be required to precisely and reliably measure properties of magnets in a production environment. The extensive testing of the superconducting magnets comprises several types of measurements whose main purpose is to evaluate some basic parameters characterizing magnetic, mechanic and cryogenic properties of magnets. The measurement process will produce a significant amount of data which will be subjected to complex analysis. Such massive measurements require a careful design of both the hardware and software of computer systems, having in mind a reliable, maximally automated system. In order to fulfillmore » this requirement a dedicated Distributed Magnet Measurement System (DMMS) is being developed.« less

  10. Constraining Binary Asteroid Mass Distributions Based On Mutual Motion

    NASA Astrophysics Data System (ADS)

    Davis, Alex B.; Scheeres, Daniel J.

    2017-06-01

    The mutual gravitational potential and torques of binary asteroid systems results in a complex coupling of attitude and orbital motion based on the mass distribution of each body. For a doubly-synchronous binary system observations of the mutual motion can be leveraged to identify and measure the unique mass distributions of each body. By implementing arbitrary shape and order computation of the full two-body problem (F2BP) equilibria we study the influence of asteroid asymmetries on separation and orientation of a doubly-synchronous system. Additionally, simulations of binary systems perturbed from doubly-synchronous behavior are studied to understand the effects of mass distribution perturbations on precession and nutation rates such that unique behaviors can be isolated and used to measure asteroid mass distributions. We apply our investigation to the Trojan binary asteroid system 617 Patroclus and Menoetius (1906 VY), which will be the final flyby target of the recently announced LUCY Discovery mission in March 2033. This binary asteroid system is of particular interest due to the results of a recent stellar occultation study (DPS 46, id.506.09) that suggests the system to be doubly-synchronous and consisting of two-similarly sized oblate ellipsoids, in addition to suggesting the presence mass asymmetries resulting from an impact crater on the southern limb of Menoetius.

  11. Assessment of Stable Isotope Distribution in Complex Systems

    NASA Astrophysics Data System (ADS)

    He, Y.; Cao, X.; Wang, J.; Bao, H.

    2017-12-01

    Biomolecules in living organisms have the potential to approach chemical steady state and even apparent isotope equilibrium because enzymatic reactions are intrinsically reversible. If an apparent local equilibrium can be identified, enzymatic reversibility and its controlling factors may be quantified, which helps to understand complex biochemical processes. Earlier research on isotope fractionation tends to focus on specific process and compare mostly two different chemical species. Using linear regression, "Thermodynamic order", which refers to correlated δ13C and 13β values, has been proposed to be present among many biomolecules by Galimov et al. However, the concept "thermodynamic order" they proposed and the approach they used has been questioned. Here, we propose that the deviation of a complex system from its equilibrium state can be rigorously described as a graph problem as is applied in discrete mathematics. The deviation of isotope distribution from equilibrium state and apparent local isotope equilibrium among a subset of biomolecules can be assessed using an apparent fractionation difference matrix (|Δα|). Applying the |Δα| matrix analysis to earlier published data of amino acids, we show the existence of apparent local equilibrium among different amino acids in potato and a kind of green alga. The existence of apparent local equilibrium is in turn consistent with the notion that enzymatic reactions can be reversible even in living systems. The result also implies that previous emphasis on external carbon source intake may be misplaced when studying isotope distribution in physiology. In addition to the identification of local equilibrium among biomolecules, the difference matrix approach has the potential to explore chemical or isotope equilibrium state in extraterrestrial bodies, to distinguish living from non-living systems, and to classify living species. This approach will benefit from large numbers of systematic data and advanced pattern recognition techniques.

  12. Fractal analysis of urban catchments and their representation in semi-distributed models: imperviousness and sewer system

    NASA Astrophysics Data System (ADS)

    Gires, Auguste; Tchiguirinskaia, Ioulia; Schertzer, Daniel; Ochoa-Rodriguez, Susana; Willems, Patrick; Ichiba, Abdellah; Wang, Lipen; Pina, Rui; Van Assel, Johan; Bruni, Guendalina; Murla Tuyls, Damian; ten Veldhuis, Marie-Claire

    2017-04-01

    Land use distribution and sewer system geometry exhibit complex scale dependent patterns in urban environment. This scale dependency is even more visible in a rasterized representation where only a unique class is affected to each pixel. Such features are well grasped with fractal tools, which are based scale invariance and intrinsically designed to characterise and quantify the space filled by a geometrical set exhibiting complex and tortuous patterns. Fractal tools have been widely used in hydrology but seldom in the specific context of urban hydrology. In this paper, they are used to analyse surface and sewer data from 10 urban or peri-urban catchments located in 5 European countries in the framework of the NWE Interreg RainGain project (www.raingain.eu). The aim was to characterise urban catchment properties accounting for the complexity and inhomogeneity typical of urban water systems. Sewer system density and imperviousness (roads or buildings), represented in rasterized maps of 2 m x 2 m pixels, were analysed to quantify their fractal dimension, characteristic of scaling invariance. It appears that both sewer density and imperviousness exhibit scale invariant features that can be characterized with the help of fractal dimensions ranging from 1.6 to 2, depending on the catchment. In a given area, consistent results were found for the two geometrical features, yielding a robust and innovative way of quantifying the level of urbanization. The representation of imperviousness in operational semi-distributed hydrological models for these catchments was also investigated by computing fractal dimensions of the geometrical sets made up of the sub-catchments with coefficients of imperviousness greater than a range of thresholds. It enables to quantify how well spatial structures of imperviousness are represented in the urban hydrological models.

  13. Metallicity gradients in tidal tails and merging systems

    NASA Astrophysics Data System (ADS)

    Torres-Flores, S.; Scarano, S., Jr.; Olave, D.; Alfaro, M.; Mendes de Oliveira, C.; de Mello, D. F.; Carrasco, E. R.; Amram, P.; Plana, H.

    2014-10-01

    We present an analysis of the metal distribution in the tidal tails of two interacting systems and in the main body of a galaxy merger: NGC92, NGC6845 and HCG31, respectively. Using Gemini/GMOS spectroscopic data, we found no metallicity gradients for the tail of NGC92. The abundances in the tail are similar to the values displayed by the central regions of NGC92. This fact suggests that gas mixing triggered by the interaction produces a flattening in the metallicity distribution of this system. For the system NGC6845, we found that regions located in the tail have similar abundances to one source located in the inner region of this galaxy, also suggesting a flat metal distribution. For HCG 31 we found an inhomogeneous metal distribution for the central region. Apparently, each star forming complex keeps its metal abundance despite the strong gravitational interaction that this system suffered. In the case of the tidal tails, our results support the scenario in which gas mixing produces a flattening in the metal distribution. However, we suggest that the star formation is an important mechanism in enhancing the oxygen abundance of these structures.

  14. "Structure and dynamics in complex chemical systems: Gaining new insights through recent advances in time-resolved spectroscopies.” ACS Division of Physical Chemistry Symposium presented at the Fall National ACS Meeting in Boston, MA, August 2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, Daniel

    8-Session Symposium on STRUCTURE AND DYNAMICS IN COMPLEX CHEMICAL SYSTEMS: GAINING NEW INSIGHTS THROUGH RECENT ADVANCES IN TIME-RESOLVED SPECTROSCOPIES. The intricacy of most chemical, biochemical, and material processes and their applications are underscored by the complex nature of the environments in which they occur. Substantial challenges for building a global understanding of a heterogeneous system include (1) identifying unique signatures associated with specific structural motifs within the heterogeneous distribution, and (2) resolving the significance of each of multiple time scales involved in both small- and large-scale nuclear reorganization. This symposium focuses on the progress in our understanding of dynamics inmore » complex systems driven by recent innovations in time-resolved spectroscopies and theoretical developments. Such advancement is critical for driving discovery at the molecular level facilitating new applications. Broad areas of interest include: Structural relaxation and the impact of structure on dynamics in liquids, interfaces, biochemical systems, materials, and other heterogeneous environments.« less

  15. Guest Editors' introduction

    NASA Astrophysics Data System (ADS)

    Magee, Jeff; Moffett, Jonathan

    1996-06-01

    Special Issue on Management This special issue contains seven papers originally presented at an International Workshop on Services for Managing Distributed Systems (SMDS'95), held in September 1995 in Karslruhe, Germany. The workshop was organized to present the results of two ESPRIT III funded projects, Sysman and IDSM, and more generally to bring together work in the area of distributed systems management. The workshop focused on the tools and techniques necessary for managing future large-scale, multi-organizational distributed systems. The open call for papers attracted a large number of submissions and the subsequent attendance at the workshop, which was larger than expected, clearly indicated that the topics addressed by the workshop were of considerable interest both to industry and academia. The papers selected for this special issue represent an excellent coverage of the issues addressed by the workshop. A particular focus of the workshop was the need to help managers deal with the size and complexity of modern distributed systems by the provision of automated support. This automation must have two prime characteristics: it must provide a flexible management system which responds rapidly to changing organizational needs, and it must provide both human managers and automated management components with the information that they need, in a form which can be used for decision-making. These two characteristics define the two main themes of this special issue. To satisfy the requirement for a flexible management system, workers in both industry and universities have turned to architectures which support policy directed management. In these architectures policy is explicitly represented and can be readily modified to meet changing requirements. The paper `Towards implementing policy-based systems management' by Meyer, Anstötz and Popien describes an approach whereby policy is enforced by event-triggered rules. Krause and Zimmermann in their paper `Implementing configuration management policies for distributed applications' present a system in which the configuration of the system in terms of its constituent components and their interconnections can be controlled by reconfiguration rules. Neumair and Wies in the paper `Case study: applying management policies to manage distributed queuing systems' examine how high-level policies can be transformed into practical and efficient implementations for the case of distributed job queuing systems. Koch and Krämer in `Rules and agents for automated management of distributed systems' describe the results of an experiment in using the software development environment Marvel to provide a rule based implementation of management policy. The paper by Jardin, `Supporting scalability and flexibility in a distributed management platform' reports on the experience of using a policy directed approach in the industrial strength TeMIP management platform. Both human managers and automated management components rely on a comprehensive monitoring system to provide accurate and timely information on which decisions are made to modify the operation of a system. The monitoring service must deal with condensing and summarizing the vast amount of data available to produce the events of interest to the controlling components of the overall management system. The paper `Distributed intelligent monitoring and reporting facilities' by Pavlou, Mykoniatis and Sanchez describes a flexible monitoring system in which the monitoring agents themselves are policy directed. Their monitoring system has been implemented in the context of the OSIMIS management platform. Debski and Janas in `The SysMan monitoring service and its management environment' describe the overall SysMan management system architecture and then concentrate on how event processing and distribution is supported in that architecture. The collection of papers gives a good overview of the current state of the art in distributed system management. It has reached a point at which a first generation of systems, based on policy representation within systems and automated monitoring systems, are coming into practical use. The papers also serve to identify many of the issues which are open research questions. In particular, as management systems increase in complexity, how far can we automate the refinement of high-level policies into implementations? How can we detect and resolve conflicts between policies? And how can monitoring services deal efficiently with ever-growing complexity and volume? We wish to acknowledge the many contributors, besides the authors, who have made this issue possible: the anonymous reviewers who have done much to assure the quality of these papers, Morris Sloman and his Programme Committee who convened the Workshop, and Thomas Usländer and his team at the Fraunhofer Institute in Karlsruhe who acted as hosts.

  16. Thermodynamic phase transitions for Pomeau-Manneville maps

    NASA Astrophysics Data System (ADS)

    Venegeroles, Roberto

    2012-08-01

    We study phase transitions in the thermodynamic description of Pomeau-Manneville intermittent maps from the point of view of infinite ergodic theory, which deals with diverging measure dynamical systems. For such systems, we use a distributional limit theorem to provide both a powerful tool for calculating thermodynamic potentials as also an understanding of the dynamic characteristics at each instability phase. In particular, topological pressure and Rényi entropy are calculated exactly for such systems. Finally, we show the connection of the distributional limit theorem with non-Gaussian fluctuations of the algorithmic complexity proposed by Gaspard and Wang [Proc. Natl. Acad. Sci. USA10.1073/pnas.85.13.4591 85, 4591 (1988)].

  17. Distributed parameter system coupled ARMA expansion identification and adaptive parallel IIR filtering - A unified problem statement. [Auto Regressive Moving-Average

    NASA Technical Reports Server (NTRS)

    Johnson, C. R., Jr.; Balas, M. J.

    1980-01-01

    A novel interconnection of distributed parameter system (DPS) identification and adaptive filtering is presented, which culminates in a common statement of coupled autoregressive, moving-average expansion or parallel infinite impulse response configuration adaptive parameterization. The common restricted complexity filter objectives are seen as similar to the reduced-order requirements of the DPS expansion description. The interconnection presents the possibility of an exchange of problem formulations and solution approaches not yet easily addressed in the common finite dimensional lumped-parameter system context. It is concluded that the shared problems raised are nevertheless many and difficult.

  18. High performance frame synchronization for continuous variable quantum key distribution systems.

    PubMed

    Lin, Dakai; Huang, Peng; Huang, Duan; Wang, Chao; Peng, Jinye; Zeng, Guihua

    2015-08-24

    Considering a practical continuous variable quantum key distribution(CVQKD) system, synchronization is of significant importance as it is hardly possible to extract secret keys from unsynchronized strings. In this paper, we proposed a high performance frame synchronization method for CVQKD systems which is capable to operate under low signal-to-noise(SNR) ratios and is compatible with random phase shift induced by quantum channel. A practical implementation of this method with low complexity is presented and its performance is analysed. By adjusting the length of synchronization frame, this method can work well with large range of SNR values which paves the way for longer distance CVQKD.

  19. Critical issues in NASA information systems

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The National Aeronautics and Space Administration has developed a globally-distributed complex of earth resources data bases since LANDSAT 1 was launched in 1972. NASA envisages considerable growth in the number, extent, and complexity of such data bases, due to the improvements expected in its remote sensing data rates, and the increasingly multidisciplinary nature of its scientific investigations. Work already has begun on information systems to support multidisciplinary research activities based on data acquired by the space station complex and other space-based and terrestrial sources. In response to a request from NASA's former Associate Administrator for Space Science and Applications, the National Research Council convened a committee in June 1985 to identify the critical issues involving information systems support to space science and applications. The committee has suggested that OSSA address four major information systems issues; centralization of management functions, interoperability of user involvement in the planning and implementation of its programs, and technology.

  20. Transdisciplinary Application of Cross-Scale Resilience ...

    EPA Pesticide Factsheets

    The cross-scale resilience model was developed in ecology to explain the emergence of resilience from the distribution of ecological functions within and across scales, and as a tool to assess resilience. We propose that the model and the underlyingdiscontinuity hypothesis are relevant to other complex adaptive systems, and can be used to identify and track changes in system parameters related to resilience. We explain the theory behind the cross-scale resilience model, review the cases where it has been applied to non-ecological systems, and discuss some examples of social-ecological, archaeological/anthropological, and economic systems where a cross-scale resilience analysis could add a quantitative dimension to our current understanding of system dynamics and resilience. We argue that the scaling and diversity parameters suitable for a resilience analysis of ecological systems are appropriate for a broad suite of systems where non-normative quantitative assessments of resilience are desired. Our planet is currently characterized by fast environmental and social change, and the cross-scale resilience model has the potential to quantify resilience across many types of complex adaptive systems. Comparative analyses of complex systems have, in fact, demonstrated commonalities among distinctly different types of systems (Schneider & Kay 1994; Holling 2001; Lansing 2003; Foster 2005; Bullmore et al. 2009). Both biological and non-biological complex systems appear t

  1. Distributed Cognition and Process Management Enabling Individualized Translational Research: The NIH Undiagnosed Diseases Program Experience

    PubMed Central

    Links, Amanda E.; Draper, David; Lee, Elizabeth; Guzman, Jessica; Valivullah, Zaheer; Maduro, Valerie; Lebedev, Vlad; Didenko, Maxim; Tomlin, Garrick; Brudno, Michael; Girdea, Marta; Dumitriu, Sergiu; Haendel, Melissa A.; Mungall, Christopher J.; Smedley, Damian; Hochheiser, Harry; Arnold, Andrew M.; Coessens, Bert; Verhoeven, Steven; Bone, William; Adams, David; Boerkoel, Cornelius F.; Gahl, William A.; Sincan, Murat

    2016-01-01

    The National Institutes of Health Undiagnosed Diseases Program (NIH UDP) applies translational research systematically to diagnose patients with undiagnosed diseases. The challenge is to implement an information system enabling scalable translational research. The authors hypothesized that similar complex problems are resolvable through process management and the distributed cognition of communities. The team, therefore, built the NIH UDP integrated collaboration system (UDPICS) to form virtual collaborative multidisciplinary research networks or communities. UDPICS supports these communities through integrated process management, ontology-based phenotyping, biospecimen management, cloud-based genomic analysis, and an electronic laboratory notebook. UDPICS provided a mechanism for efficient, transparent, and scalable translational research and thereby addressed many of the complex and diverse research and logistical problems of the NIH UDP. Full definition of the strengths and deficiencies of UDPICS will require formal qualitative and quantitative usability and process improvement measurement. PMID:27785453

  2. NASA Workshop on Distributed Parameter Modeling and Control of Flexible Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Marks, Virginia B. (Compiler); Keckler, Claude R. (Compiler)

    1994-01-01

    Although significant advances have been made in modeling and controlling flexible systems, there remains a need for improvements in model accuracy and in control performance. The finite element models of flexible systems are unduly complex and are almost intractable to optimum parameter estimation for refinement using experimental data. Distributed parameter or continuum modeling offers some advantages and some challenges in both modeling and control. Continuum models often result in a significantly reduced number of model parameters, thereby enabling optimum parameter estimation. The dynamic equations of motion of continuum models provide the advantage of allowing the embedding of the control system dynamics, thus forming a complete set of system dynamics. There is also increased insight provided by the continuum model approach.

  3. Neural complexity: A graph theoretic interpretation

    NASA Astrophysics Data System (ADS)

    Barnett, L.; Buckley, C. L.; Bullock, S.

    2011-04-01

    One of the central challenges facing modern neuroscience is to explain the ability of the nervous system to coherently integrate information across distinct functional modules in the absence of a central executive. To this end, Tononi [Proc. Natl. Acad. Sci. USA.PNASA60027-842410.1073/pnas.91.11.5033 91, 5033 (1994)] proposed a measure of neural complexity that purports to capture this property based on mutual information between complementary subsets of a system. Neural complexity, so defined, is one of a family of information theoretic metrics developed to measure the balance between the segregation and integration of a system’s dynamics. One key question arising for such measures involves understanding how they are influenced by network topology. Sporns [Cereb. Cortex53OPAV1047-321110.1093/cercor/10.2.127 10, 127 (2000)] employed numerical models in order to determine the dependence of neural complexity on the topological features of a network. However, a complete picture has yet to be established. While De Lucia [Phys. Rev. EPLEEE81539-375510.1103/PhysRevE.71.016114 71, 016114 (2005)] made the first attempts at an analytical account of this relationship, their work utilized a formulation of neural complexity that, we argue, did not reflect the intuitions of the original work. In this paper we start by describing weighted connection matrices formed by applying a random continuous weight distribution to binary adjacency matrices. This allows us to derive an approximation for neural complexity in terms of the moments of the weight distribution and elementary graph motifs. In particular, we explicitly establish a dependency of neural complexity on cyclic graph motifs.

  4. Production and Distribution of NASA MODIS Remote Sensing Products

    NASA Technical Reports Server (NTRS)

    Wolfe, Robert

    2007-01-01

    The two Moderate Resolution Imaging Spectroradiometer (MODIS) instruments on-board NASA's Earth Observing System (EOS) Terra and Aqua satellites make key measurements for understanding the Earth's terrestrial ecosystems. Global time-series of terrestrial geophysical parameters have been produced from MODIS/Terra for over 7 years and for MODIS/Aqua for more than 4 1/2 years. These well calibrated instruments, a team of scientists and a large data production, archive and distribution systems have allowed for the development of a new suite of high quality product variables at spatial resolutions as fine as 250m in support of global change research and natural resource applications. This talk describes the MODIS Science team's products, with a focus on the terrestrial (land) products, the data processing approach and the process for monitoring and improving the product quality. The original MODIS science team was formed in 1989. The team's primary role is the development and implementation of the geophysical algorithms. In addition, the team provided feedback on the design and pre-launch testing of the instrument and helped guide the development of the data processing system. The key challenges the science team dealt with before launch were the development of algorithms for a new instrument and provide guidance of the large and complex multi-discipline processing system. Land, Ocean and Atmosphere discipline teams drove the processing system requirements, particularly in the area of the processing loads and volumes needed to daily produce geophysical maps of the Earth at resolutions as fine as 250 m. The processing system had to handle a large number of data products, large data volumes and processing loads, and complex processing requirements. Prior to MODIS, daily global maps from heritage instruments, such as Advanced Very High Resolution Radiometer (AVHRR), were not produced at resolutions finer than 5 km. The processing solution evolved into a combination of processing the lower level (Level 1) products and the higher level discipline specific Land and Atmosphere products in the MODIS Science Investigator Lead Processing System (SIPS), the MODIS Adaptive Processing System (MODAPS), and archive and distribution of the Land products to the user community by two of NASA s EOS Distributed Active Archive Centers (DAACs). Recently, a part of MODAPS, the Level 1 and Atmosphere Archive and Distribution System (LAADS), took over the role of archiving and distributing the Level 1 and Atmosphere products to the user community.

  5. A novel automotive headlight system based on digital micro-mirror devices and diffractive optical elements

    NASA Astrophysics Data System (ADS)

    Su, Ping; Song, Yuming; Ma, Jianshe

    2018-01-01

    The DMD (Digital Micro-mirror Device) has the advantages of high refresh rate and high diffraction efficiency, and these make it become an ideal loader of multiple modes illumination. DOEs (Diffractive Optical Element) have the advantages of high degree of freedom, light weight, easy to copy, low cost etc., and can be used to reduce the weight, complexity, cost of optical system. A novel automotive headlamp system using DMD as the light distribution element and a DOE as the light field modulation device is proposed in this paper. The pure phase DOE is obtained by the GS algorithm using Rayleigh-Sommerfeld diffraction integral model. Based on the standard automotive headlamp light intensity distribution in the target plane, the amplitude distribution of DMD is obtained by numerical simulation, and the grayscale diagram loaded on the DMD can be obtained accordingly. Finally, according to simulation result, the light intensity distribution in the target plane is proportional to the national standard, hence verifies the validity of the novel system. The novel illumination system proposed in this paper provides a reliable hardware platform for the intelligent headlamps.

  6. Distribution of Quantum Coherence in Multipartite Systems

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Chandrashekar; Parthasarathy, Manikandan; Jambulingam, Segar; Byrnes, Tim

    2016-04-01

    The distribution of coherence in multipartite systems is examined. We use a new coherence measure with entropic nature and metric properties, based on the quantum Jensen-Shannon divergence. The metric property allows for the coherence to be decomposed into various contributions, which arise from local and intrinsic coherences. We find that there are trade-off relations between the various contributions of coherence, as a function of parameters of the quantum state. In bipartite systems the coherence resides on individual sites or is distributed among the sites, which contribute in a complementary way. In more complex systems, the characteristics of the coherence can display more subtle changes with respect to the parameters of the quantum state. In the case of the X X Z Heisenberg model, the coherence changes from a monogamous to a polygamous nature. This allows us to define the shareability of coherence, leading to monogamy relations for coherence.

  7. Distributions of extreme bursts above thresholds in a fractional Lévy toy model of natural complexity.

    NASA Astrophysics Data System (ADS)

    Watkins, Nicholas; Chapman, Sandra; Rosenberg, Sam; Credgington, Dan; Sanchez, Raul

    2010-05-01

    In 2 far-sighted contributions in the 1960s Mandelbrot showed the ubiquity of both non-Gaussian fluctuations and long-ranged temporal memory (the "Noah" and "Joseph" effects, respectively) in the natural and man-made worlds. Much subsequent work in complexity science has contributed to the physical underpinning of these effects, particularly in cases where complex interactions in a system cause a driven or random perturbation to be nonlinearly amplified in amplitude and/or spread out over a wide range of frequencies. In addition the modelling of catastrophes has begun to incorporate the insights which these approaches have offered into the likelihood of extreme and long-lived fluctuations. I will briefly survey how the application of the above ideas in the earth system has been a key focus and motivation of research into natural complexity at BAS [e.g. Watkins & Freeman, Science, 2008; Edwards et al, Nature, 2007]. I will then discuss in detail a standard toy model (linear fractional stable motion, LFSM) which combines the Noah and Joseph effects in a controllable way and explain how it differs from the widely used continuous time random walk. I will describe how LFSM is being used to explore the interplay of the above two effects in the distribution of bursts above thresholds. I will describe ongoing work to improve the accuracy of maximum likelihood-based estimation of burst size and waiting time distributions for LFSM first reported in [Watkins et al, PRE, 2009]; and will also touch on similar work for multifractal models [Watkins et al, PRL comment, 2009].

  8. Distributed rewiring model for complex networking: The effect of local rewiring rules on final structural properties.

    PubMed

    López Chavira, Magali Alexander; Marcelín-Jiménez, Ricardo

    2017-01-01

    The study of complex networks has become an important subject over the last decades. It has been shown that these structures have special features, such as their diameter, or their average path length, which in turn are the explanation of some functional properties in a system such as its fault tolerance, its fragility before attacks, or the ability to support routing procedures. In the present work, we study some of the forces that help a network to evolve to the point where structural properties are settled. Although our work is mainly focused on the possibility of applying our ideas to Information and Communication Technologies systems, we consider that our results may contribute to understanding different scenarios where complex networks have become an important modeling tool. Using a discrete event simulator, we get each node to discover the shortcuts that may connect it with regions away from its local environment. Based on this partial knowledge, each node can rewire some of its links, which allows modifying the topology of the entire underlying graph to achieve new structural properties. We proposed a distributed rewiring model that creates networks with features similar to those found in complex networks. Although each node acts in a distributed way and seeking to reduce only the trajectories of its packets, we observed a decrease of diameter and an increase in clustering coefficient in the global structure compared to the initial graph. Furthermore, we can find different final structures depending on slight changes in the local rewiring rules.

  9. Multiscale Modeling of Antibody-Drug Conjugates: Connecting Tissue and Cellular Distribution to Whole Animal Pharmacokinetics and Potential Implications for Efficacy.

    PubMed

    Cilliers, Cornelius; Guo, Hans; Liao, Jianshan; Christodolu, Nikolas; Thurber, Greg M

    2016-09-01

    Antibody-drug conjugates exhibit complex pharmacokinetics due to their combination of macromolecular and small molecule properties. These issues range from systemic concerns, such as deconjugation of the small molecule drug during the long antibody circulation time or rapid clearance from nonspecific interactions, to local tumor tissue heterogeneity, cell bystander effects, and endosomal escape. Mathematical models can be used to study the impact of these processes on overall distribution in an efficient manner, and several types of models have been used to analyze varying aspects of antibody distribution including physiologically based pharmacokinetic (PBPK) models and tissue-level simulations. However, these processes are quantitative in nature and cannot be handled qualitatively in isolation. For example, free antibody from deconjugation of the small molecule will impact the distribution of conjugated antibodies within the tumor. To incorporate these effects into a unified framework, we have coupled the systemic and organ-level distribution of a PBPK model with the tissue-level detail of a distributed parameter tumor model. We used this mathematical model to analyze new experimental results on the distribution of the clinical antibody-drug conjugate Kadcyla in HER2-positive mouse xenografts. This model is able to capture the impact of the drug-antibody ratio (DAR) on tumor penetration, the net result of drug deconjugation, and the effect of using unconjugated antibody to drive ADC penetration deeper into the tumor tissue. This modeling approach will provide quantitative and mechanistic support to experimental studies trying to parse the impact of multiple mechanisms of action for these complex drugs.

  10. Multiscale Modeling of Antibody Drug Conjugates: Connecting tissue and cellular distribution to whole animal pharmacokinetics and potential implications for efficacy

    PubMed Central

    Cilliers, Cornelius; Guo, Hans; Liao, Jianshan; Christodolu, Nikolas; Thurber, Greg M.

    2016-01-01

    Antibody drug conjugates exhibit complex pharmacokinetics due to their combination of macromolecular and small molecule properties. These issues range from systemic concerns, such as deconjugation of the small molecule drug during the long antibody circulation time or rapid clearance from non-specific interactions, to local tumor tissue heterogeneity, cell bystander effects, and endosomal escape. Mathematical models can be used to study the impact of these processes on overall distribution in an efficient manner, and several types of models have been used to analyze varying aspects of antibody distribution including physiologically based pharmacokinetic (PBPK) models and tissue-level simulations. However, these processes are quantitative in nature and cannot be handled qualitatively in isolation. For example, free antibody from deconjugation of the small molecule will impact the distribution of conjugated antibodies within the tumor. To incorporate these effects into a unified framework, we have coupled the systemic and organ-level distribution of a PBPK model with the tissue-level detail of a distributed parameter tumor model. We used this mathematical model to analyze new experimental results on the distribution of the clinical antibody drug conjugate Kadcyla in HER2 positive mouse xenografts. This model is able to capture the impact of the drug antibody ratio (DAR) on tumor penetration, the net result of drug deconjugation, and the effect of using unconjugated antibody to drive ADC penetration deeper into the tumor tissue. This modeling approach will provide quantitative and mechanistic support to experimental studies trying to parse the impact of multiple mechanisms of action for these complex drugs. PMID:27287046

  11. Modelling Root Systems Using Oriented Density Distributions

    NASA Astrophysics Data System (ADS)

    Dupuy, Lionel X.

    2011-09-01

    Root architectural models are essential tools to understand how plants access and utilize soil resources during their development. However, root architectural models use complex geometrical descriptions of the root system and this has limitations to model interactions with the soil. This paper presents the development of continuous models based on the concept of oriented density distribution function. The growth of the root system is built as a hierarchical system of partial differential equations (PDEs) that incorporate single root growth parameters such as elongation rate, gravitropism and branching rate which appear explicitly as coefficients of the PDE. Acquisition and transport of nutrients are then modelled by extending Darcy's law to oriented density distribution functions. This framework was applied to build a model of the growth and water uptake of barley root system. This study shows that simplified and computer effective continuous models of the root system development can be constructed. Such models will allow application of root growth models at field scale.

  12. Distribution of free and antibody-bound peptide hormones in two-phase aqueous polymer systems

    PubMed Central

    Desbuquois, Bernard; Aurbach, G. D.

    1972-01-01

    Peptide hormones labelled with radioactive iodine were partitioned into the aqueous two-phase polymer systems developed by Albertsson (1960) and the conditions required for separation of free from antibody-bound hormone have been worked out. Hormones studied included insulin, growth hormone, parathyroid hormone and [arginine]-vasopressin. Free and antibody-bound hormones show different distribution coefficients in a number of systems tested; two systems, the dextran–polyethylene glycol and dextran sulphate–polyethylene glycol system, give optimum separation. Free hormones distribute readily into the upper phase of these systems, whereas hormone–antibody complexes, as well as uncombined antibody, are found almost completely in the lower phase. Various factors including the polymer concentration, the ionic composition of the system, the nature of the hormone and the nature of added serum protein differentially affect the distribution coefficients for free and antibody-bound hormone. These factors can be adequately controlled so as to improve separation. The two-phase partition method has been successfully applied to measure binding of labelled hormone to antibody under standard radioimmunoassay conditions. It exhibits several advantages over the method of equilibration dialysis and can be applied to the study of non-immunological interactions. PMID:4672674

  13. Distributed cooperative control of AC microgrids

    NASA Astrophysics Data System (ADS)

    Bidram, Ali

    In this dissertation, the comprehensive secondary control of electric power microgrids is of concern. Microgrid technical challenges are mainly realized through the hierarchical control structure, including primary, secondary, and tertiary control levels. Primary control level is locally implemented at each distributed generator (DG), while the secondary and tertiary control levels are conventionally implemented through a centralized control structure. The centralized structure requires a central controller which increases the reliability concerns by posing the single point of failure. In this dissertation, the distributed control structure using the distributed cooperative control of multi-agent systems is exploited to increase the secondary control reliability. The secondary control objectives are microgrid voltage and frequency, and distributed generators (DGs) active and reactive powers. Fully distributed control protocols are implemented through distributed communication networks. In the distributed control structure, each DG only requires its own information and the information of its neighbors on the communication network. The distributed structure obviates the requirements for a central controller and complex communication network which, in turn, improves the system reliability. Since the DG dynamics are nonlinear and non-identical, input-output feedback linearization is used to transform the nonlinear dynamics of DGs to linear dynamics. Proposed control frameworks cover the control of microgrids containing inverter-based DGs. Typical microgrid test systems are used to verify the effectiveness of the proposed control protocols.

  14. A Pathophysiological Model-Driven Communication for Dynamic Distributed Medical Best Practice Guidance Systems.

    PubMed

    Hosseini, Mohammad; Jiang, Yu; Wu, Poliang; Berlin, Richard B; Ren, Shangping; Sha, Lui

    2016-11-01

    There is a great divide between rural and urban areas, particularly in medical emergency care. Although medical best practice guidelines exist and are in hospital handbooks, they are often lengthy and difficult to apply clinically. The challenges are exaggerated for doctors in rural areas and emergency medical technicians (EMT) during patient transport. In this paper, we propose the concept of distributed executable medical best practice guidance systems to assist adherence to best practice from the time that a patient first presents at a rural hospital, through diagnosis and ambulance transfer to arrival and treatment at a regional tertiary hospital center. We codify complex medical knowledge in the form of simplified distributed executable disease automata, from the thin automata at rural hospitals to the rich automata in the regional center hospitals. However, a main challenge is how to efficiently and safely synchronize distributed best practice models as the communication among medical facilities, devices, and professionals generates a large number of messages. This complex problem of patient diagnosis and transport from rural to center facility is also fraught with many uncertainties and changes resulting in a high degree of dynamism. A critically ill patient's medical conditions can change abruptly in addition to changes in the wireless bandwidth during the ambulance transfer. Such dynamics have yet to be addressed in existing literature on telemedicine. To address this situation, we propose a pathophysiological model-driven message exchange communication architecture that ensures the real-time and dynamic requirements of synchronization among distributed emergency best practice models are met in a reliable and safe manner. Taking the signs, symptoms, and progress of stroke patients transported across a geographically distributed healthcare network as the motivating use case, we implement our communication system and apply it to our developed best practice automata using laboratory simulations. Our proof-of-concept experiments shows there is potential for the use of our system in a wide variety of domains.

  15. Some Recent Developments on Complex Multivariate Distributions

    ERIC Educational Resources Information Center

    Krishnaiah, P. R.

    1976-01-01

    In this paper, the author gives a review of the literature on complex multivariate distributions. Some new results on these distributions are also given. Finally, the author discusses the applications of the complex multivariate distributions in the area of the inference on multiple time series. (Author)

  16. PROCOS: computational analysis of protein-protein complexes.

    PubMed

    Fink, Florian; Hochrein, Jochen; Wolowski, Vincent; Merkl, Rainer; Gronwald, Wolfram

    2011-09-01

    One of the main challenges in protein-protein docking is a meaningful evaluation of the many putative solutions. Here we present a program (PROCOS) that calculates a probability-like measure to be native for a given complex. In contrast to scores often used for analyzing complex structures, the calculated probabilities offer the advantage of providing a fixed range of expected values. This will allow, in principle, the comparison of models corresponding to different targets that were solved with the same algorithm. Judgments are based on distributions of properties derived from a large database of native and false complexes. For complex analysis PROCOS uses these property distributions of native and false complexes together with a support vector machine (SVM). PROCOS was compared to the established scoring schemes of ZRANK and DFIRE. Employing a set of experimentally solved native complexes, high probability values above 50% were obtained for 90% of these structures. Next, the performance of PROCOS was tested on the 40 binary targets of the Dockground decoy set, on 14 targets of the RosettaDock decoy set and on 9 targets that participated in the CAPRI scoring evaluation. Again the advantage of using a probability-based scoring system becomes apparent and a reasonable number of near native complexes was found within the top ranked complexes. In conclusion, a novel fully automated method is presented that allows the reliable evaluation of protein-protein complexes. Copyright © 2011 Wiley Periodicals, Inc.

  17. Robust Architectures for Complex Multi-Agent Heterogeneous Systems

    DTIC Science & Technology

    2014-07-23

    establish the tradeoff between the control performance and the QoS of the communications network . We also derived the performance bound on the difference...accomplished within this time period leveraged the prior accomplishments in the area of networked multi-agent systems. The past work (prior to 2011...distributed control of uncertain networked systems [3]. Additionally, a preliminary collision avoidance algorithm has been developed for a team of

  18. Evaluation of Lightning Incidence to Elements of a Complex Structure: A Monte Carlo Approach

    NASA Technical Reports Server (NTRS)

    Mata, Carlos T.; Rakov, V. A.

    2008-01-01

    There are complex structures for which the installation and positioning of the lightning protection system (LPS) cannot be done using the lightning protection standard guidelines. As a result, there are some "unprotected" or "exposed" areas. In an effort to quantify the lightning threat to these areas, a Monte Carlo statistical tool has been developed. This statistical tool uses two random number generators: a uniform distribution to generate origins of downward propagating leaders and a lognormal distribution to generate returns stroke peak currents. Downward leaders propagate vertically downward and their striking distances are defined by the polarity and peak current. Following the electrogeometrical concept, we assume that the leader attaches to the closest object within its striking distance. The statistical analysis is run for 10,000 years with an assumed ground flash density and peak current distributions, and the output of the program is the probability of direct attachment to objects of interest with its corresponding peak current distribution.

  19. System of multifunctional laser polarimetry of phase and amplitude anisotropy in the diagnosis of endometriosis

    NASA Astrophysics Data System (ADS)

    Ushenko, Yu. O.; Dubolazov, O. V.; Olar, O. V.

    2015-11-01

    The theoretical background of azimuthally stable method Jones matrix mapping of histological sections of biopsy of uterine neck on the basis of spatial-frequency selection of the mechanisms of linear and circular birefringence is presented. The comparative results of measuring the coordinate distributions of complex degree of mutual anisotropy formed by polycristalline networks of blood plasma layers of donors (group 1) and patients with endometriosis (group 2). The values and ranges of change of the statistical (moments of the 1st - 4th order) parameters of complex degree of mutual anisotropy coordinate distributions are studied. The objective criteria of diagnostics of the pathology and differentiation of its severity degree are determined.

  20. Electrophysiological evidence of functional integration between the language and motor systems in the brain: a study of the speech Bereitschaftspotential.

    PubMed

    McArdle, J J; Mari, Z; Pursley, R H; Schulz, G M; Braun, A R

    2009-02-01

    We investigated whether the Bereitschaftspotential (BP), an event related potential believed to reflect motor planning, would be modulated by language-related parameters prior to speech. We anticipated that articulatory complexity would produce effects on the BP distribution similar to those demonstrated for complex limb movements. We also hypothesized that lexical semantic operations would independently impact the BP. Eighteen participants performed 3 speech tasks designed to differentiate lexical semantic and articulatory contributions to the BP. EEG epochs were time-locked to the earliest source of speech movement per trial. Lip movements were assessed using EMG recordings. Doppler imaging was used to determine the onset of tongue movement during speech, providing a means of identification and elimination of potential artifact. Compared to simple repetition, complex articulations produced an anterior shift in the maximum midline BP. Tasks requiring lexical search and selection augmented these effects and independently elicited a left lateralized asymmetry in the frontal distribution. The findings indicate that the BP is significantly modulated by linguistic processing, suggesting that the premotor system might play a role in lexical access. These novel findings support the notion that the motor systems may play a significant role in the formulation of language.

  1. Underestimating extreme events in power-law behavior due to machine-dependent cutoffs

    NASA Astrophysics Data System (ADS)

    Radicchi, Filippo

    2014-11-01

    Power-law distributions are typical macroscopic features occurring in almost all complex systems observable in nature. As a result, researchers in quantitative analyses must often generate random synthetic variates obeying power-law distributions. The task is usually performed through standard methods that map uniform random variates into the desired probability space. Whereas all these algorithms are theoretically solid, in this paper we show that they are subject to severe machine-dependent limitations. As a result, two dramatic consequences arise: (i) the sampling in the tail of the distribution is not random but deterministic; (ii) the moments of the sample distribution, which are theoretically expected to diverge as functions of the sample sizes, converge instead to finite values. We provide quantitative indications for the range of distribution parameters that can be safely handled by standard libraries used in computational analyses. Whereas our findings indicate possible reinterpretations of numerical results obtained through flawed sampling methodologies, they also pave the way for the search for a concrete solution to this central issue shared by all quantitative sciences dealing with complexity.

  2. Capillary Discharge Thruster Experiments and Modeling (Briefing Charts)

    DTIC Science & Technology

    2016-06-01

    Martin1 ERC INC.1, IN-SPACE PROPULSION BRANCH, AIR FORCE RESEARCH LABORATORY EDWARDS AIR FORCE BASE, CA USA Electric propulsion systems June 2016... PROPULSION MODELS & EXPERIMENTS Spacecraft Propulsion Relevant Plasma: From hall thrusters to plumes and fluxes on components Complex reaction physics i.e... Propulsion Plumes FRC Chamber Environment R.S. MARTIN (ERC INC.) DISTRIBUTION A: APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED; PA# 16279 3 / 30 ELECTRIC

  3. Universality and chaotic dynamics in reactive scattering of ultracold KRb molecules with K atoms

    NASA Astrophysics Data System (ADS)

    Li, Ming; Makrides, Constantinos; Petrov, Alexander; Kotochigova, Svetlana; Croft, James F. E.; Balakrishnan, Naduvalath; Kendrick, Brian K.

    2017-04-01

    We study the benchmark reaction between the most-celebrated ultracold polar molecule, KRb, with an ultracold K atom. For the first time we map out an accurate ab initio ground potential energy surface of the K2Rb complex in full dimensionality and performed a numerically exact quantum-mechanical calculation of reaction dynamics based on coupled-channels approach in hyperspherical coordinates. An analysis of the adiabatic hyperspherical potentials reveals a chaotic distribution for the short-range complex that plays a key role in governing the reaction outcome. The equivalent distribution for a lighter collisional system with a smaller density of states (here the Li2Yb trimer) only shows random behavior. We find an extreme sensitivity of our chaotic system to a small perturbation associated with the weak non-additive three-body potential contribution that does not affect the total reaction rate coefficient but leads to a significant change in the rotational distribution in the product molecule. In both cases the distribution of these rates is random or Poissonian. This work was supported in part by NSF Grant PHY-1505557 (N.B.) and PHY-1619788 (S.K.), ARO MURI Grant No. W911NF-12-1-0476 (N.B. & S.K.), and DOE LDRD Grant No. 20170221ER (B.K.).

  4. Distributions of vesicular glutamate transporters 1 and 2 in the visual system of tree shrews (Tupaia belangeri)1

    PubMed Central

    Balaram, P; Isaamullah, M; Petry, HM; Bickford, ME; Kaas, JH

    2014-01-01

    Vesicular glutamate transporter (VGLUT) proteins regulate the storage and release of glutamate from synapses of excitatory neurons. Two isoforms, VGLUT1 and VGLUT2, are found in most glutamatergic projections across the mammalian visual system, and appear to differentially identify subsets of excitatory projections between visual structures. To expand current knowledge on the distribution of VGLUT isoforms in highly visual mammals, we examined the mRNA and protein expression patterns of VGLUT1 and VGLUT2 in the lateral geniculate nucleus (LGN), superior colliculus, pulvinar complex, and primary visual cortex (V1) in tree shrews (Tupaia belangeri), which are closely related to primates but classified as a separate order (Scandentia). We found that VGLUT1 was distributed in intrinsic and corticothalamic connections, whereas VGLUT2 was predominantly distributed in subcortical and thalamocortical connections. VGLUT1 and VGLUT2 were coexpressed in the LGN and in the pulvinar complex, as well as in restricted layers of V1, suggesting a greater heterogeneity in the range of efferent glutamatergic projections from these structures. These findings provide further evidence that VGLUT1 and VGLUT2 identify distinct populations of excitatory neurons in visual brain structures across mammals. Observed variations in individual projections may highlight the evolution of these connections through the mammalian lineage. PMID:25521420

  5. Distributed Optimization of Multi-Agent Systems: Framework, Local Optimizer, and Applications

    NASA Astrophysics Data System (ADS)

    Zu, Yue

    Convex optimization problem can be solved in a centralized or distributed manner. Compared with centralized methods based on single-agent system, distributed algorithms rely on multi-agent systems with information exchanging among connected neighbors, which leads to great improvement on the system fault tolerance. Thus, a task within multi-agent system can be completed with presence of partial agent failures. By problem decomposition, a large-scale problem can be divided into a set of small-scale sub-problems that can be solved in sequence/parallel. Hence, the computational complexity is greatly reduced by distributed algorithm in multi-agent system. Moreover, distributed algorithm allows data collected and stored in a distributed fashion, which successfully overcomes the drawbacks of using multicast due to the bandwidth limitation. Distributed algorithm has been applied in solving a variety of real-world problems. Our research focuses on the framework and local optimizer design in practical engineering applications. In the first one, we propose a multi-sensor and multi-agent scheme for spatial motion estimation of a rigid body. Estimation performance is improved in terms of accuracy and convergence speed. Second, we develop a cyber-physical system and implement distributed computation devices to optimize the in-building evacuation path when hazard occurs. The proposed Bellman-Ford Dual-Subgradient path planning method relieves the congestion in corridor and the exit areas. At last, highway traffic flow is managed by adjusting speed limits to minimize the fuel consumption and travel time in the third project. Optimal control strategy is designed through both centralized and distributed algorithm based on convex problem formulation. Moreover, a hybrid control scheme is presented for highway network travel time minimization. Compared with no controlled case or conventional highway traffic control strategy, the proposed hybrid control strategy greatly reduces total travel time on test highway network.

  6. Structural and functional networks in complex systems with delay.

    PubMed

    Eguíluz, Víctor M; Pérez, Toni; Borge-Holthoefer, Javier; Arenas, Alex

    2011-05-01

    Functional networks of complex systems are obtained from the analysis of the temporal activity of their components, and are often used to infer their unknown underlying connectivity. We obtain the equations relating topology and function in a system of diffusively delay-coupled elements in complex networks. We solve exactly the resulting equations in motifs (directed structures of three nodes) and in directed networks. The mean-field solution for directed uncorrelated networks shows that the clusterization of the activity is dominated by the in-degree of the nodes, and that the locking frequency decreases with increasing average degree. We find that the exponent of a power law degree distribution of the structural topology γ is related to the exponent of the associated functional network as α=(2-γ)(-1) for γ<2. © 2011 American Physical Society

  7. Understanding scaling through history-dependent processes with collapsing sample space.

    PubMed

    Corominas-Murtra, Bernat; Hanel, Rudolf; Thurner, Stefan

    2015-04-28

    History-dependent processes are ubiquitous in natural and social systems. Many such stochastic processes, especially those that are associated with complex systems, become more constrained as they unfold, meaning that their sample space, or their set of possible outcomes, reduces as they age. We demonstrate that these sample-space-reducing (SSR) processes necessarily lead to Zipf's law in the rank distributions of their outcomes. We show that by adding noise to SSR processes the corresponding rank distributions remain exact power laws, p(x) ~ x(-λ), where the exponent directly corresponds to the mixing ratio of the SSR process and noise. This allows us to give a precise meaning to the scaling exponent in terms of the degree to which a given process reduces its sample space as it unfolds. Noisy SSR processes further allow us to explain a wide range of scaling exponents in frequency distributions ranging from α = 2 to ∞. We discuss several applications showing how SSR processes can be used to understand Zipf's law in word frequencies, and how they are related to diffusion processes in directed networks, or aging processes such as in fragmentation processes. SSR processes provide a new alternative to understand the origin of scaling in complex systems without the recourse to multiplicative, preferential, or self-organized critical processes.

  8. Distributed Traffic Complexity Management by Preserving Trajectory Flexibility

    NASA Technical Reports Server (NTRS)

    Idris, Husni; Vivona, Robert A.; Garcia-Chico, Jose-Luis; Wing, David J.

    2007-01-01

    In order to handle the expected increase in air traffic volume, the next generation air transportation system is moving towards a distributed control architecture, in which groundbased service providers such as controllers and traffic managers and air-based users such as pilots share responsibility for aircraft trajectory generation and management. This paper presents preliminary research investigating a distributed trajectory-oriented approach to manage traffic complexity, based on preserving trajectory flexibility. The underlying hypotheses are that preserving trajectory flexibility autonomously by aircraft naturally achieves the aggregate objective of avoiding excessive traffic complexity, and that trajectory flexibility is increased by collaboratively minimizing trajectory constraints without jeopardizing the intended air traffic management objectives. This paper presents an analytical framework in which flexibility is defined in terms of robustness and adaptability to disturbances and preliminary metrics are proposed that can be used to preserve trajectory flexibility. The hypothesized impacts are illustrated through analyzing a trajectory solution space in a simple scenario with only speed as a degree of freedom, and in constraint situations involving meeting multiple times of arrival and resolving conflicts.

  9. Functional consequences of structural differences in stingray sensory systems. Part I: mechanosensory lateral line canals.

    PubMed

    Jordan, Laura K; Kajiura, Stephen M; Gordon, Malcolm S

    2009-10-01

    Short range hydrodynamic and electrosensory signals are important during final stages of prey capture in elasmobranchs (sharks, skates and rays), and may be particularly useful for dorso-ventrally flattened batoids with mouths hidden from their eyes. In stingrays, both the lateral line canal and electrosensory systems are highly modified and complex with significant differences on ventral surfaces that relate to feeding ecology. This study tests functional hypotheses based on quantified differences in sensory system morphology of three stingray species, Urobatis halleri, Myliobatis californica and Pteroplatytrygon violacea. Part I investigates the mechanosensory lateral line canal system whereas part II focuses on the electrosensory system. Stingray lateral line canals include both pored and non-pored sections and differ in branching complexity and distribution. A greater proportion of pored canals and high pore numbers were predicted to correspond to increased response to water flow. Behavioral experiments were performed to compare responses of stingrays to weak water jets mimicking signals produced by potential prey at velocities of 10-20 cm s(-1). Bat rays, M. californica, have the most complex and broadly distributed pored canal network and demonstrated both the highest response rate and greater response intensity to water jet signals. Results suggest that U. halleri and P. violacea may rely on additional sensory input, including tactile and visual cues, respectively, to initiate stronger feeding responses. These results suggest that stingray lateral line canal morphology can indicate detection capabilities through responsiveness to weak water jets.

  10. Influence of Force Fields and Quantum Chemistry Approach on Spectral Densities of BChl a in Solution and in FMO Proteins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chandrasekaran, Suryanarayanan; Aghtar, Mortaza; Valleau, Stéphanie

    2015-08-06

    Studies on light-harvesting (LH) systems have attracted much attention after the finding of long-lived quantum coherences in the exciton dynamics of the Fenna–Matthews–Olson (FMO) complex. In this complex, excitation energy transfer occurs between the bacteriochlorophyll a (BChl a) pigments. Two quantum mechanics/molecular mechanics (QM/MM) studies, each with a different force-field and quantum chemistry approach, reported different excitation energy distributions for the FMO complex. To understand the reasons for these differences in the predicted excitation energies, we have carried out a comparative study between the simulations using the CHARMM and AMBER force field and the Zerner intermediate neglect of differential orbitalmore » (ZINDO)/S and time-dependent density functional theory (TDDFT) quantum chemistry methods. The calculations using the CHARMM force field together with ZINDO/S or TDDFT always show a wider spread in the energy distribution compared to those using the AMBER force field. High- or low-energy tails in these energy distributions result in larger values for the spectral density at low frequencies. A detailed study on individual BChl a molecules in solution shows that without the environment, the density of states is the same for both force field sets. Including the environmental point charges, however, the excitation energy distribution gets broader and, depending on the applied methods, also asymmetric. The excitation energy distribution predicted using TDDFT together with the AMBER force field shows a symmetric, Gaussian-like distribution.« less

  11. Non-extensivity and complexity in the earthquake activity at the West Corinth rift (Greece)

    NASA Astrophysics Data System (ADS)

    Michas, Georgios; Vallianatos, Filippos; Sammonds, Peter

    2013-04-01

    Earthquakes exhibit complex phenomenology that is revealed from the fractal structure in space, time and magnitude. For that reason other tools rather than the simple Poissonian statistics seem more appropriate to describe the statistical properties of the phenomenon. Here we use Non-Extensive Statistical Physics [NESP] to investigate the inter-event time distribution of the earthquake activity at the west Corinth rift (central Greece). This area is one of the most seismotectonically active areas in Europe, with an important continental N-S extension and high seismicity rates. NESP concept refers to the non-additive Tsallis entropy Sq that includes Boltzmann-Gibbs entropy as a particular case. This concept has been successfully used for the analysis of a variety of complex dynamic systems including earthquakes, where fractality and long-range interactions are important. The analysis indicates that the cumulative inter-event time distribution can be successfully described with NESP, implying the complexity that characterizes the temporal occurrences of earthquakes. Further on, we use the Tsallis entropy (Sq) and the Fischer Information Measure (FIM) to investigate the complexity that characterizes the inter-event time distribution through different time windows along the evolution of the seismic activity at the West Corinth rift. The results of this analysis reveal a different level of organization and clusterization of the seismic activity in time. Acknowledgments. GM wish to acknowledge the partial support of the Greek State Scholarships Foundation (IKY).

  12. Network Communication as a Service-Oriented Capability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnston, William; Johnston, William; Metzger, Joe

    2008-01-08

    In widely distributed systems generally, and in science-oriented Grids in particular, software, CPU time, storage, etc., are treated as"services" -- they can be allocated and used with service guarantees that allows them to be integrated into systems that perform complex tasks. Network communication is currently not a service -- it is provided, in general, as a"best effort" capability with no guarantees and only statistical predictability. In order for Grids (and most types of systems with widely distributed components) to be successful in performing the sustained, complex tasks of large-scale science -- e.g., the multi-disciplinary simulation of next generation climate modelingmore » and management and analysis of the petabytes of data that will come from the next generation of scientific instrument (which is very soon for the LHC at CERN) -- networks must provide communication capability that is service-oriented: That is it must be configurable, schedulable, predictable, and reliable. In order to accomplish this, the research and education network community is undertaking a strategy that involves changes in network architecture to support multiple classes of service; development and deployment of service-oriented communication services, and; monitoring and reporting in a form that is directly useful to the application-oriented system so that it may adapt to communications failures. In this paper we describe ESnet's approach to each of these -- an approach that is part of an international community effort to have intra-distributed system communication be based on a service-oriented capability.« less

  13. Effects of mesoscale structures on the distribution of cephalopod paralarvae in the Gulf of California and adjacent Pacific

    NASA Astrophysics Data System (ADS)

    Ruvalcaba-Aroche, Erick D.; Sánchez-Velasco, Laura; Beier, Emilio; Godínez, Victor M.; Barton, Eric D.; Pacheco, Ma. Rocío

    2018-01-01

    Vertical distribution of the cephalopod paralarvae was investigated in relation to a system of two cyclonic and three anticyclonic eddies in the southern Gulf of California and a front in the adjacent Pacific Ocean. Results showed that the preferential habitat for the Sthenoteuthis oualaniensis - Dosidicus gigas "SD-complex" in both regions was the oxygenated surface mixed layer and the thermocline. The highest abundances occurred in of one of the anticyclonic eddies and a frontal zone, which are convergent structures. Enoploteuthid and Pyroteuthid paralarvae both displayed their highest abundances in the thermocline. Pyroteuthids dominated in the cyclonic eddy whereas Enoploteuthidae were less evident in the eddy system. Pyroteuthids were observed on the western (California Current) side of the frontal zone, and Enoploteuthids on its eastern (Gulf of California) side. The octopods and the complex of Ommastrephes-Eucleoteuthis-Hyaloteuthis paralarvae were present below the thermocline. Both groups had a scarce presence in the eddy system and high abundance near the frontal zone. The octopods abounded on the eastern side in association with the low dissolved oxygen concentrations (< 44 μmol kg-1) of Subtropical-Subsurface Water; the complex on the western front side was immersed in California Current Water. It may be concluded that the spawning and early stages of development of these cephalopod groups are associated with particular mesoscale structures of the water masses. For example, the "SD complex" inhabits the surface water masses, preferentially in convergence zones generated by mesoscale activity.

  14. Statistics of Shared Components in Complex Component Systems

    NASA Astrophysics Data System (ADS)

    Mazzolini, Andrea; Gherardi, Marco; Caselle, Michele; Cosentino Lagomarsino, Marco; Osella, Matteo

    2018-04-01

    Many complex systems are modular. Such systems can be represented as "component systems," i.e., sets of elementary components, such as LEGO bricks in LEGO sets. The bricks found in a LEGO set reflect a target architecture, which can be built following a set-specific list of instructions. In other component systems, instead, the underlying functional design and constraints are not obvious a priori, and their detection is often a challenge of both scientific and practical importance, requiring a clear understanding of component statistics. Importantly, some quantitative invariants appear to be common to many component systems, most notably a common broad distribution of component abundances, which often resembles the well-known Zipf's law. Such "laws" affect in a general and nontrivial way the component statistics, potentially hindering the identification of system-specific functional constraints or generative processes. Here, we specifically focus on the statistics of shared components, i.e., the distribution of the number of components shared by different system realizations, such as the common bricks found in different LEGO sets. To account for the effects of component heterogeneity, we consider a simple null model, which builds system realizations by random draws from a universe of possible components. Under general assumptions on abundance heterogeneity, we provide analytical estimates of component occurrence, which quantify exhaustively the statistics of shared components. Surprisingly, this simple null model can positively explain important features of empirical component-occurrence distributions obtained from large-scale data on bacterial genomes, LEGO sets, and book chapters. Specific architectural features and functional constraints can be detected from occurrence patterns as deviations from these null predictions, as we show for the illustrative case of the "core" genome in bacteria.

  15. High-Assurance Spiral

    DTIC Science & Technology

    2017-11-01

    Public Release; Distribution Unlimited. PA# 88ABW-2017-5388 Date Cleared: 30 OCT 2017 13. SUPPLEMENTARY NOTES 14. ABSTRACT Cyber- physical systems... physical processes that interact in intricate manners. This makes verification of the software complex and unwieldy. In this report, an approach towards...resulting implementations. 15. SUBJECT TERMS Cyber- physical systems, Formal guarantees, Code generation 16. SECURITY CLASSIFICATION OF: 17

  16. Assessment of the integration capability of system architectures from a complex and distributed software systems perspective

    NASA Astrophysics Data System (ADS)

    Leuchter, S.; Reinert, F.; Müller, W.

    2014-06-01

    Procurement and design of system architectures capable of network centric operations demand for an assessment scheme in order to compare different alternative realizations. In this contribution an assessment method for system architectures targeted at the C4ISR domain is presented. The method addresses the integration capability of software systems from a complex and distributed software system perspective focusing communication, interfaces and software. The aim is to evaluate the capability to integrate a system or its functions within a system-of-systems network. This method uses approaches from software architecture quality assessment and applies them on the system architecture level. It features a specific goal tree of several dimensions that are relevant for enterprise integration. These dimensions have to be weighed against each other and totalized using methods from the normative decision theory in order to reflect the intention of the particular enterprise integration effort. The indicators and measurements for many of the considered quality features rely on a model based view on systems, networks, and the enterprise. That means it is applicable to System-of-System specifications based on enterprise architectural frameworks relying on defined meta-models or domain ontologies for defining views and viewpoints. In the defense context we use the NATO Architecture Framework (NAF) to ground respective system models. The proposed assessment method allows evaluating and comparing competing system designs regarding their future integration potential. It is a contribution to the system-of-systems engineering methodology.

  17. Coherent optical monolithic phased-array antenna steering system

    DOEpatents

    Hietala, Vincent M.; Kravitz, Stanley H.; Vawter, Gregory A.

    1994-01-01

    An optical-based RF beam steering system for phased-array antennas comprising a photonic integrated circuit (PIC). The system is based on optical heterodyning employed to produce microwave phase shifting by a monolithic PIC constructed entirely of passive components. Microwave power and control signal distribution to the antenna is accomplished by optical fiber, permitting physical separation of the PIC and its control functions from the antenna. The system reduces size, weight, complexity, and cost of phased-array antenna systems.

  18. Finding equilibrium in the spatiotemporal chaos of the complex Ginzburg-Landau equation

    NASA Astrophysics Data System (ADS)

    Ballard, Christopher C.; Esty, C. Clark; Egolf, David A.

    2016-11-01

    Equilibrium statistical mechanics allows the prediction of collective behaviors of large numbers of interacting objects from just a few system-wide properties; however, a similar theory does not exist for far-from-equilibrium systems exhibiting complex spatial and temporal behavior. We propose a method for predicting behaviors in a broad class of such systems and apply these ideas to an archetypal example, the spatiotemporal chaotic 1D complex Ginzburg-Landau equation in the defect chaos regime. Building on the ideas of Ruelle and of Cross and Hohenberg that a spatiotemporal chaotic system can be considered a collection of weakly interacting dynamical units of a characteristic size, the chaotic length scale, we identify underlying, mesoscale, chaotic units and effective interaction potentials between them. We find that the resulting equilibrium Takahashi model accurately predicts distributions of particle numbers. These results suggest the intriguing possibility that a class of far-from-equilibrium systems may be well described at coarse-grained scales by the well-established theory of equilibrium statistical mechanics.

  19. Finding equilibrium in the spatiotemporal chaos of the complex Ginzburg-Landau equation.

    PubMed

    Ballard, Christopher C; Esty, C Clark; Egolf, David A

    2016-11-01

    Equilibrium statistical mechanics allows the prediction of collective behaviors of large numbers of interacting objects from just a few system-wide properties; however, a similar theory does not exist for far-from-equilibrium systems exhibiting complex spatial and temporal behavior. We propose a method for predicting behaviors in a broad class of such systems and apply these ideas to an archetypal example, the spatiotemporal chaotic 1D complex Ginzburg-Landau equation in the defect chaos regime. Building on the ideas of Ruelle and of Cross and Hohenberg that a spatiotemporal chaotic system can be considered a collection of weakly interacting dynamical units of a characteristic size, the chaotic length scale, we identify underlying, mesoscale, chaotic units and effective interaction potentials between them. We find that the resulting equilibrium Takahashi model accurately predicts distributions of particle numbers. These results suggest the intriguing possibility that a class of far-from-equilibrium systems may be well described at coarse-grained scales by the well-established theory of equilibrium statistical mechanics.

  20. A distributed approach for optimizing cascaded classifier topologies in real-time stream mining systems.

    PubMed

    Foo, Brian; van der Schaar, Mihaela

    2010-11-01

    In this paper, we discuss distributed optimization techniques for configuring classifiers in a real-time, informationally-distributed stream mining system. Due to the large volume of streaming data, stream mining systems must often cope with overload, which can lead to poor performance and intolerable processing delay for real-time applications. Furthermore, optimizing over an entire system of classifiers is a difficult task since changing the filtering process at one classifier can impact both the feature values of data arriving at classifiers further downstream and thus, the classification performance achieved by an ensemble of classifiers, as well as the end-to-end processing delay. To address this problem, this paper makes three main contributions: 1) Based on classification and queuing theoretic models, we propose a utility metric that captures both the performance and the delay of a binary filtering classifier system. 2) We introduce a low-complexity framework for estimating the system utility by observing, estimating, and/or exchanging parameters between the inter-related classifiers deployed across the system. 3) We provide distributed algorithms to reconfigure the system, and analyze the algorithms based on their convergence properties, optimality, information exchange overhead, and rate of adaptation to non-stationary data sources. We provide results using different video classifier systems.

  1. Nanoparticles from renewable polymers

    PubMed Central

    Wurm, Frederik R.; Weiss, Clemens K.

    2014-01-01

    The use of polymers from natural resources can bring many benefits for novel polymeric nanoparticle systems. Such polymers have a variety of beneficial properties such as biodegradability and biocompatibility, they are readily available on large scale and at low cost. As the amount of fossil fuels decrease, their application becomes more interesting even if characterization is in many cases more challenging due to structural complexity, either by broad distribution of their molecular weights (polysaccharides, polyesters, lignin) or by complex structure (proteins, lignin). This review summarizes different sources and methods for the preparation of biopolymer-based nanoparticle systems for various applications. PMID:25101259

  2. High throughput computing: a solution for scientific analysis

    USGS Publications Warehouse

    O'Donnell, M.

    2011-01-01

    handle job failures due to hardware, software, or network interruptions (obviating the need to manually resubmit the job after each stoppage); be affordable; and most importantly, allow us to complete very large, complex analyses that otherwise would not even be possible. In short, we envisioned a job-management system that would take advantage of unused FORT CPUs within a local area network (LAN) to effectively distribute and run highly complex analytical processes. What we found was a solution that uses High Throughput Computing (HTC) and High Performance Computing (HPC) systems to do exactly that (Figure 1).

  3. Innovations in clinical trials informatics.

    PubMed

    Summers, Ron; Vyas, Hiten; Dudhal, Nilesh; Doherty, Neil F; Coombs, Crispin R; Hepworth, Mark

    2008-01-01

    This paper will investigate innovations in information management for use in clinical trials. The application typifies a complex, adaptive, distributed and information-rich environment for which continuous innovation is necessary. Organisational innovation is highlighted as well as the technical innovations in workflow processes and their representation as an integrated set of web services. Benefits realization uncovers further innovations in the business strand of the work undertaken. Following the description of the development of this information management system, the semantic web is postulated as a possible solution to tame the complexity related to information management issues found within clinical trials support systems.

  4. Centralized vs decentralized lunar power system study

    NASA Astrophysics Data System (ADS)

    Metcalf, Kenneth; Harty, Richard B.; Perronne, Gerald E.

    1991-09-01

    Three power-system options are considered with respect to utilization on a lunar base: the fully centralized option, the fully decentralized option, and a hybrid comprising features of the first two options. Power source, power conditioning, and power transmission are considered separately, and each architecture option is examined with ac and dc distribution, high and low voltage transmission, and buried and suspended cables. Assessments are made on the basis of mass, technological complexity, cost, reliability, and installation complexity, however, a preferred power-system architecture is not proposed. Preferred options include transmission based on ac, transmission voltages of 2000-7000 V with buried high-voltage lines and suspended low-voltage lines. Assessments of the total cost associated with the installations are required to determine the most suitable power system.

  5. Centralities in simplicial complexes. Applications to protein interaction networks.

    PubMed

    Estrada, Ernesto; Ross, Grant J

    2018-02-07

    Complex networks can be used to represent complex systems which originate in the real world. Here we study a transformation of these complex networks into simplicial complexes, where cliques represent the simplices of the complex. We extend the concept of node centrality to that of simplicial centrality and study several mathematical properties of degree, closeness, betweenness, eigenvector, Katz, and subgraph centrality for simplicial complexes. We study the degree distributions of these centralities at the different levels. We also compare and describe the differences between the centralities at the different levels. Using these centralities we study a method for detecting essential proteins in PPI networks of cells and explain the varying abilities of the centrality measures at the different levels in identifying these essential proteins. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Distance measurements in Au nanoparticles functionalized with nitroxide radicals and Gd(3+)-DTPA chelate complexes.

    PubMed

    Yulikov, Maxim; Lueders, Petra; Warsi, Muhammad Farooq; Chechik, Victor; Jeschke, Gunnar

    2012-08-14

    Nanosized gold particles were functionalised with two types of paramagnetic surface tags, one having a nitroxide radical and the other one carrying a DTPA complex loaded with Gd(3+). Selective measurements of nitroxide-nitroxide, Gd(3+)-nitroxide and Gd(3+)-Gd(3+) distances were performed on this system and information on the distance distribution in the three types of spin pairs was obtained. A numerical analysis of the dipolar frequency distributions is presented for Gd(3+) centres with moderate magnitudes of zero-field splitting, in the range of detection frequencies and resonance fields where the high-field approximation is only roughly valid. The dipolar frequency analysis confirms the applicability of DEER for distance measurements in such complexes and gives an estimate for the magnitudes of possible systematic errors due to the non-ideality of the measurement of the dipole-dipole interaction.

  7. Complex Dynamics in Information Sharing Networks

    NASA Astrophysics Data System (ADS)

    Cronin, Bruce

    This study examines the roll-out of an electronic knowledge base in a medium-sized professional services firm over a six year period. The efficiency of such implementation is a key business problem in IT systems of this type. Data from usage logs provides the basis for analysis of the dynamic evolution of social networks around the depository during this time. The adoption pattern follows an "s-curve" and usage exhibits something of a power law distribution, both attributable to network effects, and network position is associated with organisational performance on a number of indicators. But periodicity in usage is evident and the usage distribution displays an exponential cut-off. Further analysis provides some evidence of mathematical complexity in the periodicity. Some implications of complex patterns in social network data for research and management are discussed. The study provides a case study demonstrating the utility of the broad methodological approach.

  8. Parallel Molecular Distributed Detection With Brownian Motion.

    PubMed

    Rogers, Uri; Koh, Min-Sung

    2016-12-01

    This paper explores the in vivo distributed detection of an undesired biological agent's (BAs) biomarkers by a group of biological sized nanomachines in an aqueous medium under drift. The term distributed, indicates that the system information relative to the BAs presence is dispersed across the collection of nanomachines, where each nanomachine possesses limited communication, computation, and movement capabilities. Using Brownian motion with drift, a probabilistic detection and optimal data fusion framework, coined molecular distributed detection, will be introduced that combines theory from both molecular communication and distributed detection. Using the optimal data fusion framework as a guide, simulation indicates that a sub-optimal fusion method exists, allowing for a significant reduction in implementation complexity while retaining BA detection accuracy.

  9. Biogeography of diseases: a framework for analysis

    NASA Astrophysics Data System (ADS)

    Peterson, A. Townsend

    2008-06-01

    A growing body of literature offers a framework for understanding geographic and ecological distributions of species; a few applications of this framework have treated disease transmission systems and their geography. The general framework focuses on interactions among abiotic requirements, biotic constraints, and dispersal abilities of species as determinants of distributional areas. Disease transmission systems have key differences from other sorts of biological phenomena: Interactions among species are particularly important, interactions may be stable or unstable, abiotic conditions may be relatively less important in shaping disease distributions, and dispersal abilities may be quite variable. The ways in which these differences may influence disease transmission geography are complex; I illustrate their effects by means of worked examples regarding West Nile Virus, plague, filoviruses, and yellow fever.

  10. Topology of Collisionless Relaxation

    NASA Astrophysics Data System (ADS)

    Pakter, Renato; Levin, Yan

    2013-04-01

    Using extensive molecular dynamics simulations we explore the fine-grained phase space structure of systems with long-range interactions. We find that if the initial phase space particle distribution has no holes, the final stationary distribution will also contain a compact simply connected region. The microscopic holes created by the filamentation of the initial distribution function are always restricted to the outer regions of the phase space. In general, for complex multilevel distributions it is very difficult to a priori predict the final stationary state without solving the full dynamical evolution. However, we show that, for multilevel initial distributions satisfying a generalized virial condition, it is possible to predict the particle distribution in the final stationary state using Casimir invariants of the Vlasov dynamics.

  11. Comments on the "Byzantine Self-Stabilizing Pulse Synchronization" Protocol: Counter-examples

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.; Siminiceanu, Radu

    2006-01-01

    Embedded distributed systems have become an integral part of many safety-critical applications. There have been many attempts to solve the self-stabilization problem of clocks across a distributed system. An analysis of one such protocol called the Byzantine Self-Stabilizing Pulse Synchronization (BSS-Pulse-Synch) protocol from a paper entitled "Linear Time Byzantine Self-Stabilizing Clock Synchronization" by Daliot, et al., is presented in this report. This report also includes a discussion of the complexity and pitfalls of designing self-stabilizing protocols and provides counter-examples for the claims of the above protocol.

  12. Computational Model for Ethnographically Informed Systems Design

    NASA Astrophysics Data System (ADS)

    Iqbal, Rahat; James, Anne; Shah, Nazaraf; Terken, Jacuqes

    This paper presents a computational model for ethnographically informed systems design that can support complex and distributed cooperative activities. This model is based on an ethnographic framework consisting of three important dimensions (e.g., distributed coordination, awareness of work and plans and procedure), and the BDI (Belief, Desire and Intention) model of intelligent agents. The ethnographic framework is used to conduct ethnographic analysis and to organise ethnographically driven information into three dimensions, whereas the BDI model allows such information to be mapped upon the underlying concepts of multi-agent systems. The advantage of this model is that it is built upon an adaptation of existing mature and well-understood techniques. By the use of this model, we also address the cognitive aspects of systems design.

  13. Polarization variations in installed fibers and their influence on quantum key distribution systems.

    PubMed

    Ding, Yu-Yang; Chen, Hua; Wang, Shuang; He, De-Yong; Yin, Zhen-Qiang; Chen, Wei; Zhou, Zheng; Guo, Guang-Can; Han, Zheng-Fu

    2017-10-30

    Polarization variations in the installed fibers are complex and volatile, and would severely affect the performances of polarization-sensitive quantum key distribution (QKD) systems. Based on the recorded data about polarization variations of different installed fibers, we establish an analytical methodology to quantitatively evaluate the influence of polarization variations on polarization-sensitive QKD systems. Using the increased quantum bit error rate induced by polarization variations as a key criteria, we propose two parameters - polarization drift time and required tracking speed - to characterize polarization variations. For field buried and aerial fibers with different length, we quantitatively evaluate the influence of polarization variations, and also provide requirements and suggestions for polarization basis alignment modules of QKD systems deployed in different kind of fibers.

  14. Transition in the waiting-time distribution of price-change events in a global socioeconomic system

    NASA Astrophysics Data System (ADS)

    Zhao, Guannan; McDonald, Mark; Fenn, Dan; Williams, Stacy; Johnson, Nicholas; Johnson, Neil F.

    2013-12-01

    The goal of developing a firmer theoretical understanding of inhomogeneous temporal processes-in particular, the waiting times in some collective dynamical system-is attracting significant interest among physicists. Quantifying the deviations between the waiting-time distribution and the distribution generated by a random process may help unravel the feedback mechanisms that drive the underlying dynamics. We analyze the waiting-time distributions of high-frequency foreign exchange data for the best executable bid-ask prices across all major currencies. We find that the lognormal distribution yields a good overall fit for the waiting-time distribution between currency rate changes if both short and long waiting times are included. If we restrict our study to long waiting times, each currency pair’s distribution is consistent with a power-law tail with exponent near to 3.5. However, for short waiting times, the overall distribution resembles one generated by an archetypal complex systems model in which boundedly rational agents compete for limited resources. Our findings suggest that a gradual transition arises in trading behavior between a fast regime in which traders act in a boundedly rational way and a slower one in which traders’ decisions are driven by generic feedback mechanisms across multiple timescales and hence produce similar power-law tails irrespective of currency type.

  15. An automated system for the study of ionospheric spatial structures

    NASA Astrophysics Data System (ADS)

    Belinskaya, I. V.; Boitman, O. N.; Vugmeister, B. O.; Vyborova, V. M.; Zakharov, V. N.; Laptev, V. A.; Mamchenko, M. S.; Potemkin, A. A.; Radionov, V. V.

    The system is designed for the study of the vertical distribution of electron density and the parameters of medium-scale ionospheric irregularities over the sounding site as well as the reconstruction of the spatial distribution of electron density within the range of up to 300 km from the sounding location. The system comprises an active central station as well as passive companion stations. The central station is equipped with the digital ionosonde ``Basis'', the measuring-and-computing complex IVK-2, and the receiver-recorder PRK-3M. The companion stations are equipped with receivers-recorders PRK-3. The automated comlex software system includes 14 subsystems. Data transfer between them is effected using magnetic disk data sets. The system is operated in both ionogram mode and Doppler shift and angle-of-arrival mode. Using data obtained in these two modes, the reconstruction of the spatial distribution of electron density in the region is carried out. Reconstruction is checked for accuracy using data from companion stations.

  16. Supervisory control and diagnostics system for the mirror fusion test facility: overview and status 1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGoldrick, P.R.

    1981-01-01

    The Mirror Fusion Test Facility (MFTF) is a complex facility requiring a highly-computerized Supervisory Control and Diagnostics System (SCDS) to monitor and provide control over ten subsystems; three of which require true process control. SCDS will provide physicists with a method of studying machine and plasma behavior by acquiring and processing up to four megabytes of plasma diagnostic information every five minutes. A high degree of availability and throughput is provided by a distributed computer system (nine 32-bit minicomputers on shared memory). Data, distributed across SCDS, is managed by a high-bandwidth Distributed Database Management System. The MFTF operators' control roommore » consoles use color television monitors with touch sensitive screens; this is a totally new approach. The method of handling deviations to normal machine operation and how the operator should be notified and assisted in the resolution of problems has been studied and a system designed.« less

  17. Cells distribution in the modeling of fibrosis. Comment on "Towards a unified approach in the modeling of fibrosis: A review with research perspectives" by Martine Ben Amar and Carlo Bianca

    NASA Astrophysics Data System (ADS)

    Abdel-Aty, Mahmoud

    2016-07-01

    The modeling of a complex system requires the analysis of all microscopic constituents and in particular of their interactions [1]. The interest in this research field has increased considering also recent developments in the information sciences. However interaction among scholars working in various fields of the applied sciences can be considered the true motor for the definition of a general framework for the analysis of complex systems. In particular biological systems constitute the platform where many scientists have decided to collaborate in order to gain a global description of the system. Among others, cancer-immune system competition (see [2] and the review papers [3,4]) has attracted much attention.

  18. A Systems Approach to Vaccine Decision Making

    PubMed Central

    Lee, Bruce Y.; Mueller, Leslie E.; Tilchin, Carla G.

    2016-01-01

    Vaccines reside in a complex multiscale system that includes biological, clinical, behavioral, social, operational, environmental, and economical relationships. Not accounting for these systems when making decisions about vaccines can result in changes that have little effect rather than solutions, lead to unsustainable solutions, miss indirect (e.g., secondary, tertiary, and beyond) effects, cause unintended consequences, and lead to wasted time, effort, and resources. Mathematical and computational modeling can help better understand and address complex systems by representing all or most of the components, relationships, and processes. Such models can serve as “virtual laboratories” to examine how a system operates and test the effects of different changes within the system. Here are ten lessons learned from using computational models to bring more of a systems approach to vaccine decision making: (i) traditional single measure approaches may overlook opportunities; (ii) there is complex interplay among many vaccine, population, and disease characteristics; (iii) accounting for perspective can identify synergies; (iv) the distribution system should not be overlooked; (v) target population choice can have secondary and tertiary effects; (vi) potentially overlooked characteristics can be important; (vii) characteristics of one vaccine can affect other vaccines; (viii) the broader impact of vaccines is complex; (ix) vaccine administration extends beyond the provider level; (x) and the value of vaccines is dynamic. PMID:28017430

  19. A systems approach to vaccine decision making.

    PubMed

    Lee, Bruce Y; Mueller, Leslie E; Tilchin, Carla G

    2017-01-20

    Vaccines reside in a complex multiscale system that includes biological, clinical, behavioral, social, operational, environmental, and economical relationships. Not accounting for these systems when making decisions about vaccines can result in changes that have little effect rather than solutions, lead to unsustainable solutions, miss indirect (e.g., secondary, tertiary, and beyond) effects, cause unintended consequences, and lead to wasted time, effort, and resources. Mathematical and computational modeling can help better understand and address complex systems by representing all or most of the components, relationships, and processes. Such models can serve as "virtual laboratories" to examine how a system operates and test the effects of different changes within the system. Here are ten lessons learned from using computational models to bring more of a systems approach to vaccine decision making: (i) traditional single measure approaches may overlook opportunities; (ii) there is complex interplay among many vaccine, population, and disease characteristics; (iii) accounting for perspective can identify synergies; (iv) the distribution system should not be overlooked; (v) target population choice can have secondary and tertiary effects; (vi) potentially overlooked characteristics can be important; (vii) characteristics of one vaccine can affect other vaccines; (viii) the broader impact of vaccines is complex; (ix) vaccine administration extends beyond the provider level; and (x) the value of vaccines is dynamic. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Geographically distributed real-time digital simulations using linear prediction

    DOE PAGES

    Liu, Ren; Mohanpurkar, Manish; Panwar, Mayank; ...

    2016-07-04

    Real time simulation is a powerful tool for analyzing, planning, and operating modern power systems. For analyzing the ever evolving power systems and understanding complex dynamic and transient interactions larger real time computation capabilities are essential. These facilities are interspersed all over the globe and to leverage unique facilities geographically-distributed real-time co-simulation in analyzing the power systems is pursued and presented. However, the communication latency between different simulator locations may lead to inaccuracy in geographically distributed real-time co-simulations. In this paper, the effect of communication latency on geographically distributed real-time co-simulation is introduced and discussed. In order to reduce themore » effect of the communication latency, a real-time data predictor, based on linear curve fitting is developed and integrated into the distributed real-time co-simulation. Two digital real time simulators are used to perform dynamic and transient co-simulations with communication latency and predictor. Results demonstrate the effect of the communication latency and the performance of the real-time data predictor to compensate it.« less

  1. A mathematical model for generating bipartite graphs and its application to protein networks

    NASA Astrophysics Data System (ADS)

    Nacher, J. C.; Ochiai, T.; Hayashida, M.; Akutsu, T.

    2009-12-01

    Complex systems arise in many different contexts from large communication systems and transportation infrastructures to molecular biology. Most of these systems can be organized into networks composed of nodes and interacting edges. Here, we present a theoretical model that constructs bipartite networks with the particular feature that the degree distribution can be tuned depending on the probability rate of fundamental processes. We then use this model to investigate protein-domain networks. A protein can be composed of up to hundreds of domains. Each domain represents a conserved sequence segment with specific functional tasks. We analyze the distribution of domains in Homo sapiens and Arabidopsis thaliana organisms and the statistical analysis shows that while (a) the number of domain types shared by k proteins exhibits a power-law distribution, (b) the number of proteins composed of k types of domains decays as an exponential distribution. The proposed mathematical model generates bipartite graphs and predicts the emergence of this mixing of (a) power-law and (b) exponential distributions. Our theoretical and computational results show that this model requires (1) growth process and (2) copy mechanism.

  2. Geographically distributed real-time digital simulations using linear prediction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Ren; Mohanpurkar, Manish; Panwar, Mayank

    Real time simulation is a powerful tool for analyzing, planning, and operating modern power systems. For analyzing the ever evolving power systems and understanding complex dynamic and transient interactions larger real time computation capabilities are essential. These facilities are interspersed all over the globe and to leverage unique facilities geographically-distributed real-time co-simulation in analyzing the power systems is pursued and presented. However, the communication latency between different simulator locations may lead to inaccuracy in geographically distributed real-time co-simulations. In this paper, the effect of communication latency on geographically distributed real-time co-simulation is introduced and discussed. In order to reduce themore » effect of the communication latency, a real-time data predictor, based on linear curve fitting is developed and integrated into the distributed real-time co-simulation. Two digital real time simulators are used to perform dynamic and transient co-simulations with communication latency and predictor. Results demonstrate the effect of the communication latency and the performance of the real-time data predictor to compensate it.« less

  3. Distributed simulation using a real-time shared memory network

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Mattern, Duane L.; Wong, Edmond; Musgrave, Jeffrey L.

    1993-01-01

    The Advanced Control Technology Branch of the NASA Lewis Research Center performs research in the area of advanced digital controls for aeronautic and space propulsion systems. This work requires the real-time implementation of both control software and complex dynamical models of the propulsion system. We are implementing these systems in a distributed, multi-vendor computer environment. Therefore, a need exists for real-time communication and synchronization between the distributed multi-vendor computers. A shared memory network is a potential solution which offers several advantages over other real-time communication approaches. A candidate shared memory network was tested for basic performance. The shared memory network was then used to implement a distributed simulation of a ramjet engine. The accuracy and execution time of the distributed simulation was measured and compared to the performance of the non-partitioned simulation. The ease of partitioning the simulation, the minimal time required to develop for communication between the processors and the resulting execution time all indicate that the shared memory network is a real-time communication technique worthy of serious consideration.

  4. An inexact log-normal distribution-based stochastic chance-constrained model for agricultural water quality management

    NASA Astrophysics Data System (ADS)

    Wang, Yu; Fan, Jie; Xu, Ye; Sun, Wei; Chen, Dong

    2018-05-01

    In this study, an inexact log-normal-based stochastic chance-constrained programming model was developed for solving the non-point source pollution issues caused by agricultural activities. Compared to the general stochastic chance-constrained programming model, the main advantage of the proposed model is that it allows random variables to be expressed as a log-normal distribution, rather than a general normal distribution. Possible deviations in solutions caused by irrational parameter assumptions were avoided. The agricultural system management in the Erhai Lake watershed was used as a case study, where critical system factors, including rainfall and runoff amounts, show characteristics of a log-normal distribution. Several interval solutions were obtained under different constraint-satisfaction levels, which were useful in evaluating the trade-off between system economy and reliability. The applied results show that the proposed model could help decision makers to design optimal production patterns under complex uncertainties. The successful application of this model is expected to provide a good example for agricultural management in many other watersheds.

  5. Residence times of groundwater and nitrate transport in coastal aquifer systems: Daweijia area, northeastern China.

    PubMed

    Han, Dongmei; Cao, Guoliang; McCallum, James; Song, Xianfang

    2015-12-15

    Groundwater within the coastal aquifer systems of the Daweijia area in northeastern China is characterized by a large of variations (33-521mg/L) in NO3(-) concentrations. Elevated nitrate concentrations, in addition to seawater intrusion in the Daweijia well field, both attributable to anthropogenic activities, may impact future water-management practices. Chemical and stable isotopic (δ(18)O, δ(2)H) analysis, (3)H and CFCs methods were applied to provide a better understanding of the relationship between the distribution of groundwater mean residence time (MRT) and nitrate transport, and to identify sources of nitrate concentrations in the complex coastal aquifer systems. There is a relatively narrow range of isotopic composition (ranging from -8.5 to -7.0‰) in most groundwater. Generally higher tritium contents observed in the wet season relative to the dry season may result from rapid groundwater circulation in response to the rainfall through the preferential flow paths. In the well field, the relatively increased nitrate concentrations of groundwater, accompanied by the higher tritium contents in the wet season, indicate the nitrate pollution can be attributed to domestic wastes. The binary exponential and piston-flow mixing model (BEP) yielded feasible age distributions based on the conceptual model. The good inverse relationship between groundwater MRTs (92-467years) and the NO3(-) concentrations in the shallow Quaternary aquifers indicates that elevated nitrate concentrations are attributable to more recent recharge for shallow groundwater. However, there is no significant relationship between the MRTs (8-411years) and the NO3(-) concentrations existing in the carbonate aquifer system, due to the complex hydrogeological conditions, groundwater age distributions and the range of contaminant source areas. Nitrate in the groundwater system without denitrification effects could accumulate and be transported for tens of years, through the complex carbonate aquifer matrix and the successive inputs of nitrogen from various sources. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Efficient Statistically Accurate Algorithms for the Fokker-Planck Equation in Large Dimensions

    NASA Astrophysics Data System (ADS)

    Chen, N.; Majda, A.

    2017-12-01

    Solving the Fokker-Planck equation for high-dimensional complex turbulent dynamical systems is an important and practical issue. However, most traditional methods suffer from the curse of dimensionality and have difficulties in capturing the fat tailed highly intermittent probability density functions (PDFs) of complex systems in turbulence, neuroscience and excitable media. In this article, efficient statistically accurate algorithms are developed for solving both the transient and the equilibrium solutions of Fokker-Planck equations associated with high-dimensional nonlinear turbulent dynamical systems with conditional Gaussian structures. The algorithms involve a hybrid strategy that requires only a small number of ensembles. Here, a conditional Gaussian mixture in a high-dimensional subspace via an extremely efficient parametric method is combined with a judicious non-parametric Gaussian kernel density estimation in the remaining low-dimensional subspace. Particularly, the parametric method, which is based on an effective data assimilation framework, provides closed analytical formulae for determining the conditional Gaussian distributions in the high-dimensional subspace. Therefore, it is computationally efficient and accurate. The full non-Gaussian PDF of the system is then given by a Gaussian mixture. Different from the traditional particle methods, each conditional Gaussian distribution here covers a significant portion of the high-dimensional PDF. Therefore a small number of ensembles is sufficient to recover the full PDF, which overcomes the curse of dimensionality. Notably, the mixture distribution has a significant skill in capturing the transient behavior with fat tails of the high-dimensional non-Gaussian PDFs, and this facilitates the algorithms in accurately describing the intermittency and extreme events in complex turbulent systems. It is shown in a stringent set of test problems that the method only requires an order of O(100) ensembles to successfully recover the highly non-Gaussian transient PDFs in up to 6 dimensions with only small errors.

  7. Improved neutron activation prediction code system development

    NASA Technical Reports Server (NTRS)

    Saqui, R. M.

    1971-01-01

    Two integrated neutron activation prediction code systems have been developed by modifying and integrating existing computer programs to perform the necessary computations to determine neutron induced activation gamma ray doses and dose rates in complex geometries. Each of the two systems is comprised of three computational modules. The first program module computes the spatial and energy distribution of the neutron flux from an input source and prepares input data for the second program which performs the reaction rate, decay chain and activation gamma source calculations. A third module then accepts input prepared by the second program to compute the cumulative gamma doses and/or dose rates at specified detector locations in complex, three-dimensional geometries.

  8. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deboever, Jeremiah; Zhang, Xiaochen; Reno, Matthew J.

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10more » to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.« less

  9. Statistical self-similarity of hotspot seamount volumes modeled as self-similar criticality

    USGS Publications Warehouse

    Tebbens, S.F.; Burroughs, S.M.; Barton, C.C.; Naar, D.F.

    2001-01-01

    The processes responsible for hotspot seamount formation are complex, yet the cumulative frequency-volume distribution of hotspot seamounts in the Easter Island/Salas y Gomez Chain (ESC) is found to be well-described by an upper-truncated power law. We develop a model for hotspot seamount formation where uniform energy input produces events initiated on a self-similar distribution of critical cells. We call this model Self-Similar Criticality (SSC). By allowing the spatial distribution of magma migration to be self-similar, the SSC model recreates the observed ESC seamount volume distribution. The SSC model may have broad applicability to other natural systems.

  10. Automated aberration compensation in high numerical aperture systems for arbitrary laser modes (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Hering, Julian; Waller, Erik H.; von Freymann, Georg

    2017-02-01

    Since a large number of optical systems and devices are based on differently shaped focal intensity distributions (point-spread-functions, PSF), the PSF's quality is crucial for the application's performance. E.g., optical tweezers, optical potentials for trapping of ultracold atoms as well as stimulated-emission-depletion (STED) based microscopy and lithography rely on precisely controlled intensity distributions. However, especially in high numerical aperture (NA) systems, such complex laser modes are easily distorted by aberrations leading to performance losses. Although different approaches addressing phase retrieval algorithms have been recently presented[1-3], fast and automated aberration compensation for a broad variety of complex shaped PSFs in high NA systems is still missing. Here, we report on a Gerchberg-Saxton[4] based algorithm (GSA) for automated aberration correction of arbitrary PSFs, especially for high NA systems. Deviations between the desired target intensity distribution and the three-dimensionally (3D) scanned experimental focal intensity distribution are used to calculate a correction phase pattern. The target phase distribution plus the correction pattern are displayed on a phase-only spatial-light-modulator (SLM). Focused by a high NA objective, experimental 3D scans of several intensity distributions allow for characterization of the algorithms performance: aberrations are reliably identified and compensated within less than 10 iterations. References 1. B. M. Hanser, M. G. L. Gustafsson, D. A. Agard, and J. W. Sedat, "Phase-retrieved pupil functions in wide-field fluorescence microscopy," J. of Microscopy 216(1), 32-48 (2004). 2. A. Jesacher, A. Schwaighofer, S. Frhapter, C. Maurer, S. Bernet, and M. Ritsch-Marte, "Wavefront correction of spatial light modulators using an optical vortex image," Opt. Express 15(9), 5801-5808 (2007). 3. A. Jesacher and M. J. Booth, "Parallel direct laser writing in three dimensions with spatially dependent aberration correction," Opt. Express 18(20), 21090-21099 (2010). 4. R. W. Gerchberg and W. O. Saxton, "A practical algorithm for the determination of the phase from image and diffraction plane pictures," Optik 35(2), 237-246 (1972).

  11. MFIRE-2: A Multi Agent System for Flow-Based Intrusion Detection Using Stochastic Search

    DTIC Science & Technology

    2012-03-01

    attacks that are distributed in nature , but may not protect individual systems effectively without incurring large bandwidth penalties while collecting...system-level information to help prepare for more significant attacks. The type of information potentially revealed by footprinting includes account...key areas where MAS may be appropriate: • The environment is open, highly dynamic, uncertain, or complex • Agents are a natural metaphor—Many

  12. Real-time contingency handling in MAESTRO

    NASA Technical Reports Server (NTRS)

    Britt, Daniel L.; Geoffroy, Amy L.

    1992-01-01

    A scheduling and resource management system named MAESTRO was interfaced with a Space Station Module Power Management and Distribution (SSM/PMAD) breadboard at MSFC. The combined system serves to illustrate the integration of planning, scheduling, and control in a realistic, complex domain. This paper briefly describes the functional elements of the combined system, including normal and contingency operational scenarios, then focusses on the method used by the scheduler to handle real-time contingencies.

  13. Modeling and dynamical topology properties of VANET based on complex networks theory

    NASA Astrophysics Data System (ADS)

    Zhang, Hong; Li, Jie

    2015-01-01

    Vehicular Ad hoc Network (VANET) is a special subset of multi-hop Mobile Ad hoc Networks in which vehicles can not only communicate with each other but also with the fixed equipments along the roads through wireless interfaces. Recently, it has been discovered that essential systems in real world share similar properties. When they are regarded as networks, among which the dynamic topology structure of VANET system is an important issue. Many real world networks are actually growing with preferential attachment like Internet, transportation system and telephone network. Those phenomena have brought great possibility in finding a strategy to calibrate and control the topology parameters which can help find VANET topology change regulation to relieve traffic jam, prevent traffic accident and improve traffic safety. VANET is a typical complex network which has its basic characteristics. In this paper, we focus on the macroscopic Vehicle-to-Infrastructure (V2I) and Vehicle-to-Vehicle (V2V) inter-vehicle communication network with complex network theory. In particular, this paper is the first one to propose a method analyzing the topological structure and performance of VANET and present the communications in VANET from a new perspective. Accordingly, we propose degree distribution, clustering coefficient and the short path length of complex network to implement our strategy by numerical example and simulation. All the results demonstrate that VANET shows small world network features and is characterized by a truncated scale-free degree distribution with power-law degree distribution. The average path length of the network is simulated numerically, which indicates that the network shows small-world property and is rarely affected by the randomness. What's more, we carry out extensive simulations of information propagation and mathematically prove the power law property when γ > 2. The results of this study provide useful information for VANET optimization from a macroscopic perspective.

  14. Modeling and dynamical topology properties of VANET based on complex networks theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Hong; Li, Jie, E-mail: prof.li@foxmail.com

    2015-01-15

    Vehicular Ad hoc Network (VANET) is a special subset of multi-hop Mobile Ad hoc Networks in which vehicles can not only communicate with each other but also with the fixed equipments along the roads through wireless interfaces. Recently, it has been discovered that essential systems in real world share similar properties. When they are regarded as networks, among which the dynamic topology structure of VANET system is an important issue. Many real world networks are actually growing with preferential attachment like Internet, transportation system and telephone network. Those phenomena have brought great possibility in finding a strategy to calibrate andmore » control the topology parameters which can help find VANET topology change regulation to relieve traffic jam, prevent traffic accident and improve traffic safety. VANET is a typical complex network which has its basic characteristics. In this paper, we focus on the macroscopic Vehicle-to-Infrastructure (V2I) and Vehicle-to-Vehicle (V2V) inter-vehicle communication network with complex network theory. In particular, this paper is the first one to propose a method analyzing the topological structure and performance of VANET and present the communications in VANET from a new perspective. Accordingly, we propose degree distribution, clustering coefficient and the short path length of complex network to implement our strategy by numerical example and simulation. All the results demonstrate that VANET shows small world network features and is characterized by a truncated scale-free degree distribution with power-law degree distribution. The average path length of the network is simulated numerically, which indicates that the network shows small-world property and is rarely affected by the randomness. What’s more, we carry out extensive simulations of information propagation and mathematically prove the power law property when γ > 2. The results of this study provide useful information for VANET optimization from a macroscopic perspective.« less

  15. Universality and depinning models for plastic yield in amorphous materials

    NASA Astrophysics Data System (ADS)

    Budrikis, Zoe; Fernandez Castellano, David; Sandfeld, Stefan; Zaiser, Michael; Zapperi, Stefano

    Plastic yield in amorphous materials occurs as a result of complex collective dynamics of local reorganizations, which gives rise to rich phenomena such as strain localization, intermittent dynamics and power-law distributed avalanches. While such systems have received considerable attention, both theoretical and experimental, controversy remains over the nature of the yielding transition. We present a new fully-tensorial coarsegrained model in 2D and 3D, and demonstrate that the exponents describing avalanche distributions are universal under a variety of loading conditions, system dimensionality and size, and boundary conditions. Our results show that while depinning-type models in general are apt to describe the system, mean field depinning models are not.

  16. Distributional potential of the Triatoma brasiliensis species complex at present and under scenarios of future climate conditions

    PubMed Central

    2014-01-01

    Background The Triatoma brasiliensis complex is a monophyletic group, comprising three species, one of which includes two subspecific taxa, distributed across 12 Brazilian states, in the caatinga and cerrado biomes. Members of the complex are diverse in terms of epidemiological importance, morphology, biology, ecology, and genetics. Triatoma b. brasiliensis is the most disease-relevant member of the complex in terms of epidemiology, extensive distribution, broad feeding preferences, broad ecological distribution, and high rates of infection with Trypanosoma cruzi; consequently, it is considered the principal vector of Chagas disease in northeastern Brazil. Methods We used ecological niche models to estimate potential distributions of all members of the complex, and evaluated the potential for suitable adjacent areas to be colonized; we also present first evaluations of potential for climate change-mediated distributional shifts. Models were developed using the GARP and Maxent algorithms. Results Models for three members of the complex (T. b. brasiliensis, N = 332; T. b. macromelasoma, N = 35; and T. juazeirensis, N = 78) had significant distributional predictivity; however, models for T. sherlocki and T. melanica, both with very small sample sizes (N = 7), did not yield predictions that performed better than random. Model projections onto future-climate scenarios indicated little broad-scale potential for change in the potential distribution of the complex through 2050. Conclusions This study suggests that T. b. brasiliensis is the member of the complex with the greatest distributional potential to colonize new areas: overall; however, the distribution of the complex appears relatively stable. These analyses offer key information to guide proactive monitoring and remediation activities to reduce risk of Chagas disease transmission. PMID:24886587

  17. Neuroergonomics - Analyzing Brain Function to Enhance Human Performance in Complex Systems

    DTIC Science & Technology

    2008-12-02

    NUMBER 6. AUTHOR( S ) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) George Mason...University 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME( S ) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM( S ) 11. SPONSOR...MONITOR’S REPORT NUMBER( S ) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release, distribution unlimited 13. SUPPLEMENTARY NOTES See

  18. Intelligent systems engineering methodology

    NASA Technical Reports Server (NTRS)

    Fouse, Scott

    1990-01-01

    An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.

  19. Access to small size distributions of nanoparticles by microwave-assisted synthesis. Formation of Ag nanoparticles in aqueous carboxymethylcellulose solutions in batch and continuous-flow reactors

    NASA Astrophysics Data System (ADS)

    Horikoshi, Satoshi; Abe, Hideki; Torigoe, Kanjiro; Abe, Masahiko; Serpone, Nick

    2010-08-01

    This article examines the effect(s) of the 2.45-GHz microwave (MW) radiation in the synthesis of silver nanoparticles in aqueous media by reduction of the diaminesilver(i) complex, [Ag(NH3)2]+, with carboxymethylcellulose (CMC) in both batch-type and continuous-flow reactor systems with a particular emphasis on the characteristics of the microwaves in this process and the size distributions. This microwave thermally-assisted synthesis is compared to a conventional heating (CH) method, both requiring a reaction temperature of 100 °C to produce the nanoparticles, in both cases leading to the formation of silver colloids with different size distributions. Reduction of the diaminesilver(i) precursor complex, [Ag(NH3)2]+, by CMC depended on the solution temperature. Cooling the reactor during the heating process driven with 390-Watt microwaves (MW-390W/Cool protocol) yielded silver nanoparticles with sizes spanning the range 1-2 nm. By contrast, the size distribution of Ag nanoparticles with 170-Watt microwaves (no cooling; MW-170W protocol) was in the range 1.4-3.6 nm (average size ~3 nm). The overall results suggest the potential for a scale-up process in the microwave-assisted synthesis of nanoparticles. Based on the present data, a flow-through microwave reactor system is herein proposed for the continuous production of silver nanoparticles. The novel flow reactor system (flow rate, 600 mL min-1) coupled to 1200-Watt microwave radiation generated silver nanoparticles with a size distribution 0.7-2.8 nm (average size ca. 1.5 nm).

  20. Arctic systems in the Quaternary: ecological collision, faunal mosaics and the consequences of a wobbling climate.

    PubMed

    Hoberg, E P; Cook, J A; Agosta, S J; Boeger, W; Galbreath, K E; Laaksonen, S; Kutz, S J; Brooks, D R

    2017-07-01

    Climate oscillations and episodic processes interact with evolution, ecology and biogeography to determine the structure and complex mosaic that is the biosphere. Parasites and parasite-host assemblages are key components in a general explanatory paradigm for global biodiversity. We explore faunal assembly in the context of Quaternary time frames of the past 2.6 million years, a period dominated by episodic shifts in climate. Climate drivers cross a continuum from geological to contemporary timescales and serve to determine the structure and distribution of complex biotas. Cycles within cycles are apparent, with drivers that are layered, multifactorial and complex. These cycles influence the dynamics and duration of shifts in environmental structure on varying temporal and spatial scales. An understanding of the dynamics of high-latitude systems, the history of the Beringian nexus (the intermittent land connection linking Eurasia and North America) and downstream patterns of diversity depend on teasing apart the complexity of biotic assembly and persistence. Although climate oscillations have dominated the Quaternary, contemporary dynamics are driven by tipping points and shifting balances emerging from anthropogenic forces that are disrupting ecological structure. Climate change driven by anthropogenic forcing has supplanted a history of episodic variation and is eliminating ecological barriers and constraints on development and distribution for pathogen transmission. A framework to explore interactions of episodic processes on faunal structure and assembly is the Stockholm Paradigm, which appropriately shifts the focus from cospeciation to complexity and contingency in explanations of diversity.

  1. TSPA 1991: An initial total-system performance assessment for Yucca Mountain; Yucca Mountain Site Characterization Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnard, R.W.; Wilson, M.L.; Dockery, H.A.

    1992-07-01

    This report describes an assessment of the long-term performance of a repository system that contains deeply buried highly radioactive waste; the system is assumed to be located at the potential site at Yucca Mountain, Nevada. The study includes an identification of features, events, and processes that might affect the potential repository, a construction of scenarios based on this identification, a selection of models describing these scenarios (including abstraction of appropriate models from detailed models), a selection of probability distributions for the parameters in the models, a stochastic calculation of radionuclide releases for the scenarios, and a derivation of complementary cumulativemore » distribution functions (CCDFs) for the releases. Releases and CCDFs are calculated for four categories of scenarios: aqueous flow (modeling primarily the existing conditions at the site, with allowances for climate change), gaseous flow, basaltic igneous activity, and human intrusion. The study shows that models of complex processes can be abstracted into more simplified representations that preserve the understanding of the processes and produce results consistent with those of more complex models.« less

  2. The Paperless Solution

    NASA Technical Reports Server (NTRS)

    2001-01-01

    REI Systems, Inc. developed a software solution that uses the Internet to eliminate the paperwork typically required to document and manage complex business processes. The data management solution, called Electronic Handbooks (EHBs), is presently used for the entire SBIR program processes at NASA. The EHB-based system is ideal for programs and projects whose users are geographically distributed and are involved in complex management processes and procedures. EHBs provide flexible access control and increased communications while maintaining security for systems of all sizes. Through Internet Protocol- based access, user authentication and user-based access restrictions, role-based access control, and encryption/decryption, EHBs provide the level of security required for confidential data transfer. EHBs contain electronic forms and menus, which can be used in real time to execute the described processes. EHBs use standard word processors that generate ASCII HTML code to set up electronic forms that are viewed within a web browser. EHBs require no end-user software distribution, significantly reducing operating costs. Each interactive handbook simulates a hard-copy version containing chapters with descriptions of participants' roles in the online process.

  3. Testing Limits on Matte Surface Color Perception in Three-Dimensional Scenes with Complex Light Fields

    PubMed Central

    Doerschner, K.; Boyaci, H.; Maloney, L. T.

    2007-01-01

    We investigated limits on the human visual system’s ability to discount directional variation in complex lights field when estimating Lambertian surface color. Directional variation in the light field was represented in the frequency domain using spherical harmonics. The bidirectional reflectance distribution function of a Lambertian surface acts as a low-pass filter on directional variation in the light field. Consequently, the visual system needs to discount only the low-pass component of the incident light corresponding to the first nine terms of a spherical harmonics expansion (Basri & Jacobs, 2001; Ramamoorthi & Hanrahan, 2001) to accurately estimate surface color. We test experimentally whether the visual system discounts directional variation in the light field up to this physical limit. Our results are consistent with the claim that the visual system can compensate for all of the complexity in the light field that affects the appearance of Lambertian surfaces. PMID:18053846

  4. Forecasting overhaul or replacement intervals based on estimated system failure intensity

    NASA Astrophysics Data System (ADS)

    Gannon, James M.

    1994-12-01

    System reliability can be expressed in terms of the pattern of failure events over time. Assuming a nonhomogeneous Poisson process and Weibull intensity function for complex repairable system failures, the degree of system deterioration can be approximated. Maximum likelihood estimators (MLE's) for the system Rate of Occurrence of Failure (ROCOF) function are presented. Evaluating the integral of the ROCOF over annual usage intervals yields the expected number of annual system failures. By associating a cost of failure with the expected number of failures, budget and program policy decisions can be made based on expected future maintenance costs. Monte Carlo simulation is used to estimate the range and the distribution of the net present value and internal rate of return of alternative cash flows based on the distributions of the cost inputs and confidence intervals of the MLE's.

  5. Distributed mixed-integer fuzzy hierarchical programming for municipal solid waste management. Part II: scheme analysis and mechanism revelation.

    PubMed

    Cheng, Guanhui; Huang, Guohe; Dong, Cong; Xu, Ye; Chen, Jiapei; Chen, Xiujuan; Li, Kailong

    2017-03-01

    As presented in the first companion paper, distributed mixed-integer fuzzy hierarchical programming (DMIFHP) was developed for municipal solid waste management (MSWM) under complexities of heterogeneities, hierarchy, discreteness, and interactions. Beijing was selected as a representative case. This paper focuses on presenting the obtained schemes and the revealed mechanisms of the Beijing MSWM system. The optimal MSWM schemes for Beijing under various solid waste treatment policies and their differences are deliberated. The impacts of facility expansion, hierarchy, and spatial heterogeneities and potential extensions of DMIFHP are also discussed. A few of findings are revealed from the results and a series of comparisons and analyses. For instance, DMIFHP is capable of robustly reflecting these complexities in MSWM systems, especially for Beijing. The optimal MSWM schemes are of fragmented patterns due to the dominant role of the proximity principle in allocating solid waste treatment resources, and they are closely related to regulated ratios of landfilling, incineration, and composting. Communities without significant differences among distances to different types of treatment facilities are more sensitive to these ratios than others. The complexities of hierarchy and heterogeneities pose significant impacts on MSWM practices. Spatial dislocation of MSW generation rates and facility capacities caused by unreasonable planning in the past may result in insufficient utilization of treatment capacities under substantial influences of transportation costs. The problems of unreasonable MSWM planning, e.g., severe imbalance among different technologies and complete vacancy of ten facilities, should be gained deliberation of the public and the municipal or local governments in Beijing. These findings are helpful for gaining insights into MSWM systems under these complexities, mitigating key challenges in the planning of these systems, improving the related management practices, and eliminating potential socio-economic and eco-environmental issues resulting from unreasonable management.

  6. A resource management architecture based on complex network theory in cloud computing federation

    NASA Astrophysics Data System (ADS)

    Zhang, Zehua; Zhang, Xuejie

    2011-10-01

    Cloud Computing Federation is a main trend of Cloud Computing. Resource Management has significant effect on the design, realization, and efficiency of Cloud Computing Federation. Cloud Computing Federation has the typical characteristic of the Complex System, therefore, we propose a resource management architecture based on complex network theory for Cloud Computing Federation (abbreviated as RMABC) in this paper, with the detailed design of the resource discovery and resource announcement mechanisms. Compare with the existing resource management mechanisms in distributed computing systems, a Task Manager in RMABC can use the historical information and current state data get from other Task Managers for the evolution of the complex network which is composed of Task Managers, thus has the advantages in resource discovery speed, fault tolerance and adaptive ability. The result of the model experiment confirmed the advantage of RMABC in resource discovery performance.

  7. Communication and complexity in a GRN-based multicellular system for graph colouring.

    PubMed

    Buck, Moritz; Nehaniv, Chrystopher L

    2008-01-01

    Artificial Genetic Regulatory Networks (GRNs) are interesting control models through their simplicity and versatility. They can be easily implemented, evolved and modified, and their similarity to their biological counterparts makes them interesting for simulations of life-like systems as well. These aspects suggest they may be perfect control systems for distributed computing in diverse situations, but to be usable for such applications the computational power and evolvability of GRNs need to be studied. In this research we propose a simple distributed system implementing GRNs to solve the well known NP-complete graph colouring problem. Every node (cell) of the graph to be coloured is controlled by an instance of the same GRN. All the cells communicate directly with their immediate neighbours in the graph so as to set up a good colouring. The quality of this colouring directs the evolution of the GRNs using a genetic algorithm. We then observe the quality of the colouring for two different graphs according to different communication protocols and the number of different proteins in the cell (a measure for the possible complexity of a GRN). Those two points, being the main scalability issues that any computational paradigm raises, will then be discussed.

  8. USING SELF-ORGANIZING MAPS TO EXPLORE PATTERNS IN SPECIES RICHNESS AND PROTECTION

    EPA Science Inventory

    The combination of species distributions with abiotic and landscape variables using Geographic Information Systems can be used to help prioritize areas for biodiversity protection, although the number of variables and complexity of the relationships between them can prove difficu...

  9. Irregular Warfare Stability Model (IWSMod)

    DTIC Science & Technology

    2014-01-01

    single shot causing no casualties to a highly coordinated complex attack using two or more weapon systems. Advances in technology have allowed...to power law, the more stable the enviroment . To determine how close the actual distribution is to the power law, the method determines the

  10. Biomechanical comparison of two different collar structured implants supporting 3-unit fixed partial denture: a 3-D FEM study.

    PubMed

    Meriç, Gökçe; Erkmen, Erkan; Kurt, Ahmet; Eser, Atilim; Ozden, Ahmet Utku

    2012-01-01

    The purpose of the study was to compare the effects of two distinct collar geometries of implants on stress distribution in the bone as well as in the fixture-abutment complex, in the framework and in the veneering material of 3-unit fixed partial denture (FPD). The 3-dimensional finite element analysis method was selected to evaluate the stress distribution in the system composed of 3-unit FPD supported by two different dental implant systems with two distinct collar geometries; microthread collar structure (MCS) and non-microthread collar structure (NMCS). In separate load cases, 300 N vertical, 150 N oblique and 60 N horizontal, forces were utilized to simulate the multidirectional chewing forces. Tensile and compressive stress values in the cortical and cancellous bone and von Mises stresses in the fixture-abutment complex, in the framework and veneering material, were simulated as a body and investigated separately. In the cortical bone lower stress values were found in the MCS model, when compared with NMCS. In the cancellous bone, lower stress values were observed in the NMCS model when compared with MCS. In the implant-abutment complex, highest von Mises stress values were noted in the NMCS model; however, in the framework and veneering material, highest stress values were calculated in MCS model. MCS implants when compared with NMCS implants supporting 3-unit FPDs decrease the stress values in the cortical bone and implant-abutment complex. The results of the present study will be evaluated as a base for our ongoing FEA studies focused on stress distribution around the microthread and non-microthread collar geometries with various prosthesis design.

  11. A Wideband Fast Multipole Method for the two-dimensional complex Helmholtz equation

    NASA Astrophysics Data System (ADS)

    Cho, Min Hyung; Cai, Wei

    2010-12-01

    A Wideband Fast Multipole Method (FMM) for the 2D Helmholtz equation is presented. It can evaluate the interactions between N particles governed by the fundamental solution of 2D complex Helmholtz equation in a fast manner for a wide range of complex wave number k, which was not easy with the original FMM due to the instability of the diagonalized conversion operator. This paper includes the description of theoretical backgrounds, the FMM algorithm, software structures, and some test runs. Program summaryProgram title: 2D-WFMM Catalogue identifier: AEHI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHI_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 4636 No. of bytes in distributed program, including test data, etc.: 82 582 Distribution format: tar.gz Programming language: C Computer: Any Operating system: Any operating system with gcc version 4.2 or newer Has the code been vectorized or parallelized?: Multi-core processors with shared memory RAM: Depending on the number of particles N and the wave number k Classification: 4.8, 4.12 External routines: OpenMP ( http://openmp.org/wp/) Nature of problem: Evaluate interaction between N particles governed by the fundamental solution of 2D Helmholtz equation with complex k. Solution method: Multilevel Fast Multipole Algorithm in a hierarchical quad-tree structure with cutoff level which combines low frequency method and high frequency method. Running time: Depending on the number of particles N, wave number k, and number of cores in CPU. CPU time increases as N log N.

  12. Verification of Space Station Secondary Power System Stability Using Design of Experiment

    NASA Technical Reports Server (NTRS)

    Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce

    1998-01-01

    This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.

  13. A program for the Bayesian Neural Network in the ROOT framework

    NASA Astrophysics Data System (ADS)

    Zhong, Jiahang; Huang, Run-Sheng; Lee, Shih-Chang

    2011-12-01

    We present a Bayesian Neural Network algorithm implemented in the TMVA package (Hoecker et al., 2007 [1]), within the ROOT framework (Brun and Rademakers, 1997 [2]). Comparing to the conventional utilization of Neural Network as discriminator, this new implementation has more advantages as a non-parametric regression tool, particularly for fitting probabilities. It provides functionalities including cost function selection, complexity control and uncertainty estimation. An example of such application in High Energy Physics is shown. The algorithm is available with ROOT release later than 5.29. Program summaryProgram title: TMVA-BNN Catalogue identifier: AEJX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: BSD license No. of lines in distributed program, including test data, etc.: 5094 No. of bytes in distributed program, including test data, etc.: 1,320,987 Distribution format: tar.gz Programming language: C++ Computer: Any computer system or cluster with C++ compiler and UNIX-like operating system Operating system: Most UNIX/Linux systems. The application programs were thoroughly tested under Fedora and Scientific Linux CERN. Classification: 11.9 External routines: ROOT package version 5.29 or higher ( http://root.cern.ch) Nature of problem: Non-parametric fitting of multivariate distributions Solution method: An implementation of Neural Network following the Bayesian statistical interpretation. Uses Laplace approximation for the Bayesian marginalizations. Provides the functionalities of automatic complexity control and uncertainty estimation. Running time: Time consumption for the training depends substantially on the size of input sample, the NN topology, the number of training iterations, etc. For the example in this manuscript, about 7 min was used on a PC/Linux with 2.0 GHz processors.

  14. Simulated Tempering Distributed Replica Sampling, Virtual Replica Exchange, and Other Generalized-Ensemble Methods for Conformational Sampling.

    PubMed

    Rauscher, Sarah; Neale, Chris; Pomès, Régis

    2009-10-13

    Generalized-ensemble algorithms in temperature space have become popular tools to enhance conformational sampling in biomolecular simulations. A random walk in temperature leads to a corresponding random walk in potential energy, which can be used to cross over energetic barriers and overcome the problem of quasi-nonergodicity. In this paper, we introduce two novel methods: simulated tempering distributed replica sampling (STDR) and virtual replica exchange (VREX). These methods are designed to address the practical issues inherent in the replica exchange (RE), simulated tempering (ST), and serial replica exchange (SREM) algorithms. RE requires a large, dedicated, and homogeneous cluster of CPUs to function efficiently when applied to complex systems. ST and SREM both have the drawback of requiring extensive initial simulations, possibly adaptive, for the calculation of weight factors or potential energy distribution functions. STDR and VREX alleviate the need for lengthy initial simulations, and for synchronization and extensive communication between replicas. Both methods are therefore suitable for distributed or heterogeneous computing platforms. We perform an objective comparison of all five algorithms in terms of both implementation issues and sampling efficiency. We use disordered peptides in explicit water as test systems, for a total simulation time of over 42 μs. Efficiency is defined in terms of both structural convergence and temperature diffusion, and we show that these definitions of efficiency are in fact correlated. Importantly, we find that ST-based methods exhibit faster temperature diffusion and correspondingly faster convergence of structural properties compared to RE-based methods. Within the RE-based methods, VREX is superior to both SREM and RE. On the basis of our observations, we conclude that ST is ideal for simple systems, while STDR is well-suited for complex systems.

  15. Distribution of electromagnetic field and group velocities in two-dimensional periodic systems with dissipative metallic components

    NASA Astrophysics Data System (ADS)

    Kuzmiak, Vladimir; Maradudin, Alexei A.

    1998-09-01

    We study the distribution of the electromagnetic field of the eigenmodes and corresponding group velocities associated with the photonic band structures of two-dimensional periodic systems consisting of an array of infinitely long parallel metallic rods whose intersections with a perpendicular plane form a simple square lattice. We consider both nondissipative and lossy metallic components characterized by a complex frequency-dependent dielectric function. Our analysis is based on the calculation of the complex photonic band structure obtained by using a modified plane-wave method that transforms the problem of solving Maxwell's equations into the problem of diagonalizing an equivalent non-Hermitian matrix. In order to investigate the nature and the symmetry properties of the eigenvectors, which significantly affect the optical properties of the photonic lattices, we evaluate the associated field distribution at the high symmetry points and along high symmetry directions in the two-dimensional first Brillouin zone of the periodic system. By considering both lossless and lossy metallic rods we study the effect of damping on the spatial distribution of the eigenvectors. Then we use the Hellmann-Feynman theorem and the eigenvectors and eigenfrequencies obtained from a photonic band-structure calculation based on a standard plane-wave approach applied to the nondissipative system to calculate the components of the group velocities associated with individual bands as functions of the wave vector in the first Brillouin zone. From the group velocity of each eigenmode the flow of energy is examined. The results obtained indicate a strong directional dependence of the group velocity, and confirm the experimental observation that a photonic crystal is a potentially efficient tool in controlling photon propagation.

  16. A Content Markup Language for Data Services

    NASA Astrophysics Data System (ADS)

    Noviello, C.; Acampa, P.; Mango Furnari, M.

    Network content delivery and documents sharing is possible using a variety of technologies, such as distributed databases, service-oriented applications, and so forth. The development of such systems is a complex job, because document life cycle involves a strong cooperation between domain experts and software developers. Furthermore, the emerging software methodologies, such as the service-oriented architecture and knowledge organization (e.g., semantic web) did not really solve the problems faced in a real distributed and cooperating settlement. In this chapter the authors' efforts to design and deploy a distribute and cooperating content management system are described. The main features of the system are a user configurable document type definition and a management middleware layer. It allows CMS developers to orchestrate the composition of specialized software components around the structure of a document. In this chapter are also reported some of the experiences gained on deploying the developed framework in a cultural heritage dissemination settlement.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Module-level power electronics, such as DC power optimizers, microinverters, and those found in AC modules, are increasing in popularity in smaller-scale photovoltaic (PV) systems as their prices continue to decline. Therefore, it is important to provide PV modelers with guidelines about how to model these distributed power electronics appropriately in PV modeling software. This paper extends the work completed at NREL that provided recommendations to model the performance of distributed power electronics in NREL’s popular PVWatts calculator [1], to provide similar guidelines for modeling these technologies in NREL's more complex System Advisor Model (SAM). Module-level power electronics - such asmore » DC power optimizers, microinverters, and those found in AC modules-- are increasing in popularity in smaller-scale photovoltaic (PV) systems as their prices continue to decline. Therefore, it is important to provide PV modelers with guidelines about how to model these distributed power electronics appropriately in PV modeling software.« less

  18. Preparation of organogel with tea polyphenols complex for enhancing the antioxidation properties of edible oil.

    PubMed

    Shi, Rong; Zhang, Qiuyue; Vriesekoop, Frank; Yuan, Qipeng; Liang, Hao

    2014-08-20

    Food-grade organogels are semisolid systems with immobilized liquid edible oil in a three-dimensional network of self-assembled gelators, and they are supposed to have a broad range of potential applications in food industries. In this work, an edible organogel with tea polyphenols was developed, which possesses a highly effective antioxidative function. To enhance the dispersibility of the tea polyphenols in the oil phase, a solid lipid-surfactant-tea polyphenols complex (organogel complex) was first prepared according to a novel method. Then, a food-grade organogel was prepared by mixing this organogel complex with fresh peanut oil. Compared with adding free tea polyphenols, the organogel complex could be more homogeneously distributed in the prepared organogel system, especially under heating condition. Furthermore, the organogel loading of tea polyphenols performed a 2.5-fold higher antioxidation compared with other chemically synthesized antioxidants (butylated hydroxytoluene and propyl gallate) by evaluating the peroxide value of the fresh peanut oil based organogel in accelerated oxidation conditions.

  19. Mass balance of metal species in supercritical fluid extraction using sodium diethyldithiocarbamate and dibutylammonium dibutyldithiocarbamate.

    PubMed

    Wang, Joanna Shaofen; Chiu, Kong-Hwa

    2006-03-01

    The objective of this work is to track the amount of metal complexes distributed in the extraction cell, collection vial, and tubing used in supercritical fluid extraction (SFE) systems after progressive removal of metal ions in supercritical carbon dioxide (SC-CO2). Sodium diethyldithiocarbamate (NaDDC) and dibutylammonium dibutyldithiocarbamate (DBDC) ligands were used to form complexes with Cd, Cu, Pb, and Zn and CO(2)/5% methanol as a supercritical fluid. The mass balance of metal complexes were obtained before and after extraction, and metals in different locations in the system were flushed out using an organic solvent and nitric acid (HNO3). These results infer that the stability constant (beta) of the metal-ligand complex has a strong correlation with SFE. Because of the composition of the stainless-steel cell, Fe, Cr, and Ni or other trace elements in the cell might interfere with the mass balance of metal complexes in SFE due to an exchange mechanism taking place between the cell and the sample.

  20. Rényi entropies characterizing the shape and the extension of the phase space representation of quantum wave functions in disordered systems.

    PubMed

    Varga, Imre; Pipek, János

    2003-08-01

    We discuss some properties of the generalized entropies, called Rényi entropies, and their application to the case of continuous distributions. In particular, it is shown that these measures of complexity can be divergent; however, their differences are free from these divergences, thus enabling them to be good candidates for the description of the extension and the shape of continuous distributions. We apply this formalism to the projection of wave functions onto the coherent state basis, i.e., to the Husimi representation. We also show how the localization properties of the Husimi distribution on average can be reconstructed from its marginal distributions that are calculated in position and momentum space in the case when the phase space has no structure, i.e., no classical limit can be defined. Numerical simulations on a one-dimensional disordered system corroborate our expectations.

  1. Data Collection for Mobile Group Consumption: An Asynchronous Distributed Approach.

    PubMed

    Zhu, Weiping; Chen, Weiran; Hu, Zhejie; Li, Zuoyou; Liang, Yue; Chen, Jiaojiao

    2016-04-06

    Mobile group consumption refers to consumption by a group of people, such as a couple, a family, colleagues and friends, based on mobile communications. It differs from consumption only involving individuals, because of the complex relations among group members. Existing data collection systems for mobile group consumption are centralized, which has the disadvantages of being a performance bottleneck, having single-point failure and increasing business and security risks. Moreover, these data collection systems are based on a synchronized clock, which is often unrealistic because of hardware constraints, privacy concerns or synchronization cost. In this paper, we propose the first asynchronous distributed approach to collecting data generated by mobile group consumption. We formally built a system model thereof based on asynchronous distributed communication. We then designed a simulation system for the model for which we propose a three-layer solution framework. After that, we describe how to detect the causality relation of two/three gathering events that happened in the system based on the collected data. Various definitions of causality relations based on asynchronous distributed communication are supported. Extensive simulation results show that the proposed approach is effective for data collection relating to mobile group consumption.

  2. A modular approach to addressing model design, scale, and parameter estimation issues in distributed hydrological modelling

    USGS Publications Warehouse

    Leavesley, G.H.; Markstrom, S.L.; Restrepo, Pedro J.; Viger, R.J.

    2002-01-01

    A modular approach to model design and construction provides a flexible framework in which to focus the multidisciplinary research and operational efforts needed to facilitate the development, selection, and application of the most robust distributed modelling methods. A variety of modular approaches have been developed, but with little consideration for compatibility among systems and concepts. Several systems are proprietary, limiting any user interaction. The US Geological Survey modular modelling system (MMS) is a modular modelling framework that uses an open source software approach to enable all members of the scientific community to address collaboratively the many complex issues associated with the design, development, and application of distributed hydrological and environmental models. Implementation of a common modular concept is not a trivial task. However, it brings the resources of a larger community to bear on the problems of distributed modelling, provides a framework in which to compare alternative modelling approaches objectively, and provides a means of sharing the latest modelling advances. The concepts and components of the MMS are described and an example application of the MMS, in a decision-support system context, is presented to demonstrate current system capabilities. Copyright ?? 2002 John Wiley and Sons, Ltd.

  3. Off-the-shelf Control of Data Analysis Software

    NASA Astrophysics Data System (ADS)

    Wampler, S.

    The Gemini Project must provide convenient access to data analysis facilities to a wide user community. The international nature of this community makes the selection of data analysis software particularly interesting, with staunch advocates of systems such as ADAM and IRAF among the users. Additionally, the continuing trends towards increased use of networked systems and distributed processing impose additional complexity. To meet these needs, the Gemini Project is proposing the novel approach of using low-cost, off-the-shelf software to abstract out both the control and distribution of data analysis from the functionality of the data analysis software. For example, the orthogonal nature of control versus function means that users might select analysis routines from both ADAM and IRAF as appropriate, distributing these routines across a network of machines. It is the belief of the Gemini Project that this approach results in a system that is highly flexible, maintainable, and inexpensive to develop. The Khoros visualization system is presented as an example of control software that is currently available for providing the control and distribution within a data analysis system. The visual programming environment provided with Khoros is also discussed as a means to providing convenient access to this control.

  4. Data Collection for Mobile Group Consumption: An Asynchronous Distributed Approach †

    PubMed Central

    Zhu, Weiping; Chen, Weiran; Hu, Zhejie; Li, Zuoyou; Liang, Yue; Chen, Jiaojiao

    2016-01-01

    Mobile group consumption refers to consumption by a group of people, such as a couple, a family, colleagues and friends, based on mobile communications. It differs from consumption only involving individuals, because of the complex relations among group members. Existing data collection systems for mobile group consumption are centralized, which has the disadvantages of being a performance bottleneck, having single-point failure and increasing business and security risks. Moreover, these data collection systems are based on a synchronized clock, which is often unrealistic because of hardware constraints, privacy concerns or synchronization cost. In this paper, we propose the first asynchronous distributed approach to collecting data generated by mobile group consumption. We formally built a system model thereof based on asynchronous distributed communication. We then designed a simulation system for the model for which we propose a three-layer solution framework. After that, we describe how to detect the causality relation of two/three gathering events that happened in the system based on the collected data. Various definitions of causality relations based on asynchronous distributed communication are supported. Extensive simulation results show that the proposed approach is effective for data collection relating to mobile group consumption. PMID:27058544

  5. - XSUMMER- Transcendental functions and symbolic summation in FORM

    NASA Astrophysics Data System (ADS)

    Moch, S.; Uwer, P.

    2006-05-01

    Harmonic sums and their generalizations are extremely useful in the evaluation of higher-order perturbative corrections in quantum field theory. Of particular interest have been the so-called nested sums, where the harmonic sums and their generalizations appear as building blocks, originating for example, from the expansion of generalized hypergeometric functions around integer values of the parameters. In this paper we discuss the implementation of several algorithms to solve these sums by algebraic means, using the computer algebra system FORM. Program summaryTitle of program:XSUMMER Catalogue identifier:ADXQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXQ_v1_0 Program obtainable from:CPC Program Library, Queen's University of Belfast, N. Ireland License:GNU Public License and FORM License Computers:all Operating system:all Program language:FORM Memory required to execute:Depending on the complexity of the problem, recommended at least 64 MB RAM No. of lines in distributed program, including test data, etc.:9854 No. of bytes in distributed program, including test data, etc.:126 551 Distribution format:tar.gz Other programs called:none External files needed:none Nature of the physical problem:Systematic expansion of higher transcendental functions in a small parameter. The expansions arise in the calculation of loop integrals in perturbative quantum field theory. Method of solution:Algebraic manipulations of nested sums. Restrictions on complexity of the problem:Usually limited only by the available disk space. Typical running time:Dependent on the complexity of the problem.

  6. Exact sampling hardness of Ising spin models

    NASA Astrophysics Data System (ADS)

    Fefferman, B.; Foss-Feig, M.; Gorshkov, A. V.

    2017-09-01

    We study the complexity of classically sampling from the output distribution of an Ising spin model, which can be implemented naturally in a variety of atomic, molecular, and optical systems. In particular, we construct a specific example of an Ising Hamiltonian that, after time evolution starting from a trivial initial state, produces a particular output configuration with probability very nearly proportional to the square of the permanent of a matrix with arbitrary integer entries. In a similar spirit to boson sampling, the ability to sample classically from the probability distribution induced by time evolution under this Hamiltonian would imply unlikely complexity theoretic consequences, suggesting that the dynamics of such a spin model cannot be efficiently simulated with a classical computer. Physical Ising spin systems capable of achieving problem-size instances (i.e., qubit numbers) large enough so that classical sampling of the output distribution is classically difficult in practice may be achievable in the near future. Unlike boson sampling, our current results only imply hardness of exact classical sampling, leaving open the important question of whether a much stronger approximate-sampling hardness result holds in this context. The latter is most likely necessary to enable a convincing experimental demonstration of quantum supremacy. As referenced in a recent paper [A. Bouland, L. Mancinska, and X. Zhang, in Proceedings of the 31st Conference on Computational Complexity (CCC 2016), Leibniz International Proceedings in Informatics (Schloss Dagstuhl-Leibniz-Zentrum für Informatik, Dagstuhl, 2016)], our result completes the sampling hardness classification of two-qubit commuting Hamiltonians.

  7. Distribution of radionuclides during melting of carbon steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thurber, W.C.; MacKinney, J.

    1997-02-01

    During the melting of steel with radioactive contamination, radionuclides may be distributed among the metal product, the home scrap, the slag, the furnace lining and the off-gas collection system. In addition, some radionuclides will pass through the furnace system and vent to the atmosphere. To estimate radiological impacts of recycling radioactive scrap steel, it is essential to understand how radionuclides are distributed within the furnace system. For example, an isotope of a gaseous element (e.g., radon) will exhaust directly from the furnace system into the atmosphere while a relatively non-volatile element (e.g., manganese) can be distributed among all the othermore » possible media. This distribution of radioactive contaminants is a complex process that can be influenced by numerous chemical and physical factors, including composition of the steel bath, chemistry of the slag, vapor pressure of the particular element of interest, solubility of the element in molten iron, density of the oxide(s), steel melting temperature and melting practice (e.g., furnace type and size, melting time, method of carbon adjustment and method of alloy additions). This paper discusses the distribution of various elements with particular reference to electric arc furnace steelmaking. The first two sections consider the calculation of partition ratios for elements between metal and slag based on thermodynamic considerations. The third section presents laboratory and production measurements of the distribution of various elements among slag, metal, and the off-gas collection system; and the final section provides recommendations for the assumed distribution of each element of interest.« less

  8. Application of Bayesian inference to the study of hierarchical organization in self-organized complex adaptive systems

    NASA Astrophysics Data System (ADS)

    Knuth, K. H.

    2001-05-01

    We consider the application of Bayesian inference to the study of self-organized structures in complex adaptive systems. In particular, we examine the distribution of elements, agents, or processes in systems dominated by hierarchical structure. We demonstrate that results obtained by Caianiello [1] on Hierarchical Modular Systems (HMS) can be found by applying Jaynes' Principle of Group Invariance [2] to a few key assumptions about our knowledge of hierarchical organization. Subsequent application of the Principle of Maximum Entropy allows inferences to be made about specific systems. The utility of the Bayesian method is considered by examining both successes and failures of the hierarchical model. We discuss how Caianiello's original statements suffer from the Mind Projection Fallacy [3] and we restate his assumptions thus widening the applicability of the HMS model. The relationship between inference and statistical physics, described by Jaynes [4], is reiterated with the expectation that this realization will aid the field of complex systems research by moving away from often inappropriate direct application of statistical mechanics to a more encompassing inferential methodology.

  9. Automatic selection of dynamic data partitioning schemes for distributed memory multicomputers

    NASA Technical Reports Server (NTRS)

    Palermo, Daniel J.; Banerjee, Prithviraj

    1995-01-01

    For distributed memory multicomputers such as the Intel Paragon, the IBM SP-2, the NCUBE/2, and the Thinking Machines CM-5, the quality of the data partitioning for a given application is crucial to obtaining high performance. This task has traditionally been the user's responsibility, but in recent years much effort has been directed to automating the selection of data partitioning schemes. Several researchers have proposed systems that are able to produce data distributions that remain in effect for the entire execution of an application. For complex programs, however, such static data distributions may be insufficient to obtain acceptable performance. The selection of distributions that dynamically change over the course of a program's execution adds another dimension to the data partitioning problem. In this paper, we present a technique that can be used to automatically determine which partitionings are most beneficial over specific sections of a program while taking into account the added overhead of performing redistribution. This system is being built as part of the PARADIGM (PARAllelizing compiler for DIstributed memory General-purpose Multicomputers) project at the University of Illinois. The complete system will provide a fully automated means to parallelize programs written in a serial programming model obtaining high performance on a wide range of distributed-memory multicomputers.

  10. An evaluative model of system performance in manned teleoperational systems

    NASA Technical Reports Server (NTRS)

    Haines, Richard F.

    1989-01-01

    Manned teleoperational systems are used in aerospace operations in which humans must interact with machines remotely. Manual guidance of remotely piloted vehicles, controling a wind tunnel, carrying out a scientific procedure remotely are examples of teleoperations. A four input parameter throughput (Tp) model is presented which can be used to evaluate complex, manned, teleoperations-based systems and make critical comparisons among candidate control systems. The first two parameters of this model deal with nominal (A) and off-nominal (B) predicted events while the last two focus on measured events of two types, human performance (C) and system performance (D). Digital simulations showed that the expression A(1-B)/C+D) produced the greatest homogeneity of variance and distribution symmetry. Results from a recently completed manned life science telescience experiment will be used to further validate the model. Complex, interacting teleoperational systems may be systematically evaluated using this expression much like a computer benchmark is used.

  11. Places to Intervene to Make Complex Food Systems More Healthy, Green, Fair, and Affordable

    PubMed Central

    Malhi, Luvdeep; Karanfil, Özge; Merth, Tommy; Acheson, Molly; Palmer, Amanda; Finegood, Diane T.

    2009-01-01

    A Food Systems and Public Health conference was convened in April 2009 to consider research supporting food systems that are healthy, green, fair, and affordable. We used a complex systems framework to examine the contents of background material provided to conference participants. Application of our intervention-level framework (paradigm, goals, system structure, feedback and delays, structural elements) enabled comparison of the conference themes of healthy, green, fair, and affordable. At the level of system structure suggested actions to achieve these goals are fairly compatible, including broad public discussion and implementation of policies and programs that support sustainable food production and distribution. At the level of paradigm and goals, the challenge of making healthy and green food affordable becomes apparent as some actions may be in conflict. Systems thinking can provide insight into the challenges and opportunities to act to make the food supply more healthy, green, fair, and affordable. PMID:23173029

  12. X-ray tomography using the full complex index of refraction.

    PubMed

    Nielsen, M S; Lauridsen, T; Thomsen, M; Jensen, T H; Bech, M; Christensen, L B; Olsen, E V; Hviid, M; Feidenhans'l, R; Pfeiffer, F

    2012-10-07

    We report on x-ray tomography using the full complex index of refraction recorded with a grating-based x-ray phase-contrast setup. Combining simultaneous absorption and phase-contrast information, the distribution of the full complex index of refraction is determined and depicted in a bivariate graph. A simple multivariable threshold segmentation can be applied offering higher accuracy than with a single-variable threshold segmentation as well as new possibilities for the partial volume analysis and edge detection. It is particularly beneficial for low-contrast systems. In this paper, this concept is demonstrated by experimental results.

  13. Identifying apparent local stable isotope equilibrium in a complex non-equilibrium system.

    PubMed

    He, Yuyang; Cao, Xiaobin; Wang, Jianwei; Bao, Huiming

    2018-02-28

    Although being out of equilibrium, biomolecules in organisms have the potential to approach isotope equilibrium locally because enzymatic reactions are intrinsically reversible. A rigorous approach that can describe isotope distribution among biomolecules and their apparent deviation from equilibrium state is lacking, however. Applying the concept of distance matrix in graph theory, we propose that apparent local isotope equilibrium among a subset of biomolecules can be assessed using an apparent fractionation difference (|Δα|) matrix, in which the differences between the observed isotope composition (δ') and the calculated equilibrium fractionation factor (1000lnβ) can be more rigorously evaluated than by using a previous approach for multiple biomolecules. We tested our |Δα| matrix approach by re-analyzing published data of different amino acids (AAs) in potato and in green alga. Our re-analysis shows that biosynthesis pathways could be the reason for an apparently close-to-equilibrium relationship inside AA families in potato leaves. Different biosynthesis/degradation pathways in tubers may have led to the observed isotope distribution difference between potato leaves and tubers. The analysis of data from green algae does not support the conclusion that AAs are further from equilibrium in glucose-cultured green algae than in the autotrophic ones. Application of the |Δα| matrix can help us to locate potential reversible reactions or reaction networks in a complex system such as a metabolic system. The same approach can be broadly applied to all complex systems that have multiple components, e.g. geochemical or atmospheric systems of early Earth or other planets. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Examining System-Wide Impacts of Solar PV Control Systems with a Power Hardware-in-the-Loop Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Tess L.; Fuller, Jason C.; Schneider, Kevin P.

    2014-10-11

    High penetration levels of distributed solar PV power generation can lead to adverse power quality impacts such as excessive voltage rise, voltage flicker, and reactive power values that result in unacceptable voltage levels. Advanced inverter control schemes have been proposed that have the potential to mitigate many power quality concerns. However, closed-loop control may lead to unintended behavior in deployed systems as complex interactions can occur between numerous operating devices. In order to enable the study of the performance of advanced control schemes in a detailed distribution system environment, a Hardware-in-the-Loop (HIL) platform has been developed. In the HIL system,more » GridLAB-D, a distribution system simulation tool, runs in real-time mode at the Pacific Northwest National Laboratory (PNNL) and supplies power system parameters at a point of common coupling to hardware located at the National Renewable Energy Laboratory (NREL). Hardware inverters interact with grid and PV simulators emulating an operational distribution system and power output from the inverters is measured and sent to PNNL to update the real-time distribution system simulation. The platform is described and initial test cases are presented. The platform is used to study the system-wide impacts and the interactions of controls applied to inverters that are integrated into a simulation of the IEEE 8500-node test feeder, with inverters in either constant power factor control or active volt/VAR control. We demonstrate that this HIL platform is well-suited to the study of advanced inverter controls and their impacts on the power quality of a distribution feeder. Additionally, the results from HIL are used to validate GridLAB-D simulations of advanced inverter controls.« less

  15. Preisach modeling of temperature-dependent ferroelectric response of piezoceramics at sub-switching regime

    NASA Astrophysics Data System (ADS)

    Ochoa, Diego Alejandro; García, Jose Eduardo

    2016-04-01

    The Preisach model is a classical method for describing nonlinear behavior in hysteretic systems. According to this model, a hysteretic system contains a collection of simple bistable units which are characterized by an internal field and a coercive field. This set of bistable units exhibits a statistical distribution that depends on these fields as parameters. Thus, nonlinear response depends on the specific distribution function associated with the material. This model is satisfactorily used in this work to describe the temperature-dependent ferroelectric response in PZT- and KNN-based piezoceramics. A distribution function expanded in Maclaurin series considering only the first terms in the internal field and the coercive field is proposed. Changes in coefficient relations of a single distribution function allow us to explain the complex temperature dependence of hard piezoceramic behavior. A similar analysis based on the same form of the distribution function shows that the KNL-NTS properties soften around its orthorhombic to tetragonal phase transition.

  16. Telerobotic system performance measurement - Motivation and methods

    NASA Technical Reports Server (NTRS)

    Kondraske, George V.; Khoury, George J.

    1992-01-01

    A systems performance-based strategy for modeling and conducting experiments relevant to the design and performance characterization of telerobotic systems is described. A developmental testbed consisting of a distributed telerobotics network and initial efforts to implement the strategy described is presented. Consideration is given to the general systems performance theory (GSPT) to tackle human performance problems as a basis for: measurement of overall telerobotic system (TRS) performance; task decomposition; development of a generic TRS model; and the characterization of performance of subsystems comprising the generic model. GSPT employs a resource construct to model performance and resource economic principles to govern the interface of systems to tasks. It provides a comprehensive modeling/measurement strategy applicable to complex systems including both human and artificial components. Application is presented within the framework of a distributed telerobotics network as a testbed. Insight into the design of test protocols which elicit application-independent data is described.

  17. Determination of the total concentration and speciation of metal ions in river, estuarine and seawater samples.

    PubMed

    Alberti, Giancarla; Biesuz, Raffaela; Pesavento, Maria

    2008-12-01

    Different natural water samples were investigated to determine the total concentration and the distribution of species for Cu(II), Pb(II), Al(III) and U(VI). The proposed method, named resin titration (RT), was developed in our laboratory to investigate the distribution of species for metal ions in complex matrices. It is a competition method, in which a complexing resin competes with natural ligands present in the sample to combine with the metal ions. In the present paper, river, estuarine and seawater samples, collected during a cruise in Adriatic Sea, were investigated. For each sample, two RTs were performed, using different complexing resins: the iminodiacetic Chelex 100 and the carboxylic Amberlite CG50. In this way, it was possible to detect different class of ligands. Satisfactory results have been obtained and are commented on critically. They were summarized by principal component analysis (PCA) and the correlations with physicochemical parameters allowed one to follow the evolution of the metals along the considered transect. It should be pointed out that, according to our findings, the ligands responsible for metal ions complexation are not the major components of the water system, since they form considerably weaker complexes.

  18. Reactive solute transport in an asymmetrical fracture-rock matrix system

    NASA Astrophysics Data System (ADS)

    Zhou, Renjie; Zhan, Hongbin

    2018-02-01

    The understanding of reactive solute transport in a single fracture-rock matrix system is the foundation of studying transport behavior in the complex fractured porous media. When transport properties are asymmetrically distributed in the adjacent rock matrixes, reactive solute transport has to be considered as a coupled three-domain problem, which is more complex than the symmetric case with identical transport properties in the adjacent rock matrixes. This study deals with the transport problem in a single fracture-rock matrix system with asymmetrical distribution of transport properties in the rock matrixes. Mathematical models are developed for such a problem under the first-type and the third-type boundary conditions to analyze the spatio-temporal concentration and mass distribution in the fracture and rock matrix with the help of Laplace transform technique and de Hoog numerical inverse Laplace algorithm. The newly acquired solutions are then tested extensively against previous analytical and numerical solutions and are proven to be robust and accurate. Furthermore, a water flushing phase is imposed on the left boundary of system after a certain time. The diffusive mass exchange along the fracture/rock matrixes interfaces and the relative masses stored in each of three domains (fracture, upper rock matrix, and lower rock matrix) after the water flushing provide great insights of transport with asymmetric distribution of transport properties. This study has the following findings: 1) Asymmetric distribution of transport properties imposes greater controls on solute transport in the rock matrixes. However, transport in the fracture is mildly influenced. 2) The mass stored in the fracture responses quickly to water flushing, while the mass stored in the rock matrix is much less sensitive to the water flushing. 3) The diffusive mass exchange during the water flushing phase has similar patterns under symmetric and asymmetric cases. 4) The characteristic distance which refers to the zero diffusion between the fracture and the rock matrix during the water flushing phase is closely associated with dispersive process in the fracture.

  19. A Distributed Value of Information (VoI)-Based Approach for Mission-Adaptive Context-Aware Information Management and Presentation

    DTIC Science & Technology

    2016-05-16

    metrics involve regulating automation of complex systems , such as aircraft .12 Additionally, adaptive management of content in user interfaces has also...both the user and environmental context would aid in deciding how to present the information to the Warfighter. The prototype system currently...positioning system , and rate sensors can provide user - specific context to disambiguate physiologic data. The consumer “quantified self” market has driven

  20. Efficient modeling of photonic crystals with local Hermite polynomials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boucher, C. R.; Li, Zehao; Albrecht, J. D.

    2014-04-21

    Developing compact algorithms for accurate electrodynamic calculations with minimal computational cost is an active area of research given the increasing complexity in the design of electromagnetic composite structures such as photonic crystals, metamaterials, optical interconnects, and on-chip routing. We show that electric and magnetic (EM) fields can be calculated using scalar Hermite interpolation polynomials as the numerical basis functions without having to invoke edge-based vector finite elements to suppress spurious solutions or to satisfy boundary conditions. This approach offers several fundamental advantages as evidenced through band structure solutions for periodic systems and through waveguide analysis. Compared with reciprocal space (planemore » wave expansion) methods for periodic systems, advantages are shown in computational costs, the ability to capture spatial complexity in the dielectric distributions, the demonstration of numerical convergence with scaling, and variational eigenfunctions free of numerical artifacts that arise from mixed-order real space basis sets or the inherent aberrations from transforming reciprocal space solutions of finite expansions. The photonic band structure of a simple crystal is used as a benchmark comparison and the ability to capture the effects of spatially complex dielectric distributions is treated using a complex pattern with highly irregular features that would stress spatial transform limits. This general method is applicable to a broad class of physical systems, e.g., to semiconducting lasers which require simultaneous modeling of transitions in quantum wells or dots together with EM cavity calculations, to modeling plasmonic structures in the presence of EM field emissions, and to on-chip propagation within monolithic integrated circuits.« less

Top