Sample records for complex processing systems

  1. Real-time monitoring of clinical processes using complex event processing and transition systems.

    PubMed

    Meinecke, Sebastian

    2014-01-01

    Dependencies between tasks in clinical processes are often complex and error-prone. Our aim is to describe a new approach for the automatic derivation of clinical events identified via the behaviour of IT systems using Complex Event Processing. Furthermore we map these events on transition systems to monitor crucial clinical processes in real-time for preventing and detecting erroneous situations.

  2. On the use of multi-agent systems for the monitoring of industrial systems

    NASA Astrophysics Data System (ADS)

    Rezki, Nafissa; Kazar, Okba; Mouss, Leila Hayet; Kahloul, Laid; Rezki, Djamil

    2016-03-01

    The objective of the current paper is to present an intelligent system for complex process monitoring, based on artificial intelligence technologies. This system aims to realize with success all the complex process monitoring tasks that are: detection, diagnosis, identification and reconfiguration. For this purpose, the development of a multi-agent system that combines multiple intelligences such as: multivariate control charts, neural networks, Bayesian networks and expert systems has became a necessity. The proposed system is evaluated in the monitoring of the complex process Tennessee Eastman process.

  3. Integrating complex business processes for knowledge-driven clinical decision support systems.

    PubMed

    Kamaleswaran, Rishikesan; McGregor, Carolyn

    2012-01-01

    This paper presents in detail the component of the Complex Business Process for Stream Processing framework that is responsible for integrating complex business processes to enable knowledge-driven Clinical Decision Support System (CDSS) recommendations. CDSSs aid the clinician in supporting the care of patients by providing accurate data analysis and evidence-based recommendations. However, the incorporation of a dynamic knowledge-management system that supports the definition and enactment of complex business processes and real-time data streams has not been researched. In this paper we discuss the process web service as an innovative method of providing contextual information to a real-time data stream processing CDSS.

  4. Apollo Experiment Report: Lunar-Sample Processing in the Lunar Receiving Laboratory High-Vacuum Complex

    NASA Technical Reports Server (NTRS)

    White, D. R.

    1976-01-01

    A high-vacuum complex composed of an atmospheric decontamination system, sample-processing chambers, storage chambers, and a transfer system was built to process and examine lunar material while maintaining quarantine status. Problems identified, equipment modifications, and procedure changes made for Apollo 11 and 12 sample processing are presented. The sample processing experiences indicate that only a few operating personnel are required to process the sample efficiently, safely, and rapidly in the high-vacuum complex. The high-vacuum complex was designed to handle the many contingencies, both quarantine and scientific, associated with handling an unknown entity such as the lunar sample. Lunar sample handling necessitated a complex system that could not respond rapidly to changing scientific requirements as the characteristics of the lunar sample were better defined. Although the complex successfully handled the processing of Apollo 11 and 12 lunar samples, the scientific requirement for vacuum samples was deleted after the Apollo 12 mission just as the vacuum system was reaching its full potential.

  5. Simplifying the complexity surrounding ICU work processes--identifying the scope for information management in ICU settings.

    PubMed

    Munir, Samina K; Kay, Stephen

    2005-08-01

    A multi-site study, conducted in two English and two Danish intensive care units, investigates the complexity of work processes in intensive care, and the implications of this complexity for information management with regards to clinical information systems. Data were collected via observations, shadowing of clinical staff, interviews and questionnaires. The construction of role activity diagrams enabled the capture of critical care work processes. Upon analysing these diagrams, it was found that intensive care work processes consist of 'simplified-complexity', these processes are changed with the introduction of information systems for the everyday use and management of all clinical information. The prevailing notion of complexity surrounding critical care clinical work processes was refuted and found to be misleading; in reality, it is not the work processes that cause the complexity, the complexity is rooted in the way in which clinical information is used and managed. This study emphasises that the potential for clinical information systems that consider integrating all clinical information requirements is not only immense but also very plausible.

  6. Overview of DYMCAS, the Y-12 Material Control And Accountability System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alspaugh, D. H.

    2001-07-01

    This paper gives an overview of DYMCAS, the material control and accountability information system for the Y-12 National Security Complex. A common misconception, even within the DOE community, understates the nature and complexity of material control and accountability (MC and A) systems, likening them to parcel delivery systems tracking packages at various locations or banking systems that account for money, down to the penny. A major point set forth in this paper is that MC and A systems such as DYMCAS can be and often are very complex. Given accountability reporting requirements and the critical and sensitive nature of themore » task, no MC and A system can be simple. The complexity of site-level accountability systems, however, varies dramatically depending on the amounts, kinds, and forms of nuclear materials and the kinds of processing performed at the site. Some accountability systems are tailored to unique and highly complex site-level materials and material processing and, consequently, are highly complex systems. Sites with less complexity require less complex accountability systems, and where processes and practices are the same or similar, sites on the mid-to-low end of the complexity scale can effectively utilize a standard accountability system. In addition to being complex, a unique feature of DYMCAS is its integration with the site production control and manufacturing system. This paper will review the advantages of such integration, as well as related challenges, and make the point that the effectiveness of complex MC and A systems can be significantly enhanced through appropriate systems integration.« less

  7. Improving processes through evolutionary optimization.

    PubMed

    Clancy, Thomas R

    2011-09-01

    As systems evolve over time, their natural tendency is to become increasingly more complex. Studies on complex systems have generated new perspectives on management in social organizations such as hospitals. Much of this research appears as a natural extension of the cross-disciplinary field of systems theory. This is the 18th in a series of articles applying complex systems science to the traditional management concepts of planning, organizing, directing, coordinating, and controlling. In this article, I discuss methods to optimize complex healthcare processes through learning, adaptation, and evolutionary planning.

  8. Tailoring Enterprise Systems Engineering Policy for Project Scale and Complexity

    NASA Technical Reports Server (NTRS)

    Cox, Renee I.; Thomas, L. Dale

    2014-01-01

    Space systems are characterized by varying degrees of scale and complexity. Accordingly, cost-effective implementation of systems engineering also varies depending on scale and complexity. Recognizing that systems engineering and integration happen everywhere and at all levels of a given system and that the life cycle is an integrated process necessary to mature a design, the National Aeronautic and Space Administration's (NASA's) Marshall Space Flight Center (MSFC) has developed a suite of customized implementation approaches based on project scale and complexity. While it may be argued that a top-level system engineering process is common to and indeed desirable across an enterprise for all space systems, implementation of that top-level process and the associated products developed as a result differ from system to system. The implementation approaches used for developing a scientific instrument necessarily differ from those used for a space station. .

  9. Animal models and conserved processes

    PubMed Central

    2012-01-01

    Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is insufficient for inter-species extrapolation when the trait or response being studied is located at higher levels of organization, is in a different module, or is influenced by other modules. However, when the examination of the conserved process occurs at the same level of organization or in the same module, and hence is subject to study solely by reductionism, then extrapolation is possible. PMID:22963674

  10. A foundational methodology for determining system static complexity using notional lunar oxygen production processes

    NASA Astrophysics Data System (ADS)

    Long, Nicholas James

    This thesis serves to develop a preliminary foundational methodology for evaluating the static complexity of future lunar oxygen production systems when extensive information is not yet available about the various systems under consideration. Evaluating static complexity, as part of a overall system complexity analysis, is an important consideration in ultimately selecting a process to be used in a lunar base. When system complexity is higher, there is generally an overall increase in risk which could impact the safety of astronauts and the economic performance of the mission. To evaluate static complexity in lunar oxygen production, static complexity is simplified and defined into its essential components. First, three essential dimensions of static complexity are investigated, including interconnective complexity, strength of connections, and complexity in variety. Then a set of methods is developed upon which to separately evaluate each dimension. Q-connectivity analysis is proposed as a means to evaluate interconnective complexity and strength of connections. The law of requisite variety originating from cybernetic theory is suggested to interpret complexity in variety. Secondly, a means to aggregate the results of each analysis is proposed to create holistic measurement for static complexity using the Single Multi-Attribute Ranking Technique (SMART). Each method of static complexity analysis and the aggregation technique is demonstrated using notional data for four lunar oxygen production processes.

  11. Controls for Burning Solid Wastes

    ERIC Educational Resources Information Center

    Toro, Richard F.; Weinstein, Norman J.

    1975-01-01

    Modern thermal solid waste processing systems are becoming more complex, incorporating features that require instrumentation and control systems to a degree greater than that previously required just for proper combustion control. With the advent of complex, sophisticated, thermal processing systems, TV monitoring and computer control should…

  12. A systems-based approach for integrated design of materials, products and design process chains

    NASA Astrophysics Data System (ADS)

    Panchal, Jitesh H.; Choi, Hae-Jin; Allen, Janet K.; McDowell, David L.; Mistree, Farrokh

    2007-12-01

    The concurrent design of materials and products provides designers with flexibility to achieve design objectives that were not previously accessible. However, the improved flexibility comes at a cost of increased complexity of the design process chains and the materials simulation models used for executing the design chains. Efforts to reduce the complexity generally result in increased uncertainty. We contend that a systems based approach is essential for managing both the complexity and the uncertainty in design process chains and simulation models in concurrent material and product design. Our approach is based on simplifying the design process chains systematically such that the resulting uncertainty does not significantly affect the overall system performance. Similarly, instead of striving for accurate models for multiscale systems (that are inherently complex), we rely on making design decisions that are robust to uncertainties in the models. Accordingly, we pursue hierarchical modeling in the context of design of multiscale systems. In this paper our focus is on design process chains. We present a systems based approach, premised on the assumption that complex systems can be designed efficiently by managing the complexity of design process chains. The approach relies on (a) the use of reusable interaction patterns to model design process chains, and (b) consideration of design process decisions using value-of-information based metrics. The approach is illustrated using a Multifunctional Energetic Structural Material (MESM) design example. Energetic materials store considerable energy which can be released through shock-induced detonation; conventionally, they are not engineered for strength properties. The design objectives for the MESM in this paper include both sufficient strength and energy release characteristics. The design is carried out by using models at different length and time scales that simulate different aspects of the system. Finally, by applying the method to the MESM design problem, we show that the integrated design of materials and products can be carried out more efficiently by explicitly accounting for design process decisions with the hierarchy of models.

  13. Graphical Language for Data Processing

    NASA Technical Reports Server (NTRS)

    Alphonso, Keith

    2011-01-01

    A graphical language for processing data allows processing elements to be connected with virtual wires that represent data flows between processing modules. The processing of complex data, such as lidar data, requires many different algorithms to be applied. The purpose of this innovation is to automate the processing of complex data, such as LIDAR, without the need for complex scripting and programming languages. The system consists of a set of user-interface components that allow the user to drag and drop various algorithmic and processing components onto a process graph. By working graphically, the user can completely visualize the process flow and create complex diagrams. This innovation supports the nesting of graphs, such that a graph can be included in another graph as a single step for processing. In addition to the user interface components, the system includes a set of .NET classes that represent the graph internally. These classes provide the internal system representation of the graphical user interface. The system includes a graph execution component that reads the internal representation of the graph (as described above) and executes that graph. The execution of the graph follows the interpreted model of execution in that each node is traversed and executed from the original internal representation. In addition, there are components that allow external code elements, such as algorithms, to be easily integrated into the system, thus making the system infinitely expandable.

  14. Inclusive Education as Complex Process and Challenge for School System

    ERIC Educational Resources Information Center

    Al-Khamisy, Danuta

    2015-01-01

    Education may be considered as a number of processes, actions and effects affecting human being, as the state or level of the results of these processes or as the modification of the functions, institutions and social practices roles, which in the result of inclusion become new, integrated system. Thus this is very complex process. Nowadays the…

  15. Wave processes in the human cardiovascular system: The measuring complex, computing models, and diagnostic analysis

    NASA Astrophysics Data System (ADS)

    Ganiev, R. F.; Reviznikov, D. L.; Rogoza, A. N.; Slastushenskiy, Yu. V.; Ukrainskiy, L. E.

    2017-03-01

    A description of a complex approach to investigation of nonlinear wave processes in the human cardiovascular system based on a combination of high-precision methods of measuring a pulse wave, mathematical methods of processing the empirical data, and methods of direct numerical modeling of hemodynamic processes in an arterial tree is given.

  16. Sparse dynamical Boltzmann machine for reconstructing complex networks with binary dynamics

    NASA Astrophysics Data System (ADS)

    Chen, Yu-Zhong; Lai, Ying-Cheng

    2018-03-01

    Revealing the structure and dynamics of complex networked systems from observed data is a problem of current interest. Is it possible to develop a completely data-driven framework to decipher the network structure and different types of dynamical processes on complex networks? We develop a model named sparse dynamical Boltzmann machine (SDBM) as a structural estimator for complex networks that host binary dynamical processes. The SDBM attains its topology according to that of the original system and is capable of simulating the original binary dynamical process. We develop a fully automated method based on compressive sensing and a clustering algorithm to construct the SDBM. We demonstrate, for a variety of representative dynamical processes on model and real world complex networks, that the equivalent SDBM can recover the network structure of the original system and simulates its dynamical behavior with high precision.

  17. Sparse dynamical Boltzmann machine for reconstructing complex networks with binary dynamics.

    PubMed

    Chen, Yu-Zhong; Lai, Ying-Cheng

    2018-03-01

    Revealing the structure and dynamics of complex networked systems from observed data is a problem of current interest. Is it possible to develop a completely data-driven framework to decipher the network structure and different types of dynamical processes on complex networks? We develop a model named sparse dynamical Boltzmann machine (SDBM) as a structural estimator for complex networks that host binary dynamical processes. The SDBM attains its topology according to that of the original system and is capable of simulating the original binary dynamical process. We develop a fully automated method based on compressive sensing and a clustering algorithm to construct the SDBM. We demonstrate, for a variety of representative dynamical processes on model and real world complex networks, that the equivalent SDBM can recover the network structure of the original system and simulates its dynamical behavior with high precision.

  18. Application of advanced multidisciplinary analysis and optimization methods to vehicle design synthesis

    NASA Technical Reports Server (NTRS)

    Consoli, Robert David; Sobieszczanski-Sobieski, Jaroslaw

    1990-01-01

    Advanced multidisciplinary analysis and optimization methods, namely system sensitivity analysis and non-hierarchical system decomposition, are applied to reduce the cost and improve the visibility of an automated vehicle design synthesis process. This process is inherently complex due to the large number of functional disciplines and associated interdisciplinary couplings. Recent developments in system sensitivity analysis as applied to complex non-hierarchic multidisciplinary design optimization problems enable the decomposition of these complex interactions into sub-processes that can be evaluated in parallel. The application of these techniques results in significant cost, accuracy, and visibility benefits for the entire design synthesis process.

  19. Complexity in electronic negotiation support systems.

    PubMed

    Griessmair, Michele; Strunk, Guido; Vetschera, Rudolf; Koeszegi, Sabine T

    2011-10-01

    It is generally acknowledged that the medium influences the way we communicate and negotiation research directs considerable attention to the impact of different electronic communication modes on the negotiation process and outcomes. Complexity theories offer models and methods that allow the investigation of how pattern and temporal sequences unfold over time in negotiation interactions. By focusing on the dynamic and interactive quality of negotiations as well as the information, choice, and uncertainty contained in the negotiation process, the complexity perspective addresses several issues of central interest in classical negotiation research. In the present study we compare the complexity of the negotiation communication process among synchronous and asynchronous negotiations (IM vs. e-mail) as well as an electronic negotiation support system including a decision support system (DSS). For this purpose, transcripts of 145 negotiations have been coded and analyzed with the Shannon entropy and the grammar complexity. Our results show that negotiating asynchronically via e-mail as well as including a DSS significantly reduces the complexity of the negotiation process. Furthermore, a reduction of the complexity increases the probability of reaching an agreement.

  20. The Design, Development and Testing of a Multi-process Real-time Software System

    DTIC Science & Technology

    2007-03-01

    programming large systems stems from the complexity of dealing with many different details at one time. A sound engineering approach is to break...controls and 3) is portable to other OS platforms such as Microsoft Windows. Next, to reduce the complexity of the programming tasks, the system...processes depending on how often the process has to check to see if common data was modified. A good method for one process to quickly notify another

  1. Complexity in Soil Systems: What Does It Mean and How Should We Proceed?

    NASA Astrophysics Data System (ADS)

    Faybishenko, B.; Molz, F. J.; Brodie, E.; Hubbard, S. S.

    2015-12-01

    The complex soil systems approach is needed fundamentally for the development of integrated, interdisciplinary methods to measure and quantify the physical, chemical and biological processes taking place in soil, and to determine the role of fine-scale heterogeneities. This presentation is aimed at a review of the concepts and observations concerning complexity and complex systems theory, including terminology, emergent complexity and simplicity, self-organization and a general approach to the study of complex systems using the Weaver (1948) concept of "organized complexity." These concepts are used to provide understanding of complex soil systems, and to develop experimental and mathematical approaches to soil microbiological processes. The results of numerical simulations, observations and experiments are presented that indicate the presence of deterministic chaotic dynamics in soil microbial systems. So what are the implications for the scientists who wish to develop mathematical models in the area of organized complexity or to perform experiments to help clarify an aspect of an organized complex system? The modelers have to deal with coupled systems having at least three dependent variables, and they have to forgo making linear approximations to nonlinear phenomena. The analogous rule for experimentalists is that they need to perform experiments that involve measurement of at least three interacting entities (variables depending on time, space, and each other). These entities could be microbes in soil penetrated by roots. If a process being studied in a soil affects the soil properties, like biofilm formation, then this effect has to be measured and included. The mathematical implications of this viewpoint are examined, and results of numerical solutions to a system of equations demonstrating deterministic chaotic behavior are also discussed using time series and the 3D strange attractors.

  2. Superstructure-based Design and Optimization of Batch Biodiesel Production Using Heterogeneous Catalysts

    NASA Astrophysics Data System (ADS)

    Nuh, M. Z.; Nasir, N. F.

    2017-08-01

    Biodiesel as a fuel comprised of mono alkyl esters of long chain fatty acids derived from renewable lipid feedstock, such as vegetable oil and animal fat. Biodiesel production is complex process which need systematic design and optimization. However, no case study using the process system engineering (PSE) elements which are superstructure optimization of batch process, it involves complex problems and uses mixed-integer nonlinear programming (MINLP). The PSE offers a solution to complex engineering system by enabling the use of viable tools and techniques to better manage and comprehend the complexity of the system. This study is aimed to apply the PSE tools for the simulation of biodiesel process and optimization and to develop mathematical models for component of the plant for case A, B, C by using published kinetic data. Secondly, to determine economic analysis for biodiesel production, focusing on heterogeneous catalyst. Finally, the objective of this study is to develop the superstructure for biodiesel production by using heterogeneous catalyst. The mathematical models are developed by the superstructure and solving the resulting mixed integer non-linear model and estimation economic analysis by using MATLAB software. The results of the optimization process with the objective function of minimizing the annual production cost by batch process from case C is 23.2587 million USD. Overall, the implementation a study of process system engineering (PSE) has optimized the process of modelling, design and cost estimation. By optimizing the process, it results in solving the complex production and processing of biodiesel by batch.

  3. Application of simplified Complexity Theory concepts for healthcare social systems to explain the implementation of evidence into practice.

    PubMed

    Chandler, Jacqueline; Rycroft-Malone, Jo; Hawkes, Claire; Noyes, Jane

    2016-02-01

    To examine the application of core concepts from Complexity Theory to explain the findings from a process evaluation undertaken in a trial evaluating implementation strategies for recommendations about reducing surgical fasting times. The proliferation of evidence-based guidance requires a greater focus on its implementation. Theory is required to explain the complex processes across the multiple healthcare organizational levels. This social healthcare context involves the interaction between professionals, patients and the organizational systems in care delivery. Complexity Theory may provide an explanatory framework to explain the complexities inherent in implementation in social healthcare contexts. A secondary thematic analysis of qualitative process evaluation data informed by Complexity Theory. Seminal texts applying Complexity Theory to the social context were annotated, key concepts extracted and core Complexity Theory concepts identified. These core concepts were applied as a theoretical lens to provide an explanation of themes from a process evaluation of a trial evaluating the implementation of strategies to reduce surgical fasting times. Sampled substantive texts provided a representative spread of theoretical development and application of Complexity Theory from late 1990's-2013 in social science, healthcare, management and philosophy. Five Complexity Theory core concepts extracted were 'self-organization', 'interaction', 'emergence', 'system history' and 'temporality'. Application of these concepts suggests routine surgical fasting practice is habituated in the social healthcare system and therefore it cannot easily be reversed. A reduction to fasting times requires an incentivised new approach to emerge in the surgical system's priority of completing the operating list. The application of Complexity Theory provides a useful explanation for resistance to change fasting practice. Its utility in implementation research warrants further attention and evaluation. © 2015 John Wiley & Sons Ltd.

  4. Unstructured Cartesian/prismatic grid generation for complex geometries

    NASA Technical Reports Server (NTRS)

    Karman, Steve L., Jr.

    1995-01-01

    The generation of a hybrid grid system for discretizing complex three dimensional (3D) geometries is described. The primary grid system is an unstructured Cartesian grid automatically generated using recursive cell subdivision. This grid system is sufficient for computing Euler solutions about extremely complex 3D geometries. A secondary grid system, using triangular-prismatic elements, may be added for resolving the boundary layer region of viscous flows near surfaces of solid bodies. This paper describes the grid generation processes used to generate each grid type. Several example grids are shown, demonstrating the ability of the method to discretize complex geometries, with very little pre-processing required by the user.

  5. Performance evaluation of functioning of natural-industrial system of mining-processing complex with help of analytical and mathematical models

    NASA Astrophysics Data System (ADS)

    Bosikov, I. I.; Klyuev, R. V.; Revazov, V. Ch; Pilieva, D. E.

    2018-03-01

    The article describes research and analysis of hazardous processes occurring in the natural-industrial system and effectiveness assessment of its functioning using mathematical models. Studies of the functioning regularities of the natural and industrial system are becoming increasingly relevant in connection with the formulation of the task of modernizing production and the economy of Russia as a whole. In connection with a significant amount of poorly structured data, it is complicated by regulations for the effective functioning of production processes, social and natural complexes, under which a sustainable development of the natural-industrial system of the mining and processing complex would be ensured. Therefore, the scientific and applied problems, the solution of which allows one to formalize the hidden structural functioning patterns of the natural-industrial system and to make managerial decisions of organizational and technological nature to improve the efficiency of the system, are very relevant.

  6. Collective Autoionization in Multiply-Excited Systems: A novel ionization process observed in Helium Nanodroplets

    PubMed Central

    LaForge, A. C.; Drabbels, M.; Brauer, N. B.; Coreno, M.; Devetta, M.; Di Fraia, M.; Finetti, P.; Grazioli, C.; Katzy, R.; Lyamayev, V.; Mazza, T.; Mudrich, M.; O'Keeffe, P.; Ovcharenko, Y.; Piseri, P.; Plekan, O.; Prince, K. C.; Richter, R.; Stranges, S.; Callegari, C.; Möller, T.; Stienkemeier, F.

    2014-01-01

    Free electron lasers (FELs) offer the unprecedented capability to study reaction dynamics and image the structure of complex systems. When multiple photons are absorbed in complex systems, a plasma-like state is formed where many atoms are ionized on a femtosecond timescale. If multiphoton absorption is resonantly-enhanced, the system becomes electronically-excited prior to plasma formation, with subsequent decay paths which have been scarcely investigated to date. Here, we show using helium nanodroplets as an example that these systems can decay by a new type of process, named collective autoionization. In addition, we show that this process is surprisingly efficient, leading to ion abundances much greater than that of direct single-photon ionization. This novel collective ionization process is expected to be important in many other complex systems, e.g. macromolecules and nanoparticles, exposed to high intensity radiation fields. PMID:24406316

  7. A Principled Approach to the Specification of System Architectures for Space Missions

    NASA Technical Reports Server (NTRS)

    McKelvin, Mark L. Jr.; Castillo, Robert; Bonanne, Kevin; Bonnici, Michael; Cox, Brian; Gibson, Corrina; Leon, Juan P.; Gomez-Mustafa, Jose; Jimenez, Alejandro; Madni, Azad

    2015-01-01

    Modern space systems are increasing in complexity and scale at an unprecedented pace. Consequently, innovative methods, processes, and tools are needed to cope with the increasing complexity of architecting these systems. A key systems challenge in practice is the ability to scale processes, methods, and tools used to architect complex space systems. Traditionally, the process for specifying space system architectures has largely relied on capturing the system architecture in informal descriptions that are often embedded within loosely coupled design documents and domain expertise. Such informal descriptions often lead to misunderstandings between design teams, ambiguous specifications, difficulty in maintaining consistency as the architecture evolves throughout the system development life cycle, and costly design iterations. Therefore, traditional methods are becoming increasingly inefficient to cope with ever-increasing system complexity. We apply the principles of component-based design and platform-based design to the development of the system architecture for a practical space system to demonstrate feasibility of our approach using SysML. Our results show that we are able to apply a systematic design method to manage system complexity, thus enabling effective data management, semantic coherence and traceability across different levels of abstraction in the design chain. Just as important, our approach enables interoperability among heterogeneous tools in a concurrent engineering model based design environment.

  8. Grounding explanations in evolving, diagnostic situations

    NASA Technical Reports Server (NTRS)

    Johannesen, Leila J.; Cook, Richard I.; Woods, David D.

    1994-01-01

    Certain fields of practice involve the management and control of complex dynamic systems. These include flight deck operations in commercial aviation, control of space systems, anesthetic management during surgery or chemical or nuclear process control. Fault diagnosis of these dynamic systems generally must occur with the monitored process on-line and in conjunction with maintaining system integrity.This research seeks to understand in more detail what it means for an intelligent system to function cooperatively, or as a 'team player' in complex, dynamic environments. The approach taken was to study human practitioners engaged in the management of a complex, dynamic process: anesthesiologists during neurosurgical operations. The investigation focused on understanding how team members cooperate in management and fault diagnosis and comparing this interaction to the situation with an Artificial Intelligence(AI) system that provides diagnoses and explanations. Of particular concern was to study the ways in which practitioners support one another in keeping aware of relevant information concerning the state of the monitored process and of the problem solving process.

  9. QMU as an approach to strengthening the predictive capabilities of complex models.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, Genetha Anne.; Boggs, Paul T.; Grace, Matthew D.

    2010-09-01

    Complex systems are made up of multiple interdependent parts, and the behavior of the entire system cannot always be directly inferred from the behavior of the individual parts. They are nonlinear and system responses are not necessarily additive. Examples of complex systems include energy, cyber and telecommunication infrastructures, human and animal social structures, and biological structures such as cells. To meet the goals of infrastructure development, maintenance, and protection for cyber-related complex systems, novel modeling and simulation technology is needed. Sandia has shown success using M&S in the nuclear weapons (NW) program. However, complex systems represent a significant challenge andmore » relative departure from the classical M&S exercises, and many of the scientific and mathematical M&S processes must be re-envisioned. Specifically, in the NW program, requirements and acceptable margins for performance, resilience, and security are well-defined and given quantitatively from the start. The Quantification of Margins and Uncertainties (QMU) process helps to assess whether or not these safety, reliability and performance requirements have been met after a system has been developed. In this sense, QMU is used as a sort of check that requirements have been met once the development process is completed. In contrast, performance requirements and margins may not have been defined a priori for many complex systems, (i.e. the Internet, electrical distribution grids, etc.), particularly not in quantitative terms. This project addresses this fundamental difference by investigating the use of QMU at the start of the design process for complex systems. Three major tasks were completed. First, the characteristics of the cyber infrastructure problem were collected and considered in the context of QMU-based tools. Second, UQ methodologies for the quantification of model discrepancies were considered in the context of statistical models of cyber activity. Third, Bayesian methods for optimal testing in the QMU framework were developed. This completion of this project represent an increased understanding of how to apply and use the QMU process as a means for improving model predictions of the behavior of complex systems. 4« less

  10. A Chemical Engineer's Perspective on Health and Disease

    PubMed Central

    Androulakis, Ioannis P.

    2014-01-01

    Chemical process systems engineering considers complex supply chains which are coupled networks of dynamically interacting systems. The quest to optimize the supply chain while meeting robustness and flexibility constraints in the face of ever changing environments necessitated the development of theoretical and computational tools for the analysis, synthesis and design of such complex engineered architectures. However, it was realized early on that optimality is a complex characteristic required to achieve proper balance between multiple, often competing, objectives. As we begin to unravel life's intricate complexities, we realize that that living systems share similar structural and dynamic characteristics; hence much can be learned about biological complexity from engineered systems. In this article, we draw analogies between concepts in process systems engineering and conceptual models of health and disease; establish connections between these concepts and physiologic modeling; and describe how these mirror onto the physiological counterparts of engineered systems. PMID:25506103

  11. Exploring the application of an evolutionary educational complex systems framework to teaching and learning about issues in the science and technology classroom

    NASA Astrophysics Data System (ADS)

    Yoon, Susan Anne

    Understanding the world through a complex systems lens has recently garnered a great deal of interest in many knowledge disciplines. In the educational arena, interactional studies, through their focus on understanding patterns of system behaviour including the dynamical processes and trajectories of learning, lend support for investigating how a complex systems approach can inform educational research. This study uses previously existing literature and tools for complex systems applications and seeks to extend this research base by exploring learning outcomes of a complex systems framework when applied to curriculum and instruction. It is argued that by applying the evolutionary dynamics of variation, interaction and selection, complexity may be harnessed to achieve growth in both the social and cognitive systems of the classroom. Furthermore, if the goal of education, i.e., the social system under investigation, is to teach for understanding, conceptual knowledge of the kind described in Popper's (1972; 1976) World 3, needs to evolve. Both the study of memetic processes and knowledge building pioneered by Bereiter (cf. Bereiter, 2002) draw on the World 3 notion of ideas existing as conceptual artifacts that can be investigated as products outside of the individual mind providing an educational lens from which to proceed. The curricular topic addressed is the development of an ethical understanding of the scientific and technological issues of genetic engineering. 11 grade 8 students are studied as they proceed through 40 hours of curricular instruction based on the complex systems evolutionary framework. Results demonstrate growth in both complex systems thinking and content knowledge of the topic of genetic engineering. Several memetic processes are hypothesized to have influenced how and why ideas change. Categorized by factors influencing either reflective or non-reflective selection, these processes appear to have exerted differential effects on students' abilities to think and act in complex ways at various points throughout the study. Finally, an analysis of winner and loser memes is offered that is intended to reveal information about the conceptual system---its strengths and deficiencies---that can help educators assess curricular goals and organize and construct additional educational activities.

  12. Using a biased qubit to probe complex systems

    NASA Astrophysics Data System (ADS)

    Pollock, Felix A.; Checińska, Agata; Pascazio, Saverio; Modi, Kavan

    2016-09-01

    Complex mesoscopic systems play increasingly important roles in modern science, from understanding biological functions at the molecular level to designing solid-state information processing devices. The operation of these systems typically depends on their energetic structure, yet probing their energy landscape can be extremely challenging; they have many degrees of freedom, which may be hard to isolate and measure independently. Here, we show that a qubit (a two-level quantum system) with a biased energy splitting can directly probe the spectral properties of a complex system, without knowledge of how they couple. Our work is based on the completely positive and trace-preserving map formalism, which treats any unknown dynamics as a "black-box" process. This black box contains information about the system with which the probe interacts, which we access by measuring the survival probability of the initial state of the probe as function of the energy splitting and the process time. Fourier transforming the results yields the energy spectrum of the complex system. Without making assumptions about the strength or form of its coupling, our probe could determine aspects of a complex molecule's energy landscape as well as, in many cases, test for coherent superposition of its energy eigenstates.

  13. Empirical modeling for intelligent, real-time manufacture control

    NASA Technical Reports Server (NTRS)

    Xu, Xiaoshu

    1994-01-01

    Artificial neural systems (ANS), also known as neural networks, are an attempt to develop computer systems that emulate the neural reasoning behavior of biological neural systems (e.g. the human brain). As such, they are loosely based on biological neural networks. The ANS consists of a series of nodes (neurons) and weighted connections (axons) that, when presented with a specific input pattern, can associate specific output patterns. It is essentially a highly complex, nonlinear, mathematical relationship or transform. These constructs have two significant properties that have proven useful to the authors in signal processing and process modeling: noise tolerance and complex pattern recognition. Specifically, the authors have developed a new network learning algorithm that has resulted in the successful application of ANS's to high speed signal processing and to developing models of highly complex processes. Two of the applications, the Weld Bead Geometry Control System and the Welding Penetration Monitoring System, are discussed in the body of this paper.

  14. Potential of Laboratory Execution Systems (LESs) to Simplify the Application of Business Process Management Systems (BPMSs) in Laboratory Automation.

    PubMed

    Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin

    2017-04-01

    Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.

  15. Complexity Theory

    USGS Publications Warehouse

    Lee, William H K.

    2016-01-01

    A complex system consists of many interacting parts, generates new collective behavior through self organization, and adaptively evolves through time. Many theories have been developed to study complex systems, including chaos, fractals, cellular automata, self organization, stochastic processes, turbulence, and genetic algorithms.

  16. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  17. Process for the enhanced capture of heavy metal emissions

    DOEpatents

    Biswas, Pratim; Wu, Chang-Yu

    2001-01-01

    This invention is directed to a process for forming a sorbent-metal complex. The process includes oxidizing a sorbent precursor and contacting the sorbent precursor with a metallic species. The process further includes chemically reacting the sorbent precursor and the metallic species, thereby forming a sorbent-metal complex. In one particular aspect of the invention, at least a portion of the sorbent precursor is transformed into sorbent particles during the oxidation step. These sorbent particles then are contacted with the metallic species and chemically reacted with the metallic species, thereby forming a sorbent-metal complex. Another aspect of the invention is directed to a process for forming a sorbent metal complex in a combustion system. The process includes introducing a sorbent precursor into a combustion system and subjecting the sorbent precursor to an elevated temperature sufficient to oxidize the sorbent precursor and transform the sorbent precursor into sorbent particles. The process further includes contacting the sorbent particles with a metallic species and exposing the sorbent particles and the metallic species to a complex-forming temperature whereby the metallic species reacts with the sorbent particles thereby forming a sorbent-metal complex under UV irradiation.

  18. An information transfer based novel framework for fault root cause tracing of complex electromechanical systems in the processing industry

    NASA Astrophysics Data System (ADS)

    Wang, Rongxi; Gao, Xu; Gao, Jianmin; Gao, Zhiyong; Kang, Jiani

    2018-02-01

    As one of the most important approaches for analyzing the mechanism of fault pervasion, fault root cause tracing is a powerful and useful tool for detecting the fundamental causes of faults so as to prevent any further propagation and amplification. Focused on the problems arising from the lack of systematic and comprehensive integration, an information transfer-based novel data-driven framework for fault root cause tracing of complex electromechanical systems in the processing industry was proposed, taking into consideration the experience and qualitative analysis of conventional fault root cause tracing methods. Firstly, an improved symbolic transfer entropy method was presented to construct a directed-weighted information model for a specific complex electromechanical system based on the information flow. Secondly, considering the feedback mechanisms in the complex electromechanical systems, a method for determining the threshold values of weights was developed to explore the disciplines of fault propagation. Lastly, an iterative method was introduced to identify the fault development process. The fault root cause was traced by analyzing the changes in information transfer between the nodes along with the fault propagation pathway. An actual fault root cause tracing application of a complex electromechanical system is used to verify the effectiveness of the proposed framework. A unique fault root cause is obtained regardless of the choice of the initial variable. Thus, the proposed framework can be flexibly and effectively used in fault root cause tracing for complex electromechanical systems in the processing industry, and formulate the foundation of system vulnerability analysis and condition prediction, as well as other engineering applications.

  19. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    PubMed

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  20. Simulating Complex, Cold-region Process Interactions Using a Multi-scale, Variable-complexity Hydrological Model

    NASA Astrophysics Data System (ADS)

    Marsh, C.; Pomeroy, J. W.; Wheater, H. S.

    2017-12-01

    Accurate management of water resources is necessary for social, economic, and environmental sustainability worldwide. In locations with seasonal snowcovers, the accurate prediction of these water resources is further complicated due to frozen soils, solid-phase precipitation, blowing snow transport, and snowcover-vegetation-atmosphere interactions. Complex process interactions and feedbacks are a key feature of hydrological systems and may result in emergent phenomena, i.e., the arising of novel and unexpected properties within a complex system. One example is the feedback associated with blowing snow redistribution, which can lead to drifts that cause locally-increased soil moisture, thus increasing plant growth that in turn subsequently impacts snow redistribution, creating larger drifts. Attempting to simulate these emergent behaviours is a significant challenge, however, and there is concern that process conceptualizations within current models are too incomplete to represent the needed interactions. An improved understanding of the role of emergence in hydrological systems often requires high resolution distributed numerical hydrological models that incorporate the relevant process dynamics. The Canadian Hydrological Model (CHM) provides a novel tool for examining cold region hydrological systems. Key features include efficient terrain representation, allowing simulations at various spatial scales, reduced computational overhead, and a modular process representation allowing for an alternative-hypothesis framework. Using both physics-based and conceptual process representations sourced from long term process studies and the current cold regions literature allows for comparison of process representations and importantly, their ability to produce emergent behaviours. Examining the system in a holistic, process-based manner can hopefully derive important insights and aid in development of improved process representations.

  1. System-Level Shared Governance Structures and Processes in Healthcare Systems With Magnet®-Designated Hospitals: A Descriptive Study.

    PubMed

    Underwood, Carlisa M; Hayne, Arlene N

    The purpose was to identify and describe structures and processes of best practices for system-level shared governance in healthcare systems. Currently, more than 64.6% of US community hospitals are part of a system. System chief nurse executives (SCNEs) are challenged to establish leadership structures and processes that effectively and efficiently disseminate best practices for patients and staff across complex organizations, geographically dispersed locations, and populations. Eleven US healthcare SCNEs from the American Nurses Credentialing Center's repository of Magnet®-designated facilities participated in a 35-multiquestion interview based on Kanter's Theory of Organizational Empowerment. Most SCNEs reported the presence of more than 50% of the empowerment structures and processes in system-level shared governance. Despite the difficulties and complexities of growing health systems, SCNEs have replicated empowerment characteristics of hospital shared governance structures and processes at the system level.

  2. Feed-back between geriatric syndromes: general system theory in geriatrics.

    PubMed

    Musso, Carlos G; Núñez, Juan F Macías

    2006-01-01

    Geriatrics has described three entities: confusional syndrome, incontinente and gait disorders, calling them geriatric giants. Aging process also induces changes in renal physiology such as glomerular filtration rate reduction, and alteration in water and electrolytes handling. These ageing renal changes have been named as nephrogeriatric giants. These two groups of giants, geriatric and nephrogeriatric, can predispose and potentiate each other leading old people to fatal outcomes. These phenomenon of feed-back between these geriatric syndromes has its roots in the loss of complexity that the ageing process has. Complexity means that all the body systems work harmoniously. The process of senescence weakens this coordination among systems undermining complexity and making the old person frail.

  3. Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth

    2016-01-01

    If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.

  4. Part 3 Specialized aspects of GIS and spatial analysis . Garage band science and dynamic spatial models

    NASA Astrophysics Data System (ADS)

    Box, Paul W.

    GIS and spatial analysis is suited mainly for static pictures of the landscape, but many of the processes that need exploring are dynamic in nature. Dynamic processes can be complex when put in a spatial context; our ability to study such processes will probably come with advances in understanding complex systems in general. Cellular automata and agent-based models are two prime candidates for exploring complex spatial systems, but are difficult to implement. Innovative tools that help build complex simulations will create larger user communities, who will probably find novel solutions for understanding complexity. A significant source for such innovations is likely to be from the collective efforts of hobbyists and part-time programmers, who have been dubbed ``garage-band scientists'' in the popular press.

  5. Collaborative Educational Leadership: The Emergence of Human Interactional Sense-Making Process as a Complex System

    ERIC Educational Resources Information Center

    Jäppinen, Aini-Kristiina

    2014-01-01

    The article aims at explicating the emergence of human interactional sense-making process within educational leadership as a complex system. The kind of leadership is understood as a holistic entity called collaborative leadership. There, sense-making emerges across interdependent domains, called attributes of collaborative leadership. The…

  6. Expert systems for superalloy studies

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Kaukler, William F.

    1990-01-01

    There are many areas in science and engineering which require knowledge of an extremely complex foundation of experimental results in order to design methodologies for developing new materials or products. Superalloys are an area which fit well into this discussion in the sense that they are complex combinations of elements which exhibit certain characteristics. Obviously the use of superalloys in high performance, high temperature systems such as the Space Shuttle Main Engine is of interest to NASA. The superalloy manufacturing process is complex and the implementation of an expert system within the design process requires some thought as to how and where it should be implemented. A major motivation is to develop a methodology to assist metallurgists in the design of superalloy materials using current expert systems technology. Hydrogen embrittlement is disasterous to rocket engines and the heuristics can be very complex. Attacking this problem as one module in the overall design process represents a significant step forward. In order to describe the objectives of the first phase implementation, the expert system was designated Hydrogen Environment Embrittlement Expert System (HEEES).

  7. Optimized design of embedded DSP system hardware supporting complex algorithms

    NASA Astrophysics Data System (ADS)

    Li, Yanhua; Wang, Xiangjun; Zhou, Xinling

    2003-09-01

    The paper presents an optimized design method for a flexible and economical embedded DSP system that can implement complex processing algorithms as biometric recognition, real-time image processing, etc. It consists of a floating-point DSP, 512 Kbytes data RAM, 1 Mbytes FLASH program memory, a CPLD for achieving flexible logic control of input channel and a RS-485 transceiver for local network communication. Because of employing a high performance-price ratio DSP TMS320C6712 and a large FLASH in the design, this system permits loading and performing complex algorithms with little algorithm optimization and code reduction. The CPLD provides flexible logic control for the whole DSP board, especially in input channel, and allows convenient interface between different sensors and DSP system. The transceiver circuit can transfer data between DSP and host computer. In the paper, some key technologies are also introduced which make the whole system work efficiently. Because of the characters referred above, the hardware is a perfect flat for multi-channel data collection, image processing, and other signal processing with high performance and adaptability. The application section of this paper presents how this hardware is adapted for the biometric identification system with high identification precision. The result reveals that this hardware is easy to interface with a CMOS imager and is capable of carrying out complex biometric identification algorithms, which require real-time process.

  8. Systems and methods for rapid processing and storage of data

    DOEpatents

    Stalzer, Mark A.

    2017-01-24

    Systems and methods of building massively parallel computing systems using low power computing complexes in accordance with embodiments of the invention are disclosed. A massively parallel computing system in accordance with one embodiment of the invention includes at least one Solid State Blade configured to communicate via a high performance network fabric. In addition, each Solid State Blade includes a processor configured to communicate with a plurality of low power computing complexes interconnected by a router, and each low power computing complex includes at least one general processing core, an accelerator, an I/O interface, and cache memory and is configured to communicate with non-volatile solid state memory.

  9. Integrating technology into complex intervention trial processes: a case study.

    PubMed

    Drew, Cheney J G; Poile, Vincent; Trubey, Rob; Watson, Gareth; Kelson, Mark; Townson, Julia; Rosser, Anne; Hood, Kerenza; Quinn, Lori; Busse, Monica

    2016-11-17

    Trials of complex interventions are associated with high costs and burdens in terms of paperwork, management, data collection, validation, and intervention fidelity assessment occurring across multiple sites. Traditional data collection methods rely on paper-based forms, where processing can be time-consuming and error rates high. Electronic source data collection can potentially address many of these inefficiencies, but has not routinely been used in complex intervention trials. Here we present the use of an on-line system for managing all aspects of data handling and for the monitoring of trial processes in a multicentre trial of a complex intervention. We custom built a web-accessible software application for the delivery of ENGAGE-HD, a multicentre trial of a complex physical therapy intervention. The software incorporated functionality for participant randomisation, data collection and assessment of intervention fidelity. It was accessible to multiple users with differing levels of access depending on required usage or to maintain blinding. Each site was supplied with a 4G-enabled iPad for accessing the system. The impact of this system was quantified through review of data quality and collation of feedback from site coordinators and assessors through structured process interviews. The custom-built system was an efficient tool for collecting data and managing trial processes. Although the set-up time required was significant, using the system resulted in an overall data completion rate of 98.5% with a data query rate of 0.1%, the majority of which were resolved in under a week. Feedback from research staff indicated that the system was highly acceptable for use in a research environment. This was a reflection of the portability and accessibility of the system when using the iPad and its usefulness in aiding accurate data collection, intervention fidelity and general administration. A combination of commercially available hardware and a bespoke online database designed to support data collection, intervention fidelity and trial progress provides a viable option for streamlining trial processes in a multicentre complex intervention trial. There is scope to further extend the system to cater for larger trials and add further functionality such as automatic reporting facilities and participant management support. ISRCTN65378754 , registered on 13 March 2014.

  10. Integrating a Genetic Algorithm Into a Knowledge-Based System for Ordering Complex Design Processes

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; McCulley, Collin M.; Bloebaum, Christina L.

    1996-01-01

    The design cycle associated with large engineering systems requires an initial decomposition of the complex system into design processes which are coupled through the transference of output data. Some of these design processes may be grouped into iterative subcycles. In analyzing or optimizing such a coupled system, it is essential to be able to determine the best ordering of the processes within these subcycles to reduce design cycle time and cost. Many decomposition approaches assume the capability is available to determine what design processes and couplings exist and what order of execution will be imposed during the design cycle. Unfortunately, this is often a complex problem and beyond the capabilities of a human design manager. A new feature, a genetic algorithm, has been added to DeMAID (Design Manager's Aid for Intelligent Decomposition) to allow the design manager to rapidly examine many different combinations of ordering processes in an iterative subcycle and to optimize the ordering based on cost, time, and iteration requirements. Two sample test cases are presented to show the effects of optimizing the ordering with a genetic algorithm.

  11. Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants

    NASA Astrophysics Data System (ADS)

    Kulbjakina, A. V.; Dolotovskij, I. V.

    2018-01-01

    The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.

  12. Principal process analysis of biological models.

    PubMed

    Casagranda, Stefano; Touzeau, Suzanne; Ropers, Delphine; Gouzé, Jean-Luc

    2018-06-14

    Understanding the dynamical behaviour of biological systems is challenged by their large number of components and interactions. While efforts have been made in this direction to reduce model complexity, they often prove insufficient to grasp which and when model processes play a crucial role. Answering these questions is fundamental to unravel the functioning of living organisms. We design a method for dealing with model complexity, based on the analysis of dynamical models by means of Principal Process Analysis. We apply the method to a well-known model of circadian rhythms in mammals. The knowledge of the system trajectories allows us to decompose the system dynamics into processes that are active or inactive with respect to a certain threshold value. Process activities are graphically represented by Boolean and Dynamical Process Maps. We detect model processes that are always inactive, or inactive on some time interval. Eliminating these processes reduces the complex dynamics of the original model to the much simpler dynamics of the core processes, in a succession of sub-models that are easier to analyse. We quantify by means of global relative errors the extent to which the simplified models reproduce the main features of the original system dynamics and apply global sensitivity analysis to test the influence of model parameters on the errors. The results obtained prove the robustness of the method. The analysis of the sub-model dynamics allows us to identify the source of circadian oscillations. We find that the negative feedback loop involving proteins PER, CRY, CLOCK-BMAL1 is the main oscillator, in agreement with previous modelling and experimental studies. In conclusion, Principal Process Analysis is a simple-to-use method, which constitutes an additional and useful tool for analysing the complex dynamical behaviour of biological systems.

  13. Primordial Evolution in the Finitary Process Soup

    NASA Astrophysics Data System (ADS)

    Görnerup, Olof; Crutchfield, James P.

    A general and basic model of primordial evolution—a soup of reacting finitary and discrete processes—is employed to identify and analyze fundamental mechanisms that generate and maintain complex structures in prebiotic systems. The processes—ɛ-machines as defined in computational mechanics—and their interaction networks both provide well defined notions of structure. This enables us to quantitatively demonstrate hierarchical self-organization in the soup in terms of complexity. We found that replicating processes evolve the strategy of successively building higher levels of organization by autocatalysis. Moreover, this is facilitated by local components that have low structural complexity, but high generality. In effect, the finitary process soup spontaneously evolves a selection pressure that favors such components. In light of the finitary process soup's generality, these results suggest a fundamental law of hierarchical systems: global complexity requires local simplicity.

  14. Complex systems dynamics in aging: new evidence, continuing questions.

    PubMed

    Cohen, Alan A

    2016-02-01

    There have long been suggestions that aging is tightly linked to the complex dynamics of the physiological systems that maintain homeostasis, and in particular to dysregulation of regulatory networks of molecules. This review synthesizes recent work that is starting to provide evidence for the importance of such complex systems dynamics in aging. There is now clear evidence that physiological dysregulation--the gradual breakdown in the capacity of complex regulatory networks to maintain homeostasis--is an emergent property of these regulatory networks, and that it plays an important role in aging. It can be measured simply using small numbers of biomarkers. Additionally, there are indications of the importance during aging of emergent physiological processes, functional processes that cannot be easily understood through clear metabolic pathways, but can nonetheless be precisely quantified and studied. The overall role of such complex systems dynamics in aging remains an important open question, and to understand it future studies will need to distinguish and integrate related aspects of aging research, including multi-factorial theories of aging, systems biology, bioinformatics, network approaches, robustness, and loss of complexity.

  15. Communication Network Integration and Group Uniformity in a Complex Organization.

    ERIC Educational Resources Information Center

    Danowski, James A.; Farace, Richard V.

    This paper contains a discussion of the limitations of research on group processes in complex organizations and the manner in which a procedure for network analysis in on-going systems can reduce problems. The research literature on group uniformity processes and on theoretical models of these processes from an information processing perspective…

  16. Mixture and odorant processing in the olfactory systems of insects: a comparative perspective.

    PubMed

    Clifford, Marie R; Riffell, Jeffrey A

    2013-11-01

    Natural olfactory stimuli are often complex mixtures of volatiles, of which the identities and ratios of constituents are important for odor-mediated behaviors. Despite this importance, the mechanism by which the olfactory system processes this complex information remains an area of active study. In this review, we describe recent progress in how odorants and mixtures are processed in the brain of insects. We use a comparative approach toward contrasting olfactory coding and the behavioral efficacy of mixtures in different insect species, and organize these topics around four sections: (1) Examples of the behavioral efficacy of odor mixtures and the olfactory environment; (2) mixture processing in the periphery; (3) mixture coding in the antennal lobe; and (4) evolutionary implications and adaptations for olfactory processing. We also include pertinent background information about the processing of individual odorants and comparative differences in wiring and anatomy, as these topics have been richly investigated and inform the processing of mixtures in the insect olfactory system. Finally, we describe exciting studies that have begun to elucidate the role of the processing of complex olfactory information in evolution and speciation.

  17. A PDA-based system for online recording and analysis of concurrent events in complex behavioral processes.

    PubMed

    Held, Jürgen; Manser, Tanja

    2005-02-01

    This article outlines how a Palm- or Newton-based PDA (personal digital assistant) system for online event recording was used to record and analyze concurrent events. We describe the features of this PDA-based system, called the FIT-System (flexible interface technique), and its application to the analysis of concurrent events in complex behavioral processes--in this case, anesthesia work processes. The patented FIT-System has a unique user interface design allowing the user to design an interface template with a pencil and paper or using a transparency film. The template usually consists of a drawing or sketch that includes icons or symbols that depict the observer's representation of the situation to be observed. In this study, the FIT-System allowed us to create a design for fast, intuitive online recording of concurrent events using a set of 41 observation codes. An analysis of concurrent events leads to a description of action density, and our results revealed a characteristic distribution of action density during the administration of anesthesia in the operating room. This distribution indicated the central role of the overlapping operations in the action sequences of medical professionals as they deal with the varying requirements of this complex task. We believe that the FIT-System for online recording of concurrent events in complex behavioral processes has the potential to be useful across a broad spectrum of research areas.

  18. Measuring the complexity of design in real-time imaging software

    NASA Astrophysics Data System (ADS)

    Sangwan, Raghvinder S.; Vercellone-Smith, Pamela; Laplante, Phillip A.

    2007-02-01

    Due to the intricacies in the algorithms involved, the design of imaging software is considered to be more complex than non-image processing software (Sangwan et al, 2005). A recent investigation (Larsson and Laplante, 2006) examined the complexity of several image processing and non-image processing software packages along a wide variety of metrics, including those postulated by McCabe (1976), Chidamber and Kemerer (1994), and Martin (2003). This work found that it was not always possible to quantitatively compare the complexity between imaging applications and nonimage processing systems. Newer research and an accompanying tool (Structure 101, 2006), however, provides a greatly simplified approach to measuring software complexity. Therefore it may be possible to definitively quantify the complexity differences between imaging and non-imaging software, between imaging and real-time imaging software, and between software programs of the same application type. In this paper, we review prior results and describe the methodology for measuring complexity in imaging systems. We then apply a new complexity measurement methodology to several sets of imaging and non-imaging code in order to compare the complexity differences between the two types of applications. The benefit of such quantification is far reaching, for example, leading to more easily measured performance improvement and quality in real-time imaging code.

  19. Markov and non-Markov processes in complex systems by the dynamical information entropy

    NASA Astrophysics Data System (ADS)

    Yulmetyev, R. M.; Gafarov, F. M.

    1999-12-01

    We consider the Markov and non-Markov processes in complex systems by the dynamical information Shannon entropy (DISE) method. The influence and important role of the two mutually dependent channels of entropy alternation (creation or generation of correlation) and anti-correlation (destroying or annihilation of correlation) have been discussed. The developed method has been used for the analysis of the complex systems of various natures: slow neutron scattering in liquid cesium, psychology (short-time numeral and pattern human memory and effect of stress on the dynamical taping-test), random dynamics of RR-intervals in human ECG (problem of diagnosis of various disease of the human cardio-vascular systems), chaotic dynamics of the parameters of financial markets and ecological systems.

  20. The Acquisition Process as a Vehicle for Enabling Knowledge Management in the Lifecycle of Complex Federal Systems

    NASA Technical Reports Server (NTRS)

    Stewart, Helen; Spence, Matt Chew; Holm, Jeanne; Koga, Dennis (Technical Monitor)

    2001-01-01

    This white paper explores how to increase the success and operation of critical, complex, national systems by effectively capturing knowledge management requirements within the federal acquisition process. Although we focus on aerospace flight systems, the principles outlined within may have a general applicability to other critical federal systems as well. Fundamental design deficiencies in federal, mission-critical systems have contributed to recent, highly visible system failures, such as the V-22 Osprey and the Delta rocket family. These failures indicate that the current mechanisms for knowledge management and risk management are inadequate to meet the challenges imposed by the rising complexity of critical systems. Failures of aerospace system operations and vehicles may have been prevented or lessened through utilization of better knowledge management and information management techniques.

  1. Extension of optical lithography by mask-litho integration with computational lithography

    NASA Astrophysics Data System (ADS)

    Takigawa, T.; Gronlund, K.; Wiley, J.

    2010-05-01

    Wafer lithography process windows can be enlarged by using source mask co-optimization (SMO). Recently, SMO including freeform wafer scanner illumination sources has been developed. Freeform sources are generated by a programmable illumination system using a micro-mirror array or by custom Diffractive Optical Elements (DOE). The combination of freeform sources and complex masks generated by SMO show increased wafer lithography process window and reduced MEEF. Full-chip mask optimization using source optimized by SMO can generate complex masks with small variable feature size sub-resolution assist features (SRAF). These complex masks create challenges for accurate mask pattern writing and low false-defect inspection. The accuracy of the small variable-sized mask SRAF patterns is degraded by short range mask process proximity effects. To address the accuracy needed for these complex masks, we developed a highly accurate mask process correction (MPC) capability. It is also difficult to achieve low false-defect inspections of complex masks with conventional mask defect inspection systems. A printability check system, Mask Lithography Manufacturability Check (M-LMC), is developed and integrated with 199-nm high NA inspection system, NPI. M-LMC successfully identifies printable defects from all of the masses of raw defect images collected during the inspection of a complex mask. Long range mask CD uniformity errors are compensated by scanner dose control. A mask CD uniformity error map obtained by mask metrology system is used as input data to the scanner. Using this method, wafer CD uniformity is improved. As reviewed above, mask-litho integration technology with computational lithography is becoming increasingly important.

  2. Spectral simplicity of apparent complexity. II. Exact complexities and complexity spectra

    NASA Astrophysics Data System (ADS)

    Riechers, Paul M.; Crutchfield, James P.

    2018-03-01

    The meromorphic functional calculus developed in Part I overcomes the nondiagonalizability of linear operators that arises often in the temporal evolution of complex systems and is generic to the metadynamics of predicting their behavior. Using the resulting spectral decomposition, we derive closed-form expressions for correlation functions, finite-length Shannon entropy-rate approximates, asymptotic entropy rate, excess entropy, transient information, transient and asymptotic state uncertainties, and synchronization information of stochastic processes generated by finite-state hidden Markov models. This introduces analytical tractability to investigating information processing in discrete-event stochastic processes, symbolic dynamics, and chaotic dynamical systems. Comparisons reveal mathematical similarities between complexity measures originally thought to capture distinct informational and computational properties. We also introduce a new kind of spectral analysis via coronal spectrograms and the frequency-dependent spectra of past-future mutual information. We analyze a number of examples to illustrate the methods, emphasizing processes with multivariate dependencies beyond pairwise correlation. This includes spectral decomposition calculations for one representative example in full detail.

  3. Problems in modernization of automation systems at coal preparation plants

    NASA Astrophysics Data System (ADS)

    Myshlyaev, L. P.; Lyakhovets, M. V.; Venger, K. G.; Leontiev, I. A.; Makarov, G. V.; Salamatin, A. S.

    2018-05-01

    The factors influencing the process of modernization (reconstruction) of the automation systems at coal preparation plants are described. Problems such as heterogeneity of existing and developed systems, planning of reconstruction of a technological complex without taking into account modernization of automated systems, commissioning without stopping the existing technological complex, as well as problems of conducting procurement procedures are discussed. The option of stage-by-stage start-up and adjustment works in the conditions of modernization of systems without long stops of the process equipment is offered.

  4. Is the destabilization of the cournot equilibrium a good business strategy in cournot-puu duopoly?

    PubMed

    Canovas, Jose S

    2011-10-01

    It is generally acknowledged that the medium influences the way we communicate and negotiation research directs considerable attention to the impact of different electronic communication modes on the negotiation process and outcomes. Complexity theories offer models and methods that allow the investigation of how pattern and temporal sequences unfold over time in negotiation interactions. By focusing on the dynamic and interactive quality of negotiations as well as the information, choice, and uncertainty contained in the negotiation process, the complexity perspective addresses several issues of central interest in classical negotiation research. In the present study we compare the complexity of the negotiation communication process among synchronous and asynchronous negotiations (IM vs. e-mail) as well as an electronic negotiation support system including a decision support system (DSS). For this purpose, transcripts of 145 negotiations have been coded and analyzed with the Shannon entropy and the grammar complexity. Our results show that negotiating asynchronically via e-mail as well as including a DSS significantly reduces the complexity of the negotiation process. Furthermore, a reduction of the complexity increases the probability of reaching an agreement.

  5. Documentation Driven Development for Complex Real-Time Systems

    DTIC Science & Technology

    2004-12-01

    This paper presents a novel approach for development of complex real - time systems , called the documentation-driven development (DDD) approach. This... time systems . DDD will also support automated software generation based on a computational model and some relevant techniques. DDD includes two main...stakeholders to be easily involved in development processes and, therefore, significantly improve the agility of software development for complex real

  6. Cx-02 Program, workshop on modeling complex systems

    USGS Publications Warehouse

    Mossotti, Victor G.; Barragan, Jo Ann; Westergard, Todd D.

    2003-01-01

    This publication contains the abstracts and program for the workshop on complex systems that was held on November 19-21, 2002, in Reno, Nevada. Complex systems are ubiquitous within the realm of the earth sciences. Geological systems consist of a multiplicity of linked components with nested feedback loops; the dynamics of these systems are non-linear, iterative, multi-scale, and operate far from equilibrium. That notwithstanding, It appears that, with the exception of papers on seismic studies, geology and geophysics work has been disproportionally underrepresented at regional and national meetings on complex systems relative to papers in the life sciences. This is somewhat puzzling because geologists and geophysicists are, in many ways, preadapted to thinking of complex system mechanisms. Geologists and geophysicists think about processes involving large volumes of rock below the sunlit surface of Earth, the accumulated consequence of processes extending hundreds of millions of years in the past. Not only do geologists think in the abstract by virtue of the vast time spans, most of the evidence is out-of-sight. A primary goal of this workshop is to begin to bridge the gap between the Earth sciences and life sciences through demonstration of the universality of complex systems science, both philosophically and in model structures.

  7. The Influence of Cultural Factors on Trust in Automation

    ERIC Educational Resources Information Center

    Chien, Shih-Yi James

    2016-01-01

    Human interaction with automation is a complex process that requires both skilled operators and complex system designs to effectively enhance overall performance. Although automation has successfully managed complex systems throughout the world for over half a century, inappropriate reliance on automation can still occur, such as the recent…

  8. Auditory Processing of Complex Sounds Across Frequency Channels.

    DTIC Science & Technology

    1992-06-26

    towards gaining an understanding how the auditory system processes complex sounds. "The results of binaural psychophysical experiments in human subjects...suggest (1) that spectrally synthetic binaural processing is the rule when the number of components in the tone complex are relatively few (less than...10) and there are no dynamic binaural cues to aid segregation of the target from the background, and (2) that waveforms having large effective

  9. Software and Dataware for Energy Generation and Consumption Analysis System of Gas Processing Enterprises

    NASA Astrophysics Data System (ADS)

    Dolotovskii, I. V.; Dolotovskaya, N. V.; Larin, E. A.

    2018-05-01

    The article presents the architecture and content of a specialized analytical system for monitoring operational conditions, planning of consumption and generation of energy resources, long-term planning of production activities and development of a strategy for the development of the energy complex of gas processing enterprises. A compositional model of structured data on the equipment of the main systems of the power complex is proposed. The correctness of the use of software modules and the database of the analytical system is confirmed by comparing the results of measurements on the equipment of the electric power system and simulation at the operating gas processing plant. A high accuracy in the planning of consumption of fuel and energy resources has been achieved (the error does not exceed 1%). Information and program modules of the analytical system allow us to develop a strategy for improving the energy complex in the face of changing technological topology and partial uncertainty of economic factors.

  10. Intelligent classifier for dynamic fault patterns based on hidden Markov model

    NASA Astrophysics Data System (ADS)

    Xu, Bo; Feng, Yuguang; Yu, Jinsong

    2006-11-01

    It's difficult to build precise mathematical models for complex engineering systems because of the complexity of the structure and dynamics characteristics. Intelligent fault diagnosis introduces artificial intelligence and works in a different way without building the analytical mathematical model of a diagnostic object, so it's a practical approach to solve diagnostic problems of complex systems. This paper presents an intelligent fault diagnosis method, an integrated fault-pattern classifier based on Hidden Markov Model (HMM). This classifier consists of dynamic time warping (DTW) algorithm, self-organizing feature mapping (SOFM) network and Hidden Markov Model. First, after dynamic observation vector in measuring space is processed by DTW, the error vector including the fault feature of being tested system is obtained. Then a SOFM network is used as a feature extractor and vector quantization processor. Finally, fault diagnosis is realized by fault patterns classifying with the Hidden Markov Model classifier. The importing of dynamic time warping solves the problem of feature extracting from dynamic process vectors of complex system such as aeroengine, and makes it come true to diagnose complex system by utilizing dynamic process information. Simulating experiments show that the diagnosis model is easy to extend, and the fault pattern classifier is efficient and is convenient to the detecting and diagnosing of new faults.

  11. Conceptual Foundations of Systems Biology Explaining Complex Cardiac Diseases.

    PubMed

    Louridas, George E; Lourida, Katerina G

    2017-02-21

    Systems biology is an important concept that connects molecular biology and genomics with computing science, mathematics and engineering. An endeavor is made in this paper to associate basic conceptual ideas of systems biology with clinical medicine. Complex cardiac diseases are clinical phenotypes generated by integration of genetic, molecular and environmental factors. Basic concepts of systems biology like network construction, modular thinking, biological constraints (downward biological direction) and emergence (upward biological direction) could be applied to clinical medicine. Especially, in the field of cardiology, these concepts can be used to explain complex clinical cardiac phenotypes like chronic heart failure and coronary artery disease. Cardiac diseases are biological complex entities which like other biological phenomena can be explained by a systems biology approach. The above powerful biological tools of systems biology can explain robustness growth and stability during disease process from modulation to phenotype. The purpose of the present review paper is to implement systems biology strategy and incorporate some conceptual issues raised by this approach into the clinical field of complex cardiac diseases. Cardiac disease process and progression can be addressed by the holistic realistic approach of systems biology in order to define in better terms earlier diagnosis and more effective therapy.

  12. Challenges in the analysis of complex systems: introduction and overview

    NASA Astrophysics Data System (ADS)

    Hastings, Harold M.; Davidsen, Jörn; Leung, Henry

    2017-12-01

    One of the main challenges of modern physics is to provide a systematic understanding of systems far from equilibrium exhibiting emergent behavior. Prominent examples of such complex systems include, but are not limited to the cardiac electrical system, the brain, the power grid, social systems, material failure and earthquakes, and the climate system. Due to the technological advances over the last decade, the amount of observations and data available to characterize complex systems and their dynamics, as well as the capability to process that data, has increased substantially. The present issue discusses a cross section of the current research on complex systems, with a focus on novel experimental and data-driven approaches to complex systems that provide the necessary platform to model the behavior of such systems.

  13. Technology-design-manufacturing co-optimization for advanced mobile SoCs

    NASA Astrophysics Data System (ADS)

    Yang, Da; Gan, Chock; Chidambaram, P. R.; Nallapadi, Giri; Zhu, John; Song, S. C.; Xu, Jeff; Yeap, Geoffrey

    2014-03-01

    How to maintain the Moore's Law scaling beyond the 193 immersion resolution limit is the key question semiconductor industry needs to answer in the near future. Process complexity will undoubtfully increase for 14nm node and beyond, which brings both challenges and opportunities for technology development. A vertically integrated design-technologymanufacturing co-optimization flow is desired to better address the complicated issues new process changes bring. In recent years smart mobile wireless devices have been the fastest growing consumer electronics market. Advanced mobile devices such as smartphones are complex systems with the overriding objective of providing the best userexperience value by harnessing all the technology innovations. Most critical system drivers are better system performance/power efficiency, cost effectiveness, and smaller form factors, which, in turns, drive the need of system design and solution with More-than-Moore innovations. Mobile system-on-chips (SoCs) has become the leading driver for semiconductor technology definition and manufacturing. Here we highlight how the co-optimization strategy influenced architecture, device/circuit, process technology and package, in the face of growing process cost/complexity and variability as well as design rule restrictions.

  14. Orbiter data reduction complex data processing requirements for the OFT mission evaluation team (level C)

    NASA Technical Reports Server (NTRS)

    1979-01-01

    This document addresses requirements for post-test data reduction in support of the Orbital Flight Tests (OFT) mission evaluation team, specifically those which are planned to be implemented in the ODRC (Orbiter Data Reduction Complex). Only those requirements which have been previously baselined by the Data Systems and Analysis Directorate configuration control board are included. This document serves as the control document between Institutional Data Systems Division and the Integration Division for OFT mission evaluation data processing requirements, and shall be the basis for detailed design of ODRC data processing systems.

  15. Methodology and Results of Mathematical Modelling of Complex Technological Processes

    NASA Astrophysics Data System (ADS)

    Mokrova, Nataliya V.

    2018-03-01

    The methodology of system analysis allows us to draw a mathematical model of the complex technological process. The mathematical description of the plasma-chemical process was proposed. The importance the quenching rate and initial temperature decrease time was confirmed for producing the maximum amount of the target product. The results of numerical integration of the system of differential equations can be used to describe reagent concentrations, plasma jet rate and temperature in order to achieve optimal mode of hardening. Such models are applicable both for solving control problems and predicting future states of sophisticated technological systems.

  16. Engineering Design Thinking

    ERIC Educational Resources Information Center

    Lammi, Matthew; Becker, Kurt

    2013-01-01

    Engineering design thinking is "a complex cognitive process" including divergence-convergence, a systems perspective, ambiguity, and collaboration (Dym, Agogino, Eris, Frey, & Leifer, 2005, p. 104). Design is often complex, involving multiple levels of interacting components within a system that may be nested within or connected to other systems.…

  17. Engineering research, development and technology FY99

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langland, R T

    The growth of computer power and connectivity, together with advances in wireless sensing and communication technologies, is transforming the field of complex distributed systems. The ability to deploy large numbers of sensors with a rapid, broadband communication system will enable high-fidelity, near real-time monitoring of complex systems. These technological developments will provide unprecedented insight into the actual performance of engineered and natural environment systems, enable the evolution of many new types of engineered systems for monitoring and detection, and enhance our ability to perform improved and validated large-scale simulations of complex systems. One of the challenges facing engineering is tomore » develop methodologies to exploit the emerging information technologies. Particularly important will be the ability to assimilate measured data into the simulation process in a way which is much more sophisticated than current, primarily ad hoc procedures. The reports contained in this section on the Center for Complex Distributed Systems describe activities related to the integrated engineering of large complex systems. The first three papers describe recent developments for each link of the integrated engineering process for large structural systems. These include (1) the development of model-based signal processing algorithms which will formalize the process of coupling measurements and simulation and provide a rigorous methodology for validation and update of computational models; (2) collaborative efforts with faculty at the University of California at Berkeley on the development of massive simulation models for the earth and large bridge structures; and (3) the development of wireless data acquisition systems which provide a practical means of monitoring large systems like the National Ignition Facility (NIF) optical support structures. These successful developments are coming to a confluence in the next year with applications to NIF structural characterizations and analysis of large bridge structures for the State of California. Initial feasibility investigations into the development of monitoring and detection systems are described in the papers on imaging of underground structures with ground-penetrating radar, and the use of live insects as sensor platforms. These efforts are establishing the basic performance characteristics essential to the decision process for future development of sensor arrays for information gathering related to national security.« less

  18. Managing Complexity: Impact of Organization and Processing Style on Nonverbal Memory in Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Tsatsanis, Katherine D.; Noens, Ilse L. J.; Illmann, Cornelia L.; Pauls, David L.; Volkmar, Fred R.; Schultz, Robert T.; Klin, Ami

    2011-01-01

    The contributions of cognitive style and organization to processing and recalling a complex novel stimulus were examined by comparing the Rey Osterrieth Complex Figure (ROCF) test performance of children, adolescents, and adults with ASD to clinical controls (CC) and non-impaired controls (NC) using the "Developmental Scoring System."…

  19. Research of processes of reception and analysis of dynamic digital medical images in hardware/software complexes used for diagnostics and treatment of cardiovascular diseases

    NASA Astrophysics Data System (ADS)

    Karmazikov, Y. V.; Fainberg, E. M.

    2005-06-01

    Work with DICOM compatible equipment integrated into hardware and software systems for medical purposes has been considered. Structures of process of reception and translormation of the data are resulted by the example of digital rentgenography and angiography systems, included in hardware-software complex DIMOL-IK. Algorithms of reception and the analysis of the data are offered. Questions of the further processing and storage of the received data are considered.

  20. Classrooms as Complex Adaptive Systems: A Relational Model

    ERIC Educational Resources Information Center

    Burns, Anne; Knox, John S.

    2011-01-01

    In this article, we describe and model the language classroom as a complex adaptive system (see Logan & Schumann, 2005). We argue that linear, categorical descriptions of classroom processes and interactions do not sufficiently explain the complex nature of classrooms, and cannot account for how classroom change occurs (or does not occur), over…

  1. A Macro-Level Analysis of SRL Processes and Their Relations to the Acquisition of a Sophisticated Mental Model of a Complex System

    ERIC Educational Resources Information Center

    Greene, Jeffrey Alan; Azevedo, Roger

    2009-01-01

    In this study, we used think-aloud verbal protocols to examine how various macro-level processes of self-regulated learning (SRL; e.g., planning, monitoring, strategy use, handling of task difficulty and demands) were associated with the acquisition of a sophisticated mental model of a complex biological system. Numerous studies examine how…

  2. Inference, simulation, modeling, and analysis of complex networks, with special emphasis on complex networks in systems biology

    NASA Astrophysics Data System (ADS)

    Christensen, Claire Petra

    Across diverse fields ranging from physics to biology, sociology, and economics, the technological advances of the past decade have engendered an unprecedented explosion of data on highly complex systems with thousands, if not millions of interacting components. These systems exist at many scales of size and complexity, and it is becoming ever-more apparent that they are, in fact, universal, arising in every field of study. Moreover, they share fundamental properties---chief among these, that the individual interactions of their constituent parts may be well-understood, but the characteristic behaviour produced by the confluence of these interactions---by these complex networks---is unpredictable; in a nutshell, the whole is more than the sum of its parts. There is, perhaps, no better illustration of this concept than the discoveries being made regarding complex networks in the biological sciences. In particular, though the sequencing of the human genome in 2003 was a remarkable feat, scientists understand that the "cellular-level blueprints" for the human being are cellular-level parts lists, but they say nothing (explicitly) about cellular-level processes. The challenge of modern molecular biology is to understand these processes in terms of the networks of parts---in terms of the interactions among proteins, enzymes, genes, and metabolites---as it is these processes that ultimately differentiate animate from inanimate, giving rise to life! It is the goal of systems biology---an umbrella field encapsulating everything from molecular biology to epidemiology in social systems---to understand processes in terms of fundamental networks of core biological parts, be they proteins or people. By virtue of the fact that there are literally countless complex systems, not to mention tools and techniques used to infer, simulate, analyze, and model these systems, it is impossible to give a truly comprehensive account of the history and study of complex systems. The author's own publications have contributed network inference, simulation, modeling, and analysis methods to the much larger body of work in systems biology, and indeed, in network science. The aim of this thesis is therefore twofold: to present this original work in the historical context of network science, but also to provide sufficient review and reference regarding complex systems (with an emphasis on complex networks in systems biology) and tools and techniques for their inference, simulation, analysis, and modeling, such that the reader will be comfortable in seeking out further information on the subject. The review-like Chapters 1, 2, and 4 are intended to convey the co-evolution of network science and the slow but noticeable breakdown of boundaries between disciplines in academia as research and comparison of diverse systems has brought to light the shared properties of these systems. It is the author's hope that theses chapters impart some sense of the remarkable and rapid progress in complex systems research that has led to this unprecedented academic synergy. Chapters 3 and 5 detail the author's original work in the context of complex systems research. Chapter 3 presents the methods and results of a two-stage modeling process that generates candidate gene-regulatory networks of the bacterium B.subtilis from experimentally obtained, yet mathematically underdetermined microchip array data. These networks are then analyzed from a graph theoretical perspective, and their biological viability is critiqued by comparing the networks' graph theoretical properties to those of other biological systems. The results of topological perturbation analyses revealing commonalities in behavior at multiple levels of complexity are also presented, and are shown to be an invaluable means by which to ascertain the level of complexity to which the network inference process is robust to noise. Chapter 5 outlines a learning algorithm for the development of a realistic, evolving social network (a city) into which a disease is introduced. The results of simulations in populations spanning two orders of magnitude are compared to prevaccine era measles data for England and Wales and demonstrate that the simulations are able to capture the quantitative and qualitative features of epidemics in populations as small as 10,000 people. The work presented in Chapter 5 validates the utility of network simulation in concurrently probing contact network dynamics and disease dynamics.

  3. Structural model of control system for hydraulic stepper motor complex

    NASA Astrophysics Data System (ADS)

    Obukhov, A. D.; Dedov, D. L.; Kolodin, A. N.

    2018-03-01

    The article considers the problem of developing a structural model of the control system for a hydraulic stepper drive complex. A comparative analysis of stepper drives and assessment of the applicability of HSM for solving problems, requiring accurate displacement in space with subsequent positioning of the object, are carried out. The presented structural model of the automated control system of the multi-spindle complex of hydraulic stepper drives reflects the main components of the system, as well as the process of its control based on the control signals transfer to the solenoid valves by the controller. The models and methods described in the article can be used to formalize the control process in technical systems based on the application hydraulic stepper drives and allow switching from mechanical control to automated control.

  4. Molecular gearing systems

    DOE PAGES

    Gakh, Andrei A.; Sachleben, Richard A.; Bryan, Jeff C.

    1997-11-01

    The race to create smaller devices is fueling much of the research in electronics. The competition has intensified with the advent of microelectromechanical systems (MEMS), in which miniaturization is already reaching the dimensional limits imposed by physics of current lithographic techniques. Also, in the realm of biochemistry, evidence is accumulating that certain enzyme complexes are capable of very sophisticated modes of motion. Complex synergistic biochemical complexes driven by sophisticated biomechanical processes are quite common. Their biochemical functions are based on the interplay of mechanical and chemical processes, including allosteric effects. In addition, the complexity of this interplay far exceeds thatmore » of typical chemical reactions. Understanding the behavior of artificial molecular devices as well as complex natural molecular biomechanical systems is difficult. Fortunately, the problem can be successfully resolved by direct molecular engineering of simple molecular systems that can mimic desired mechanical or electronic devices. These molecular systems are called technomimetics (the name is derived, by analogy, from biomimetics). Several classes of molecular systems that can mimic mechanical, electronic, or other features of macroscopic devices have been successfully synthesized by conventional chemical methods during the past two decades. In this article we discuss only one class of such model devices: molecular gearing systems.« less

  5. Research on the EDM Technology for Micro-holes at Complex Spatial Locations

    NASA Astrophysics Data System (ADS)

    Y Liu, J.; Guo, J. M.; Sun, D. J.; Cai, Y. H.; Ding, L. T.; Jiang, H.

    2017-12-01

    For the demands on machining micro-holes at complex spatial location, several key technical problems are conquered such as micro-Electron Discharge Machining (micro-EDM) power supply system’s development, the host structure’s design and machining process technical. Through developing low-voltage power supply circuit, high-voltage circuit, micro and precision machining circuit and clearance detection system, the narrow pulse and high frequency six-axis EDM machining power supply system is developed to meet the demands on micro-hole discharging machining. With the method of combining the CAD structure design, CAE simulation analysis, modal test, ODS (Operational Deflection Shapes) test and theoretical analysis, the host construction and key axes of the machine tool are optimized to meet the position demands of the micro-holes. Through developing the special deionized water filtration system to make sure that the machining process is stable enough. To verify the machining equipment and processing technical developed in this paper through developing the micro-hole’s processing flow and test on the real machine tool. As shown in the final test results: the efficient micro-EDM machining pulse power supply system, machine tool host system, deionized filtration system and processing method developed in this paper meet the demands on machining micro-holes at complex spatial locations.

  6. Autonomous control systems: applications to remote sensing and image processing

    NASA Astrophysics Data System (ADS)

    Jamshidi, Mohammad

    2001-11-01

    One of the main challenges of any control (or image processing) paradigm is being able to handle complex systems under unforeseen uncertainties. A system may be called complex here if its dimension (order) is too high and its model (if available) is nonlinear, interconnected, and information on the system is uncertain such that classical techniques cannot easily handle the problem. Examples of complex systems are power networks, space robotic colonies, national air traffic control system, and integrated manufacturing plant, the Hubble Telescope, the International Space Station, etc. Soft computing, a consortia of methodologies such as fuzzy logic, neuro-computing, genetic algorithms and genetic programming, has proven to be powerful tools for adding autonomy and semi-autonomy to many complex systems. For such systems the size of soft computing control architecture will be nearly infinite. In this paper new paradigms using soft computing approaches are utilized to design autonomous controllers and image enhancers for a number of application areas. These applications are satellite array formations for synthetic aperture radar interferometry (InSAR) and enhancement of analog and digital images.

  7. Methodological approach and tools for systems thinking in health systems research: technical assistants' support of health administration reform in the Democratic Republic of Congo as an application.

    PubMed

    Ribesse, Nathalie; Bossyns, Paul; Marchal, Bruno; Karemere, Hermes; Burman, Christopher J; Macq, Jean

    2017-03-01

    In the field of development cooperation, interest in systems thinking and complex systems theories as a methodological approach is increasingly recognised. And so it is in health systems research, which informs health development aid interventions. However, practical applications remain scarce to date. The objective of this article is to contribute to the body of knowledge by presenting the tools inspired by systems thinking and complexity theories and methodological lessons learned from their application. These tools were used in a case study. Detailed results of this study are in process for publication in additional articles. Applying a complexity 'lens', the subject of the case study is the role of long-term international technical assistance in supporting health administration reform at the provincial level in the Democratic Republic of Congo. The Methods section presents the guiding principles of systems thinking and complex systems, their relevance and implication for the subject under study, and the existing tools associated with those theories which inspired us in the design of the data collection and analysis process. The tools and their application processes are presented in the results section, and followed in the discussion section by the critical analysis of their innovative potential and emergent challenges. The overall methodology provides a coherent whole, each tool bringing a different and complementary perspective on the system.

  8. Renewal Processes in the Critical Brain

    NASA Astrophysics Data System (ADS)

    Allegrini, Paolo; Paradisi, Paolo; Menicucci, Danilo; Gemignani, Angelo

    We describe herein a multidisciplinary research, as it developes and applies concepts of the theory of complexity, in turn stemming from recent advancements of statistical physics, onto cognitive neuroscience. We discuss (define) complexity, and how the human brain is a paradigm of it. We discuss how the hypothesis of brain activity dynamically behaving as a critical system is taking momentum in literature, then we focus on a feature of critical systems (hence of the brain), which is the intermittent passage between metastable states, marked by events, locally resetting the memory, but giving rise to correlation functions with infinite correlation times. The events, extracted from multi-channel ElectroEncephaloGrams, mark (are interpreted as) a birth/death process of cooperation, namely of system elements being recruited into collective states. Finally we discuss a recently discovered form of control (in the form of a new Linear Response Theory), that allows an optimized information transmission between complex systems, named Complexity Matching.

  9. Emergence Processes up to Consciousness Using the Multiplicity Principle and Quantum Physics

    NASA Astrophysics Data System (ADS)

    Ehresmann, Andrée C.; Vanbremeersch, Jean-Paul

    2002-09-01

    Evolution is marked by the emergence of new objects and interactions. Pursuing our preceding work on Memory Evolutive Systems (MES; cf. our Internet site), we propose a general mathematical model for this process, based on Category Theory. Its main characteristics is the Multiplicity Principle (MP) which asserts the existence of complex objects with several possible configurations. The MP entails the emergence of non-reducible more and more complex objects (emergentist reductionism). From the laws of Quantum Physics, it follows that the MP is valid for the category of particles and atoms, hence, by complexification, for any natural autonomous anticipatory complex system, such as biological systems up to neural systems, or social systems. Applying the model to the MES of neurons, we describe the emergence of higher and higher cognitive processes and of a semantic memory. Consciousness is characterized by the development of a permanent `personal' memory, the archetypal core, which allows the formation of extended landscapes with an integration of the temporal dimensions.

  10. Using Simple Manipulatives to Improve Student Comprehension of a Complex Biological Process: Protein Synthesis

    ERIC Educational Resources Information Center

    Guzman, Karen; Bartlett, John

    2012-01-01

    Biological systems and living processes involve a complex interplay of biochemicals and macromolecular structures that can be challenging for undergraduate students to comprehend and, thus, misconceptions abound. Protein synthesis, or translation, is an example of a biological process for which students often hold many misconceptions. This article…

  11. Challenges in Developing Models Describing Complex Soil Systems

    NASA Astrophysics Data System (ADS)

    Simunek, J.; Jacques, D.

    2014-12-01

    Quantitative mechanistic models that consider basic physical, mechanical, chemical, and biological processes have the potential to be powerful tools to integrate our understanding of complex soil systems, and the soil science community has often called for models that would include a large number of these diverse processes. However, once attempts have been made to develop such models, the response from the community has not always been overwhelming, especially after it discovered that these models are consequently highly complex, requiring not only a large number of parameters, not all of which can be easily (or at all) measured and/or identified, and which are often associated with large uncertainties, but also requiring from their users deep knowledge of all/most of these implemented physical, mechanical, chemical and biological processes. Real, or perceived, complexity of these models then discourages users from using them even for relatively simple applications, for which they would be perfectly adequate. Due to the nonlinear nature and chemical/biological complexity of the soil systems, it is also virtually impossible to verify these types of models analytically, raising doubts about their applicability. Code inter-comparisons, which is then likely the most suitable method to assess code capabilities and model performance, requires existence of multiple models of similar/overlapping capabilities, which may not always exist. It is thus a challenge not only to developed models describing complex soil systems, but also to persuade the soil science community in using them. As a result, complex quantitative mechanistic models are still an underutilized tool in soil science research. We will demonstrate some of the challenges discussed above on our own efforts in developing quantitative mechanistic models (such as HP1/2) for complex soil systems.

  12. Application fields for the new Object Management Group (OMG) Standards Case Management Model and Notation (CMMN) and Decision Management Notation (DMN) in the perioperative field.

    PubMed

    Wiemuth, M; Junger, D; Leitritz, M A; Neumann, J; Neumuth, T; Burgert, O

    2017-08-01

    Medical processes can be modeled using different methods and notations. Currently used modeling systems like Business Process Model and Notation (BPMN) are not capable of describing the highly flexible and variable medical processes in sufficient detail. We combined two modeling systems, Business Process Management (BPM) and Adaptive Case Management (ACM), to be able to model non-deterministic medical processes. We used the new Standards Case Management Model and Notation (CMMN) and Decision Management Notation (DMN). First, we explain how CMMN, DMN and BPMN could be used to model non-deterministic medical processes. We applied this methodology to model 79 cataract operations provided by University Hospital Leipzig, Germany, and four cataract operations provided by University Eye Hospital Tuebingen, Germany. Our model consists of 85 tasks and about 20 decisions in BPMN. We were able to expand the system with more complex situations that might appear during an intervention. An effective modeling of the cataract intervention is possible using the combination of BPM and ACM. The combination gives the possibility to depict complex processes with complex decisions. This combination allows a significant advantage for modeling perioperative processes.

  13. Forward design of a complex enzyme cascade reaction

    PubMed Central

    Hold, Christoph; Billerbeck, Sonja; Panke, Sven

    2016-01-01

    Enzymatic reaction networks are unique in that one can operate a large number of reactions under the same set of conditions concomitantly in one pot, but the nonlinear kinetics of the enzymes and the resulting system complexity have so far defeated rational design processes for the construction of such complex cascade reactions. Here we demonstrate the forward design of an in vitro 10-membered system using enzymes from highly regulated biological processes such as glycolysis. For this, we adapt the characterization of the biochemical system to the needs of classical engineering systems theory: we combine online mass spectrometry and continuous system operation to apply standard system theory input functions and to use the detailed dynamic system responses to parameterize a model of sufficient quality for forward design. This allows the facile optimization of a 10-enzyme cascade reaction for fine chemical production purposes. PMID:27677244

  14. Visual Complexity Attenuates Emotional Processing in Psychopathy: Implications for Fear-Potentiated Startle Deficits

    PubMed Central

    Sadeh, Naomi; Verona, Edelyn

    2012-01-01

    A long-standing debate is the extent to which psychopathy is characterized by a fundamental deficit in attention or emotion. We tested the hypothesis that the interplay of emotional and attentional systems is critical for understanding processing deficits in psychopathy. Sixty-three offenders were assessed using the Psychopathy Checklist: Screening Version. Event-related brain potentials (ERPs) and fear-potentiated startle (FPS) were collected while participants viewed pictures selected to disentangle an existing confound between perceptual complexity and emotional content in the pictures typically used to study fear deficits in psychopathy. As predicted, picture complexity moderated emotional processing deficits. Specifically, the affective-interpersonal features of psychopathy were associated with greater allocation of attentional resources to processing emotional stimuli at initial perception (visual N1) but only when picture stimuli were visually-complex. Despite this, results for the late positive potential indicated that emotional pictures were less attentionally engaging and held less motivational significance for individuals high in affective-interpersonal traits. This deficient negative emotional processing was observed later in their reduced defensive fear reactivity (FPS) to high-complexity unpleasant pictures. In contrast, the impulsive-antisocial features of psychopathy were associated with decreased sensitivity to picture complexity (visual N1) and unrelated to emotional processing as assessed by ERP and FPS. These findings are the first to demonstrate that picture complexity moderates FPS deficits and implicate the interplay of attention and emotional systems as deficient in psychopathy. PMID:22187225

  15. Environmental Uncertainty and Communication Network Complexity: A Cross-System, Cross-Cultural Test.

    ERIC Educational Resources Information Center

    Danowski, James

    An infographic model is proposed to account for the operation of systems within their information environments. Infographics is a communication paradigm used to indicate the clustering of information processing variables in communication systems. Four propositions concerning environmental uncertainty and internal communication network complexity,…

  16. AN ADVANCED SYSTEM FOR POLLUTION PREVENTION IN CHEMICAL COMPLEXES

    EPA Science Inventory

    One important accomplishment is that the system will give process engineers interactively and simultaneously use of programs for total cost analysis, life cycle assessment and sustainability metrics to provide direction for the optimal chemical complex analysis pro...

  17. Software control and system configuration management - A process that works

    NASA Technical Reports Server (NTRS)

    Petersen, K. L.; Flores, C., Jr.

    1983-01-01

    A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.

  18. Bioreactivity: Studies on a Simple Brain Stem Reflex in Behaving Animals

    DTIC Science & Technology

    1990-08-10

    problem in attempting to understand complex physiological processes, such as brain neuromodulation , or complex behavioral processes, such as arousal...containing only one synapse in brain, and receives dense inputs from two neurochemical systems important in neuromodulation and arousal. Initial

  19. Bioreactivity: Studies on a Simple Brain Stem Reflex in Behaving Animals

    DTIC Science & Technology

    1990-01-04

    attempting to understand complex physiological processes, such as brain neuromodulation , or complex behavioral processes, such as arousal, is finding a...one synapse in brain, and receives dense inputs from two neurochemical systems important in neuromodulation and arousal. Initial pharmacologic studies

  20. Complexity versus certainty in understanding species’ declines

    USGS Publications Warehouse

    Sundstrom, Shana M.; Allen, Craig R.

    2014-01-01

    Traditional approaches to predict species declines (e.g. government processes or IUCN Red Lists), may be too simplistic and may therefore misguide management and conservation. Using complex systems approaches that account for scale-specific patterns and processes have the potential to overcome these limitations.

  1. Reconceptualizing children's complex discharge with health systems theory: novel integrative review with embedded expert consultation and theory development.

    PubMed

    Noyes, Jane; Brenner, Maria; Fox, Patricia; Guerin, Ashleigh

    2014-05-01

    To report a novel review to develop a health systems model of successful transition of children with complex healthcare needs from hospital to home. Children with complex healthcare needs commonly experience an expensive, ineffectual and prolonged nurse-led discharge process. Children gain no benefit from prolonged hospitalization and are exposed to significant harm. Research to enable intervention development and process evaluation across the entire health system is lacking. Novel mixed-method integrative review informed by health systems theory. DATA  CINAHL, PsychInfo, EMBASE, PubMed, citation searching, personal contact. REVIEW  Informed by consultation with experts. English language studies, opinion/discussion papers reporting research, best practice and experiences of children, parents and healthcare professionals and purposively selected policies/guidelines from 2002-December 2012 were abstracted using Framework synthesis, followed by iterative theory development. Seven critical factors derived from thirty-four sources across five health system levels explained successful discharge (new programme theory). All seven factors are required in an integrated care pathway, with a dynamic communication loop to facilitate effective discharge (new programme logic). Current health system responses were frequently static and critical success factors were commonly absent, thereby explaining ineffectual discharge. The novel evidence-based model, which reconceptualizes 'discharge' as a highly complex longitudinal health system intervention, makes a significant contribution to global knowledge to drive practice development. Research is required to develop process and outcome measures at different time points in the discharge process and future trials are needed to determine the effectiveness of integrated health system discharge models. © 2013 John Wiley & Sons Ltd.

  2. Automated fiber placement composite manufacturing: The mission at MSFC's Productivity Enhancement Complex

    NASA Technical Reports Server (NTRS)

    Vickers, John H.; Pelham, Larry I.

    1993-01-01

    Automated fiber placement is a manufacturing process used for producing complex composite structures. It is a notable leap to the state-of-the-art in technology for automated composite manufacturing. The fiber placement capability was established at the Marshall Space Flight Center's (MSFC) Productivity Enhancement Complex in 1992 in collaboration with Thiokol Corporation to provide materials and processes research and development, and to fabricate components for many of the Center's Programs. The Fiber Placement System (FPX) was developed as a distinct solution to problems inherent to other automated composite manufacturing systems. This equipment provides unique capabilities to build composite parts in complex 3-D shapes with concave and other asymmetrical configurations. Components with complex geometries and localized reinforcements usually require labor intensive efforts resulting in expensive, less reproducible components; the fiber placement system has the features necessary to overcome these conditions. The mechanical systems of the equipment have the motion characteristics of a filament winder and the fiber lay-up attributes of a tape laying machine, with the additional capabilities of differential tow payout speeds, compaction and cut-restart to selectively place the correct number of fibers where the design dictates. This capability will produce a repeatable process resulting in lower cost and improved quality and reliability.

  3. Monitoring system for the quality assessment in additive manufacturing

    NASA Astrophysics Data System (ADS)

    Carl, Volker

    2015-03-01

    Additive Manufacturing (AM) refers to a process by which a set of digital data -representing a certain complex 3dim design - is used to grow the respective 3dim real structure equal to the corresponding design. For the powder-based EOS manufacturing process a variety of plastic and metal materials can be used. Thereby, AM is in many aspects a very powerful tool as it can help to overcome particular limitations in conventional manufacturing. AM enables more freedom of design, complex, hollow and/or lightweight structures as well as product individualisation and functional integration. As such it is a promising approach with respect to the future design and manufacturing of complex 3dim structures. On the other hand, it certainly calls for new methods and standards in view of quality assessment. In particular, when utilizing AM for the design of complex parts used in aviation and aerospace technologies, appropriate monitoring systems are mandatory. In this respect, recently, sustainable progress has been accomplished by joining the common efforts and concerns of a manufacturer Additive Manufacturing systems and respective materials (EOS), along with those of an operator of such systems (MTU Aero Engines) and experienced application engineers (Carl Metrology), using decent know how in the field of optical and infrared methods regarding non-destructive-examination (NDE). The newly developed technology is best described by a high-resolution layer by layer inspection technique, which allows for a 3D tomography-analysis of the complex part at any time during the manufacturing process. Thereby, inspection costs are kept rather low by using smart image-processing methods as well as CMOS sensors instead of infrared detectors. Moreover, results from conventional physical metallurgy may easily be correlated with the predictive results of the monitoring system which not only allows for improvements of the AM monitoring system, but finally leads to an optimisation of the quality and insurance of material security of the complex structure being manufactured. Both, our poster and our oral presentation will explain the data flow between the above mentioned parties involved. A suitable monitoring system for Additive Manufacturing will be introduced, along with a presentation of the respective high resolution data acquisition, as well as the image processing and the data analysis allowing for a precise control of the 3dim growth-process.

  4. Monitoring system for the quality assessment in additive manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carl, Volker, E-mail: carl@t-zfp.de

    Additive Manufacturing (AM) refers to a process by which a set of digital data -representing a certain complex 3dim design - is used to grow the respective 3dim real structure equal to the corresponding design. For the powder-based EOS manufacturing process a variety of plastic and metal materials can be used. Thereby, AM is in many aspects a very powerful tool as it can help to overcome particular limitations in conventional manufacturing. AM enables more freedom of design, complex, hollow and/or lightweight structures as well as product individualisation and functional integration. As such it is a promising approach with respectmore » to the future design and manufacturing of complex 3dim structures. On the other hand, it certainly calls for new methods and standards in view of quality assessment. In particular, when utilizing AM for the design of complex parts used in aviation and aerospace technologies, appropriate monitoring systems are mandatory. In this respect, recently, sustainable progress has been accomplished by joining the common efforts and concerns of a manufacturer Additive Manufacturing systems and respective materials (EOS), along with those of an operator of such systems (MTU Aero Engines) and experienced application engineers (Carl Metrology), using decent know how in the field of optical and infrared methods regarding non-destructive-examination (NDE). The newly developed technology is best described by a high-resolution layer by layer inspection technique, which allows for a 3D tomography-analysis of the complex part at any time during the manufacturing process. Thereby, inspection costs are kept rather low by using smart image-processing methods as well as CMOS sensors instead of infrared detectors. Moreover, results from conventional physical metallurgy may easily be correlated with the predictive results of the monitoring system which not only allows for improvements of the AM monitoring system, but finally leads to an optimisation of the quality and insurance of material security of the complex structure being manufactured. Both, our poster and our oral presentation will explain the data flow between the above mentioned parties involved. A suitable monitoring system for Additive Manufacturing will be introduced, along with a presentation of the respective high resolution data acquisition, as well as the image processing and the data analysis allowing for a precise control of the 3dim growth-process.« less

  5. Managing complexity in simulations of land surface and near-surface processes

    DOE PAGES

    Coon, Ethan T.; Moulton, J. David; Painter, Scott L.

    2016-01-12

    Increasing computing power and the growing role of simulation in Earth systems science have led to an increase in the number and complexity of processes in modern simulators. We present a multiphysics framework that specifies interfaces for coupled processes and automates weak and strong coupling strategies to manage this complexity. Process management is enabled by viewing the system of equations as a tree, where individual equations are associated with leaf nodes and coupling strategies with internal nodes. A dynamically generated dependency graph connects a variable to its dependencies, streamlining and automating model evaluation, easing model development, and ensuring models aremore » modular and flexible. Additionally, the dependency graph is used to ensure that data requirements are consistent between all processes in a given simulation. Here we discuss the design and implementation of these concepts within the Arcos framework, and demonstrate their use for verification testing and hypothesis evaluation in numerical experiments.« less

  6. The use of discrete-event simulation modelling to improve radiation therapy planning processes.

    PubMed

    Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven

    2009-07-01

    The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.

  7. Risk Modeling of Interdependent Complex Systems of Systems: Theory and Practice.

    PubMed

    Haimes, Yacov Y

    2018-01-01

    The emergence of the complexity characterizing our systems of systems (SoS) requires a reevaluation of the way we model, assess, manage, communicate, and analyze the risk thereto. Current models for risk analysis of emergent complex SoS are insufficient because too often they rely on the same risk functions and models used for single systems. These models commonly fail to incorporate the complexity derived from the networks of interdependencies and interconnectedness (I-I) characterizing SoS. There is a need to reevaluate currently practiced risk analysis to respond to this reality by examining, and thus comprehending, what makes emergent SoS complex. The key to evaluating the risk to SoS lies in understanding the genesis of characterizing I-I of systems manifested through shared states and other essential entities within and among the systems that constitute SoS. The term "essential entities" includes shared decisions, resources, functions, policies, decisionmakers, stakeholders, organizational setups, and others. This undertaking can be accomplished by building on state-space theory, which is fundamental to systems engineering and process control. This article presents a theoretical and analytical framework for modeling the risk to SoS with two case studies performed with the MITRE Corporation and demonstrates the pivotal contributions made by shared states and other essential entities to modeling and analysis of the risk to complex SoS. A third case study highlights the multifarious representations of SoS, which require harmonizing the risk analysis process currently applied to single systems when applied to complex SoS. © 2017 Society for Risk Analysis.

  8. Problems of Automation and Management Principles Information Flow in Manufacturing

    NASA Astrophysics Data System (ADS)

    Grigoryuk, E. N.; Bulkin, V. V.

    2017-07-01

    Automated control systems of technological processes are complex systems that are characterized by the presence of elements of the overall focus, the systemic nature of the implemented algorithms for the exchange and processing of information, as well as a large number of functional subsystems. The article gives examples of automatic control systems and automated control systems of technological processes held parallel between them by identifying strengths and weaknesses. Other proposed non-standard control system of technological process.

  9. Complex Topographic Feature Ontology Patterns

    USGS Publications Warehouse

    Varanka, Dalia E.; Jerris, Thomas J.

    2015-01-01

    Semantic ontologies are examined as effective data models for the representation of complex topographic feature types. Complex feature types are viewed as integrated relations between basic features for a basic purpose. In the context of topographic science, such component assemblages are supported by resource systems and found on the local landscape. Ontologies are organized within six thematic modules of a domain ontology called Topography that includes within its sphere basic feature types, resource systems, and landscape types. Context is constructed not only as a spatial and temporal setting, but a setting also based on environmental processes. Types of spatial relations that exist between components include location, generative processes, and description. An example is offered in a complex feature type ‘mine.’ The identification and extraction of complex feature types are an area for future research.

  10. Evolution of the archaeal and mammalian information processing systems: towards an archaeal model for human disease.

    PubMed

    Lyu, Zhe; Whitman, William B

    2017-01-01

    Current evolutionary models suggest that Eukaryotes originated from within Archaea instead of being a sister lineage. To test this model of ancient evolution, we review recent studies and compare the three major information processing subsystems of replication, transcription and translation in the Archaea and Eukaryotes. Our hypothesis is that if the Eukaryotes arose within the archaeal radiation, their information processing systems will appear to be one of kind and not wholly original. Within the Eukaryotes, the mammalian or human systems are emphasized because of their importance in understanding health. Biochemical as well as genetic studies provide strong evidence for the functional similarity of archaeal homologs to the mammalian information processing system and their dissimilarity to the bacterial systems. In many independent instances, a simple archaeal system is functionally equivalent to more elaborate eukaryotic homologs, suggesting that evolution of complexity is likely an central feature of the eukaryotic information processing system. Because fewer components are often involved, biochemical characterizations of the archaeal systems are often easier to interpret. Similarly, the archaeal cell provides a genetically and metabolically simpler background, enabling convenient studies on the complex information processing system. Therefore, Archaea could serve as a parsimonious and tractable host for studying human diseases that arise in the information processing systems.

  11. Uncertainty Reduction for Stochastic Processes on Complex Networks

    NASA Astrophysics Data System (ADS)

    Radicchi, Filippo; Castellano, Claudio

    2018-05-01

    Many real-world systems are characterized by stochastic dynamical rules where a complex network of interactions among individual elements probabilistically determines their state. Even with full knowledge of the network structure and of the stochastic rules, the ability to predict system configurations is generally characterized by a large uncertainty. Selecting a fraction of the nodes and observing their state may help to reduce the uncertainty about the unobserved nodes. However, choosing these points of observation in an optimal way is a highly nontrivial task, depending on the nature of the stochastic process and on the structure of the underlying interaction pattern. In this paper, we introduce a computationally efficient algorithm to determine quasioptimal solutions to the problem. The method leverages network sparsity to reduce computational complexity from exponential to almost quadratic, thus allowing the straightforward application of the method to mid-to-large-size systems. Although the method is exact only for equilibrium stochastic processes defined on trees, it turns out to be effective also for out-of-equilibrium processes on sparse loopy networks.

  12. Concept of a Cloud Service for Data Preparation and Computational Control on Custom HPC Systems in Application to Molecular Dynamics

    NASA Astrophysics Data System (ADS)

    Puzyrkov, Dmitry; Polyakov, Sergey; Podryga, Viktoriia; Markizov, Sergey

    2018-02-01

    At the present stage of computer technology development it is possible to study the properties and processes in complex systems at molecular and even atomic levels, for example, by means of molecular dynamics methods. The most interesting are problems related with the study of complex processes under real physical conditions. Solving such problems requires the use of high performance computing systems of various types, for example, GRID systems and HPC clusters. Considering the time consuming computational tasks, the need arises of software for automatic and unified monitoring of such computations. A complex computational task can be performed over different HPC systems. It requires output data synchronization between the storage chosen by a scientist and the HPC system used for computations. The design of the computational domain is also quite a problem. It requires complex software tools and algorithms for proper atomistic data generation on HPC systems. The paper describes the prototype of a cloud service, intended for design of atomistic systems of large volume for further detailed molecular dynamic calculations and computational management for this calculations, and presents the part of its concept aimed at initial data generation on the HPC systems.

  13. Situational Analysis for Complex Systems: Methodological Development in Public Health Research.

    PubMed

    Martin, Wanda; Pauly, Bernie; MacDonald, Marjorie

    2016-01-01

    Public health systems have suffered infrastructure losses worldwide. Strengthening public health systems requires not only good policies and programs, but also development of new research methodologies to support public health systems renewal. Our research team considers public health systems to be complex adaptive systems and as such new methods are necessary to generate knowledge about the process of implementing public health programs and services. Within our program of research, we have employed situational analysis as a method for studying complex adaptive systems in four distinct research studies on public health program implementation. The purpose of this paper is to demonstrate the use of situational analysis as a method for studying complex systems and highlight the need for further methodological development.

  14. Dynamic control and information processing in chemical reaction systems by tuning self-organization behavior

    NASA Astrophysics Data System (ADS)

    Lebiedz, Dirk; Brandt-Pollmann, Ulrich

    2004-09-01

    Specific external control of chemical reaction systems and both dynamic control and signal processing as central functions in biochemical reaction systems are important issues of modern nonlinear science. For example nonlinear input-output behavior and its regulation are crucial for the maintainance of the life process that requires extensive communication between cells and their environment. An important question is how the dynamical behavior of biochemical systems is controlled and how they process information transmitted by incoming signals. But also from a general point of view external forcing of complex chemical reaction processes is important in many application areas ranging from chemical engineering to biomedicine. In order to study such control issues numerically, here, we choose a well characterized chemical system, the CO oxidation on Pt(110), which is interesting per se as an externally forced chemical oscillator model. We show numerically that tuning of temporal self-organization by input signals in this simple nonlinear chemical reaction exhibiting oscillatory behavior can in principle be exploited for both specific external control of dynamical system behavior and processing of complex information.

  15. Graphical Environment Tools for Application to Gamma-Ray Energy Tracking Arrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Todd, Richard A.; Radford, David C.

    2013-12-30

    Highly segmented, position-sensitive germanium detector systems are being developed for nuclear physics research where traditional electronic signal processing with mixed analog and digital function blocks would be enormously complex and costly. Future systems will be constructed using pipelined processing of high-speed digitized signals as is done in the telecommunications industry. Techniques which provide rapid algorithm and system development for future systems are desirable. This project has used digital signal processing concepts and existing graphical system design tools to develop a set of re-usable modular functions and libraries targeted for the nuclear physics community. Researchers working with complex nuclear detector arraysmore » such as the Gamma-Ray Energy Tracking Array (GRETA) have been able to construct advanced data processing algorithms for implementation in field programmable gate arrays (FPGAs) through application of these library functions using intuitive graphical interfaces.« less

  16. Use of complex adaptive systems metaphor to achieve professional and organizational change.

    PubMed

    Rowe, Ann; Hogarth, Annette

    2005-08-01

    This paper uses the experiences of a programme designed to bring about change in performance of public health nurses (health visitors and school nurses) in an inner city primary care trust, to explore the issues of professional and organizational change in health care organizations. The United Kingdom government has given increasing emphasis to programmes of modernization within the National Health Service. A central facet of this policy shift has been an expectation of behaviour and practice change by health care professionals. Change was brought about through use of a Complex Adaptive Systems approach. This enabled change to be seen as an inclusive, evolving and unpredictable process rather one which is linear and mechanistic. The paper examines in detail how the use of concepts and metaphors associated with Complex Adaptive Systems influenced the development of the programme, its implementation and outcomes. The programme resulted in extensive change in professional behaviour, service delivery and transformational change in the organizational structures and processes of the employing organization. This gave greater opportunities for experimentation and innovation, leading to new developments in service delivery, but also meant higher levels of uncertainty, responsibility, decision-making and risk management for practitioners. Using a Complex Adaptive Systems approach was helpful for developing alternative views of change and for understanding why and how some aspects of change were more successful than others. Its use encouraged the confrontation of some long-standing assumptions about change and service delivery patterns in the National Health Service, and the process exposed challenging tensions within the Service. The consequent destabilising of organizational and professional norms resulted in considerable emotional impacts for practitioners, an area which was found to be underplayed within the Complex Adaptive Systems literature. A Complex Adaptive Systems approach can support change, in particular a recognition and understanding of the emergence of unexpected structures, patterns and processes. The approach can support nurses to change their behaviour and innovate, but requires high levels of accountability, individual and professional creativity.

  17. Implications of Biospheric Energization

    NASA Astrophysics Data System (ADS)

    Budding, Edd; Demircan, Osman; Gündüz, Güngör; Emin Özel, Mehmet

    2016-07-01

    Our physical model relating to the origin and development of lifelike processes from very simple beginnings is reviewed. This molecular ('ABC') process is compared with the chemoton model, noting the role of the autocatalytic tuning to the time-dependent source of energy. This substantiates a Darwinian character to evolution. The system evolves from very simple beginnings to a progressively more highly tuned, energized and complex responding biosphere, that grows exponentially; albeit with a very low net growth factor. Rates of growth and complexity in the evolution raise disturbing issues of inherent stability. Autocatalytic processes can include a fractal character to their development allowing recapitulative effects to be observed. This property, in allowing similarities of pattern to be recognized, can be useful in interpreting complex (lifelike) systems.

  18. Increasing complexity with quantum physics.

    PubMed

    Anders, Janet; Wiesner, Karoline

    2011-09-01

    We argue that complex systems science and the rules of quantum physics are intricately related. We discuss a range of quantum phenomena, such as cryptography, computation and quantum phases, and the rules responsible for their complexity. We identify correlations as a central concept connecting quantum information and complex systems science. We present two examples for the power of correlations: using quantum resources to simulate the correlations of a stochastic process and to implement a classically impossible computational task.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, Philip LaRoche

    At the end of his life, Stephen Jay Kline, longtime professor of mechanical engineering at Stanford University, completed a book on how to address complex systems. The title of the book is 'Conceptual Foundations of Multi-Disciplinary Thinking' (1995), but the topic of the book is systems. Kline first establishes certain limits that are characteristic of our conscious minds. Kline then establishes a complexity measure for systems and uses that complexity measure to develop a hierarchy of systems. Kline then argues that our minds, due to their characteristic limitations, are unable to model the complex systems in that hierarchy. Computers aremore » of no help to us here. Our attempts at modeling these complex systems are based on the way we successfully model some simple systems, in particular, 'inert, naturally-occurring' objects and processes, such as what is the focus of physics. But complex systems overwhelm such attempts. As a result, the best we can do in working with these complex systems is to use a heuristic, what Kline calls the 'Guideline for Complex Systems.' Kline documents the problems that have developed due to 'oversimple' system models and from the inappropriate application of a system model from one domain to another. One prominent such problem is the Procrustean attempt to make the disciplines that deal with complex systems be 'physics-like.' Physics deals with simple systems, not complex ones, using Kline's complexity measure. The models that physics has developed are inappropriate for complex systems. Kline documents a number of the wasteful and dangerous fallacies of this type.« less

  20. Representation of People's Decisions in Health Information Systems. A Complementary Approach for Understanding Health Care Systems and Population Health.

    PubMed

    Gonzalez Bernaldo de Quiros, Fernan; Dawidowski, Adriana R; Figar, Silvana

    2017-02-01

    In this study, we aimed: 1) to conceptualize the theoretical challenges facing health information systems (HIS) to represent patients' decisions about health and medical treatments in everyday life; 2) to suggest approaches for modeling these processes. The conceptualization of the theoretical and methodological challenges was discussed in 2015 during a series of interdisciplinary meetings attended by health informatics staff, epidemiologists and health professionals working in quality management and primary and secondary prevention of chronic diseases of the Hospital Italiano de Buenos Aires, together with sociologists, anthropologists and e-health stakeholders. HIS are facing the need and challenge to represent social human processes based on constructivist and complexity theories, which are the current frameworks of human sciences for understanding human learning and socio-cultural changes. Computer systems based on these theories can model processes of social construction of concrete and subjective entities and the interrelationships between them. These theories could be implemented, among other ways, through the mapping of health assets, analysis of social impact through community trials and modeling of complexity with system simulation tools. This analysis suggested the need to complement the traditional linear causal explanations of disease onset (and treatments) that are the bases for models of analysis of HIS with constructivist and complexity frameworks. Both may enlighten the complex interrelationships among patients, health services and the health system. The aim of this strategy is to clarify people's decision making processes to improve the efficiency, quality and equity of the health services and the health system.

  1. Can Models Capture the Complexity of the Systems Engineering Process?

    NASA Astrophysics Data System (ADS)

    Boppana, Krishna; Chow, Sam; de Weck, Olivier L.; Lafon, Christian; Lekkakos, Spyridon D.; Lyneis, James; Rinaldi, Matthew; Wang, Zhiyong; Wheeler, Paul; Zborovskiy, Marat; Wojcik, Leonard A.

    Many large-scale, complex systems engineering (SE) programs have been problematic; a few examples are listed below (Bar-Yam, 2003 and Cullen, 2004), and many others have been late, well over budget, or have failed: Hilton/Marriott/American Airlines system for hotel reservations and flights; 1988-1992; 125 million; "scrapped"

  2. Bacterial flagella and Type III secretion: case studies in the evolution of complexity.

    PubMed

    Pallen, M J; Gophna, U

    2007-01-01

    Bacterial flagella at first sight appear uniquely sophisticated in structure, so much so that they have even been considered 'irreducibly complex' by the intelligent design movement. However, a more detailed analysis reveals that these remarkable pieces of molecular machinery are the product of processes that are fully compatible with Darwinian evolution. In this chapter we present evidence for such processes, based on a review of experimental studies, molecular phylogeny and microbial genomics. Several processes have played important roles in flagellar evolution: self-assembly of simple repeating subunits, gene duplication with subsequent divergence, recruitment of elements from other systems ('molecular bricolage'), and recombination. We also discuss additional tentative new assignments of homology (FliG with MgtE, FliO with YscJ). In conclusion, rather than providing evidence of intelligent design, flagellar and non-flagellar Type III secretion systems instead provide excellent case studies in the evolution of complex systems from simpler components.

  3. Self-conscious robotic system design process--from analysis to implementation.

    PubMed

    Chella, Antonio; Cossentino, Massimo; Seidita, Valeria

    2011-01-01

    Developing robotic systems endowed with self-conscious capabilities means realizing complex sub-systems needing ad-hoc software engineering techniques for their modelling, analysis and implementation. In this chapter the whole process (from analysis to implementation) to model the development of self-conscious robotic systems is presented and the new created design process, PASSIC, supporting each part of it, is fully illustrated.

  4. Complexity theory and physical unification: From microscopic to oscopic level

    NASA Astrophysics Data System (ADS)

    Pavlos, G. P.; Iliopoulos, A. C.; Karakatsanis, L. P.; Tsoutsouras, V. G.; Pavlos, E. G.

    During the last two decades, low dimensional chaotic or self-organized criticality (SOC) processes have been observed by our group in many different physical systems such as space plasmas, the solar or the magnetospheric dynamics, the atmosphere, earthquakes, the brain activity as well as in informational systems. All these systems are complex systems living far from equilibrium with strong self-organization and phase transition character. The theoretical interpretation of these natural phenomena needs a deeper insight into the fundamentals of complexity theory. In this study, we try to give a synoptic description of complexity theory both at the microscopic and at the oscopic level of the physical reality. Also, we propose that the self-organization observed oscopically is a phenomenon that reveals the strong unifying character of the complex dynamics which includes thermodynamical and dynamical characteristics in all levels of the physical reality. From this point of view, oscopical deterministic and stochastic processes are closely related to the microscopical chaos and self-organization. In this study the scientific work of scientists such as Wilson, Nicolis, Prigogine, Hooft, Nottale, El Naschie, Castro, Tsallis, Chang and others is used for the development of a unified physical comprehension of complex dynamics from the microscopic to the oscopic level.

  5. A global "imaging'' view on systems approaches in immunology.

    PubMed

    Ludewig, Burkhard; Stein, Jens V; Sharpe, James; Cervantes-Barragan, Luisa; Thiel, Volker; Bocharov, Gennady

    2012-12-01

    The immune system exhibits an enormous complexity. High throughput methods such as the "-omic'' technologies generate vast amounts of data that facilitate dissection of immunological processes at ever finer resolution. Using high-resolution data-driven systems analysis, causal relationships between complex molecular processes and particular immunological phenotypes can be constructed. However, processes in tissues, organs, and the organism itself (so-called higher level processes) also control and regulate the molecular (lower level) processes. Reverse systems engineering approaches, which focus on the examination of the structure, dynamics and control of the immune system, can help to understand the construction principles of the immune system. Such integrative mechanistic models can properly describe, explain, and predict the behavior of the immune system in health and disease by combining both higher and lower level processes. Moving from molecular and cellular levels to a multiscale systems understanding requires the development of methodologies that integrate data from different biological levels into multiscale mechanistic models. In particular, 3D imaging techniques and 4D modeling of the spatiotemporal dynamics of immune processes within lymphoid tissues are central for such integrative approaches. Both dynamic and global organ imaging technologies will be instrumental in facilitating comprehensive multiscale systems immunology analyses as discussed in this review. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. General and craniofacial development are complex adaptive processes influenced by diversity.

    PubMed

    Brook, A H; O'Donnell, M Brook; Hone, A; Hart, E; Hughes, T E; Smith, R N; Townsend, G C

    2014-06-01

    Complex systems are present in such diverse areas as social systems, economies, ecosystems and biology and, therefore, are highly relevant to dental research, education and practice. A Complex Adaptive System in biological development is a dynamic process in which, from interacting components at a lower level, higher level phenomena and structures emerge. Diversity makes substantial contributions to the performance of complex adaptive systems. It enhances the robustness of the process, allowing multiple responses to external stimuli as well as internal changes. From diversity comes variation in outcome and the possibility of major change; outliers in the distribution enhance the tipping points. The development of the dentition is a valuable, accessible model with extensive and reliable databases for investigating the role of complex adaptive systems in craniofacial and general development. The general characteristics of such systems are seen during tooth development: self-organization; bottom-up emergence; multitasking; self-adaptation; variation; tipping points; critical phases; and robustness. Dental findings are compatible with the Random Network Model, the Threshold Model and also with the Scale Free Network Model which has a Power Law distribution. In addition, dental development shows the characteristics of Modularity and Clustering to form Hierarchical Networks. The interactions between the genes (nodes) demonstrate Small World phenomena, Subgraph Motifs and Gene Regulatory Networks. Genetic mechanisms are involved in the creation and evolution of variation during development. The genetic factors interact with epigenetic and environmental factors at the molecular level and form complex networks within the cells. From these interactions emerge the higher level tissues, tooth germs and mineralized teeth. Approaching development in this way allows investigation of why there can be variations in phenotypes from identical genotypes; the phenotype is the outcome of perturbations in the cellular systems and networks, as well as of the genotype. Understanding and applying complexity theory will bring about substantial advances not only in dental research and education but also in the organization and delivery of oral health care. © 2014 Australian Dental Association.

  7. Social complexity, modernity and suicide: an assessment of Durkheim's suicide from the perspective of a non-linear analysis of complex social systems.

    PubMed

    Condorelli, Rosalia

    2016-01-01

    Can we share even today the same vision of modernity which Durkheim left us by its suicide analysis? or can society 'surprise us'? The answer to these questions can be inspired by several studies which found that beginning the second half of the twentieth century suicides in western countries more industrialized and modernized do not increase in a constant, linear way as modernization and social fragmentation process increases, as well as Durkheim's theory seems to lead us to predict. Despite continued modernizing process, they found stabilizing or falling overall suicide rate trends. Therefore, a gradual process of adaptation to the stress of modernization associated to low social integration levels seems to be activated in modern society. Assuming this perspective, the paper highlights as this tendency may be understood in the light of the new concept of social systems as complex adaptive systems, systems which are able to adapt to environmental perturbations and generate as a whole surprising, emergent effects due to nonlinear interactions among their components. So, in the frame of Nonlinear Dynamical System Modeling, we formalize the logic of suicide decision-making process responsible for changes at aggregate level in suicide growth rates by a nonlinear differential equation structured in a logistic way, and in so doing we attempt to capture the mechanism underlying the change process in suicide growth rate and to test the hypothesis that system's dynamics exhibits a restrained increase process as expression of an adaptation process to the liquidity of social ties in modern society. In particular, a Nonlinear Logistic Map is applied to suicide data in a modern society such as the Italian one from 1875 to 2010. The analytic results, seeming to confirm the idea of the activation of an adaptation process to the liquidity of social ties, constitutes an opportunity for a more general reflection on the current configuration of modern society, by relating the Durkheimian Theory with the Halbwachs' Theory and most current visions of modernity such as the Baumanian one. Complexity completes the interpretative framework by rooting the generating mechanism of adaptation process in the precondition of a new General Theory of Systems making the non linearity property of social system's interactions and surprise the functioning and evolution rule of social systems.

  8. Complex multidisciplinary system composition for aerospace vehicle conceptual design

    NASA Astrophysics Data System (ADS)

    Gonzalez, Lex

    Although, there exists a vast amount of work concerning the analysis, design, integration of aerospace vehicle systems, there is no standard for how this data and knowledge should be combined in order to create a synthesis system. Each institution creating a synthesis system has in house vehicle and hardware components they are attempting to model and proprietary methods with which to model them. This leads to the fact that synthesis systems begin as one-off creations meant to answer a specific problem. As the scope of the synthesis system grows to encompass more and more problems, so does its size and complexity; in order for a single synthesis system to answer multiple questions the number of methods and method interface must increase. As a means to curtail the requirement that the increase of an aircraft synthesis systems capability leads to an increase in its size and complexity, this research effort focuses on the idea that each problem in aerospace requires its own analysis framework. By focusing on the creation of a methodology which centers on the matching of an analysis framework towards the problem being solved, the complexity of the analysis framework is decoupled from the complexity of the system that creates it. The derived methodology allows for the composition of complex multi-disciplinary systems (CMDS) through the automatic creation and implementation of system and disciplinary method interfaces. The CMDS Composition process follows a four step methodology meant to take a problem definition and progress towards the creation of an analysis framework meant to answer said problem. The unique implementation of the CMDS Composition process take user selected disciplinary analysis methods and automatically integrates them, together in order to create a syntactically composable analysis framework. As a means of assessing the validity of the CMDS Composition process a prototype system (AVDDBMS) has been developed. AVD DBMS has been used to model the Generic Hypersonic Vehicle (GHV), an open source family of hypersonic vehicles originating from the Air Force Research Laboratory. AVDDBMS has been applied in three different ways in order to assess its validity: Verification using GHV disciplinary data, Validation using selected disciplinary analysis methods, and Application of the CMDS Composition Process to assess the design solution space for the GHV hardware. The research demonstrates the holistic effect that selection of individual disciplinary analysis methods has on the structure and integration of the analysis framework.

  9. Automated Derivation of Complex System Constraints from User Requirements

    NASA Technical Reports Server (NTRS)

    Foshee, Mark; Murey, Kim; Marsh, Angela

    2010-01-01

    The Payload Operations Integration Center (POIC) located at the Marshall Space Flight Center has the responsibility of integrating US payload science requirements for the International Space Station (ISS). All payload operations must request ISS system resources so that the resource usage will be included in the ISS on-board execution timelines. The scheduling of resources and building of the timeline is performed using the Consolidated Planning System (CPS). The ISS resources are quite complex due to the large number of components that must be accounted for. The planners at the POIC simplify the process for Payload Developers (PD) by providing the PDs with a application that has the basic functionality PDs need as well as list of simplified resources in the User Requirements Collection (URC) application. The planners maintained a mapping of the URC resources to the CPS resources. The process of manually converting PD's science requirements from a simplified representation to a more complex CPS representation is a time-consuming and tedious process. The goal is to provide a software solution to allow the planners to build a mapping of the complex CPS constraints to the basic URC constraints and automatically convert the PD's requirements into systems requirements during export to CPS.

  10. An experimental study of phase transitions in a complex plasma

    NASA Astrophysics Data System (ADS)

    Smith, Bernard Albert Thomas, II

    In semiconductor manufacturing, contamination due to particulates significantly decreases the yield and quality of device fabrication, therefore increasing the cost of production. Dust particle clouds can be found in almost all plasma processing environments including both plasma etching devices and in plasma deposition processes. Dust particles suspended within such plasmas will acquire an electric charge from collisions with free electrons in the plasma. If the ratio of inter-particle potential energy to the average kinetic energy is sufficient, the particles will form either a "liquid" structure with short range ordering or a crystalline structure with long range ordering. Otherwise, the dust particle system will remain in a gaseous state. Many experiments have been conducted over the past decade on such complex plasmas to discover the character of the systems formed, but more work is needed to fully understand these structures. This paper describes the processes involved in setting up the CASPER GEC RF Reference Cell and the modifications necessary to examine complex plasmas. Research conducted to characterize the system is outlined to demonstrate that the CASPER Cell behaves as other GEC Cells. In addition, further research performed shows the behavior of the complex plasma system in the CASPER Cell is similar to complex plasmas studied by other groups in this field. Along the way analysis routines developed specifically for this system are described. New research involving polydisperse dust distributions is carried out in the system once the initial characterization is finished. Next, a system to externally vary the DC bias in the CASPER Cell is developed and characterized. Finally, new research conducted to specifically examine how the complex plasma system reacts to a variable DC bias is reported. Specifically, the response of the interparticle spacing to various system parameters (including the external DC bias) is examined. Also, a previously unreported phenomenon, namely layer splitting, is examined.

  11. Design for waste-management system

    NASA Technical Reports Server (NTRS)

    Guarneri, C. A.; Reed, A.; Renman, R.

    1973-01-01

    Study was made and system defined for water-recovery and solid-waste processing for low-rise apartment complexes. System can be modified to conform with unique requirements of community, including hydrology, geology, and climate. Reclamation is accomplished by treatment process that features reverse-osmosis membranes.

  12. Using evaluation to adapt health information outreach to the complex environments of community-based organizations.

    PubMed

    Olney, Cynthia A

    2005-10-01

    After arguing that most community-based organizations (CBOs) function as complex adaptive systems, this white paper describes the evaluation goals, questions, indicators, and methods most important at different stages of community-based health information outreach. This paper presents the basic characteristics of complex adaptive systems and argues that the typical CBO can be considered this type of system. It then presents evaluation as a tool for helping outreach teams adapt their outreach efforts to the CBO environment and thus maximize success. Finally, it describes the goals, questions, indicators, and methods most important or helpful at each stage of evaluation (community assessment, needs assessment and planning, process evaluation, and outcomes assessment). Literature from complex adaptive systems as applied to health care, business, and evaluation settings is presented. Evaluation models and applications, particularly those based on participatory approaches, are presented as methods for maximizing the effectiveness of evaluation in dynamic CBO environments. If one accepts that CBOs function as complex adaptive systems-characterized by dynamic relationships among many agents, influences, and forces-then effective evaluation at the stages of community assessment, needs assessment and planning, process evaluation, and outcomes assessment is critical to outreach success.

  13. A mammalian nervous system-specific plasma membrane proteasome complex that modulates neuronal function

    PubMed Central

    Ramachandran, Kapil V.; Margolis, Seth S.

    2017-01-01

    In the nervous system, rapidly occurring processes such as neuronal transmission and calcium signaling are affected by short-term inhibition of proteasome function. It remains unclear how proteasomes can acutely regulate such processes, as this is inconsistent with their canonical role in proteostasis. Here, we made the discovery of a mammalian nervous system-specific membrane proteasome complex that directly and rapidly modulates neuronal function by degrading intracellular proteins into extracellular peptides that can stimulate neuronal signaling. This proteasome complex is tightly associated with neuronal plasma membranes, exposed to the extracellular space, and catalytically active. Selective inhibition of this membrane proteasome complex by a cell-impermeable proteasome inhibitor blocked extracellular peptide production and attenuated neuronal activity-induced calcium signaling. Moreover, membrane proteasome-derived peptides are sufficient to induce neuronal calcium signaling. Our discoveries challenge the prevailing notion that proteasomes primarily function to maintain proteostasis, and highlight a form of neuronal communication through a membrane proteasome complex. PMID:28287632

  14. Quality data collection and management technology of aerospace complex product assembly process

    NASA Astrophysics Data System (ADS)

    Weng, Gang; Liu, Jianhua; He, Yongxi; Zhuang, Cunbo

    2017-04-01

    Aiming at solving problems of difficult management and poor traceability for discrete assembly process quality data, a data collection and management method is proposed which take the assembly process and BOM as the core. Data collection method base on workflow technology, data model base on BOM and quality traceability of assembly process is included in the method. Finally, assembly process quality data management system is developed and effective control and management of quality information for complex product assembly process is realized.

  15. A framework to observe and evaluate the sustainability of human-natural systems in a complex dynamic context.

    PubMed

    Satanarachchi, Niranji; Mino, Takashi

    2014-01-01

    This paper aims to explore the prominent implications of the process of observing complex dynamics linked to sustainability in human-natural systems and to propose a framework for sustainability evaluation by introducing the concept of sustainability boundaries. Arguing that both observing and evaluating sustainability should engage awareness of complex dynamics from the outset, we try to embody this idea in the framework by two complementary methods, namely, the layer view- and dimensional view-based methods, which support the understanding of a reflexive and iterative sustainability process. The framework enables the observation of complex dynamic sustainability contexts, which we call observation metastructures, and enable us to map the contexts to sustainability boundaries.

  16. Ontology patterns for complex topographic feature yypes

    USGS Publications Warehouse

    Varanka, Dalia E.

    2011-01-01

    Complex feature types are defined as integrated relations between basic features for a shared meaning or concept. The shared semantic concept is difficult to define in commonly used geographic information systems (GIS) and remote sensing technologies. The role of spatial relations between complex feature parts was recognized in early GIS literature, but had limited representation in the feature or coverage data models of GIS. Spatial relations are more explicitly specified in semantic technology. In this paper, semantics for topographic feature ontology design patterns (ODP) are developed as data models for the representation of complex features. In the context of topographic processes, component assemblages are supported by resource systems and are found on local landscapes. The topographic ontology is organized across six thematic modules that can account for basic feature types, resource systems, and landscape types. Types of complex feature attributes include location, generative processes and physical description. Node/edge networks model standard spatial relations and relations specific to topographic science to represent complex features. To demonstrate these concepts, data from The National Map of the U. S. Geological Survey was converted and assembled into ODP.

  17. Accomplishment Summary 1968-1969. Biological Computer Laboratory.

    ERIC Educational Resources Information Center

    Von Foerster, Heinz; And Others

    This report summarizes theoretical, applied, and experimental studies in the areas of computational principles in complex intelligent systems, cybernetics, multivalued logic, and the mechanization of cognitive processes. This work is summarized under the following topic headings: properties of complex dynamic systems; computers and the language…

  18. Experimentally modeling stochastic processes with less memory by the use of a quantum processor

    PubMed Central

    Palsson, Matthew S.; Gu, Mile; Ho, Joseph; Wiseman, Howard M.; Pryde, Geoff J.

    2017-01-01

    Computer simulation of observable phenomena is an indispensable tool for engineering new technology, understanding the natural world, and studying human society. However, the most interesting systems are often so complex that simulating their future behavior demands storing immense amounts of information regarding how they have behaved in the past. For increasingly complex systems, simulation becomes increasingly difficult and is ultimately constrained by resources such as computer memory. Recent theoretical work shows that quantum theory can reduce this memory requirement beyond ultimate classical limits, as measured by a process’ statistical complexity, C. We experimentally demonstrate this quantum advantage in simulating stochastic processes. Our quantum implementation observes a memory requirement of Cq = 0.05 ± 0.01, far below the ultimate classical limit of C = 1. Scaling up this technique would substantially reduce the memory required in simulations of more complex systems. PMID:28168218

  19. Evaluation in context: ATC automation in the field

    NASA Technical Reports Server (NTRS)

    Harwood, Kelly; Sanford, Beverly

    1994-01-01

    The process for incorporating advanced technologies into complex aviation systems is as important as the final product itself. This paper described a process that is currently being applied to the development and assessment of an advanced ATC automation system, CTAS. The key element of the process is field exposure early in the system development cycle. The process deviates from current established practices of system development -- where field testing is an implementation endpoint -- and has been deemed necessary by the FAA for streamlining development and bringing system functions to a level of stability and usefulness. Methods and approaches for field assessment are borrowed from human factors engineering, cognitive engineering, and usability engineering and are tailored for the constraints of an operational ATC environment. To date, the focus has been on the qualitative assessment of the match between TMA capabilities and the context for their use. Capturing the users' experience with the automation tool and understanding tool use in the context of the operational environment is important, not only for developing a tool that is an effective problem-solving instrument but also for defining meaningful operational requirements. Such requirements form the basis for certifying the safety and efficiency of the system. CTAS is the first U.S. advanced ATC automation system of its scope and complexity to undergo this field development and assessment process. With the rapid advances in aviation technologies and our limited understanding of their impact on system performance, it is time we opened our eyes to new possibilities for developing, validating, and ultimately certifying complex aviation systems.

  20. Policy experimentation and innovation as a response to complexity in China's management of health reforms.

    PubMed

    Husain, Lewis

    2017-08-03

    There are increasing criticisms of dominant models for scaling up health systems in developing countries and a recognition that approaches are needed that better take into account the complexity of health interventions. Since Reform and Opening in the late 1970s, Chinese government has managed complex, rapid and intersecting reforms across many policy areas. As with reforms in other policy areas, reform of the health system has been through a process of trial and error. There is increasing understanding of the importance of policy experimentation and innovation in many of China's reforms; this article argues that these processes have been important in rebuilding China's health system. While China's current system still has many problems, progress is being made in developing a functioning system able to ensure broad population access. The article analyses Chinese thinking on policy experimentation and innovation and their use in management of complex reforms. It argues that China's management of reform allows space for policy tailoring and innovation by sub-national governments under a broad agreement over the ends of reform, and that shared understandings of policy innovation, alongside informational infrastructures for the systemic propagation and codification of useful practices, provide a framework for managing change in complex environments and under conditions of uncertainty in which 'what works' is not knowable in advance. The article situates China's use of experimentation and innovation in management of health system reform in relation to recent literature which applies complex systems thinking to global health, and concludes that there are lessons to be learnt from China's approaches to managing complexity in development of health systems for the benefit of the poor.

  1. Developing a framework for qualitative engineering: Research in design and analysis of complex structural systems

    NASA Technical Reports Server (NTRS)

    Franck, Bruno M.

    1990-01-01

    The research is focused on automating the evaluation of complex structural systems, whether for the design of a new system or the analysis of an existing one, by developing new structural analysis techniques based on qualitative reasoning. The problem is to identify and better understand: (1) the requirements for the automation of design, and (2) the qualitative reasoning associated with the conceptual development of a complex system. The long-term objective is to develop an integrated design-risk assessment environment for the evaluation of complex structural systems. The scope of this short presentation is to describe the design and cognition components of the research. Design has received special attention in cognitive science because it is now identified as a problem solving activity that is different from other information processing tasks (1). Before an attempt can be made to automate design, a thorough understanding of the underlying design theory and methodology is needed, since the design process is, in many cases, multi-disciplinary, complex in size and motivation, and uses various reasoning processes involving different kinds of knowledge in ways which vary from one context to another. The objective is to unify all the various types of knowledge under one framework of cognition. This presentation focuses on the cognitive science framework that we are using to represent the knowledge aspects associated with the human mind's abstraction abilities and how we apply it to the engineering knowledge and engineering reasoning in design.

  2. Mathematical and Computational Modeling in Complex Biological Systems

    PubMed Central

    Li, Wenyang; Zhu, Xiaoliang

    2017-01-01

    The biological process and molecular functions involved in the cancer progression remain difficult to understand for biologists and clinical doctors. Recent developments in high-throughput technologies urge the systems biology to achieve more precise models for complex diseases. Computational and mathematical models are gradually being used to help us understand the omics data produced by high-throughput experimental techniques. The use of computational models in systems biology allows us to explore the pathogenesis of complex diseases, improve our understanding of the latent molecular mechanisms, and promote treatment strategy optimization and new drug discovery. Currently, it is urgent to bridge the gap between the developments of high-throughput technologies and systemic modeling of the biological process in cancer research. In this review, we firstly studied several typical mathematical modeling approaches of biological systems in different scales and deeply analyzed their characteristics, advantages, applications, and limitations. Next, three potential research directions in systems modeling were summarized. To conclude, this review provides an update of important solutions using computational modeling approaches in systems biology. PMID:28386558

  3. Mathematical and Computational Modeling in Complex Biological Systems.

    PubMed

    Ji, Zhiwei; Yan, Ke; Li, Wenyang; Hu, Haigen; Zhu, Xiaoliang

    2017-01-01

    The biological process and molecular functions involved in the cancer progression remain difficult to understand for biologists and clinical doctors. Recent developments in high-throughput technologies urge the systems biology to achieve more precise models for complex diseases. Computational and mathematical models are gradually being used to help us understand the omics data produced by high-throughput experimental techniques. The use of computational models in systems biology allows us to explore the pathogenesis of complex diseases, improve our understanding of the latent molecular mechanisms, and promote treatment strategy optimization and new drug discovery. Currently, it is urgent to bridge the gap between the developments of high-throughput technologies and systemic modeling of the biological process in cancer research. In this review, we firstly studied several typical mathematical modeling approaches of biological systems in different scales and deeply analyzed their characteristics, advantages, applications, and limitations. Next, three potential research directions in systems modeling were summarized. To conclude, this review provides an update of important solutions using computational modeling approaches in systems biology.

  4. Managing the Process of Protection Level Assessment of the Complex Organization and Technical Industrial Enterprises

    NASA Astrophysics Data System (ADS)

    Gorlov, A. P.; Averchenkov, V. I.; Rytov, M. Yu; Eryomenko, V. T.

    2017-01-01

    The article is concerned with mathematical simulation of protection level assessment of complex organizational and technical systems of industrial enterprises by creating automated system, which main functions are: information security (IS) audit, forming of the enterprise threats model, recommendations concerning creation of the information protection system, a set of organizational-administrative documentation.

  5. 29. Perimeter acquisition radar building room #318, data processing system ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    29. Perimeter acquisition radar building room #318, data processing system area; data processor maintenance and operations center, showing data processing consoles - Stanley R. Mickelsen Safeguard Complex, Perimeter Acquisition Radar Building, Limited Access Area, between Limited Access Patrol Road & Service Road A, Nekoma, Cavalier County, ND

  6. Simulating complex intracellular processes using object-oriented computational modelling.

    PubMed

    Johnson, Colin G; Goldman, Jacki P; Gullick, William J

    2004-11-01

    The aim of this paper is to give an overview of computer modelling and simulation in cellular biology, in particular as applied to complex biochemical processes within the cell. This is illustrated by the use of the techniques of object-oriented modelling, where the computer is used to construct abstractions of objects in the domain being modelled, and these objects then interact within the computer to simulate the system and allow emergent properties to be observed. The paper also discusses the role of computer simulation in understanding complexity in biological systems, and the kinds of information which can be obtained about biology via simulation.

  7. Analysis of Complexity Evolution Management and Human Performance Issues in Commercial Aircraft Automation Systems

    NASA Technical Reports Server (NTRS)

    Vakil, Sanjay S.; Hansman, R. John

    2000-01-01

    Autoflight systems in the current generation of aircraft have been implicated in several recent incidents and accidents. A contributory aspect to these incidents may be the manner in which aircraft transition between differing behaviours or 'modes.' The current state of aircraft automation was investigated and the incremental development of the autoflight system was tracked through a set of aircraft to gain insight into how these systems developed. This process appears to have resulted in a system without a consistent global representation. In order to evaluate and examine autoflight systems, a 'Hybrid Automation Representation' (HAR) was developed. This representation was used to examine several specific problems known to exist in aircraft systems. Cyclomatic complexity is an analysis tool from computer science which counts the number of linearly independent paths through a program graph. This approach was extended to examine autoflight mode transitions modelled with the HAR. A survey was conducted of pilots to identify those autoflight mode transitions which airline pilots find difficult. The transitions identified in this survey were analyzed using cyclomatic complexity to gain insight into the apparent complexity of the autoflight system from the perspective of the pilot. Mode transitions which had been identified as complex by pilots were found to have a high cyclomatic complexity. Further examination was made into a set of specific problems identified in aircraft: the lack of a consistent representation of automation, concern regarding appropriate feedback from the automation, and the implications of physical limitations on the autoflight systems. Mode transitions involved in changing to and leveling at a new altitude were identified across multiple aircraft by numerous pilots. Where possible, evaluation and verification of the behaviour of these autoflight mode transitions was investigated via aircraft-specific high fidelity simulators. Three solution approaches to concerns regarding autoflight systems, and mode transitions in particular, are presented in this thesis. The first is to use training to modify pilot behaviours, or procedures to work around known problems. The second approach is to mitigate problems by enhancing feedback. The third approach is to modify the process by which automation is designed. The Operator Directed Process forces the consideration and creation of an automation model early in the design process for use as the basis of the software specification and training.

  8. Is complex signal processing for bone conduction hearing aids useful?

    PubMed

    Kompis, Martin; Kurz, Anja; Pfiffner, Flurin; Senn, Pascal; Arnold, Andreas; Caversaccio, Marco

    2014-05-01

    To establish whether complex signal processing is beneficial for users of bone anchored hearing aids. Review and analysis of two studies from our own group, each comparing a speech processor with basic digital signal processing (either Baha Divino or Baha Intenso) and a processor with complex digital signal processing (either Baha BP100 or Baha BP110 power). The main differences between basic and complex signal processing are the number of audiologist accessible frequency channels and the availability and complexity of the directional multi-microphone noise reduction and loudness compression systems. Both studies show a small, statistically non-significant improvement of speech understanding in quiet with the complex digital signal processing. The average improvement for speech in noise is +0.9 dB, if speech and noise are emitted both from the front of the listener. If noise is emitted from the rear and speech from the front of the listener, the advantage of the devices with complex digital signal processing as opposed to those with basic signal processing increases, on average, to +3.2 dB (range +2.3 … +5.1 dB, p ≤ 0.0032). Complex digital signal processing does indeed improve speech understanding, especially in noise coming from the rear. This finding has been supported by another study, which has been published recently by a different research group. When compared to basic digital signal processing, complex digital signal processing can increase speech understanding of users of bone anchored hearing aids. The benefit is most significant for speech understanding in noise.

  9. Measuring the impact of final demand on global production system based on Markov process

    NASA Astrophysics Data System (ADS)

    Xing, Lizhi; Guan, Jun; Wu, Shan

    2018-07-01

    Input-output table is a comprehensive and detailed in describing the national economic systems, consisting of supply and demand information among various industrial sectors. The complex network, a theory and method for measuring the structure of complex system, can depict the structural properties of social and economic systems, and reveal the complicated relationships between the inner hierarchies and the external macroeconomic functions. This paper tried to measure the globalization degree of industrial sectors on the global value chain. Firstly, it constructed inter-country input-output network models to reproduce the topological structure of global economic system. Secondly, it regarded the propagation of intermediate goods on the global value chain as Markov process and introduced counting first passage betweenness to quantify the added processing amount when globally final demand stimulates this production system. Thirdly, it analyzed the features of globalization at both global and country-sector level

  10. Towards an integrated optofluidic system for highly sensitive detection of antibiotics in seawater incorporating bimodal waveguide photonic biosensors and complex, active microfluidics

    NASA Astrophysics Data System (ADS)

    Szydzik, C.; Gavela, A. F.; Roccisano, J.; Herranz de Andrés, S.; Mitchell, A.; Lechuga, L. M.

    2016-12-01

    We present recent results on the realisation and demonstration of an integrated optofluidic lab-on-a-chip measurement system. The system consists of an integrated on-chip automated microfluidic fluid handling subsystem, coupled with bimodal nano-interferometer waveguide technology, and is applied in the context of detection of antibiotics in seawater. The bimodal waveguide (BMWG) is a highly sensitive label-free biosensor. Integration of complex microfluidic systems with bimodal waveguide technology enables on-chip sample handling and fluid processing capabilities and allows for significant automation of experimental processes. The on-chip fluid-handling subsystem is realised through the integration of pneumatically actuated elastomer pumps and valves, enabling high temporal resolution sample and reagent delivery and facilitating multiplexed detection processes.

  11. Novel methodology to examine cognitive and experiential factors in language development: combining eye-tracking and LENA technology

    PubMed Central

    Odean, Rosalie; Nazareth, Alina; Pruden, Shannon M.

    2015-01-01

    Developmental systems theory posits that development cannot be segmented by influences acting in isolation, but should be studied through a scientific lens that highlights the complex interactions between these forces over time (Overton, 2013a). This poses a unique challenge for developmental psychologists studying complex processes like language development. In this paper, we advocate for the combining of highly sophisticated data collection technologies in an effort to move toward a more systemic approach to studying language development. We investigate the efficiency and appropriateness of combining eye-tracking technology and the LENA (Language Environment Analysis) system, an automated language analysis tool, in an effort to explore the relation between language processing in early development, and external dynamic influences like parent and educator language input in the home and school environments. Eye-tracking allows us to study language processing via eye movement analysis; these eye movements have been linked to both conscious and unconscious cognitive processing, and thus provide one means of evaluating cognitive processes underlying language development that does not require the use of subjective parent reports or checklists. The LENA system, on the other hand, provides automated language output that describes a child’s language-rich environment. In combination, these technologies provide critical information not only about a child’s language processing abilities but also about the complexity of the child’s language environment. Thus, when used in conjunction these technologies allow researchers to explore the nature of interacting systems involved in language development. PMID:26379591

  12. 29 CFR 1910.119 - Process safety management of highly hazardous chemicals.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...

  13. 29 CFR 1910.119 - Process safety management of highly hazardous chemicals.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...

  14. Teaching Information Systems Development via Process Variants

    ERIC Educational Resources Information Center

    Tan, Wee-Kek; Tan, Chuan-Hoo

    2010-01-01

    Acquiring the knowledge to assemble an integrated Information System (IS) development process that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…

  15. Three faces of entropy for complex systems: Information, thermodynamics, and the maximum entropy principle

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf

    2017-09-01

    There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.

  16. Structural Complexity in Linguistic Systems Research Topic 3: Mathematical Sciences

    DTIC Science & Technology

    2015-04-01

    understanding of structure (pattern, correlation, memory , semantics , ...) observed in linguistic systems and process? On this score we believe the...on our understanding of structure (pattern, correlation, memory , semantics , ...) observed in linguistic systems and process? On this score we believe...promised to overcome these difficulties, since it gives a clear and constructive view of structure in memoryful stochastic processes. In principle, this

  17. Pt(ii) coordination complexes as visible light photocatalysts for the oxidation of sulfides using batch and flow processes.

    PubMed

    Casado-Sánchez, Antonio; Gómez-Ballesteros, Rocío; Tato, Francisco; Soriano, Francisco J; Pascual-Coca, Gustavo; Cabrera, Silvia; Alemán, José

    2016-07-12

    A new catalytic system for the photooxidation of sulfides based on Pt(ii) complexes is presented. The catalyst is capable of oxidizing a large number of sulfides containing aryl, alkyl, allyl, benzyl, as well as more complex structures such as heterocycles and methionine amino acid, with complete chemoselectivity. In addition, the first sulfur oxidation in a continuous flow process has been developed.

  18. Further Understanding of Complex Information Processing in Verbal Adolescents and Adults with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Williams, Diane L.; Minshew, Nancy J.; Goldstein, Gerald

    2015-01-01

    More than 20?years ago, Minshew and colleagues proposed the Complex Information Processing model of autism in which the impairment is characterized as a generalized deficit involving multiple modalities and cognitive domains that depend on distributed cortical systems responsible for higher order abilities. Subsequent behavioral work revealed a…

  19. Complex Network Structure Influences Processing in Long-Term and Short-Term Memory

    ERIC Educational Resources Information Center

    Vitevitch, Michael S.; Chan, Kit Ying; Roodenrys, Steven

    2012-01-01

    Complex networks describe how entities in systems interact; the structure of such networks is argued to influence processing. One measure of network structure, clustering coefficient, C, measures the extent to which neighbors of a node are also neighbors of each other. Previous psycholinguistic experiments found that the C of phonological…

  20. Recoding Numerics to Geometrics for Complex Discrimination Tasks; A Feasibility Study of Coding Strategy.

    ERIC Educational Resources Information Center

    Simpkins, John D.

    Processing complex multivariate information effectively when relational properties of information sub-groups are ambiguous is difficult for man and man-machine systems. However, the information processing task is made easier through code study, cybernetic planning, and accurate display mechanisms. An exploratory laboratory study designed for the…

  1. An image processing and analysis tool for identifying and analysing complex plant root systems in 3D soil using non-destructive analysis: Root1.

    PubMed

    Flavel, Richard J; Guppy, Chris N; Rabbi, Sheikh M R; Young, Iain M

    2017-01-01

    The objective of this study was to develop a flexible and free image processing and analysis solution, based on the Public Domain ImageJ platform, for the segmentation and analysis of complex biological plant root systems in soil from x-ray tomography 3D images. Contrasting root architectures from wheat, barley and chickpea root systems were grown in soil and scanned using a high resolution micro-tomography system. A macro (Root1) was developed that reliably identified with good to high accuracy complex root systems (10% overestimation for chickpea, 1% underestimation for wheat, 8% underestimation for barley) and provided analysis of root length and angle. In-built flexibility allowed the user interaction to (a) amend any aspect of the macro to account for specific user preferences, and (b) take account of computational limitations of the platform. The platform is free, flexible and accurate in analysing root system metrics.

  2. An Improved Method to Control the Critical Parameters of a Multivariable Control System

    NASA Astrophysics Data System (ADS)

    Subha Hency Jims, P.; Dharmalingam, S.; Wessley, G. Jims John

    2017-10-01

    The role of control systems is to cope with the process deficiencies and the undesirable effect of the external disturbances. Most of the multivariable processes are highly iterative and complex in nature. Aircraft systems, Modern Power Plants, Refineries, Robotic systems are few such complex systems that involve numerous critical parameters that need to be monitored and controlled. Control of these important parameters is not only tedious and cumbersome but also is crucial from environmental, safety and quality perspective. In this paper, one such multivariable system, namely, a utility boiler has been considered. A modern power plant is a complex arrangement of pipework and machineries with numerous interacting control loops and support systems. In this paper, the calculation of controller parameters based on classical tuning concepts has been presented. The controller parameters thus obtained and employed has controlled the critical parameters of a boiler during fuel switching disturbances. The proposed method can be applied to control the critical parameters like elevator, aileron, rudder, elevator trim rudder and aileron trim, flap control systems of aircraft systems.

  3. Big Data Analysis of Manufacturing Processes

    NASA Astrophysics Data System (ADS)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-11-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.

  4. High rate information systems - Architectural trends in support of the interdisciplinary investigator

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Preheim, Larry E.

    1990-01-01

    Data systems requirements in the Earth Observing System (EOS) Space Station Freedom (SSF) eras indicate increasing data volume, increased discipline interplay, higher complexity and broader data integration and interpretation. A response to the needs of the interdisciplinary investigator is proposed, considering the increasing complexity and rising costs of scientific investigation. The EOS Data Information System, conceived to be a widely distributed system with reliable communication links between central processing and the science user community, is described. Details are provided on information architecture, system models, intelligent data management of large complex databases, and standards for archiving ancillary data, using a research library, a laboratory and collaboration services.

  5. On Convergence of Development Costs and Cost Models for Complex Spaceflight Instrument Electronics

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Patel, Umeshkumar D.; Kasa, Robert L.; Hestnes, Phyllis; Brown, Tammy; Vootukuru, Madhavi

    2008-01-01

    Development costs of a few recent spaceflight instrument electrical and electronics subsystems have diverged from respective heritage cost model predictions. The cost models used are Grass Roots, Price-H and Parametric Model. These cost models originated in the military and industry around 1970 and were successfully adopted and patched by NASA on a mission-by-mission basis for years. However, the complexity of new instruments recently changed rapidly by orders of magnitude. This is most obvious in the complexity of representative spaceflight instrument electronics' data system. It is now required to perform intermediate processing of digitized data apart from conventional processing of science phenomenon signals from multiple detectors. This involves on-board instrument formatting of computational operands from row data for example, images), multi-million operations per second on large volumes of data in reconfigurable hardware (in addition to processing on a general purpose imbedded or standalone instrument flight computer), as well as making decisions for on-board system adaptation and resource reconfiguration. The instrument data system is now tasked to perform more functions, such as forming packets and instrument-level data compression of more than one data stream, which are traditionally performed by the spacecraft command and data handling system. It is furthermore required that the electronics box for new complex instruments is developed for one-digit watt power consumption, small size and that it is light-weight, and delivers super-computing capabilities. The conflict between the actual development cost of newer complex instruments and its electronics components' heritage cost model predictions seems to be irreconcilable. This conflict and an approach to its resolution are addressed in this paper by determining the complexity parameters, complexity index, and their use in enhanced cost model.

  6. Systems approach provides management control of complex programs

    NASA Technical Reports Server (NTRS)

    Dudek, E. F., Jr.; Mc Carthy, J. F., Jr.

    1970-01-01

    Integrated program management process provides management visual assistance through three interrelated charts - system model that identifies each function to be performed, matrix that identifies personnel responsibilities for these functions, process chart that breaks down the functions into discrete tasks.

  7. Panarchy

    USGS Publications Warehouse

    Garmestani, Ahjond S.; Allen, Craig R.; El-Shaarawi, Abdel H.; Piegorsch, Walter W.

    2012-01-01

    Panarchy is the term coined to describe hierarchical systems where control is not only top down, as typically considered, but also bottom up. A panarchy is composed of adaptive cycles, and an adaptive cycle describes the processes of development and decay in a system. Complex systems self-organize into hierarchies because this structure limits the possible spread of destructive phenomena (e.g., forest fires, epidemics) that could result in catastrophic system failure. Thus, hierarchical organization enhances the resilience of complex systems.

  8. Five schools of thought about complexity: Implications for design and process science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warfield, J.N.

    1996-12-31

    The prevalence of complexity is a fact of life in virtually all aspects of system design today. Five schools of thought concerning complexity seem to be present in areas where people strive to gain more facility with difficult issues: (1) Interdisciplinary or Cross-Disciplinary {open_quotes}approaches{close_quotes} or {open_quotes}methods{close_quotes} (fostered by the Association for Integrative Studies, a predominantly liberal-arts faculty activity), (2) Systems Dynamics (fostered by Jay Forrester, Dennis Meadows, Peter Senge, and others closely associated with MIT), (3) Chaos Theory (arising in small groups in many locations), (4) Adaptive Systems Theory (predominantly associated with the Santa Fe Institute), and (5) The Structure-Basedmore » school (developed by the author, his colleagues and associates). A comparison of these five schools of thought will be offered, in order to show the implications of them upon the development and application of design and process science. The following criteria of comparison will be used: (a) how complexity is defined, (b) analysis versus synthesis, (c) potential for acquiring practical competence in coping with complexity, and (d) relationship to underlying formalisms that facilitate computer assistance in applications. Through these comparisons, the advantages and disadvantages of each school of thought can be clarified, and the possibilities of changes in the educational system to provide for the management of complexity in system design can be articulated.« less

  9. Digital Signal Processing and Control for the Study of Gene Networks

    NASA Astrophysics Data System (ADS)

    Shin, Yong-Jun

    2016-04-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  10. Digital Signal Processing and Control for the Study of Gene Networks.

    PubMed

    Shin, Yong-Jun

    2016-04-22

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  11. Digital Signal Processing and Control for the Study of Gene Networks

    PubMed Central

    Shin, Yong-Jun

    2016-01-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks. PMID:27102828

  12. Bridged transition-metal complexes and uses thereof for hydrogen separation, storage and hydrogenation

    DOEpatents

    Lilga, Michael A.; Hallen, Richard T.

    1990-01-01

    The present invention constitutes a class of organometallic complexes which reversibly react with hydrogen to form dihydrides and processes by which these compounds can be utilized. The class includes bimetallic complexes in which two cyclopentadienyl rings are bridged together and also separately .pi.-bonded to two transition metal atoms. The transition metals are believed to bond with the hydrogen in forming the dihydride. Transition metals such as Fe, Mn or Co may be employed in the complexes although Cr constitutes the preferred metal. A multiple number of ancilliary ligands such as CO are bonded to the metal atoms in the complexes. Alkyl groups and the like may be substituted on the cyclopentadienyl rings. These organometallic compounds may be used in absorption/desorption systems and in facilitated transport membrane systems for storing and separating out H.sub.2 from mixed gas streams such as the produce gas from coal gasification processes.

  13. Bridged transition-metal complexes and uses thereof for hydrogen separation, storage and hydrogenation

    DOEpatents

    Lilga, M.A.; Hallen, R.T.

    1991-10-15

    The present invention constitutes a class of organometallic complexes which reversibly react with hydrogen to form dihydrides and processes by which these compounds can be utilized. The class includes bimetallic complexes in which two cyclopentadienyl rings are bridged together and also separately [pi]-bonded to two transition metal atoms. The transition metals are believed to bond with the hydrogen in forming the dihydride. Transition metals such as Fe, Mn or Co may be employed in the complexes although Cr constitutes the preferred metal. A multiple number of ancillary ligands such as CO are bonded to the metal atoms in the complexes. Alkyl groups and the like may be substituted on the cyclopentadienyl rings. These organometallic compounds may be used in absorption/desorption systems and in facilitated transport membrane systems for storing and separating out H[sub 2] from mixed gas streams such as the product gas from coal gasification processes. 3 figures.

  14. Bridged transition-metal complexes and uses thereof for hydrogen separation, storage and hydrogenation

    DOEpatents

    Lilga, M.A.; Hallen, R.T.

    1990-08-28

    The present invention constitutes a class of organometallic complexes which reversibly react with hydrogen to form dihydrides and processes by which these compounds can be utilized. The class includes bimetallic complexes in which two cyclopentadienyl rings are bridged together and also separately [pi]-bonded to two transition metal atoms. The transition metals are believed to bond with the hydrogen in forming the dihydride. Transition metals such as Fe, Mn or Co may be employed in the complexes although Cr constitutes the preferred metal. A multiple number of ancillary ligands such as CO are bonded to the metal atoms in the complexes. Alkyl groups and the like may be substituted on the cyclopentadienyl rings. These organometallic compounds may be used in absorption/desorption systems and in facilitated transport membrane systems for storing and separating out H[sub 2] from mixed gas streams such as the producer gas from coal gasification processes. 3 figs.

  15. Bridged transition-metal complexes and uses thereof for hydrogen separation, storage and hydrogenation

    DOEpatents

    Lilga, Michael A.; Hallen, Richard T.

    1991-01-01

    The present invention constitutes a class of organometallic complexes which reversibly react with hydrogen to form dihydrides and processes by which these compounds can be utilized. The class includes bimetallic complexes in which two cyclopentadienyl rings are bridged together and also separately .pi.-bonded to two transition metal atoms. The transition metals are believed to bond with the hydrogen in forming the dihydride. Transition metals such as Fe, Mn or Co may be employed in the complexes although Cr constitutes the preferred metal. A multiple number of ancilliary ligands such as CO are bonded to the metal atoms in the complexes. Alkyl groups and the like may be substituted on the cyclopentadienyl rings. These organometallic compounds may be used in absorption/desorption systems and in facilitated transport membrane systems for storing and separating out H.sub.2 from mixed gas streams such as the product gas from coal gasification processes.

  16. Role of the Conserved Oligomeric Golgi Complex in the Abnormalities of Glycoprotein Processing in Breast Cancer Cells

    DTIC Science & Technology

    2005-05-01

    AD Award Number: DAMD17-03-1-0243 TITLE: Role of the Conserved Oligomeric Golgi Complex in the Abnormalities of Glycoprotein Processing in Breast...Glycoprotein Processing in Breast Cancer 5b.GRANTNUMBER Cells DAAD17-03-1-0243 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER Sergey N... processing of glycoproteins, exocytosis, protein delivery systems, gene expression, western and northern blot analysis, immunotiuorescence, gradient

  17. Advancing beyond the system: telemedicine nurses' clinical reasoning using a computerised decision support system for patients with COPD - an ethnographic study.

    PubMed

    Barken, Tina Lien; Thygesen, Elin; Söderhamn, Ulrika

    2017-12-28

    Telemedicine is changing traditional nursing care, and entails nurses performing advanced and complex care within a new clinical environment, and monitoring patients at a distance. Telemedicine practice requires complex disease management, advocating that the nurses' reasoning and decision-making processes are supported. Computerised decision support systems are being used increasingly to assist reasoning and decision-making in different situations. However, little research has focused on the clinical reasoning of nurses using a computerised decision support system in a telemedicine setting. Therefore, the objective of the study is to explore the process of telemedicine nurses' clinical reasoning when using a computerised decision support system for the management of patients with chronic obstructive pulmonary disease. The factors influencing the reasoning and decision-making processes were investigated. In this ethnographic study, a combination of data collection methods, including participatory observations, the think-aloud technique, and a focus group interview was employed. Collected data were analysed using qualitative content analysis. When telemedicine nurses used a computerised decision support system for the management of patients with complex, unstable chronic obstructive pulmonary disease, two categories emerged: "the process of telemedicine nurses' reasoning to assess health change" and "the influence of the telemedicine setting on nurses' reasoning and decision-making processes". An overall theme, termed "advancing beyond the system", represented the connection between the reasoning processes and the telemedicine work and setting, where being familiar with the patient functioned as a foundation for the nurses' clinical reasoning process. In the telemedicine setting, when supported by a computerised decision support system, nurses' reasoning was enabled by the continuous flow of digital clinical data, regular video-mediated contact and shared decision-making with the patient. These factors fostered an in-depth knowledge of the patients and acted as a foundation for the nurses' reasoning process. Nurses' reasoning frequently advanced beyond the computerised decision support system recommendations. Future studies are warranted to develop more accurate algorithms, increase system maturity, and improve the integration of the digital clinical information with clinical experiences, to support telemedicine nurses' reasoning process.

  18. A Design Rationale Capture Using REMAP/MM

    DTIC Science & Technology

    1994-06-01

    company-wide down-sizing, the power company has determined that an automated service order processing system is the most economical solution. This new...service order processing system for a large power company can easily be 37 led. A system of this complexity would typically require three to five years

  19. A study of compositional verification based IMA integration method

    NASA Astrophysics Data System (ADS)

    Huang, Hui; Zhang, Guoquan; Xu, Wanmeng

    2018-03-01

    The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.

  20. Best geoscience approach to complex systems in environment

    NASA Astrophysics Data System (ADS)

    Mezemate, Yacine; Tchiguirinskaia, Ioulia; Schertzer, Daniel

    2017-04-01

    The environment is a social issue that continues to grow in importance. Its complexity, both cross-disciplinary and multi-scale, has given rise to a large number of scientific and technological locks, that complex systems approaches can solve. Significant challenges must met to achieve the understanding of the environmental complexes systems. There study should proceed in some steps in which the use of data and models is crucial: - Exploration, observation and basic data acquisition - Identification of correlations, patterns, and mechanisms - Modelling - Model validation, implementation and prediction - Construction of a theory Since the e-learning becomes a powerful tool for knowledge and best practice shearing, we use it to teach the environmental complexities and systems. In this presentation we promote the e-learning course dedicated for a large public (undergraduates, graduates, PhD students and young scientists) which gather and puts in coherence different pedagogical materials of complex systems and environmental studies. This course describes a complex processes using numerous illustrations, examples and tests that make it "easy to enjoy" learning process. For the seek of simplicity, the course is divided in different modules and at the end of each module a set of exercises and program codes are proposed for a best practice. The graphical user interface (GUI) which is constructed using an open source Opale Scenari offers a simple navigation through the different module. The course treats the complex systems that can be found in environment and their observables, we particularly highlight the extreme variability of these observables over a wide range of scales. Using the multifractal formalism through different applications (turbulence, precipitation, hydrology) we demonstrate how such extreme variability of the geophysical/biological fields should be used solving everyday (geo-)environmental chalenges.

  1. Development of structural model of adaptive training complex in ergatic systems for professional use

    NASA Astrophysics Data System (ADS)

    Obukhov, A. D.; Dedov, D. L.; Arkhipov, A. E.

    2018-03-01

    The article considers the structural model of the adaptive training complex (ATC), which reflects the interrelations between the hardware, software and mathematical model of ATC and describes the processes in this subject area. The description of the main components of software and hardware complex, their interaction and functioning within the common system are given. Also the article scrutinizers a brief description of mathematical models of personnel activity, a technical system and influences, the interactions of which formalize the regularities of ATC functioning. The studies of main objects of training complexes and connections between them will make it possible to realize practical implementation of ATC in ergatic systems for professional use.

  2. Video Analysis and Remote Digital Ethnography: Approaches to understanding user perspectives and processes involving healthcare information technology.

    PubMed

    Kushniruk, Andre W; Borycki, Elizabeth M

    2015-01-01

    Innovations in healthcare information systems promise to revolutionize and streamline healthcare processes worldwide. However, the complexity of these systems and the need to better understand issues related to human-computer interaction have slowed progress in this area. In this chapter the authors describe their work in using methods adapted from usability engineering, video ethnography and analysis of digital log files for improving our understanding of complex real-world healthcare interactions between humans and technology. The approaches taken are cost-effective and practical and can provide detailed ethnographic data on issues health professionals and consumers encounter while using systems as well as potential safety problems. The work is important in that it can be used in techno-anthropology to characterize complex user interactions with technologies and also to provide feedback into redesign and optimization of improved healthcare information systems.

  3. Can modular psychological concepts like affect and emotion be assigned to a distinct subset of regional neural circuits?. Comment on "The quartet theory of human emotions: An integrative and neurofunctional model" by S. Koelsch et al.

    NASA Astrophysics Data System (ADS)

    Fehr, Thorsten; Herrmann, Manfred

    2015-06-01

    The proposed Quartet Theory of Human Emotions by Koelsch and co-workers [11] adumbrates evidence from various scientific sources to integrate and assign the psychological concepts of 'affect' and 'emotion' to four brain circuits or to four neuronal core systems for affect-processing in the brain. The authors differentiate between affect and emotion and assign several facultative, or to say modular, psychological domains and principles of information processing, such as learning and memory, antecedents of affective activity, emotion satiation, cognitive complexity, subjective quality feelings, degree of conscious appraisal, to different affect systems. Furthermore, they relate orbito-frontal brain structures to moral affects as uniquely human, and the hippocampus to attachment-related affects. An additional feature of the theory describes 'emotional effector-systems' for motor-related processes (e.g., emotion-related actions), physiological arousal, attention and memory that are assumed to be cross-linked with the four proposed affect systems. Thus, higher principles of emotional information processing, but also modular affect-related issues, such as moral and attachment related affects, are thought to be handled by these four different physiological sub-systems that are on the other side assumed to be highly interwoven at both physiological and functional levels. The authors also state that the proposed sub-systems have many features in common, such as the selection and modulation of biological processes related to behaviour, perception, attention and memory. The latter aspect challenges an ongoing discussion about the mind-body problem: To which degree do the proposed sub-systems 'sufficiently' cover the processing of complex modular or facultative emotional/affective and/or cognitive phenomena? There are current models and scientific positions that almost completely reject the idea that modular psychological phenomena are handled by a distinct selection of regional brain systems or neural modules, but rather suggest highly complex and cross-linked neural networks individually shaped by livelong learning and experience [e.g., 6,7,10,13]. This holds in particular true for complex emotional phenomena such as aggression or empathy in social interaction [8,13]. It thus remains questionable, whether - beyond primary sensory and motor-processing - a small number of modular sub-systems sufficiently cover the organisation of specific phenomenological and social features of perception and behaviour [7,10].

  4. A Hospital Is Not Just a Factory, but a Complex Adaptive System-Implications for Perioperative Care.

    PubMed

    Mahajan, Aman; Islam, Salim D; Schwartz, Michael J; Cannesson, Maxime

    2017-07-01

    Many methods used to improve hospital and perioperative services productivity and quality of care have assumed that the hospital is essentially a factory, and therefore, that industrial engineering and manufacturing-derived redesign approaches such as Six Sigma and Lean can be applied to hospitals and perioperative services just as they have been applied in factories. However, a hospital is not merely a factory but also a complex adaptive system (CAS). The hospital CAS has many subsystems, with perioperative care being an important one for which concepts of factory redesign are frequently advocated. In this article, we argue that applying only factory approaches such as lean methodologies or process standardization to complex systems such as perioperative care could account for difficulties and/or failures in improving performance in care delivery. Within perioperative services, only noncomplex/low-variance surgical episodes are amenable to manufacturing-based redesign. On the other hand, complex surgery/high-variance cases and preoperative segmentation (the process of distinguishing between normal and complex cases) can be viewed as CAS-like. These systems tend to self-organize, often resist or react unpredictably to attempts at control, and therefore require application of CAS principles to modify system behavior. We describe 2 examples of perioperative redesign to illustrate the concepts outlined above. These examples present complementary and contrasting cases from 2 leading delivery systems. The Mayo Clinic example illustrates the application of manufacturing-based redesign principles to a factory-like (high-volume, low-risk, and mature practice) clinical program, while the Kaiser Permanente example illustrates the application of both manufacturing-based and self-organization-based approaches to programs and processes that are not factory-like but CAS-like. In this article, we describe how factory-like processes and CAS can coexist within a hospital and how self-organization-based approaches can be used to improve care delivery in many situations where manufacturing-based approaches may not be appropriate.

  5. Advances and Computational Tools towards Predictable Design in Biological Engineering

    PubMed Central

    2014-01-01

    The design process of complex systems in all the fields of engineering requires a set of quantitatively characterized components and a method to predict the output of systems composed by such elements. This strategy relies on the modularity of the used components or the prediction of their context-dependent behaviour, when parts functioning depends on the specific context. Mathematical models usually support the whole process by guiding the selection of parts and by predicting the output of interconnected systems. Such bottom-up design process cannot be trivially adopted for biological systems engineering, since parts function is hard to predict when components are reused in different contexts. This issue and the intrinsic complexity of living systems limit the capability of synthetic biologists to predict the quantitative behaviour of biological systems. The high potential of synthetic biology strongly depends on the capability of mastering this issue. This review discusses the predictability issues of basic biological parts (promoters, ribosome binding sites, coding sequences, transcriptional terminators, and plasmids) when used to engineer simple and complex gene expression systems in Escherichia coli. A comparison between bottom-up and trial-and-error approaches is performed for all the discussed elements and mathematical models supporting the prediction of parts behaviour are illustrated. PMID:25161694

  6. A performance improvement case study in aircraft maintenance and its implications for hazard identification.

    PubMed

    Ward, Marie; McDonald, Nick; Morrison, Rabea; Gaynor, Des; Nugent, Tony

    2010-02-01

    Aircraft maintenance is a highly regulated, safety critical, complex and competitive industry. There is a need to develop innovative solutions to address process efficiency without compromising safety and quality. This paper presents the case that in order to improve a highly complex system such as aircraft maintenance, it is necessary to develop a comprehensive and ecologically valid model of the operational system, which represents not just what is meant to happen, but what normally happens. This model then provides the backdrop against which to change or improve the system. A performance report, the Blocker Report, specific to aircraft maintenance and related to the model was developed gathering data on anything that 'blocks' task or check performance. A Blocker Resolution Process was designed to resolve blockers and improve the current check system. Significant results were obtained for the company in the first trial and implications for safety management systems and hazard identification are discussed. Statement of Relevance: Aircraft maintenance is a safety critical, complex, competitive industry with a need to develop innovative solutions to address process and safety efficiency. This research addresses this through the development of a comprehensive and ecologically valid model of the system linked with a performance reporting and resolution system.

  7. Understanding the complexity of redesigning care around the clinical microsystem.

    PubMed

    Barach, P; Johnson, J K

    2006-12-01

    The microsystem is an organizing design construct in which social systems cut across traditional discipline boundaries. Because of its interdisciplinary focus, the clinical microsystem provides a conceptual and practical framework for simplifying complex organizations that deliver care. It also provides an important opportunity for organizational learning. Process mapping and microworld simulation may be especially useful for redesigning care around the microsystem concept. Process mapping, in which the core processes of the microsystem are delineated and assessed from the perspective of how the individual interacts with the system, is an important element of the continuous learning cycle of the microsystem and the healthcare organization. Microworld simulations are interactive computer based models that can be used as an experimental platform to test basic questions about decision making misperceptions, cause-effect inferences, and learning within the clinical microsystem. Together these tools offer the user and organization the ability to understand the complexity of healthcare systems and to facilitate the redesign of optimal outcomes.

  8. Experimental phase synchronization detection in non-phase coherent chaotic systems by using the discrete complex wavelet approach

    NASA Astrophysics Data System (ADS)

    Ferreira, Maria Teodora; Follmann, Rosangela; Domingues, Margarete O.; Macau, Elbert E. N.; Kiss, István Z.

    2017-08-01

    Phase synchronization may emerge from mutually interacting non-linear oscillators, even under weak coupling, when phase differences are bounded, while amplitudes remain uncorrelated. However, the detection of this phenomenon can be a challenging problem to tackle. In this work, we apply the Discrete Complex Wavelet Approach (DCWA) for phase assignment, considering signals from coupled chaotic systems and experimental data. The DCWA is based on the Dual-Tree Complex Wavelet Transform (DT-CWT), which is a discrete transformation. Due to its multi-scale properties in the context of phase characterization, it is possible to obtain very good results from scalar time series, even with non-phase-coherent chaotic systems without state space reconstruction or pre-processing. The method correctly predicts the phase synchronization for a chemical experiment with three locally coupled, non-phase-coherent chaotic processes. The impact of different time-scales is demonstrated on the synchronization process that outlines the advantages of DCWA for analysis of experimental data.

  9. Cognitive process modelling of controllers in en route air traffic control.

    PubMed

    Inoue, Satoru; Furuta, Kazuo; Nakata, Keiichi; Kanno, Taro; Aoyama, Hisae; Brown, Mark

    2012-01-01

    In recent years, various efforts have been made in air traffic control (ATC) to maintain traffic safety and efficiency in the face of increasing air traffic demands. ATC is a complex process that depends to a large degree on human capabilities, and so understanding how controllers carry out their tasks is an important issue in the design and development of ATC systems. In particular, the human factor is considered to be a serious problem in ATC safety and has been identified as a causal factor in both major and minor incidents. There is, therefore, a need to analyse the mechanisms by which errors occur due to complex factors and to develop systems that can deal with these errors. From the cognitive process perspective, it is essential that system developers have an understanding of the more complex working processes that involve the cooperative work of multiple controllers. Distributed cognition is a methodological framework for analysing cognitive processes that span multiple actors mediated by technology. In this research, we attempt to analyse and model interactions that take place in en route ATC systems based on distributed cognition. We examine the functional problems in an ATC system from a human factors perspective, and conclude by identifying certain measures by which to address these problems. This research focuses on the analysis of air traffic controllers' tasks for en route ATC and modelling controllers' cognitive processes. This research focuses on an experimental study to gain a better understanding of controllers' cognitive processes in air traffic control. We conducted ethnographic observations and then analysed the data to develop a model of controllers' cognitive process. This analysis revealed that strategic routines are applicable to decision making.

  10. Image-Processing Software For A Hypercube Computer

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Mazer, Alan S.; Groom, Steven L.; Williams, Winifred I.

    1992-01-01

    Concurrent Image Processing Executive (CIPE) is software system intended to develop and use image-processing application programs on concurrent computing environment. Designed to shield programmer from complexities of concurrent-system architecture, it provides interactive image-processing environment for end user. CIPE utilizes architectural characteristics of particular concurrent system to maximize efficiency while preserving architectural independence from user and programmer. CIPE runs on Mark-IIIfp 8-node hypercube computer and associated SUN-4 host computer.

  11. Complexity in Nature and Society: Complexity Management in the Age of Globalization

    NASA Astrophysics Data System (ADS)

    Mainzer, Klaus

    The theory of nonlinear complex systems has become a proven problem-solving approach in the natural sciences from cosmic and quantum systems to cellular organisms and the brain. Even in modern engineering science self-organizing systems are developed to manage complex networks and processes. It is now recognized that many of our ecological, social, economic, and political problems are also of a global, complex, and nonlinear nature. What are the laws of sociodynamics? Is there a socio-engineering of nonlinear problem solving? What can we learn from nonlinear dynamics for complexity management in social, economic, financial and political systems? Is self-organization an acceptable strategy to handle the challenges of complexity in firms, institutions and other organizations? It is a main thesis of the talk that nature and society are basically governed by nonlinear and complex information dynamics. How computational is sociodynamics? What can we hope for social, economic and political problem solving in the age of globalization?.

  12. Social network supported process recommender system.

    PubMed

    Ye, Yanming; Yin, Jianwei; Xu, Yueshen

    2014-01-01

    Process recommendation technologies have gained more and more attention in the field of intelligent business process modeling to assist the process modeling. However, most of the existing technologies only use the process structure analysis and do not take the social features of processes into account, while the process modeling is complex and comprehensive in most situations. This paper studies the feasibility of social network research technologies on process recommendation and builds a social network system of processes based on the features similarities. Then, three process matching degree measurements are presented and the system implementation is discussed subsequently. Finally, experimental evaluations and future works are introduced.

  13. Supporting the Knowledge-to-Action Process: A Systems-Thinking Approach

    ERIC Educational Resources Information Center

    Cherney, Adrian; Head, Brian

    2011-01-01

    The processes for moving research-based knowledge to the domains of action in social policy and professional practice are complex. Several disciplinary research traditions have illuminated several key aspects of these processes. A more holistic approach, drawing on systems thinking, has also been outlined and advocated by recent contributors to…

  14. The Capabilities of Chaos and Complexity

    PubMed Central

    Abel, David L.

    2009-01-01

    To what degree could chaos and complexity have organized a Peptide or RNA World of crude yet necessarily integrated protometabolism? How far could such protolife evolve in the absence of a heritable linear digital symbol system that could mutate, instruct, regulate, optimize and maintain metabolic homeostasis? To address these questions, chaos, complexity, self-ordered states, and organization must all be carefully defined and distinguished. In addition their cause-and-effect relationships and mechanisms of action must be delineated. Are there any formal (non physical, abstract, conceptual, algorithmic) components to chaos, complexity, self-ordering and organization, or are they entirely physicodynamic (physical, mass/energy interaction alone)? Chaos and complexity can produce some fascinating self-ordered phenomena. But can spontaneous chaos and complexity steer events and processes toward pragmatic benefit, select function over non function, optimize algorithms, integrate circuits, produce computational halting, organize processes into formal systems, control and regulate existing systems toward greater efficiency? The question is pursued of whether there might be some yet-to-be discovered new law of biology that will elucidate the derivation of prescriptive information and control. “System” will be rigorously defined. Can a low-informational rapid succession of Prigogine’s dissipative structures self-order into bona fide organization? PMID:19333445

  15. Process mining is an underutilized clinical research tool in transfusion medicine.

    PubMed

    Quinn, Jason G; Conrad, David M; Cheng, Calvino K

    2017-03-01

    To understand inventory performance, transfusion services commonly use key performance indicators (KPIs) as summary descriptors of inventory efficiency that are graphed, trended, and used to benchmark institutions. Here, we summarize current limitations in KPI-based evaluation of blood bank inventory efficiency and propose process mining as an ideal methodology for application to inventory management research to improve inventory flows and performance. The transit of a blood product from inventory receipt to final disposition is complex and relates to many internal and external influences, and KPIs may be inadequate to fully understand the complexity of the blood supply chain and how units interact with its processes. Process mining lends itself well to analysis of blood bank inventories, and modern laboratory information systems can track nearly all of the complex processes that occur in the blood bank. Process mining is an analytical tool already used in other industries and can be applied to blood bank inventory management and research through laboratory information systems data using commercial applications. Although the current understanding of real blood bank inventories is value-centric through KPIs, it potentially can be understood from a process-centric lens using process mining. © 2017 AABB.

  16. Complexity leadership: a healthcare imperative.

    PubMed

    Weberg, Dan

    2012-01-01

    The healthcare system is plagued with increasing cost and poor quality outcomes. A major contributing factor for these issues is that outdated leadership practices, such as leader-centricity, linear thinking, and poor readiness for innovation, are being used in healthcare organizations. Complexity leadership theory provides a new framework with which healthcare leaders may practice leadership. Complexity leadership theory conceptualizes leadership as a continual process that stems from collaboration, complex systems thinking, and innovation mindsets. Compared to transactional and transformational leadership concepts, complexity leadership practices hold promise to improve cost and quality in health care. © 2012 Wiley Periodicals, Inc.

  17. Data flow modeling techniques

    NASA Technical Reports Server (NTRS)

    Kavi, K. M.

    1984-01-01

    There have been a number of simulation packages developed for the purpose of designing, testing and validating computer systems, digital systems and software systems. Complex analytical tools based on Markov and semi-Markov processes have been designed to estimate the reliability and performance of simulated systems. Petri nets have received wide acceptance for modeling complex and highly parallel computers. In this research data flow models for computer systems are investigated. Data flow models can be used to simulate both software and hardware in a uniform manner. Data flow simulation techniques provide the computer systems designer with a CAD environment which enables highly parallel complex systems to be defined, evaluated at all levels and finally implemented in either hardware or software. Inherent in data flow concept is the hierarchical handling of complex systems. In this paper we will describe how data flow can be used to model computer system.

  18. The Emergence of Temporal Structures in Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Mainzer, Klaus

    2010-10-01

    Dynamical systems in classical, relativistic and quantum physics are ruled by laws with time reversibility. Complex dynamical systems with time-irreversibility are known from thermodynamics, biological evolution, growth of organisms, brain research, aging of people, and historical processes in social sciences. Complex systems are systems that compromise many interacting parts with the ability to generate a new quality of macroscopic collective behavior the manifestations of which are the spontaneous emergence of distinctive temporal, spatial or functional structures. But, emergence is no mystery. In a general meaning, the emergence of macroscopic features results from the nonlinear interactions of the elements in a complex system. Mathematically, the emergence of irreversible structures is modelled by phase transitions in non-equilibrium dynamics of complex systems. These methods have been modified even for chemical, biological, economic and societal applications (e.g., econophysics). Emergence of irreversible structures can also be simulated by computational systems. The question arises how the emergence of irreversible structures is compatible with the reversibility of fundamental physical laws. It is argued that, according to quantum cosmology, cosmic evolution leads from symmetry to complexity of irreversible structures by symmetry breaking and phase transitions. Thus, arrows of time and aging processes are not only subjective experiences or even contradictions to natural laws, but they can be explained by quantum cosmology and the nonlinear dynamics of complex systems. Human experiences and religious concepts of arrows of time are considered in a modern scientific framework. Platonic ideas of eternity are at least understandable with respect to mathematical invariance and symmetry of physical laws. Heraclit’s world of change and dynamics can be mapped onto our daily real-life experiences of arrows of time.

  19. Deployment Process, Mechanization, and Testing for the Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Iskenderian, Ted

    2004-01-01

    NASA's Mar Exploration Rover (MER) robotic prospectors were produced in an environment of unusually challenging schedule, volume, and mass restrictions. The technical challenges pushed the system s design towards extensive integration of function, which resulted in complex system engineering issues. One example of the system's integrated complexity can be found in the deployment process for the rover. Part of this process, rover "standup", is outlined in this paper. Particular attention is given to the Rover Lift Mechanism's (RLM) role and its design. Analysis methods are presented and compared to test results. It is shown that because prudent design principles were followed, a robust mechanism was created that minimized the duration of integration and test, and enabled recovery without perturbing related systems when reasonably foreseeable problems did occur. Examples of avoidable, unnecessary difficulty are also presented.

  20. Smart signal processing for an evolving electric grid

    NASA Astrophysics Data System (ADS)

    Silva, Leandro Rodrigues Manso; Duque, Calos Augusto; Ribeiro, Paulo F.

    2015-12-01

    Electric grids are interconnected complex systems consisting of generation, transmission, distribution, and active loads, recently called prosumers as they produce and consume electric energy. Additionally, these encompass a vast array of equipment such as machines, power transformers, capacitor banks, power electronic devices, motors, etc. that are continuously evolving in their demand characteristics. Given these conditions, signal processing is becoming an essential assessment tool to enable the engineer and researcher to understand, plan, design, and operate the complex and smart electronic grid of the future. This paper focuses on recent developments associated with signal processing applied to power system analysis in terms of characterization and diagnostics. The following techniques are reviewed and their characteristics and applications discussed: active power system monitoring, sparse representation of power system signal, real-time resampling, and time-frequency (i.e., wavelets) applied to power fluctuations.

  1. Optimization of controlled processes in combined-cycle plant (new developments and researches)

    NASA Astrophysics Data System (ADS)

    Tverskoy, Yu S.; Muravev, I. K.

    2017-11-01

    All modern complex technical systems, including power units of TPP and nuclear power plants, work in the system-forming structure of multifunctional APCS. The development of the modern APCS mathematical support allows bringing the automation degree to the solution of complex optimization problems of equipment heat-mass-exchange processes in real time. The difficulty of efficient management of a binary power unit is related to the need to solve jointly at least three problems. The first problem is related to the physical issues of combined-cycle technologies. The second problem is determined by the criticality of the CCGT operation to changes in the regime and climatic factors. The third problem is related to a precise description of a vector of controlled coordinates of a complex technological object. To obtain a joint solution of this complex of interconnected problems, the methodology of generalized thermodynamic analysis, methods of the theory of automatic control and mathematical modeling are used. In the present report, results of new developments and studies are shown. These results allow improving the principles of process control and the automatic control systems structural synthesis of power units with combined-cycle plants that provide attainable technical and economic efficiency and operational reliability of equipment.

  2. Bridging complexity theory and resilience to develop surge capacity in health systems.

    PubMed

    Therrien, Marie-Christine; Normandin, Julie-Maude; Denis, Jean-Louis

    2017-03-20

    Purpose Health systems are periodically confronted by crises - think of Severe Acute Respiratory Syndrome, H1N1, and Ebola - during which they are called upon to manage exceptional situations without interrupting essential services to the population. The ability to accomplish this dual mandate is at the heart of resilience strategies, which in healthcare systems involve developing surge capacity to manage a sudden influx of patients. The paper aims to discuss these issues. Design/methodology/approach This paper relates insights from resilience research to the four "S" of surge capacity (staff, stuff, structures and systems) and proposes a framework based on complexity theory to better understand and assess resilience factors that enable the development of surge capacity in complex health systems. Findings Detailed and dynamic complexities manifest in different challenges during a crisis. Resilience factors are classified according to these types of complexity and along their temporal dimensions: proactive factors that improve preparedness to confront both usual and exceptional requirements, and passive factors that enable response to unexpected demands as they arise during a crisis. The framework is completed by further categorizing resilience factors according to their stabilizing or destabilizing impact, drawing on feedback processes described in complexity theory. Favorable order resilience factors create consistency and act as stabilizing forces in systems, while favorable disorder factors such as diversity and complementarity act as destabilizing forces. Originality/value The framework suggests a balanced and innovative process to integrate these factors in a pragmatic approach built around the fours "S" of surge capacity to increase health system resilience.

  3. Complex adaptive systems and their relevance for nursing: An evolutionary concept analysis.

    PubMed

    Notarnicola, Ippolito; Petrucci, Cristina; De Jesus Barbosa, Maria Rosimar; Giorgi, Fabio; Stievano, Alessandro; Rocco, Gennaro; Lancia, Loreto

    2017-06-01

    This study aimed to analyse the concept of "complex adaptive systems." The construct is still nebulous in the literature, and a further explanation of the idea is needed to have a shared knowledge of it. A concept analysis was conducted utilizing Rodgers evolutionary method. The inclusive years of bibliographic search started from 2005 to 2015. The search was conducted at PubMed©, CINAHL© (EBSCO host©), Scopus©, Web of Science©, and Academic Search Premier©. Retrieved papers were critically analysed to explore the attributes, antecedents, and consequences of the concept. Moreover, surrogates, related terms, and a pattern recognition scheme were identified. The concept analysis showed that complex systems are adaptive and have the ability to process information. They can adapt to the environment and consequently evolve. Nursing is a complex adaptive system, and the nursing profession in practice exhibits complex adaptive system characteristics. Complexity science through complex adaptive systems provides new ways of seeing and understanding the mechanisms that underpin the nursing profession. © 2017 John Wiley & Sons Australia, Ltd.

  4. Highly Manufacturable Deep (Sub-Millimeter) Etching Enabled High Aspect Ratio Complex Geometry Lego-Like Silicon Electronics.

    PubMed

    Ghoneim, Mohamed Tarek; Hussain, Muhammad Mustafa

    2017-04-01

    A highly manufacturable deep reactive ion etching based process involving a hybrid soft/hard mask process technology shows high aspect ratio complex geometry Lego-like silicon electronics formation enabling free-form (physically flexible, stretchable, and reconfigurable) electronic systems. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Low complexity Reed-Solomon-based low-density parity-check design for software defined optical transmission system based on adaptive puncturing decoding algorithm

    NASA Astrophysics Data System (ADS)

    Pan, Xiaolong; Liu, Bo; Zheng, Jianglong; Tian, Qinghua

    2016-08-01

    We propose and demonstrate a low complexity Reed-Solomon-based low-density parity-check (RS-LDPC) code with adaptive puncturing decoding algorithm for elastic optical transmission system. Partial received codes and the relevant column in parity-check matrix can be punctured to reduce the calculation complexity by adaptive parity-check matrix during decoding process. The results show that the complexity of the proposed decoding algorithm is reduced by 30% compared with the regular RS-LDPC system. The optimized code rate of the RS-LDPC code can be obtained after five times iteration.

  6. Fund Accounting Is Dead: Let This Complex System Rest in Peace.

    ERIC Educational Resources Information Center

    Coville, Joanne

    1995-01-01

    It is argued that colleges and universities have created extremely complex and convoluted accounting/reporting systems using fund accounting. Recent changes in accounting standards should be seen as an opportunity to streamline many of the processes that have been designed to support funds, allowing introduction of other approaches. (MSE)

  7. "Unhelpfully Complex and Exceedingly Opaque": Australia's School Funding System

    ERIC Educational Resources Information Center

    Dowling, Andrew

    2008-01-01

    Australia's system of school funding is notoriously complex and difficult to understand. This article shines some light on this issue by describing clearly the processes of school funding that currently exist in Australia. It describes the steps taken by federal and state governments to provide over $30 billion each year to government and…

  8. Computational Nonlinear Morphology with Emphasis on Semitic Languages. Studies in Natural Language Processing.

    ERIC Educational Resources Information Center

    Kiraz, George Anton

    This book presents a tractable computational model that can cope with complex morphological operations, especially in Semitic languages, and less complex morphological systems present in Western languages. It outlines a new generalized regular rewrite rule system that uses multiple finite-state automata to cater to root-and-pattern morphology,…

  9. Self-organization and complexity in historical landscape patterns

    Treesearch

    Janine Bolliger; Julien C. Sprott; David J. Mladenoff

    2003-01-01

    Self-organization describes the evolution process of complex structures where systems emerge spontaneously, driven internally by variations of the system itself. Self-organization to the critical state is manifested by scale-free behavior across many orders of magnitude (Bak et al. 1987, Bak 1996, Sole et a1. 1999). Spatial scale-free behavior implies fractal...

  10. Embracing Connectedness and Change: A Complex Dynamic Systems Perspective for Applied Linguistic Research

    ERIC Educational Resources Information Center

    Cameron, Lynne

    2015-01-01

    Complex dynamic systems (CDS) theory offers a powerful metaphorical model of applied linguistic processes, allowing holistic descriptions of situated phenomena, and addressing the connectedness and change that often characterise issues in our field. A recent study of Kenyan conflict transformation illustrates application of a CDS perspective. Key…

  11. If Language Is a Complex Adaptive System, What Is Language Assessment?

    ERIC Educational Resources Information Center

    Mislevy, Robert J.; Yin, Chengbin

    2009-01-01

    Individuals' use of language in contexts emerges from second-to-second processes of activating and integrating traces of past experiences--an interactionist view compatible with the study of language as a complex adaptive system but quite different from the trait-based framework through which measurement specialists investigate validity, establish…

  12. Focusing on the Complexity of Emotion Issues in Academic Learning: A Dynamical Component Systems Approach

    ERIC Educational Resources Information Center

    Eynde, Peter Op 't; Turner, Jeannine E.

    2006-01-01

    Understanding the interrelations among students' cognitive, emotional, motivational, and volitional processes is an emergening focus in educational psychology. A dynamical, component systems theory of emotions is presented as a promising framework to further unravel these complex interrelations. This framework considers emotions to be a process…

  13. Spintronic characteristics of self-assembled neurotransmitter acetylcholine molecular complexes enable quantum information processing in neural networks and brain

    NASA Astrophysics Data System (ADS)

    Tamulis, Arvydas; Majauskaite, Kristina; Kairys, Visvaldas; Zborowski, Krzysztof; Adhikari, Kapil; Krisciukaitis, Sarunas

    2016-09-01

    Implementation of liquid state quantum information processing based on spatially localized electronic spin in the neurotransmitter stable acetylcholine (ACh) neutral molecular radical is discussed. Using DFT quantum calculations we proved that this molecule possesses stable localized electron spin, which may represent a qubit in quantum information processing. The necessary operating conditions for ACh molecule are formulated in self-assembled dimer and more complex systems. The main quantum mechanical research result of this paper is that the neurotransmitter ACh systems, which were proposed, include the use of quantum molecular spintronics arrays to control the neurotransmission in neural networks.

  14. Translations on USSR Science and Technology, Biomedical and Behavioral Sciences, Number 15

    DTIC Science & Technology

    1977-11-16

    processed. By applying systems theory to synthesis of complex man-machine systems we form ergatic organisms which not only have external and internal...without exception (and this is extremely important to emphasize) as a complex , integral formation, which through various traditions has acquired a...and outputs of the whole, which has a complex internal organization and structure, which we can no longer ignore in our analysis. Thus analysis and

  15. Complex Permittivity of Planar Building Materials Measured With an Ultra-Wideband Free-Field Antenna Measurement System.

    PubMed

    Davis, Ben; Grosvenor, Chriss; Johnk, Robert; Novotny, David; Baker-Jarvis, James; Janezic, Michael

    2007-01-01

    Building materials are often incorporated into complex, multilayer macrostructures that are simply not amenable to measurements using coax or waveguide sample holders. In response to this, we developed an ultra-wideband (UWB) free-field measurement system. This measurement system uses a ground-plane-based system and two TEM half-horn antennas to transmit and receive the RF signal. The material samples are placed between the antennas, and reflection and transmission measurements made. Digital signal processing techniques are then applied to minimize environmental and systematic effects. The processed data are compared to a plane-wave model to extract the material properties with optimization software based on genetic algorithms.

  16. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  17. Reliable Real-time Calculation of Heart-rate Complexity in Critically Ill Patients Using Multiple Noisy Waveform Sources

    DTIC Science & Technology

    2014-01-01

    systems Machine learning Automatic data processing 1 Introduction Heart-rate complexity (HRC) is a method of quantifying the amount of complex...5. Batchinsky AI, Skinner J, Necsoiu C, et al. New measures of heart-rate complexity: effect of chest trauma and hemorrhage. J Trauma. 2010;68:1178–85

  18. Identifying Complex Dynamics in Social Systems: A New Methodological Approach Applied to Study School Segregation

    ERIC Educational Resources Information Center

    Spaiser, Viktoria; Hedström, Peter; Ranganathan, Shyam; Jansson, Kim; Nordvik, Monica K.; Sumpter, David J. T.

    2018-01-01

    It is widely recognized that segregation processes are often the result of complex nonlinear dynamics. Empirical analyses of complex dynamics are however rare, because there is a lack of appropriate empirical modeling techniques that are capable of capturing complex patterns and nonlinearities. At the same time, we know that many social phenomena…

  19. Behavior of the gypsy moth life system model and development of synoptic model formulations

    Treesearch

    J. J. Colbert; Xu Rumei

    1991-01-01

    Aims of the research: The gypsy moth life system model (GMLSM) is a complex model which incorporates numerous components (both biotic and abiotic) and ecological processes. It is a detailed simulation model which has much biological reality. However, it has not yet been tested with life system data. For such complex models, evaluation and testing cannot be adequately...

  20. The N2-P3 complex of the evoked potential and human performance

    NASA Technical Reports Server (NTRS)

    Odonnell, Brian F.; Cohen, Ronald A.

    1988-01-01

    The N2-P3 complex and other endogenous components of human evoked potential provide a set of tools for the investigation of human perceptual and cognitive processes. These multidimensional measures of central nervous system bioelectrical activity respond to a variety of environmental and internal factors which have been experimentally characterized. Their application to the analysis of human performance in naturalistic task environments is just beginning. Converging evidence suggests that the N2-P3 complex reflects processes of stimulus evaluation, perceptual resource allocation, and decision making that proceed in parallel, rather than in series, with response generation. Utilization of these EP components may provide insights into the central nervous system mechanisms modulating task performance unavailable from behavioral measures alone. The sensitivity of the N2-P3 complex to neuropathology, psychopathology, and pharmacological manipulation suggests that these components might provide sensitive markers for the effects of environmental stressors on the human central nervous system.

  1. Software control and system configuration management: A systems-wide approach

    NASA Technical Reports Server (NTRS)

    Petersen, K. L.; Flores, C., Jr.

    1984-01-01

    A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.

  2. Integrating Wraparound into a Schoolwide System of Positive Behavior Supports

    ERIC Educational Resources Information Center

    Eber, Lucille; Hyde, Kelly; Suter, Jesse C.

    2011-01-01

    We describe the structure for implementation of the wraparound process within a multi-tiered system of school wide positive behavior support (SWPBS) to address the needs of the 1-5% of students with complex emotional/behavioral challenges. The installation of prerequisite system features that, based on a 3 year demonstration process, we consider…

  3. State Analysis: A Control Architecture View of Systems Engineering

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert D.

    2005-01-01

    A viewgraph presentation on the state analysis process is shown. The topics include: 1) Issues with growing complexity; 2) Limits of common practice; 3) Exploiting a control point of view; 4) A glimpse at the State Analysis process; 5) Synergy with model-based systems engineering; and 6) Bridging the systems to software gap.

  4. Effective Software Engineering Leadership for Development Programs

    ERIC Educational Resources Information Center

    Cagle West, Marsha

    2010-01-01

    Software is a critical component of systems ranging from simple consumer appliances to complex health, nuclear, and flight control systems. The development of quality, reliable, and effective software solutions requires the incorporation of effective software engineering processes and leadership. Processes, approaches, and methodologies for…

  5. The Caenorhabditis elegans Excretory System: A Model for Tubulogenesis, Cell Fate Specification, and Plasticity

    PubMed Central

    Sundaram, Meera V.; Buechner, Matthew

    2016-01-01

    The excretory system of the nematode Caenorhabditis elegans is a superb model of tubular organogenesis involving a minimum of cells. The system consists of just three unicellular tubes (canal, duct, and pore), a secretory gland, and two associated neurons. Just as in more complex organs, cells of the excretory system must first adopt specific identities and then coordinate diverse processes to form tubes of appropriate topology, shape, connectivity, and physiological function. The unicellular topology of excretory tubes, their varied and sometimes complex shapes, and the dynamic reprogramming of cell identity and remodeling of tube connectivity that occur during larval development are particularly fascinating features of this organ. The physiological roles of the excretory system in osmoregulation and other aspects of the animal’s life cycle are only beginning to be explored. The cellular mechanisms and molecular pathways used to build and shape excretory tubes appear similar to those used in both unicellular and multicellular tubes in more complex organs, such as the vertebrate vascular system and kidney, making this simple organ system a useful model for understanding disease processes. PMID:27183565

  6. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  7. Adaptive Correction from Virtually Complex Dynamic Libraries: The Role of Noncovalent Interactions in Structural Selection and Folding.

    PubMed

    Lafuente, Maria; Atcher, Joan; Solà, Jordi; Alfonso, Ignacio

    2015-11-16

    The hierarchical self-assembling of complex molecular systems is dictated by the chemical and structural information stored in their components. This information can be expressed through an adaptive process that determines the structurally fittest assembly under given environmental conditions. We have set up complex disulfide-based dynamic covalent libraries of chemically and topologically diverse pseudopeptidic compounds. We show how the reaction evolves from very complex mixtures at short reaction times to the almost exclusive formation of a major compound, through the establishment of intramolecular noncovalent interactions. Our experiments demonstrate that the systems evolve through error-check and error-correction processes. The nature of these interactions, the importance of the folding and the effects of the environment are also discussed. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Advancing the application of systems thinking in health: managing rural China health system development in complex and dynamic contexts.

    PubMed

    Zhang, Xiulan; Bloom, Gerald; Xu, Xiaoxin; Chen, Lin; Liang, Xiaoyun; Wolcott, Sara J

    2014-08-26

    This paper explores the evolution of schemes for rural finance in China as a case study of the long and complex process of health system development. It argues that the evolution of these schemes has been the outcome of the response of a large number of agents to a rapidly changing context and of efforts by the government to influence this adaptation process and achieve public health goals. The study draws on several sources of data including a review of official policy documents and academic papers and in-depth interviews with key policy actors at national level and at a sample of localities. The study identifies three major transition points associated with changes in broad development strategy and demonstrates how the adaptation of large numbers of actors to these contextual changes had a major impact on the performance of the health system. Further, it documents how the Ministry of Health viewed its role as both an advocate for the interests of health facilities and health workers and as the agency responsible for ensuring that government health system objectives were met. It is argued that a major reason for the resilience of the health system and its ability to adapt to rapid economic and institutional change was the ability of the Ministry to provide overall strategy leadership. Additionally, it postulates that a number of interest groups have emerged, which now also seek to influence the pathway of health system development. This history illustrates the complex and political nature of the management of health system development and reform. The paper concludes that governments will need to increase their capacity to analyze the health sector as a complex system and to manage change processes.

  9. Inhomogeneous point-process entropy: An instantaneous measure of complexity in discrete systems

    NASA Astrophysics Data System (ADS)

    Valenza, Gaetano; Citi, Luca; Scilingo, Enzo Pasquale; Barbieri, Riccardo

    2014-05-01

    Measures of entropy have been widely used to characterize complexity, particularly in physiological dynamical systems modeled in discrete time. Current approaches associate these measures to finite single values within an observation window, thus not being able to characterize the system evolution at each moment in time. Here, we propose a new definition of approximate and sample entropy based on the inhomogeneous point-process theory. The discrete time series is modeled through probability density functions, which characterize and predict the time until the next event occurs as a function of the past history. Laguerre expansions of the Wiener-Volterra autoregressive terms account for the long-term nonlinear information. As the proposed measures of entropy are instantaneously defined through probability functions, the novel indices are able to provide instantaneous tracking of the system complexity. The new measures are tested on synthetic data, as well as on real data gathered from heartbeat dynamics of healthy subjects and patients with cardiac heart failure and gait recordings from short walks of young and elderly subjects. Results show that instantaneous complexity is able to effectively track the system dynamics and is not affected by statistical noise properties.

  10. Qualitative Fault Isolation of Hybrid Systems: A Structural Model Decomposition-Based Approach

    NASA Technical Reports Server (NTRS)

    Bregon, Anibal; Daigle, Matthew; Roychoudhury, Indranil

    2016-01-01

    Quick and robust fault diagnosis is critical to ensuring safe operation of complex engineering systems. A large number of techniques are available to provide fault diagnosis in systems with continuous dynamics. However, many systems in aerospace and industrial environments are best represented as hybrid systems that consist of discrete behavioral modes, each with its own continuous dynamics. These hybrid dynamics make the on-line fault diagnosis task computationally more complex due to the large number of possible system modes and the existence of autonomous mode transitions. This paper presents a qualitative fault isolation framework for hybrid systems based on structural model decomposition. The fault isolation is performed by analyzing the qualitative information of the residual deviations. However, in hybrid systems this process becomes complex due to possible existence of observation delays, which can cause observed deviations to be inconsistent with the expected deviations for the current mode in the system. The great advantage of structural model decomposition is that (i) it allows to design residuals that respond to only a subset of the faults, and (ii) every time a mode change occurs, only a subset of the residuals will need to be reconfigured, thus reducing the complexity of the reasoning process for isolation purposes. To demonstrate and test the validity of our approach, we use an electric circuit simulation as the case study.

  11. Automated campaign system

    NASA Astrophysics Data System (ADS)

    Vondran, Gary; Chao, Hui; Lin, Xiaofan; Beyer, Dirk; Joshi, Parag; Atkins, Brian; Obrador, Pere

    2006-02-01

    To run a targeted campaign involves coordination and management across numerous organizations and complex process flows. Everything from market analytics on customer databases, acquiring content and images, composing the materials, meeting the sponsoring enterprise brand standards, driving through production and fulfillment, and evaluating results; all processes are currently performed by experienced highly trained staff. Presented is a developed solution that not only brings together technologies that automate each process, but also automates the entire flow so that a novice user could easily run a successful campaign from their desktop. This paper presents the technologies, structure, and process flows used to bring this system together. Highlighted will be how the complexity of running a targeted campaign is hidden from the user through technologies, all while providing the benefits of a professionally managed campaign.

  12. Information processing using a single dynamical node as complex system

    PubMed Central

    Appeltant, L.; Soriano, M.C.; Van der Sande, G.; Danckaert, J.; Massar, S.; Dambre, J.; Schrauwen, B.; Mirasso, C.R.; Fischer, I.

    2011-01-01

    Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing. PMID:21915110

  13. Unlocking the potential of supported liquid phase catalysts with supercritical fluids: low temperature continuous flow catalysis with integrated product separation

    PubMed Central

    Franciò, Giancarlo; Hintermair, Ulrich; Leitner, Walter

    2015-01-01

    Solution-phase catalysis using molecular transition metal complexes is an extremely powerful tool for chemical synthesis and a key technology for sustainable manufacturing. However, as the reaction complexity and thermal sensitivity of the catalytic system increase, engineering challenges associated with product separation and catalyst recovery can override the value of the product. This persistent downstream issue often renders industrial exploitation of homogeneous catalysis uneconomical despite impressive batch performance of the catalyst. In this regard, continuous-flow systems that allow steady-state homogeneous turnover in a stationary liquid phase while at the same time effecting integrated product separation at mild process temperatures represent a particularly attractive scenario. While continuous-flow processing is a standard procedure for large volume manufacturing, capitalizing on its potential in the realm of the molecular complexity of organic synthesis is still an emerging area that requires innovative solutions. Here we highlight some recent developments which have succeeded in realizing such systems by the combination of near- and supercritical fluids with homogeneous catalysts in supported liquid phases. The cases discussed exemplify how all three levels of continuous-flow homogeneous catalysis (catalyst system, separation strategy, process scheme) must be matched to locate viable process conditions. PMID:26574523

  14. Unlocking the potential of supported liquid phase catalysts with supercritical fluids: low temperature continuous flow catalysis with integrated product separation.

    PubMed

    Franciò, Giancarlo; Hintermair, Ulrich; Leitner, Walter

    2015-12-28

    Solution-phase catalysis using molecular transition metal complexes is an extremely powerful tool for chemical synthesis and a key technology for sustainable manufacturing. However, as the reaction complexity and thermal sensitivity of the catalytic system increase, engineering challenges associated with product separation and catalyst recovery can override the value of the product. This persistent downstream issue often renders industrial exploitation of homogeneous catalysis uneconomical despite impressive batch performance of the catalyst. In this regard, continuous-flow systems that allow steady-state homogeneous turnover in a stationary liquid phase while at the same time effecting integrated product separation at mild process temperatures represent a particularly attractive scenario. While continuous-flow processing is a standard procedure for large volume manufacturing, capitalizing on its potential in the realm of the molecular complexity of organic synthesis is still an emerging area that requires innovative solutions. Here we highlight some recent developments which have succeeded in realizing such systems by the combination of near- and supercritical fluids with homogeneous catalysts in supported liquid phases. The cases discussed exemplify how all three levels of continuous-flow homogeneous catalysis (catalyst system, separation strategy, process scheme) must be matched to locate viable process conditions. © 2015 The Authors.

  15. Analog nonlinear MIMO receiver for optical mode division multiplexing transmission.

    PubMed

    Spalvieri, Arnaldo; Boffi, Pierpaolo; Pecorino, Simone; Barletta, Luca; Magarini, Maurizio; Gatto, Alberto; Martelli, Paolo; Martinelli, Mario

    2013-10-21

    The complexity and the power consumption of digital signal processing are crucial issues in optical transmission systems based on mode division multiplexing and coherent multiple-input multiple-output (MIMO) processing at the receiver. In this paper the inherent characteristic of spatial separation between fiber modes is exploited, getting a MIMO system where joint demultiplexing and detection is based on spatially separated photodetectors. After photodetection, one has a MIMO system with nonlinear crosstalk between modes. The paper shows that the nonlinear crosstalk can be dealt with by a low-complexity and non-adaptive detection scheme, at least in the cases presented in the paper.

  16. Tree physiology research in a changing world.

    PubMed

    Kaufmann, Merrill R.; Linder, Sune

    1996-01-01

    Changes in issues and advances in methodology have contributed to substantial progress in tree physiology research during the last several decades. Current research focuses on process interactions in complex systems and the integration of processes across multiple spatial and temporal scales. An increasingly important challenge for future research is assuring sustainability of production systems and forested ecosystems in the face of increased demands for natural resources and human disturbance of forests. Meeting this challenge requires significant shifts in research approach, including the study of limitations of productivity that may accompany achievement of system sustainability, and a focus on the biological capabilities of complex land bases altered by human activity.

  17. A perspective on the advancement of natural language processing tasks via topological analysis of complex networks. Comment on "Approaching human language with complex networks" by Cong and Liu

    NASA Astrophysics Data System (ADS)

    Amancio, Diego Raphael

    2014-12-01

    Concepts and methods of complex networks have been applied to probe the properties of a myriad of real systems [1]. The finding that written texts modeled as graphs share several properties of other completely different real systems has inspired the study of language as a complex system [2]. Actually, language can be represented as a complex network in its several levels of complexity. As a consequence, morphological, syntactical and semantical properties have been employed in the construction of linguistic networks [3]. Even the character level has been useful to unfold particular patterns [4,5]. In the review by Cong and Liu [6], the authors emphasize the need to use the topological information of complex networks modeling the various spheres of the language to better understand its origins, evolution and organization. In addition, the authors cite the use of networks in applications aiming at holistic typology and stylistic variations. In this context, I will discuss some possible directions that could be followed in future research directed towards the understanding of language via topological characterization of complex linguistic networks. In addition, I will comment the use of network models for language processing applications. Additional prospects for future practical research lines will also be discussed in this comment.

  18. MHD processes in the outer heliosphere

    NASA Technical Reports Server (NTRS)

    Burlaga, L. F.

    1984-01-01

    The magnetic field measurements from Voyager and the magnetohydrodynamic (MHD) processes in the outer heliosphere are reviewed. A bibliography of the experimental and theoretical work concerning magnetic fields and plasmas observed in the outer heliosphere is given. Emphasis in this review is on basic concepts and dynamical processes involving the magnetic field. The theory that serves to explain and unify the interplanetary magnetic field and plasma observations is magnetohydrodynamics. Basic physical processes and observations that relate directly to solutions of the MHD equations are emphasized, but obtaining solutions of this complex system of equations involves various assumptions and approximations. The spatial and temporal complexity of the outer heliosphere and some approaches for dealing with this complexity are discussed.

  19. Complex agro-ecosystems for food security in a changing climate

    PubMed Central

    Khumairoh, Uma; Groot, Jeroen CJ; Lantinga, Egbert A

    2012-01-01

    Attempts to increase food crop yields by intensifying agricultural systems using high inputs of nonrenewable resources and chemicals frequently lead to de-gradation of natural resources, whereas most technological innovations are not accessible for smallholders that represent the majority of farmers world wide. Alternatively, cocultures consisting of assemblages of plant and animal species can support ecological processes of nutrient cycling and pest control, which may lead to increasing yields and declining susceptibility to extreme weather conditions with increasing complexity of the systems. Here we show that enhancing the complexity of a rice production system by adding combinations of compost, azolla, ducks, and fish resulted in strongly increased grain yields and revenues in a season with extremely adverse weather conditions on East Java, Indonesia. We found that azolla, duck, and fish increased plant nutrient content, tillering and leaf area expansion, and strongly reduced the density of six different pests. In the most complex system comprising all components the highest grain yield was obtained. The net revenues of this system from sales of rice grain, fish, and ducks, after correction for extra costs, were 114% higher than rice cultivation with only compost as fertilizer. These results provide more insight in the agro-ecological processes and demonstrate how complex agricultural systems can contribute to food security in a changing climate. If smallholders can be trained to manage these systems and are supported for initial investments by credits, their livelihoods can be improved while producing in an ecologically benign way. PMID:22957173

  20. Design tools for complex dynamic security systems.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byrne, Raymond Harry; Rigdon, James Brian; Rohrer, Brandon Robinson

    2007-01-01

    The development of tools for complex dynamic security systems is not a straight forward engineering task but, rather, a scientific task where discovery of new scientific principles and math is necessary. For years, scientists have observed complex behavior but have had difficulty understanding it. Prominent examples include: insect colony organization, the stock market, molecular interactions, fractals, and emergent behavior. Engineering such systems will be an even greater challenge. This report explores four tools for engineered complex dynamic security systems: Partially Observable Markov Decision Process, Percolation Theory, Graph Theory, and Exergy/Entropy Theory. Additionally, enabling hardware technology for next generation security systemsmore » are described: a 100 node wireless sensor network, unmanned ground vehicle and unmanned aerial vehicle.« less

  1. How multiplicity determines entropy and the derivation of the maximum entropy principle for complex systems.

    PubMed

    Hanel, Rudolf; Thurner, Stefan; Gell-Mann, Murray

    2014-05-13

    The maximum entropy principle (MEP) is a method for obtaining the most likely distribution functions of observables from statistical systems by maximizing entropy under constraints. The MEP has found hundreds of applications in ergodic and Markovian systems in statistical mechanics, information theory, and statistics. For several decades there has been an ongoing controversy over whether the notion of the maximum entropy principle can be extended in a meaningful way to nonextensive, nonergodic, and complex statistical systems and processes. In this paper we start by reviewing how Boltzmann-Gibbs-Shannon entropy is related to multiplicities of independent random processes. We then show how the relaxation of independence naturally leads to the most general entropies that are compatible with the first three Shannon-Khinchin axioms, the (c,d)-entropies. We demonstrate that the MEP is a perfectly consistent concept for nonergodic and complex statistical systems if their relative entropy can be factored into a generalized multiplicity and a constraint term. The problem of finding such a factorization reduces to finding an appropriate representation of relative entropy in a linear basis. In a particular example we show that path-dependent random processes with memory naturally require specific generalized entropies. The example is to our knowledge the first exact derivation of a generalized entropy from the microscopic properties of a path-dependent random process.

  2. Harnessing expert knowledge: Defining a Bayesian network decision model with limited data-Model structure for the vibration qualification problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rizzo, Davinia B.; Blackburn, Mark R.

    As systems become more complex, systems engineers rely on experts to inform decisions. There are few experts and limited data in many complex new technologies. This challenges systems engineers as they strive to plan activities such as qualification in an environment where technical constraints are coupled with the traditional cost, risk, and schedule constraints. Bayesian network (BN) models provide a framework to aid systems engineers in planning qualification efforts with complex constraints by harnessing expert knowledge and incorporating technical factors. By quantifying causal factors, a BN model can provide data about the risk of implementing a decision supplemented with informationmore » on driving factors. This allows a systems engineer to make informed decisions and examine “what-if” scenarios. This paper discusses a novel process developed to define a BN model structure based primarily on expert knowledge supplemented with extremely limited data (25 data sets or less). The model was developed to aid qualification decisions—specifically to predict the suitability of six degrees of freedom (6DOF) vibration testing for qualification. The process defined the model structure with expert knowledge in an unbiased manner. Finally, validation during the process execution and of the model provided evidence the process may be an effective tool in harnessing expert knowledge for a BN model.« less

  3. Harnessing expert knowledge: Defining a Bayesian network decision model with limited data-Model structure for the vibration qualification problem

    DOE PAGES

    Rizzo, Davinia B.; Blackburn, Mark R.

    2018-03-30

    As systems become more complex, systems engineers rely on experts to inform decisions. There are few experts and limited data in many complex new technologies. This challenges systems engineers as they strive to plan activities such as qualification in an environment where technical constraints are coupled with the traditional cost, risk, and schedule constraints. Bayesian network (BN) models provide a framework to aid systems engineers in planning qualification efforts with complex constraints by harnessing expert knowledge and incorporating technical factors. By quantifying causal factors, a BN model can provide data about the risk of implementing a decision supplemented with informationmore » on driving factors. This allows a systems engineer to make informed decisions and examine “what-if” scenarios. This paper discusses a novel process developed to define a BN model structure based primarily on expert knowledge supplemented with extremely limited data (25 data sets or less). The model was developed to aid qualification decisions—specifically to predict the suitability of six degrees of freedom (6DOF) vibration testing for qualification. The process defined the model structure with expert knowledge in an unbiased manner. Finally, validation during the process execution and of the model provided evidence the process may be an effective tool in harnessing expert knowledge for a BN model.« less

  4. Task Complexity, Epistemological Beliefs and Metacognitive Calibration: An Exploratory Study

    ERIC Educational Resources Information Center

    Stahl, Elmar; Pieschl, Stephanie; Bromme, Rainer

    2006-01-01

    This article presents an explorative study, which is part of a comprehensive project to examine the impact of epistemological beliefs on metacognitive calibration during learning processes within a complex hypermedia information system. More specifically, this study investigates: 1) if learners differentiate between tasks of different complexity,…

  5. Structural Behavioral Study on the General Aviation Network Based on Complex Network

    NASA Astrophysics Data System (ADS)

    Zhang, Liang; Lu, Na

    2017-12-01

    The general aviation system is an open and dissipative system with complex structures and behavioral features. This paper has established the system model and network model for general aviation. We have analyzed integral attributes and individual attributes by applying the complex network theory and concluded that the general aviation network has influential enterprise factors and node relations. We have checked whether the network has small world effect, scale-free property and network centrality property which a complex network should have by applying degree distribution of functions and proved that the general aviation network system is a complex network. Therefore, we propose to achieve the evolution process of the general aviation industrial chain to collaborative innovation cluster of advanced-form industries by strengthening network multiplication effect, stimulating innovation performance and spanning the structural hole path.

  6. Using AI and Semantic Web Technologies to attack Process Complexity in Open Systems

    NASA Astrophysics Data System (ADS)

    Thompson, Simon; Giles, Nick; Li, Yang; Gharib, Hamid; Nguyen, Thuc Duong

    Recently many vendors and groups have advocated using BPEL and WS-BPEL as a workflow language to encapsulate business logic. While encapsulating workflow and process logic in one place is a sensible architectural decision the implementation of complex workflows suffers from the same problems that made managing and maintaining hierarchical procedural programs difficult. BPEL lacks constructs for logical modularity such as the requirements construct from the STL [12] or the ability to adapt constructs like pure abstract classes for the same purpose. We describe a system that uses semantic web and agent concepts to implement an abstraction layer for BPEL based on the notion of Goals and service typing. AI planning was used to enable process engineers to create and validate systems that used services and goals as first class concepts and compiled processes at run time for execution.

  7. An integration architecture for the automation of a continuous production complex.

    PubMed

    Chacón, Edgar; Besembel, Isabel; Narciso, Flor; Montilva, Jonás; Colina, Eliezer

    2002-01-01

    The development of integrated automation systems for continuous production plants is a very complicated process. A variety of factors must be taken into account, such as their different components (e.g., production units control systems, planning systems, financial systems, etc.), the interaction among them, and their different behavior (continuous or discrete). Moreover, the difficulty of this process is increased by the fact that each component can be viewed in a different way depending on the kind of decisions to be made, and its specific behavior. Modeling continuous production complexes as a composition of components, where, in turn, each component may also be a composite, appears to be the simplest and safest way to develop integrated automation systems. In order to provide the most versatile way to develop this kind of system, this work proposes a new approach for designing and building them, where process behavior, operation conditions and equipment conditions are integrated into a hierarchical automation architecture.

  8. The Role of Mental Models in Dynamic Decision-Making

    DTIC Science & Technology

    2009-03-01

    Humansystems® Incorporated 111 Farquhar St., Guelph, ON N1H 3N4 Project Manager : Lisa A. Rehak PWGSC Contract No.: W7711-078110/001/TOR Call...simulate the processes that people use to manage complex systems. These analogies, moreover, represent one way to help people to form more accurate...make complex decisions. Control theory’s primary emphasis is on the role of feedback while managing a complex system. What is common to all of these

  9. Investigation of design considerations for a complex demodulation filter

    NASA Technical Reports Server (NTRS)

    Stoughton, J. W.

    1984-01-01

    The digital design of an adaptive digital filter to be employed in the processing of microwave remote sensor data was developed. In particular, a complex demodulation approach was developed to provide narrow band power estimation for a proposed Doppler scatterometer system. This scatterometer was considered for application in the proposed National Oceanographic survey satellite, on an improvement of SEASAT features. A generalized analysis of complex diagrams for the digital architecture component of the proposed system.

  10. Strategies for concurrent processing of complex algorithms in data driven architectures

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.

    1988-01-01

    The purpose is to document research to develop strategies for concurrent processing of complex algorithms in data driven architectures. The problem domain consists of decision-free algorithms having large-grained, computationally complex primitive operations. Such are often found in signal processing and control applications. The anticipated multiprocessor environment is a data flow architecture containing between two and twenty computing elements. Each computing element is a processor having local program memory, and which communicates with a common global data memory. A new graph theoretic model called ATAMM which establishes rules for relating a decomposed algorithm to its execution in a data flow architecture is presented. The ATAMM model is used to determine strategies to achieve optimum time performance and to develop a system diagnostic software tool. In addition, preliminary work on a new multiprocessor operating system based on the ATAMM specifications is described.

  11. Some aspects of mathematical and chemical modeling of complex chemical processes

    NASA Technical Reports Server (NTRS)

    Nemes, I.; Botar, L.; Danoczy, E.; Vidoczy, T.; Gal, D.

    1983-01-01

    Some theoretical questions involved in the mathematical modeling of the kinetics of complex chemical process are discussed. The analysis is carried out for the homogeneous oxidation of ethylbenzene in the liquid phase. Particular attention is given to the determination of the general characteristics of chemical systems from an analysis of mathematical models developed on the basis of linear algebra.

  12. The application of a unique flow modeling technique to complex combustion systems

    NASA Astrophysics Data System (ADS)

    Waslo, J.; Hasegawa, T.; Hilt, M. B.

    1986-06-01

    This paper describes the application of a unique three-dimensional water flow modeling technique to the study of complex fluid flow patterns within an advanced gas turbine combustor. The visualization technique uses light scattering, coupled with real-time image processing, to determine flow fields. Additional image processing is used to make concentration measurements within the combustor.

  13. Epidemic processes in complex networks

    NASA Astrophysics Data System (ADS)

    Pastor-Satorras, Romualdo; Castellano, Claudio; Van Mieghem, Piet; Vespignani, Alessandro

    2015-07-01

    In recent years the research community has accumulated overwhelming evidence for the emergence of complex and heterogeneous connectivity patterns in a wide range of biological and sociotechnical systems. The complex properties of real-world networks have a profound impact on the behavior of equilibrium and nonequilibrium phenomena occurring in various systems, and the study of epidemic spreading is central to our understanding of the unfolding of dynamical processes in complex networks. The theoretical analysis of epidemic spreading in heterogeneous networks requires the development of novel analytical frameworks, and it has produced results of conceptual and practical relevance. A coherent and comprehensive review of the vast research activity concerning epidemic processes is presented, detailing the successful theoretical approaches as well as making their limits and assumptions clear. Physicists, mathematicians, epidemiologists, computer, and social scientists share a common interest in studying epidemic spreading and rely on similar models for the description of the diffusion of pathogens, knowledge, and innovation. For this reason, while focusing on the main results and the paradigmatic models in infectious disease modeling, the major results concerning generalized social contagion processes are also presented. Finally, the research activity at the forefront in the study of epidemic spreading in coevolving, coupled, and time-varying networks is reported.

  14. Supporting Space Systems Design via Systems Dependency Analysis Methodology

    NASA Astrophysics Data System (ADS)

    Guariniello, Cesare

    The increasing size and complexity of space systems and space missions pose severe challenges to space systems engineers. When complex systems and Systems-of-Systems are involved, the behavior of the whole entity is not only due to that of the individual systems involved but also to the interactions and dependencies between the systems. Dependencies can be varied and complex, and designers usually do not perform analysis of the impact of dependencies at the level of complex systems, or this analysis involves excessive computational cost, or occurs at a later stage of the design process, after designers have already set detailed requirements, following a bottom-up approach. While classical systems engineering attempts to integrate the perspectives involved across the variety of engineering disciplines and the objectives of multiple stakeholders, there is still a need for more effective tools and methods capable to identify, analyze and quantify properties of the complex system as a whole and to model explicitly the effect of some of the features that characterize complex systems. This research describes the development and usage of Systems Operational Dependency Analysis and Systems Developmental Dependency Analysis, two methods based on parametric models of the behavior of complex systems, one in the operational domain and one in the developmental domain. The parameters of the developed models have intuitive meaning, are usable with subjective and quantitative data alike, and give direct insight into the causes of observed, and possibly emergent, behavior. The approach proposed in this dissertation combines models of one-to-one dependencies among systems and between systems and capabilities, to analyze and evaluate the impact of failures or delays on the outcome of the whole complex system. The analysis accounts for cascading effects, partial operational failures, multiple failures or delays, and partial developmental dependencies. The user of these methods can assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.

  15. The problem of ecological scaling in spatially complex, nonequilibrium ecological systems [chapter 3

    Treesearch

    Samuel A. Cushman; Jeremy Littell; Kevin McGarigal

    2010-01-01

    In the previous chapter we reviewed the challenges posed by spatial complexity and temporal disequilibrium to efforts to understand and predict the structure and dynamics of ecological systems. The central theme was that spatial variability in the environment and population processes fundamentally alters the interactions between species and their environments, largely...

  16. Computer simulation of functioning of elements of security systems

    NASA Astrophysics Data System (ADS)

    Godovykh, A. V.; Stepanov, B. P.; Sheveleva, A. A.

    2017-01-01

    The article is devoted to issues of development of the informational complex for simulation of functioning of the security system elements. The complex is described from the point of view of main objectives, a design concept and an interrelation of main elements. The proposed conception of the computer simulation provides an opportunity to simulate processes of security system work for training security staff during normal and emergency operation.

  17. Parameter estimation procedure for complex non-linear systems: calibration of ASM No. 1 for N-removal in a full-scale oxidation ditch.

    PubMed

    Abusam, A; Keesman, K J; van Straten, G; Spanjers, H; Meinema, K

    2001-01-01

    When applied to large simulation models, the process of parameter estimation is also called calibration. Calibration of complex non-linear systems, such as activated sludge plants, is often not an easy task. On the one hand, manual calibration of such complex systems is usually time-consuming, and its results are often not reproducible. On the other hand, conventional automatic calibration methods are not always straightforward and often hampered by local minima problems. In this paper a new straightforward and automatic procedure, which is based on the response surface method (RSM) for selecting the best identifiable parameters, is proposed. In RSM, the process response (output) is related to the levels of the input variables in terms of a first- or second-order regression model. Usually, RSM is used to relate measured process output quantities to process conditions. However, in this paper RSM is used for selecting the dominant parameters, by evaluating parameters sensitivity in a predefined region. Good results obtained in calibration of ASM No. 1 for N-removal in a full-scale oxidation ditch proved that the proposed procedure is successful and reliable.

  18. Towards Hybrid Online On-Demand Querying of Realtime Data with Stateful Complex Event Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Qunzhi; Simmhan, Yogesh; Prasanna, Viktor K.

    Emerging Big Data applications in areas like e-commerce and energy industry require both online and on-demand queries to be performed over vast and fast data arriving as streams. These present novel challenges to Big Data management systems. Complex Event Processing (CEP) is recognized as a high performance online query scheme which in particular deals with the velocity aspect of the 3-V’s of Big Data. However, traditional CEP systems do not consider data variety and lack the capability to embed ad hoc queries over the volume of data streams. In this paper, we propose H2O, a stateful complex event processing framework,more » to support hybrid online and on-demand queries over realtime data. We propose a semantically enriched event and query model to address data variety. A formal query algebra is developed to precisely capture the stateful and containment semantics of online and on-demand queries. We describe techniques to achieve the interactive query processing over realtime data featured by efficient online querying, dynamic stream data persistence and on-demand access. The system architecture is presented and the current implementation status reported.« less

  19. Social Network Supported Process Recommender System

    PubMed Central

    Ye, Yanming; Yin, Jianwei; Xu, Yueshen

    2014-01-01

    Process recommendation technologies have gained more and more attention in the field of intelligent business process modeling to assist the process modeling. However, most of the existing technologies only use the process structure analysis and do not take the social features of processes into account, while the process modeling is complex and comprehensive in most situations. This paper studies the feasibility of social network research technologies on process recommendation and builds a social network system of processes based on the features similarities. Then, three process matching degree measurements are presented and the system implementation is discussed subsequently. Finally, experimental evaluations and future works are introduced. PMID:24672309

  20. Does human cognition allow Human Factors (HF) certification of advanced aircrew systems?

    NASA Technical Reports Server (NTRS)

    Macleod, Iain S.; Taylor, Robert M.

    1994-01-01

    This paper has examined the requirements of HF specification and certification within advanced or complex aircrew systems. It suggests reasons for current inadequacies in the use of HF in the design process, giving some examples in support, and suggesting an avenue towards the improvement of the HF certification process. The importance of human cognition to the operation and performance of advanced aircrew systems has been stressed. Many of the shortfalls of advanced aircrew systems must be attributed to over automated designs that show little consideration on either the mental limits or the cognitive capabilities of the human system component. Traditional approaches to system design and HF certification are set within an over physicalistic foundation. Also, traditionally it was assumed that physicalistic system functions could be attributed to either the human or the machine on a one to one basis. Moreover, any problems associated with the parallel needs, or promoting human understanding alongside system operation and direction, were generally equated in reality by the natural flexibility and adaptability of human skills. The consideration of the human component of a complex system is seen as being primarily based on manifestations of human behavior to the almost total exclusion of any appreciation of unobservable human mental and cognitive processes. The argument of this paper is that the considered functionality of any complex human-machine system must contain functions that are purely human and purely cognitive. Human-machine system reliability ultimately depends on human reliability and dependability and, therefore, on the form and frequency of cognitive processes that have to be conducted to support system performance. The greater the demand placed by an advanced aircraft system on the human component's basic knowledge processes or cognition, rather than on skill, the more insiduous the effects the human may have on that system. This paper discusses one example of an attempt to devise an improved method of specificaiton and certification with relation to the advanced aircrew system, that of the RN Merlin helicopter. The method is realized to have limitations in practice, these mainly associated with the late production of the system specification in relation to the system development process. The need for a careful appreciation of the capabilities and support needs of human cognition within the design process of a complex man machine system has been argued, especially with relation to the concept of system functionality. Unlike the physicalistic Fitts list, a new classification of system functionality is proposed, namely: (1) equipment - system equipment related; (2) cognitive - human cognition related; and (3) associated - necessary combinatin of equipment and cognitive. This paper has not proposed a method for a fuller consideration of cognition within systems design, but has suggested the need for such a method and indicated an avenue towards its development. Finally, the HF certification of advanced aircrew systems is seen as only being possible in a qualified sense until the important functions of human cognition are considered within the system design process. (This paper contains the opinions of its authors and does not necessarily refledt the standpoint of their respective organizations).

  1. Dynamical complexity of short and noisy time series. Compression-Complexity vs. Shannon entropy

    NASA Astrophysics Data System (ADS)

    Nagaraj, Nithin; Balasubramanian, Karthi

    2017-07-01

    Shannon entropy has been extensively used for characterizing complexity of time series arising from chaotic dynamical systems and stochastic processes such as Markov chains. However, for short and noisy time series, Shannon entropy performs poorly. Complexity measures which are based on lossless compression algorithms are a good substitute in such scenarios. We evaluate the performance of two such Compression-Complexity Measures namely Lempel-Ziv complexity (LZ) and Effort-To-Compress (ETC) on short time series from chaotic dynamical systems in the presence of noise. Both LZ and ETC outperform Shannon entropy (H) in accurately characterizing the dynamical complexity of such systems. For very short binary sequences (which arise in neuroscience applications), ETC has higher number of distinct complexity values than LZ and H, thus enabling a finer resolution. For two-state ergodic Markov chains, we empirically show that ETC converges to a steady state value faster than LZ. Compression-Complexity measures are promising for applications which involve short and noisy time series.

  2. System-level simulation of liquid filling in microfluidic chips.

    PubMed

    Song, Hongjun; Wang, Yi; Pant, Kapil

    2011-06-01

    Liquid filling in microfluidic channels is a complex process that depends on a variety of geometric, operating, and material parameters such as microchannel geometry, flow velocity∕pressure, liquid surface tension, and contact angle of channel surface. Accurate analysis of the filling process can provide key insights into the filling time, air bubble trapping, and dead zone formation, and help evaluate trade-offs among the various design parameters and lead to optimal chip design. However, efficient modeling of liquid filling in complex microfluidic networks continues to be a significant challenge. High-fidelity computational methods, such as the volume of fluid method, are prohibitively expensive from a computational standpoint. Analytical models, on the other hand, are primarily applicable to idealized geometries and, hence, are unable to accurately capture chip level behavior of complex microfluidic systems. This paper presents a parametrized dynamic model for the system-level analysis of liquid filling in three-dimensional (3D) microfluidic networks. In our approach, a complex microfluidic network is deconstructed into a set of commonly used components, such as reservoirs, microchannels, and junctions. The components are then assembled according to their spatial layout and operating rationale to achieve a rapid system-level model. A dynamic model based on the transient momentum equation is developed to track the liquid front in the microchannels. The principle of mass conservation at the junction is used to link the fluidic parameters in the microchannels emanating from the junction. Assembly of these component models yields a set of differential and algebraic equations, which upon integration provides temporal information of the liquid filling process, particularly liquid front propagation (i.e., the arrival time). The models are used to simulate the transient liquid filling process in a variety of microfluidic constructs and in a multiplexer, representing a complex microfluidic network. The accuracy (relative error less than 7%) and orders-of-magnitude speedup (30 000X-4 000 000X) of our system-level models are verified by comparison against 3D high-fidelity numerical studies. Our findings clearly establish the utility of our models and simulation methodology for fast, reliable analysis of liquid filling to guide the design optimization of complex microfluidic networks.

  3. Method of separating and recovering uranium and related cations from spent Purex-type systems

    DOEpatents

    Mailen, J.C.; Tallent, O.K.

    1987-02-25

    A process for separating uranium and related cations from a spent Purex-type solvent extraction system which contains degradation complexes of tributylphosphate wherein the system is subjected to an ion-exchange process prior to a sodium carbonate scrubbing step. A further embodiment comprises recovery of the separated uranium and related cations. 5 figs.

  4. Measuring Business Process Learning with Enterprise Resource Planning Systems to Improve the Value of Education

    ERIC Educational Resources Information Center

    Monk, Ellen F.; Lycett, Mark

    2016-01-01

    Enterprise Resource Planning Systems (ERP) are very large and complex software packages that run every aspect of an organization. Increasingly, ERP systems are being used in higher education as one way to teach business processes, essential knowledge for students competing in today's business environment. Past research attempting to measure…

  5. Excitation energy transfer in photosynthetic protein-pigment complexes

    NASA Astrophysics Data System (ADS)

    Yeh, Shu-Hao

    Quantum biology is a relatively new research area which investigates the rules that quantum mechanics plays in biology. One of the most intriguing systems in this field is the coherent excitation energy transport (EET) in photosynthesis. In this document I will discuss the theories that are suitable for describing the photosynthetic EET process and the corresponding numerical results on several photosynthetic protein-pigment complexes (PPCs). In some photosynthetic EET processes, because of the electronic coupling between the chromophores within the system is about the same order of magnitude as system-bath coupling (electron-phonon coupling), a non-perturbative method called hierarchy equation of motion (HEOM) is applied to study the EET dynamics. The first part of this thesis includes brief introduction and derivation to the HEOM approach. The second part of this thesis the HEOM method will be applied to investigate the EET process within the B850 ring of the light harvesting complex 2 (LH2) from purple bacteria, Rhodopseudomonas acidophila. The dynamics of the exciton population and coherence will be analyzed under different initial excitation configurations and temperatures. Finally, how HEOM can be implemented to simulate the two-dimensional electronic spectra of photosynthetic PPCs will be discussed. Two-dimensional electronic spectroscopy is a crucial experimental technique to probe EET dynamics in multi-chromophoric systems. The system we are interested in is the 7-chromophore Fenna-Matthews-Olson (FMO) complex from green sulfur bacteria, Prosthecochloris aestuarii. Recent crystallographic studies report the existence of an additional (eighth) chromophore in some of the FMO monomers. By applying HEOM we are able to calculate the two-dimensional electronic spectra of the 7-site and 8-site FMO complexes and investigate the functionality of the eighth chromophore.

  6. System complexity as a measure of safe capacity for the emergency department.

    PubMed

    France, Daniel J; Levin, Scott

    2006-11-01

    System complexity is introduced as a new measure of system state for the emergency department (ED). In its original form, the measure quantifies the uncertainty of demands on system resources. For application in the ED, the measure is being modified to quantify both workload and uncertainty to produce a single integrated measure of system state. Complexity is quantified using an information-theoretic or entropic approach developed in manufacturing and operations research. In its original form, complexity is calculated on the basis of four system parameters: 1) the number of resources (clinicians and processing entities such as radiology and laboratory systems), 2) the number of possible work states for each resource, 3) the probability that a resource is in a particular work state, and 4) the probability of queue changes (i.e., where a queue is defined by the number of patients or patient orders being managed by a resource) during a specified time period. An example is presented to demonstrate how complexity is calculated and interpreted for a simple system composed of three resources (i.e., emergency physicians) managing varying patient loads. The example shows that variation in physician work states and patient queues produces different scores of complexity for each physician. It also illustrates how complexity and workload differ. System complexity is a viable and technically feasible measurement for monitoring and managing surge capacity in the ED.

  7. USMC Ground Surveillance Robot (GSR): Lessons Learned

    NASA Astrophysics Data System (ADS)

    Harmon, S. Y.

    1987-02-01

    This paper describes the design of an autonomous vehicle and the lessons learned during the implementation of that complex robot. The major problems encountered to which solutions were found include sensor processing bandwidth limitations, coordination of the interactions between major subsystems, sensor data fusion and system knowledge representation. Those problems remaining unresolved include system complexity management, the lack of powerful system monitoring and debugging tools, exploratory implementation of a complex system and safety and testing issues. Many of these problems arose from working with underdeveloped and continuously evolving technology and will probably be resolved as the technological resources mature and stabilize. Unfortunately, other problems will continue to plague developers throughout the evolution of autonomous system technology.

  8. Managing Programmatic Risk for Complex Space System Developments

    NASA Technical Reports Server (NTRS)

    Panetta, Peter V.; Hastings, Daniel; Brumfield, Mark (Technical Monitor)

    2001-01-01

    Risk management strategies have become a recent important research topic to many aerospace organizations as they prepare to develop the revolutionary complex space systems of the future. Future multi-disciplinary complex space systems will make it absolutely essential for organizations to practice a rigorous, comprehensive risk management process, emphasizing thorough systems engineering principles to succeed. Project managers must possess strong leadership skills to direct high quality, cross-disciplinary teams for successfully developing revolutionary space systems that are ever increasing in complexity. Proactive efforts to reduce or eliminate risk throughout a project's lifecycle ideally must be practiced by all technical members in the organization. This paper discusses some of the risk management perspectives that were collected from senior managers and project managers of aerospace and aeronautical organizations by the use of interviews and surveys. Some of the programmatic risks which drive the success or failure of projects are revealed. Key findings lead to a number of insights for organizations to consider for proactively approaching the risks which face current and future complex space systems projects.

  9. Integrated, systems metabolic picture of acetone-butanol-ethanol fermentation by Clostridium acetobutylicum.

    PubMed

    Liao, Chen; Seo, Seung-Oh; Celik, Venhar; Liu, Huaiwei; Kong, Wentao; Wang, Yi; Blaschek, Hans; Jin, Yong-Su; Lu, Ting

    2015-07-07

    Microbial metabolism involves complex, system-level processes implemented via the orchestration of metabolic reactions, gene regulation, and environmental cues. One canonical example of such processes is acetone-butanol-ethanol (ABE) fermentation by Clostridium acetobutylicum, during which cells convert carbon sources to organic acids that are later reassimilated to produce solvents as a strategy for cellular survival. The complexity and systems nature of the process have been largely underappreciated, rendering challenges in understanding and optimizing solvent production. Here, we present a system-level computational framework for ABE fermentation that combines metabolic reactions, gene regulation, and environmental cues. We developed the framework by decomposing the entire system into three modules, building each module separately, and then assembling them back into an integrated system. During the model construction, a bottom-up approach was used to link molecular events at the single-cell level into the events at the population level. The integrated model was able to successfully reproduce ABE fermentations of the WT C. acetobutylicum (ATCC 824), as well as its mutants, using data obtained from our own experiments and from literature. Furthermore, the model confers successful predictions of the fermentations with various network perturbations across metabolic, genetic, and environmental aspects. From foundation to applications, the framework advances our understanding of complex clostridial metabolism and physiology and also facilitates the development of systems engineering strategies for the production of advanced biofuels.

  10. Integrated, systems metabolic picture of acetone-butanol-ethanol fermentation by Clostridium acetobutylicum

    PubMed Central

    Liao, Chen; Seo, Seung-Oh; Celik, Venhar; Liu, Huaiwei; Kong, Wentao; Wang, Yi; Blaschek, Hans; Jin, Yong-Su; Lu, Ting

    2015-01-01

    Microbial metabolism involves complex, system-level processes implemented via the orchestration of metabolic reactions, gene regulation, and environmental cues. One canonical example of such processes is acetone-butanol-ethanol (ABE) fermentation by Clostridium acetobutylicum, during which cells convert carbon sources to organic acids that are later reassimilated to produce solvents as a strategy for cellular survival. The complexity and systems nature of the process have been largely underappreciated, rendering challenges in understanding and optimizing solvent production. Here, we present a system-level computational framework for ABE fermentation that combines metabolic reactions, gene regulation, and environmental cues. We developed the framework by decomposing the entire system into three modules, building each module separately, and then assembling them back into an integrated system. During the model construction, a bottom-up approach was used to link molecular events at the single-cell level into the events at the population level. The integrated model was able to successfully reproduce ABE fermentations of the WT C. acetobutylicum (ATCC 824), as well as its mutants, using data obtained from our own experiments and from literature. Furthermore, the model confers successful predictions of the fermentations with various network perturbations across metabolic, genetic, and environmental aspects. From foundation to applications, the framework advances our understanding of complex clostridial metabolism and physiology and also facilitates the development of systems engineering strategies for the production of advanced biofuels. PMID:26100881

  11. Hierarchical coordinate systems for understanding complexity and its evolution, with applications to genetic regulatory networks.

    PubMed

    Egri-Nagy, Attila; Nehaniv, Chrystopher L

    2008-01-01

    Beyond complexity measures, sometimes it is worthwhile in addition to investigate how complexity changes structurally, especially in artificial systems where we have complete knowledge about the evolutionary process. Hierarchical decomposition is a useful way of assessing structural complexity changes of organisms modeled as automata, and we show how recently developed computational tools can be used for this purpose, by computing holonomy decompositions and holonomy complexity. To gain insight into the evolution of complexity, we investigate the smoothness of the landscape structure of complexity under minimal transitions. As a proof of concept, we illustrate how the hierarchical complexity analysis reveals symmetries and irreversible structure in biological networks by applying the methods to the lac operon mechanism in the genetic regulatory network of Escherichia coli.

  12. Scope Complexity Options Risks Excursions (SCORE) Factor Mathematical Description.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gearhart, Jared Lee; Samberson, Jonell Nicole; Shettigar, Subhasini

    The purpose of the Scope, Complexity, Options, Risks, Excursions (SCORE) model is to estimate the relative complexity of design variants of future warhead options, resulting in scores. SCORE factors extend this capability by providing estimates of complexity relative to a base system (i.e., all design options are normalized to one weapon system). First, a clearly defined set of scope elements for a warhead option is established. The complexity of each scope element is estimated by Subject Matter Experts (SMEs), including a level of uncertainty, relative to a specific reference system. When determining factors, complexity estimates for a scope element canmore » be directly tied to the base system or chained together via comparable scope elements in a string of reference systems that ends with the base system. The SCORE analysis process is a growing multi-organizational Nuclear Security Enterprise (NSE) effort, under the management of the NA-12 led Enterprise Modeling and Analysis Consortium (EMAC). Historically, it has provided the data elicitation, integration, and computation needed to support the out-year Life Extension Program (LEP) cost estimates included in the Stockpile Stewardship Management Plan (SSMP).« less

  13. Understanding scaling through history-dependent processes with collapsing sample space.

    PubMed

    Corominas-Murtra, Bernat; Hanel, Rudolf; Thurner, Stefan

    2015-04-28

    History-dependent processes are ubiquitous in natural and social systems. Many such stochastic processes, especially those that are associated with complex systems, become more constrained as they unfold, meaning that their sample space, or their set of possible outcomes, reduces as they age. We demonstrate that these sample-space-reducing (SSR) processes necessarily lead to Zipf's law in the rank distributions of their outcomes. We show that by adding noise to SSR processes the corresponding rank distributions remain exact power laws, p(x) ~ x(-λ), where the exponent directly corresponds to the mixing ratio of the SSR process and noise. This allows us to give a precise meaning to the scaling exponent in terms of the degree to which a given process reduces its sample space as it unfolds. Noisy SSR processes further allow us to explain a wide range of scaling exponents in frequency distributions ranging from α = 2 to ∞. We discuss several applications showing how SSR processes can be used to understand Zipf's law in word frequencies, and how they are related to diffusion processes in directed networks, or aging processes such as in fragmentation processes. SSR processes provide a new alternative to understand the origin of scaling in complex systems without the recourse to multiplicative, preferential, or self-organized critical processes.

  14. LISP based simulation generators for modeling complex space processes

    NASA Technical Reports Server (NTRS)

    Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing

    1987-01-01

    The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.

  15. Protonation free energy levels in complex molecular systems.

    PubMed

    Antosiewicz, Jan M

    2008-04-01

    All proteins, nucleic acids, and other biomolecules contain residues capable of exchanging protons with their environment. These proton transfer phenomena lead to pH sensitivity of many molecular processes underlying biological phenomena. In the course of biological evolution, Nature has invented some mechanisms to use pH gradients to regulate biomolecular processes inside cells or in interstitial fluids. Therefore, an ability to model protonation equilibria in molecular systems accurately would be of enormous value for our understanding of biological processes and for possible rational influence on them, like in developing pH dependent drugs to treat particular diseases. This work presents a derivation, by thermodynamic and statistical mechanical methods, of an expression for the free energy of a complex molecular system at arbitrary ionization state of its titratable residues. This constitutes one of the elements of modeling protonation equilibria. Starting from a consideration of a simple acid-base equilibrium of a model compound with a single tritratable group, we arrive at an expression which is of general validity for complex systems. The only approximation used in this derivation is the postulating that the interaction energy between any pair of titratable sites does not depend on the protonation states of all the remaining ionizable groups.

  16. Annular beam shaping system for advanced 3D laser brazing

    NASA Astrophysics Data System (ADS)

    Pütsch, Oliver; Stollenwerk, Jochen; Kogel-Hollacher, Markus; Traub, Martin

    2012-10-01

    As laser brazing benefits from advantages such as smooth joints and small heat-affected zones, it has become established as a joining technology that is widely used in the automotive industry. With the processing of complex-shaped geometries, recent developed brazing heads suffer, however, from the need for continuous reorientation of the optical system and/or limited accessibility due to lateral wire feeding. This motivates the development of a laser brazing head with coaxial wire feeding and enhanced functionality. An optical system is designed that allows to generate an annular intensity distribution in the working zone. The utilization of complex optical components avoids obscuration of the optical path by the wire feeding. The new design overcomes the disadvantages of the state-of-the-art brazing heads with lateral wire feeding and benefits from the independence of direction while processing complex geometries. To increase the robustness of the brazing process, the beam path also includes a seam tracking system, leading to a more challenging design of the whole optical train. This paper mainly discusses the concept and the optical design of the coaxial brazing head, and also presents the results obtained with a prototype and selected application results.

  17. Probabilistic performance assessment of complex energy process systems - The case of a self-sustained sanitation system.

    PubMed

    Kolios, Athanasios; Jiang, Ying; Somorin, Tosin; Sowale, Ayodeji; Anastasopoulou, Aikaterini; Anthony, Edward J; Fidalgo, Beatriz; Parker, Alison; McAdam, Ewan; Williams, Leon; Collins, Matt; Tyrrel, Sean

    2018-05-01

    A probabilistic modelling approach was developed and applied to investigate the energy and environmental performance of an innovative sanitation system, the "Nano-membrane Toilet" (NMT). The system treats human excreta via an advanced energy and water recovery island with the aim of addressing current and future sanitation demands. Due to the complex design and inherent characteristics of the system's input material, there are a number of stochastic variables which may significantly affect the system's performance. The non-intrusive probabilistic approach adopted in this study combines a finite number of deterministic thermodynamic process simulations with an artificial neural network (ANN) approximation model and Monte Carlo simulations (MCS) to assess the effect of system uncertainties on the predicted performance of the NMT system. The joint probability distributions of the process performance indicators suggest a Stirling Engine (SE) power output in the range of 61.5-73 W with a high confidence interval (CI) of 95%. In addition, there is high probability (with 95% CI) that the NMT system can achieve positive net power output between 15.8 and 35 W. A sensitivity study reveals the system power performance is mostly affected by SE heater temperature. Investigation into the environmental performance of the NMT design, including water recovery and CO 2 /NO x emissions, suggests significant environmental benefits compared to conventional systems. Results of the probabilistic analysis can better inform future improvements on the system design and operational strategy and this probabilistic assessment framework can also be applied to similar complex engineering systems.

  18. Electronic construction collaboration system -- final phase : [tech transfer summary].

    DOT National Transportation Integrated Search

    2014-07-01

    Construction projects have been growing more complex in terms of : project team composition, design aspects, and construction processes. : To help manage the shop/working drawings and requests for information : (RFIs) for its large, complex projects,...

  19. Friction Spinning—New Innovative Tool Systems For The Production of Complex Functionally Graded Workpieces

    NASA Astrophysics Data System (ADS)

    Homberg, Werner; Hornjak, Daniel

    2011-05-01

    Friction spinning is a new innovative and promising incremental forming technology implying high potential regarding the manufacturing of complex functionally graded workpieces and enhancing existing forming limits of conventional metal spinning processes. The friction spinning process is based on the integration of thermo-mechanical friction subprocesses in this incremental forming process. By choosing the appropriate process parameters, e.g. axial feed rate or relative motion, the contact conditions between tool and workpiece can be influenced in a defined way and, thus, a required temperature profile can be obtained. Friction spinning allows the extension of forming limits compared to conventional metal spinning in order to produce multifunctional components with locally varying properties and the manufacturing of e.g. complex hollow parts made of tubes, profiles, or sheet metals. In this way, it meets the demands regarding efficiency and the manufacturing of functionally graded lightweight components. There is e.g. the possibility of locally increasing the wall thickness in joining zones and, as a consequence, achieving higher quality of the joint at decreased expense. These products are not or only hardly producible by conventional processes so far. In order to benefit from the advantages and potentials of this new innovative process new tooling systems and concepts are indispensable which fulfill the special requirements of this thermo-mechanical process concerning thermal and tribological loads and which allow simultaneous and defined forming and friction operations. An important goal of the corresponding research work at the Chair of Forming and Machining Technology at the University of Paderborn is the development of tool systems that allow the manufacturing of such complex parts by simple uniaxial or sequential biaxial linear tool paths. In the paper, promising tool systems and geometries as well as results of theoretical and experimental research work (e.g. regarding the influence and interaction of process parameters on the workpiece quality) will be discussed. Furthermore, possibilities regarding the manufacturing of geometries (demonstrator workpieces) which are not or only hardly producible with conventional processes will be presented.

  20. Implementation of Complexity Analyzing Based on Additional Effect

    NASA Astrophysics Data System (ADS)

    Zhang, Peng; Li, Na; Liang, Yanhong; Liu, Fang

    According to the Complexity Theory, there is complexity in the system when the functional requirement is not be satisfied. There are several study performances for Complexity Theory based on Axiomatic Design. However, they focus on reducing the complexity in their study and no one focus on method of analyzing the complexity in the system. Therefore, this paper put forth a method of analyzing the complexity which is sought to make up the deficiency of the researches. In order to discussing the method of analyzing the complexity based on additional effect, this paper put forth two concepts which are ideal effect and additional effect. The method of analyzing complexity based on additional effect combines Complexity Theory with Theory of Inventive Problem Solving (TRIZ). It is helpful for designers to analyze the complexity by using additional effect. A case study shows the application of the process.

  1. Systems Engineering and Integration for Advanced Life Support System and HST

    NASA Technical Reports Server (NTRS)

    Kamarani, Ali K.

    2005-01-01

    Systems engineering (SE) discipline has revolutionized the way engineers and managers think about solving issues related to design of complex systems: With continued development of state-of-the-art technologies, systems are becoming more complex and therefore, a systematic approach is essential to control and manage their integrated design and development. This complexity is driven from integration issues. In this case, subsystems must interact with one another in order to achieve integration objectives, and also achieve the overall system's required performance. Systems engineering process addresses these issues at multiple levels. It is a technology and management process dedicated to controlling all aspects of system life cycle to assure integration at all levels. The Advanced Integration Matrix (AIM) project serves as the systems engineering and integration function for the Human Support Technology (HST) program. AIM provides means for integrated test facilities and personnel for performance trade studies, analyses, integrated models, test results, and validated requirements of the integration of HST. The goal of AIM is to address systems-level integration issues for exploration missions. It will use an incremental systems integration approach to yield technologies, baselines for further development, and possible breakthrough concepts in the areas of technological and organizational interfaces, total information flow, system wide controls, technical synergism, mission operations protocols and procedures, and human-machine interfaces.

  2. Increasingly automated procedure acquisition in dynamic systems

    NASA Technical Reports Server (NTRS)

    Mathe, Nathalie; Kedar, Smadar

    1992-01-01

    Procedures are widely used by operators for controlling complex dynamic systems. Currently, most development of such procedures is done manually, consuming a large amount of paper, time, and manpower in the process. While automated knowledge acquisition is an active field of research, not much attention has been paid to the problem of computer-assisted acquisition and refinement of complex procedures for dynamic systems. The Procedure Acquisition for Reactive Control Assistant (PARC), which is designed to assist users in more systematically and automatically encoding and refining complex procedures. PARC is able to elicit knowledge interactively from the user during operation of the dynamic system. We categorize procedure refinement into two stages: diagnosis - diagnose the failure and choose a repair - and repair - plan and perform the repair. The basic approach taken in PARC is to assist the user in all steps of this process by providing increased levels of assistance with layered tools. We illustrate the operation of PARC in refining procedures for the control of a robot arm.

  3. Psychoacoustics

    NASA Astrophysics Data System (ADS)

    Moore, Brian C. J.

    Psychoacoustics psychological is concerned with the relationships between the physical characteristics of sounds and their perceptual attributes. This chapter describes: the absolute sensitivity of the auditory system for detecting weak sounds and how that sensitivity varies with frequency; the frequency selectivity of the auditory system (the ability to resolve or hear out the sinusoidal components in a complex sound) and its characterization in terms of an array of auditory filters; the processes that influence the masking of one sound by another; the range of sound levels that can be processed by the auditory system; the perception and modeling of loudness; level discrimination; the temporal resolution of the auditory system (the ability to detect changes over time); the perception and modeling of pitch for pure and complex tones; the perception of timbre for steady and time-varying sounds; the perception of space and sound localization; and the mechanisms underlying auditory scene analysis that allow the construction of percepts corresponding to individual sounds sources when listening to complex mixtures of sounds.

  4. An Investigation and Interpretation of Selected Topics in Uncertainty Reasoning

    DTIC Science & Technology

    1989-12-01

    Characterizing seconditry uncertainty as spurious evidence and including it in the inference process , It was shown that probability ratio graphs are a...in the inference process has great impact on the computational complexity of an Inference process . viii An Investigation and Interpretation of...Systems," he outlines a five step process that incorporates Blyeslan reasoning in the development of the expert system rule base: 1. A group of

  5. A complex adaptive systems perspective of health information technology implementation.

    PubMed

    Keshavjee, Karim; Kuziemsky, Craig; Vassanji, Karim; Ghany, Ahmad

    2013-01-01

    Implementing health information technology (HIT) is a challenge because of the complexity and multiple interactions that define HIT implementation. Much of the research on HIT implementation is descriptive in nature and has focused on distinct processes such as order entry or decision support. These studies fail to take into account the underlying complexity of the processes, people and settings that are typical of HIT implementations. Complex adaptive systems (CAS) is a promising field that could elucidate the complexity and non-linear interacting issues that are typical in HIT implementation. Initially we sought new models that would enable us to better understand the complex nature of HIT implementation, to proactively identify problem issues that could be a precursor to unintended consequences and to develop new models and new approaches to successful HIT implementations. Our investigation demonstrates that CAS does not provide prediction, but forces us to rethink our HIT implementation paradigms and question what we think we know. CAS provides new ways to conceptualize HIT implementation and suggests new approaches to increasing HIT implementation successes.

  6. A new simulation system of traffic flow based on cellular automata principle

    NASA Astrophysics Data System (ADS)

    Shan, Junru

    2017-05-01

    Traffic flow is a complex system of multi-behavior so it is difficult to give a specific mathematical equation to express it. With the rapid development of computer technology, it is an important method to study the complex traffic behavior by simulating the interaction mechanism between vehicles and reproduce complex traffic behavior. Using the preset of multiple operating rules, cellular automata is a kind of power system which has discrete time and space. It can be a good simulation of the real traffic process and a good way to solve the traffic problems.

  7. Strong anticipation and long-range cross-correlation: Application of detrended cross-correlation analysis to human behavioral data

    NASA Astrophysics Data System (ADS)

    Delignières, Didier; Marmelat, Vivien

    2014-01-01

    In this paper, we analyze empirical data, accounting for coordination processes between complex systems (bimanual coordination, interpersonal coordination, and synchronization with a fractal metronome), by using a recently proposed method: detrended cross-correlation analysis (DCCA). This work is motivated by the strong anticipation hypothesis, which supposes that coordination between complex systems is not achieved on the basis of local adaptations (i.e., correction, predictions), but results from a more global matching of complexity properties. Indeed, recent experiments have evidenced a very close correlation between the scaling properties of the series produced by two coordinated systems, despite a quite weak local synchronization. We hypothesized that strong anticipation should result in the presence of long-range cross-correlations between the series produced by the two systems. Results allow a detailed analysis of the effects of coordination on the fluctuations of the series produced by the two systems. In the long term, series tend to present similar scaling properties, with clear evidence of long-range cross-correlation. Short-term results strongly depend on the nature of the task. Simulation studies allow disentangling the respective effects of noise and short-term coupling processes on DCCA results, and suggest that the matching of long-term fluctuations could be the result of short-term coupling processes.

  8. Observing Consistency in Online Communication Patterns for User Re-Identification.

    PubMed

    Adeyemi, Ikuesan Richard; Razak, Shukor Abd; Salleh, Mazleena; Venter, Hein S

    2016-01-01

    Comprehension of the statistical and structural mechanisms governing human dynamics in online interaction plays a pivotal role in online user identification, online profile development, and recommender systems. However, building a characteristic model of human dynamics on the Internet involves a complete analysis of the variations in human activity patterns, which is a complex process. This complexity is inherent in human dynamics and has not been extensively studied to reveal the structural composition of human behavior. A typical method of anatomizing such a complex system is viewing all independent interconnectivity that constitutes the complexity. An examination of the various dimensions of human communication pattern in online interactions is presented in this paper. The study employed reliable server-side web data from 31 known users to explore characteristics of human-driven communications. Various machine-learning techniques were explored. The results revealed that each individual exhibited a relatively consistent, unique behavioral signature and that the logistic regression model and model tree can be used to accurately distinguish online users. These results are applicable to one-to-one online user identification processes, insider misuse investigation processes, and online profiling in various areas.

  9. Active control of complex, multicomponent self-assembly processes

    NASA Astrophysics Data System (ADS)

    Schulman, Rebecca

    The kinetics of many complex biological self-assembly processes such as cytoskeletal assembly are precisely controlled by cells. Spatiotemporal control over rates of filament nucleation, growth and disassembly determine how self-assembly occurs and how the assembled form changes over time. These reaction rates can be manipulated by changing the concentrations of the components needed for assembly by activating or deactivating them. I will describe how we can use these principles to design driven self-assembly processes in which we assemble and disassemble multiple types of components to create micron-scale networks of semiflexible filaments assembled from DNA. The same set of primitive components can be assembled into many different, structures depending on the concentrations of different components and how designed, DNA-based chemical reaction networks manipulate these concentrations over time. These chemical reaction networks can in turn interpret environmental stimuli to direct complex, multistage response. Such a system is a laboratory for understanding complex active material behaviors, such as metamorphosis, self-healing or adaptation to the environment that are ubiquitous in biological systems but difficult to quantitatively characterize or engineer.

  10. Data Acquisition System for Multi-Frequency Radar Flight Operations Preparation

    NASA Technical Reports Server (NTRS)

    Leachman, Jonathan

    2010-01-01

    A three-channel data acquisition system was developed for the NASA Multi-Frequency Radar (MFR) system. The system is based on a commercial-off-the-shelf (COTS) industrial PC (personal computer) and two dual-channel 14-bit digital receiver cards. The decimated complex envelope representations of the three radar signals are passed to the host PC via the PCI bus, and then processed in parallel by multiple cores of the PC CPU (central processing unit). The innovation is this parallelization of the radar data processing using multiple cores of a standard COTS multi-core CPU. The data processing portion of the data acquisition software was built using autonomous program modules or threads, which can run simultaneously on different cores. A master program module calculates the optimal number of processing threads, launches them, and continually supplies each with data. The benefit of this new parallel software architecture is that COTS PCs can be used to implement increasingly complex processing algorithms on an increasing number of radar range gates and data rates. As new PCs become available with higher numbers of CPU cores, the software will automatically utilize the additional computational capacity.

  11. Modeling and deadlock avoidance of automated manufacturing systems with multiple automated guided vehicles.

    PubMed

    Wu, Naiqi; Zhou, MengChu

    2005-12-01

    An automated manufacturing system (AMS) contains a number of versatile machines (or workstations), buffers, an automated material handling system (MHS), and is computer-controlled. An effective and flexible alternative for implementing MHS is to use automated guided vehicle (AGV) system. The deadlock issue in AMS is very important in its operation and has extensively been studied. The deadlock problems were separately treated for parts in production and transportation and many techniques were developed for each problem. However, such treatment does not take the advantage of the flexibility offered by multiple AGVs. In general, it is intractable to obtain maximally permissive control policy for either problem. Instead, this paper investigates these two problems in an integrated way. First we model an AGV system and part processing processes by resource-oriented Petri nets, respectively. Then the two models are integrated by using macro transitions. Based on the combined model, a novel control policy for deadlock avoidance is proposed. It is shown to be maximally permissive with computational complexity of O (n2) where n is the number of machines in AMS if the complexity for controlling the part transportation by AGVs is not considered. Thus, the complexity of deadlock avoidance for the whole system is bounded by the complexity in controlling the AGV system. An illustrative example shows its application and power.

  12. On Using SysML, DoDAF 2.0 and UPDM to Model the Architecture for the NOAA's Joint Polar Satellite System (JPSS) Ground System (GS)

    NASA Technical Reports Server (NTRS)

    Hayden, Jeffrey L.; Jeffries, Alan

    2012-01-01

    The JPSS Ground System is a lIexible system of systems responsible for telemetry, tracking & command (TT &C), data acquisition, routing and data processing services for a varied lIeet of satellites to support weather prediction, modeling and climate modeling. To assist in this engineering effort, architecture modeling tools are being employed to translate the former NPOESS baseline to the new JPSS baseline, The paper will focus on the methodology for the system engineering process and the use of these architecture modeling tools within that process, The Department of Defense Architecture Framework version 2,0 (DoDAF 2.0) viewpoints and views that are being used to describe the JPSS GS architecture are discussed. The Unified Profile for DoOAF and MODAF (UPDM) and Systems Modeling Language (SysML), as ' provided by extensions to the MagicDraw UML modeling tool, are used to develop the diagrams and tables that make up the architecture model. The model development process and structure are discussed, examples are shown, and details of handling the complexities of a large System of Systems (SoS), such as the JPSS GS, with an equally complex modeling tool, are described

  13. Architectures Toward Reusable Science Data Systems

    NASA Technical Reports Server (NTRS)

    Moses, John Firor

    2014-01-01

    Science Data Systems (SDS) comprise an important class of data processing systems that support product generation from remote sensors and in-situ observations. These systems enable research into new science data products, replication of experiments and verification of results. NASA has been building systems for satellite data processing since the first Earth observing satellites launched and is continuing development of systems to support NASA science research and NOAA's Earth observing satellite operations. The basic data processing workflows and scenarios continue to be valid for remote sensor observations research as well as for the complex multi-instrument operational satellite data systems being built today.

  14. Systems thinking and complexity: considerations for health promoting schools.

    PubMed

    Rosas, Scott R

    2017-04-01

    The health promoting schools concept reflects a comprehensive and integrated philosophy to improving student and personnel health and well-being. Conceptualized as a configuration of interacting, interdependent parts connected through a web of relationships that form a whole greater than the sum of its parts, school health promotion initiatives often target several levels (e.g. individual, professional, procedural and policy) simultaneously. Health promoting initiatives, such as those operationalized under the whole school approach, include several interconnected components that are coordinated to improve health outcomes in complex settings. These complex systems interventions are embedded in intricate arrangements of physical, biological, ecological, social, political and organizational relationships. Systems thinking and characteristics of complex adaptive systems are introduced in this article to provide a perspective that emphasizes the patterns of inter-relationships associated with the nonlinear, dynamic and adaptive nature of complex hierarchical systems. Four systems thinking areas: knowledge, networks, models and organizing are explored as a means to further manage the complex nature of the development and sustainability of health promoting schools. Applying systems thinking and insights about complex adaptive systems can illuminate how to address challenges found in settings with both complicated (i.e. multi-level and multisite) and complex aspects (i.e. synergistic processes and emergent outcomes). © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. How synthetic membrane systems contribute to the understanding of lipid-driven endocytosis.

    PubMed

    Schubert, Thomas; Römer, Winfried

    2015-11-01

    Synthetic membrane systems, such as giant unilamellar vesicles and solid supported lipid bilayers, have widened our understanding of biological processes occurring at or through membranes. Artificial systems are particularly suited to study the inherent properties of membranes with regard to their components and characteristics. This review critically reflects the emerging molecular mechanism of lipid-driven endocytosis and the impact of model membrane systems in elucidating the complex interplay of biomolecules within this process. Lipid receptor clustering induced by binding of several toxins, viruses and bacteria to the plasma membrane leads to local membrane bending and formation of tubular membrane invaginations. Here, lipid shape, and protein structure and valency are the essential parameters in membrane deformation. Combining observations of complex cellular processes and their reconstitution on minimal systems seems to be a promising future approach to resolve basic underlying mechanisms. This article is part of a Special Issue entitled: Mechanobiology. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Colour processing in complex environments: insights from the visual system of bees

    PubMed Central

    Dyer, Adrian G.; Paulk, Angelique C.; Reser, David H.

    2011-01-01

    Colour vision enables animals to detect and discriminate differences in chromatic cues independent of brightness. How the bee visual system manages this task is of interest for understanding information processing in miniaturized systems, as well as the relationship between bee pollinators and flowering plants. Bees can quickly discriminate dissimilar colours, but can also slowly learn to discriminate very similar colours, raising the question as to how the visual system can support this, or whether it is simply a learning and memory operation. We discuss the detailed neuroanatomical layout of the brain, identify probable brain areas for colour processing, and suggest that there may be multiple systems in the bee brain that mediate either coarse or fine colour discrimination ability in a manner dependent upon individual experience. These multiple colour pathways have been identified along both functional and anatomical lines in the bee brain, providing us with some insights into how the brain may operate to support complex colour discrimination behaviours. PMID:21147796

  17. AOIPS water resources data management system

    NASA Technical Reports Server (NTRS)

    Vanwie, P.

    1977-01-01

    The text and computer-generated displays used to demonstrate the AOIPS (Atmospheric and Oceanographic Information Processing System) water resources data management system are investigated. The system was developed to assist hydrologists in analyzing the physical processes occurring in watersheds. It was designed to alleviate some of the problems encountered while investigating the complex interrelationships of variables such as land-cover type, topography, precipitation, snow melt, surface runoff, evapotranspiration, and streamflow rates. The system has an interactive image processing capability and a color video display to display results as they are obtained.

  18. An intelligent factory-wide optimal operation system for continuous production process

    NASA Astrophysics Data System (ADS)

    Ding, Jinliang; Chai, Tianyou; Wang, Hongfeng; Wang, Junwei; Zheng, Xiuping

    2016-03-01

    In this study, a novel intelligent factory-wide operation system for a continuous production process is designed to optimise the entire production process, which consists of multiple units; furthermore, this system is developed using process operational data to avoid the complexity of mathematical modelling of the continuous production process. The data-driven approach aims to specify the structure of the optimal operation system; in particular, the operational data of the process are used to formulate each part of the system. In this context, the domain knowledge of process engineers is utilised, and a closed-loop dynamic optimisation strategy, which combines feedback, performance prediction, feed-forward, and dynamic tuning schemes into a framework, is employed. The effectiveness of the proposed system has been verified using industrial experimental results.

  19. Ultrafast Primary Reactions in the Photosystems of Oxygen-Evolving Organisms

    NASA Astrophysics Data System (ADS)

    Holzwarth, A. R.

    In oxygen-evolving photosynthetic organisms (plants, green algae, cyanobacteria), the primary steps of photosynthesis occur in two membrane-bound protein supercomplexes, Photosystem I (PS I) and Photosystem II (PS II), located in the thylakoid membrane (c.f. Fig. 7.1) along with two other important protein complexes, the cytochrome b6/f complex and the ATP-synthase [1]. Each of the photosystems consists of a reaction center (RC) where the photoinduced early electron transfer processes occur, of a so-called core antenna consisting of chlorophyll (Chl) protein complexes responsible for light absorption and ultrafast energy transfer to the RC pigments, and additional peripheral antenna complexes of various kinds that increase the absorption cross-section. The peripheral complexes are Chl a/b-protein complexes in higher plants and green algae (LHC I or LHC II for PS I or PS II, respectively) and so-called phycobilisomes in cyanobacteria and red algae [2-4]. The structures and light-harvesting functions of these antenna systems have been extensively reviewed [2, 5-9]. Recently, X-ray structures of both PS I and PS II antenna/RC complexes have been determined, some to atomic resolution. Although many details of the pigment content and organization of the RCs and antenna systems of PS I and PS II have been known before, the high resolution structures of the integral complexes allow us for the first time to try to understand structure/function relationships in detail. This article covers our present understanding of the ultrafast energy transfer and early electron transfer processes occurring in the photosystems of oxygen-evolving organisms. The main emphasis will be on the electron transfer processes. However, in both photosystems the kinetics of the energy transfer processes in the core antennae is intimately interwoven with the kinetics of the electron transfer steps. Since both types of processes occur on a similar time scale, their kinetics cannot be considered separately in any experiment and consequently they have to be discussed together.

  20. Solute transport with equilibrium aqueous complexation and either sorption or ion exchange: Simulation methodology and applications

    USGS Publications Warehouse

    Lewis, F.M.; Voss, C.I.; Rubin, J.

    1987-01-01

    Methodologies that account for specific types of chemical reactions in the simulation of solute transport can be developed so they are compatible with solution algorithms employed in existing transport codes. This enables the simulation of reactive transport in complex multidimensional flow regimes, and provides a means for existing codes to account for some of the fundamental chemical processes that occur among transported solutes. Two equilibrium-controlled reaction systems demonstrate a methodology for accommodating chemical interaction into models of solute transport. One system involves the sorption of a given chemical species, as well as two aqueous complexations in which the sorbing species is a participant. The other reaction set involves binary ion exchange coupled with aqueous complexation involving one of the exchanging species. The methodology accommodates these reaction systems through the addition of nonlinear terms to the transport equations for the sorbing species. Example simulation results show (1) the effect equilibrium chemical parameters have on the spatial distributions of concentration for complexing solutes; (2) that an interrelationship exists between mechanical dispersion and the various reaction processes; (3) that dispersive parameters of the porous media cannot be determined from reactive concentration distributions unless the reaction is accounted for or the influence of the reaction is negligible; (4) how the concentration of a chemical species may be significantly affected by its participation in an aqueous complex with a second species which also sorbs; and (5) that these coupled chemical processes influencing reactive transport can be demonstrated in two-dimensional flow regimes. ?? 1987.

  1. Multi-enzyme logic network architectures for assessing injuries: digital processing of biomarkers.

    PubMed

    Halámek, Jan; Bocharova, Vera; Chinnapareddy, Soujanya; Windmiller, Joshua Ray; Strack, Guinevere; Chuang, Min-Chieh; Zhou, Jian; Santhosh, Padmanabhan; Ramirez, Gabriela V; Arugula, Mary A; Wang, Joseph; Katz, Evgeny

    2010-12-01

    A multi-enzyme biocatalytic cascade processing simultaneously five biomarkers characteristic of traumatic brain injury (TBI) and soft tissue injury (STI) was developed. The system operates as a digital biosensor based on concerted function of 8 Boolean AND logic gates, resulting in the decision about the physiological conditions based on the logic analysis of complex patterns of the biomarkers. The system represents the first example of a multi-step/multi-enzyme biosensor with the built-in logic for the analysis of complex combinations of biochemical inputs. The approach is based on recent advances in enzyme-based biocomputing systems and the present paper demonstrates the potential applicability of biocomputing for developing novel digital biosensor networks.

  2. Human Aided Reinforcement Learning in Complex Environments

    DTIC Science & Technology

    learn to solve tasks through a trial -and- error process. As an agent takes ...task faster andmore accurately, a human expert can be added to the system to guide an agent in solving the task. This project seeks to expand on current...theenvironment, which works particularly well for reactive tasks . In more complex tasks , these systems do not work as intended. The manipulation

  3. Doing Away with the "Native Speaker": A Complex Adaptive Systems Approach to L2 Phonological Attainment

    ERIC Educational Resources Information Center

    Aslan, Erhan

    2017-01-01

    Employing the complex adaptive systems (CAS) model, the present case study provides a self-report description of the attitudes, perceptions and experiences of an advanced adult L2 English learner with respect to his L2 phonological attainment. CAS is predicated on the notion that an individual's cognitive processes are intricately related to his…

  4. Behavior Models for Software Architecture

    DTIC Science & Technology

    2014-11-01

    MP. Existing process modeling frameworks (BPEL, BPMN [Grosskopf et al. 2009], IDEF) usually follow the “single flowchart” paradigm. MP separates...Process: Business Process Modeling using BPMN , Meghan Kiffer Press. HAREL, D., 1987, A Visual Formalism for Complex Systems. Science of Computer

  5. An Investigation into Semantic and Phonological Processing in Individuals with Williams Syndrome

    ERIC Educational Resources Information Center

    Lee, Cheryl S.; Binder, Katherine S.

    2014-01-01

    Purpose: The current study examined semantic and phonological processing in individuals with Williams syndrome (WS). Previous research in language processing in individuals with WS suggests a complex linguistic system characterized by "deviant" semantic organization and differential phonological processing. Method: Two experiments…

  6. Ontology of Earth's nonlinear dynamic complex systems

    NASA Astrophysics Data System (ADS)

    Babaie, Hassan; Davarpanah, Armita

    2017-04-01

    As a complex system, Earth and its major integrated and dynamically interacting subsystems (e.g., hydrosphere, atmosphere) display nonlinear behavior in response to internal and external influences. The Earth Nonlinear Dynamic Complex Systems (ENDCS) ontology formally represents the semantics of the knowledge about the nonlinear system element (agent) behavior, function, and structure, inter-agent and agent-environment feedback loops, and the emergent collective properties of the whole complex system as the result of interaction of the agents with other agents and their environment. It also models nonlinear concepts such as aperiodic, random chaotic behavior, sensitivity to initial conditions, bifurcation of dynamic processes, levels of organization, self-organization, aggregated and isolated functionality, and emergence of collective complex behavior at the system level. By incorporating several existing ontologies, the ENDCS ontology represents the dynamic system variables and the rules of transformation of their state, emergent state, and other features of complex systems such as the trajectories in state (phase) space (attractor and strange attractor), basins of attractions, basin divide (separatrix), fractal dimension, and system's interface to its environment. The ontology also defines different object properties that change the system behavior, function, and structure and trigger instability. ENDCS will help to integrate the data and knowledge related to the five complex subsystems of Earth by annotating common data types, unifying the semantics of shared terminology, and facilitating interoperability among different fields of Earth science.

  7. Multi-Criteria Approach in Multifunctional Building Design Process

    NASA Astrophysics Data System (ADS)

    Gerigk, Mateusz

    2017-10-01

    The paper presents new approach in multifunctional building design process. Publication defines problems related to the design of complex multifunctional buildings. Currently, contemporary urban areas are characterized by very intensive use of space. Today, buildings are being built bigger and contain more diverse functions to meet the needs of a large number of users in one capacity. The trends show the need for recognition of design objects in an organized structure, which must meet current design criteria. The design process in terms of the complex system is a theoretical model, which is the basis for optimization solutions for the entire life cycle of the building. From the concept phase through exploitation phase to disposal phase multipurpose spaces should guarantee aesthetics, functionality, system efficiency, system safety and environmental protection in the best possible way. The result of the analysis of the design process is presented as a theoretical model of the multifunctional structure. Recognition of multi-criteria model in the form of Cartesian product allows to create a holistic representation of the designed building in the form of a graph model. The proposed network is the theoretical base that can be used in the design process of complex engineering systems. The systematic multi-criteria approach makes possible to maintain control over the entire design process and to provide the best possible performance. With respect to current design requirements, there are no established design rules for multifunctional buildings in relation to their operating phase. Enrichment of the basic criteria with functional flexibility criterion makes it possible to extend the exploitation phase which brings advantages on many levels.

  8. Understanding global health governance as a complex adaptive system.

    PubMed

    Hill, Peter S

    2011-01-01

    The transition from international to global health reflects the rapid growth in the numbers and nature of stakeholders in health, as well as the constant change embodied in the process of globalisation itself. This paper argues that global health governance shares the characteristics of complex adaptive systems, with its multiple and diverse players, and their polyvalent and constantly evolving relationships, and rich and dynamic interactions. The sheer quantum of initiatives, the multiple networks through which stakeholders (re)configure their influence, the range of contexts in which development for health is played out - all compound the complexity of this system. This paper maps out the characteristics of complex adaptive systems as they apply to global health governance, linking them to developments in the past two decades, and the multiple responses to these changes. Examining global health governance through the frame of complexity theory offers insight into the current dynamics of governance, and while providing a framework for making meaning of the whole, opens up ways of accessing this complexity through local points of engagement.

  9. VPipe: Virtual Pipelining for Scheduling of DAG Stream Query Plans

    NASA Astrophysics Data System (ADS)

    Wang, Song; Gupta, Chetan; Mehta, Abhay

    There are data streams all around us that can be harnessed for tremendous business and personal advantage. For an enterprise-level stream processing system such as CHAOS [1] (Continuous, Heterogeneous Analytic Over Streams), handling of complex query plans with resource constraints is challenging. While several scheduling strategies exist for stream processing, efficient scheduling of complex DAG query plans is still largely unsolved. In this paper, we propose a novel execution scheme for scheduling complex directed acyclic graph (DAG) query plans with meta-data enriched stream tuples. Our solution, called Virtual Pipelined Chain (or VPipe Chain for short), effectively extends the "Chain" pipelining scheduling approach to complex DAG query plans.

  10. Energy Center Structure Optimization by using Smart Technologies in Process Control System

    NASA Astrophysics Data System (ADS)

    Shilkina, Svetlana V.

    2018-03-01

    The article deals with practical application of fuzzy logic methods in process control systems. A control object - agroindustrial greenhouse complex, which includes its own energy center - is considered. The paper analyzes object power supply options taking into account connection to external power grids and/or installation of own power generating equipment with various layouts. The main problem of a greenhouse facility basic process is extremely uneven power consumption, which forces to purchase redundant generating equipment idling most of the time, which quite negatively affects project profitability. Energy center structure optimization is largely based on solving the object process control system construction issue. To cut investor’s costs it was proposed to optimize power consumption by building an energy-saving production control system based on a fuzzy logic controller. The developed algorithm of automated process control system functioning ensured more even electric and thermal energy consumption, allowed to propose construction of the object energy center with a smaller number of units due to their more even utilization. As a result, it is shown how practical use of microclimate parameters fuzzy control system during object functioning leads to optimization of agroindustrial complex energy facility structure, which contributes to a significant reduction in object construction and operation costs.

  11. Reaction-diffusion controlled growth of complex structures

    NASA Astrophysics Data System (ADS)

    Noorduin, Willem; Mahadevan, L.; Aizenberg, Joanna

    2013-03-01

    Understanding how the emergence of complex forms and shapes in biominerals came about is both of fundamental and practical interest. Although biomineralization processes and organization strategies to give higher order architectures have been studied extensively, synthetic approaches to mimic these self-assembled structures are highly complex and have been difficult to emulate, let alone replicate. The emergence of solution patterns has been found in reaction-diffusion systems such as Turing patterns and the BZ reaction. Intrigued by this spontaneous formation of complexity we explored if similar processes can lead to patterns in the solid state. We here identify a reaction-diffusion system in which the shape of the solidified products is a direct readout of the environmental conditions. Based on insights in the underlying mechanism, we developed a toolbox of engineering strategies to deterministically sculpt patterns and shapes, and combine different morphologies to create a landscape of hierarchical multi scale-complex tectonic architectures with unprecedented levels of complexity. These findings may hold profound implications for understanding, mimicking and ultimately expanding upon nature's morphogenesis strategies, allowing the synthesis of advanced highly complex microscale materials and devices. WLN acknowledges the Netherlands Organization for Scientific Research for financial support

  12. Measures and Metrics of Information Processing in Complex Systems: A Rope of Sand

    ERIC Educational Resources Information Center

    James, Ryan Gregory

    2013-01-01

    How much information do natural systems store and process? In this work we attempt to answer this question in multiple ways. We first establish a mathematical framework where natural systems are represented by a canonical form of edge-labeled hidden fc models called e-machines. Then, utilizing this framework, a variety of measures are defined and…

  13. Balancing Broad Ideas with Context: An Evaluation of Student Accuracy in Describing Ecosystem Processes after a System-Level Intervention

    ERIC Educational Resources Information Center

    Jordan, Rebecca C.; Brooks, Wesley R.; Hmelo-Silver, Cindy; Eberbach, Catherine; Sinha, Suparna

    2014-01-01

    Promoting student understanding of ecosystem processes is critical to biological education. Yet, teaching complex life systems can be difficult because systems are dynamic and often behave in a non-linear manner. In this paper, we discuss assessment results from a middle school classroom intervention in which a conceptual representation framework…

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The four-dimensional scattering function S(Q,w) obtained by inelastic neutron scattering measurements provides unique "dynamical fingerprints" of the spin state and interactions present in complex magnetic materials. Extracting this information however is currently a slow and complex process that may take an expert -depending on the complexity of the system- up to several weeks of painstaking work to complete. Spin Wave Genie was created to abstract and automate this process. It strives to both reduce the time to complete this analysis and make these calculations more accessible to a broader group of scientists and engineers.

  15. Collaborative simulation method with spatiotemporal synchronization process control

    NASA Astrophysics Data System (ADS)

    Zou, Yisheng; Ding, Guofu; Zhang, Weihua; Zhang, Jian; Qin, Shengfeng; Tan, John Kian

    2016-10-01

    When designing a complex mechatronics system, such as high speed trains, it is relatively difficult to effectively simulate the entire system's dynamic behaviors because it involves multi-disciplinary subsystems. Currently,a most practical approach for multi-disciplinary simulation is interface based coupling simulation method, but it faces a twofold challenge: spatial and time unsynchronizations among multi-directional coupling simulation of subsystems. A new collaborative simulation method with spatiotemporal synchronization process control is proposed for coupling simulating a given complex mechatronics system across multiple subsystems on different platforms. The method consists of 1) a coupler-based coupling mechanisms to define the interfacing and interaction mechanisms among subsystems, and 2) a simulation process control algorithm to realize the coupling simulation in a spatiotemporal synchronized manner. The test results from a case study show that the proposed method 1) can certainly be used to simulate the sub-systems interactions under different simulation conditions in an engineering system, and 2) effectively supports multi-directional coupling simulation among multi-disciplinary subsystems. This method has been successfully applied in China high speed train design and development processes, demonstrating that it can be applied in a wide range of engineering systems design and simulation with improved efficiency and effectiveness.

  16. Observations on Complexity and Costs for Over Three Decades of Communications Satellites

    NASA Astrophysics Data System (ADS)

    Bearden, David A.

    2002-01-01

    This paper takes an objective look at approximately thirty communications satellites built over three decades using a complexity index as an economic model. The complexity index is derived from a number of technical parameters including dry mass, end-of-life- power, payload type, communication bands, spacecraft lifetime, and attitude control approach. Complexity is then plotted versus total satellite cost and development time (defined as contract start to first launch). A comparison of the relative cost and development time for various classes of communications satellites and conclusions regarding dependence on system complexity are presented. Observations regarding inherent differences between commercially acquired systems and those procured by government organizations are also presented. A process is described where a new communications system in the formative stage may be compared against similarly "complex" missions of the recent past to balance risk within allotted time and funds. 1

  17. Free Energy and Virtual Reality in Neuroscience and Psychoanalysis: A Complexity Theory of Dreaming and Mental Disorder.

    PubMed

    Hopkins, Jim

    2016-01-01

    The main concepts of the free energy (FE) neuroscience developed by Karl Friston and colleagues parallel those of Freud's Project for a Scientific Psychology. In Hobson et al. (2014) these include an innate virtual reality generator that produces the fictive prior beliefs that Freud described as the primary process. This enables Friston's account to encompass a unified treatment-a complexity theory-of the role of virtual reality in both dreaming and mental disorder. In both accounts the brain operates to minimize FE aroused by sensory impingements-including interoceptive impingements that report compliance with biological imperatives-and constructs a representation/model of the causes of impingement that enables this minimization. In Friston's account (variational) FE equals complexity minus accuracy, and is minimized by increasing accuracy and decreasing complexity. Roughly the brain (or model) increases accuracy together with complexity in waking. This is mediated by consciousness-creating active inference-by which it explains sensory impingements in terms of perceptual experiences of their causes. In sleep it reduces complexity by processes that include both synaptic pruning and consciousness/virtual reality/dreaming in REM. The consciousness-creating active inference that effects complexity-reduction in REM dreaming must operate on FE-arousing data distinct from sensory impingement. The most relevant source is remembered arousals of emotion, both recent and remote, as processed in SWS and REM on "active systems" accounts of memory consolidation/reconsolidation. Freud describes these remembered arousals as condensed in the dreamwork for use in the conscious contents of dreams, and similar condensation can be seen in symptoms. Complexity partly reflects emotional conflict and trauma. This indicates that dreams and symptoms are both produced to reduce complexity in the form of potentially adverse (traumatic or conflicting) arousals of amygdala-related emotions. Mental disorder is thus caused by computational complexity together with mechanisms like synaptic pruning that have evolved for complexity-reduction; and important features of disorder can be understood in these terms. Details of the consilience among Freudian, systems consolidation, and complexity-reduction accounts appear clearly in the analysis of a single fragment of a dream, indicating also how complexity reduction proceeds by a process resembling Bayesian model selection.

  18. Conceptual Modeling in Systems Biology Fosters Empirical Findings: The mRNA Lifecycle

    PubMed Central

    Dori, Dov; Choder, Mordechai

    2007-01-01

    One of the main obstacles to understanding complex biological systems is the extent and rapid evolution of information, way beyond the capacity individuals to manage and comprehend. Current modeling approaches and tools lack adequate capacity to model concurrently structure and behavior of biological systems. Here we propose Object-Process Methodology (OPM), a holistic conceptual modeling paradigm, as a means to model both diagrammatically and textually biological systems formally and intuitively at any desired number of levels of detail. OPM combines objects, e.g., proteins, and processes, e.g., transcription, in a way that is simple and easily comprehensible to researchers and scholars. As a case in point, we modeled the yeast mRNA lifecycle. The mRNA lifecycle involves mRNA synthesis in the nucleus, mRNA transport to the cytoplasm, and its subsequent translation and degradation therein. Recent studies have identified specific cytoplasmic foci, termed processing bodies that contain large complexes of mRNAs and decay factors. Our OPM model of this cellular subsystem, presented here, led to the discovery of a new constituent of these complexes, the translation termination factor eRF3. Association of eRF3 with processing bodies is observed after a long-term starvation period. We suggest that OPM can eventually serve as a comprehensive evolvable model of the entire living cell system. The model would serve as a research and communication platform, highlighting unknown and uncertain aspects that can be addressed empirically and updated consequently while maintaining consistency. PMID:17849002

  19. Complex of technologies and prototype systems for eco-friendly shutdown of the power-generating, process, capacitive, and transport equipment

    NASA Astrophysics Data System (ADS)

    Smorodin, A. I.; Red'kin, V. V.; Frolov, Y. D.; Korobkov, A. A.; Kemaev, O. V.; Kulik, M. V.; Shabalin, O. V.

    2015-07-01

    A set of technologies and prototype systems for eco-friendly shutdown of the power-generating, process, capacitive, and transport equipment is offered. The following technologies are regarded as core technologies for the complex: cryogenic technology nitrogen for displacement of hydrogen from the cooling circuit of turbine generators, cryo blasting of the power units by dioxide granules, preservation of the shutdown power units by dehydrated air, and dismantling and severing of equipment and structural materials of power units. Four prototype systems for eco-friendly shutdown of the power units may be built on the basis of selected technologies: Multimode nitrogen cryogenic system with four subsystems, cryo blasting system with CO2 granules for thermal-mechanical and electrical equipment of power units, and compressionless air-drainage systems for drying and storage of the shutdown power units and cryo-gas system for general severing of the steam-turbine power units. Results of the research and pilot and demonstration tests of the operational units of the considered technological systems allow applying the proposed technologies and systems in the prototype systems for shutdown of the power-generating, process, capacitive, and transport equipment.

  20. Understanding the biological underpinnings of ecohydrological processes

    NASA Astrophysics Data System (ADS)

    Huxman, T. E.; Scott, R. L.; Barron-Gafford, G. A.; Hamerlynck, E. P.; Jenerette, D.; Tissue, D. T.; Breshears, D. D.; Saleska, S. R.

    2012-12-01

    Climate change presents a challenge for predicting ecosystem response, as multiple factors drive both the physical and life processes happening on the land surface and their interactions result in a complex, evolving coupled system. For example, changes in surface temperature and precipitation influence near-surface hydrology through impacts on system energy balance, affecting a range of physical processes. These changes in the salient features of the environment affect biological processes and elicit responses along the hierarchy of life (biochemistry to community composition). Many of these structural or process changes can alter patterns of soil water-use and influence land surface characteristics that affect local climate. Of the many features that affect our ability to predict the future dynamics of ecosystems, it is this hierarchical response of life that creates substantial complexity. Advances in the ability to predict or understand aspects of demography help describe thresholds in coupled ecohydrological system. Disentangling the physical and biological features that underlie land surface dynamics following disturbance are allowing a better understanding of the partitioning of water in the time-course of recovery. Better predicting the timing of phenology and key seasonal events allow for a more accurate description of the full functional response of the land surface to climate. In addition, explicitly considering the hierarchical structural features of life are helping to describe complex time-dependent behavior in ecosystems. However, despite this progress, we have yet to build an ability to fully account for the generalization of the main features of living systems into models that can describe ecohydrological processes, especially acclimation, assembly and adaptation. This is unfortunate, given that many key ecosystem services are functions of these coupled co-evolutionary processes. To date, both the lack of controlled measurements and experimentation has precluded determination of sufficient theoretical development. Understanding the land-surface response and feedback to climate change requires a mechanistic understanding of the coupling of ecological and hydrological processes and an expansion of theory from the life sciences to appropriately contribute to the broader Earth system science goal.

  1. An expert systems application to space base data processing

    NASA Technical Reports Server (NTRS)

    Babb, Stephen M.

    1988-01-01

    The advent of space vehicles with their increased data requirements are reflected in the complexity of future telemetry systems. Space based operations with its immense operating costs will shift the burden of data processing and routine analysis from the space station to the Orbital Transfer Vehicle (OTV). A research and development project is described which addresses the real time onboard data processing tasks associated with a space based vehicle, specifically focusing on an implementation of an expert system.

  2. Applying dynamic simulation modeling methods in health care delivery research-the SIMULATE checklist: report of the ISPOR simulation modeling emerging good practices task force.

    PubMed

    Marshall, Deborah A; Burgos-Liz, Lina; IJzerman, Maarten J; Osgood, Nathaniel D; Padula, William V; Higashi, Mitchell K; Wong, Peter K; Pasupathy, Kalyan S; Crown, William

    2015-01-01

    Health care delivery systems are inherently complex, consisting of multiple tiers of interdependent subsystems and processes that are adaptive to changes in the environment and behave in a nonlinear fashion. Traditional health technology assessment and modeling methods often neglect the wider health system impacts that can be critical for achieving desired health system goals and are often of limited usefulness when applied to complex health systems. Researchers and health care decision makers can either underestimate or fail to consider the interactions among the people, processes, technology, and facility designs. Health care delivery system interventions need to incorporate the dynamics and complexities of the health care system context in which the intervention is delivered. This report provides an overview of common dynamic simulation modeling methods and examples of health care system interventions in which such methods could be useful. Three dynamic simulation modeling methods are presented to evaluate system interventions for health care delivery: system dynamics, discrete event simulation, and agent-based modeling. In contrast to conventional evaluations, a dynamic systems approach incorporates the complexity of the system and anticipates the upstream and downstream consequences of changes in complex health care delivery systems. This report assists researchers and decision makers in deciding whether these simulation methods are appropriate to address specific health system problems through an eight-point checklist referred to as the SIMULATE (System, Interactions, Multilevel, Understanding, Loops, Agents, Time, Emergence) tool. It is a primer for researchers and decision makers working in health care delivery and implementation sciences who face complex challenges in delivering effective and efficient care that can be addressed with system interventions. On reviewing this report, the readers should be able to identify whether these simulation modeling methods are appropriate to answer the problem they are addressing and to recognize the differences of these methods from other modeling approaches used typically in health technology assessment applications. Copyright © 2015 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  3. Understanding Whole Systems Change in Health Care: Insights into System Level Diffusion from Nursing Service Delivery Innovations--A Multiple Case Study

    ERIC Educational Resources Information Center

    Berta, Whitney; Virani, Tazim; Bajnok, Irmajean; Edwards, Nancy; Rowan, Margo

    2014-01-01

    Our study responds to calls for theory-driven approaches to studying innovation diffusion processes in health care. While most research on diffusion in health care is situated at the service delivery level, we study innovations and associated processes that have diffused to the system level, and refer to work on complex adaptive systems and whole…

  4. Personnel Selection Influences on Remotely Piloted Aircraft Human-System Integration.

    PubMed

    Carretta, Thomas R; King, Raymond E

    2015-08-01

    Human-system integration (HSI) is a complex process used to design and develop systems that integrate human capabilities and limitations in an effective and affordable manner. Effective HSI incorporates several domains, including manpower, personnel and training, human factors, environment, safety, occupational health, habitability, survivability, logistics, intelligence, mobility, and command and control. To achieve effective HSI, the relationships among these domains must be considered. Although this integrated approach is well documented, there are many instances where it is not followed. Human factors engineers typically focus on system design with little attention to the skills, abilities, and other characteristics needed by human operators. When problems with fielded systems occur, additional training of personnel is developed and conducted. Personnel selection is seldom considered during the HSI process. Complex systems such as aviation require careful selection of the individuals who will interact with the system. Personnel selection is a two-stage process involving select-in and select-out procedures. Select-in procedures determine which candidates have the aptitude to profit from training and represent the best investment. Select-out procedures focus on medical qualification and determine who should not enter training for medical reasons. The current paper discusses the role of personnel selection in the HSI process in the context of remotely piloted aircraft systems.

  5. Nonterrestrial material processing and manufacturing of large space systems

    NASA Technical Reports Server (NTRS)

    Von Tiesenhausen, G.

    1979-01-01

    Nonterrestrial processing of materials and manufacturing of large space system components from preprocessed lunar materials at a manufacturing site in space is described. Lunar materials mined and preprocessed at the lunar resource complex will be flown to the space manufacturing facility (SMF), where together with supplementary terrestrial materials, they will be final processed and fabricated into space communication systems, solar cell blankets, radio frequency generators, and electrical equipment. Satellite Power System (SPS) material requirements and lunar material availability and utilization are detailed, and the SMF processing, refining, fabricating facilities, material flow and manpower requirements are described.

  6. Towards the understanding of network information processing in biology

    NASA Astrophysics Data System (ADS)

    Singh, Vijay

    Living organisms perform incredibly well in detecting a signal present in the environment. This information processing is achieved near optimally and quite reliably, even though the sources of signals are highly variable and complex. The work in the last few decades has given us a fair understanding of how individual signal processing units like neurons and cell receptors process signals, but the principles of collective information processing on biological networks are far from clear. Information processing in biological networks, like the brain, metabolic circuits, cellular-signaling circuits, etc., involves complex interactions among a large number of units (neurons, receptors). The combinatorially large number of states such a system can exist in makes it impossible to study these systems from the first principles, starting from the interactions between the basic units. The principles of collective information processing on such complex networks can be identified using coarse graining approaches. This could provide insights into the organization and function of complex biological networks. Here I study models of biological networks using continuum dynamics, renormalization, maximum likelihood estimation and information theory. Such coarse graining approaches identify features that are essential for certain processes performed by underlying biological networks. We find that long-range connections in the brain allow for global scale feature detection in a signal. These also suppress the noise and remove any gaps present in the signal. Hierarchical organization with long-range connections leads to large-scale connectivity at low synapse numbers. Time delays can be utilized to separate a mixture of signals with temporal scales. Our observations indicate that the rules in multivariate signal processing are quite different from traditional single unit signal processing.

  7. Process consistency in models: The importance of system signatures, expert knowledge, and process complexity

    NASA Astrophysics Data System (ADS)

    Hrachowitz, M.; Fovet, O.; Ruiz, L.; Euser, T.; Gharari, S.; Nijzink, R.; Freer, J.; Savenije, H. H. G.; Gascuel-Odoux, C.

    2014-09-01

    Hydrological models frequently suffer from limited predictive power despite adequate calibration performances. This can indicate insufficient representations of the underlying processes. Thus, ways are sought to increase model consistency while satisfying the contrasting priorities of increased model complexity and limited equifinality. In this study, the value of a systematic use of hydrological signatures and expert knowledge for increasing model consistency was tested. It was found that a simple conceptual model, constrained by four calibration objective functions, was able to adequately reproduce the hydrograph in the calibration period. The model, however, could not reproduce a suite of hydrological signatures, indicating a lack of model consistency. Subsequently, testing 11 models, model complexity was increased in a stepwise way and counter-balanced by "prior constraints," inferred from expert knowledge to ensure a model which behaves well with respect to the modeler's perception of the system. We showed that, in spite of unchanged calibration performance, the most complex model setup exhibited increased performance in the independent test period and skill to better reproduce all tested signatures, indicating a better system representation. The results suggest that a model may be inadequate despite good performance with respect to multiple calibration objectives and that increasing model complexity, if counter-balanced by prior constraints, can significantly increase predictive performance of a model and its skill to reproduce hydrological signatures. The results strongly illustrate the need to balance automated model calibration with a more expert-knowledge-driven strategy of constraining models.

  8. "Chemical transformers" from nanoparticle ensembles operated with logic.

    PubMed

    Motornov, Mikhail; Zhou, Jian; Pita, Marcos; Gopishetty, Venkateshwarlu; Tokarev, Ihor; Katz, Evgeny; Minko, Sergiy

    2008-09-01

    The pH-responsive nanoparticles were coupled with information-processing enzyme-based systems to yield "smart" signal-responsive hybrid systems with built-in Boolean logic. The enzyme systems performed AND/OR logic operations, transducing biochemical input signals into reversible structural changes (signal-directed self-assembly) of the nanoparticle assemblies, thus resulting in the processing and amplification of the biochemical signals. The hybrid system mimics biological systems in effective processing of complex biochemical information, resulting in reversible changes of the self-assembled structures of the nanoparticles. The bioinspired approach to the nanostructured morphing materials could be used in future self-assembled molecular robotic systems.

  9. Influence of the preparation method on the physicochemical properties of indomethacin and methyl-β-cyclodextrin complexes.

    PubMed

    Rudrangi, Shashi Ravi Suman; Bhomia, Ruchir; Trivedi, Vivek; Vine, George J; Mitchell, John C; Alexander, Bruce David; Wicks, Stephen Richard

    2015-02-20

    The main objective of this study was to investigate different manufacturing processes claimed to promote inclusion complexation between indomethacin and cyclodextrins in order to enhance the apparent solubility and dissolution properties of indomethacin. Especially, the effectiveness of supercritical carbon dioxide processing for preparing solid drug-cyclodextrin inclusion complexes was investigated and compared to other preparation methods. The complexes were prepared by physical mixing, co-evaporation, freeze drying from aqueous solution, spray drying and supercritical carbon dioxide processing methods. The prepared complexes were then evaluated by scanning electron microscopy, differential scanning calorimetry, X-ray powder diffraction, solubility and dissolution studies. The method of preparation of the inclusion complexes was shown to influence the physicochemical properties of the formed complexes. Indomethacin exists in a highly crystalline solid form. Physical mixing of indomethacin and methyl-β-cyclodextrin appeared not to reduce the degree of crystallinity of the drug. The co-evaporated and freeze dried complexes had a lower degree of crystallinity than the physical mix; however the lowest degree of crystallinity was achieved in complexes prepared by spray drying and supercritical carbon dioxide processing methods. All systems based on methyl-β-cyclodextrin exhibited better dissolution properties than the drug alone. The greatest improvement in drug dissolution properties was obtained from complexes prepared using supercritical carbon dioxide processing, thereafter by spray drying, freeze drying, co-evaporation and finally by physical mixing. Supercritical carbon dioxide processing is well known as an energy efficient alternative to other pharmaceutical processes and may have application for the preparation of solid-state drug-cyclodextrin inclusion complexes. It is an effective and economic method that allows the formation of solid complexes with a high yield, without the use of organic solvents and problems associated with their residues. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Direct-to-digital holography reduction of reference hologram noise and fourier space smearing

    DOEpatents

    Voelkl, Edgar

    2006-06-27

    Systems and methods are described for reduction of reference hologram noise and reduction of Fourier space smearing, especially in the context of direct-to-digital holography (off-axis interferometry). A method of reducing reference hologram noise includes: recording a plurality of reference holograms; processing the plurality of reference holograms into a corresponding plurality of reference image waves; and transforming the corresponding plurality of reference image waves into a reduced noise reference image wave. A method of reducing smearing in Fourier space includes: recording a plurality of reference holograms; processing the plurality of reference holograms into a corresponding plurality of reference complex image waves; transforming the corresponding plurality of reference image waves into a reduced noise reference complex image wave; recording a hologram of an object; processing the hologram of the object into an object complex image wave; and dividing the complex image wave of the object by the reduced noise reference complex image wave to obtain a reduced smearing object complex image wave.

  11. Connectivity in the human brain dissociates entropy and complexity of auditory inputs☆

    PubMed Central

    Nastase, Samuel A.; Iacovella, Vittorio; Davis, Ben; Hasson, Uri

    2015-01-01

    Complex systems are described according to two central dimensions: (a) the randomness of their output, quantified via entropy; and (b) their complexity, which reflects the organization of a system's generators. Whereas some approaches hold that complexity can be reduced to uncertainty or entropy, an axiom of complexity science is that signals with very high or very low entropy are generated by relatively non-complex systems, while complex systems typically generate outputs with entropy peaking between these two extremes. In understanding their environment, individuals would benefit from coding for both input entropy and complexity; entropy indexes uncertainty and can inform probabilistic coding strategies, whereas complexity reflects a concise and abstract representation of the underlying environmental configuration, which can serve independent purposes, e.g., as a template for generalization and rapid comparisons between environments. Using functional neuroimaging, we demonstrate that, in response to passively processed auditory inputs, functional integration patterns in the human brain track both the entropy and complexity of the auditory signal. Connectivity between several brain regions scaled monotonically with input entropy, suggesting sensitivity to uncertainty, whereas connectivity between other regions tracked entropy in a convex manner consistent with sensitivity to input complexity. These findings suggest that the human brain simultaneously tracks the uncertainty of sensory data and effectively models their environmental generators. PMID:25536493

  12. Modeling and Simulation for Mission Operations Work System Design

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Clancey, William J.; Seah, Chin; Trimble, Jay P.; Sims, Michael H.

    2003-01-01

    Work System analysis and design is complex and non-deterministic. In this paper we describe Brahms, a multiagent modeling and simulation environment for designing complex interactions in human-machine systems. Brahms was originally conceived as a business process design tool that simulates work practices, including social systems of work. We describe our modeling and simulation method for mission operations work systems design, based on a research case study in which we used Brahms to design mission operations for a proposed discovery mission to the Moon. We then describe the results of an actual method application project-the Brahms Mars Exploration Rover. Space mission operations are similar to operations of traditional organizations; we show that the application of Brahms for space mission operations design is relevant and transferable to other types of business processes in organizations.

  13. Sustainable development goals for global health: facilitating good governance in a complex environment.

    PubMed

    Haffeld, Just

    2013-11-01

    Increasing complexity is following in the wake of rampant globalization. Thus, the discussion about Sustainable Development Goals (SDGs) requires new thinking that departs from a critique of current policy tools in exploration of a complexity-friendly approach. This article argues that potential SDGs should: treat stakeholders, like states, business and civil society actors, as agents on different aggregate levels of networks; incorporate good governance processes that facilitate early involvement of relevant resources, as well as equitable participation, consultative processes, and regular policy and programme implementation reviews; anchor adoption and enforcement of such rules to democratic processes in accountable organizations; and include comprehensive systems evaluations, including procedural indicators. A global framework convention for health could be a suitable instrument for handling some of the challenges related to the governance of a complex environment. It could structure and legitimize government involvement, engage stakeholders, arrange deliberation and decision-making processes with due participation and regular policy review, and define minimum standards for health services. A monitoring scheme could ensure that agents in networks comply according to whole-systems targets, locally defined outcome indicators, and process indicators, thus resolving the paradox of government control vs. local policy space. A convention could thus exploit the energy created in the encounter between civil society, international organizations and national authorities. Copyright © 2013 Reproductive Health Matters. Published by Elsevier Ltd. All rights reserved.

  14. The value of mechanistic biophysical information for systems-level understanding of complex biological processes such as cytokinesis.

    PubMed

    Pollard, Thomas D

    2014-12-02

    This review illustrates the value of quantitative information including concentrations, kinetic constants and equilibrium constants in modeling and simulating complex biological processes. Although much has been learned about some biological systems without these parameter values, they greatly strengthen mechanistic accounts of dynamical systems. The analysis of muscle contraction is a classic example of the value of combining an inventory of the molecules, atomic structures of the molecules, kinetic constants for the reactions, reconstitutions with purified proteins and theoretical modeling to account for the contraction of whole muscles. A similar strategy is now being used to understand the mechanism of cytokinesis using fission yeast as a favorable model system. Copyright © 2014 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  15. A novel knowledge-based system for interpreting complex engineering drawings: theory, representation, and implementation.

    PubMed

    Lu, Tong; Tai, Chiew-Lan; Yang, Huafei; Cai, Shijie

    2009-08-01

    We present a novel knowledge-based system to automatically convert real-life engineering drawings to content-oriented high-level descriptions. The proposed method essentially turns the complex interpretation process into two parts: knowledge representation and knowledge-based interpretation. We propose a new hierarchical descriptor-based knowledge representation method to organize the various types of engineering objects and their complex high-level relations. The descriptors are defined using an Extended Backus Naur Form (EBNF), facilitating modification and maintenance. When interpreting a set of related engineering drawings, the knowledge-based interpretation system first constructs an EBNF-tree from the knowledge representation file, then searches for potential engineering objects guided by a depth-first order of the nodes in the EBNF-tree. Experimental results and comparisons with other interpretation systems demonstrate that our knowledge-based system is accurate and robust for high-level interpretation of complex real-life engineering projects.

  16. Toddlers' Complex Communication: Playfulness from a Secure Base

    ERIC Educational Resources Information Center

    Alcock, Sophie

    2013-01-01

    Attachment theory is presented in this article as involving embodied relational processes within complex relational systems. Two narrative-like "events" are represented to illustrate very young children playfully relating -- connecting and communicating inter- and intrasubjectively. The ethnographic-inspired research methods included…

  17. Teaching Complex Organizations: A Survey Essay.

    ERIC Educational Resources Information Center

    Dobratz, Betty

    1988-01-01

    Briefly reviews six textbooks for teaching about complex organizations: ORGANIZATIONS: STRUCTURES, PROCESSES, AND OUTCOMES (Hall, 1987); ORGANIZATIONS: RATIONAL, NATURAL, AND OPEN SYSTEMS (Scott, 1987); ORGANIZATIONS IN SOCIETY (Etzioni, 1985); ORGANIZATIONAL BEHAVIOR (Hellriegel et al, 1986); ORGANIZATIONAL BEHAVIOR: EXPERIENCES AND CASES (Hai,…

  18. Panarchy: Theory and Application

    EPA Science Inventory

    The concept of panarchy was introduced by Gunderson et al. (1995) and refined by Gunderson and Holling (2002) as a heuristic model to help explain complex changes in ecosystem processes and structures within and across scales of organization. The concept takes a complex systems a...

  19. Safety management of complex research operators

    NASA Technical Reports Server (NTRS)

    Brown, W. J.

    1981-01-01

    Complex research and technology operations present varied potential hazards which are addressed in a disciplined, independent safety review and approval process. Potential hazards vary from high energy fuels to hydrocarbon fuels, high pressure systems to high voltage systems, toxic chemicals to radioactive materials and high speed rotating machinery to high powered lasers. A Safety Permit System presently covers about 600 potentially hazardous operations. The Safety Management Program described is believed to be a major factor in maintaining an excellent safety record.

  20. Protein-Protein Interactions of Azurin Complex by Coarse-Grained Simulations with a Gō-Like Model

    NASA Astrophysics Data System (ADS)

    Rusmerryani, Micke; Takasu, Masako; Kawaguchi, Kazutomo; Saito, Hiroaki; Nagao, Hidemi

    Proteins usually perform their biological functions by forming a complex with other proteins. It is very important to study the protein-protein interactions since these interactions are crucial in many processes of a living organism. In this study, we develop a coarse grained model to simulate protein complex in liquid system. We carry out molecular dynamics simulations with topology-based potential interactions to simulate dynamical properties of Pseudomonas Aeruginosa azurin complex systems. Azurin is known to play an essential role as an anticancer agent and bind many important intracellular molecules. Some physical properties are monitored during simulation time to get a better understanding of the influence of protein-protein interactions to the azurin complex dynamics. These studies will provide valuable insights for further investigation on protein-protein interactions in more realistic system.

  1. The Difference between Uncertainty and Information, and Why This Matters

    NASA Astrophysics Data System (ADS)

    Nearing, G. S.

    2016-12-01

    Earth science investigation and arbitration (for decision making) is very often organized around a concept of uncertainty. It seems relatively straightforward that the purpose of our science is to reduce uncertainty about how environmental systems will react and evolve under different conditions. I propose here that approaching a science of complex systems as a process of quantifying and reducing uncertainty is a mistake, and specifically a mistake that is rooted in certain rather hisoric logical errors. Instead I propose that we should be asking questions about information. I argue here that an information-based perspective facilitates almost trivial answers to environmental science questions that are either difficult or theoretically impossible to answer when posed as questions about uncertainty. In particular, I propose that an information-centric perspective leads to: Coherent and non-subjective hypothesis tests for complex system models. Process-level diagnostics for complex systems models. Methods for building complex systems models that allow for inductive inference without the need for a priori specification of likelihood functions or ad hoc error metrics. Asymptotically correct quantification of epistemic uncertainty. To put this in slightly more basic terms, I propose that an information-theoretic philosophy of science has the potential to resolve certain important aspects of the Demarcation Problem and the Duhem-Quine Problem, and that Hydrology and other Earth Systems Sciences can immediately capitalize on this to address some of our most difficult and persistent problems.

  2. Considerations Regardingthe Integration-Intrication Processin the Nature and Technology

    NASA Astrophysics Data System (ADS)

    Tecaru Berekmeri, Camelia Velia; Blebea, Ioan

    2014-11-01

    The big challenges in education and R&D activities in the century just started are related on the complexity and transdisciplinarity understanding and promotion.The approaches are necessary in order to understand the unity of the world we live in through the unity of knowledge.The complexity is the result of the integration process.The paper presents fundamentals of the integration-intrication process in the nature and technology.The concept of integronics and the basic principles of the integration process are outlined too. Also the main features of mechatronics as environment for transdisciplinarity learning and the concept of integral education promotion are presented.The advanced mechatronics and the embedded systems are fundamentals of the cyberphysical systems of the future

  3. Reaction pathways in atomistic models of thin film growth

    NASA Astrophysics Data System (ADS)

    Lloyd, Adam L.; Zhou, Ying; Yu, Miao; Scott, Chris; Smith, Roger; Kenny, Steven D.

    2017-10-01

    The atomistic processes that form the basis of thin film growth often involve complex multi-atom movements of atoms or groups of atoms on or close to the surface of a substrate. These transitions and their pathways are often difficult to predict in advance. By using an adaptive kinetic Monte Carlo (AKMC) approach, many complex mechanisms can be identified so that the growth processes can be understood and ultimately controlled. Here the AKMC technique is briefly described along with some special adaptions that can speed up the simulations when, for example, the transition barriers are small. Examples are given of such complex processes that occur in different material systems especially for the growth of metals and metallic oxides.

  4. Use of the self-organising map network (SOMNet) as a decision support system for regional mental health planning.

    PubMed

    Chung, Younjin; Salvador-Carulla, Luis; Salinas-Pérez, José A; Uriarte-Uriarte, Jose J; Iruin-Sanz, Alvaro; García-Alonso, Carlos R

    2018-04-25

    Decision-making in mental health systems should be supported by the evidence-informed knowledge transfer of data. Since mental health systems are inherently complex, involving interactions between its structures, processes and outcomes, decision support systems (DSS) need to be developed using advanced computational methods and visual tools to allow full system analysis, whilst incorporating domain experts in the analysis process. In this study, we use a DSS model developed for interactive data mining and domain expert collaboration in the analysis of complex mental health systems to improve system knowledge and evidence-informed policy planning. We combine an interactive visual data mining approach, the self-organising map network (SOMNet), with an operational expert knowledge approach, expert-based collaborative analysis (EbCA), to develop a DSS model. The SOMNet was applied to the analysis of healthcare patterns and indicators of three different regional mental health systems in Spain, comprising 106 small catchment areas and providing healthcare for over 9 million inhabitants. Based on the EbCA, the domain experts in the development team guided and evaluated the analytical processes and results. Another group of 13 domain experts in mental health systems planning and research evaluated the model based on the analytical information of the SOMNet approach for processing information and discovering knowledge in a real-world context. Through the evaluation, the domain experts assessed the feasibility and technology readiness level (TRL) of the DSS model. The SOMNet, combined with the EbCA, effectively processed evidence-based information when analysing system outliers, explaining global and local patterns, and refining key performance indicators with their analytical interpretations. The evaluation results showed that the DSS model was feasible by the domain experts and reached level 7 of the TRL (system prototype demonstration in operational environment). This study supports the benefits of combining health systems engineering (SOMNet) and expert knowledge (EbCA) to analyse the complexity of health systems research. The use of the SOMNet approach contributes to the demonstration of DSS for mental health planning in practice.

  5. Securing Information with Complex Optical Encryption Networks

    DTIC Science & Technology

    2015-08-11

    Network Security, Network Vulnerability , Multi-dimentional Processing, optoelectronic devices 16. SECURITY CLASSIFICATION OF: 17. LIMITATION... optoelectronic devices and systems should be analyzed before the retrieval, any hostile hacker will need to possess multi-disciplinary scientific...sophisticated optoelectronic principles and systems where he/she needs to process the information. However, in the military applications, most military

  6. How Do Students Regulate their Learning of Complex Systems with Hypermedia?.

    ERIC Educational Resources Information Center

    Azevedo, Roger; Seibert, Diane; Guthrie, John T.; Cromley, Jennifer G.; Wang, Huei-yu; Tron, Myriam

    This study examined the role of different goal-setting instructional interventions in facilitating students' shift to more sophisticated mental models of the circulatory system as indicated by both performance and process data. Researchers adopted the information processing model of self-regulated learning of P. Winne and colleagues (1998, 2001)…

  7. Is Model-Based Development a Favorable Approach for Complex and Safety-Critical Computer Systems on Commercial Aircraft?

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2014-01-01

    A system is safety-critical if its failure can endanger human life or cause significant damage to property or the environment. State-of-the-art computer systems on commercial aircraft are highly complex, software-intensive, functionally integrated, and network-centric systems of systems. Ensuring that such systems are safe and comply with existing safety regulations is costly and time-consuming as the level of rigor in the development process, especially the validation and verification activities, is determined by considerations of system complexity and safety criticality. A significant degree of care and deep insight into the operational principles of these systems is required to ensure adequate coverage of all design implications relevant to system safety. Model-based development methodologies, methods, tools, and techniques facilitate collaboration and enable the use of common design artifacts among groups dealing with different aspects of the development of a system. This paper examines the application of model-based development to complex and safety-critical aircraft computer systems. Benefits and detriments are identified and an overall assessment of the approach is given.

  8. Could a neuroscientist understand a microprocessor?

    DOE PAGES

    Jonas, Eric; Kording, Konrad Paul; Diedrichsen, Jorn

    2017-01-12

    There is a popular belief in neuroscience that we are primarily data limited, and that producing large, multimodal, and complex datasets will, with the help of advanced data analysis algorithms, lead to fundamental insights into the way the brain processes information. These datasets do not yet exist, and if they did we would have no way of evaluating whether or not the algorithmically-generated insights were sufficient or even correct. To address this, here we take a classical microprocessor as a model organism, and use our ability to perform arbitrary experiments on it to see if popular data analysis methods frommore » neuroscience can elucidate the way it processes information. Microprocessors are among those artificial information processing systems that are both complex and that we understand at all levels, from the overall logical flow, via logical gates, to the dynamics of transistors. We show that the approaches reveal interesting structure in the data but do not meaningfully describe the hierarchy of information processing in the microprocessor. This suggests current analytic approaches in neuroscience may fall short of producing meaningful understanding of neural systems, regardless of the amount of data. Furthermore, we argue for scientists using complex non-linear dynamical systems with known ground truth, such as the microprocessor as a validation platform for time-series and structure discovery methods.« less

  9. Beyond a series of security nets: Applying STAMP & STPA to port security

    DOE PAGES

    Williams, Adam D.

    2015-11-17

    Port security is an increasing concern considering the significant role of ports in global commerce and today’s increasingly complex threat environment. Current approaches to port security mirror traditional models of accident causality -- ‘a series of security nets’ based on component reliability and probabilistic assumptions. Traditional port security frameworks result in isolated and inconsistent improvement strategies. Recent work in engineered safety combines the ideas of hierarchy, emergence, control and communication into a new paradigm for understanding port security as an emergent complex system property. The ‘System-Theoretic Accident Model and Process (STAMP)’ is a new model of causality based on systemsmore » and control theory. The associated analysis process -- System Theoretic Process Analysis (STPA) -- identifies specific technical or procedural security requirements designed to work in coordination with (and be traceable to) overall port objectives. This process yields port security design specifications that can mitigate (if not eliminate) port security vulnerabilities related to an emphasis on component reliability, lack of coordination between port security stakeholders or economic pressures endemic in the maritime industry. As a result, this article aims to demonstrate how STAMP’s broader view of causality and complexity can better address the dynamic and interactive behaviors of social, organizational and technical components of port security.« less

  10. Could a Neuroscientist Understand a Microprocessor?

    PubMed Central

    Kording, Konrad Paul

    2017-01-01

    There is a popular belief in neuroscience that we are primarily data limited, and that producing large, multimodal, and complex datasets will, with the help of advanced data analysis algorithms, lead to fundamental insights into the way the brain processes information. These datasets do not yet exist, and if they did we would have no way of evaluating whether or not the algorithmically-generated insights were sufficient or even correct. To address this, here we take a classical microprocessor as a model organism, and use our ability to perform arbitrary experiments on it to see if popular data analysis methods from neuroscience can elucidate the way it processes information. Microprocessors are among those artificial information processing systems that are both complex and that we understand at all levels, from the overall logical flow, via logical gates, to the dynamics of transistors. We show that the approaches reveal interesting structure in the data but do not meaningfully describe the hierarchy of information processing in the microprocessor. This suggests current analytic approaches in neuroscience may fall short of producing meaningful understanding of neural systems, regardless of the amount of data. Additionally, we argue for scientists using complex non-linear dynamical systems with known ground truth, such as the microprocessor as a validation platform for time-series and structure discovery methods. PMID:28081141

  11. Could a Neuroscientist Understand a Microprocessor?

    PubMed

    Jonas, Eric; Kording, Konrad Paul

    2017-01-01

    There is a popular belief in neuroscience that we are primarily data limited, and that producing large, multimodal, and complex datasets will, with the help of advanced data analysis algorithms, lead to fundamental insights into the way the brain processes information. These datasets do not yet exist, and if they did we would have no way of evaluating whether or not the algorithmically-generated insights were sufficient or even correct. To address this, here we take a classical microprocessor as a model organism, and use our ability to perform arbitrary experiments on it to see if popular data analysis methods from neuroscience can elucidate the way it processes information. Microprocessors are among those artificial information processing systems that are both complex and that we understand at all levels, from the overall logical flow, via logical gates, to the dynamics of transistors. We show that the approaches reveal interesting structure in the data but do not meaningfully describe the hierarchy of information processing in the microprocessor. This suggests current analytic approaches in neuroscience may fall short of producing meaningful understanding of neural systems, regardless of the amount of data. Additionally, we argue for scientists using complex non-linear dynamical systems with known ground truth, such as the microprocessor as a validation platform for time-series and structure discovery methods.

  12. Could a neuroscientist understand a microprocessor?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jonas, Eric; Kording, Konrad Paul; Diedrichsen, Jorn

    There is a popular belief in neuroscience that we are primarily data limited, and that producing large, multimodal, and complex datasets will, with the help of advanced data analysis algorithms, lead to fundamental insights into the way the brain processes information. These datasets do not yet exist, and if they did we would have no way of evaluating whether or not the algorithmically-generated insights were sufficient or even correct. To address this, here we take a classical microprocessor as a model organism, and use our ability to perform arbitrary experiments on it to see if popular data analysis methods frommore » neuroscience can elucidate the way it processes information. Microprocessors are among those artificial information processing systems that are both complex and that we understand at all levels, from the overall logical flow, via logical gates, to the dynamics of transistors. We show that the approaches reveal interesting structure in the data but do not meaningfully describe the hierarchy of information processing in the microprocessor. This suggests current analytic approaches in neuroscience may fall short of producing meaningful understanding of neural systems, regardless of the amount of data. Furthermore, we argue for scientists using complex non-linear dynamical systems with known ground truth, such as the microprocessor as a validation platform for time-series and structure discovery methods.« less

  13. Beyond a series of security nets: Applying STAMP & STPA to port security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Adam D.

    Port security is an increasing concern considering the significant role of ports in global commerce and today’s increasingly complex threat environment. Current approaches to port security mirror traditional models of accident causality -- ‘a series of security nets’ based on component reliability and probabilistic assumptions. Traditional port security frameworks result in isolated and inconsistent improvement strategies. Recent work in engineered safety combines the ideas of hierarchy, emergence, control and communication into a new paradigm for understanding port security as an emergent complex system property. The ‘System-Theoretic Accident Model and Process (STAMP)’ is a new model of causality based on systemsmore » and control theory. The associated analysis process -- System Theoretic Process Analysis (STPA) -- identifies specific technical or procedural security requirements designed to work in coordination with (and be traceable to) overall port objectives. This process yields port security design specifications that can mitigate (if not eliminate) port security vulnerabilities related to an emphasis on component reliability, lack of coordination between port security stakeholders or economic pressures endemic in the maritime industry. As a result, this article aims to demonstrate how STAMP’s broader view of causality and complexity can better address the dynamic and interactive behaviors of social, organizational and technical components of port security.« less

  14. Physical bases of the generation of short-term earthquake precursors: A complex model of ionization-induced geophysical processes in the lithosphere-atmosphere-ionosphere-magnetosphere system

    NASA Astrophysics Data System (ADS)

    Pulinets, S. A.; Ouzounov, D. P.; Karelin, A. V.; Davidenko, D. V.

    2015-07-01

    This paper describes the current understanding of the interaction between geospheres from a complex set of physical and chemical processes under the influence of ionization. The sources of ionization involve the Earth's natural radioactivity and its intensification before earthquakes in seismically active regions, anthropogenic radioactivity caused by nuclear weapon testing and accidents in nuclear power plants and radioactive waste storage, the impact of galactic and solar cosmic rays, and active geophysical experiments using artificial ionization equipment. This approach treats the environment as an open complex system with dissipation, where inherent processes can be considered in the framework of the synergistic approach. We demonstrate the synergy between the evolution of thermal and electromagnetic anomalies in the Earth's atmosphere, ionosphere, and magnetosphere. This makes it possible to determine the direction of the interaction process, which is especially important in applications related to short-term earthquake prediction. That is why the emphasis in this study is on the processes proceeding the final stage of earthquake preparation; the effects of other ionization sources are used to demonstrate that the model is versatile and broadly applicable in geophysics.

  15. On common noise-induced synchronization in complex networks with state-dependent noise diffusion processes

    NASA Astrophysics Data System (ADS)

    Russo, Giovanni; Shorten, Robert

    2018-04-01

    This paper is concerned with the study of common noise-induced synchronization phenomena in complex networks of diffusively coupled nonlinear systems. We consider the case where common noise propagation depends on the network state and, as a result, the noise diffusion process at the nodes depends on the state of the network. For such networks, we present an algebraic sufficient condition for the onset of synchronization, which depends on the network topology, the dynamics at the nodes, the coupling strength and the noise diffusion. Our result explicitly shows that certain noise diffusion processes can drive an unsynchronized network towards synchronization. In order to illustrate the effectiveness of our result, we consider two applications: collective decision processes and synchronization of chaotic systems. We explicitly show that, in the former application, a sufficiently large noise can drive a population towards a common decision, while, in the latter, we show how common noise can synchronize a network of Lorentz chaotic systems.

  16. Semi-automatic image analysis methodology for the segmentation of bubbles and drops in complex dispersions occurring in bioreactors

    NASA Astrophysics Data System (ADS)

    Taboada, B.; Vega-Alvarado, L.; Córdova-Aguilar, M. S.; Galindo, E.; Corkidi, G.

    2006-09-01

    Characterization of multiphase systems occurring in fermentation processes is a time-consuming and tedious process when manual methods are used. This work describes a new semi-automatic methodology for the on-line assessment of diameters of oil drops and air bubbles occurring in a complex simulated fermentation broth. High-quality digital images were obtained from the interior of a mechanically stirred tank. These images were pre-processed to find segments of edges belonging to the objects of interest. The contours of air bubbles and oil drops were then reconstructed using an improved Hough transform algorithm which was tested in two, three and four-phase simulated fermentation model systems. The results were compared against those obtained manually by a trained observer, showing no significant statistical differences. The method was able to reduce the total processing time for the measurements of bubbles and drops in different systems by 21-50% and the manual intervention time for the segmentation procedure by 80-100%.

  17. 24 CFR 103.205 - Systemic processing.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... are pervasive or institutional in nature, or that the processing of the complaint will involve complex issues, novel questions of fact or law, or will affect a large number of persons, the Assistant Secretary...

  18. A measuring tool for tree-rings analysis

    NASA Astrophysics Data System (ADS)

    Shumilov, Oleg; Kanatjev, Alexander; Kasatkina, Elena

    2013-04-01

    A special tool has been created for the annual tree-ring widths measurement and analysis. It consists of professional scanner, computer system and software. This created complex in many aspects does not yield the similar systems (LINTAB, WinDENDRO), but in comparison to manual measurement systems, it offers a number of advantages: productivity gain, possibility of archiving the results of the measurements at any stage of the processing, operator comfort. It has been developed a new software, allowing processing of samples of different types (cores, saw cuts), including those which is difficult to process, having got a complex wood structure (inhomogeneity of growing in different directions, missed, light and false rings etc.). This software can analyze pictures made with optical scanners, analog or digital cameras. The complex software program was created on programming language C++, being compatible with modern operating systems like Windows X. Annual ring widths are measured along paths traced interactively. These paths can have any orientation and can be created so that ring widths are measured perpendicular to ring boundaries. A graphic of ring-widths in function of the year is displayed on a screen during the analysis and it can be used for visual and numerical cross-dating and comparison with other series or master-chronologies. Ring widths are saved to the text files in a special format, and those files are converted to the format accepted for data conservation in the International Tree-Ring Data Bank. The created complex is universal in application that will allow its use for decision of the different problems in biology and ecology. With help of this complex it has been reconstructed a long-term juniper (1328-2004) and pine (1445-2005) tree-ring chronologies on the base of samples collected at Kola Peninsula (northwestern Russia).

  19. Using Complexity and Network Concepts to Inform Healthcare Knowledge Translation

    PubMed Central

    Kitson, Alison; Brook, Alan; Harvey, Gill; Jordan, Zoe; Marshall, Rhianon; O’Shea, Rebekah; Wilson, David

    2018-01-01

    Many representations of the movement of healthcare knowledge through society exist, and multiple models for the translation of evidence into policy and practice have been articulated. Most are linear or cyclical and very few come close to reflecting the dense and intricate relationships, systems and politics of organizations and the processes required to enact sustainable improvements. We illustrate how using complexity and network concepts can better inform knowledge translation (KT) and argue that changing the way we think and talk about KT could enhance the creation and movement of knowledge throughout those systems needing to develop and utilise it. From our theoretical refinement, we propose that KT is a complex network composed of five interdependent sub-networks, or clusters, of key processes (problem identification [PI], knowledge creation [KC], knowledge synthesis [KS], implementation [I], and evaluation [E]) that interact dynamically in different ways at different times across one or more sectors (community; health; government; education; research for example). We call this the KT Complexity Network, defined as a network that optimises the effective, appropriate and timely creation and movement of knowledge to those who need it in order to improve what they do. Activation within and throughout any one of these processes and systems depends upon the agents promoting the change, successfully working across and between multiple systems and clusters. The case is presented for moving to a way of thinking about KT using complexity and network concepts. This extends the thinking that is developing around integrated KT approaches. There are a number of policy and practice implications that need to be considered in light of this shift in thinking. PMID:29524952

  20. The Design, Development and Testing of Complex Avionics Systems: Conference Proceedings Held at the Avionics Panel Symposium in Las Vegas, Nevada on 27 April-1 May 1987

    DTIC Science & Technology

    1987-12-01

    Normally, the system is decomposed into manageable parts with accurately defined interfaces. By rigidly controlling this process, aerospace companies have...Reference A CHANGE IN SYSTEM DESIGN EMPHASIS: FROM MACHINE TO MAN by M.L.Metersky and J.L.Ryder 16 SESSION I1 - MANAGING THE FUl URE SYSTEM DESIGN...PROCESS MANAGING ADVANCED AVIONIC SYSTEM DESIGN by P.Simons 17 ERGONOMIE PSYCHOSENSORIELLE DES COCKPITS, INTERET DES SYSTEMES INFORMATIQUES INTELLIGENTS

  1. PACS technologies and reliability: are we making things better or worse?

    NASA Astrophysics Data System (ADS)

    Horii, Steven C.; Redfern, Regina O.; Kundel, Harold L.; Nodine, Calvin F.

    2002-05-01

    In the process of installing picture archiving and communications (PACS) and speech recognition equipment, upgrading it, and working with previously stored digital image information, the authors encountered a number of problems. Examination of these difficulties illustrated the complex nature of our existing systems and how difficult it is, in many cases, to predict the behavior of these systems. This was found to be true even for our relatively small number of interconnected systems. The purpose of this paper is to illustrate some of the principles of understanding complex system interaction through examples from our experience. The work for this paper grew out of a number of studies we had carried out on our PACS over several years. The complex nature of our systems was evaluated through comparison of our operations with known examples of systems in other industries. Three scenarios: a network failure, a system software upgrade, and attempting to read media from an old archive showed that the major systems used in the radiology departments of many healthcare facilities (HIS, RIS, PACS, and speed recognition) are likely to interact in complex and often unpredictable ways. These interactions may be very difficult or impossible to predict, so that some plans should be made to overcome the negative aspects of the problems that result. Failures and problems, often unpredictable ones, are a likely side effect of having multiple information handling and processing systems interconnected and interoperating. Planning to avoid, or at least not be so vulnerable, to such difficulties is an important aspect of systems planning.

  2. Performance analysis of Integrated Communication and Control System networks

    NASA Technical Reports Server (NTRS)

    Halevi, Y.; Ray, A.

    1990-01-01

    This paper presents statistical analysis of delays in Integrated Communication and Control System (ICCS) networks that are based on asynchronous time-division multiplexing. The models are obtained in closed form for analyzing control systems with randomly varying delays. The results of this research are applicable to ICCS design for complex dynamical processes like advanced aircraft and spacecraft, autonomous manufacturing plants, and chemical and processing plants.

  3. Complex adaptive systems: concept analysis.

    PubMed

    Holden, Lela M

    2005-12-01

    The aim of this paper is to explicate the concept of complex adaptive systems through an analysis that provides a description, antecedents, consequences, and a model case from the nursing and health care literature. Life is more than atoms and molecules--it is patterns of organization. Complexity science is the latest generation of systems thinking that investigates patterns and has emerged from the exploration of the subatomic world and quantum physics. A key component of complexity science is the concept of complex adaptive systems, and active research is found in many disciplines--from biology to economics to health care. However, the research and literature related to these appealing topics have generated confusion. A thorough explication of complex adaptive systems is needed. A modified application of the methods recommended by Walker and Avant for concept analysis was used. A complex adaptive system is a collection of individual agents with freedom to act in ways that are not always totally predictable and whose actions are interconnected. Examples include a colony of termites, the financial market, and a surgical team. It is often referred to as chaos theory, but the two are not the same. Chaos theory is actually a subset of complexity science. Complexity science offers a powerful new approach--beyond merely looking at clinical processes and the skills of healthcare professionals. The use of complex adaptive systems as a framework is increasing for a wide range of scientific applications, including nursing and healthcare management research. When nursing and other healthcare managers focus on increasing connections, diversity, and interactions they increase information flow and promote creative adaptation referred to as self-organization. Complexity science builds on the rich tradition in nursing that views patients and nursing care from a systems perspective.

  4. Mitochondrial Translation and Beyond: Processes Implicated in Combined Oxidative Phosphorylation Deficiencies

    PubMed Central

    Smits, Paulien; Smeitink, Jan; van den Heuvel, Lambert

    2010-01-01

    Mitochondrial disorders are a heterogeneous group of often multisystemic and early fatal diseases, which are amongst the most common inherited human diseases. These disorders are caused by defects in the oxidative phosphorylation (OXPHOS) system, which comprises five multisubunit enzyme complexes encoded by both the nuclear and the mitochondrial genomes. Due to the multitude of proteins and intricacy of the processes required for a properly functioning OXPHOS system, identifying the genetic defect that underlies an OXPHOS deficiency is not an easy task, especially in the case of combined OXPHOS defects. In the present communication we give an extensive overview of the proteins and processes (in)directly involved in mitochondrial translation and the biogenesis of the OXPHOS system and their roles in combined OXPHOS deficiencies. This knowledge is important for further research into the genetic causes, with the ultimate goal to effectively prevent and cure these complex and often devastating disorders. PMID:20396601

  5. Reactive extraction at liquid-liquid systems

    NASA Astrophysics Data System (ADS)

    Wieszczycka, Karolina

    2018-01-01

    The chapter summarizes the state of knowledge about a metal transport in two-phase system. The first part of this review focuses on the distribution law and main factors determination in classical solvent extraction (solubility and polarity of the solute, as well as inter- and intramolecules interaction. Next part of the chapter is devoted to the reactive solvent extraction and the molecular modeling requiring knowledge on type of extractants, complexation mechanisms, metals ions speciation and oxidation during complexes forming, and other parameters that enable to understand the extraction process. Also the kinetic data that is needed for proper modeling, simulation and design of processes needed for critical separations are discussed. Extraction at liquid-solid system using solvent impregnated resins is partially identical as in the case of the corresponding solvent extraction, therefore this subject was also presented in all aspects of separation process (equilibrium, mechanism, kinetics).

  6. Learning To Live with Complexity.

    ERIC Educational Resources Information Center

    Dosa, Marta

    Neither the design of information systems and networks nor the delivery of library services can claim true user centricity without an understanding of the multifaceted psychological environment of users and potential users. The complexity of the political process, social problems, challenges to scientific inquiry, entrepreneurship, and…

  7. Complex-time singularity and locality estimates for quantum lattice systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bouch, Gabriel

    2015-12-15

    We present and prove a well-known locality bound for the complex-time dynamics of a general class of one-dimensional quantum spin systems. Then we discuss how one might hope to extend this same procedure to higher dimensions using ideas related to the Eden growth process and lattice trees. Finally, we demonstrate with a specific family of lattice trees in the plane why this approach breaks down in dimensions greater than one and prove that there exist interactions for which the complex-time dynamics blows-up in finite imaginary time. .

  8. [Process-oriented cost calculation in interventional radiology. A case study].

    PubMed

    Mahnken, A H; Bruners, P; Günther, R W; Rasche, C

    2012-01-01

    Currently used costing methods such as cost centre accounting do not sufficiently reflect the process-based resource utilization in medicine. The goal of this study was to establish a process-oriented cost assessment of percutaneous radiofrequency (RF) ablation of liver and lung metastases. In each of 15 patients a detailed task analysis of the primary process of hepatic and pulmonary RF ablation was performed. Based on these data a dedicated cost calculation model was developed for each primary process. The costs of each process were computed and compared with the revenue for in-patients according to the German diagnosis-related groups (DRG) system 2010. The RF ablation of liver metastases in patients without relevant comorbidities and a low patient complexity level results in a loss of EUR 588.44, whereas the treatment of patients with a higher complexity level yields an acceptable profit. The treatment of pulmonary metastases is profitable even in cases of additional expenses due to complications. Process-oriented costing provides relevant information that is needed for understanding the economic impact of treatment decisions. It is well suited as a starting point for economically driven process optimization and reengineering. Under the terms of the German DRG 2010 system percutaneous RF ablation of lung metastases is economically reasonable, while RF ablation of liver metastases in cases of low patient complexity levels does not cover the costs.

  9. Nonlinear Decoupling Control With ANFIS-Based Unmodeled Dynamics Compensation for a Class of Complex Industrial Processes.

    PubMed

    Zhang, Yajun; Chai, Tianyou; Wang, Hong; Wang, Dianhui; Chen, Xinkai

    2018-06-01

    Complex industrial processes are multivariable and generally exhibit strong coupling among their control loops with heavy nonlinear nature. These make it very difficult to obtain an accurate model. As a result, the conventional and data-driven control methods are difficult to apply. Using a twin-tank level control system as an example, a novel multivariable decoupling control algorithm with adaptive neural-fuzzy inference system (ANFIS)-based unmodeled dynamics (UD) compensation is proposed in this paper for a class of complex industrial processes. At first, a nonlinear multivariable decoupling controller with UD compensation is introduced. Different from the existing methods, the decomposition estimation algorithm using ANFIS is employed to estimate the UD, and the desired estimating and decoupling control effects are achieved. Second, the proposed method does not require the complicated switching mechanism which has been commonly used in the literature. This significantly simplifies the obtained decoupling algorithm and its realization. Third, based on some new lemmas and theorems, the conditions on the stability and convergence of the closed-loop system are analyzed to show the uniform boundedness of all the variables. This is then followed by the summary on experimental tests on a heavily coupled nonlinear twin-tank system that demonstrates the effectiveness and the practicability of the proposed method.

  10. Near infrared light-mediated photoactivation of cytotoxic Re(i) complexes by using lanthanide-doped upconversion nanoparticles.

    PubMed

    Hu, Ming; Zhao, Jixian; Ai, Xiangzhao; Budanovic, Maja; Mu, Jing; Webster, Richard D; Cao, Qian; Mao, Zongwan; Xing, Bengang

    2016-09-13

    Platinum-based chemotherapy, although it has been well proven to be effective in the battle against cancer, suffers from limited specificity, severe side effects and drug resistance. The development of new alternatives with potent anticancer effects and improved specificity is therefore urgently needed. Recently, there are some new chemotherapy reagents based on photoactive Re(i) complexes which have been reported as promising alternatives to improve specificity mainly attributed to the spatial and temporal activation process by light irradiation. However, most of them respond to short-wavelength light (e.g. UV, blue or green light), which may cause unwanted photo damage to cells. Herein, we demonstrate a system for near-infrared (NIR) light controlled activation of Re(i) complex cytotoxicity by integration of photoactivatable Re(i) complexes and lanthanide-doped upconversion nanoparticles (UCNPs). Upon NIR irradiation at 980 nm, the Re(i) complex can be locally activated by upconverted UV light emitted from UCNPs and subsequently leads to enhanced cell lethality. Cytotoxicity studies showed effective inactivation of both drug susceptible human ovarian carcinoma A2780 cells and cisplatin resistant subline A2780cis cells by our UCNP based system with NIR irradiation, and there was minimum light toxicity observed in the whole process, suggesting that such a system could provide a promising strategy to control localized activation of Re(i) complexes and therefore minimize potential side effects.

  11. Deciphering the Interdependence between Ecological and Evolutionary Networks.

    PubMed

    Melián, Carlos J; Matthews, Blake; de Andreazzi, Cecilia S; Rodríguez, Jorge P; Harmon, Luke J; Fortuna, Miguel A

    2018-05-24

    Biological systems consist of elements that interact within and across hierarchical levels. For example, interactions among genes determine traits of individuals, competitive and cooperative interactions among individuals influence population dynamics, and interactions among species affect the dynamics of communities and ecosystem processes. Such systems can be represented as hierarchical networks, but can have complex dynamics when interdependencies among levels of the hierarchy occur. We propose integrating ecological and evolutionary processes in hierarchical networks to explore interdependencies in biological systems. We connect gene networks underlying predator-prey trait distributions to food webs. Our approach addresses longstanding questions about how complex traits and intraspecific trait variation affect the interdependencies among biological levels and the stability of meta-ecosystems. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Low-complexity camera digital signal imaging for video document projection system

    NASA Astrophysics Data System (ADS)

    Hsia, Shih-Chang; Tsai, Po-Shien

    2011-04-01

    We present high-performance and low-complexity algorithms for real-time camera imaging applications. The main functions of the proposed camera digital signal processing (DSP) involve color interpolation, white balance, adaptive binary processing, auto gain control, and edge and color enhancement for video projection systems. A series of simulations demonstrate that the proposed method can achieve good image quality while keeping computation cost and memory requirements low. On the basis of the proposed algorithms, the cost-effective hardware core is developed using Verilog HDL. The prototype chip has been verified with one low-cost programmable device. The real-time camera system can achieve 1270 × 792 resolution with the combination of extra components and can demonstrate each DSP function.

  13. Gathering Information from Transport Systems for Processing in Supply Chains

    NASA Astrophysics Data System (ADS)

    Kodym, Oldřich; Unucka, Jakub

    2016-12-01

    Paper deals with complex system for processing information from means of transport acting as parts of train (rail or road). It focuses on automated information gathering using AutoID technology, information transmission via Internet of Things networks and information usage in information systems of logistic firms for support of selected processes on MES and ERP levels. Different kinds of gathered information from whole transport chain are discussed. Compliance with existing standards is mentioned. Security of information in full life cycle is integral part of presented system. Design of fully equipped system based on synthesized functional nodes is presented.

  14. Continuous monitoring of the lunar or Martian subsurface using on-board pattern recognition and neural processing of Rover geophysical data

    NASA Technical Reports Server (NTRS)

    Mcgill, J. W.; Glass, C. E.; Sternberg, B. K.

    1990-01-01

    The ultimate goal is to create an extraterrestrial unmanned system for subsurface mapping and exploration. Neural networks are to be used to recognize anomalies in the profiles that correspond to potentially exploitable subsurface features. The ground penetrating radar (GPR) techniques are likewise identical. Hence, the preliminary research focus on GPR systems will be directly applicable to seismic systems once such systems can be designed for continuous operation. The original GPR profile may be very complex due to electrical behavior of the background, targets, and antennas, much as the seismic record is made complex by multiple reflections, ghosting, and ringing. Because the format of the GPR data is similar to the format of seismic data, seismic processing software may be applied to GPR data to help enhance the data. A neural network may then be trained to more accurately identify anomalies from the processed record than from the original record.

  15. The kinetics of lanthanide complexation by EDTA and DTPA in lactate media.

    PubMed

    Nash, K L; Brigham, D; Shehee, T C; Martin, A

    2012-12-28

    The interaction of trivalent lanthanide and actinide cations with polyaminopolycarboxylic acid complexing agents in lactic acid buffer systems is an important feature of the chemistry of the TALSPEAK process for the separation of trivalent actinides from lanthanides. To improve understanding of metal ion coordination chemistry in this process, the results of an investigation of the kinetics of lanthanide complexation by ethylenediamine-N,N,N',N'-tetraacetic acid (EDTA) and diethylenetriamine-N,N,N',N'',N''-pentaacetic acid (DTPA) in 0.3 M lactic acid/0.3 M ionic strength solution are reported. Progress of the reaction was monitored using the distinctive visible spectral changes attendant to lanthanide complexation by the colorimetric indicator ligand Arsenazo III, which enables the experiment but plays no mechanistic role. Under the conditions of these experiments, the reactions occur in a time regime suitable for study by stopped-flow spectrophotometric techniques. Experiments have been conducted as a function of EDTA/DTPA ligand concentration, total lactic acid concentration, and pH. The equilibrium perturbation reaction proceeds as a first order approach to equilibrium over a wide range of conditions, allowing the simultaneous determination of complex formation and dissociation rate constants. The rate of the complexation reaction has been determined for the entire lanthanide series (except Pm(3+)). The predominant pathway for lanthanide-EDTA and lanthanide-DTPA dissociation is inversely dependent on the total lactate concentration; the complex formation reaction demonstrates a direct dependence on [H(+)]. Unexpectedly, the rate of the complex formation reaction is seen in both ligand systems to be fastest for Gd(3+). Correlation of these results indicates that in 0.3 M lactate solutions the exchange of lanthanide ions between lactate complexes and the polyaminopolycarboxylate govern the process.

  16. The Paperless Solution

    NASA Technical Reports Server (NTRS)

    2001-01-01

    REI Systems, Inc. developed a software solution that uses the Internet to eliminate the paperwork typically required to document and manage complex business processes. The data management solution, called Electronic Handbooks (EHBs), is presently used for the entire SBIR program processes at NASA. The EHB-based system is ideal for programs and projects whose users are geographically distributed and are involved in complex management processes and procedures. EHBs provide flexible access control and increased communications while maintaining security for systems of all sizes. Through Internet Protocol- based access, user authentication and user-based access restrictions, role-based access control, and encryption/decryption, EHBs provide the level of security required for confidential data transfer. EHBs contain electronic forms and menus, which can be used in real time to execute the described processes. EHBs use standard word processors that generate ASCII HTML code to set up electronic forms that are viewed within a web browser. EHBs require no end-user software distribution, significantly reducing operating costs. Each interactive handbook simulates a hard-copy version containing chapters with descriptions of participants' roles in the online process.

  17. InSAR Deformation Time Series Processed On-Demand in the Cloud

    NASA Astrophysics Data System (ADS)

    Horn, W. B.; Weeden, R.; Dimarchi, H.; Arko, S. A.; Hogenson, K.

    2017-12-01

    During this past year, ASF has developed a cloud-based on-demand processing system known as HyP3 (http://hyp3.asf.alaska.edu/), the Hybrid Pluggable Processing Pipeline, for Synthetic Aperture Radar (SAR) data. The system makes it easy for a user who doesn't have the time or inclination to install and use complex SAR processing software to leverage SAR data in their research or operations. One such processing algorithm is generation of a deformation time series product, which is a series of images representing ground displacements over time, which can be computed using a time series of interferometric SAR (InSAR) products. The set of software tools necessary to generate this useful product are difficult to install, configure, and use. Moreover, for a long time series with many images, the processing of just the interferograms can take days. Principally built by three undergraduate students at the ASF DAAC, the deformation time series processing relies the new Amazon Batch service, which enables processing of jobs with complex interconnected dependencies in a straightforward and efficient manner. In the case of generating a deformation time series product from a stack of single-look complex SAR images, the system uses Batch to serialize the up-front processing, interferogram generation, optional tropospheric correction, and deformation time series generation. The most time consuming portion is the interferogram generation, because even for a fairly small stack of images many interferograms need to be processed. By using AWS Batch, the interferograms are all generated in parallel; the entire process completes in hours rather than days. Additionally, the individual interferograms are saved in Amazon's cloud storage, so that when new data is acquired in the stack, an updated time series product can be generated with minimal addiitonal processing. This presentation will focus on the development techniques and enabling technologies that were used in developing the time series processing in the ASF HyP3 system. Data and process flow from job submission through to order completion will be shown, highlighting the benefits of the cloud for each step.

  18. A hierarchical approach to cooperativity in macromolecular and self-assembling binding systems.

    PubMed

    Garcés, Josep Lluís; Acerenza, Luis; Mizraji, Eduardo; Mas, Francesc

    2008-04-01

    The study of complex macromolecular binding systems reveals that a high number of states and processes are involved in their mechanism of action, as has become more apparent with the sophistication of the experimental techniques used. The resulting information is often difficult to interpret because of the complexity of the scheme (large size and profuse interactions, including cooperative and self-assembling interactions) and the lack of transparency that this complexity introduces into the interpretation of the indexes traditionally used to describe the binding properties. In particular, cooperative behaviour can be attributed to very different causes, such as direct chemical modification of the binding sites, conformational changes in the whole structure of the macromolecule, aggregation processes between different subunits, etc. In this paper, we propose a novel approach for the analysis of the binding properties of complex macromolecular and self-assembling systems. To quantify the binding behaviour, we use the global association quotient defined as K(c) = [occupied sites]/([free sites] L), L being the free ligand concentration. K(c) can be easily related to other measures of cooperativity (such as the Hill number or the Scatchard plot) and to the free energies involved in the binding processes at each ligand concentration. In a previous work, it was shown that K(c) could be decomposed as an average of equilibrium constants in two ways: intrinsic constants for Adair binding systems and elementary constants for the general case. In this study, we show that these two decompositions are particular cases of a more general expression, where the average is over partial association quotients, associated with subsystems from which the system is composed. We also show that if the system is split into different subsystems according to a binding hierarchy that starts from the lower, microscopic level and ends at the higher, aggregation level, the global association quotient can be decomposed following the hierarchical levels of macromolecular organisation. In this process, the partial association quotients of one level are expressed, in a recursive way, as a function of the partial quotients of the level that is immediately below, until the microscopic level is reached. As a result, the binding properties of very complex macromolecular systems can be analysed in detail, making the mechanistic explanation of their behaviour transparent. In addition, our approach provides a model-independent interpretation of the intrinsic equilibrium constants in terms of the elementary ones.

  19. Stochastic tools hidden behind the empirical dielectric relaxation laws

    NASA Astrophysics Data System (ADS)

    Stanislavsky, Aleksander; Weron, Karina

    2017-03-01

    The paper is devoted to recent advances in stochastic modeling of anomalous kinetic processes observed in dielectric materials which are prominent examples of disordered (complex) systems. Theoretical studies of dynamical properties of ‘structures with variations’ (Goldenfield and Kadanoff 1999 Science 284 87-9) require application of such mathematical tools—by means of which their random nature can be analyzed and, independently of the details distinguishing various systems (dipolar materials, glasses, semiconductors, liquid crystals, polymers, etc), the empirical universal kinetic patterns can be derived. We begin with a brief survey of the historical background of the dielectric relaxation study. After a short outline of the theoretical ideas providing the random tools applicable to modeling of relaxation phenomena, we present probabilistic implications for the study of the relaxation-rate distribution models. In the framework of the probability distribution of relaxation rates we consider description of complex systems, in which relaxing entities form random clusters interacting with each other and single entities. Then we focus on stochastic mechanisms of the relaxation phenomenon. We discuss the diffusion approach and its usefulness for understanding of anomalous dynamics of relaxing systems. We also discuss extensions of the diffusive approach to systems under tempered random processes. Useful relationships among different stochastic approaches to the anomalous dynamics of complex systems allow us to get a fresh look at this subject. The paper closes with a final discussion on achievements of stochastic tools describing the anomalous time evolution of complex systems.

  20. Licensing of future mobile satellite systems

    NASA Technical Reports Server (NTRS)

    Lepkowski, Ronald J.

    1990-01-01

    The regulatory process for licensing mobile satellite systems is complex and can require many years to complete. This process involves frequency allocations, national licensing, and frequency coordination. The regulatory process that resulted in the establishment of the radiodetermination satellite service (RDSS) between 1983 and 1987 is described. In contrast, each of these steps in the licensing of the mobile satellite service (MSS) is taking a significantly longer period of time to complete.

  1. Social regulation of emotion: messy layers

    PubMed Central

    Kappas, Arvid

    2013-01-01

    Emotions are evolved systems of intra- and interpersonal processes that are regulatory in nature, dealing mostly with issues of personal or social concern. They regulate social interaction and in extension, the social sphere. In turn, processes in the social sphere regulate emotions of individuals and groups. In other words, intrapersonal processes project in the interpersonal space, and inversely, interpersonal experiences deeply influence intrapersonal processes. Thus, I argue that the concepts of emotion generation and regulation should not be artificially separated. Similarly, interpersonal emotions should not be reduced to interacting systems of intraindividual processes. Instead, we can consider emotions at different social levels, ranging from dyads to large scale e-communities. The interaction between these levels is complex and does not only involve influences from one level to the next. In this sense the levels of emotion/regulation are messy and a challenge for empirical study. In this article, I discuss the concepts of emotions and regulation at different intra- and interpersonal levels. I extend the concept of auto-regulation of emotions (Kappas, 2008, 2011a,b) to social processes. Furthermore, I argue for the necessity of including mediated communication, particularly in cyberspace in contemporary models of emotion/regulation. Lastly, I suggest the use of concepts from systems dynamics and complex systems to tackle the challenge of the “messy layers.” PMID:23424049

  2. A proven knowledge-based approach to prioritizing process information

    NASA Technical Reports Server (NTRS)

    Corsberg, Daniel R.

    1991-01-01

    Many space-related processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect is rapid analysis of the changing process information. During a disturbance, this task can overwhelm humans as well as computers. Humans deal with this by applying heuristics in determining significant information. A simple, knowledge-based approach to prioritizing information is described. The approach models those heuristics that humans would use in similar circumstances. The approach described has received two patents and was implemented in the Alarm Filtering System (AFS) at the Idaho National Engineering Laboratory (INEL). AFS was first developed for application in a nuclear reactor control room. It has since been used in chemical processing applications, where it has had a significant impact on control room environments. The approach uses knowledge-based heuristics to analyze data from process instrumentation and respond to that data according to knowledge encapsulated in objects and rules. While AFS cannot perform the complete diagnosis and control task, it has proven to be extremely effective at filtering and prioritizing information. AFS was used for over two years as a first level of analysis for human diagnosticians. Given the approach's proven track record in a wide variety of practical applications, it should be useful in both ground- and space-based systems.

  3. Thermal Control Technologies for Complex Spacecraft

    NASA Technical Reports Server (NTRS)

    Swanson, Theodore D.

    2004-01-01

    Thermal control is a generic need for all spacecraft. In response to ever more demanding science and exploration requirements, spacecraft are becoming ever more complex, and hence their thermal control systems must evolve. This paper briefly discusses the process of technology development, the state-of-the-art in thermal control, recent experiences with on-orbit two-phase systems, and the emerging thermal control technologies to meet these evolving needs. Some "lessons learned" based on experience with on-orbit systems are also presented.

  4. Queueing Network Models for Parallel Processing of Task Systems: an Operational Approach

    NASA Technical Reports Server (NTRS)

    Mak, Victor W. K.

    1986-01-01

    Computer performance modeling of possibly complex computations running on highly concurrent systems is considered. Earlier works in this area either dealt with a very simple program structure or resulted in methods with exponential complexity. An efficient procedure is developed to compute the performance measures for series-parallel-reducible task systems using queueing network models. The procedure is based on the concept of hierarchical decomposition and a new operational approach. Numerical results for three test cases are presented and compared to those of simulations.

  5. Problem solving using soft systems methodology.

    PubMed

    Land, L

    This article outlines a method of problem solving which considers holistic solutions to complex problems. Soft systems methodology allows people involved in the problem situation to have control over the decision-making process.

  6. Sustainability science: accounting for nonlinear dynamics in policy and social-ecological systems

    EPA Science Inventory

    Resilience is an emergent property of complex systems. Understanding resilience is critical for sustainability science, as linked social-ecological systems and the policy process that governs them are characterized by non-linear dynamics. Non-linear dynamics in these systems mean...

  7. Considerations In The Design And Specifications Of An Automatic Inspection System

    NASA Astrophysics Data System (ADS)

    Lee, David T.

    1980-05-01

    Considerable activities have been centered around the automation of manufacturing quality control and inspection functions. Several reasons can be cited for this development. The continuous pressure of direct and indirect labor cost increase is only one of the obvious motivations. With the drive for electronics miniaturization come more and more complex processes where control parameters are critical and the yield is highly susceptible to inadequate process monitor and inspection. With multi-step, multi-layer process for substrate fabrication, process defects that are not detected and corrected at certain critical points may render the entire subassembly useless. As a process becomes more complex, the time required to test the product increases significantly in the total build cycle. The urgency to reduce test time brings more pressure to improve in-process control and inspection. The advances and improvements of components, assemblies and systems such as micro-processors, micro-computers, programmable controllers, and other intelligent devices, have made the automation of quality control much more cost effective and justifiable.

  8. Simulation of groundwater flow in the glacial aquifer system of northeastern Wisconsin with variable model complexity

    USGS Publications Warehouse

    Juckem, Paul F.; Clark, Brian R.; Feinstein, Daniel T.

    2017-05-04

    The U.S. Geological Survey, National Water-Quality Assessment seeks to map estimated intrinsic susceptibility of the glacial aquifer system of the conterminous United States. Improved understanding of the hydrogeologic characteristics that explain spatial patterns of intrinsic susceptibility, commonly inferred from estimates of groundwater age distributions, is sought so that methods used for the estimation process are properly equipped. An important step beyond identifying relevant hydrogeologic datasets, such as glacial geology maps, is to evaluate how incorporation of these resources into process-based models using differing levels of detail could affect resulting simulations of groundwater age distributions and, thus, estimates of intrinsic susceptibility.This report describes the construction and calibration of three groundwater-flow models of northeastern Wisconsin that were developed with differing levels of complexity to provide a framework for subsequent evaluations of the effects of process-based model complexity on estimations of groundwater age distributions for withdrawal wells and streams. Preliminary assessments, which focused on the effects of model complexity on simulated water levels and base flows in the glacial aquifer system, illustrate that simulation of vertical gradients using multiple model layers improves simulated heads more in low-permeability units than in high-permeability units. Moreover, simulation of heterogeneous hydraulic conductivity fields in coarse-grained and some fine-grained glacial materials produced a larger improvement in simulated water levels in the glacial aquifer system compared with simulation of uniform hydraulic conductivity within zones. The relation between base flows and model complexity was less clear; however, the relation generally seemed to follow a similar pattern as water levels. Although increased model complexity resulted in improved calibrations, future application of the models using simulated particle tracking is anticipated to evaluate if these model design considerations are similarly important for understanding the primary modeling objective - to simulate reasonable groundwater age distributions.

  9. Observing Consistency in Online Communication Patterns for User Re-Identification

    PubMed Central

    Venter, Hein S.

    2016-01-01

    Comprehension of the statistical and structural mechanisms governing human dynamics in online interaction plays a pivotal role in online user identification, online profile development, and recommender systems. However, building a characteristic model of human dynamics on the Internet involves a complete analysis of the variations in human activity patterns, which is a complex process. This complexity is inherent in human dynamics and has not been extensively studied to reveal the structural composition of human behavior. A typical method of anatomizing such a complex system is viewing all independent interconnectivity that constitutes the complexity. An examination of the various dimensions of human communication pattern in online interactions is presented in this paper. The study employed reliable server-side web data from 31 known users to explore characteristics of human-driven communications. Various machine-learning techniques were explored. The results revealed that each individual exhibited a relatively consistent, unique behavioral signature and that the logistic regression model and model tree can be used to accurately distinguish online users. These results are applicable to one-to-one online user identification processes, insider misuse investigation processes, and online profiling in various areas. PMID:27918593

  10. The Dynamics of Coalition Formation on Complex Networks

    NASA Astrophysics Data System (ADS)

    Auer, S.; Heitzig, J.; Kornek, U.; Schöll, E.; Kurths, J.

    2015-08-01

    Complex networks describe the structure of many socio-economic systems. However, in studies of decision-making processes the evolution of the underlying social relations are disregarded. In this report, we aim to understand the formation of self-organizing domains of cooperation (“coalitions”) on an acquaintance network. We include both the network’s influence on the formation of coalitions and vice versa how the network adapts to the current coalition structure, thus forming a social feedback loop. We increase complexity from simple opinion adaptation processes studied in earlier research to more complex decision-making determined by costs and benefits, and from bilateral to multilateral cooperation. We show how phase transitions emerge from such coevolutionary dynamics, which can be interpreted as processes of great transformations. If the network adaptation rate is high, the social dynamics prevent the formation of a grand coalition and therefore full cooperation. We find some empirical support for our main results: Our model develops a bimodal coalition size distribution over time similar to those found in social structures. Our detection and distinguishing of phase transitions may be exemplary for other models of socio-economic systems with low agent numbers and therefore strong finite-size effects.

  11. The behavior and importance of lactic acid complexation in Talspeak extraction systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grimes, Travis S.; Nilsson, Mikael; Nash, Kenneth L.

    2008-07-01

    Advanced partitioning of spent nuclear fuel in the UREX +la process relies on the TALSPEAK process for separation of fission-product lanthanides from trivalent actinides. The classic TALSPEAK utilizes an aqueous medium of both lactic acid and diethylenetriaminepentaacetic acid and the extraction reagent di(2-ethylhexyl)phosphoric acid in an aromatic diluent. In this study, the specific role of lactic acid and the complexes involved in the extraction of the trivalent actinides and lanthanides have been investigated using {sup 14}C-labeled lactic acid. Our results show that lactic acid partitions between the phases in a complex fashion. (authors)

  12. Connected Worlds: Connecting the public with complex environmental systems

    NASA Astrophysics Data System (ADS)

    Uzzo, S. M.; Chen, R. S.; Downs, R. R.

    2016-12-01

    Among the most important concepts in environmental science learning is the structure and dynamics of coupled human and natural systems (CHANS). But the fundamental epistemology for understanding CHANS requires systems thinking, interdisciplinarity, and complexity. Although the Next Generation Science Standards mandate connecting ideas across disciplines and systems, traditional approaches to education do not provide more than superficial understanding of this concept. Informal science learning institutions have a key role in bridging gaps between the reductive nature of classroom learning and contemporary data-driven science. The New York Hall of Science, in partnership with Design I/O and Columbia University's Center for International Earth Science Information Network, has developed an approach to immerse visitors in complex human nature interactions and provide opportunities for those of all ages to elicit and notice environmental consequences of their actions. Connected Worlds is a nearly 1,000 m2 immersive, playful environment in which students learn about complexity and interconnectedness in ecosystems and how ecosystems might respond to human intervention. It engages students through direct interactions with fanciful flora and fauna within and among six biomes: desert, rainforest, grassland, mountain valley, reservoir, and wetlands, which are interconnected through stocks and flows of water. Through gestures and the manipulation of a dynamic water system, Connected Worlds enables students, teachers, and parents to experience how the ecosystems of planet Earth are connected and to observe relationships between the behavior of Earth's inhabitants and our shared world. It is also a cyberlearning platform to study how visitors notice and scaffold their understanding of complex environmental processes and the responses of these processes to human intervention, to help inform the improvement of education practices in complex environmental science.

  13. Science with society in the anthropocene.

    PubMed

    Seidl, Roman; Brand, Fridolin Simon; Stauffacher, Michael; Krütli, Pius; Le, Quang Bao; Spörri, Andy; Meylan, Grégoire; Moser, Corinne; González, Monica Berger; Scholz, Roland Werner

    2013-02-01

    Interdisciplinary scientific knowledge is necessary but not sufficient when it comes to addressing sustainable transformations, as science increasingly has to deal with normative and value-related issues. A systems perspective on coupled human-environmental systems (HES) helps to address the inherent complexities. Additionally, a thorough interaction between science and society (i.e., transdisciplinarity = TD) is necessary, as sustainable transitions are sometimes contested and can cause conflicts. In order to navigate complexities regarding the delicate interaction of scientific research with societal decisions these processes must proceed in a structured and functional way. We thus propose HES-based TD processes to provide a basis for reorganizing science in coming decades.

  14. Reframing the challenges to integrated care: a complex-adaptive systems perspective.

    PubMed

    Tsasis, Peter; Evans, Jenna M; Owen, Susan

    2012-01-01

    Despite over two decades of international experience and research on health systems integration, integrated care has not developed widely. We hypothesized that part of the problem may lie in how we conceptualize the integration process and the complex systems within which integrated care is enacted. This study aims to contribute to discourse regarding the relevance and utility of a complex-adaptive systems (CAS) perspective on integrated care. In the Canadian province of Ontario, government mandated the development of fourteen Local Health Integration Networks in 2006. Against the backdrop of these efforts to integrate care, we collected focus group data from a diverse sample of healthcare professionals in the Greater Toronto Area using convenience and snowball sampling. A semi-structured interview guide was used to elicit participant views and experiences of health systems integration. We use a CAS framework to describe and analyze the data, and to assess the theoretical fit of a CAS perspective with the dominant themes in participant responses. Our findings indicate that integration is challenged by system complexity, weak ties and poor alignment among professionals and organizations, a lack of funding incentives to support collaborative work, and a bureaucratic environment based on a command and control approach to management. Using a CAS framework, we identified several characteristics of CAS in our data, including diverse, interdependent and semi-autonomous actors; embedded co-evolutionary systems; emergent behaviours and non-linearity; and self-organizing capacity. One possible explanation for the lack of systems change towards integration is that we have failed to treat the healthcare system as complex-adaptive. The data suggest that future integration initiatives must be anchored in a CAS perspective, and focus on building the system's capacity to self-organize. We conclude that integrating care requires policies and management practices that promote system awareness, relationship-building and information-sharing, and that recognize change as an evolving learning process rather than a series of programmatic steps.

  15. @neurIST: infrastructure for advanced disease management through integration of heterogeneous data, computing, and complex processing services.

    PubMed

    Benkner, Siegfried; Arbona, Antonio; Berti, Guntram; Chiarini, Alessandro; Dunlop, Robert; Engelbrecht, Gerhard; Frangi, Alejandro F; Friedrich, Christoph M; Hanser, Susanne; Hasselmeyer, Peer; Hose, Rod D; Iavindrasana, Jimison; Köhler, Martin; Iacono, Luigi Lo; Lonsdale, Guy; Meyer, Rodolphe; Moore, Bob; Rajasekaran, Hariharan; Summers, Paul E; Wöhrer, Alexander; Wood, Steven

    2010-11-01

    The increasing volume of data describing human disease processes and the growing complexity of understanding, managing, and sharing such data presents a huge challenge for clinicians and medical researchers. This paper presents the @neurIST system, which provides an infrastructure for biomedical research while aiding clinical care, by bringing together heterogeneous data and complex processing and computing services. Although @neurIST targets the investigation and treatment of cerebral aneurysms, the system's architecture is generic enough that it could be adapted to the treatment of other diseases. Innovations in @neurIST include confining the patient data pertaining to aneurysms inside a single environment that offers clinicians the tools to analyze and interpret patient data and make use of knowledge-based guidance in planning their treatment. Medical researchers gain access to a critical mass of aneurysm related data due to the system's ability to federate distributed information sources. A semantically mediated grid infrastructure ensures that both clinicians and researchers are able to seamlessly access and work on data that is distributed across multiple sites in a secure way in addition to providing computing resources on demand for performing computationally intensive simulations for treatment planning and research.

  16. The way to universal and correct medical presentation of diagnostic informations for complex spectrophotometry noninvasive medical diagnostic systems

    NASA Astrophysics Data System (ADS)

    Rogatkin, Dmitrii A.; Tchernyi, Vladimir V.

    2003-07-01

    The optical noninvasive diagnostic systems are now widely applied and investigated in different areas of medicine. One of the such techniques is the noninvasive spectrophotometry, the complex diagnostic technique consisting on elastic scattering spectroscopy, absorption spectroscopy, fluorescent diagnostics, photoplethismography, etc. Today a lot of real optical diagnostic systems indicate the technical parameters and physical data only as a result of the diagnostic procedure. But, it is clear that for the medical staff the more convenient medical information is needed. This presentation lights the general way for development a diagnostic system"s software, which can produce the full processing of the diagnostic data from a physical to a medical level. It is shown, that this process is a multilevel (3-level) procedure and the main diagnostic result for noninvasive spectrophotometry methods, the biochemical and morphological composition of the tested tissues, arises in it on a second level of calculations.

  17. Method for Evaluating Information to Solve Problems of Control, Monitoring and Diagnostics

    NASA Astrophysics Data System (ADS)

    Vasil'ev, V. A.; Dobrynina, N. V.

    2017-06-01

    The article describes a method for evaluating information to solve problems of control, monitoring and diagnostics. It is necessary for reducing the dimensionality of informational indicators of situations, bringing them to relative units, for calculating generalized information indicators on their basis, ranking them by characteristic levels, for calculating the efficiency criterion of a system functioning in real time. The design of information evaluation system has been developed on its basis that allows analyzing, processing and assessing information about the object. Such object can be a complex technical, economic and social system. The method and the based system thereof can find a wide application in the field of analysis, processing and evaluation of information on the functioning of the systems, regardless of their purpose, goals, tasks and complexity. For example, they can be used to assess the innovation capacities of industrial enterprises and management decisions.

  18. Phase transitions in Pareto optimal complex networks

    NASA Astrophysics Data System (ADS)

    Seoane, Luís F.; Solé, Ricard

    2015-09-01

    The organization of interactions in complex systems can be described by networks connecting different units. These graphs are useful representations of the local and global complexity of the underlying systems. The origin of their topological structure can be diverse, resulting from different mechanisms including multiplicative processes and optimization. In spatial networks or in graphs where cost constraints are at work, as it occurs in a plethora of situations from power grids to the wiring of neurons in the brain, optimization plays an important part in shaping their organization. In this paper we study network designs resulting from a Pareto optimization process, where different simultaneous constraints are the targets of selection. We analyze three variations on a problem, finding phase transitions of different kinds. Distinct phases are associated with different arrangements of the connections, but the need of drastic topological changes does not determine the presence or the nature of the phase transitions encountered. Instead, the functions under optimization do play a determinant role. This reinforces the view that phase transitions do not arise from intrinsic properties of a system alone, but from the interplay of that system with its external constraints.

  19. Connections Matter: Social Networks and Lifespan Health in Primate Translational Models

    PubMed Central

    McCowan, Brenda; Beisner, Brianne; Bliss-Moreau, Eliza; Vandeleest, Jessica; Jin, Jian; Hannibal, Darcy; Hsieh, Fushing

    2016-01-01

    Humans live in societies full of rich and complex relationships that influence health. The ability to improve human health requires a detailed understanding of the complex interplay of biological systems that contribute to disease processes, including the mechanisms underlying the influence of social contexts on these biological systems. A longitudinal computational systems science approach provides methods uniquely suited to elucidate the mechanisms by which social systems influence health and well-being by investigating how they modulate the interplay among biological systems across the lifespan. In the present report, we argue that nonhuman primate social systems are sufficiently complex to serve as model systems allowing for the development and refinement of both analytical and theoretical frameworks linking social life to health. Ultimately, developing systems science frameworks in nonhuman primate models will speed discovery of the mechanisms that subserve the relationship between social life and human health. PMID:27148103

  20. High Speed PC Based Data Acquisition and Instrumentation for Measurement of Simulated Low Earth Orbit Thermally Induced Disturbances

    NASA Technical Reports Server (NTRS)

    Sills, Joel W., Jr.; Griffin, Thomas J. (Technical Monitor)

    2001-01-01

    The Hubble Space Telescope (HST) Disturbance Verification Test (DVT) was conducted to characterize responses of the Observatory's new set of rigid solar array's (SA3) to thermally induced 'creak' or stiction releases. The data acquired in the DVT were used in verification of the HST Pointing Control System on-orbit performance, post-Servicing Mission 3B (SM3B). The test simulated the on-orbit environment on a deployed SA3 flight wing. Instrumentation for this test required pretest simulations in order to select the correct sensitivities. Vacuum compatible, highly accurate accelerometers and force gages were used for this test. The complexity of the test, as well as a short planning schedule, required a data acquisition system that was easy to configure, highly flexible, and extremely robust. A PC Windows oriented data acquisition system meets these requirements, allowing the test engineers to minimize the time required to plan and perform complex environmental test. The SA3 DVT provided a direct practical and complex demonstration of the versatility that PC based data acquisition systems provide. Two PC based data acquisition systems were assembled to acquire, process, distribute, and provide real time processing for several types of transducers used in the SA3 DVT. A high sample rate digital tape recorder was used to archive the sensor signals. The two systems provided multi-channel hardware and software architecture and were selected based on the test requirements. How these systems acquire and processes multiple data rates from different transducer types is discussed, along with the system hardware and software architecture.

  1. A chaotic view of behavior change: a quantum leap for health promotion.

    PubMed

    Resnicow, Ken; Vaughan, Roger

    2006-09-12

    The study of health behavior change, including nutrition and physical activity behaviors, has been rooted in a cognitive-rational paradigm. Change is conceptualized as a linear, deterministic process where individuals weigh pros and cons, and at the point at which the benefits outweigh the cost change occurs. Consistent with this paradigm, the associated statistical models have almost exclusively assumed a linear relationship between psychosocial predictors and behavior. Such a perspective however, fails to account for non-linear, quantum influences on human thought and action. Consider why after years of false starts and failed attempts, a person succeeds at increasing their physical activity, eating healthier or losing weight. Or, why after years of success a person relapses. This paper discusses a competing view of health behavior change that was presented at the 2006 annual ISBNPA meeting in Boston. Rather than viewing behavior change from a linear perspective it can be viewed as a quantum event that can be understood through the lens of Chaos Theory and Complex Dynamic Systems. Key principles of Chaos Theory and Complex Dynamic Systems relevant to understanding health behavior change include: 1) Chaotic systems can be mathematically modeled but are nearly impossible to predict; 2) Chaotic systems are sensitive to initial conditions; 3) Complex Systems involve multiple component parts that interact in a nonlinear fashion; and 4) The results of Complex Systems are often greater than the sum of their parts. Accordingly, small changes in knowledge, attitude, efficacy, etc may dramatically alter motivation and behavioral outcomes. And the interaction of such variables can yield almost infinite potential patterns of motivation and behavior change. In the linear paradigm unaccounted for variance is generally relegated to the catch all "error" term, when in fact such "error" may represent the chaotic component of the process. The linear and chaotic paradigms are however, not mutually exclusive, as behavior change may include both chaotic and cognitive processes. Studies of addiction suggest that many decisions to change are quantum rather than planned events; motivation arrives as opposed to being planned. Moreover, changes made through quantum processes appear more enduring than those that involve more rational, planned processes. How such processes may apply to nutrition and physical activity behavior and related interventions merits examination.

  2. A probabilistic process model for pelagic marine ecosystems informed by Bayesian inverse analysis

    EPA Science Inventory

    Marine ecosystems are complex systems with multiple pathways that produce feedback cycles, which may lead to unanticipated effects. Models abstract this complexity and allow us to predict, understand, and hypothesize. In ecological models, however, the paucity of empirical data...

  3. Synchronization in complex networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arenas, A.; Diaz-Guilera, A.; Moreno, Y.

    Synchronization processes in populations of locally interacting elements are in the focus of intense research in physical, biological, chemical, technological and social systems. The many efforts devoted to understand synchronization phenomena in natural systems take now advantage of the recent theory of complex networks. In this review, we report the advances in the comprehension of synchronization phenomena when oscillating elements are constrained to interact in a complex network topology. We also overview the new emergent features coming out from the interplay between the structure and the function of the underlying pattern of connections. Extensive numerical work as well as analyticalmore » approaches to the problem are presented. Finally, we review several applications of synchronization in complex networks to different disciplines: biological systems and neuroscience, engineering and computer science, and economy and social sciences.« less

  4. Indicator system for a process plant control complex

    DOEpatents

    Scarola, Kenneth; Jamison, David S.; Manazir, Richard M.; Rescorl, Robert L.; Harmon, Daryl L.

    1993-01-01

    An advanced control room complex for a nuclear power plant, including a discrete indicator and alarm system (72) which is nuclear qualified for rapid response to changes in plant parameters and a component control system (64) which together provide a discrete monitoring and control capability at a panel (14-22, 26, 28) in the control room (10). A separate data processing system (70), which need not be nuclear qualified, provides integrated and overview information to the control room and to each panel, through CRTs (84) and a large, overhead integrated process status overview board (24). The discrete indicator and alarm system (72) and the data processing system (70) receive inputs from common plant sensors and validate the sensor outputs to arrive at a representative value of the parameter for use by the operator during both normal and accident conditions, thereby avoiding the need for him to assimilate data from each sensor individually. The integrated process status board (24) is at the apex of an information hierarchy that extends through four levels and provides access at each panel to the full display hierarchy. The control room panels are preferably of a modular construction, permitting the definition of inputs and outputs, the man machine interface, and the plant specific algorithms, to proceed in parallel with the fabrication of the panels, the installation of the equipment and the generic testing thereof.

  5. A Framework to Determine New System Requirements Under Design Parameter and Demand Uncertainties

    DTIC Science & Technology

    2015-04-30

    relegates quantitative complexities of decision-making to the method and designates trade-space exploration to the practitioner. We demonstrate the...quantitative complexities of decision-making to the method and designates trade-space exploration to the practitioner. We demonstrate the approach...play a critical role in determining new system requirements. Scope and Method of Approach The early stages of the design process have substantial

  6. Human Error In Complex Systems

    NASA Technical Reports Server (NTRS)

    Morris, Nancy M.; Rouse, William B.

    1991-01-01

    Report presents results of research aimed at understanding causes of human error in such complex systems as aircraft, nuclear powerplants, and chemical processing plants. Research considered both slips (errors of action) and mistakes (errors of intention), and influence of workload on them. Results indicated that: humans respond to conditions in which errors expected by attempting to reduce incidence of errors; and adaptation to conditions potent influence on human behavior in discretionary situations.

  7. Uncertainties in building a strategic defense.

    PubMed

    Zraket, C A

    1987-03-27

    Building a strategic defense against nuclear ballistic missiles involves complex and uncertain functional, spatial, and temporal relations. Such a defensive system would evolve and grow over decades. It is too complex, dynamic, and interactive to be fully understood initially by design, analysis, and experiments. Uncertainties exist in the formulation of requirements and in the research and design of a defense architecture that can be implemented incrementally and be fully tested to operate reliably. The analysis and measurement of system survivability, performance, and cost-effectiveness are critical to this process. Similar complexities exist for an adversary's system that would suppress or use countermeasures against a missile defense. Problems and opportunities posed by these relations are described, with emphasis on the unique characteristics and vulnerabilities of space-based systems.

  8. Fault management for the Space Station Freedom control center

    NASA Technical Reports Server (NTRS)

    Clark, Colin; Jowers, Steven; Mcnenny, Robert; Culbert, Chris; Kirby, Sarah; Lauritsen, Janet

    1992-01-01

    This paper describes model based reasoning fault isolation in complex systems using automated digraph analysis. It discusses the use of the digraph representation as the paradigm for modeling physical systems and a method for executing these failure models to provide real-time failure analysis. It also discusses the generality, ease of development and maintenance, complexity management, and susceptibility to verification and validation of digraph failure models. It specifically describes how a NASA-developed digraph evaluation tool and an automated process working with that tool can identify failures in a monitored system when supplied with one or more fault indications. This approach is well suited to commercial applications of real-time failure analysis in complex systems because it is both powerful and cost effective.

  9. Hybrid modeling and empirical analysis of automobile supply chain network

    NASA Astrophysics Data System (ADS)

    Sun, Jun-yan; Tang, Jian-ming; Fu, Wei-ping; Wu, Bing-ying

    2017-05-01

    Based on the connection mechanism of nodes which automatically select upstream and downstream agents, a simulation model for dynamic evolutionary process of consumer-driven automobile supply chain is established by integrating ABM and discrete modeling in the GIS-based map. Firstly, the rationality is proved by analyzing the consistency of sales and changes in various agent parameters between the simulation model and a real automobile supply chain. Second, through complex network theory, hierarchical structures of the model and relationships of networks at different levels are analyzed to calculate various characteristic parameters such as mean distance, mean clustering coefficients, and degree distributions. By doing so, it verifies that the model is a typical scale-free network and small-world network. Finally, the motion law of this model is analyzed from the perspective of complex self-adaptive systems. The chaotic state of the simulation system is verified, which suggests that this system has typical nonlinear characteristics. This model not only macroscopically illustrates the dynamic evolution of complex networks of automobile supply chain but also microcosmically reflects the business process of each agent. Moreover, the model construction and simulation of the system by means of combining CAS theory and complex networks supplies a novel method for supply chain analysis, as well as theory bases and experience for supply chain analysis of auto companies.

  10. Adaptive identifier for uncertain complex nonlinear systems based on continuous neural networks.

    PubMed

    Alfaro-Ponce, Mariel; Cruz, Amadeo Argüelles; Chairez, Isaac

    2014-03-01

    This paper presents the design of a complex-valued differential neural network identifier for uncertain nonlinear systems defined in the complex domain. This design includes the construction of an adaptive algorithm to adjust the parameters included in the identifier. The algorithm is obtained based on a special class of controlled Lyapunov functions. The quality of the identification process is characterized using the practical stability framework. Indeed, the region where the identification error converges is derived by the same Lyapunov method. This zone is defined by the power of uncertainties and perturbations affecting the complex-valued uncertain dynamics. Moreover, this convergence zone is reduced to its lowest possible value using ideas related to the so-called ellipsoid methodology. Two simple but informative numerical examples are developed to show how the identifier proposed in this paper can be used to approximate uncertain nonlinear systems valued in the complex domain.

  11. Unifying Complexity and Information

    NASA Astrophysics Data System (ADS)

    Ke, Da-Guan

    2013-04-01

    Complex systems, arising in many contexts in the computer, life, social, and physical sciences, have not shared a generally-accepted complexity measure playing a fundamental role as the Shannon entropy H in statistical mechanics. Superficially-conflicting criteria of complexity measurement, i.e. complexity-randomness (C-R) relations, have given rise to a special measure intrinsically adaptable to more than one criterion. However, deep causes of the conflict and the adaptability are not much clear. Here I trace the root of each representative or adaptable measure to its particular universal data-generating or -regenerating model (UDGM or UDRM). A representative measure for deterministic dynamical systems is found as a counterpart of the H for random process, clearly redefining the boundary of different criteria. And a specific UDRM achieving the intrinsic adaptability enables a general information measure that ultimately solves all major disputes. This work encourages a single framework coving deterministic systems, statistical mechanics and real-world living organisms.

  12. Is the Process of Special Measures an Effective Tool for Bringing about Authentic School Improvement?

    ERIC Educational Resources Information Center

    Willis, Lynne

    2010-01-01

    Managing change in education is a complex process, but to do so under the pressure of a punishment-based measurement system (Fullan, 2008) makes sustainable and meaningful change increasingly difficult. Systems which produce high stakes accountability measures, which bring with it sanctions that create a greater sense of distrust, demoralization…

  13. Assessing Students' Ability to Trace Matter in Dynamic Systems in Cell Biology

    ERIC Educational Resources Information Center

    Wilson, Christopher D.; Anderson, Charles W.; Heidemann, Merle; Merrill, John E.; Merritt, Brett W.; Richmond, Gail; Sibley, Duncan F.; Parker, Joyce M.

    2006-01-01

    College-level biology courses contain many complex processes that are often taught and learned as detailed narratives. These processes can be better understood by perceiving them as dynamic systems that are governed by common fundamental principles. Conservation of matter is such a principle, and thus tracing matter is an essential step in…

  14. Space and time in ecology: Noise or fundamental driver? [chapter 2

    Treesearch

    Samuel A. Cushman

    2010-01-01

    In this chapter I frame the central issue of the book, namely is spatial and temporal complexity in ecological systems merely noise around the predictions of non-spatial, equilibrium processes? Or, alternatively, do spatial and temporal variability in the environment and autogenic space­time processes in populations fundamentally alter system behavior such that ideal...

  15. Context sensitivity and ambiguity in component-based systems design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bespalko, S.J.; Sindt, A.

    1997-10-01

    Designers of components-based, real-time systems need to guarantee to correctness of soft-ware and its output. Complexity of a system, and thus the propensity for error, is best characterized by the number of states a component can encounter. In many cases, large numbers of states arise where the processing is highly dependent on context. In these cases, states are often missed, leading to errors. The following are proposals for compactly specifying system states which allow the factoring of complex components into a control module and a semantic processing module. Further, the need for methods that allow for the explicit representation ofmore » ambiguity and uncertainty in the design of components is discussed. Presented herein are examples of real-world problems which are highly context-sensitive or are inherently ambiguous.« less

  16. Technology Evaluation for the Big Spring Water Treatment System at the Y-12 National Security Complex, Oak Ridge, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becthel Jacobs Company LLC

    2002-11-01

    The Y-12 National Security Complex (Y-12 Complex) is an active manufacturing and developmental engineering facility that is located on the U.S. Department of Energy (DOE) Oak Ridge Reservation. Building 9201-2 was one of the first process buildings constructed at the Y-12 Complex. Construction involved relocating and straightening of the Upper East Fork Poplar Creek (UEFPC) channel, adding large quantities of fill material to level areas along the creek, and pumping of concrete into sinkholes and solution cavities present within the limestone bedrock. Flow from a large natural spring designated as ''Big Spring'' on the original 1943 Stone & Webster Buildingmore » 9201-2 Field Sketch FS6003 was captured and directed to UEFPC through a drainpipe designated Outfall 51. The building was used from 1953 to 1955 for pilot plant operations for an industrial process that involved the use of large quantities of elemental mercury. Past operations at the Y-12 Complex led to the release of mercury to the environment. Significant environmental media at the site were contaminated by accidental releases of mercury from the building process facilities piping and sumps associated with Y-12 Complex mercury handling facilities. Releases to the soil surrounding the buildings have resulted in significant levels of mercury in these areas of contamination, which is ultimately transported to UEFPC, its streambed, and off-site. Bechtel Jacobs Company LLC (BJC) is the DOE-Oak Ridge Operations prime contractor responsible for conducting environmental restoration activities at the Y-12 Complex. In order to mitigate the mercury being released to UEFPC, the Big Spring Water Treatment System will be designed and constructed as a Comprehensive Environmental Response, Compensation, and Liability Act action. This facility will treat the combined flow from Big Spring feeding Outfall 51 and the inflow now being processed at the East End Mercury Treatment System (EEMTS). Both discharge to UEFPC adjacent to Bldg. 9201-2. The EEMTS treats mercury-contaminated groundwater that collects in sumps in the basement of Bldg. 9201-2. A pre-design study was performed to investigate the applicability of various treatment technologies for reducing mercury discharges at Outfall 51 in support of the design of the Big Spring Water Treatment System. This document evaluates the results of the pre-design study for selection of the mercury removal technology for the treatment system.« less

  17. Connectivity in the human brain dissociates entropy and complexity of auditory inputs.

    PubMed

    Nastase, Samuel A; Iacovella, Vittorio; Davis, Ben; Hasson, Uri

    2015-03-01

    Complex systems are described according to two central dimensions: (a) the randomness of their output, quantified via entropy; and (b) their complexity, which reflects the organization of a system's generators. Whereas some approaches hold that complexity can be reduced to uncertainty or entropy, an axiom of complexity science is that signals with very high or very low entropy are generated by relatively non-complex systems, while complex systems typically generate outputs with entropy peaking between these two extremes. In understanding their environment, individuals would benefit from coding for both input entropy and complexity; entropy indexes uncertainty and can inform probabilistic coding strategies, whereas complexity reflects a concise and abstract representation of the underlying environmental configuration, which can serve independent purposes, e.g., as a template for generalization and rapid comparisons between environments. Using functional neuroimaging, we demonstrate that, in response to passively processed auditory inputs, functional integration patterns in the human brain track both the entropy and complexity of the auditory signal. Connectivity between several brain regions scaled monotonically with input entropy, suggesting sensitivity to uncertainty, whereas connectivity between other regions tracked entropy in a convex manner consistent with sensitivity to input complexity. These findings suggest that the human brain simultaneously tracks the uncertainty of sensory data and effectively models their environmental generators. Copyright © 2014. Published by Elsevier Inc.

  18. Systematic approach to optimal design of induction heating installations for aluminum extrusion process

    NASA Astrophysics Data System (ADS)

    Zimin, L. S.; Sorokin, A. G.; Egiazaryan, A. S.; Filimonova, O. V.

    2018-03-01

    An induction heating system has a number of inherent benefits compared to traditional heating systems due to a non-contact heating process. It is widely used in vehicle manufacture, cast-rolling, forging, preheating before rolling, heat treatment, galvanizing and so on. Compared to other heating technologies, induction heating has the advantages of high efficiency, fast heating rate and easy control. The paper presents a new systematic approach to the design and operation of induction heating installations (IHI) in aluminum alloys production. The heating temperature in industrial complexes “induction heating - deformation” is not fixed in advance, but is determined in accordance with the maximization or minimization of the total economic performance during the process of metal heating and deformation. It is indicated that the energy efficient technological complex “IHI – Metal Forming (MF)” can be designed only with regard to its power supply system (PSS). So the task of designing systems of induction heating is to provide, together with the power supply system and forming equipment, the minimum energy costs for the metal retreating.

  19. Laser micromachining of biofactory-on-a-chip devices

    NASA Astrophysics Data System (ADS)

    Burt, Julian P.; Goater, Andrew D.; Hayden, Christopher J.; Tame, John A.

    2002-06-01

    Excimer laser micromachining provides a flexible means for the manufacture and rapid prototyping of miniaturized systems such as Biofactory-on-a-Chip devices. Biofactories are miniaturized diagnostic devices capable of characterizing, manipulating, separating and sorting suspension of particles such as biological cells. Such systems operate by exploiting the electrical properties of microparticles and controlling particle movement in AC non- uniform stationary and moving electric fields. Applications of Biofactory devices are diverse and include, among others, the healthcare, pharmaceutical, chemical processing, environmental monitoring and food diagnostic markets. To achieve such characterization and separation, Biofactory devices employ laboratory-on-a-chip type components such as complex multilayer microelectrode arrays, microfluidic channels, manifold systems and on-chip detection systems. Here we discuss the manufacturing requirements of Biofactory devices and describe the use of different excimer laser micromachined methods both in stand-alone processes and also in conjunction with conventional fabrication processes such as photolithography and thermal molding. Particular attention is given to the production of large area multilayer microelectrode arrays and the manufacture of complex cross-section microfluidic channel systems for use in simple distribution and device interfacing.

  20. High performance embedded system for real-time pattern matching

    NASA Astrophysics Data System (ADS)

    Sotiropoulou, C.-L.; Luciano, P.; Gkaitatzis, S.; Citraro, S.; Giannetti, P.; Dell'Orso, M.

    2017-02-01

    In this paper we present an innovative and high performance embedded system for real-time pattern matching. This system is based on the evolution of hardware and algorithms developed for the field of High Energy Physics and more specifically for the execution of extremely fast pattern matching for tracking of particles produced by proton-proton collisions in hadron collider experiments. A miniaturized version of this complex system is being developed for pattern matching in generic image processing applications. The system works as a contour identifier able to extract the salient features of an image. It is based on the principles of cognitive image processing, which means that it executes fast pattern matching and data reduction mimicking the operation of the human brain. The pattern matching can be executed by a custom designed Associative Memory chip. The reference patterns are chosen by a complex training algorithm implemented on an FPGA device. Post processing algorithms (e.g. pixel clustering) are also implemented on the FPGA. The pattern matching can be executed on a 2D or 3D space, on black and white or grayscale images, depending on the application and thus increasing exponentially the processing requirements of the system. We present the firmware implementation of the training and pattern matching algorithm, performance and results on a latest generation Xilinx Kintex Ultrascale FPGA device.

  1. Application of programmable logic controllers to space simulation

    NASA Technical Reports Server (NTRS)

    Sushon, Janet

    1992-01-01

    Incorporating a state-of-the-art process control and instrumentation system into a complex system for thermal vacuum testing is discussed. The challenge was to connect several independent control systems provided by various vendors to a supervisory computer. This combination will sequentially control and monitor the process, collect the data, and transmit it to color a graphic system for subsequent manipulation. The vacuum system upgrade included: replacement of seventeen diffusion pumps with eight cryogenic pumps and one turbomolecular pump, replacing a relay based control system, replacing vacuum instrumentation, and upgrading the data acquisition system.

  2. Measuring information processing in a client with extreme agitation following traumatic brain injury using the Perceive, Recall, Plan and Perform System of Task Analysis.

    PubMed

    Nott, Melissa T; Chapparo, Christine

    2008-09-01

    Agitation following traumatic brain injury is characterised by a heightened state of activity with disorganised information processing that interferes with learning and achieving functional goals. This study aimed to identify information processing problems during task performance of a severely agitated adult using the Perceive, Recall, Plan and Perform (PRPP) System of Task Analysis. Second, this study aimed to examine the sensitivity of the PRPP System to changes in task performance over a short period of rehabilitation, and third, to evaluate the guidance provided by the PRPP in directing intervention. A case study research design was employed. The PRPP System of Task Analysis was used to assess changes in task embedded information processing capacity during occupational therapy intervention with a severely agitated adult in a rehabilitation context. Performance is assessed on three selected tasks over a one-month period. Information processing difficulties during task performance can be clearly identified when observing a severely agitated adult following a traumatic brain injury. Processing skills involving attention, sensory processing and planning were most affected at this stage of rehabilitation. These processing difficulties are linked to established descriptions of agitated behaviour. Fluctuations in performance across three tasks of differing processing complexity were evident, leading to hypothesised relationships between task complexity, environment and novelty with information processing errors. Changes in specific information processing capacity over time were evident based on repeated measures using the PRPP System of Task Analysis. This lends preliminary support for its utility as an outcome measure, and raises hypotheses about the type of therapy required to enhance information processing in people with severe agitation. The PRPP System is sensitive to information processing changes in severely agitated adults when used to reassess performance over short intervals and can provide direct guidance to occupational therapy intervention to improve task embedded information processing by categorising errors under four stages of an information processing model: Perceive, Recall, Plan and Perform.

  3. Distinct amyloid precursor protein processing machineries of the olfactory system.

    PubMed

    Kim, Jae Yeon; Rasheed, Ameer; Yoo, Seung-Jun; Kim, So Yeun; Cho, Bongki; Son, Gowoon; Yu, Seong-Woon; Chang, Keun-A; Suh, Yoo-Hun; Moon, Cheil

    2018-01-01

    Processing of amyloid precursor protein (APP) occurs through sequential cleavages first by β-secretase and then by the γ-secretase complex. However, abnormal processing of APP leads to excessive production of β-amyloid (Aβ) in the central nervous system (CNS), an event which is regarded as a primary cause of Alzheimer's disease (AD). In particular, gene mutations of the γ-secretase complex-which contains presenilin 1 or 2 as the catalytic core-could trigger marked Aβ accumulation. Olfactory dysfunction usually occurs before the onset of typical AD-related symptoms (eg, memory loss or muscle retardation), suggesting that the olfactory system may be one of the most vulnerable regions to AD. To date however, little is known about why the olfactory system is affected so early by AD prior to other regions. Thus, we examined the distribution of secretases and levels of APP processing in the olfactory system under either healthy or pathological conditions. Here, we show that the olfactory system has distinct APP processing machineries. In particular, we identified higher expressions levels and activity of γ-secretase in the olfactory epithelium (OE) than other regions of the brain. Moreover, APP c-terminal fragments (CTF) are markedly detected. During AD progression, we note increased expression of presenilin2 of γ-secretases in the OE, not in the OB, and show that neurotoxic Aβ*56 accumulates more quickly in the OE. Taken together, these results suggest that the olfactory system has distinct APP processing machineries under healthy and pathological conditions. This finding may provide a crucial understanding of the unique APP-processing mechanisms in the olfactory system, and further highlights the correlation between olfactory deficits and AD symptoms. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Direct observation of multistep energy transfer in LHCII with fifth-order 3D electronic spectroscopy.

    PubMed

    Zhang, Zhengyang; Lambrev, Petar H; Wells, Kym L; Garab, Győző; Tan, Howe-Siang

    2015-07-31

    During photosynthesis, sunlight is efficiently captured by light-harvesting complexes, and the excitation energy is then funneled towards the reaction centre. These photosynthetic excitation energy transfer (EET) pathways are complex and proceed in a multistep fashion. Ultrafast two-dimensional electronic spectroscopy (2DES) is an important tool to study EET processes in photosynthetic complexes. However, the multistep EET processes can only be indirectly inferred by correlating different cross peaks from a series of 2DES spectra. Here we directly observe multistep EET processes in LHCII using ultrafast fifth-order three-dimensional electronic spectroscopy (3DES). We measure cross peaks in 3DES spectra of LHCII that directly indicate energy transfer from excitons in the chlorophyll b (Chl b) manifold to the low-energy level chlorophyll a (Chl a) via mid-level Chl a energy states. This new spectroscopic technique allows scientists to move a step towards mapping the complete complex EET processes in photosynthetic systems.

  5. Process Network Approach to Understanding How Forest Ecosystems Adapt to Changes

    NASA Astrophysics Data System (ADS)

    Kim, J.; Yun, J.; Hong, J.; Kwon, H.; Chun, J.

    2011-12-01

    Sustainability challenges are transforming science and its role in society. Complex systems science has emerged as an inevitable field of education and research, which transcends disciplinary boundaries and focuses on understanding of the dynamics of complex social-ecological systems (SES). SES is a combined system of social and ecological components and drivers that interact and give rise to results, which could not be understood on the basis of sociological or ecological considerations alone. However, both systems may be viewed as a network of processes, and such a network hierarchy may serve as a hinge to bridge social and ecological systems. As a first step toward such effort, we attempted to delineate and interpret such process networks in forest ecosystems, which play a critical role in the cycles of carbon and water from local to global scales. These cycles and their variability, in turn, play an important role in the emergent and self-organizing interactions between forest ecosystems and their environment. Ruddell and Kumar (2009) define a process network as a network of feedback loops and the related time scales, which describe the magnitude and direction of the flow of energy, matter, and information between the different variables in a complex system. Observational evidence, based on micrometeorological eddy covariance measurements, suggests that heterogeneity and disturbances in forest ecosystems in monsoon East Asia may facilitate to build resilience for adaptation to change. Yet, the principles that characterize the role of variability in these interactions remain elusive. In this presentation, we report results from the analysis of multivariate ecohydrologic and biogeochemical time series data obtained from temperate forest ecosystems in East Asia based on information flow statistics.

  6. Biomedically relevant chemical and physical properties of coal combustion products.

    PubMed Central

    Fisher, G L

    1983-01-01

    The evaluation of the potential public and occupational health hazards of developing and existing combustion processes requires a detailed understanding of the physical and chemical properties of effluents available for human and environmental exposures. These processes produce complex mixtures of gases and aerosols which may interact synergistically or antagonistically with biological systems. Because of the physicochemical complexity of the effluents, the biomedically relevant properties of these materials must be carefully assessed. Subsequent to release from combustion sources, environmental interactions further complicate assessment of the toxicity of combustion products. This report provides an overview of the biomedically relevant physical and chemical properties of coal fly ash. Coal fly ash is presented as a model complex mixture for health and safety evaluation of combustion processes. PMID:6337824

  7. Characterising the development of the understanding of human body systems in high-school biology students - a longitudinal study

    NASA Astrophysics Data System (ADS)

    Snapir, Zohar; Eberbach, Catherine; Ben-Zvi-Assaraf, Orit; Hmelo-Silver, Cindy; Tripto, Jaklin

    2017-10-01

    Science education today has become increasingly focused on research into complex natural, social and technological systems. In this study, we examined the development of high-school biology students' systems understanding of the human body, in a three-year longitudinal study. The development of the students' system understanding was evaluated using the Components Mechanisms Phenomena (CMP) framework for conceptual representation. We coded and analysed the repertory grid personal constructs of 67 high-school biology students at 4 points throughout the study. Our data analysis builds on the assumption that systems understanding entails a perception of all the system categories, including structures within the system (its Components), specific processes and interactions at the macro and micro levels (Mechanisms), and the Phenomena that present the macro scale of processes and patterns within a system. Our findings suggest that as the learning process progressed, the systems understanding of our students became more advanced, moving forward within each of the major CMP categories. Moreover, there was an increase in the mechanism complexity presented by the students, manifested by more students describing mechanisms at the molecular level. Thus, the 'mechanism' category and the micro level are critical components that enable students to understand system-level phenomena such as homeostasis.

  8. PCLIPS: Parallel CLIPS

    NASA Technical Reports Server (NTRS)

    Gryphon, Coranth D.; Miller, Mark D.

    1991-01-01

    PCLIPS (Parallel CLIPS) is a set of extensions to the C Language Integrated Production System (CLIPS) expert system language. PCLIPS is intended to provide an environment for the development of more complex, extensive expert systems. Multiple CLIPS expert systems are now capable of running simultaneously on separate processors, or separate machines, thus dramatically increasing the scope of solvable tasks within the expert systems. As a tool for parallel processing, PCLIPS allows for an expert system to add to its fact-base information generated by other expert systems, thus allowing systems to assist each other in solving a complex problem. This allows individual expert systems to be more compact and efficient, and thus run faster or on smaller machines.

  9. Influence of using challenging tasks in biology classrooms on students' cognitive knowledge structure: an empirical video study

    NASA Astrophysics Data System (ADS)

    Nawani, Jigna; Rixius, Julia; Neuhaus, Birgit J.

    2016-08-01

    Empirical analysis of secondary biology classrooms revealed that, on average, 68% of teaching time in Germany revolved around processing tasks. Quality of instruction can thus be assessed by analyzing the quality of tasks used in classroom discourse. This quasi-experimental study analyzed how teachers used tasks in 38 videotaped biology lessons pertaining to the topic 'blood and circulatory system'. Two fundamental characteristics used to analyze tasks include: (1) required cognitive level of processing (e.g. low level information processing: repetiition, summary, define, classify and high level information processing: interpret-analyze data, formulate hypothesis, etc.) and (2) complexity of task content (e.g. if tasks require use of factual, linking or concept level content). Additionally, students' cognitive knowledge structure about the topic 'blood and circulatory system' was measured using student-drawn concept maps (N = 970 students). Finally, linear multilevel models were created with high-level cognitive processing tasks and higher content complexity tasks as class-level predictors and students' prior knowledge, students' interest in biology, and students' interest in biology activities as control covariates. Results showed a positive influence of high-level cognitive processing tasks (β = 0.07; p < .01) on students' cognitive knowledge structure. However, there was no observed effect of higher content complexity tasks on students' cognitive knowledge structure. Presented findings encourage the use of high-level cognitive processing tasks in biology instruction.

  10. Electronic Timekeeping: North Dakota State University Improves Payroll Processing.

    ERIC Educational Resources Information Center

    Vetter, Ronald J.; And Others

    1993-01-01

    North Dakota State University has adopted automated timekeeping to improve the efficiency and effectiveness of payroll processing. The microcomputer-based system accurately records and computes employee time, tracks labor distribution, accommodates complex labor policies and company pay practices, provides automatic data processing and reporting,…

  11. A Process Management System for Networked Manufacturing

    NASA Astrophysics Data System (ADS)

    Liu, Tingting; Wang, Huifen; Liu, Linyan

    With the development of computer, communication and network, networked manufacturing has become one of the main manufacturing paradigms in the 21st century. Under the networked manufacturing environment, there exist a large number of cooperative tasks susceptible to alterations, conflicts caused by resources and problems of cost and quality. This increases the complexity of administration. Process management is a technology used to design, enact, control, and analyze networked manufacturing processes. It supports efficient execution, effective management, conflict resolution, cost containment and quality control. In this paper we propose an integrated process management system for networked manufacturing. Requirements of process management are analyzed and architecture of the system is presented. And a process model considering process cost and quality is developed. Finally a case study is provided to explain how the system runs efficiently.

  12. Plants: Novel Developmental Processes.

    ERIC Educational Resources Information Center

    Goldberg, Robert B.

    1988-01-01

    Describes the diversity of plants. Outlines novel developmental and complex genetic processes that are specific to plants. Identifies approaches that can be used to solve problems in plant biology. Cites the advantages of using higher plants for experimental systems. (RT)

  13. Complex adaptive systems: A new approach for understanding health practices.

    PubMed

    Gomersall, Tim

    2018-06-22

    This article explores the potential of complex adaptive systems theory to inform behaviour change research. A complex adaptive system describes a collection of heterogeneous agents interacting within a particular context, adapting to each other's actions. In practical terms, this implies that behaviour change is 1) socially and culturally situated; 2) highly sensitive to small baseline differences in individuals, groups, and intervention components; and 3) determined by multiple components interacting "chaotically". Two approaches to studying complex adaptive systems are briefly reviewed. Agent-based modelling is a computer simulation technique that allows researchers to investigate "what if" questions in a virtual environment. Applied qualitative research techniques, on the other hand, offer a way to examine what happens when an intervention is pursued in real-time, and to identify the sorts of rules and assumptions governing social action. Although these represent very different approaches to complexity, there may be scope for mixing these methods - for example, by grounding models in insights derived from qualitative fieldwork. Finally, I will argue that the concept of complex adaptive systems offers one opportunity to gain a deepened understanding of health-related practices, and to examine the social psychological processes that produce health-promoting or damaging actions.

  14. Image processing for optical mapping.

    PubMed

    Ravindran, Prabu; Gupta, Aditya

    2015-01-01

    Optical Mapping is an established single-molecule, whole-genome analysis system, which has been used to gain a comprehensive understanding of genomic structure and to study structural variation of complex genomes. A critical component of Optical Mapping system is the image processing module, which extracts single molecule restriction maps from image datasets of immobilized, restriction digested and fluorescently stained large DNA molecules. In this review, we describe robust and efficient image processing techniques to process these massive datasets and extract accurate restriction maps in the presence of noise, ambiguity and confounding artifacts. We also highlight a few applications of the Optical Mapping system.

  15. A System for Planning Ahead

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A software system that uses artificial intelligence techniques to help with complex Space Shuttle scheduling at Kennedy Space Center is commercially available. Stottler Henke Associates, Inc.(SHAI), is marketing its automatic scheduling system, the Automated Manifest Planner (AMP), to industries that must plan and project changes many different times before the tasks are executed. The system creates optimal schedules while reducing manpower costs. Using information entered into the system by expert planners, the system automatically makes scheduling decisions based upon resource limitations and other constraints. It provides a constraint authoring system for adding other constraints to the scheduling process as needed. AMP is adaptable to assist with a variety of complex scheduling problems in manufacturing, transportation, business, architecture, and construction. AMP can benefit vehicle assembly plants, batch processing plants, semiconductor manufacturing, printing and textiles, surface and underground mining operations, and maintenance shops. For most of SHAI's commercial sales, the company obtains a service contract to customize AMP to a specific domain and then issues the customer a user license.

  16. Distributed Cognition and Process Management Enabling Individualized Translational Research: The NIH Undiagnosed Diseases Program Experience

    PubMed Central

    Links, Amanda E.; Draper, David; Lee, Elizabeth; Guzman, Jessica; Valivullah, Zaheer; Maduro, Valerie; Lebedev, Vlad; Didenko, Maxim; Tomlin, Garrick; Brudno, Michael; Girdea, Marta; Dumitriu, Sergiu; Haendel, Melissa A.; Mungall, Christopher J.; Smedley, Damian; Hochheiser, Harry; Arnold, Andrew M.; Coessens, Bert; Verhoeven, Steven; Bone, William; Adams, David; Boerkoel, Cornelius F.; Gahl, William A.; Sincan, Murat

    2016-01-01

    The National Institutes of Health Undiagnosed Diseases Program (NIH UDP) applies translational research systematically to diagnose patients with undiagnosed diseases. The challenge is to implement an information system enabling scalable translational research. The authors hypothesized that similar complex problems are resolvable through process management and the distributed cognition of communities. The team, therefore, built the NIH UDP integrated collaboration system (UDPICS) to form virtual collaborative multidisciplinary research networks or communities. UDPICS supports these communities through integrated process management, ontology-based phenotyping, biospecimen management, cloud-based genomic analysis, and an electronic laboratory notebook. UDPICS provided a mechanism for efficient, transparent, and scalable translational research and thereby addressed many of the complex and diverse research and logistical problems of the NIH UDP. Full definition of the strengths and deficiencies of UDPICS will require formal qualitative and quantitative usability and process improvement measurement. PMID:27785453

  17. Crystallization process of a three-dimensional complex plasma

    NASA Astrophysics Data System (ADS)

    Steinmüller, Benjamin; Dietz, Christopher; Kretschmer, Michael; Thoma, Markus H.

    2018-05-01

    Characteristic timescales and length scales for phase transitions of real materials are in ranges where a direct visualization is unfeasible. Therefore, model systems can be useful. Here, the crystallization process of a three-dimensional complex plasma under gravity conditions is considered where the system ranges up to a large extent into the bulk plasma. Time-resolved measurements exhibit the process down to a single-particle level. Primary clusters, consisting of particles in the solid state, grow vertically and, secondarily, horizontally. The box-counting method shows a fractal dimension of df≈2.72 for the clusters. This value gives a hint that the formation process is a combination of local epitaxial and diffusion-limited growth. The particle density and the interparticle distance to the nearest neighbor remain constant within the clusters during crystallization. All results are in good agreement with former observations of a single-particle layer.

  18. Managing IT service management implementation complexity: from the perspective of the Warfield Version of systems science

    NASA Astrophysics Data System (ADS)

    Wan, Jiangping; Jones, James D.

    2013-11-01

    The Warfield version of systems science supports a wide variety of application areas, and is useful to practitioners who use the work program of complexity (WPOC) tool. In this article, WPOC is applied to information technology service management (ITSM) for managing the complexity of projects. In discussing the application of WPOC to ITSM, we discuss several steps of WPOC. The discovery step of WPOC consists of a description process and a diagnosis process. During the description process, 52 risk factors are identified, which are then narrowed to 20 key risk factors. All of this is done by interviews and surveys. Root risk factors (the most basic risk factors) consist of 11 kinds of common 'mindbugs' which are selected from an interpretive structural model. This is achieved by empirical analysis of 25 kinds of mindbugs. (A lesser aim of this research is to affirm that these mindbugs developed from a Western mindset have corresponding relevance in a completely different culture: the Peoples Republic of China.) During the diagnosis process, the relationships among the root risk factors in the implementation of the ITSM project are identified. The resolution step of WPOC consists of a design process and an implementation process. During the design process, issues related to the ITSM application are compared to both e-Government operation and maintenance, and software process improvement. The ITSM knowledge support structure is also designed at this time. During the implementation process, 10 keys to the successful implementation of ITSM projects are identified.

  19. Multifaceted Modelling of Complex Business Enterprises

    PubMed Central

    2015-01-01

    We formalise and present a new generic multifaceted complex system approach for modelling complex business enterprises. Our method has a strong focus on integrating the various data types available in an enterprise which represent the diverse perspectives of various stakeholders. We explain the challenges faced and define a novel approach to converting diverse data types into usable Bayesian probability forms. The data types that can be integrated include historic data, survey data, and management planning data, expert knowledge and incomplete data. The structural complexities of the complex system modelling process, based on various decision contexts, are also explained along with a solution. This new application of complex system models as a management tool for decision making is demonstrated using a railway transport case study. The case study demonstrates how the new approach can be utilised to develop a customised decision support model for a specific enterprise. Various decision scenarios are also provided to illustrate the versatility of the decision model at different phases of enterprise operations such as planning and control. PMID:26247591

  20. Multifaceted Modelling of Complex Business Enterprises.

    PubMed

    Chakraborty, Subrata; Mengersen, Kerrie; Fidge, Colin; Ma, Lin; Lassen, David

    2015-01-01

    We formalise and present a new generic multifaceted complex system approach for modelling complex business enterprises. Our method has a strong focus on integrating the various data types available in an enterprise which represent the diverse perspectives of various stakeholders. We explain the challenges faced and define a novel approach to converting diverse data types into usable Bayesian probability forms. The data types that can be integrated include historic data, survey data, and management planning data, expert knowledge and incomplete data. The structural complexities of the complex system modelling process, based on various decision contexts, are also explained along with a solution. This new application of complex system models as a management tool for decision making is demonstrated using a railway transport case study. The case study demonstrates how the new approach can be utilised to develop a customised decision support model for a specific enterprise. Various decision scenarios are also provided to illustrate the versatility of the decision model at different phases of enterprise operations such as planning and control.

  1. A method for multiprotein assembly in cells reveals independent action of kinesins in complex

    PubMed Central

    Norris, Stephen R.; Soppina, Virupakshi; Dizaji, Aslan S.; Schimert, Kristin I.; Sept, David; Cai, Dawen; Sivaramakrishnan, Sivaraj

    2014-01-01

    Teams of processive molecular motors are critical for intracellular transport and organization, yet coordination between motors remains poorly understood. Here, we develop a system using protein components to generate assemblies of defined spacing and composition inside cells. This system is applicable to studying macromolecular complexes in the context of cell signaling, motility, and intracellular trafficking. We use the system to study the emergent behavior of kinesin motors in teams. We find that two kinesin motors in complex act independently (do not help or hinder each other) and can alternate their activities. For complexes containing a slow kinesin-1 and fast kinesin-3 motor, the slow motor dominates motility in vitro but the fast motor can dominate on certain subpopulations of microtubules in cells. Both motors showed dynamic interactions with the complex, suggesting that motor–cargo linkages are sensitive to forces applied by the motors. We conclude that kinesin motors in complex act independently in a manner regulated by the microtubule track. PMID:25365993

  2. [Complex automatic data processing in multi-profile hospitals].

    PubMed

    Dovzhenko, Iu M; Panov, G D

    1990-01-01

    The computerization of data processing in multi-disciplinary hospitals is the key factor in raising the quality of medical care provided to the population, intensifying the work of the personnel, improving the curative and diagnostic process and the use of resources. Even a small experience in complex computerization at the Botkin Hospital indicates that due to the use of the automated system the quality of data processing in being improved, a high level of patients' examination is being provided, a speedy training of young specialists is being achieved, conditions are being created for continuing education of physicians through the analysis of their own activity. At big hospitals a complex solution of administrative and curative diagnostic tasks on the basis of general hospital network of display connection and general hospital data bank is the most prospective form of computerization.

  3. Cas5d Protein Processes Pre-crRNA and Assembles into a Cascade-like Interference Complex in Subtype I-C/Dvulg CRISPR-Cas System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nam, Ki Hyun; Haitjema, Charles; Liu, Xueqi

    Clustered regularly interspaced short palindromic repeats (CRISPRs), together with an operon of CRISPR-associated (Cas) proteins, form an RNA-based prokaryotic immune system against exogenous genetic elements. Cas5 family proteins are found in several type I CRISPR-Cas systems. Here, we report the molecular function of subtype I-C/Dvulg Cas5d from Bacillus halodurans. We show that Cas5d cleaves pre-crRNA into unit length by recognizing both the hairpin structure and the 3 single stranded sequence in the CRISPR repeat region. Cas5d structure reveals a ferredoxin domain-based architecture and a catalytic triad formed by Y46, K116, and H117 residues. We further show that after pre-crRNA processing,more » Cas5d assembles with crRNA, Csd1, and Csd2 proteins to form a multi-sub-unit interference complex similar to Escherichia coli Cascade (CRISPR-associated complex for antiviral defense) in architecture. Our results suggest that formation of a crRNA-presenting Cascade-like complex is likely a common theme among type I CRISPR subtypes.« less

  4. Visualization-based decision support for value-driven system design

    NASA Astrophysics Data System (ADS)

    Tibor, Elliott

    In the past 50 years, the military, communication, and transportation systems that permeate our world, have grown exponentially in size and complexity. The development and production of these systems has seen ballooning costs and increased risk. This is particularly critical for the aerospace industry. The inability to deal with growing system complexity is a crippling force in the advancement of engineered systems. Value-Driven Design represents a paradigm shift in the field of design engineering that has potential to help counteract this trend. The philosophy of Value-Driven Design places the desires of the stakeholder at the forefront of the design process to capture true preferences and reveal system alternatives that were never previously thought possible. Modern aerospace engineering design problems are large, complex, and involve multiple levels of decision-making. To find the best design, the decision-maker is often required to analyze hundreds or thousands of combinations of design variables and attributes. Visualization can be used to support these decisions, by communicating large amounts of data in a meaningful way. Understanding the design space, the subsystem relationships, and the design uncertainties is vital to the advancement of Value-Driven Design as an accepted process for the development of more effective, efficient, robust, and elegant aerospace systems. This research investigates the use of multi-dimensional data visualization tools to support decision-making under uncertainty during the Value-Driven Design process. A satellite design system comprising a satellite, ground station, and launch vehicle is used to demonstrate effectiveness of new visualization methods to aid in decision support during complex aerospace system design. These methods are used to facilitate the exploration of the feasible design space by representing the value impact of system attribute changes and comparing the results of multi-objective optimization formulations with a Value-Driven Design formulation. The visualization methods are also used to assist in the decomposition of a value function, by representing attribute sensitivities to aid with trade-off studies. Lastly, visualization is used to enable greater understanding of the subsystem relationships, by displaying derivative-based couplings, and the design uncertainties, through implementation of utility theory. The use of these visualization methods is shown to enhance the decision-making capabilities of the designer by granting them a more holistic view of the complex design space.

  5. Engineering Complex Embedded Systems with State Analysis and the Mission Data System

    NASA Technical Reports Server (NTRS)

    Ingham, Michel D.; Rasmussen, Robert D.; Bennett, Matthew B.; Moncada, Alex C.

    2004-01-01

    It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer s intent, potentially leading to software errors. This problem is addressed by a systems engineering methodology called State Analysis, which provides a process for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using State Analysis and how these requirements inform the design of the system software, using representative spacecraft examples.

  6. Analyzing SystemC Designs: SystemC Analysis Approaches for Varying Applications

    PubMed Central

    Stoppe, Jannis; Drechsler, Rolf

    2015-01-01

    The complexity of hardware designs is still increasing according to Moore's law. With embedded systems being more and more intertwined and working together not only with each other, but also with their environments as cyber physical systems (CPSs), more streamlined development workflows are employed to handle the increasing complexity during a system's design phase. SystemC is a C++ library for the design of hardware/software systems, enabling the designer to quickly prototype, e.g., a distributed CPS without having to decide about particular implementation details (such as whether to implement a feature in hardware or in software) early in the design process. Thereby, this approach reduces the initial implementation's complexity by offering an abstract layer with which to build a working prototype. However, as SystemC is based on C++, analyzing designs becomes a difficult task due to the complex language features that are available to the designer. Several fundamentally different approaches for analyzing SystemC designs have been suggested. This work illustrates several different SystemC analysis approaches, including their specific advantages and shortcomings, allowing designers to pick the right tools to assist them with a specific problem during the design of a system using SystemC. PMID:25946632

  7. Analyzing SystemC Designs: SystemC Analysis Approaches for Varying Applications.

    PubMed

    Stoppe, Jannis; Drechsler, Rolf

    2015-05-04

    The complexity of hardware designs is still increasing according to Moore's law. With embedded systems being more and more intertwined and working together not only with each other, but also with their environments as cyber physical systems (CPSs), more streamlined development workflows are employed to handle the increasing complexity during a system's design phase. SystemC is a C++ library for the design of hardware/software systems, enabling the designer to quickly prototype, e.g., a distributed CPS without having to decide about particular implementation details (such as whether to implement a feature in hardware or in software) early in the design process. Thereby, this approach reduces the initial implementation's complexity by offering an abstract layer with which to build a working prototype. However, as SystemC is based on C++, analyzing designs becomes a difficult task due to the complex language features that are available to the designer. Several fundamentally different approaches for analyzing SystemC designs have been suggested. This work illustrates several different SystemC analysis approaches, including their specific advantages and shortcomings, allowing designers to pick the right tools to assist them with a specific problem during the design of a system using SystemC.

  8. Collaborative Systems Thinking: A Response to the Problems Faced by Systems Engineering's 'Middle Tier'

    NASA Technical Reports Server (NTRS)

    Phfarr, Barbara B.; So, Maria M.; Lamb, Caroline Twomey; Rhodes, Donna H.

    2009-01-01

    Experienced systems engineers are adept at more than implementing systems engineering processes: they utilize systems thinking to solve complex engineering problems. Within the space industry demographics and economic pressures are reducing the number of experienced systems engineers that will be available in the future. Collaborative systems thinking within systems engineering teams is proposed as a way to integrate systems engineers of various experience levels to handle complex systems engineering challenges. This paper uses the GOES-R Program Systems Engineering team to illustrate the enablers and barriers to team level systems thinking and to identify ways in which performance could be improved. Ways NASA could expand its engineering training to promote team-level systems thinking are proposed.

  9. Simulating Mercury And Methyl Mercury Stream Concentrations At Multiple Scales in a Wetland Influenced Coastal Plain Watershed (McTier Creek, SC, USA)

    EPA Science Inventory

    Use of Mechanistic Models to?Improve Understanding: Differential, mass balance, process-based Spatial and temporal resolution Necessary simplifications of system complexity Combing field monitoring and modeling efforts Balance between capturing complexity and maintaining...

  10. High energy density battery based on complex hydrides

    DOEpatents

    Zidan, Ragaiy

    2016-04-26

    A battery and process of operating a battery system is provided using high hydrogen capacity complex hydrides in an organic non-aqueous solvent that allows the transport of hydride ions such as AlH.sub.4.sup.- and metal ions during respective discharging and charging steps.

  11. Automated Derivation of Complex System Constraints from User Requirements

    NASA Technical Reports Server (NTRS)

    Muery, Kim; Foshee, Mark; Marsh, Angela

    2006-01-01

    International Space Station (ISS) payload developers submit their payload science requirements for the development of on-board execution timelines. The ISS systems required to execute the payload science operations must be represented as constraints for the execution timeline. Payload developers use a software application, User Requirements Collection (URC), to submit their requirements by selecting a simplified representation of ISS system constraints. To fully represent the complex ISS systems, the constraints require a level of detail that is beyond the insight of the payload developer. To provide the complex representation of the ISS system constraints, HOSC operations personnel, specifically the Payload Activity Requirements Coordinators (PARC), manually translate the payload developers simplified constraints into detailed ISS system constraints used for scheduling the payload activities in the Consolidated Planning System (CPS). This paper describes the implementation for a software application, User Requirements Integration (URI), developed to automate the manual ISS constraint translation process.

  12. Biological robustness.

    PubMed

    Kitano, Hiroaki

    2004-11-01

    Robustness is a ubiquitously observed property of biological systems. It is considered to be a fundamental feature of complex evolvable systems. It is attained by several underlying principles that are universal to both biological organisms and sophisticated engineering systems. Robustness facilitates evolvability and robust traits are often selected by evolution. Such a mutually beneficial process is made possible by specific architectural features observed in robust systems. But there are trade-offs between robustness, fragility, performance and resource demands, which explain system behaviour, including the patterns of failure. Insights into inherent properties of robust systems will provide us with a better understanding of complex diseases and a guiding principle for therapy design.

  13. Quaternary Morphodynamics of Fluvial Dispersal Systems Revealed: The Fly River, PNG, and the Sunda Shelf, SE Asia, simulated with the Massively Parallel GPU-based Model 'GULLEM'

    NASA Astrophysics Data System (ADS)

    Aalto, R. E.; Lauer, J. W.; Darby, S. E.; Best, J.; Dietrich, W. E.

    2015-12-01

    During glacial-marine transgressions vast volumes of sediment are deposited due to the infilling of lowland fluvial systems and shallow shelves, material that is removed during ensuing regressions. Modelling these processes would illuminate system morphodynamics, fluxes, and 'complexity' in response to base level change, yet such problems are computationally formidable. Environmental systems are characterized by strong interconnectivity, yet traditional supercomputers have slow inter-node communication -- whereas rapidly advancing Graphics Processing Unit (GPU) technology offers vastly higher (>100x) bandwidths. GULLEM (GpU-accelerated Lowland Landscape Evolution Model) employs massively parallel code to simulate coupled fluvial-landscape evolution for complex lowland river systems over large temporal and spatial scales. GULLEM models the accommodation space carved/infilled by representing a range of geomorphic processes, including: river & tributary incision within a multi-directional flow regime, non-linear diffusion, glacial-isostatic flexure, hydraulic geometry, tectonic deformation, sediment production, transport & deposition, and full 3D tracking of all resulting stratigraphy. Model results concur with the Holocene dynamics of the Fly River, PNG -- as documented with dated cores, sonar imaging of floodbasin stratigraphy, and the observations of topographic remnants from LGM conditions. Other supporting research was conducted along the Mekong River, the largest fluvial system of the Sunda Shelf. These and other field data provide tantalizing empirical glimpses into the lowland landscapes of large rivers during glacial-interglacial transitions, observations that can be explored with this powerful numerical model. GULLEM affords estimates for the timing and flux budgets within the Fly and Sunda Systems, illustrating complex internal system responses to the external forcing of sea level and climate. Furthermore, GULLEM can be applied to most ANY fluvial system to explore processes across a wide range of temporal and spatial scales. The presentation will provide insights (& many animations) illustrating river morphodynamics & resulting landscapes formed as a result of sea level oscillations. [Image: The incised 3.2e6 km^2 Sundaland domain @ 431ka

  14. Redundant Asynchronous Microprocessor System

    NASA Technical Reports Server (NTRS)

    Meyer, G.; Johnston, J. O.; Dunn, W. R.

    1985-01-01

    Fault-tolerant computer structure called RAMPS (for redundant asynchronous microprocessor system) has simplicity of static redundancy but offers intermittent-fault handling ability of complex, dynamically redundant systems. New structure useful wherever several microprocessors are employed for control - in aircraft, industrial processes, robotics, and automatic machining, for example.

  15. Hydrogeologic processes of large-scale tectonomagmatic complexes in Mongolia-southern Siberia and on Mars

    USGS Publications Warehouse

    Komatsu, G.; Dohm, J.M.; Hare, T.M.

    2004-01-01

    Large-scale tectonomagmatic complexes are common on Earth and Mars. Many of these complexes are created or at least influenced by mantle processes, including a wide array of plume types ranging from superplumes to mantle plumes. Among the most prominent complexes, the Mongolian plateau on Earth and the Tharsis bulge on Mars share remarkable similarities in terms of large domal uplifted areas, great rift canyon systems, and widespread volcanism on their surfaces. Water has also played an important role in the development of the two complexes. In general, atmospheric and surface water play a bigger role in the development of the present-day Mongolian plateau than for the Tharsis bulge, as evidenced by highly developed drainages and thick accumulation of sediments in the basins of the Baikal rift system. On the Tharsis bulge, however, water appears to have remained as ground ice except during periods of elevated magmatic activity. Glacial and periglacial processes are well documented for the Mongolian plateau and are also reported for parts of the Tharsis bulge. Ice-magma interactions, which are represented by the formation of subice volcanoes in parts of the Mongolian plateau region, have been reported for the Valles Marineris region of Mars. The complexes are also characterized by cataclysmic floods, but their triggering mechanism may differ: mainly ice-dam failures for the Mongolian plateau and outburst of groundwater for the Tharsis bulge, probably by magma-ice interactions, although ice-dam failures within the Valles Marineris region cannot be ruled out as a possible contributor. Comparative studies of the Mongolian plateau and Tharsis bulge provide excellent opportunities for understanding surface manifestations of plume-driven processes on terrestrial planets and how they interact with hydro-cryospheres. ?? 2004 Geological Society of America.

  16. Precision molding of advanced glass optics: innovative production technology for lens arrays and free form optics

    NASA Astrophysics Data System (ADS)

    Pongs, Guido; Bresseler, Bernd; Bergs, Thomas; Menke, Gert

    2012-10-01

    Today isothermal precision molding of imaging glass optics has become a widely applied and integrated production technology in the optical industry. Especially in consumer electronics (e.g. digital cameras, mobile phones, Blu-ray) a lot of optical systems contain rotationally symmetrical aspherical lenses produced by precision glass molding. But due to higher demands on complexity and miniaturization of optical elements the established process chain for precision glass molding is not sufficient enough. Wafer based molding processes for glass optics manufacturing become more and more interesting for mobile phone applications. Also cylindrical lens arrays can be used in high power laser systems. The usage of unsymmetrical free-form optics allows an increase of efficiency in optical laser systems. Aixtooling is working on different aspects in the fields of mold manufacturing technologies and molding processes for extremely high complex optical components. In terms of array molding technologies, Aixtooling has developed a manufacturing technology for the ultra-precision machining of carbide molds together with European partners. The development covers the machining of multi lens arrays as well as cylindrical lens arrays. The biggest challenge is the molding of complex free-form optics having no symmetrical axis. A comprehensive CAD/CAM data management along the entire process chain is essential to reach high accuracies on the molded lenses. Within a national funded project Aixtooling is working on a consistent data handling procedure in the process chain for precision molding of free-form optics.

  17. Computerized procedures system

    DOEpatents

    Lipner, Melvin H.; Mundy, Roger A.; Franusich, Michael D.

    2010-10-12

    An online data driven computerized procedures system that guides an operator through a complex process facility's operating procedures. The system monitors plant data, processes the data and then, based upon this processing, presents the status of the current procedure step and/or substep to the operator. The system supports multiple users and a single procedure definition supports several interface formats that can be tailored to the individual user. Layered security controls access privileges and revisions are version controlled. The procedures run on a server that is platform independent of the user workstations that the server interfaces with and the user interface supports diverse procedural views.

  18. Complex coacervates as a foundation for synthetic underwater adhesives

    PubMed Central

    Stewart, Russell J.; Wang, Ching Shuen; Shao, Hui

    2011-01-01

    Complex coacervation was proposed to play a role in the formation of the underwater bioadhesive of the Sandcastle worm (Phragmatopoma californica) based on the polyacidic and polybasic nature of the glue proteins and the balance of opposite charges at physiological pH. Morphological studies of the secretory system suggested the natural process does not involve complex coacervation as commonly defined. The distinction may not be important because electrostatic interactions likely play an important role in formation of the sandcastle glue. Complex coacervation has also been invoked in the formation of adhesive underwater silk fibers of caddisfly larvae and the adhesive plaques of mussels. A process similar to complex coacervation, that is, condensation and dehydration of biopolyelectrolytes through electrostatic associations, seems plausible for the caddisfly silk. This much is clear, the sandcastle glue complex coacervation model provided a valuable blueprint for the synthesis of a biomimetic, waterborne, underwater adhesive with demonstrated potential for repair of wet tissue. PMID:21081223

  19. Loss of 'complexity' and aging. Potential applications of fractals and chaos theory to senescence

    NASA Technical Reports Server (NTRS)

    Lipsitz, L. A.; Goldberger, A. L.

    1992-01-01

    The concept of "complexity," derived from the field of nonlinear dynamics, can be adapted to measure the output of physiologic processes that generate highly variable fluctuations resembling "chaos." We review data suggesting that physiologic aging is associated with a generalized loss of such complexity in the dynamics of healthy organ system function and hypothesize that such loss of complexity leads to an impaired ability to adapt to physiologic stress. This hypothesis is supported by observations showing an age-related loss of complex variability in multiple physiologic processes including cardiovascular control, pulsatile hormone release, and electroencephalographic potentials. If further research supports this hypothesis, measures of complexity based on chaos theory and the related geometric concept of fractals may provide new ways to monitor senescence and test the efficacy of specific interventions to modify the age-related decline in adaptive capacity.

  20. Evolution of complex adaptations in molecular systems

    PubMed Central

    Pál, Csaba; Papp, Balázs

    2017-01-01

    A central challenge in evolutionary biology concerns the mechanisms by which complex adaptations arise. Such adaptations depend on the fixation of multiple, highly specific mutations, where intermediate stages of evolution seemingly provide little or no benefit. It is generally assumed that the establishment of complex adaptations is very slow in nature, as evolution of such traits demands special population genetic or environmental circumstances. However, blueprints of complex adaptations in molecular systems are pervasive, indicating that they can readily evolve. We discuss the prospects and limitations of non-adaptive scenarios, which assume multiple neutral or deleterious steps in the evolution of complex adaptations. Next, we examine how complex adaptations can evolve by natural selection in changing environment. Finally, we argue that molecular ’springboards’, such as phenotypic heterogeneity and promiscuous interactions facilitate this process by providing access to new adaptive paths. PMID:28782044

  1. Conference Proceedings: Aptitude, Learning, and Instruction. Volume 2. Cognitive Process Analyses of Learning and Problem Solving,

    DTIC Science & Technology

    1981-01-01

    of a Complex System; 177 Albert L. Stevens and Allan Collins Introduction 177 Models 182 Conclusion 196 21. Complex Learning Processes 199 John R...have called schemata (Norman, Gentner, & Stevens , 1976), frames (Minsky, 1975), and scripts (Schank & Abelson, 1977). If these other authors are...York: McGraw-Hill. 1975. Norman, D. A.. Gcntner. D. R., & Stevens . A. L. Comments on learning schemata and memory representation. In D). K lahr (Ed

  2. Patient safety - the role of human factors and systems engineering.

    PubMed

    Carayon, Pascale; Wood, Kenneth E

    2010-01-01

    Patient safety is a global challenge that requires knowledge and skills in multiple areas, including human factors and systems engineering. In this chapter, numerous conceptual approaches and methods for analyzing, preventing and mitigating medical errors are described. Given the complexity of healthcare work systems and processes, we emphasize the need for increasing partnerships between the health sciences and human factors and systems engineering to improve patient safety. Those partnerships will be able to develop and implement the system redesigns that are necessary to improve healthcare work systems and processes for patient safety.

  3. Dancing with Swarms: Utilizing Swarm Intelligence to Build, Investigate, and Control Complex Systems

    NASA Astrophysics Data System (ADS)

    Jacob, Christian

    We are surrounded by a natural world of massively parallel, decentralized biological "information processing" systems, a world that exhibits fascinating emergent properties in many ways. In fact, our very own bodies are the result of emergent patterns, as the development of any multi-cellular organism is determined by localized interactions among an enormous number of cells, carefully orchestrated by enzymes, signalling proteins and other molecular "agents". What is particularly striking about these highly distributed developmental processes is that a centralized control agency is completely absent. This is also the case for many other biological systems, such as termites which build their nests—without an architect that draws a plan, or brain cells evolving into a complex `mind machine'—without an explicit blueprint of a network layout.

  4. Rethinking Communication in Innovation Processes: Creating Space for Change in Complex Systems

    ERIC Educational Resources Information Center

    Leeuwis, Cees; Aarts, Noelle

    2011-01-01

    This paper systematically rethinks the role of communication in innovation processes, starting from largely separate theoretical developments in communication science and innovation studies. Literature review forms the basis of the arguments presented. The paper concludes that innovation is a collective process that involves the contextual…

  5. Language Is a Complex Adaptive System: Position Paper

    ERIC Educational Resources Information Center

    Beckner, Clay; Blythe, Richard; Bybee, Joan; Christiansen, Morten H.; Croft, William; Ellis, Nick C.; Holland, John; Ke, Jinyun; Larsen-Freeman, Diane; Schoenemann, Tom

    2009-01-01

    Language has a fundamentally social function. Processes of human interaction along with domain-general cognitive processes shape the structure and knowledge of language. Recent research in the cognitive sciences has demonstrated that patterns of use strongly affect how language is acquired, is used, and changes. These processes are not independent…

  6. Reengineering the JPL Spacecraft Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, C.

    1995-01-01

    This presentation describes the factors that have emerged in the evolved process of reengineering the unmanned spacecraft design process at the Jet Propulsion Laboratory in Pasadena, California. Topics discussed include: New facilities, new design factors, new system-level tools, complex performance objectives, changing behaviors, design integration, leadership styles, and optimization.

  7. Towards an Intelligent Planning Knowledge Base Development Environment

    NASA Technical Reports Server (NTRS)

    Chien, S.

    1994-01-01

    ract describes work in developing knowledge base editing and debugging tools for the Multimission VICAR Planner (MVP) system. MVP uses artificial intelligence planning techniques to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing requests made to the JPL Multimission Image Processing Laboratory.

  8. Multi-mission space science data processing systems - Past, present, and future

    NASA Technical Reports Server (NTRS)

    Stallings, William H.

    1990-01-01

    Packetized telemetry that is consistent with the international Consultative Committee for Space Data Systems (CCSDS) has been baselined for future NASA missions such as Space Station Freedom. Some experiences from past and present multimission systems are examined, including current experiences in implementing a CCSDS standard packetized data processing system, relative to the effectiveness of the multimission approach in lowering life cycle cost and the complexity of meeting new mission needs. It is shown that the continued effort toward standardization of telemetry and processing support will permit the development of multimission systems needed to meet the increased requirements of future NASA missions.

  9. Simple processes drive unpredictable differences in estuarine fish assemblages: Baselines for understanding site-specific ecological and anthropogenic impacts

    NASA Astrophysics Data System (ADS)

    Sheaves, Marcus

    2016-03-01

    Predicting patterns of abundance and composition of biotic assemblages is essential to our understanding of key ecological processes, and our ability to monitor, evaluate and manage assemblages and ecosystems. Fish assemblages often vary from estuary to estuary in apparently unpredictable ways, making it challenging to develop a general understanding of the processes that determine assemblage composition. This makes it problematic to transfer understanding from one estuary situation to another and therefore difficult to assemble effective management plans or to assess the impacts of natural and anthropogenic disturbance. Although system-to-system variability is a common property of ecological systems, rather than being random it is the product of complex interactions of multiple causes and effects at a variety of spatial and temporal scales. I investigate the drivers of differences in estuary fish assemblages, to develop a simple model explaining the diversity and complexity of observed estuary-to-estuary differences, and explore its implications for management and conservation. The model attributes apparently unpredictable differences in fish assemblage composition from estuary to estuary to the interaction of species-specific, life history-specific and scale-specific processes. In explaining innate faunal differences among estuaries without the need to invoke complex ecological or anthropogenic drivers, the model provides a baseline against which the effects of additional natural and anthropogenic factors can be evaluated.

  10. Life History of a Topic in an Online Discussion: A Complex Systems Theory Perspective on How One Message Attracts Class Members to Create Meaning Collaboratively

    ERIC Educational Resources Information Center

    Vogler, Jane S.; Schallert, Diane L.; Jordan, Michelle E.; Song, Kwangok; Sanders, Anke J. Z.; Te Chiang, Yueh-hui Yan; Lee, Ji-Eun; Park, Jeongbin Hannah; Yu, Li-Tang

    2017-01-01

    Complex adaptive systems theory served as a framework for this qualitative study exploring the process of how meaning emerges from the collective interactions of individuals in a synchronous online discussion through their shared words about a topic. In an effort to bridge levels of analysis from the individual to the small group to the community,…

  11. A user-system interface agent

    NASA Technical Reports Server (NTRS)

    Wakim, Nagi T.; Srivastava, Sadanand; Bousaidi, Mehdi; Goh, Gin-Hua

    1995-01-01

    Agent-based technologies answer to several challenges posed by additional information processing requirements in today's computing environments. In particular, (1) users desire interaction with computing devices in a mode which is similar to that used between people, (2) the efficiency and successful completion of information processing tasks often require a high-level of expertise in complex and multiple domains, (3) information processing tasks often require handling of large volumes of data and, therefore, continuous and endless processing activities. The concept of an agent is an attempt to address these new challenges by introducing information processing environments in which (1) users can communicate with a system in a natural way, (2) an agent is a specialist and a self-learner and, therefore, it qualifies to be trusted to perform tasks independent of the human user, and (3) an agent is an entity that is continuously active performing tasks that are either delegated to it or self-imposed. The work described in this paper focuses on the development of an interface agent for users of a complex information processing environment (IPE). This activity is part of an on-going effort to build a model for developing agent-based information systems. Such systems will be highly applicable to environments which require a high degree of automation, such as, flight control operations and/or processing of large volumes of data in complex domains, such as the EOSDIS environment and other multidisciplinary, scientific data systems. The concept of an agent as an information processing entity is fully described with emphasis on characteristics of special interest to the User-System Interface Agent (USIA). Issues such as agent 'existence' and 'qualification' are discussed in this paper. Based on a definition of an agent and its main characteristics, we propose an architecture for the development of interface agents for users of an IPE that is agent-oriented and whose resources are likely to be distributed and heterogeneous in nature. The architecture of USIA is outlined in two main components: (1) the user interface which is concerned with issues as user dialog and interaction, user modeling, and adaptation to user profile, and (2) the system interface part which deals with identification of IPE capabilities, task understanding and feasibility assessment, and task delegation and coordination of assistant agents.

  12. Reversibility in Quantum Models of Stochastic Processes

    NASA Astrophysics Data System (ADS)

    Gier, David; Crutchfield, James; Mahoney, John; James, Ryan

    Natural phenomena such as time series of neural firing, orientation of layers in crystal stacking and successive measurements in spin-systems are inherently probabilistic. The provably minimal classical models of such stochastic processes are ɛ-machines, which consist of internal states, transition probabilities between states and output values. The topological properties of the ɛ-machine for a given process characterize the structure, memory and patterns of that process. However ɛ-machines are often not ideal because their statistical complexity (Cμ) is demonstrably greater than the excess entropy (E) of the processes they represent. Quantum models (q-machines) of the same processes can do better in that their statistical complexity (Cq) obeys the relation Cμ >= Cq >= E. q-machines can be constructed to consider longer lengths of strings, resulting in greater compression. With code-words of sufficiently long length, the statistical complexity becomes time-symmetric - a feature apparently novel to this quantum representation. This result has ramifications for compression of classical information in quantum computing and quantum communication technology.

  13. [Soft- and hardware support for the setup for computer tracking of radiation teletherapy].

    PubMed

    Tarutin, I G; Piliavets, V I; Strakh, A G; Minenko, V F; Golubovskiĭ, A I

    1983-06-01

    A hard and soft ware computer assisted complex has been worked out for gamma-beam therapy. The complex included all radiotherapeutic units, including a Siemens program controlled betatron with an energy of 42 MEV computer ES-1022, a Medigraf system of the processing of graphic information, a Mars-256 system for control over the homogeneity of distribution of dose rate on the field of irradiation and a package of mathematical programs to select a plan of irradiation of various tumor sites. The prospects of the utilization of such complexes in the dosimetric support of radiation therapy are discussed.

  14. Work-Facilitating Information Visualization Techniques for Complex Wastewater Systems

    NASA Astrophysics Data System (ADS)

    Ebert, Achim; Einsfeld, Katja

    The design and the operation of urban drainage systems and wastewater treatment plants (WWTP) have become increasingly complex. This complexity is due to increased requirements concerning process technology, technical, environmental, economical, and occupational safety aspects. The plant operator has access not only to some timeworn filers and measured parameters but also to numerous on-line and off-line parameters that characterize the current state of the plant in detail. Moreover, expert databases and specific support pages of plant manufactures are accessible through the World Wide Web. Thus, the operator is overwhelmed with predominantly unstructured data.

  15. The Physics of Life and Quantum Complex Matter: A Case of Cross-Fertilization

    PubMed Central

    Poccia, Nicola; Bianconi, Antonio

    2011-01-01

    Progress in the science of complexity, from the Big Bang to the coming of humankind, from chemistry and biology to geosciences and medicine, and from materials engineering to energy sciences, is leading to a shift of paradigm in the physical sciences. The focus is on the understanding of the non-equilibrium process in fine tuned systems. Quantum complex materials such as high temperature superconductors and living matter are both non-equilibrium and fine tuned systems. These topics have been subbjects of scientific discussion in the Rome Symposium on the “Quantum Physics of Living Matter”. PMID:26791661

  16. Mathematical concepts for modeling human behavior in complex man-machine systems

    NASA Technical Reports Server (NTRS)

    Johannsen, G.; Rouse, W. B.

    1979-01-01

    Many human behavior (e.g., manual control) models have been found to be inadequate for describing processes in certain real complex man-machine systems. An attempt is made to find a way to overcome this problem by examining the range of applicability of existing mathematical models with respect to the hierarchy of human activities in real complex tasks. Automobile driving is chosen as a baseline scenario, and a hierarchy of human activities is derived by analyzing this task in general terms. A structural description leads to a block diagram and a time-sharing computer analogy.

  17. Scalable and Interactive Segmentation and Visualization of Neural Processes in EM Datasets

    PubMed Central

    Jeong, Won-Ki; Beyer, Johanna; Hadwiger, Markus; Vazquez, Amelio; Pfister, Hanspeter; Whitaker, Ross T.

    2011-01-01

    Recent advances in scanning technology provide high resolution EM (Electron Microscopy) datasets that allow neuroscientists to reconstruct complex neural connections in a nervous system. However, due to the enormous size and complexity of the resulting data, segmentation and visualization of neural processes in EM data is usually a difficult and very time-consuming task. In this paper, we present NeuroTrace, a novel EM volume segmentation and visualization system that consists of two parts: a semi-automatic multiphase level set segmentation with 3D tracking for reconstruction of neural processes, and a specialized volume rendering approach for visualization of EM volumes. It employs view-dependent on-demand filtering and evaluation of a local histogram edge metric, as well as on-the-fly interpolation and ray-casting of implicit surfaces for segmented neural structures. Both methods are implemented on the GPU for interactive performance. NeuroTrace is designed to be scalable to large datasets and data-parallel hardware architectures. A comparison of NeuroTrace with a commonly used manual EM segmentation tool shows that our interactive workflow is faster and easier to use for the reconstruction of complex neural processes. PMID:19834227

  18. Path lumping: An efficient algorithm to identify metastable path channels for conformational dynamics of multi-body systems

    NASA Astrophysics Data System (ADS)

    Meng, Luming; Sheong, Fu Kit; Zeng, Xiangze; Zhu, Lizhe; Huang, Xuhui

    2017-07-01

    Constructing Markov state models from large-scale molecular dynamics simulation trajectories is a promising approach to dissect the kinetic mechanisms of complex chemical and biological processes. Combined with transition path theory, Markov state models can be applied to identify all pathways connecting any conformational states of interest. However, the identified pathways can be too complex to comprehend, especially for multi-body processes where numerous parallel pathways with comparable flux probability often coexist. Here, we have developed a path lumping method to group these parallel pathways into metastable path channels for analysis. We define the similarity between two pathways as the intercrossing flux between them and then apply the spectral clustering algorithm to lump these pathways into groups. We demonstrate the power of our method by applying it to two systems: a 2D-potential consisting of four metastable energy channels and the hydrophobic collapse process of two hydrophobic molecules. In both cases, our algorithm successfully reveals the metastable path channels. We expect this path lumping algorithm to be a promising tool for revealing unprecedented insights into the kinetic mechanisms of complex multi-body processes.

  19. Multiscale analysis of information dynamics for linear multivariate processes.

    PubMed

    Faes, Luca; Montalto, Alessandro; Stramaglia, Sebastiano; Nollo, Giandomenico; Marinazzo, Daniele

    2016-08-01

    In the study of complex physical and physiological systems represented by multivariate time series, an issue of great interest is the description of the system dynamics over a range of different temporal scales. While information-theoretic approaches to the multiscale analysis of complex dynamics are being increasingly used, the theoretical properties of the applied measures are poorly understood. This study introduces for the first time a framework for the analytical computation of information dynamics for linear multivariate stochastic processes explored at different time scales. After showing that the multiscale processing of a vector autoregressive (VAR) process introduces a moving average (MA) component, we describe how to represent the resulting VARMA process using statespace (SS) models and how to exploit the SS model parameters to compute analytical measures of information storage and information transfer for the original and rescaled processes. The framework is then used to quantify multiscale information dynamics for simulated unidirectionally and bidirectionally coupled VAR processes, showing that rescaling may lead to insightful patterns of information storage and transfer but also to potentially misleading behaviors.

  20. Digital Radar-Signal Processors Implemented in FPGAs

    NASA Technical Reports Server (NTRS)

    Berkun, Andrew; Andraka, Ray

    2004-01-01

    High-performance digital electronic circuits for onboard processing of return signals in an airborne precipitation- measuring radar system have been implemented in commercially available field-programmable gate arrays (FPGAs). Previously, it was standard practice to downlink the radar-return data to a ground station for postprocessing a costly practice that prevents the nearly-real-time use of the data for automated targeting. In principle, the onboard processing could be performed by a system of about 20 personal- computer-type microprocessors; relative to such a system, the present FPGA-based processor is much smaller and consumes much less power. Alternatively, the onboard processing could be performed by an application-specific integrated circuit (ASIC), but in comparison with an ASIC implementation, the present FPGA implementation offers the advantages of (1) greater flexibility for research applications like the present one and (2) lower cost in the small production volumes typical of research applications. The generation and processing of signals in the airborne precipitation measuring radar system in question involves the following especially notable steps: The system utilizes a total of four channels two carrier frequencies and two polarizations at each frequency. The system uses pulse compression: that is, the transmitted pulse is spread out in time and the received echo of the pulse is processed with a matched filter to despread it. The return signal is band-limited and digitally demodulated to a complex baseband signal that, for each pulse, comprises a large number of samples. Each complex pair of samples (denoted a range gate in radar terminology) is associated with a numerical index that corresponds to a specific time offset from the beginning of the radar pulse, so that each such pair represents the energy reflected from a specific range. This energy and the average echo power are computed. The phase of each range bin is compared to the previous echo by complex conjugate multiplication to obtain the mean Doppler shift (and hence the mean and variance of the velocity of precipitation) of the echo at that range.

  1. Rubber pad forming - Efficient approach for the manufacturing of complex structured sheet metal blanks for food industry

    NASA Astrophysics Data System (ADS)

    Spoelstra, Paul; Djakow, Eugen; Homberg, Werner

    2017-10-01

    The production of complex organic shapes in sheet metals is gaining more importance in the food industry due to increasing functional and hygienic demands. Hence it is necessary to produce parts with complex geometries promoting cleanability and general sanitation leading to improvement of food safety. In this context, and especially when stainless steel has to be formed into highly complex geometries while maintaining desired surface properties, it is inevitable that alternative manufacturing processes will need to be used which meet these requirements. Rubber pad forming offers high potential when it comes to shaping complex parts with excellent surface quality, with virtually no tool marks and scratches. Especially in cases where only small series are to be produced, rubber pad forming processes offers both technological and economic advantages. Due to the flexible punch, variation in metal thickness can be used with the same forming tool. The investments to set-up Rubber pad forming is low in comparison to conventional sheet metal forming processes. The process facilitates production of shallow sheet metal parts with complex contours and bends. Different bending sequences in a multiple tool set-up can also be conducted. The planned contribution thus describes a brief overview of the rubber pad technology. It shows the prototype rubber pad forming machine which can be used to perform complex part geometries made from stainless steel (1.4301). Based on an analysis of the already existing systems and new machines for rubber pad forming processes, together with their process properties, influencing variables and areas of application, some relevant parts for the food industry are presented.

  2. Engineering healthcare as a service system.

    PubMed

    Tien, James M; Goldschmidt-Clermont, Pascal J

    2010-01-01

    Engineering has and will continue to have a critical impact on healthcare; the application of technology-based techniques to biological problems can be defined to be technobiology applications. This paper is primarily focused on applying the technobiology approach of systems engineering to the development of a healthcare service system that is both integrated and adaptive. In general, healthcare services are carried out with knowledge-intensive agents or components which work together as providers and consumers to create or co-produce value. Indeed, the engineering design of a healthcare system must recognize the fact that it is actually a complex integration of human-centered activities that is increasingly dependent on information technology and knowledge. Like any service system, healthcare can be considered to be a combination or recombination of three essential components - people (characterized by behaviors, values, knowledge, etc.), processes (characterized by collaboration, customization, etc.) and products (characterized by software, hardware, infrastructures, etc.). Thus, a healthcare system is an integrated and adaptive set of people, processes and products. It is, in essence, a system of systems which objectives are to enhance its efficiency (leading to greater interdependency) and effectiveness (leading to improved health). Integration occurs over the physical, temporal, organizational and functional dimensions, while adaptation occurs over the monitoring, feedback, cybernetic and learning dimensions. In sum, such service systems as healthcare are indeed complex, especially due to the uncertainties associated with the human-centered aspects of these systems. Moreover, the system complexities can only be dealt with methods that enhance system integration and adaptation.

  3. The Role of Self-Regulated Learning in Fostering Students' Conceptual Understanding of Complex Systems with Hypermedia

    ERIC Educational Resources Information Center

    Azevedo, Roger; Guthrie, John T.; Seibert, Diane

    2004-01-01

    This study examines the role of self-regulated learning (SRL) in facilitating students' shifts to more sophisticated mental models of the circulatory system as indicated by both performance and process data. We began with Winne and colleagues' information processing model of SRL (Winne, 2001; Winne & Hadwin, 1998) and used it to examine how…

  4. DyNAMiC Workbench: an integrated development environment for dynamic DNA nanotechnology

    PubMed Central

    Grun, Casey; Werfel, Justin; Zhang, David Yu; Yin, Peng

    2015-01-01

    Dynamic DNA nanotechnology provides a promising avenue for implementing sophisticated assembly processes, mechanical behaviours, sensing and computation at the nanoscale. However, design of these systems is complex and error-prone, because the need to control the kinetic pathway of a system greatly increases the number of design constraints and possible failure modes for the system. Previous tools have automated some parts of the design workflow, but an integrated solution is lacking. Here, we present software implementing a three ‘tier’ design process: a high-level visual programming language is used to describe systems, a molecular compiler builds a DNA implementation and nucleotide sequences are generated and optimized. Additionally, our software includes tools for analysing and ‘debugging’ the designs in silico, and for importing/exporting designs to other commonly used software systems. The software we present is built on many existing pieces of software, but is integrated into a single package—accessible using a Web-based interface at http://molecular-systems.net/workbench. We hope that the deep integration between tools and the flexibility of this design process will lead to better experimental results, fewer experimental design iterations and the development of more complex DNA nanosystems. PMID:26423437

  5. TMT approach to observatory software development process

    NASA Astrophysics Data System (ADS)

    Buur, Hanne; Subramaniam, Annapurni; Gillies, Kim; Dumas, Christophe; Bhatia, Ravinder

    2016-07-01

    The purpose of the Observatory Software System (OSW) is to integrate all software and hardware components of the Thirty Meter Telescope (TMT) to enable observations and data capture; thus it is a complex software system that is defined by four principal software subsystems: Common Software (CSW), Executive Software (ESW), Data Management System (DMS) and Science Operations Support System (SOSS), all of which have interdependencies with the observatory control systems and data acquisition systems. Therefore, the software development process and plan must consider dependencies to other subsystems, manage architecture, interfaces and design, manage software scope and complexity, and standardize and optimize use of resources and tools. Additionally, the TMT Observatory Software will largely be developed in India through TMT's workshare relationship with the India TMT Coordination Centre (ITCC) and use of Indian software industry vendors, which adds complexity and challenges to the software development process, communication and coordination of activities and priorities as well as measuring performance and managing quality and risk. The software project management challenge for the TMT OSW is thus a multi-faceted technical, managerial, communications and interpersonal relations challenge. The approach TMT is using to manage this multifaceted challenge is a combination of establishing an effective geographically distributed software team (Integrated Product Team) with strong project management and technical leadership provided by the TMT Project Office (PO) and the ITCC partner to manage plans, process, performance, risk and quality, and to facilitate effective communications; establishing an effective cross-functional software management team composed of stakeholders, OSW leadership and ITCC leadership to manage dependencies and software release plans, technical complexities and change to approved interfaces, architecture, design and tool set, and to facilitate effective communications; adopting an agile-based software development process across the observatory to enable frequent software releases to help mitigate subsystem interdependencies; defining concise scope and work packages for each of the OSW subsystems to facilitate effective outsourcing of software deliverables to the ITCC partner, and to enable performance monitoring and risk management. At this stage, the architecture and high-level design of the software system has been established and reviewed. During construction each subsystem will have a final design phase with reviews, followed by implementation and testing. The results of the TMT approach to the Observatory Software development process will only be preliminary at the time of the submittal of this paper, but it is anticipated that the early results will be a favorable indication of progress.

  6. Teachers' Beliefs and Practices: A Dynamic and Complex Relationship

    ERIC Educational Resources Information Center

    Zheng, Hongying

    2013-01-01

    Research on teachers' beliefs has provided useful insights into understanding processes of teaching. However, no research has explored teachers' beliefs as a system nor have researchers investigated the substance of interactions between teachers' beliefs, practices and context. Therefore, the author adopts complexity theory to explore the features…

  7. COMPLEX HOST-PARASITE SYSTEMS IN MARTES: IMPLICATIONS FOR CONSERVATION BIOLOGY OF ENDEMIC FAUNAS.

    USDA-ARS?s Scientific Manuscript database

    Complex assemblages of hosts and parasites reveal insights about biogeography and ecology and inform us about processes which serve to structure faunal diversity and the biosphere in space and time. Exploring aspects of parasite diversity among martens (species of Martes) and other mustelids reveal...

  8. Attention Guidance in Learning from a Complex Animation: Seeing Is Understanding?

    ERIC Educational Resources Information Center

    de Koning, Bjorn B.; Tabbers, Huib K.; Rikers, Remy M. J. P.; Paas, Fred

    2010-01-01

    To examine how visual attentional resources are allocated when learning from a complex animation about the cardiovascular system, eye movements were registered in the absence and presence of visual cues. Cognitive processing was assessed using cued retrospective reporting, whereas comprehension and transfer tests measured the quality of the…

  9. [System of technical facilities for equipping the anesthesiologist's work place in the operating room].

    PubMed

    Burlakov, R I; Iurevich, V M

    1981-01-01

    The authors proved the advisability of complex technical provision for certain functional cycles, or parts of medical technological process. The example given is a modification of working place for anesthesiologist at the operating theatre. Principle and additional devices included in the complex are specified.

  10. Teaching Complex Concepts in the Geosciences by Integrating Analytical Reasoning with GIS

    ERIC Educational Resources Information Center

    Houser, Chris; Bishop, Michael P.; Lemmons, Kelly

    2017-01-01

    Conceptual models have long served as a means for physical geographers to organize their understanding of feedback mechanisms and complex systems. Analytical reasoning provides undergraduate students with an opportunity to develop conceptual models based upon their understanding of surface processes and environmental conditions. This study…

  11. Numerical propulsion system simulation: An interdisciplinary approach

    NASA Technical Reports Server (NTRS)

    Nichols, Lester D.; Chamis, Christos C.

    1991-01-01

    The tremendous progress being made in computational engineering and the rapid growth in computing power that is resulting from parallel processing now make it feasible to consider the use of computer simulations to gain insights into the complex interactions in aerospace propulsion systems and to evaluate new concepts early in the design process before a commitment to hardware is made. Described here is a NASA initiative to develop a Numerical Propulsion System Simulation (NPSS) capability.

  12. Numerical propulsion system simulation - An interdisciplinary approach

    NASA Technical Reports Server (NTRS)

    Nichols, Lester D.; Chamis, Christos C.

    1991-01-01

    The tremendous progress being made in computational engineering and the rapid growth in computing power that is resulting from parallel processing now make it feasible to consider the use of computer simulations to gain insights into the complex interactions in aerospace propulsion systems and to evaluate new concepts early in the design process before a commitment to hardware is made. Described here is a NASA initiative to develop a Numerical Propulsion System Simulation (NPSS) capability.

  13. The Acoculco caldera magmas: genesis, evolution and relation with the Acoculco geothermal system

    NASA Astrophysics Data System (ADS)

    Sosa-Ceballos, G.; Macías, J. L.; Avellán, D.

    2017-12-01

    The Acoculco Caldera Complex (ACC) is located at the eastern part of the Trans Mexican Volcanic Belt; México. This caldera complex have been active since 2.7 Ma through reactivations of the system or associated magmatism. Therefore the ACC is an excellent case scenario to investigate the relation between the magmatic heat supply and the evolution processes that modified magmatic reservoirs in a potential geothermal field. We investigated the origin and the magmatic processes (magma mixing, assimilation and crystallization) that modified the ACC rocks by petrography, major oxides-trace element geochemistry, and isotopic analysis. Magma mixing is considered as the heat supply that maintain active the magmatic system, whereas assimilation yielded insights about the depth at which processes occurred. In addition, we performed a series of hydrothermal experiments in order to constrain the storage depth for the magma tapped during the caldera collapse. Rocks from the ACC were catalogued as pre, syn and post caldera. The post caldera rocks are peralkaline rhyolites, in contrast to all other rocks that are subalkaline. Our investigation is focus to investigate if the collapse modified the plumbing system and the depth at which magmas stagnate and recorded the magmatic processes.

  14. A situation-response model for intelligent pilot aiding

    NASA Technical Reports Server (NTRS)

    Schudy, Robert; Corker, Kevin

    1987-01-01

    An intelligent pilot aiding system needs models of the pilot information processing to provide the computational basis for successful cooperation between the pilot and the aiding system. By combining artificial intelligence concepts with the human information processing model of Rasmussen, an abstraction hierarchy of states of knowledge, processing functions, and shortcuts are developed, which is useful for characterizing the information processing both of the pilot and of the aiding system. This approach is used in the conceptual design of a real time intelligent aiding system for flight crews of transport aircraft. One promising result was the tentative identification of a particular class of information processing shortcuts, from situation characterizations to appropriate responses, as the most important reliable pathway for dealing with complex time critical situations.

  15. Operationally efficient propulsion system study (OEPSS) data book. Volume 7; Launch Operations Index (LOI) Design Features and Options

    NASA Technical Reports Server (NTRS)

    Ziese, James M.

    1992-01-01

    A design tool of figure of merit was developed that allows the operability of a propulsion system design to be measured. This Launch Operations Index (LOI) relates Operations Efficiency to System Complexity. The figure of Merit can be used by conceptual designers to compare different propulsion system designs based on their impact on launch operations. The LOI will improve the design process by making sure direct launch operations experience is a necessary feedback to the design process.

  16. Automated complex for research of electric drives control

    NASA Astrophysics Data System (ADS)

    Avlasko, P. V.; Antonenko, D. A.

    2018-05-01

    In the article, the automated complex intended for research of various control modes of electric motors including the inductor motor of double-way feed is described. As a basis of the created complex, the National Instruments platform is chosen. The operating controller built in a platform is delivered with an operating system of real-time for creation of systems of measurement and management. The software developed in the environment of LabVIEW consists of several connected modules which are in different elements of a complex. Besides the software for automated management by experimental installation, the program complex is developed for modelling of processes in the electric drive. As a result there is an opportunity to compare simulated and received experimentally transitional characteristics of the electric drive in various operating modes.

  17. Clinical application of three-dimensional printing to the management of complex univentricular hearts with abnormal systemic or pulmonary venous drainage.

    PubMed

    McGovern, Eimear; Kelleher, Eoin; Snow, Aisling; Walsh, Kevin; Gadallah, Bassem; Kutty, Shelby; Redmond, John M; McMahon, Colin J

    2017-09-01

    In recent years, three-dimensional printing has demonstrated reliable reproducibility of several organs including hearts with complex congenital cardiac anomalies. This represents the next step in advanced image processing and can be used to plan surgical repair. In this study, we describe three children with complex univentricular hearts and abnormal systemic or pulmonary venous drainage, in whom three-dimensional printed models based on CT data assisted with preoperative planning. For two children, after group discussion and examination of the models, a decision was made not to proceed with surgery. We extend the current clinical experience with three-dimensional printed modelling and discuss the benefits of such models in the setting of managing complex surgical problems in children with univentricular circulation and abnormal systemic or pulmonary venous drainage.

  18. A Spatially Continuous Model of Carbohydrate Digestion and Transport Processes in the Colon

    PubMed Central

    Moorthy, Arun S.; Brooks, Stephen P. J.; Kalmokoff, Martin; Eberl, Hermann J.

    2015-01-01

    A spatially continuous mathematical model of transport processes, anaerobic digestion and microbial complexity as would be expected in the human colon is presented. The model is a system of first-order partial differential equations with context determined number of dependent variables, and stiff, non-linear source terms. Numerical simulation of the model is used to elucidate information about the colon-microbiota complex. It is found that the composition of materials on outflow of the model does not well-describe the composition of material in other model locations, and inferences using outflow data varies according to model reactor representation. Additionally, increased microbial complexity allows the total microbial community to withstand major system perturbations in diet and community structure. However, distribution of strains and functional groups within the microbial community can be modified depending on perturbation length and microbial kinetic parameters. Preliminary model extensions and potential investigative opportunities using the computational model are discussed. PMID:26680208

  19. Terabit bandwidth-adaptive transmission using low-complexity format-transparent digital signal processing.

    PubMed

    Zhuge, Qunbi; Morsy-Osman, Mohamed; Chagnon, Mathieu; Xu, Xian; Qiu, Meng; Plant, David V

    2014-02-10

    In this paper, we propose a low-complexity format-transparent digital signal processing (DSP) scheme for next generation flexible and energy-efficient transceiver. It employs QPSK symbols as the training and pilot symbols for the initialization and tracking stage of the receiver-side DSP, respectively, for various modulation formats. The performance is numerically and experimentally evaluated in a dual polarization (DP) 11 Gbaud 64QAM system. Employing the proposed DSP scheme, we conduct a system-level study of Tb/s bandwidth-adaptive superchannel transmissions with flexible modulation formats including QPSK, 8QAM and 16QAM. The spectrum bandwidth allocation is realized in the digital domain instead of turning on/off sub-channels, which improves the performance of higher order QAM. Various transmission distances ranging from 240 km to 6240 km are demonstrated with a colorless detection for hardware complexity reduction.

  20. Complexation between sodium dodecyl sulfate and amphoteric polyurethane nanoparticles.

    PubMed

    Qiao, Yong; Zhang, Shifeng; Lin, Ouya; Deng, Liandong; Dong, Anjie

    2007-09-27

    The complexation between negatively charged sodium dodecyl sulfate (SDS) and positively charged amphoteric polyurethane (APU) self-assembled nanoparticles (NPs) containing nonionic hydrophobic segments is studied by dynamic light scattering, pyrene fluorescent probing, zeta-potential, and transmission electron microscopy (TEM) in the present paper. With increasing the mol ratio of SDS to the positive charges on the surface of APU NPs, the aqueous solution of APU NPs presents precipitation at pH 2, around stoichiometric SDS concentration, and then the precipitate dissociates with excess SDS to form more stable nanoparticles of ionomer complexes. Three stages of the complexation process are clearly shown by the pyrene I1/I3 variation of the complex systems, which only depends on the ratio of SDS/APU, and demonstrate that the process is dominated by electrostatic attraction and hydrophobic aggregation.

Top