Sample records for complex systems requires

  1. Formal Verification of Complex Systems based on SysML Functional Requirements

    DTIC Science & Technology

    2014-12-23

    Formal Verification of Complex Systems based on SysML Functional Requirements Hoda Mehrpouyan1, Irem Y. Tumer2, Chris Hoyle2, Dimitra Giannakopoulou3...requirements for design of complex engineered systems. The proposed ap- proach combines a SysML modeling approach to document and structure safety requirements...methods and tools to support the integration of safety into the design solution. 2.1. SysML for Complex Engineered Systems Traditional methods and tools

  2. State analysis requirements database for engineering complex embedded systems

    NASA Technical Reports Server (NTRS)

    Bennett, Matthew B.; Rasmussen, Robert D.; Ingham, Michel D.

    2004-01-01

    It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer's intent, potentially leading to software errors. This problem is addressed by a systems engineering tool called the State Analysis Database, which provides a tool for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using the State Analysis Database.

  3. Automated Derivation of Complex System Constraints from User Requirements

    NASA Technical Reports Server (NTRS)

    Muery, Kim; Foshee, Mark; Marsh, Angela

    2006-01-01

    International Space Station (ISS) payload developers submit their payload science requirements for the development of on-board execution timelines. The ISS systems required to execute the payload science operations must be represented as constraints for the execution timeline. Payload developers use a software application, User Requirements Collection (URC), to submit their requirements by selecting a simplified representation of ISS system constraints. To fully represent the complex ISS systems, the constraints require a level of detail that is beyond the insight of the payload developer. To provide the complex representation of the ISS system constraints, HOSC operations personnel, specifically the Payload Activity Requirements Coordinators (PARC), manually translate the payload developers simplified constraints into detailed ISS system constraints used for scheduling the payload activities in the Consolidated Planning System (CPS). This paper describes the implementation for a software application, User Requirements Integration (URI), developed to automate the manual ISS constraint translation process.

  4. Engineering Complex Embedded Systems with State Analysis and the Mission Data System

    NASA Technical Reports Server (NTRS)

    Ingham, Michel D.; Rasmussen, Robert D.; Bennett, Matthew B.; Moncada, Alex C.

    2004-01-01

    It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer s intent, potentially leading to software errors. This problem is addressed by a systems engineering methodology called State Analysis, which provides a process for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using State Analysis and how these requirements inform the design of the system software, using representative spacecraft examples.

  5. Enabling Requirements-Based Programming for Highly-Dependable Complex Parallel and Distributed Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    The manual application of formal methods in system specification has produced successes, but in the end, despite any claims and assertions by practitioners, there is no provable relationship between a manually derived system specification or formal model and the customer's original requirements. Complex parallel and distributed system present the worst case implications for today s dearth of viable approaches for achieving system dependability. No avenue other than formal methods constitutes a serious contender for resolving the problem, and so recognition of requirements-based programming has come at a critical juncture. We describe a new, NASA-developed automated requirement-based programming method that can be applied to certain classes of systems, including complex parallel and distributed systems, to achieve a high degree of dependability.

  6. Managing Schools as Complex Adaptive Systems: A Strategic Perspective

    ERIC Educational Resources Information Center

    Fidan, Tuncer; Balci, Ali

    2017-01-01

    This conceptual study examines the analogies between schools and complex adaptive systems and identifies strategies used to manage schools as complex adaptive systems. Complex adaptive systems approach, introduced by the complexity theory, requires school administrators to develop new skills and strategies to realize their agendas in an…

  7. Controls for Burning Solid Wastes

    ERIC Educational Resources Information Center

    Toro, Richard F.; Weinstein, Norman J.

    1975-01-01

    Modern thermal solid waste processing systems are becoming more complex, incorporating features that require instrumentation and control systems to a degree greater than that previously required just for proper combustion control. With the advent of complex, sophisticated, thermal processing systems, TV monitoring and computer control should…

  8. Using SysML to model complex systems for security.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cano, Lester Arturo

    2010-08-01

    As security systems integrate more Information Technology the design of these systems has tended to become more complex. Some of the most difficult issues in designing Complex Security Systems (CSS) are: Capturing Requirements: Defining Hardware Interfaces: Defining Software Interfaces: Integrating Technologies: Radio Systems: Voice Over IP Systems: Situational Awareness Systems.

  9. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... Computer Software and Complex Electronics Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear...-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used... Electronics Engineers (ANSI/IEEE) Standard 830-1998, ``IEEE Recommended Practice for Software Requirements...

  10. Directed evolution and synthetic biology applications to microbial systems.

    PubMed

    Bassalo, Marcelo C; Liu, Rongming; Gill, Ryan T

    2016-06-01

    Biotechnology applications require engineering complex multi-genic traits. The lack of knowledge on the genetic basis of complex phenotypes restricts our ability to rationally engineer them. However, complex phenotypes can be engineered at the systems level, utilizing directed evolution strategies that drive whole biological systems toward desired phenotypes without requiring prior knowledge of the genetic basis of the targeted trait. Recent developments in the synthetic biology field accelerates the directed evolution cycle, facilitating engineering of increasingly complex traits in biological systems. In this review, we summarize some of the most recent advances in directed evolution and synthetic biology that allows engineering of complex traits in microbial systems. Then, we discuss applications that can be achieved through engineering at the systems level. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Apollo Experiment Report: Lunar-Sample Processing in the Lunar Receiving Laboratory High-Vacuum Complex

    NASA Technical Reports Server (NTRS)

    White, D. R.

    1976-01-01

    A high-vacuum complex composed of an atmospheric decontamination system, sample-processing chambers, storage chambers, and a transfer system was built to process and examine lunar material while maintaining quarantine status. Problems identified, equipment modifications, and procedure changes made for Apollo 11 and 12 sample processing are presented. The sample processing experiences indicate that only a few operating personnel are required to process the sample efficiently, safely, and rapidly in the high-vacuum complex. The high-vacuum complex was designed to handle the many contingencies, both quarantine and scientific, associated with handling an unknown entity such as the lunar sample. Lunar sample handling necessitated a complex system that could not respond rapidly to changing scientific requirements as the characteristics of the lunar sample were better defined. Although the complex successfully handled the processing of Apollo 11 and 12 lunar samples, the scientific requirement for vacuum samples was deleted after the Apollo 12 mission just as the vacuum system was reaching its full potential.

  12. A review of human factors challenges of complex adaptive systems: discovering and understanding chaos in human performance.

    PubMed

    Karwowski, Waldemar

    2012-12-01

    In this paper, the author explores a need for a greater understanding of the true nature of human-system interactions from the perspective of the theory of complex adaptive systems, including the essence of complexity, emergent properties of system behavior, nonlinear systems dynamics, and deterministic chaos. Human performance, more often than not, constitutes complex adaptive phenomena with emergent properties that exhibit nonlinear dynamical (chaotic) behaviors. The complexity challenges in the design and management of contemporary work systems, including service systems, are explored. Examples of selected applications of the concepts of nonlinear dynamics to the study of human physical performance are provided. Understanding and applications of the concepts of theory of complex adaptive and dynamical systems should significantly improve the effectiveness of human-centered design efforts of a large system of systems. Performance of many contemporary work systems and environments may be sensitive to the initial conditions and may exhibit dynamic nonlinear properties and chaotic system behaviors. Human-centered design of emergent human-system interactions requires application of the theories of nonlinear dynamics and complex adaptive system. The success of future human-systems integration efforts requires the fusion of paradigms, knowledge, design principles, and methodologies of human factors and ergonomics with those of the science of complex adaptive systems as well as modern systems engineering.

  13. Undecidability and Irreducibility Conditions for Open-Ended Evolution and Emergence.

    PubMed

    Hernández-Orozco, Santiago; Hernández-Quiroz, Francisco; Zenil, Hector

    2018-01-01

    Is undecidability a requirement for open-ended evolution (OEE)? Using methods derived from algorithmic complexity theory, we propose robust computational definitions of open-ended evolution and the adaptability of computable dynamical systems. Within this framework, we show that decidability imposes absolute limits on the stable growth of complexity in computable dynamical systems. Conversely, systems that exhibit (strong) open-ended evolution must be undecidable, establishing undecidability as a requirement for such systems. Complexity is assessed in terms of three measures: sophistication, coarse sophistication, and busy beaver logical depth. These three complexity measures assign low complexity values to random (incompressible) objects. As time grows, the stated complexity measures allow for the existence of complex states during the evolution of a computable dynamical system. We show, however, that finding these states involves undecidable computations. We conjecture that for similar complexity measures that assign low complexity values, decidability imposes comparable limits on the stable growth of complexity, and that such behavior is necessary for nontrivial evolutionary systems. We show that the undecidability of adapted states imposes novel and unpredictable behavior on the individuals or populations being modeled. Such behavior is irreducible. Finally, we offer an example of a system, first proposed by Chaitin, that exhibits strong OEE.

  14. The Stryker Mobile Gun System: A Case Study on Managing Complexity

    DTIC Science & Technology

    2009-06-01

    In his article Managing Innovation in Complex Product Systems, Howard Rush (1997) identified three “hotspot” categories: 1) requirements... Managing innovation in complex product systems. The Institution for Electrical Engineers. Retrieved February 2, 2009, from http

  15. Automated Derivation of Complex System Constraints from User Requirements

    NASA Technical Reports Server (NTRS)

    Foshee, Mark; Murey, Kim; Marsh, Angela

    2010-01-01

    The Payload Operations Integration Center (POIC) located at the Marshall Space Flight Center has the responsibility of integrating US payload science requirements for the International Space Station (ISS). All payload operations must request ISS system resources so that the resource usage will be included in the ISS on-board execution timelines. The scheduling of resources and building of the timeline is performed using the Consolidated Planning System (CPS). The ISS resources are quite complex due to the large number of components that must be accounted for. The planners at the POIC simplify the process for Payload Developers (PD) by providing the PDs with a application that has the basic functionality PDs need as well as list of simplified resources in the User Requirements Collection (URC) application. The planners maintained a mapping of the URC resources to the CPS resources. The process of manually converting PD's science requirements from a simplified representation to a more complex CPS representation is a time-consuming and tedious process. The goal is to provide a software solution to allow the planners to build a mapping of the complex CPS constraints to the basic URC constraints and automatically convert the PD's requirements into systems requirements during export to CPS.

  16. Selecting Measures to Evaluate Complex Sociotechnical Systems: An Empirical Comparison of a Task-based and Constraint-based Method

    DTIC Science & Technology

    2013-07-01

    experimental requirements of the research are described (See Appendix A for a full description of the development and testing ). 3.1.3 The Black...41 3. TEST SYSTEM USED FOR THE RESEARCH ...Chapter 3: Test system used for the research A complex socio-technical system is required to compare the methods. An emulation of a radar warning

  17. Overview of DYMCAS, the Y-12 Material Control And Accountability System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alspaugh, D. H.

    2001-07-01

    This paper gives an overview of DYMCAS, the material control and accountability information system for the Y-12 National Security Complex. A common misconception, even within the DOE community, understates the nature and complexity of material control and accountability (MC and A) systems, likening them to parcel delivery systems tracking packages at various locations or banking systems that account for money, down to the penny. A major point set forth in this paper is that MC and A systems such as DYMCAS can be and often are very complex. Given accountability reporting requirements and the critical and sensitive nature of themore » task, no MC and A system can be simple. The complexity of site-level accountability systems, however, varies dramatically depending on the amounts, kinds, and forms of nuclear materials and the kinds of processing performed at the site. Some accountability systems are tailored to unique and highly complex site-level materials and material processing and, consequently, are highly complex systems. Sites with less complexity require less complex accountability systems, and where processes and practices are the same or similar, sites on the mid-to-low end of the complexity scale can effectively utilize a standard accountability system. In addition to being complex, a unique feature of DYMCAS is its integration with the site production control and manufacturing system. This paper will review the advantages of such integration, as well as related challenges, and make the point that the effectiveness of complex MC and A systems can be significantly enhanced through appropriate systems integration.« less

  18. Orbiter data reduction complex data processing requirements for the OFT mission evaluation team (level C)

    NASA Technical Reports Server (NTRS)

    1979-01-01

    This document addresses requirements for post-test data reduction in support of the Orbital Flight Tests (OFT) mission evaluation team, specifically those which are planned to be implemented in the ODRC (Orbiter Data Reduction Complex). Only those requirements which have been previously baselined by the Data Systems and Analysis Directorate configuration control board are included. This document serves as the control document between Institutional Data Systems Division and the Integration Division for OFT mission evaluation data processing requirements, and shall be the basis for detailed design of ODRC data processing systems.

  19. Complex, Dynamic Systems: A New Transdisciplinary Theme for Applied Linguistics?

    ERIC Educational Resources Information Center

    Larsen-Freeman, Diane

    2012-01-01

    In this plenary address, I suggest that Complexity Theory has the potential to contribute a transdisciplinary theme to applied linguistics. Transdisciplinary themes supersede disciplines and spur new kinds of creative activity (Halliday 2001 [1990]). Investigating complex systems requires researchers to pay attention to system dynamics. Since…

  20. Concept of a Cloud Service for Data Preparation and Computational Control on Custom HPC Systems in Application to Molecular Dynamics

    NASA Astrophysics Data System (ADS)

    Puzyrkov, Dmitry; Polyakov, Sergey; Podryga, Viktoriia; Markizov, Sergey

    2018-02-01

    At the present stage of computer technology development it is possible to study the properties and processes in complex systems at molecular and even atomic levels, for example, by means of molecular dynamics methods. The most interesting are problems related with the study of complex processes under real physical conditions. Solving such problems requires the use of high performance computing systems of various types, for example, GRID systems and HPC clusters. Considering the time consuming computational tasks, the need arises of software for automatic and unified monitoring of such computations. A complex computational task can be performed over different HPC systems. It requires output data synchronization between the storage chosen by a scientist and the HPC system used for computations. The design of the computational domain is also quite a problem. It requires complex software tools and algorithms for proper atomistic data generation on HPC systems. The paper describes the prototype of a cloud service, intended for design of atomistic systems of large volume for further detailed molecular dynamic calculations and computational management for this calculations, and presents the part of its concept aimed at initial data generation on the HPC systems.

  1. Sensemaking in a Value Based Context for Large Scale Complex Engineered Systems

    NASA Astrophysics Data System (ADS)

    Sikkandar Basha, Nazareen

    The design and the development of Large-Scale Complex Engineered Systems (LSCES) requires the involvement of multiple teams and numerous levels of the organization and interactions with large numbers of people and interdisciplinary departments. Traditionally, requirements-driven Systems Engineering (SE) is used in the design and development of these LSCES. The requirements are used to capture the preferences of the stakeholder for the LSCES. Due to the complexity of the system, multiple levels of interactions are required to elicit the requirements of the system within the organization. Since LSCES involves people and interactions between the teams and interdisciplinary departments, it should be socio-technical in nature. The elicitation of the requirements of most large-scale system projects are subjected to creep in time and cost due to the uncertainty and ambiguity of requirements during the design and development. In an organization structure, the cost and time overrun can occur at any level and iterate back and forth thus increasing the cost and time. To avoid such creep past researches have shown that rigorous approaches such as value based designing can be used to control it. But before the rigorous approaches can be used, the decision maker should have a proper understanding of requirements creep and the state of the system when the creep occurs. Sensemaking is used to understand the state of system when the creep occurs and provide a guidance to decision maker. This research proposes the use of the Cynefin framework, sensemaking framework which can be used in the design and development of LSCES. It can aide in understanding the system and decision making to minimize the value gap due to requirements creep by eliminating ambiguity which occurs during design and development. A sample hierarchical organization is used to demonstrate the state of the system at the occurrence of requirements creep in terms of cost and time using the Cynefin framework. These trials are continued for different requirements and at different sub-system level. The results obtained show that the Cynefin framework can be used to improve the value of the system and can be used for predictive analysis. The decision makers can use these findings and use rigorous approaches and improve the design of Large Scale Complex Engineered Systems.

  2. QMU as an approach to strengthening the predictive capabilities of complex models.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, Genetha Anne.; Boggs, Paul T.; Grace, Matthew D.

    2010-09-01

    Complex systems are made up of multiple interdependent parts, and the behavior of the entire system cannot always be directly inferred from the behavior of the individual parts. They are nonlinear and system responses are not necessarily additive. Examples of complex systems include energy, cyber and telecommunication infrastructures, human and animal social structures, and biological structures such as cells. To meet the goals of infrastructure development, maintenance, and protection for cyber-related complex systems, novel modeling and simulation technology is needed. Sandia has shown success using M&S in the nuclear weapons (NW) program. However, complex systems represent a significant challenge andmore » relative departure from the classical M&S exercises, and many of the scientific and mathematical M&S processes must be re-envisioned. Specifically, in the NW program, requirements and acceptable margins for performance, resilience, and security are well-defined and given quantitatively from the start. The Quantification of Margins and Uncertainties (QMU) process helps to assess whether or not these safety, reliability and performance requirements have been met after a system has been developed. In this sense, QMU is used as a sort of check that requirements have been met once the development process is completed. In contrast, performance requirements and margins may not have been defined a priori for many complex systems, (i.e. the Internet, electrical distribution grids, etc.), particularly not in quantitative terms. This project addresses this fundamental difference by investigating the use of QMU at the start of the design process for complex systems. Three major tasks were completed. First, the characteristics of the cyber infrastructure problem were collected and considered in the context of QMU-based tools. Second, UQ methodologies for the quantification of model discrepancies were considered in the context of statistical models of cyber activity. Third, Bayesian methods for optimal testing in the QMU framework were developed. This completion of this project represent an increased understanding of how to apply and use the QMU process as a means for improving model predictions of the behavior of complex systems. 4« less

  3. Formal Requirements-Based Programming for Complex Systems

    NASA Technical Reports Server (NTRS)

    Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis

    2005-01-01

    Computer science as a field has not yet produced a general method to mechanically transform complex computer system requirements into a provably equivalent implementation. Such a method would be one major step towards dealing with complexity in computing, yet it remains the elusive holy grail of system development. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that such tools and methods leave unfilled is that the formal models cannot be proven to be equivalent to the system requirements as originated by the customer For the classes of complex systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations. While other techniques are available, this method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. We illustrate the application of the method to an example procedure from the Hubble Robotic Servicing Mission currently under study and preliminary formulation at NASA Goddard Space Flight Center.

  4. Lasercom system architecture with reduced complexity

    NASA Technical Reports Server (NTRS)

    Lesh, James R. (Inventor); Chen, Chien-Chung (Inventor); Ansari, Homayoon (Inventor)

    1994-01-01

    Spatial acquisition and precision beam pointing functions are critical to spaceborne laser communication systems. In the present invention, a single high bandwidth CCD detector is used to perform both spatial acquisition and tracking functions. Compared to previous lasercom hardware design, the array tracking concept offers reduced system complexity by reducing the number of optical elements in the design. Specifically, the design requires only one detector and one beam steering mechanism. It also provides the means to optically close the point-ahead control loop. The technology required for high bandwidth array tracking was examined and shown to be consistent with current state of the art. The single detector design can lead to a significantly reduced system complexity and a lower system cost.

  5. LaserCom System Architecture With Reduced Complexity

    NASA Technical Reports Server (NTRS)

    Lesh, James R. (Inventor); Chen, Chien-Chung (Inventor); Ansari, Homa-Yoon (Inventor)

    1996-01-01

    Spatial acquisition and precision beam pointing functions are critical to spaceborne laser communication systems. In the present invention a single high bandwidth CCD detector is used to perform both spatial acquisition and tracking functions. Compared to previous lasercom hardware design, the array tracking concept offers reduced system complexity by reducing the number of optical elements in the design. Specifically, the design requires only one detector and one beam steering mechanism. It also provides means to optically close the point-ahead control loop. The technology required for high bandwidth array tracking was examined and shown to be consistent with current state of the art. The single detector design can lead to a significantly reduced system complexity and a lower system cost.

  6. Development of simulation computer complex specification

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The Training Simulation Computer Complex Study was one of three studies contracted in support of preparations for procurement of a shuttle mission simulator for shuttle crew training. The subject study was concerned with definition of the software loads to be imposed on the computer complex to be associated with the shuttle mission simulator and the development of procurement specifications based on the resulting computer requirements. These procurement specifications cover the computer hardware and system software as well as the data conversion equipment required to interface the computer to the simulator hardware. The development of the necessary hardware and software specifications required the execution of a number of related tasks which included, (1) simulation software sizing, (2) computer requirements definition, (3) data conversion equipment requirements definition, (4) system software requirements definition, (5) a simulation management plan, (6) a background survey, and (7) preparation of the specifications.

  7. Integrating a geographic information system, a scientific visualization system and an orographic precipitation model

    USGS Publications Warehouse

    Hay, L.; Knapp, L.

    1996-01-01

    Investigating natural, potential, and man-induced impacts on hydrological systems commonly requires complex modelling with overlapping data requirements, and massive amounts of one- to four-dimensional data at multiple scales and formats. Given the complexity of most hydrological studies, the requisite software infrastructure must incorporate many components including simulation modelling, spatial analysis and flexible, intuitive displays. There is a general requirement for a set of capabilities to support scientific analysis which, at this time, can only come from an integration of several software components. Integration of geographic information systems (GISs) and scientific visualization systems (SVSs) is a powerful technique for developing and analysing complex models. This paper describes the integration of an orographic precipitation model, a GIS and a SVS. The combination of these individual components provides a robust infrastructure which allows the scientist to work with the full dimensionality of the data and to examine the data in a more intuitive manner.

  8. Application of simplified Complexity Theory concepts for healthcare social systems to explain the implementation of evidence into practice.

    PubMed

    Chandler, Jacqueline; Rycroft-Malone, Jo; Hawkes, Claire; Noyes, Jane

    2016-02-01

    To examine the application of core concepts from Complexity Theory to explain the findings from a process evaluation undertaken in a trial evaluating implementation strategies for recommendations about reducing surgical fasting times. The proliferation of evidence-based guidance requires a greater focus on its implementation. Theory is required to explain the complex processes across the multiple healthcare organizational levels. This social healthcare context involves the interaction between professionals, patients and the organizational systems in care delivery. Complexity Theory may provide an explanatory framework to explain the complexities inherent in implementation in social healthcare contexts. A secondary thematic analysis of qualitative process evaluation data informed by Complexity Theory. Seminal texts applying Complexity Theory to the social context were annotated, key concepts extracted and core Complexity Theory concepts identified. These core concepts were applied as a theoretical lens to provide an explanation of themes from a process evaluation of a trial evaluating the implementation of strategies to reduce surgical fasting times. Sampled substantive texts provided a representative spread of theoretical development and application of Complexity Theory from late 1990's-2013 in social science, healthcare, management and philosophy. Five Complexity Theory core concepts extracted were 'self-organization', 'interaction', 'emergence', 'system history' and 'temporality'. Application of these concepts suggests routine surgical fasting practice is habituated in the social healthcare system and therefore it cannot easily be reversed. A reduction to fasting times requires an incentivised new approach to emerge in the surgical system's priority of completing the operating list. The application of Complexity Theory provides a useful explanation for resistance to change fasting practice. Its utility in implementation research warrants further attention and evaluation. © 2015 John Wiley & Sons Ltd.

  9. Experimentally modeling stochastic processes with less memory by the use of a quantum processor

    PubMed Central

    Palsson, Matthew S.; Gu, Mile; Ho, Joseph; Wiseman, Howard M.; Pryde, Geoff J.

    2017-01-01

    Computer simulation of observable phenomena is an indispensable tool for engineering new technology, understanding the natural world, and studying human society. However, the most interesting systems are often so complex that simulating their future behavior demands storing immense amounts of information regarding how they have behaved in the past. For increasingly complex systems, simulation becomes increasingly difficult and is ultimately constrained by resources such as computer memory. Recent theoretical work shows that quantum theory can reduce this memory requirement beyond ultimate classical limits, as measured by a process’ statistical complexity, C. We experimentally demonstrate this quantum advantage in simulating stochastic processes. Our quantum implementation observes a memory requirement of Cq = 0.05 ± 0.01, far below the ultimate classical limit of C = 1. Scaling up this technique would substantially reduce the memory required in simulations of more complex systems. PMID:28168218

  10. Applying Early Systems Engineering: Injecting Knowledge into the Capability Development Process

    DTIC Science & Technology

    2012-10-01

    involves early use of systems engi- neering and technical analyses to supplement the existing operational analysis techniques currently used in...complexity, and costs of systems now being developed require tight coupling between operational requirements stated in the CDD, system requirements...Fleischer » Keywords: Capability Development, Competitive Prototyping, Knowledge Points, Early Systems Engineering Applying Early Systems

  11. Explicit solution techniques for impact with contact constraints

    NASA Technical Reports Server (NTRS)

    Mccarty, Robert E.

    1993-01-01

    Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.

  12. Explicit solution techniques for impact with contact constraints

    NASA Astrophysics Data System (ADS)

    McCarty, Robert E.

    1993-08-01

    Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.

  13. Bipartite recognition of target RNAs activates DNA cleavage by the Type III-B CRISPR–Cas system

    PubMed Central

    Elmore, Joshua R.; Sheppard, Nolan F.; Ramia, Nancy; Deighan, Trace; Li, Hong; Terns, Rebecca M.; Terns, Michael P.

    2016-01-01

    CRISPR–Cas systems eliminate nucleic acid invaders in bacteria and archaea. The effector complex of the Type III-B Cmr system cleaves invader RNAs recognized by the CRISPR RNA (crRNA ) of the complex. Here we show that invader RNAs also activate the Cmr complex to cleave DNA. As has been observed for other Type III systems, Cmr eliminates plasmid invaders in Pyrococcus furiosus by a mechanism that depends on transcription of the crRNA target sequence within the plasmid. Notably, we found that the target RNA per se induces DNA cleavage by the Cmr complex in vitro. DNA cleavage activity does not depend on cleavage of the target RNA but notably does require the presence of a short sequence adjacent to the target sequence within the activating target RNA (rPAM [RNA protospacer-adjacent motif]). The activated complex does not require a target sequence (or a PAM) in the DNA substrate. Plasmid elimination by the P. furiosus Cmr system also does not require the Csx1 (CRISPR-associated Rossman fold [CARF] superfamily) protein. Plasmid silencing depends on the HD nuclease and Palm domains of the Cmr2 (Cas10 superfamily) protein. The results establish the Cmr complex as a novel DNA nuclease activated by invader RNAs containing a crRNA target sequence and a rPAM. PMID:26848045

  14. Optimal space communications techniques. [using digital and phase locked systems for signal processing

    NASA Technical Reports Server (NTRS)

    Schilling, D. L.

    1974-01-01

    Digital multiplication of two waveforms using delta modulation (DM) is discussed. It is shown that while conventional multiplication of two N bit words requires N2 complexity, multiplication using DM requires complexity which increases linearly with N. Bounds on the signal-to-quantization noise ratio (SNR) resulting from this multiplication are determined and compared with the SNR obtained using standard multiplication techniques. The phase locked loop (PLL) system, consisting of a phase detector, voltage controlled oscillator, and a linear loop filter, is discussed in terms of its design and system advantages. Areas requiring further research are identified.

  15. Research Methodology on Language Development from a Complex Systems Perspective

    ERIC Educational Resources Information Center

    Larsen-Freeman, Diane; Cameron, Lynne

    2008-01-01

    Changes to research methodology motivated by the adoption of a complexity theory perspective on language development are considered. The dynamic, nonlinear, and open nature of complex systems, together with their tendency toward self-organization and interaction across levels and timescales, requires changes in traditional views of the functions…

  16. The Influence of Cultural Factors on Trust in Automation

    ERIC Educational Resources Information Center

    Chien, Shih-Yi James

    2016-01-01

    Human interaction with automation is a complex process that requires both skilled operators and complex system designs to effectively enhance overall performance. Although automation has successfully managed complex systems throughout the world for over half a century, inappropriate reliance on automation can still occur, such as the recent…

  17. JPRS Report, Science & Technology, USSR: Computers, Control Systems and Machines

    DTIC Science & Technology

    1989-03-14

    optimizatsii slozhnykh sistem (Coding Theory and Complex System Optimization ). Alma-Ata, Nauka Press, 1977, pp. 8-16. 11. Author’s certificate number...Interpreter Specifics [0. I. Amvrosova] ............................................. 141 Creation of Modern Computer Systems for Complex Ecological...processor can be designed to decrease degradation upon failure and assure more reliable processor operation, without requiring more complex software or

  18. Wisconsin System for Instructional Management: Teachers' Manual for the Unified System. Practical Paper No. 18.

    ERIC Educational Resources Information Center

    Bozeman, William C.; And Others

    Individualized instruction including continuous progress education and team teaching requires a complexity of organizational structure dissimilar to that of traditional schools. In such systems, teachers must maintain extensive and complex student record systems. This teachers' manual provides an example of a computerized record system developed…

  19. 76 FR 2183 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of a...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-12

    ... Electronic Complex Orders entered to the NYSE Arca System must comply with the order exposure requirements of... Complex Order, a Stock/ Option Order, or a Stock/Complex Order must be entered into the NYSE Arca System... Change Amending NYSE Arca Options Rule 6.62(h) to Define Stock/Complex Orders, Amending NYSE Arca Options...

  20. Intelligent systems engineering methodology

    NASA Technical Reports Server (NTRS)

    Fouse, Scott

    1990-01-01

    An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.

  1. Confluence and convergence: team effectiveness in complex systems.

    PubMed

    Porter-OʼGrady, Tim

    2015-01-01

    Complex adaptive systems require nursing leadership to rethink organizational work and the viability and effectiveness of teams. Much of emergent thinking about complexity and systems and organizations alter the understanding of the nature and function of teamwork and the configuration and leadership of team effort. Reflecting on basic concepts of complexity and their application to team formation, dynamics, and outcomes lays an important foundation for effectively guiding the strategic activity of systems through the focused tactical action of teams. Basic principles of complexity, their impact on teams, and the fundamental elements of team effectiveness are explored.

  2. A Study of Students' Reasoning about Probabilistic Causality: Implications for Understanding Complex Systems and for Instructional Design

    ERIC Educational Resources Information Center

    Grotzer, Tina A.; Solis, S. Lynneth; Tutwiler, M. Shane; Cuzzolino, Megan Powell

    2017-01-01

    Understanding complex systems requires reasoning about causal relationships that behave or appear to behave probabilistically. Features such as distributed agency, large spatial scales, and time delays obscure co-variation relationships and complex interactions can result in non-deterministic relationships between causes and effects that are best…

  3. A Framework to Determine New System Requirements Under Design Parameter and Demand Uncertainties

    DTIC Science & Technology

    2015-04-30

    relegates quantitative complexities of decision-making to the method and designates trade-space exploration to the practitioner. We demonstrate the...quantitative complexities of decision-making to the method and designates trade-space exploration to the practitioner. We demonstrate the approach...play a critical role in determining new system requirements. Scope and Method of Approach The early stages of the design process have substantial

  4. Challenges in Developing Models Describing Complex Soil Systems

    NASA Astrophysics Data System (ADS)

    Simunek, J.; Jacques, D.

    2014-12-01

    Quantitative mechanistic models that consider basic physical, mechanical, chemical, and biological processes have the potential to be powerful tools to integrate our understanding of complex soil systems, and the soil science community has often called for models that would include a large number of these diverse processes. However, once attempts have been made to develop such models, the response from the community has not always been overwhelming, especially after it discovered that these models are consequently highly complex, requiring not only a large number of parameters, not all of which can be easily (or at all) measured and/or identified, and which are often associated with large uncertainties, but also requiring from their users deep knowledge of all/most of these implemented physical, mechanical, chemical and biological processes. Real, or perceived, complexity of these models then discourages users from using them even for relatively simple applications, for which they would be perfectly adequate. Due to the nonlinear nature and chemical/biological complexity of the soil systems, it is also virtually impossible to verify these types of models analytically, raising doubts about their applicability. Code inter-comparisons, which is then likely the most suitable method to assess code capabilities and model performance, requires existence of multiple models of similar/overlapping capabilities, which may not always exist. It is thus a challenge not only to developed models describing complex soil systems, but also to persuade the soil science community in using them. As a result, complex quantitative mechanistic models are still an underutilized tool in soil science research. We will demonstrate some of the challenges discussed above on our own efforts in developing quantitative mechanistic models (such as HP1/2) for complex soil systems.

  5. Telerobot operator control station requirements

    NASA Technical Reports Server (NTRS)

    Kan, Edwin P.

    1988-01-01

    The operator control station of a telerobot system has unique functional and human factors requirements. It has to satisfy the needs of a truly interactive and user-friendly complex system, a telerobot system being a hybrid between a teleoperated and an autonomous system. These functional, hardware and software requirements are discussed, with explicit reference to the design objectives and constraints of the JPL/NASA Telerobot Demonstrator System.

  6. Visualizing Parallel Computer System Performance

    NASA Technical Reports Server (NTRS)

    Malony, Allen D.; Reed, Daniel A.

    1988-01-01

    Parallel computer systems are among the most complex of man's creations, making satisfactory performance characterization difficult. Despite this complexity, there are strong, indeed, almost irresistible, incentives to quantify parallel system performance using a single metric. The fallacy lies in succumbing to such temptations. A complete performance characterization requires not only an analysis of the system's constituent levels, it also requires both static and dynamic characterizations. Static or average behavior analysis may mask transients that dramatically alter system performance. Although the human visual system is remarkedly adept at interpreting and identifying anomalies in false color data, the importance of dynamic, visual scientific data presentation has only recently been recognized Large, complex parallel system pose equally vexing performance interpretation problems. Data from hardware and software performance monitors must be presented in ways that emphasize important events while eluding irrelevant details. Design approaches and tools for performance visualization are the subject of this paper.

  7. A Multifaceted Mathematical Approach for Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexander, F.; Anitescu, M.; Bell, J.

    2012-03-07

    Applied mathematics has an important role to play in developing the tools needed for the analysis, simulation, and optimization of complex problems. These efforts require the development of the mathematical foundations for scientific discovery, engineering design, and risk analysis based on a sound integrated approach for the understanding of complex systems. However, maximizing the impact of applied mathematics on these challenges requires a novel perspective on approaching the mathematical enterprise. Previous reports that have surveyed the DOE's research needs in applied mathematics have played a key role in defining research directions with the community. Although these reports have had significantmore » impact, accurately assessing current research needs requires an evaluation of today's challenges against the backdrop of recent advances in applied mathematics and computing. To address these needs, the DOE Applied Mathematics Program sponsored a Workshop for Mathematics for the Analysis, Simulation and Optimization of Complex Systems on September 13-14, 2011. The workshop had approximately 50 participants from both the national labs and academia. The goal of the workshop was to identify new research areas in applied mathematics that will complement and enhance the existing DOE ASCR Applied Mathematics Program efforts that are needed to address problems associated with complex systems. This report describes recommendations from the workshop and subsequent analysis of the workshop findings by the organizing committee.« less

  8. Evaluating the Science of Discovery in Complex Health Systems

    ERIC Educational Resources Information Center

    Norman, Cameron D.; Best, Allan; Mortimer, Sharon; Huerta, Timothy; Buchan, Alison

    2011-01-01

    Complex health problems such as chronic disease or pandemics require knowledge that transcends disciplinary boundaries to generate solutions. Such transdisciplinary discovery requires researchers to work and collaborate across boundaries, combining elements of basic and applied science. At the same time, calls for more interdisciplinary health…

  9. Human-Robot Interaction in High Vulnerability Domains

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2016-01-01

    Future NASA missions will require successful integration of the human with highly complex systems. Highly complex systems are likely to involve humans, automation, and some level of robotic assistance. The complex environments will require successful integration of the human with automation, with robots, and with human-automation-robot teams to accomplish mission critical goals. Many challenges exist for the human performing in these types of operational environments with these kinds of systems. Systems must be designed to optimally integrate various levels of inputs and outputs based on the roles and responsibilities of the human, the automation, and the robots; from direct manual control, shared human-robotic control, or no active human control (i.e. human supervisory control). It is assumed that the human will remain involved at some level. Technologies that vary based on contextual demands and on operator characteristics (workload, situation awareness) will be needed when the human integrates into these systems. Predictive models that estimate the impact of the technologies on the system performance and the on the human operator are also needed to meet the challenges associated with such future complex human-automation-robot systems in extreme environments.

  10. Building Complex Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Mike

    2006-01-01

    The explosion of capabilities and new products within ICT (Information and Communication Technology) has fostered widespread, overly optimistic opinions regarding the industry, based on common but unjustified assumptions of quality and correctness of software. These assumptions are encouraged by software producers and vendors, who have not succeeded in finding a way to overcome the lack of an automated, mathematically sound way to develop correct systems from requirements. NASA faces this dilemma as it envisages advanced mission concepts in future exploration missions, which may well be the most ambitious computer-based systems ever developed. Such missions entail levels of complexity that beg for new methods for system development. NASA-led research in such areas as sensor networks, formal methods, autonomic computing, and requirements-based programming (to name but a few) will offer some innovative approaches to achieving correctness in complex system development.

  11. Systems and context modeling approach to requirements analysis

    NASA Astrophysics Data System (ADS)

    Ahuja, Amrit; Muralikrishna, G.; Patwari, Puneet; Subhrojyoti, C.; Swaminathan, N.; Vin, Harrick

    2014-08-01

    Ensuring completeness and correctness of the requirements for a complex system such as the SKA is challenging. Current system engineering practice includes developing a stakeholder needs definition, a concept of operations, and defining system requirements in terms of use cases and requirements statements. We present a method that enhances this current practice into a collection of system models with mutual consistency relationships. These include stakeholder goals, needs definition and system-of-interest models, together with a context model that participates in the consistency relationships among these models. We illustrate this approach by using it to analyze the SKA system requirements.

  12. Liquid Rocket Booster (LRB) for the Space Transportation System (STS) systems study. Appendix G: LRB for the STS system study level 2 requirements, revision 1

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Requirements are presented for shuttle system definition; performance and design characteristics; shuttle vehicle end item performance and design characteristics; ground operations complex performance and design characteristics; operability and system design and construction standards; and quality control.

  13. Establishment of an in vitro transcription system for Peste des petits ruminant virus.

    PubMed

    Yunus, Mohammad; Shaila, Melkote S

    2012-12-05

    Peste-des-petits ruminants virus (PPRV) is a non segmented negative strand RNA virus of the genus Morbillivirus within Paramyxoviridae family. Negative strand RNA viruses are known to carry nucleocapsid (N) protein, phospho (P) protein and RNA polymerase (L protein) packaged within the virion which possess all activities required for transcription, post-transcriptional modification of mRNA and replication. In order to understand the mechanism of transcription and replication of the virus, an in vitro transcription reconstitution system is required. In the present work, an in vitro transcription system has been developed with ribonucleoprotein (RNP) complex purified from virus infected cells as well as partially purified recombinant polymerase (L-P) complex from insect cells along with N-RNA (genomic RNA encapsidated by N protein) template isolated from virus infected cells. RNP complex isolated from virus infected cells and recombinant L-P complex purified from insect cells was used to reconstitute transcription on N-RNA template. The requirement for this transcription reconstitution has been defined. Transcription of viral genes in the in vitro system was confirmed by PCR amplification of cDNAs corresponding to individual transcripts using gene specific primers. In order to measure the relative expression level of viral transcripts, real time PCR analysis was carried out. qPCR analysis of the transcription products made in vitro showed a gradient of polarity of transcription from 3' end to 5' end of the genome similar to that exhibited by the virus in infected cells. This report describes for the first time, the development of an in vitro transcription reconstitution system for PPRV with RNP complex purified from infected cells and recombinant L-P complex expressed in insect cells. Both the complexes were able to synthesize all the mRNA species in vitro, exhibiting a gradient of polarity in transcription.

  14. Risk Modeling of Interdependent Complex Systems of Systems: Theory and Practice.

    PubMed

    Haimes, Yacov Y

    2018-01-01

    The emergence of the complexity characterizing our systems of systems (SoS) requires a reevaluation of the way we model, assess, manage, communicate, and analyze the risk thereto. Current models for risk analysis of emergent complex SoS are insufficient because too often they rely on the same risk functions and models used for single systems. These models commonly fail to incorporate the complexity derived from the networks of interdependencies and interconnectedness (I-I) characterizing SoS. There is a need to reevaluate currently practiced risk analysis to respond to this reality by examining, and thus comprehending, what makes emergent SoS complex. The key to evaluating the risk to SoS lies in understanding the genesis of characterizing I-I of systems manifested through shared states and other essential entities within and among the systems that constitute SoS. The term "essential entities" includes shared decisions, resources, functions, policies, decisionmakers, stakeholders, organizational setups, and others. This undertaking can be accomplished by building on state-space theory, which is fundamental to systems engineering and process control. This article presents a theoretical and analytical framework for modeling the risk to SoS with two case studies performed with the MITRE Corporation and demonstrates the pivotal contributions made by shared states and other essential entities to modeling and analysis of the risk to complex SoS. A third case study highlights the multifarious representations of SoS, which require harmonizing the risk analysis process currently applied to single systems when applied to complex SoS. © 2017 Society for Risk Analysis.

  15. Organizational Agility and Complex Enterprise System Innovations: A Mixed Methods Study of the Effects of Enterprise Systems on Organizational Agility

    ERIC Educational Resources Information Center

    Kharabe, Amol T.

    2012-01-01

    Over the last two decades, firms have operated in "increasingly" accelerated "high-velocity" dynamic markets, which require them to become "agile." During the same time frame, firms have increasingly deployed complex enterprise systems--large-scale packaged software "innovations" that integrate and automate…

  16. A Knowledge-Based and Model-Driven Requirements Engineering Approach to Conceptual Satellite Design

    NASA Astrophysics Data System (ADS)

    Dos Santos, Walter A.; Leonor, Bruno B. F.; Stephany, Stephan

    Satellite systems are becoming even more complex, making technical issues a significant cost driver. The increasing complexity of these systems makes requirements engineering activities both more important and difficult. Additionally, today's competitive pressures and other market forces drive manufacturing companies to improve the efficiency with which they design and manufacture space products and systems. This imposes a heavy burden on systems-of-systems engineering skills and particularly on requirements engineering which is an important phase in a system's life cycle. When this is poorly performed, various problems may occur, such as failures, cost overruns and delays. One solution is to underpin the preliminary conceptual satellite design with computer-based information reuse and integration to deal with the interdisciplinary nature of this problem domain. This can be attained by taking a model-driven engineering approach (MDE), in which models are the main artifacts during system development. MDE is an emergent approach that tries to address system complexity by the intense use of models. This work outlines the use of SysML (Systems Modeling Language) and a novel knowledge-based software tool, named SatBudgets, to deal with these and other challenges confronted during the conceptual phase of a university satellite system, called ITASAT, currently being developed by INPE and some Brazilian universities.

  17. Expert systems for superalloy studies

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Kaukler, William F.

    1990-01-01

    There are many areas in science and engineering which require knowledge of an extremely complex foundation of experimental results in order to design methodologies for developing new materials or products. Superalloys are an area which fit well into this discussion in the sense that they are complex combinations of elements which exhibit certain characteristics. Obviously the use of superalloys in high performance, high temperature systems such as the Space Shuttle Main Engine is of interest to NASA. The superalloy manufacturing process is complex and the implementation of an expert system within the design process requires some thought as to how and where it should be implemented. A major motivation is to develop a methodology to assist metallurgists in the design of superalloy materials using current expert systems technology. Hydrogen embrittlement is disasterous to rocket engines and the heuristics can be very complex. Attacking this problem as one module in the overall design process represents a significant step forward. In order to describe the objectives of the first phase implementation, the expert system was designated Hydrogen Environment Embrittlement Expert System (HEEES).

  18. The effect of multiple external representations (MERs) worksheets toward complex system reasoning achievement

    NASA Astrophysics Data System (ADS)

    Sumarno; Ibrahim, M.; Supardi, Z. A. I.

    2018-03-01

    The application of a systems approach to assessing biological systems provides hope for a coherent understanding of cell dynamics patterns and their relationship to plant life. This action required the reasoning about complex systems. In other sides, there were a lot of researchers who provided the proof about the instructional successions. They involved the multiple external representations which improved the biological learning. The researcher conducted an investigation using one shoot case study design which involved 30 students in proving that the MERs worksheets could affect the student's achievement of reasoning about complex system. The data had been collected based on test of reasoning about complex system and student's identification result who worked through MERs. The result showed that only partially students could achieve reasoning about system complex, but their MERs skill could support their reasoning ability of complex system. This study could bring a new hope to develop the MERs worksheet as a tool to facilitate the reasoning about complex system.

  19. Robust Fixed-Structure Controller Synthesis

    NASA Technical Reports Server (NTRS)

    Corrado, Joseph R.; Haddad, Wassim M.; Gupta, Kajal (Technical Monitor)

    2000-01-01

    The ability to develop an integrated control system design methodology for robust high performance controllers satisfying multiple design criteria and real world hardware constraints constitutes a challenging task. The increasingly stringent performance specifications required for controlling such systems necessitates a trade-off between controller complexity and robustness. The principle challenge of the minimal complexity robust control design is to arrive at a tractable control design formulation in spite of the extreme complexity of such systems. Hence, design of minimal complexitY robust controllers for systems in the face of modeling errors has been a major preoccupation of system and control theorists and practitioners for the past several decades.

  20. Development of sensor augmented robotic weld systems for aerospace propulsion system fabrication

    NASA Technical Reports Server (NTRS)

    Jones, C. S.; Gangl, K. J.

    1986-01-01

    In order to meet stringent performance goals for power and reuseability, the Space Shuttle Main Engine was designed with many complex, difficult welded joints that provide maximum strength and minimum weight. To this end, the SSME requires 370 meters of welded joints. Automation of some welds has improved welding productivity significantly over manual welding. Application has previously been limited by accessibility constraints, requirements for complex process control, low production volumes, high part variability, and stringent quality requirements. Development of robots for welding in this application requires that a unique set of constraints be addressed. This paper shows how robotic welding can enhance production of aerospace components by addressing their specific requirements. A development program at the Marshall Space Flight Center combining industrial robots with state-of-the-art sensor systems and computer simulation is providing technology for the automation of welds in Space Shuttle Main Engine production.

  1. Systems, Stakeholders, and Students: Including Students in School Reform

    ERIC Educational Resources Information Center

    Zion, Shelley D.

    2009-01-01

    The education system in the United States is under pressure from a variety of sources to reform and improve the delivery of educational services to students. Change across a system as complex and dynamic as the educational system requires a systemic approach and requires the participation or buy-in of all participants and stakeholders. This…

  2. High Speed PC Based Data Acquisition and Instrumentation for Measurement of Simulated Low Earth Orbit Thermally Induced Disturbances

    NASA Technical Reports Server (NTRS)

    Sills, Joel W., Jr.; Griffin, Thomas J. (Technical Monitor)

    2001-01-01

    The Hubble Space Telescope (HST) Disturbance Verification Test (DVT) was conducted to characterize responses of the Observatory's new set of rigid solar array's (SA3) to thermally induced 'creak' or stiction releases. The data acquired in the DVT were used in verification of the HST Pointing Control System on-orbit performance, post-Servicing Mission 3B (SM3B). The test simulated the on-orbit environment on a deployed SA3 flight wing. Instrumentation for this test required pretest simulations in order to select the correct sensitivities. Vacuum compatible, highly accurate accelerometers and force gages were used for this test. The complexity of the test, as well as a short planning schedule, required a data acquisition system that was easy to configure, highly flexible, and extremely robust. A PC Windows oriented data acquisition system meets these requirements, allowing the test engineers to minimize the time required to plan and perform complex environmental test. The SA3 DVT provided a direct practical and complex demonstration of the versatility that PC based data acquisition systems provide. Two PC based data acquisition systems were assembled to acquire, process, distribute, and provide real time processing for several types of transducers used in the SA3 DVT. A high sample rate digital tape recorder was used to archive the sensor signals. The two systems provided multi-channel hardware and software architecture and were selected based on the test requirements. How these systems acquire and processes multiple data rates from different transducer types is discussed, along with the system hardware and software architecture.

  3. Technical support for digital systems technology development. Task order 1: ISP contention analysis and control

    NASA Technical Reports Server (NTRS)

    Stehle, Roy H.; Ogier, Richard G.

    1993-01-01

    Alternatives for realizing a packet-based network switch for use on a frequency division multiple access/time division multiplexed (FDMA/TDM) geostationary communication satellite were investigated. Each of the eight downlink beams supports eight directed dwells. The design needed to accommodate multicast packets with very low probability of loss due to contention. Three switch architectures were designed and analyzed. An output-queued, shared bus system yielded a functionally simple system, utilizing a first-in, first-out (FIFO) memory per downlink dwell, but at the expense of a large total memory requirement. A shared memory architecture offered the most efficiency in memory requirements, requiring about half the memory of the shared bus design. The processing requirement for the shared-memory system adds system complexity that may offset the benefits of the smaller memory. An alternative design using a shared memory buffer per downlink beam decreases circuit complexity through a distributed design, and requires at most 1000 packets of memory more than the completely shared memory design. Modifications to the basic packet switch designs were proposed to accommodate circuit-switched traffic, which must be served on a periodic basis with minimal delay. Methods for dynamically controlling the downlink dwell lengths were developed and analyzed. These methods adapt quickly to changing traffic demands, and do not add significant complexity or cost to the satellite and ground station designs. Methods for reducing the memory requirement by not requiring the satellite to store full packets were also proposed and analyzed. In addition, optimal packet and dwell lengths were computed as functions of memory size for the three switch architectures.

  4. Situational Analysis for Complex Systems: Methodological Development in Public Health Research.

    PubMed

    Martin, Wanda; Pauly, Bernie; MacDonald, Marjorie

    2016-01-01

    Public health systems have suffered infrastructure losses worldwide. Strengthening public health systems requires not only good policies and programs, but also development of new research methodologies to support public health systems renewal. Our research team considers public health systems to be complex adaptive systems and as such new methods are necessary to generate knowledge about the process of implementing public health programs and services. Within our program of research, we have employed situational analysis as a method for studying complex adaptive systems in four distinct research studies on public health program implementation. The purpose of this paper is to demonstrate the use of situational analysis as a method for studying complex systems and highlight the need for further methodological development.

  5. Social complexity as a proximate and ultimate factor in communicative complexity

    PubMed Central

    Freeberg, Todd M.; Dunbar, Robin I. M.; Ord, Terry J.

    2012-01-01

    The ‘social complexity hypothesis’ for communication posits that groups with complex social systems require more complex communicative systems to regulate interactions and relations among group members. Complex social systems, compared with simple social systems, are those in which individuals frequently interact in many different contexts with many different individuals, and often repeatedly interact with many of the same individuals in networks over time. Complex communicative systems, compared with simple communicative systems, are those that contain a large number of structurally and functionally distinct elements or possess a high amount of bits of information. Here, we describe some of the historical arguments that led to the social complexity hypothesis, and review evidence in support of the hypothesis. We discuss social complexity as a driver of communication and possible causal factor in human language origins. Finally, we discuss some of the key current limitations to the social complexity hypothesis—the lack of tests against alternative hypotheses for communicative complexity and evidence corroborating the hypothesis from modalities other than the vocal signalling channel. PMID:22641818

  6. Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems.

    PubMed

    Williams, Richard A; Timmis, Jon; Qwarnstrom, Eva E

    2016-01-01

    Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model.

  7. Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems

    PubMed Central

    Timmis, Jon; Qwarnstrom, Eva E.

    2016-01-01

    Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model. PMID:27571414

  8. Practical aspects of modeling aircraft dynamics from flight data

    NASA Technical Reports Server (NTRS)

    Iliff, K. W.; Maine, R. E.

    1984-01-01

    The purpose of parameter estimation, a subset of system identification, is to estimate the coefficients (such as stability and control derivatives) of the aircraft differential equations of motion from sampled measured dynamic responses. In the past, the primary reason for estimating stability and control derivatives from flight tests was to make comparisons with wind tunnel estimates. As aircraft became more complex, and as flight envelopes were expanded to include flight regimes that were not well understood, new requirements for the derivative estimates evolved. For many years, the flight determined derivatives were used in simulations to aid in flight planning and in pilot training. The simulations were particularly important in research flight test programs in which an envelope expansion into new flight regimes was required. Parameter estimation techniques for estimating stability and control derivatives from flight data became more sophisticated to support the flight test programs. As knowledge of these new flight regimes increased, more complex aircraft were flown. Much of this increased complexity was in sophisticated flight control systems. The design and refinement of the control system required higher fidelity simulations than were previously required.

  9. Evaluation in the Design of Complex Systems

    ERIC Educational Resources Information Center

    Ho, Li-An; Schwen, Thomas M.

    2006-01-01

    We identify literature that argues the process of creating knowledge-based system is often imbalanced. In most knowledge-based systems, development is often technology-driven instead of requirement-driven. Therefore, we argue designers must recognize that evaluation is a critical link in the application of requirement-driven development models…

  10. Navy Additive Manufacturing: Adding Parts, Subtracting Steps

    DTIC Science & Technology

    2015-06-01

    complex weapon systems within designed specifications requires extensive routine and preventative maintenance as well as expeditious repairs when...failures occur. These repairs are sometimes complex and often unpredictable in both peace and wartime environments. To keep these weapon systems...basis. The solution is not a simple one, but rather one of high complexity that cannot just be adopted from a big-box store such as Walmart, Target

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arion is a library and tool set that enables researchers to holistically define test system models. To define a complex system for testing an algorithm or control requires expertise across multiple domains. Simulating a complex system requires the integration of multiple simulators and test hardware, each with their own specification languages and concepts. This requires extensive set of knowledge and capabilities. Arion was developed to alleviate this challenge. Arion is a library of Java libraries that abstracts the concepts from supported simulators into a cohesive model language that allows someone to build models to their needed level of fidelity andmore » expertise. Arion is also a software tool that translates the users model back into the specification languages of the simulators and test hardware needed for execution.« less

  12. Unstructured Cartesian/prismatic grid generation for complex geometries

    NASA Technical Reports Server (NTRS)

    Karman, Steve L., Jr.

    1995-01-01

    The generation of a hybrid grid system for discretizing complex three dimensional (3D) geometries is described. The primary grid system is an unstructured Cartesian grid automatically generated using recursive cell subdivision. This grid system is sufficient for computing Euler solutions about extremely complex 3D geometries. A secondary grid system, using triangular-prismatic elements, may be added for resolving the boundary layer region of viscous flows near surfaces of solid bodies. This paper describes the grid generation processes used to generate each grid type. Several example grids are shown, demonstrating the ability of the method to discretize complex geometries, with very little pre-processing required by the user.

  13. Real-time automated failure identification in the Control Center Complex (CCC)

    NASA Technical Reports Server (NTRS)

    Kirby, Sarah; Lauritsen, Janet; Pack, Ginger; Ha, Anhhoang; Jowers, Steven; Mcnenny, Robert; Truong, The; Dell, James

    1993-01-01

    A system which will provide real-time failure management support to the Space Station Freedom program is described. The system's use of a simplified form of model based reasoning qualifies it as an advanced automation system. However, it differs from most such systems in that it was designed from the outset to meet two sets of requirements. First, it must provide a useful increment to the fault management capabilities of the Johnson Space Center (JSC) Control Center Complex (CCC) Fault Detection Management system. Second, it must satisfy CCC operational environment constraints such as cost, computer resource requirements, verification, and validation, etc. The need to meet both requirement sets presents a much greater design challenge than would have been the case had functionality been the sole design consideration. The choice of technology, discussing aspects of that choice and the process for migrating it into the control center is overviewed.

  14. Nuclear localization of Schizosaccharomyces pombe Mcm2/Cdc19p requires MCM complex assembly.

    PubMed

    Pasion, S G; Forsburg, S L

    1999-12-01

    The minichromosome maintenance (MCM) proteins MCM2-MCM7 are conserved eukaryotic replication factors that assemble in a heterohexameric complex. In fission yeast, these proteins are nuclear throughout the cell cycle. In studying the mechanism that regulates assembly of the MCM complex, we analyzed the cis and trans elements required for nuclear localization of a single subunit, Mcm2p. Mutation of any single mcm gene leads to redistribution of wild-type MCM subunits to the cytoplasm, and this redistribution depends on an active nuclear export system. We identified the nuclear localization signal sequences of Mcm2p and showed that these are required for nuclear targeting of other MCM subunits. In turn, Mcm2p must associate with other MCM proteins for its proper localization; nuclear localization of MCM proteins thus requires assembly of MCM proteins in a complex. We suggest that coupling complex assembly to nuclear targeting and retention ensures that only intact heterohexameric MCM complexes remain nuclear.

  15. Nuclear Localization of Schizosaccharomyces pombe Mcm2/Cdc19p Requires MCM Complex Assembly

    PubMed Central

    Pasion, Sally G.; Forsburg, Susan L.

    1999-01-01

    The minichromosome maintenance (MCM) proteins MCM2–MCM7 are conserved eukaryotic replication factors that assemble in a heterohexameric complex. In fission yeast, these proteins are nuclear throughout the cell cycle. In studying the mechanism that regulates assembly of the MCM complex, we analyzed the cis and trans elements required for nuclear localization of a single subunit, Mcm2p. Mutation of any single mcm gene leads to redistribution of wild-type MCM subunits to the cytoplasm, and this redistribution depends on an active nuclear export system. We identified the nuclear localization signal sequences of Mcm2p and showed that these are required for nuclear targeting of other MCM subunits. In turn, Mcm2p must associate with other MCM proteins for its proper localization; nuclear localization of MCM proteins thus requires assembly of MCM proteins in a complex. We suggest that coupling complex assembly to nuclear targeting and retention ensures that only intact heterohexameric MCM complexes remain nuclear. PMID:10588642

  16. Empirical Requirements Analysis for Mars Surface Operations Using the Flashline Mars Arctic Research Station

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Lee, Pascal; Sierhuis, Maarten; Norvig, Peter (Technical Monitor)

    2001-01-01

    Living and working on Mars will require model-based computer systems for maintaining and controlling complex life support, communication, transportation, and power systems. This technology must work properly on the first three-year mission, augmenting human autonomy, without adding-yet more complexity to be diagnosed and repaired. One design method is to work with scientists in analog (mars-like) setting to understand how they prefer to work, what constrains will be imposed by the Mars environment, and how to ameliorate difficulties. We describe how we are using empirical requirements analysis to prototype model-based tools at a research station in the High Canadian Arctic.

  17. Biologically-Inspired Concepts for Self-Management of Complexity

    NASA Technical Reports Server (NTRS)

    Sterritt, Roy; Hinchey, G.

    2006-01-01

    Inherent complexity in large-scale applications may be impossible to eliminate or even ameliorate despite a number of promising advances. In such cases, the complexity must be tolerated and managed. Such management may be beyond the abilities of humans, or require such overhead as to make management by humans unrealistic. A number of initiatives inspired by concepts in biology have arisen for self-management of complex systems. We present some ideas and techniques we have been experimenting with, inspired by lesser-known concepts in biology that show promise in protecting complex systems and represent a step towards self-management of complexity.

  18. Complex Adaptive Systems of Systems (CASoS) engineering and foundations for global design.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brodsky, Nancy S.; Finley, Patrick D.; Beyeler, Walter Eugene

    2012-01-01

    Complex Adaptive Systems of Systems, or CASoS, are vastly complex ecological, sociological, economic and/or technical systems which must be recognized and reckoned with to design a secure future for the nation and the world. Design within CASoS requires the fostering of a new discipline, CASoS Engineering, and the building of capability to support it. Towards this primary objective, we created the Phoenix Pilot as a crucible from which systemization of the new discipline could emerge. Using a wide range of applications, Phoenix has begun building both theoretical foundations and capability for: the integration of Applications to continuously build common understandingmore » and capability; a Framework for defining problems, designing and testing solutions, and actualizing these solutions within the CASoS of interest; and an engineering Environment required for 'the doing' of CASoS Engineering. In a secondary objective, we applied CASoS Engineering principles to begin to build a foundation for design in context of Global CASoS« less

  19. Hybrid and concatenated coding applications.

    NASA Technical Reports Server (NTRS)

    Hofman, L. B.; Odenwalder, J. P.

    1972-01-01

    Results of a study to evaluate the performance and implementation complexity of a concatenated and a hybrid coding system for moderate-speed deep-space applications. It is shown that with a total complexity of less than three times that of the basic Viterbi decoder, concatenated coding improves a constraint length 8 rate 1/3 Viterbi decoding system by 1.1 and 2.6 dB at bit error probabilities of 0.0001 and one hundred millionth, respectively. With a somewhat greater total complexity, the hybrid coding system is shown to obtain a 0.9-dB computational performance improvement over the basic rate 1/3 sequential decoding system. Although substantial, these complexities are much less than those required to achieve the same performances with more complex Viterbi or sequential decoder systems.

  20. A parsimonious land data assimilation system for the SMAP/GPM satellite era

    USDA-ARS?s Scientific Manuscript database

    Land data assimilation systems typically require complex parameterizations in order to: define required observation operators, quantify observing/forecasting errors and calibrate a land surface assimilation model. These parameters are commonly defined in an arbitrary manner and, if poorly specified,...

  1. Results of an Experimental Exploration of Advanced Automated Geospatial Tools: Agility in Complex Planning

    DTIC Science & Technology

    2009-06-01

    AUTOMATED GEOSPATIAL TOOLS : AGILITY IN COMPLEX PLANNING Primary Topic: Track 5 – Experimentation and Analysis Walter A. Powell [STUDENT] - GMU...TITLE AND SUBTITLE Results of an Experimental Exploration of Advanced Automated Geospatial Tools : Agility in Complex Planning 5a. CONTRACT NUMBER...Std Z39-18 Abstract Typically, the development of tools and systems for the military is requirement driven; systems are developed to meet

  2. Optically controlled phased-array antenna technology for space communication systems

    NASA Technical Reports Server (NTRS)

    Kunath, Richard R.; Bhasin, Kul B.

    1988-01-01

    Using MMICs in phased-array applications above 20 GHz requires complex RF and control signal distribution systems. Conventional waveguide, coaxial cable, and microstrip methods are undesirable due to their high weight, high loss, limited mechanical flexibility and large volume. An attractive alternative to these transmission media, for RF and control signal distribution in MMIC phased-array antennas, is optical fiber. Presented are potential system architectures and their associated characteristics. The status of high frequency opto-electronic components needed to realize the potential system architectures is also discussed. It is concluded that an optical fiber network will reduce weight and complexity, and increase reliability and performance, but may require higher power.

  3. Computational complexities and storage requirements of some Riccati equation solvers

    NASA Technical Reports Server (NTRS)

    Utku, Senol; Garba, John A.; Ramesh, A. V.

    1989-01-01

    The linear optimal control problem of an nth-order time-invariant dynamic system with a quadratic performance functional is usually solved by the Hamilton-Jacobi approach. This leads to the solution of the differential matrix Riccati equation with a terminal condition. The bulk of the computation for the optimal control problem is related to the solution of this equation. There are various algorithms in the literature for solving the matrix Riccati equation. However, computational complexities and storage requirements as a function of numbers of state variables, control variables, and sensors are not available for all these algorithms. In this work, the computational complexities and storage requirements for some of these algorithms are given. These expressions show the immensity of the computational requirements of the algorithms in solving the Riccati equation for large-order systems such as the control of highly flexible space structures. The expressions are also needed to compute the speedup and efficiency of any implementation of these algorithms on concurrent machines.

  4. An integrated cell-free metabolic platform for protein production and synthetic biology

    PubMed Central

    Jewett, Michael C; Calhoun, Kara A; Voloshin, Alexei; Wuu, Jessica J; Swartz, James R

    2008-01-01

    Cell-free systems offer a unique platform for expanding the capabilities of natural biological systems for useful purposes, i.e. synthetic biology. They reduce complexity, remove structural barriers, and do not require the maintenance of cell viability. Cell-free systems, however, have been limited by their inability to co-activate multiple biochemical networks in a single integrated platform. Here, we report the assessment of biochemical reactions in an Escherichia coli cell-free platform designed to activate natural metabolism, the Cytomim system. We reveal that central catabolism, oxidative phosphorylation, and protein synthesis can be co-activated in a single reaction system. Never before have these complex systems been shown to be simultaneously activated without living cells. The Cytomim system therefore promises to provide the metabolic foundation for diverse ab initio cell-free synthetic biology projects. In addition, we describe an improved Cytomim system with enhanced protein synthesis yields (up to 1200 mg/l in 2 h) and lower costs to facilitate production of protein therapeutics and biochemicals that are difficult to make in vivo because of their toxicity, complexity, or unusual cofactor requirements. PMID:18854819

  5. MASPROP- MASS PROPERTIES OF A RIGID STRUCTURE

    NASA Technical Reports Server (NTRS)

    Hull, R. A.

    1994-01-01

    The computer program MASPROP was developed to rapidly calculate the mass properties of complex rigid structural systems. This program's basic premise is that complex systems can be adequately described by a combination of basic elementary structural shapes. Thirteen widely used basic structural shapes are available in this program. They are as follows: Discrete Mass, Cylinder, Truncated Cone, Torus, Beam (arbitrary cross section), Circular Rod (arbitrary cross section), Spherical Segment, Sphere, Hemisphere, Parallelepiped, Swept Trapezoidal Panel, Symmetric Trapezoidal Panels, and a Curved Rectangular Panel. MASPROP provides a designer with a simple technique that requires minimal input to calculate the mass properties of a complex rigid structure and should be useful in any situation where one needs to calculate the center of gravity and moments of inertia of a complex structure. Rigid body analysis is used to calculate mass properties. Mass properties are calculated about component axes that have been rotated to be parallel to the system coordinate axes. Then the system center of gravity is calculated and the mass properties are transferred to axes through the system center of gravity by using the parallel axis theorem. System weight, moments of inertia about the system origin, and the products of inertia about the system center of mass are calculated and printed. From the information about the system center of mass the principal axes of the system and the moments of inertia about them are calculated and printed. The only input required is simple geometric data describing the size and location of each element and the respective material density or weight of each element. This program is written in FORTRAN for execution on a CDC 6000 series computer with a central memory requirement of approximately 62K (octal) of 60 bit words. The development of this program was completed in 1978.

  6. Tools and techniques for developing policies for complex and uncertain systems.

    PubMed

    Bankes, Steven C

    2002-05-14

    Agent-based models (ABM) are examples of complex adaptive systems, which can be characterized as those systems for which no model less complex than the system itself can accurately predict in detail how the system will behave at future times. Consequently, the standard tools of policy analysis, based as they are on devising policies that perform well on some best estimate model of the system, cannot be reliably used for ABM. This paper argues that policy analysis by using ABM requires an alternative approach to decision theory. The general characteristics of such an approach are described, and examples are provided of its application to policy analysis.

  7. High rate information systems - Architectural trends in support of the interdisciplinary investigator

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Preheim, Larry E.

    1990-01-01

    Data systems requirements in the Earth Observing System (EOS) Space Station Freedom (SSF) eras indicate increasing data volume, increased discipline interplay, higher complexity and broader data integration and interpretation. A response to the needs of the interdisciplinary investigator is proposed, considering the increasing complexity and rising costs of scientific investigation. The EOS Data Information System, conceived to be a widely distributed system with reliable communication links between central processing and the science user community, is described. Details are provided on information architecture, system models, intelligent data management of large complex databases, and standards for archiving ancillary data, using a research library, a laboratory and collaboration services.

  8. Using DCOM to support interoperability in forest ecosystem management decision support systems

    Treesearch

    W.D. Potter; S. Liu; X. Deng; H.M. Rauscher

    2000-01-01

    Forest ecosystems exhibit complex dynamics over time and space. Management of forest ecosystems involves the need to forecast future states of complex systems that are often undergoing structural changes. This in turn requires integration of quantitative science and engineering components with sociopolitical, regulatory, and economic considerations. The amount of data...

  9. Application of Nonlinear Systems Inverses to Automatic Flight Control Design: System Concepts and Flight Evaluations

    NASA Technical Reports Server (NTRS)

    Meyer, G.; Cicolani, L.

    1981-01-01

    A practical method for the design of automatic flight control systems for aircraft with complex characteristics and operational requirements, such as the powered lift STOL and V/STOL configurations, is presented. The method is effective for a large class of dynamic systems requiring multi-axis control which have highly coupled nonlinearities, redundant controls, and complex multidimensional operational envelopes. It exploits the concept of inverse dynamic systems, and an algorithm for the construction of inverse is given. A hierarchic structure for the total control logic with inverses is presented. The method is illustrated with an application to the Augmentor Wing Jet STOL Research Aircraft equipped with a digital flight control system. Results of flight evaluation of the control concept on this aircraft are presented.

  10. Design Patterns for Learning and Assessment: Facilitating the Introduction of a Complex Simulation-Based Learning Environment into a Community of Instructors

    ERIC Educational Resources Information Center

    Frezzo, Dennis C.; Behrens, John T.; Mislevy, Robert J.

    2010-01-01

    Simulation environments make it possible for science and engineering students to learn to interact with complex systems. Putting these capabilities to effective use for learning, and assessing learning, requires more than a simulation environment alone. It requires a conceptual framework for the knowledge, skills, and ways of thinking that are…

  11. RAFCON: A Graphical Tool for Engineering Complex, Robotic Tasks

    DTIC Science & Technology

    2016-10-09

    Robotic tasks are becoming increasingly complex, and with this also the robotic systems. This requires new tools to manage this complexity and to...execution of robotic tasks, called RAFCON. These tasks are described in hierarchical state machines supporting concurrency. A formal notation of this concept

  12. A Metrics-Based Approach to Intrusion Detection System Evaluation for Distributed Real-Time Systems

    DTIC Science & Technology

    2002-04-01

    Based Approach to Intrusion Detection System Evaluation for Distributed Real - Time Systems Authors: G. A. Fink, B. L. Chappell, T. G. Turner, and...Distributed, Security. 1 Introduction Processing and cost requirements are driving future naval combat platforms to use distributed, real - time systems of...distributed, real - time systems . As these systems grow more complex, the timing requirements do not diminish; indeed, they may become more constrained

  13. Ferroelectric Memory Devices and a Proposed Standardized Test System Design

    DTIC Science & Technology

    1992-06-01

    positive clock transition. This provides automatic data protection in case of power loss. The device is being evaluated for applications such as automobile ...systems requiring nonvolatile memory and as these systems become more complex, the demand for reprogrammable nonvolatile memory increases. The...complexity and cost in making conventional nonvolatile memory reprogrammable also increases, so the potential for using ferroelectric memory as a replacement

  14. A Chemical Engineer's Perspective on Health and Disease

    PubMed Central

    Androulakis, Ioannis P.

    2014-01-01

    Chemical process systems engineering considers complex supply chains which are coupled networks of dynamically interacting systems. The quest to optimize the supply chain while meeting robustness and flexibility constraints in the face of ever changing environments necessitated the development of theoretical and computational tools for the analysis, synthesis and design of such complex engineered architectures. However, it was realized early on that optimality is a complex characteristic required to achieve proper balance between multiple, often competing, objectives. As we begin to unravel life's intricate complexities, we realize that that living systems share similar structural and dynamic characteristics; hence much can be learned about biological complexity from engineered systems. In this article, we draw analogies between concepts in process systems engineering and conceptual models of health and disease; establish connections between these concepts and physiologic modeling; and describe how these mirror onto the physiological counterparts of engineered systems. PMID:25506103

  15. Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth

    2016-01-01

    If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.

  16. Birefringence measurement in complex optical systems

    NASA Astrophysics Data System (ADS)

    Knell, Holger; Heuck, Hans-Martin

    2017-06-01

    State of the art optical systems become more complex. There are more lenses required in the optical design and optical coatings have more layers. These complex designs are prone to induce more thermal stress into the optical system which causes birefringence. In addition, there is a certain degree of freedom required to meet optical specifications during the assembly process. The mechanical fixation of these degrees of freedom can also lead to mechanical stress in the optical system and therefore to birefringence. To be able to distinguish those two types of stress a method to image the birefringence in the optical system is required. In the proposed setup light is polarized by a circular polarization filter and then is transmitted through a rotatable linear retarder and the tested optical system. The light then is reflected on the same path by a mirror. After the light passes the circular polarization filter on the way back, the intensity is recorded. When the rotatable retarder is rotated, the recorded intensity is modulated depending on the birefringence of the tested optical system. This modulation can be analyzed in Fourier domain and the linear retardance angle between the slow and the fast axis as well as the angle of the fast axis can be calculated. The retardance distribution over the pupil of the optical system then can be analyzed using Zernike decomposition. From the Zernike decomposition, the origin of the birefringence can be identified. Since it is required to quantify small amounts of retardance well below 10nm, the birefringence of the measurement system must be characterized before the measurement and considered in the calculation of the resulting birefringence. Temperature change of the measurement system still can produce measurement artifacts in the calculated result, which must also be compensated for.

  17. DfM requirements and ROI analysis for system-on-chip

    NASA Astrophysics Data System (ADS)

    Balasinski, Artur

    2005-11-01

    DfM (Design-for-Manufacturability) has become staple requirement beyond 100 nm technology node for efficient generation of mask data, cost reduction, and optimal circuit performance. Layout pattern has to comply to many requirements pertaining to database structure and complexity, suitability for image enhancement by the optical proximity correction, and mask data pattern density and distribution over the image field. These requirements are of particular complexity for Systems-on-Chip (SoC). A number of macro-, meso-, and microscopic effects such as reticle macroloading, planarization dishing, and pattern bridging or breaking would compromise fab yield, device performance, or both. In order to determine the optimal set of DfM rules applicable to the particular designs, Return-on-Investment and Failure Mode and Effect Analysis (FMEA) are proposed.

  18. Automatic Fault Characterization via Abnormality-Enhanced Classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronevetsky, G; Laguna, I; de Supinski, B R

    Enterprise and high-performance computing systems are growing extremely large and complex, employing hundreds to hundreds of thousands of processors and software/hardware stacks built by many people across many organizations. As the growing scale of these machines increases the frequency of faults, system complexity makes these faults difficult to detect and to diagnose. Current system management techniques, which focus primarily on efficient data access and query mechanisms, require system administrators to examine the behavior of various system services manually. Growing system complexity is making this manual process unmanageable: administrators require more effective management tools that can detect faults and help tomore » identify their root causes. System administrators need timely notification when a fault is manifested that includes the type of fault, the time period in which it occurred and the processor on which it originated. Statistical modeling approaches can accurately characterize system behavior. However, the complex effects of system faults make these tools difficult to apply effectively. This paper investigates the application of classification and clustering algorithms to fault detection and characterization. We show experimentally that naively applying these methods achieves poor accuracy. Further, we design novel techniques that combine classification algorithms with information on the abnormality of application behavior to improve detection and characterization accuracy. Our experiments demonstrate that these techniques can detect and characterize faults with 65% accuracy, compared to just 5% accuracy for naive approaches.« less

  19. A new VLSI complex integer multiplier which uses a quadratic-polynomial residue system with Fermat numbers

    NASA Technical Reports Server (NTRS)

    Truong, T. K.; Hsu, I. S.; Chang, J. J.; Shyu, H. C.; Reed, I. S.

    1986-01-01

    A quadratic-polynomial Fermat residue number system (QFNS) has been used to compute complex integer multiplications. The advantage of such a QFNS is that a complex integer multiplication requires only two integer multiplications. In this article, a new type Fermat number multiplier is developed which eliminates the initialization condition of the previous method. It is shown that the new complex multiplier can be implemented on a single VLSI chip. Such a chip is designed and fabricated in CMOS-pw technology.

  20. A new VLSI complex integer multiplier which uses a quadratic-polynomial residue system with Fermat numbers

    NASA Technical Reports Server (NTRS)

    Shyu, H. C.; Reed, I. S.; Truong, T. K.; Hsu, I. S.; Chang, J. J.

    1987-01-01

    A quadratic-polynomial Fermat residue number system (QFNS) has been used to compute complex integer multiplications. The advantage of such a QFNS is that a complex integer multiplication requires only two integer multiplications. In this article, a new type Fermat number multiplier is developed which eliminates the initialization condition of the previous method. It is shown that the new complex multiplier can be implemented on a single VLSI chip. Such a chip is designed and fabricated in CMOS-Pw technology.

  1. Conversion from Tree to Graph Representation of Requirements

    NASA Technical Reports Server (NTRS)

    Mayank, Vimal; Everett, David Frank; Shmunis, Natalya; Austin, Mark

    2009-01-01

    A procedure and software to implement the procedure have been devised to enable conversion from a tree representation to a graph representation of the requirements governing the development and design of an engineering system. The need for this procedure and software and for other requirements-management tools arises as follows: In systems-engineering circles, it is well known that requirements- management capability improves the likelihood of success in the team-based development of complex systems involving multiple technological disciplines. It is especially desirable to be able to visualize (in order to identify and manage) requirements early in the system- design process, when errors can be corrected most easily and inexpensively.

  2. An evolutionary communications scenario for Mars exploration

    NASA Technical Reports Server (NTRS)

    Stevenson, Steven M.

    1987-01-01

    As Mars exploration grows in complexity with time, the corresponding communication needs will grow in variety and complexity also. From initial Earth/Mars links, further needs will arise for complete surface connectivity for the provision of navigation, position location, and voice, data, and video communications services among multiple Mars bases and remote exploration sites. This paper addresses the likely required communication functions over the first few decades of Martian exploration and postulates systems for providing these services. Required technologies are identified and development requirements indicated.

  3. Development of the Next Generation of Biogeochemistry Simulations Using EMSL's NWChem Molecular Modeling Software

    NASA Astrophysics Data System (ADS)

    Bylaska, E. J.; Kowalski, K.; Apra, E.; Govind, N.; Valiev, M.

    2017-12-01

    Methods of directly simulating the behavior of complex strongly interacting atomic systems (molecular dynamics, Monte Carlo) have provided important insight into the behavior of nanoparticles, biogeochemical systems, mineral/fluid systems, nanoparticles, actinide systems and geofluids. The limitation of these methods to even wider applications is the difficulty of developing accurate potential interactions in these systems at the molecular level that capture their complex chemistry. The well-developed tools of quantum chemistry and physics have been shown to approach the accuracy required. However, despite the continuous effort being put into improving their accuracy and efficiency, these tools will be of little value to condensed matter problems without continued improvements in techniques to traverse and sample the high-dimensional phase space needed to span the ˜10^12 time scale differences between molecular simulation and chemical events. In recent years, we have made considerable progress in developing electronic structure and AIMD methods tailored to treat biochemical and geochemical problems, including very efficient implementations of many-body methods, fast exact exchange methods, electron-transfer methods, excited state methods, QM/MM, and new parallel algorithms that scale to +100,000 cores. The poster will focus on the fundamentals of these methods and the realities in terms of system size, computational requirements and simulation times that are required for their application to complex biogeochemical systems.

  4. Use of low orbital satellite communications systems for humanitarian programs

    NASA Technical Reports Server (NTRS)

    Vlasov, Vladimir N.; Gorkovoy, Vladimir

    1991-01-01

    Communication and information exchange play a decisive role in progress and social development. However, in many parts of the world the communication infrastructure is inadequate and the capacity for on-line exchange of information may not exist. This is true of underdeveloped countries, remote and relatively inaccessible regions, sites of natural disasters, and of all cases where the resources needed to create complex communication systems are limited. The creation of an inexpensive space communications system to service such areas is therefore a high priority task. In addition to a relatively low-cost space segment, an inexpensive space communications systems requires a large number of ground terminals, which must be relatively inexpensive, energy efficient (using power generated by storage batteries, or solar arrays, etc.), small in size, and must not require highly expert maintenance. The ground terminals must be portable, and readily deployable. Communications satellites in geostationary orbit at altitudes of about 36,000 km are very expensive and require complex and expensive ground stations and launch vehicles. Given current technology, it is categorically impossible to develop inexpensive satellite systems with portable ground terminals using such satellites. To solve the problem of developing an inexpensive satellite communications system that can operate with relatively small ground stations, including portable terminals, we propose to use a system with satellites in low Earth orbit, at an altitude of 900-1500 km. Because low orbital satellites are much closer to the Earth than geostationary ones and require vastly less energy expenditure by the satellite and ground terminals for transmission of messages, a system using them is relatively inexpensive. Such a system could use portable ground terminals no more complex than ordinary mobile police radios.

  5. A Model-Based Approach to Engineering Behavior of Complex Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Ingham, Michel; Day, John; Donahue, Kenneth; Kadesch, Alex; Kennedy, Andrew; Khan, Mohammed Omair; Post, Ethan; Standley, Shaun

    2012-01-01

    One of the most challenging yet poorly defined aspects of engineering a complex aerospace system is behavior engineering, including definition, specification, design, implementation, and verification and validation of the system's behaviors. This is especially true for behaviors of highly autonomous and intelligent systems. Behavior engineering is more of an art than a science. As a process it is generally ad-hoc, poorly specified, and inconsistently applied from one project to the next. It uses largely informal representations, and results in system behavior being documented in a wide variety of disparate documents. To address this problem, JPL has undertaken a pilot project to apply its institutional capabilities in Model-Based Systems Engineering to the challenge of specifying complex spacecraft system behavior. This paper describes the results of the work in progress on this project. In particular, we discuss our approach to modeling spacecraft behavior including 1) requirements and design flowdown from system-level to subsystem-level, 2) patterns for behavior decomposition, 3) allocation of behaviors to physical elements in the system, and 4) patterns for capturing V&V activities associated with behavioral requirements. We provide examples of interesting behavior specification patterns, and discuss findings from the pilot project.

  6. Spatiotemporal control to eliminate cardiac alternans using isostable reduction

    NASA Astrophysics Data System (ADS)

    Wilson, Dan; Moehlis, Jeff

    2017-03-01

    Cardiac alternans, an arrhythmia characterized by a beat-to-beat alternation of cardiac action potential durations, is widely believed to facilitate the transition from normal cardiac function to ventricular fibrillation and sudden cardiac death. Alternans arises due to an instability of a healthy period-1 rhythm, and most dynamical control strategies either require extensive knowledge of the cardiac system, making experimental validation difficult, or are model independent and sacrifice important information about the specific system under study. Isostable reduction provides an alternative approach, in which the response of a system to external perturbations can be used to reduce the complexity of a cardiac system, making it easier to work with from an analytical perspective while retaining many of its important features. Here, we use isostable reduction strategies to reduce the complexity of partial differential equation models of cardiac systems in order to develop energy optimal strategies for the elimination of alternans. Resulting control strategies require significantly less energy to terminate alternans than comparable strategies and do not require continuous state feedback.

  7. Analysis of Multilayered Printed Circuit Boards using Computed Tomography

    DTIC Science & Technology

    2014-05-01

    complex PCBs that present a challenge for any testing or fault analysis. Set-to- work testing and fault analysis of any electronic circuit require...Electronic Warfare and Radar Division in December 2010. He is currently in Electro- Optic Countermeasures Group. Samuel works on embedded system design...and software optimisation of complex electro-optical systems, including the set to work and characterisation of these systems. He has a Bachelor of

  8. SQL is Dead; Long-live SQL: Relational Database Technology in Science Contexts

    NASA Astrophysics Data System (ADS)

    Howe, B.; Halperin, D.

    2014-12-01

    Relational databases are often perceived as a poor fit in science contexts: Rigid schemas, poor support for complex analytics, unpredictable performance, significant maintenance and tuning requirements --- these idiosyncrasies often make databases unattractive in science contexts characterized by heterogeneous data sources, complex analysis tasks, rapidly changing requirements, and limited IT budgets. In this talk, I'll argue that although the value proposition of typical relational database systems are weak in science, the core ideas that power relational databases have become incredibly prolific in open source science software, and are emerging as a universal abstraction for both big data and small data. In addition, I'll talk about two open source systems we are building to "jailbreak" the core technology of relational databases and adapt them for use in science. The first is SQLShare, a Database-as-a-Service system supporting collaborative data analysis and exchange by reducing database use to an Upload-Query-Share workflow with no installation, schema design, or configuration required. The second is Myria, a service that supports much larger scale data, complex analytics, and supports multiple back end systems. Finally, I'll describe some of the ways our collaborators in oceanography, astronomy, biology, fisheries science, and more are using these systems to replace script-based workflows for reasons of performance, flexibility, and convenience.

  9. Use of Antibiotic-Impregnated Absorbable Beads and Tissue Coverage of Complex Wounds.

    PubMed

    White, Terris L; Culliford, Alfred T; Zomaya, Martin; Freed, Gary; Demas, Christopher P

    2016-11-01

    The treatment of complex wounds is commonplace for plastic surgeons. Standard management is debridement of infected and devitalized tissue and systemic antibiotic therapy. In cases where vital structures are exposed within the wound, coverage is obtained with the use of vascularized tissue using both muscle and fasciocutaneous flaps. The use of nondissolving polymethylmethacrylate and absorbable antibiotic-impregnated beads has been shown to deliver high concentrations of antibiotics with low systemic levels of the same antibiotic. We present a multicenter retrospective review of all cases that used absorbable antibiotic-impregnated beads for complex wound management from 2003 to 2013. A total of 104 cases were investigated, flap coverage was used in 97 cases (93.3%). Overall, 15 patients (14.4%) required reoperation with the highest groups involving orthopedic wounds and sternal wounds. The advantages of using absorbable antibiotic-impregnated beads in complex infected wounds have been demonstrated with minimal disadvantages. The utilization of these beads is expanding to a variety of complex infectious wounds requiring high concentrations of local antibiotics.

  10. Unifying Human Centered Design and Systems Engineering for Human Systems Integration

    NASA Technical Reports Server (NTRS)

    Boy, Guy A.; McGovernNarkevicius, Jennifer

    2013-01-01

    Despite the holistic approach of systems engineering (SE), systems still fail, and sometimes spectacularly. Requirements, solutions and the world constantly evolve and are very difficult to keep current. SE requires more flexibility and new approaches to SE have to be developed to include creativity as an integral part and where the functions of people and technology are appropriately allocated within our highly interconnected complex organizations. Instead of disregarding complexity because it is too difficult to handle, we should take advantage of it, discovering behavioral attractors and the emerging properties that it generates. Human-centered design (HCD) provides the creativity factor that SE lacks. It promotes modeling and simulation from the early stages of design and throughout the life cycle of a product. Unifying HCD and SE will shape appropriate human-systems integration (HSI) and produce successful systems.

  11. A new decision sciences for complex systems.

    PubMed

    Lempert, Robert J

    2002-05-14

    Models of complex systems can capture much useful information but can be difficult to apply to real-world decision-making because the type of information they contain is often inconsistent with that required for traditional decision analysis. New approaches, which use inductive reasoning over large ensembles of computational experiments, now make possible systematic comparison of alternative policy options using models of complex systems. This article describes Computer-Assisted Reasoning, an approach to decision-making under conditions of deep uncertainty that is ideally suited to applying complex systems to policy analysis. The article demonstrates the approach on the policy problem of global climate change, with a particular focus on the role of technology policies in a robust, adaptive strategy for greenhouse gas abatement.

  12. The potential benefit of an advanced integrated utility system

    NASA Technical Reports Server (NTRS)

    Wolfer, B. M.

    1975-01-01

    The applicability of an advanced integrated utility system based on 1980 technology was investigated. An example of such a system, which provides electricity, heating and air conditioning, solid waste disposal, and water treatment in a single integrated plant, is illustrated for a hypothetical apartment complex. The system requires approximately 50 percent of the energy and approximately 55 percent of the water that would be required by a typical current conventional system.

  13. Mission activities planning for a Hermes mission by means of AI-technology

    NASA Technical Reports Server (NTRS)

    Pape, U.; Hajen, G.; Schielow, N.; Mitschdoerfer, P.; Allard, F.

    1993-01-01

    Mission Activities Planning is a complex task to be performed by mission control centers. AI technology can offer attractive solutions to the planning problem. This paper presents the use of a new AI-based Mission Planning System for crew activity planning. Based on a HERMES servicing mission to the COLUMBUS Man Tended Free Flyer (MTFF) with complex time and resource constraints, approximately 2000 activities with 50 different resources have been generated, processed, and planned with parametric variation of operationally sensitive parameters. The architecture, as well as the performance of the mission planning system, is discussed. An outlook to future planning scenarios, the requirements, and how a system like MARS can fulfill those requirements is given.

  14. Extreme-scale Algorithms and Solver Resilience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dongarra, Jack

    A widening gap exists between the peak performance of high-performance computers and the performance achieved by complex applications running on these platforms. Over the next decade, extreme-scale systems will present major new challenges to algorithm development that could amplify this mismatch in such a way that it prevents the productive use of future DOE Leadership computers due to the following; Extreme levels of parallelism due to multicore processors; An increase in system fault rates requiring algorithms to be resilient beyond just checkpoint/restart; Complex memory hierarchies and costly data movement in both energy and performance; Heterogeneous system architectures (mixing CPUs, GPUs,more » etc.); and Conflicting goals of performance, resilience, and power requirements.« less

  15. Modeling and Verification of Dependable Electronic Power System Architecture

    NASA Astrophysics Data System (ADS)

    Yuan, Ling; Fan, Ping; Zhang, Xiao-fang

    The electronic power system can be viewed as a system composed of a set of concurrently interacting subsystems to generate, transmit, and distribute electric power. The complex interaction among sub-systems makes the design of electronic power system complicated. Furthermore, in order to guarantee the safe generation and distribution of electronic power, the fault tolerant mechanisms are incorporated in the system design to satisfy high reliability requirements. As a result, the incorporation makes the design of such system more complicated. We propose a dependable electronic power system architecture, which can provide a generic framework to guide the development of electronic power system to ease the development complexity. In order to provide common idioms and patterns to the system *designers, we formally model the electronic power system architecture by using the PVS formal language. Based on the PVS model of this system architecture, we formally verify the fault tolerant properties of the system architecture by using the PVS theorem prover, which can guarantee that the system architecture can satisfy high reliability requirements.

  16. Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 2: Concept document

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The Simulation Computer System (SCS) concept document describes and establishes requirements for the functional performance of the SCS system, including interface, logistic, and qualification requirements. The SCS is the computational communications and display segment of the Marshall Space Flight Center (MSFC) Payload Training Complex (PTC). The PTC is the MSFC facility that will train onboard and ground operations personnel to operate the payloads and experiments on board the international Space Station Freedom. The requirements to be satisfied by the system implementation are identified here. The SCS concept document defines the requirements to be satisfied through the implementation of the system capability. The information provides the operational basis for defining the requirements to be allocated to the system components and enables the system organization to assess whether or not the completed system complies with the requirements of the system.

  17. Development of a Naval C2 Capability Evaluation Facility

    DTIC Science & Technology

    2014-06-01

    designs is required in highly complex systems since sub-system evaluation may not be predictive of the overall system effect. It has been shown by...all individual and team behaviours, communications and interactions must be recordable. From the start of the project the design concept was for a...experimentation requirements of the concept evaluations being developed by the concept development team. A system design that allowed a variable fidelity in

  18. Advanced Kalman Filter for Real-Time Responsiveness in Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Welch, Gregory Francis; Zhang, Jinghe

    2014-06-10

    Complex engineering systems pose fundamental challenges in real-time operations and control because they are highly dynamic systems consisting of a large number of elements with severe nonlinearities and discontinuities. Today’s tools for real-time complex system operations are mostly based on steady state models, unable to capture the dynamic nature and too slow to prevent system failures. We developed advanced Kalman filtering techniques and the formulation of dynamic state estimation using Kalman filtering techniques to capture complex system dynamics in aiding real-time operations and control. In this work, we looked at complex system issues including severe nonlinearity of system equations, discontinuitiesmore » caused by system controls and network switches, sparse measurements in space and time, and real-time requirements of power grid operations. We sought to bridge the disciplinary boundaries between Computer Science and Power Systems Engineering, by introducing methods that leverage both existing and new techniques. While our methods were developed in the context of electrical power systems, they should generalize to other large-scale scientific and engineering applications.« less

  19. Analyzing Software Requirements Errors in Safety-Critical, Embedded Systems

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.

    1993-01-01

    This paper analyzes the root causes of safety-related software errors in safety-critical, embedded systems. The results show that software errors identified as potentially hazardous to the system tend to be produced by different error mechanisms than non- safety-related software errors. Safety-related software errors are shown to arise most commonly from (1) discrepancies between the documented requirements specifications and the requirements needed for correct functioning of the system and (2) misunderstandings of the software's interface with the rest of the system. The paper uses these results to identify methods by which requirements errors can be prevented. The goal is to reduce safety-related software errors and to enhance the safety of complex, embedded systems.

  20. Factors which Limit the Value of Additional Redundancy in Human Rated Launch Vehicle Systems

    NASA Technical Reports Server (NTRS)

    Anderson, Joel M.; Stott, James E.; Ring, Robert W.; Hatfield, Spencer; Kaltz, Gregory M.

    2008-01-01

    The National Aeronautics and Space Administration (NASA) has embarked on an ambitious program to return humans to the moon and beyond. As NASA moves forward in the development and design of new launch vehicles for future space exploration, it must fully consider the implications that rule-based requirements of redundancy or fault tolerance have on system reliability/risk. These considerations include common cause failure, increased system complexity, combined serial and parallel configurations, and the impact of design features implemented to control premature activation. These factors and others must be considered in trade studies to support design decisions that balance safety, reliability, performance and system complexity to achieve a relatively simple, operable system that provides the safest and most reliable system within the specified performance requirements. This paper describes conditions under which additional functional redundancy can impede improved system reliability. Examples from current NASA programs including the Ares I Upper Stage will be shown.

  1. Systems Engineering and Integration for Advanced Life Support System and HST

    NASA Technical Reports Server (NTRS)

    Kamarani, Ali K.

    2005-01-01

    Systems engineering (SE) discipline has revolutionized the way engineers and managers think about solving issues related to design of complex systems: With continued development of state-of-the-art technologies, systems are becoming more complex and therefore, a systematic approach is essential to control and manage their integrated design and development. This complexity is driven from integration issues. In this case, subsystems must interact with one another in order to achieve integration objectives, and also achieve the overall system's required performance. Systems engineering process addresses these issues at multiple levels. It is a technology and management process dedicated to controlling all aspects of system life cycle to assure integration at all levels. The Advanced Integration Matrix (AIM) project serves as the systems engineering and integration function for the Human Support Technology (HST) program. AIM provides means for integrated test facilities and personnel for performance trade studies, analyses, integrated models, test results, and validated requirements of the integration of HST. The goal of AIM is to address systems-level integration issues for exploration missions. It will use an incremental systems integration approach to yield technologies, baselines for further development, and possible breakthrough concepts in the areas of technological and organizational interfaces, total information flow, system wide controls, technical synergism, mission operations protocols and procedures, and human-machine interfaces.

  2. Chief of Naval Air Training Resource Planning System (RPS).

    ERIC Educational Resources Information Center

    Hodak, Gary W.; And Others

    The Resource Planning System (RPS) provides the Chief of Naval Air Training (CNATRA) with the capability to determine the resources required to produce a specified number of Naval Aviators and Naval Flight Officers (NAs/NFOs) quickly and efficiently. The training of NAs and NFOs is extremely time consuming and complex. It requires extensive…

  3. Organizing for the Future Requires the Non-Aristotelian Lens of a Dragonfly.

    ERIC Educational Resources Information Center

    Collins, Marla Del

    To organize for the future requires non-Aristotelian thinking...a multifaceted wide-angle lens revealing hidden information. A multifaceted lens includes at least three general systems of evaluation, all of which promote complex thinking. The three systems are general semantics, postmodern feminist philosophy, and the unifying principle of…

  4. Minimum Control Requirements for Advanced Life Support Systems

    NASA Technical Reports Server (NTRS)

    Boulange, Richard; Jones, Harry; Jones, Harry

    2002-01-01

    Advanced control technologies are not necessary for the safe, reliable and continuous operation of Advanced Life Support (ALS) systems. ALS systems can and are adequately controlled by simple, reliable, low-level methodologies and algorithms. The automation provided by advanced control technologies is claimed to decrease system mass and necessary crew time by reducing buffer size and minimizing crew involvement. In truth, these approaches increase control system complexity without clearly demonstrating an increase in reliability across the ALS system. Unless these systems are as reliable as the hardware they control, there is no savings to be had. A baseline ALS system is presented with the minimal control system required for its continuous safe reliable operation. This baseline control system uses simple algorithms and scheduling methodologies and relies on human intervention only in the event of failure of the redundant backup equipment. This ALS system architecture is designed for reliable operation, with minimal components and minimal control system complexity. The fundamental design precept followed is "If it isn't there, it can't fail".

  5. A new bipolar Qtrim power supply system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mi, C.; Bruno, D.; Drozd, J.

    2015-05-03

    This year marks the 15th run of RHIC (Relativistic Heavy Ion Collider) operations. The reliability of superconducting magnet power supplies is one of the essential factors in the entire accelerator complex. Besides maintaining existing power supplies and their associated equipment, newly designed systems are also required based on the physicist’s latest requirements. A bipolar power supply was required for this year’s main quadruple trim power supply. This paper will explain the design, prototype, testing, installation and operation of this recently installed power supply system.

  6. Recording information on protein complexes in an information management system

    PubMed Central

    Savitsky, Marc; Diprose, Jonathan M.; Morris, Chris; Griffiths, Susanne L.; Daniel, Edward; Lin, Bill; Daenke, Susan; Bishop, Benjamin; Siebold, Christian; Wilson, Keith S.; Blake, Richard; Stuart, David I.; Esnouf, Robert M.

    2011-01-01

    The Protein Information Management System (PiMS) is a laboratory information management system (LIMS) designed for use with the production of proteins in a research environment. The software is distributed under the CCP4 licence, and so is available free of charge to academic laboratories. Like most LIMS, the underlying PiMS data model originally had no support for protein–protein complexes. To support the SPINE2-Complexes project the developers have extended PiMS to meet these requirements. The modifications to PiMS, described here, include data model changes, additional protocols, some user interface changes and functionality to detect when an experiment may have formed a complex. Example data are shown for the production of a crystal of a protein complex. Integration with SPINE2-Complexes Target Tracker application is also described. PMID:21605682

  7. Recording information on protein complexes in an information management system.

    PubMed

    Savitsky, Marc; Diprose, Jonathan M; Morris, Chris; Griffiths, Susanne L; Daniel, Edward; Lin, Bill; Daenke, Susan; Bishop, Benjamin; Siebold, Christian; Wilson, Keith S; Blake, Richard; Stuart, David I; Esnouf, Robert M

    2011-08-01

    The Protein Information Management System (PiMS) is a laboratory information management system (LIMS) designed for use with the production of proteins in a research environment. The software is distributed under the CCP4 licence, and so is available free of charge to academic laboratories. Like most LIMS, the underlying PiMS data model originally had no support for protein-protein complexes. To support the SPINE2-Complexes project the developers have extended PiMS to meet these requirements. The modifications to PiMS, described here, include data model changes, additional protocols, some user interface changes and functionality to detect when an experiment may have formed a complex. Example data are shown for the production of a crystal of a protein complex. Integration with SPINE2-Complexes Target Tracker application is also described. Copyright © 2011 Elsevier Inc. All rights reserved.

  8. Aeropropulsion 1987. Session 2: Aeropropulsion Structures Research

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Aeropropulsion systems present unique problems to the structural engineer. The extremes in operating temperatures, rotational effects, and behaviors of advanced material systems combine into complexities that require advances in many scientific disciplines involved in structural analysis and design procedures. This session provides an overview of the complexities of aeropropulsion structures and the theoretical, computational, and experimental research conducted to achieve the needed advances.

  9. 76 FR 30382 - Willamette Valley National Wildlife Refuge Complex, Benton, Linn, Marion, and Polk Counties, OR

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-25

    ... Wildlife Refuge Complex, 26208 Finley Refuge Road, Corvallis, OR 97333- 9533. Web site: http://www.fws.gov... Refuge System Administration Act of 1966 (16 U.S.C. 668dd-668ee) (Refuge Administration Act), as amended by the National Wildlife Refuge System Improvement Act of 1997, requires us to develop a CCP for each...

  10. Overview of Intelligent Systems and Operations Development

    NASA Technical Reports Server (NTRS)

    Pallix, Joan; Dorais, Greg; Penix, John

    2004-01-01

    To achieve NASA's ambitious mission objectives for the future, aircraft and spacecraft will need intelligence to take the correct action in a variety of circumstances. Vehicle intelligence can be defined as the ability to "do the right thing" when faced with a complex decision-making situation. It will be necessary to implement integrated autonomous operations and low-level adaptive flight control technologies to direct actions that enhance the safety and success of complex missions despite component failures, degraded performance, operator errors, and environment uncertainty. This paper will describe the array of technologies required to meet these complex objectives. This includes the integration of high-level reasoning and autonomous capabilities with multiple subsystem controllers for robust performance. Future intelligent systems will use models of the system, its environment, and other intelligent agents with which it interacts. They will also require planners, reasoning engines, and adaptive controllers that can recommend or execute commands enabling the system to respond intelligently. The presentation will also address the development of highly dependable software, which is a key component to ensure the reliability of intelligent systems.

  11. 1991 Annual report on scientific programs: A broad research program on the sciences of complexity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-01-01

    1991 was continued rapid growth for the Santa Fe Institute (SFI) as it broadened its interdisciplinary research into the organization, evolution and operation of complex systems and sought deeply the principles underlying their dynamic behavior. Research on complex systems--the focus of work at SFI--involves an extraordinary range of topics normally studied in seemingly disparate fields. Natural systems displaying complex behavior range upwards from proteins and DNA through cells and evolutionary systems to human societies. Research models exhibiting complexity include nonlinear equations, spin glasses, cellular automata, genetic algorithms, classifier systems, and an array of other computational models. Some of the majormore » questions facing complex systems researchers are: (1) explaining how complexity arises from the nonlinear interaction of simples components, (2) describing the mechanisms underlying high-level aggregate behavior of complex systems (such as the overt behavior of an organism, the flow of energy in an ecology, the GNP of an economy), and (3) creating a theoretical framework to enable predictions about the likely behavior of such systems in various conditions. The importance of understanding such systems in enormous: many of the most serious challenges facing humanity--e.g., environmental sustainability, economic stability, the control of disease--as well as many of the hardest scientific questions--e.g., protein folding, the distinction between self and non-self in the immune system, the nature of intelligence, the origin of life--require deep understanding of complex systems.« less

  12. 1991 Annual report on scientific programs: A broad research program on the sciences of complexity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-12-31

    1991 was continued rapid growth for the Santa Fe Institute (SFI) as it broadened its interdisciplinary research into the organization, evolution and operation of complex systems and sought deeply the principles underlying their dynamic behavior. Research on complex systems--the focus of work at SFI--involves an extraordinary range of topics normally studied in seemingly disparate fields. Natural systems displaying complex behavior range upwards from proteins and DNA through cells and evolutionary systems to human societies. Research models exhibiting complexity include nonlinear equations, spin glasses, cellular automata, genetic algorithms, classifier systems, and an array of other computational models. Some of the majormore » questions facing complex systems researchers are: (1) explaining how complexity arises from the nonlinear interaction of simples components, (2) describing the mechanisms underlying high-level aggregate behavior of complex systems (such as the overt behavior of an organism, the flow of energy in an ecology, the GNP of an economy), and (3) creating a theoretical framework to enable predictions about the likely behavior of such systems in various conditions. The importance of understanding such systems in enormous: many of the most serious challenges facing humanity--e.g., environmental sustainability, economic stability, the control of disease--as well as many of the hardest scientific questions--e.g., protein folding, the distinction between self and non-self in the immune system, the nature of intelligence, the origin of life--require deep understanding of complex systems.« less

  13. 33 CFR 149.125 - What are the requirements for the malfunction detection system?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... marine transfer area on an oil deepwater port must be equipped with a monitoring system in accordance...) Each oil and natural gas system, between a pumping platform complex and the shore, must have a system...

  14. 33 CFR 149.125 - What are the requirements for the malfunction detection system?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... marine transfer area on an oil deepwater port must be equipped with a monitoring system in accordance...) Each oil and natural gas system, between a pumping platform complex and the shore, must have a system...

  15. 33 CFR 149.125 - What are the requirements for the malfunction detection system?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... marine transfer area on an oil deepwater port must be equipped with a monitoring system in accordance...) Each oil and natural gas system, between a pumping platform complex and the shore, must have a system...

  16. The MSFC Systems Engineering Guide: An Overview and Plan

    NASA Technical Reports Server (NTRS)

    Shelby, Jerry A.; Thomas, L. Dale

    2007-01-01

    As systems and subsystems requirements become more complex in the pursuit of the exploration of space, advanced technology will demand and require an integrated approach to the design and development of safe and successful space vehicles and there products. System engineers play a vital and key role in transforming mission needs into vehicle requirements that can be verified and validated. This will result in a safe and cost effective design that will satisfy the mission schedule. A key to successful vehicle design within systems engineering is communication. Communication, through a systems engineering infrastructure, will not only ensure that customers and stakeholders are satisfied but will also assist in identifying vehicle requirements; i.e. identification, integration and management. This vehicle design will produce a system that is verifiable, traceable, and effectively satisfies cost, schedule, performance, and risk throughout the life-cycle of the product. A communication infrastructure will bring about the integration of different engineering disciplines within vehicle design. A system utilizing these aspects will enhance system engineering performance and improve upon required activities such as Development of Requirements, Requirements Management, Functional Analysis, Test, Synthesis, Trade Studies, Documentation, and Lessons Learned to produce a successful final product. This paper will describe the guiding vision, progress to date and the plan forward for development of the Marshall Space Flight Center (MSFC) Systems Engineering Guide (SEG), a virtual systems engineering handbook and archive that will describe the system engineering processes that are used by MSFC in the development of complex systems such as the Ares launch vehicle. It is the intent of this website to be a "One Stop Shop" for our systems engineers that will provide tutorial information, an overview of processes and procedures and links to assist system engineering with guidance and references, and provide an archive of systems engineering artifacts produced by the many NASA projects developed and managed by MSFC over the years.

  17. Global Hawk Systems Engineering. Case Study

    DTIC Science & Technology

    2010-01-01

    Management Core System ( TBMCS ) (complex software development) • F-111 Fighter (joint program with significant involvement by the Office of the...Software Requirements Specification TACC Tailored Airworthiness Certification Criteria TBMCS Theater Battle Management Core System TEMP Test and

  18. Implementation of Complexity Analyzing Based on Additional Effect

    NASA Astrophysics Data System (ADS)

    Zhang, Peng; Li, Na; Liang, Yanhong; Liu, Fang

    According to the Complexity Theory, there is complexity in the system when the functional requirement is not be satisfied. There are several study performances for Complexity Theory based on Axiomatic Design. However, they focus on reducing the complexity in their study and no one focus on method of analyzing the complexity in the system. Therefore, this paper put forth a method of analyzing the complexity which is sought to make up the deficiency of the researches. In order to discussing the method of analyzing the complexity based on additional effect, this paper put forth two concepts which are ideal effect and additional effect. The method of analyzing complexity based on additional effect combines Complexity Theory with Theory of Inventive Problem Solving (TRIZ). It is helpful for designers to analyze the complexity by using additional effect. A case study shows the application of the process.

  19. Changes in Reference Question Complexity Following the Implementation of a Proactive Chat System: Implications for Practice

    ERIC Educational Resources Information Center

    Maloney, Krisellen; Kemp, Jan H.

    2015-01-01

    There has been longstanding debate about whether the level of complexity of questions received at reference desks and via online chat services requires a librarian's expertise. Continued decreases in the number and complexity of reference questions have all but ended the debate; many academic libraries no longer staff service points with…

  20. Embracing uncertainty, managing complexity: applying complexity thinking principles to transformation efforts in healthcare systems.

    PubMed

    Khan, Sobia; Vandermorris, Ashley; Shepherd, John; Begun, James W; Lanham, Holly Jordan; Uhl-Bien, Mary; Berta, Whitney

    2018-03-21

    Complexity thinking is increasingly being embraced in healthcare, which is often described as a complex adaptive system (CAS). Applying CAS to healthcare as an explanatory model for understanding the nature of the system, and to stimulate changes and transformations within the system, is valuable. A seminar series on systems and complexity thinking hosted at the University of Toronto in 2016 offered a number of insights on applications of CAS perspectives to healthcare that we explore here. We synthesized topics from this series into a set of six insights on how complexity thinking fosters a deeper understanding of accepted ideas in healthcare, applications of CAS to actors within the system, and paradoxes in applications of complexity thinking that may require further debate: 1) a complexity lens helps us better understand the nebulous term "context"; 2) concepts of CAS may be applied differently when actors are cognizant of the system in which they operate; 3) actor responses to uncertainty within a CAS is a mechanism for emergent and intentional adaptation; 4) acknowledging complexity supports patient-centred intersectional approaches to patient care; 5) complexity perspectives can support ways that leaders manage change (and transformation) in healthcare; and 6) complexity demands different ways of implementing ideas and assessing the system. To enhance our exploration of key insights, we augmented the knowledge gleaned from the series with key articles on complexity in the literature. Ultimately, complexity thinking acknowledges the "messiness" that we seek to control in healthcare and encourages us to embrace it. This means seeing challenges as opportunities for adaptation, stimulating innovative solutions to ensure positive adaptation, leveraging the social system to enable ideas to emerge and spread across the system, and even more important, acknowledging that these adaptive actions are part of system behaviour just as much as periods of stability are. By embracing uncertainty and adapting innovatively, complexity thinking enables system actors to engage meaningfully and comfortably in healthcare system transformation.

  1. The dynamics of health care reform--learning from a complex adaptive systems theoretical perspective.

    PubMed

    Sturmberg, Joachim P; Martin, Carmel M

    2010-10-01

    Health services demonstrate key features of complex adaptive systems (CAS), they are dynamic and unfold in unpredictable ways, and unfolding events are often unique. To better understand the complex adaptive nature of health systems around a core attractor we propose the metaphor of the health care vortex. We also suggest that in an ideal health care system the core attractor would be personal health attainment. Health care reforms around the world offer an opportunity to analyse health system change from a complex adaptive perspective. At large health care reforms have been pursued disregarding the complex adaptive nature of the health system. The paper details some recent reforms and outlines how to understand their strategies and outcomes, and what could be learnt for future efforts, utilising CAS principles. Current health systems show the inherent properties of a CAS driven by a core attractor of disease and cost containment. We content that more meaningful health systems reform requires the delicate task of shifting the core attractor from disease and cost containment towards health attainment.

  2. Design of experiments for identification of complex biochemical systems with applications to mitochondrial bioenergetics.

    PubMed

    Vinnakota, Kalyan C; Beard, Daniel A; Dash, Ranjan K

    2009-01-01

    Identification of a complex biochemical system model requires appropriate experimental data. Models constructed on the basis of data from the literature often contain parameters that are not identifiable with high sensitivity and therefore require additional experimental data to identify those parameters. Here we report the application of a local sensitivity analysis to design experiments that will improve the identifiability of previously unidentifiable model parameters in a model of mitochondrial oxidative phosphorylation and tricaboxylic acid cycle. Experiments were designed based on measurable biochemical reactants in a dilute suspension of purified cardiac mitochondria with experimentally feasible perturbations to this system. Experimental perturbations and variables yielding the most number of parameters above a 5% sensitivity level are presented and discussed.

  3. HALOS: fast, autonomous, holographic adaptive optics

    NASA Astrophysics Data System (ADS)

    Andersen, Geoff P.; Gelsinger-Austin, Paul; Gaddipati, Ravi; Gaddipati, Phani; Ghebremichael, Fassil

    2014-08-01

    We present progress on our holographic adaptive laser optics system (HALOS): a compact, closed-loop aberration correction system that uses a multiplexed hologram to deconvolve the phase aberrations in an input beam. The wavefront characterization is based on simple, parallel measurements of the intensity of fixed focal spots and does not require any complex calculations. As such, the system does not require a computer and is thus much cheaper, less complex than conventional approaches. We present details of a fully functional, closed-loop prototype incorporating a 32-element MEMS mirror, operating at a bandwidth of over 10kHz. Additionally, since the all-optical sensing is made in parallel, the speed is independent of actuator number - running at the same bandwidth for one actuator as for a million.

  4. Cross-terminology mapping challenges: a demonstration using medication terminological systems.

    PubMed

    Saitwal, Himali; Qing, David; Jones, Stephen; Bernstam, Elmer V; Chute, Christopher G; Johnson, Todd R

    2012-08-01

    Standardized terminological systems for biomedical information have provided considerable benefits to biomedical applications and research. However, practical use of this information often requires mapping across terminological systems-a complex and time-consuming process. This paper demonstrates the complexity and challenges of mapping across terminological systems in the context of medication information. It provides a review of medication terminological systems and their linkages, then describes a case study in which we mapped proprietary medication codes from an electronic health record to SNOMED CT and the UMLS Metathesaurus. The goal was to create a polyhierarchical classification system for querying an i2b2 clinical data warehouse. We found that three methods were required to accurately map the majority of actively prescribed medications. Only 62.5% of source medication codes could be mapped automatically. The remaining codes were mapped using a combination of semi-automated string comparison with expert selection, and a completely manual approach. Compound drugs were especially difficult to map: only 7.5% could be mapped using the automatic method. General challenges to mapping across terminological systems include (1) the availability of up-to-date information to assess the suitability of a given terminological system for a particular use case, and to assess the quality and completeness of cross-terminology links; (2) the difficulty of correctly using complex, rapidly evolving, modern terminologies; (3) the time and effort required to complete and evaluate the mapping; (4) the need to address differences in granularity between the source and target terminologies; and (5) the need to continuously update the mapping as terminological systems evolve. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Benefits of advanced software techniques for mission planning systems

    NASA Technical Reports Server (NTRS)

    Gasquet, A.; Parrod, Y.; Desaintvincent, A.

    1994-01-01

    The increasing complexity of modern spacecraft, and the stringent requirement for maximizing their mission return, call for a new generation of Mission Planning Systems (MPS). In this paper, we discuss the requirements for the Space Mission Planning and the benefits which can be expected from Artificial Intelligence techniques through examples of applications developed by Matra Marconi Space.

  6. Benefits of advanced software techniques for mission planning systems

    NASA Astrophysics Data System (ADS)

    Gasquet, A.; Parrod, Y.; Desaintvincent, A.

    1994-10-01

    The increasing complexity of modern spacecraft, and the stringent requirement for maximizing their mission return, call for a new generation of Mission Planning Systems (MPS). In this paper, we discuss the requirements for the Space Mission Planning and the benefits which can be expected from Artificial Intelligence techniques through examples of applications developed by Matra Marconi Space.

  7. Axiomatic Design of Space Life Support Systems

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2017-01-01

    Systems engineering is an organized way to design and develop systems, but the initial system design concepts are usually seen as the products of unexplained but highly creative intuition. Axiomatic design is a mathematical approach to produce and compare system architectures. The two axioms are:- Maintain the independence of the functional requirements.- Minimize the information content (or complexity) of the design. The first axiom generates good system design structures and the second axiom ranks them. The closed system human life support architecture now implemented in the International Space Station has been essentially unchanged for fifty years. In contrast, brief missions such as Apollo and Shuttle have used open loop life support. As mission length increases, greater system closure and increased recycling become more cost-effective.Closure can be gradually increased, first recycling humidity condensate, then hygiene wastewater, urine, carbon dioxide, and water recovery brine. A long term space station or planetary base could implement nearly full closure, including food production. Dynamic systems theory supports the axioms by showing that fewer requirements, fewer subsystems, and fewer interconnections all increase system stability. If systems are too complex and interconnected, reliability is reduced and operations and maintenance become more difficult. Using axiomatic design shows how the mission duration and other requirements determine the best life support system design including the degree of closure.

  8. Method and apparatus for purifying nucleic acids and performing polymerase chain reaction assays using an immiscible fluid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koh, Chung-Yan; Light, Yooli Kim; Piccini, Matthew Ernest

    Embodiments of the present invention are directed toward devices, systems, and methods for purifying nucleic acids to conduct polymerase chain reaction (PCR) assays. In one example, a method includes generating complexes of silica beads and nucleic acids in a lysis buffer, transporting the complexes through an immiscible fluid to remove interfering compounds from the complexes, further transporting the complexes into a density medium containing components required for PCR where the nucleic acids disassociate from the silica beads, and thermocycling the contents of the density medium to achieve PCR. Signal may be detected from labeling agents in the components required formore » PCR.« less

  9. Spacecraft Parachute Recovery System Testing from a Failure Rate Perspective

    NASA Technical Reports Server (NTRS)

    Stewart, Christine E.

    2013-01-01

    Spacecraft parachute recovery systems, especially those with a parachute cluster, require testing to identify and reduce failures. This is especially important when the spacecraft in question is human-rated. Due to the recent effort to make spaceflight affordable, the importance of determining a minimum requirement for testing has increased. The number of tests required to achieve a mature design, with a relatively constant failure rate, can be estimated from a review of previous complex spacecraft recovery systems. Examination of the Apollo parachute testing and the Shuttle Solid Rocket Booster recovery chute system operation will clarify at which point in those programs the system reached maturity. This examination will also clarify the risks inherent in not performing a sufficient number of tests prior to operation with humans on-board. When looking at complex parachute systems used in spaceflight landing systems, a pattern begins to emerge regarding the need for a minimum amount of testing required to wring out the failure modes and reduce the failure rate of the parachute system to an acceptable level for human spaceflight. Not only a sufficient number of system level testing, but also the ability to update the design as failure modes are found is required to drive the failure rate of the system down to an acceptable level. In addition, sufficient data and images are necessary to identify incipient failure modes or to identify failure causes when a system failure occurs. In order to demonstrate the need for sufficient system level testing prior to an acceptable failure rate, the Apollo Earth Landing System (ELS) test program and the Shuttle Solid Rocket Booster Recovery System failure history will be examined, as well as some experiences in the Orion Capsule Parachute Assembly System will be noted.

  10. Advances In High Temperature (Viscoelastoplastic) Material Modeling for Thermal Structural Analysis

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Saleeb, Atef F.

    2005-01-01

    Typical High Temperature Applications High Temperature Applications Demand High Performance Materials: 1) Complex Thermomechanical Loading; 2) Complex Material response requires Time-Dependent/Hereditary Models: Viscoelastic/Viscoplastic; and 3) Comprehensive Characterization (Tensile, Creep, Relaxation) for a variety of material systems.

  11. Gaming science innovations to integrate health systems science into medical education and practice

    PubMed Central

    White, Earla J; Lewis, Joy H; McCoy, Lise

    2018-01-01

    Health systems science (HSS) is an emerging discipline addressing multiple, complex, interdependent variables that affect providers’ abilities to deliver patient care and influence population health. New perspectives and innovations are required as physician leaders and medical educators strive to accelerate changes in medical education and practice to meet the needs of evolving populations and systems. The purpose of this paper is to introduce gaming science as a lens to magnify HSS integration opportunities in the scope of medical education and practice. Evidence supports gaming science innovations as effective teaching and learning tools to promote learner engagement in scientific and systems thinking for decision making in complex scenarios. Valuable insights and lessons gained through the history of war games have resulted in strategic thinking to minimize risk and save lives. In health care, where decisions can affect patient and population outcomes, gaming science innovations have the potential to provide safe learning environments to practice crucial decision-making skills. Research of gaming science limitations, gaps, and strategies to maximize innovations to further advance HSS in medical education and practice is required. Gaming science holds promise to equip health care teams with HSS knowledge and skills required for transformative practice. The ultimate goals are to empower providers to work in complex systems to improve patient and population health outcomes and experiences, and to reduce costs and improve care team well-being.

  12. Gaming science innovations to integrate health systems science into medical education and practice.

    PubMed

    White, Earla J; Lewis, Joy H; McCoy, Lise

    2018-01-01

    Health systems science (HSS) is an emerging discipline addressing multiple, complex, interdependent variables that affect providers' abilities to deliver patient care and influence population health. New perspectives and innovations are required as physician leaders and medical educators strive to accelerate changes in medical education and practice to meet the needs of evolving populations and systems. The purpose of this paper is to introduce gaming science as a lens to magnify HSS integration opportunities in the scope of medical education and practice. Evidence supports gaming science innovations as effective teaching and learning tools to promote learner engagement in scientific and systems thinking for decision making in complex scenarios. Valuable insights and lessons gained through the history of war games have resulted in strategic thinking to minimize risk and save lives. In health care, where decisions can affect patient and population outcomes, gaming science innovations have the potential to provide safe learning environments to practice crucial decision-making skills. Research of gaming science limitations, gaps, and strategies to maximize innovations to further advance HSS in medical education and practice is required. Gaming science holds promise to equip health care teams with HSS knowledge and skills required for transformative practice. The ultimate goals are to empower providers to work in complex systems to improve patient and population health outcomes and experiences, and to reduce costs and improve care team well-being.

  13. StakeMeter: value-based stakeholder identification and quantification framework for value-based software systems.

    PubMed

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N A; Bin Zaheer, Kashif

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.

  14. A View on Future Building System Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wetter, Michael

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described bymore » coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).« less

  15. Modeling and Performance Considerations for Automated Fault Isolation in Complex Systems

    NASA Technical Reports Server (NTRS)

    Ferrell, Bob; Oostdyk, Rebecca

    2010-01-01

    The purpose of this paper is to document the modeling considerations and performance metrics that were examined in the development of a large-scale Fault Detection, Isolation and Recovery (FDIR) system. The FDIR system is envisioned to perform health management functions for both a launch vehicle and the ground systems that support the vehicle during checkout and launch countdown by using suite of complimentary software tools that alert operators to anomalies and failures in real-time. The FDIR team members developed a set of operational requirements for the models that would be used for fault isolation and worked closely with the vendor of the software tools selected for fault isolation to ensure that the software was able to meet the requirements. Once the requirements were established, example models of sufficient complexity were used to test the performance of the software. The results of the performance testing demonstrated the need for enhancements to the software in order to meet the demands of the full-scale ground and vehicle FDIR system. The paper highlights the importance of the development of operational requirements and preliminary performance testing as a strategy for identifying deficiencies in highly scalable systems and rectifying those deficiencies before they imperil the success of the project

  16. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Concept document

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station Payload of experiments that will be onboard the Space Station Freedom. The simulation will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  17. The Hsp90 chaperone complex regulates GDI-dependent Rab recycling.

    PubMed

    Chen, Christine Y; Balch, William E

    2006-08-01

    Rab GTPase regulated hubs provide a framework for an integrated coding system, the membrome network, that controls the dynamics of the specialized exocytic and endocytic membrane architectures found in eukaryotic cells. Herein, we report that Rab recycling in the early exocytic pathways involves the heat-shock protein (Hsp)90 chaperone system. We find that Hsp90 forms a complex with guanine nucleotide dissociation inhibitor (GDI) to direct recycling of the client substrate Rab1 required for endoplasmic reticulum (ER)-to-Golgi transport. ER-to-Golgi traffic is inhibited by the Hsp90-specific inhibitors geldanamycin (GA), 17-(dimethylaminoethylamino)-17-demethoxygeldanamycin (17-DMAG), and radicicol. Hsp90 activity is required to form a functional GDI complex to retrieve Rab1 from the membrane. Moreover, we find that Hsp90 is essential for Rab1-dependent Golgi assembly. The observation that the highly divergent Rab GTPases Rab1 involved in ER-to-Golgi transport and Rab3A involved in synaptic vesicle fusion require Hsp90 for retrieval from membranes lead us to now propose that the Hsp90 chaperone system may function as a general regulator for Rab GTPase recycling in exocytic and endocytic trafficking pathways involved in cell signaling and proliferation.

  18. Multi-agent based control of large-scale complex systems employing distributed dynamic inference engine

    NASA Astrophysics Data System (ADS)

    Zhang, Daili

    Increasing societal demand for automation has led to considerable efforts to control large-scale complex systems, especially in the area of autonomous intelligent control methods. The control system of a large-scale complex system needs to satisfy four system level requirements: robustness, flexibility, reusability, and scalability. Corresponding to the four system level requirements, there arise four major challenges. First, it is difficult to get accurate and complete information. Second, the system may be physically highly distributed. Third, the system evolves very quickly. Fourth, emergent global behaviors of the system can be caused by small disturbances at the component level. The Multi-Agent Based Control (MABC) method as an implementation of distributed intelligent control has been the focus of research since the 1970s, in an effort to solve the above-mentioned problems in controlling large-scale complex systems. However, to the author's best knowledge, all MABC systems for large-scale complex systems with significant uncertainties are problem-specific and thus difficult to extend to other domains or larger systems. This situation is partly due to the control architecture of multiple agents being determined by agent to agent coupling and interaction mechanisms. Therefore, the research objective of this dissertation is to develop a comprehensive, generalized framework for the control system design of general large-scale complex systems with significant uncertainties, with the focus on distributed control architecture design and distributed inference engine design. A Hybrid Multi-Agent Based Control (HyMABC) architecture is proposed by combining hierarchical control architecture and module control architecture with logical replication rings. First, it decomposes a complex system hierarchically; second, it combines the components in the same level as a module, and then designs common interfaces for all of the components in the same module; third, replications are made for critical agents and are organized into logical rings. This architecture maintains clear guidelines for complexity decomposition and also increases the robustness of the whole system. Multiple Sectioned Dynamic Bayesian Networks (MSDBNs) as a distributed dynamic probabilistic inference engine, can be embedded into the control architecture to handle uncertainties of general large-scale complex systems. MSDBNs decomposes a large knowledge-based system into many agents. Each agent holds its partial perspective of a large problem domain by representing its knowledge as a Dynamic Bayesian Network (DBN). Each agent accesses local evidence from its corresponding local sensors and communicates with other agents through finite message passing. If the distributed agents can be organized into a tree structure, satisfying the running intersection property and d-sep set requirements, globally consistent inferences are achievable in a distributed way. By using different frequencies for local DBN agent belief updating and global system belief updating, it balances the communication cost with the global consistency of inferences. In this dissertation, a fully factorized Boyen-Koller (BK) approximation algorithm is used for local DBN agent belief updating, and the static Junction Forest Linkage Tree (JFLT) algorithm is used for global system belief updating. MSDBNs assume a static structure and a stable communication network for the whole system. However, for a real system, sub-Bayesian networks as nodes could be lost, and the communication network could be shut down due to partial damage in the system. Therefore, on-line and automatic MSDBNs structure formation is necessary for making robust state estimations and increasing survivability of the whole system. A Distributed Spanning Tree Optimization (DSTO) algorithm, a Distributed D-Sep Set Satisfaction (DDSSS) algorithm, and a Distributed Running Intersection Satisfaction (DRIS) algorithm are proposed in this dissertation. Combining these three distributed algorithms and a Distributed Belief Propagation (DBP) algorithm in MSDBNs makes state estimations robust to partial damage in the whole system. Combining the distributed control architecture design and the distributed inference engine design leads to a process of control system design for a general large-scale complex system. As applications of the proposed methodology, the control system design of a simplified ship chilled water system and a notional ship chilled water system have been demonstrated step by step. Simulation results not only show that the proposed methodology gives a clear guideline for control system design for general large-scale complex systems with dynamic and uncertain environment, but also indicate that the combination of MSDBNs and HyMABC can provide excellent performance for controlling general large-scale complex systems.

  19. U.S. Patent Pending, Cyberspace Security System for Complex Systems, U.S. Patent Application No.: 14/134,949

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Sheldon, Frederick T; Mili, Ali

    A computer implemented method monetizes the security of a cyber-system in terms of losses each stakeholder may expect to lose if a security break down occurs. A non-transitory media stores instructions for generating a stake structure that includes costs that each stakeholder of a system would lose if the system failed to meet security requirements and generating a requirement structure that includes probabilities of failing requirements when computer components fails. The system generates a vulnerability model that includes probabilities of a component failing given threats materializing and generates a perpetrator model that includes probabilities of threats materializing. The system generatesmore » a dot product of the stakes structure, the requirement structure, the vulnerability model and the perpetrator model. The system can further be used to compare, contrast and evaluate alternative courses of actions best suited for the stakeholders and their requirements.« less

  20. Space shuttle engineering and operations support. Avionics system engineering

    NASA Technical Reports Server (NTRS)

    Broome, P. A.; Neubaur, R. J.; Welsh, R. T.

    1976-01-01

    The shuttle avionics integration laboratory (SAIL) requirements for supporting the Spacelab/orbiter avionics verification process are defined. The principal topics are a Spacelab avionics hardware assessment, test operations center/electronic systems test laboratory (TOC/ESL) data processing requirements definition, SAIL (Building 16) payload accommodations study, and projected funding and test scheduling. Because of the complex nature of the Spacelab/orbiter computer systems, the PCM data link, and the high rate digital data system hardware/software relationships, early avionics interface verification is required. The SAIL is a prime candidate test location to accomplish this early avionics verification.

  1. A Bayes network approach to uncertainty quantification in hierarchically developed computational models

    DOE PAGES

    Urbina, Angel; Mahadevan, Sankaran; Paez, Thomas L.

    2012-03-01

    Here, performance assessment of complex systems is ideally accomplished through system-level testing, but because they are expensive, such tests are seldom performed. On the other hand, for economic reasons, data from tests on individual components that are parts of complex systems are more readily available. The lack of system-level data leads to a need to build computational models of systems and use them for performance prediction in lieu of experiments. Because their complexity, models are sometimes built in a hierarchical manner, starting with simple components, progressing to collections of components, and finally, to the full system. Quantification of uncertainty inmore » the predicted response of a system model is required in order to establish confidence in the representation of actual system behavior. This paper proposes a framework for the complex, but very practical problem of quantification of uncertainty in system-level model predictions. It is based on Bayes networks and uses the available data at multiple levels of complexity (i.e., components, subsystem, etc.). Because epistemic sources of uncertainty were shown to be secondary, in this application, aleatoric only uncertainty is included in the present uncertainty quantification. An example showing application of the techniques to uncertainty quantification of measures of response of a real, complex aerospace system is included.« less

  2. Toolsets Maintain Health of Complex Systems

    NASA Technical Reports Server (NTRS)

    2010-01-01

    First featured in Spinoff 2001, Qualtech Systems Inc. (QSI), of Wethersfield, Connecticut, adapted its Testability, Engineering, and Maintenance System (TEAMS) toolset under Small Business Innovation Research (SBIR) contracts from Ames Research Center to strengthen NASA's systems health management approach for its large, complex, and interconnected systems. Today, six NASA field centers utilize the TEAMS toolset, including TEAMS-Designer, TEAMS-RT, TEAMATE, and TEAMS-RDS. TEAMS is also being used on industrial systems that generate power, carry data, refine chemicals, perform medical functions, and produce semiconductor wafers. QSI finds TEAMS can lower costs by decreasing problems requiring service by 30 to 50 percent.

  3. Design consideration in constructing high performance embedded Knowledge-Based Systems (KBS)

    NASA Technical Reports Server (NTRS)

    Dalton, Shelly D.; Daley, Philip C.

    1988-01-01

    As the hardware trends for artificial intelligence (AI) involve more and more complexity, the process of optimizing the computer system design for a particular problem will also increase in complexity. Space applications of knowledge based systems (KBS) will often require an ability to perform both numerically intensive vector computations and real time symbolic computations. Although parallel machines can theoretically achieve the speeds necessary for most of these problems, if the application itself is not highly parallel, the machine's power cannot be utilized. A scheme is presented which will provide the computer systems engineer with a tool for analyzing machines with various configurations of array, symbolic, scaler, and multiprocessors. High speed networks and interconnections make customized, distributed, intelligent systems feasible for the application of AI in space. The method presented can be used to optimize such AI system configurations and to make comparisons between existing computer systems. It is an open question whether or not, for a given mission requirement, a suitable computer system design can be constructed for any amount of money.

  4. A development framework for artificial intelligence based distributed operations support systems

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.; Cottman, Bruce H.

    1990-01-01

    Advanced automation is required to reduce costly human operations support requirements for complex space-based and ground control systems. Existing knowledge based technologies have been used successfully to automate individual operations tasks. Considerably less progress has been made in integrating and coordinating multiple operations applications for unified intelligent support systems. To fill this gap, SOCIAL, a tool set for developing Distributed Artificial Intelligence (DAI) systems is being constructed. SOCIAL consists of three primary language based components defining: models of interprocess communication across heterogeneous platforms; models for interprocess coordination, concurrency control, and fault management; and for accessing heterogeneous information resources. DAI applications subsystems, either new or existing, will access these distributed services non-intrusively, via high-level message-based protocols. SOCIAL will reduce the complexity of distributed communications, control, and integration, enabling developers to concentrate on the design and functionality of the target DAI system itself.

  5. The Acquisition Process as a Vehicle for Enabling Knowledge Management in the Lifecycle of Complex Federal Systems

    NASA Technical Reports Server (NTRS)

    Stewart, Helen; Spence, Matt Chew; Holm, Jeanne; Koga, Dennis (Technical Monitor)

    2001-01-01

    This white paper explores how to increase the success and operation of critical, complex, national systems by effectively capturing knowledge management requirements within the federal acquisition process. Although we focus on aerospace flight systems, the principles outlined within may have a general applicability to other critical federal systems as well. Fundamental design deficiencies in federal, mission-critical systems have contributed to recent, highly visible system failures, such as the V-22 Osprey and the Delta rocket family. These failures indicate that the current mechanisms for knowledge management and risk management are inadequate to meet the challenges imposed by the rising complexity of critical systems. Failures of aerospace system operations and vehicles may have been prevented or lessened through utilization of better knowledge management and information management techniques.

  6. Cross-terminology mapping challenges: A demonstration using medication terminological systems

    PubMed Central

    Saitwal, Himali; Qing, David; Jones, Stephen; Bernstam, Elmer; Chute, Christopher G.; Johnson, Todd R.

    2015-01-01

    Standardized terminological systems for biomedical information have provided considerable benefits to biomedical applications and research. However, practical use of this information often requires mapping across terminological systems—a complex and time-consuming process. This paper demonstrates the complexity and challenges of mapping across terminological systems in the context of medication information. It provides a review of medication terminological systems and their linkages, then describes a case study in which we mapped proprietary medication codes from an electronic health record to SNOMED-CT and the UMLS Metathesaurus. The goal was to create a polyhierarchical classification system for querying an i2b2 clinical data warehouse. We found that three methods were required to accurately map the majority of actively prescribed medications. Only 62.5% of source medication codes could be mapped automatically. The remaining codes were mapped using a combination of semi-automated string comparison with expert selection, and a completely manual approach. Compound drugs were especially difficult to map: only 7.5% could be mapped using the automatic method. General challenges to mapping across terminological systems include (1) the availability of up-to-date information to assess the suitability of a given terminological system for a particular use case, and to assess the quality and completeness of cross-terminology links; (2) the difficulty of correctly using complex, rapidly evolving, modern terminologies; (3) the time and effort required to complete and evaluate the mapping; (4) the need to address differences in granularity between the source and target terminologies; and (5) the need to continuously update the mapping as terminological systems evolve. PMID:22750536

  7. On Machine Capacitance Dimensional and Surface Profile Measurement System

    NASA Technical Reports Server (NTRS)

    Resnick, Ralph

    1993-01-01

    A program was awarded under the Air Force Machine Tool Sensor Improvements Program Research and Development Announcement to develop and demonstrate the use of a Capacitance Sensor System including Capacitive Non-Contact Analog Probe and a Capacitive Array Dimensional Measurement System to check the dimensions of complex shapes and contours on a machine tool or in an automated inspection cell. The manufacturing of complex shapes and contours and the subsequent verification of those manufactured shapes is fundamental and widespread throughout industry. The critical profile of a gear tooth; the overall shape of a graphite EDM electrode; the contour of a turbine blade in a jet engine; and countless other components in varied applications possess complex shapes that require detailed and complex inspection procedures. Current inspection methods for complex shapes and contours are expensive, time-consuming, and labor intensive.

  8. Next generation space interconnect research and development in space communications

    NASA Astrophysics Data System (ADS)

    Collier, Charles Patrick

    2017-11-01

    Interconnect or "bus" is one of the critical technologies in design of spacecraft avionics systems that dictates its architecture and complexity. MIL-STD-1553B has long been used as the avionics backbone technology. As avionics systems become more and more capable and complex, however, limitations of MIL-STD-1553B such as insufficient 1 Mbps bandwidth and separability have forced current avionics architects and designers to use combination of different interconnect technologies in order to meet various requirements: CompactPCI is used for backplane interconnect; LVDS or RS422 is used for low and high-speed direct point-to-point interconnect; and some proprietary interconnect standards are designed for custom interfaces. This results in a very complicated system that consumes significant spacecraft mass and power and requires extensive resources in design, integration and testing of spacecraft systems.

  9. The threshold algorithm: Description of the methodology and new developments

    NASA Astrophysics Data System (ADS)

    Neelamraju, Sridhar; Oligschleger, Christina; Schön, J. Christian

    2017-10-01

    Understanding the dynamics of complex systems requires the investigation of their energy landscape. In particular, the flow of probability on such landscapes is a central feature in visualizing the time evolution of complex systems. To obtain such flows, and the concomitant stable states of the systems and the generalized barriers among them, the threshold algorithm has been developed. Here, we describe the methodology of this approach starting from the fundamental concepts in complex energy landscapes and present recent new developments, the threshold-minimization algorithm and the molecular dynamics threshold algorithm. For applications of these new algorithms, we draw on landscape studies of three disaccharide molecules: lactose, maltose, and sucrose.

  10. Hybrid estimation of complex systems.

    PubMed

    Hofbaur, Michael W; Williams, Brian C

    2004-10-01

    Modern automated systems evolve both continuously and discretely, and hence require estimation techniques that go well beyond the capability of a typical Kalman Filter. Multiple model (MM) estimation schemes track these system evolutions by applying a bank of filters, one for each discrete system mode. Modern systems, however, are often composed of many interconnected components that exhibit rich behaviors, due to complex, system-wide interactions. Modeling these systems leads to complex stochastic hybrid models that capture the large number of operational and failure modes. This large number of modes makes a typical MM estimation approach infeasible for online estimation. This paper analyzes the shortcomings of MM estimation, and then introduces an alternative hybrid estimation scheme that can efficiently estimate complex systems with large number of modes. It utilizes search techniques from the toolkit of model-based reasoning in order to focus the estimation on the set of most likely modes, without missing symptoms that might be hidden amongst the system noise. In addition, we present a novel approach to hybrid estimation in the presence of unknown behavioral modes. This leads to an overall hybrid estimation scheme for complex systems that robustly copes with unforeseen situations in a degraded, but fail-safe manner.

  11. A user's guide to coping with estuarine management bureaucracy: An Estuarine Planning Support System (EPSS) tool.

    PubMed

    Lonsdale, Jemma; Nicholson, Rose; Weston, Keith; Elliott, Michael; Birchenough, Andrew; Sühring, Roxana

    2018-02-01

    Estuaries are amongst the most socio-economically and ecologically important environments however, due to competing and conflicting demands, management is often challenging with a complex legislative framework managed by multiple agencies. To facilitate the understanding of this legislative framework, we have developed a GISbased Estuarine Planning Support System tool. The tool integrates the requirements of the relevant legislation and provides a basis for assessing the current environmental state of an estuary as well as informing and assessing new plans to ensure a healthy estuarine state. The tool ensures that the information is easily accessible for regulators, managers, developers and the public. The tool is intended to be adaptable, but is assessed using the Humber Estuary, United Kingdom as a case study area. The successful application of the tool for complex socio-economic and environmental systems demonstrates that the tool can efficiently guide users through the complex requirements needed to support sustainable development. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  12. Getting to the core of cadherin complex function in Caenorhabditis elegans.

    PubMed

    Hardin, Jeff

    2015-01-01

    The classic cadherin-catenin complex (CCC) mediates cell-cell adhesion in metazoans. Although substantial insights have been gained by studying the CCC in vertebrate tissue culture, analyzing requirements for and regulation of the CCC in vertebrates remains challenging. Caenorhabditis elegans is a powerful system for connecting the molecular details of CCC function with functional requirements in a living organism. Recent data, using an "angstroms to embryos" approach, have elucidated functions for key residues, conserved across all metazoans, that mediate cadherin/β-catenin binding. Other recent work reveals a novel, potentially ancestral, role for the C. elegans p120ctn homologue in regulating polarization of blastomeres in the early embryo via Cdc42 and the partitioning-defective (PAR)/atypical protein kinase C (aPKC) complex. Finally, recent work suggests that the CCC is trafficked to the cell surface via the clathrin adaptor protein complex 1 (AP-1) in surprising ways. These studies continue to underscore the value of C. elegans as a model system for identifying conserved molecular mechanisms involving the CCC.

  13. Mass Drug Administration and beyond: how can we strengthen health systems to deliver complex interventions to eliminate neglected tropical diseases?

    PubMed

    Macpherson, Eleanor E; Adams, Emily R; Bockarie, Moses J; Hollingsworth, T Deirdre; Kelly-Hope, Louise A; Lehane, Mike; Kovacic, Vanja; Harrison, Robert A; Paine, Mark Ji; Reimer, Lisa J; Torr, Stephen J

    2015-01-01

    Achieving the 2020 goals for Neglected Tropical Diseases (NTDs) requires scale-up of Mass Drug Administration (MDA) which will require long-term commitment of national and global financing partners, strengthening national capacity and, at the community level, systems to monitor and evaluate activities and impact. For some settings and diseases, MDA is not appropriate and alternative interventions are required. Operational research is necessary to identify how existing MDA networks can deliver this more complex range of interventions equitably. The final stages of the different global programmes to eliminate NTDs require eliminating foci of transmission which are likely to persist in complex and remote rural settings. Operational research is required to identify how current tools and practices might be adapted to locate and eliminate these hard-to-reach foci. Chronic disabilities caused by NTDs will persist after transmission of pathogens ceases. Development and delivery of sustainable services to reduce the NTD-related disability is an urgent public health priority. LSTM and its partners are world leaders in developing and delivering interventions to control vector-borne NTDs and malaria, particularly in hard-to-reach settings in Africa. Our experience, partnerships and research capacity allows us to serve as a hub for developing, supporting, monitoring and evaluating global programmes to eliminate NTDs.

  14. Automated Planning Enables Complex Protocols on Liquid-Handling Robots.

    PubMed

    Whitehead, Ellis; Rudolf, Fabian; Kaltenbach, Hans-Michael; Stelling, Jörg

    2018-03-16

    Robotic automation in synthetic biology is especially relevant for liquid handling to facilitate complex experiments. However, research tasks that are not highly standardized are still rarely automated in practice. Two main reasons for this are the substantial investments required to translate molecular biological protocols into robot programs, and the fact that the resulting programs are often too specific to be easily reused and shared. Recent developments of standardized protocols and dedicated programming languages for liquid-handling operations addressed some aspects of ease-of-use and portability of protocols. However, either they focus on simplicity, at the expense of enabling complex protocols, or they entail detailed programming, with corresponding skills and efforts required from the users. To reconcile these trade-offs, we developed Roboliq, a software system that uses artificial intelligence (AI) methods to integrate (i) generic formal, yet intuitive, protocol descriptions, (ii) complete, but usually hidden, programming capabilities, and (iii) user-system interactions to automatically generate executable, optimized robot programs. Roboliq also enables high-level specifications of complex tasks with conditional execution. To demonstrate the system's benefits for experiments that are difficult to perform manually because of their complexity, duration, or time-critical nature, we present three proof-of-principle applications for the reproducible, quantitative characterization of GFP variants.

  15. Integrating a Genetic Algorithm Into a Knowledge-Based System for Ordering Complex Design Processes

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; McCulley, Collin M.; Bloebaum, Christina L.

    1996-01-01

    The design cycle associated with large engineering systems requires an initial decomposition of the complex system into design processes which are coupled through the transference of output data. Some of these design processes may be grouped into iterative subcycles. In analyzing or optimizing such a coupled system, it is essential to be able to determine the best ordering of the processes within these subcycles to reduce design cycle time and cost. Many decomposition approaches assume the capability is available to determine what design processes and couplings exist and what order of execution will be imposed during the design cycle. Unfortunately, this is often a complex problem and beyond the capabilities of a human design manager. A new feature, a genetic algorithm, has been added to DeMAID (Design Manager's Aid for Intelligent Decomposition) to allow the design manager to rapidly examine many different combinations of ordering processes in an iterative subcycle and to optimize the ordering based on cost, time, and iteration requirements. Two sample test cases are presented to show the effects of optimizing the ordering with a genetic algorithm.

  16. A model-based design and validation approach with OMEGA-UML and the IF toolset

    NASA Astrophysics Data System (ADS)

    Ben-hafaiedh, Imene; Constant, Olivier; Graf, Susanne; Robbana, Riadh

    2009-03-01

    Intelligent, embedded systems such as autonomous robots and other industrial systems are becoming increasingly more heterogeneous with respect to the platforms on which they are implemented, and thus the software architecture more complex to design and analyse. In this context, it is important to have well-defined design methodologies which should be supported by (1) high level design concepts allowing to master the design complexity, (2) concepts for the expression of non-functional requirements and (3) analysis tools allowing to verify or invalidate that the system under development will be able to conform to its requirements. We illustrate here such an approach for the design of complex embedded systems on hand of a small case study used as a running example for illustration purposes. We briefly present the important concepts of the OMEGA-RT UML profile, we show how we use this profile in a modelling approach, and explain how these concepts are used in the IFx verification toolbox to integrate validation into the design flow and make scalable verification possible.

  17. Validation and Verification of Future Integrated Safety-Critical Systems Operating under Off-Nominal Conditions

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.

    2010-01-01

    Loss of control remains one of the largest contributors to aircraft fatal accidents worldwide. Aircraft loss-of-control accidents are highly complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents and reducing them will require a holistic integrated intervention capability. Future onboard integrated system technologies developed for preventing loss of vehicle control accidents must be able to assure safe operation under the associated off-nominal conditions. The transition of these technologies into the commercial fleet will require their extensive validation and verification (V and V) and ultimate certification. The V and V of complex integrated systems poses major nontrivial technical challenges particularly for safety-critical operation under highly off-nominal conditions associated with aircraft loss-of-control events. This paper summarizes the V and V problem and presents a proposed process that could be applied to complex integrated safety-critical systems developed for preventing aircraft loss-of-control accidents. A summary of recent research accomplishments in this effort is also provided.

  18. The Evolution of Software and Its Impact on Complex System Design in Robotic Spacecraft Embedded Systems

    NASA Technical Reports Server (NTRS)

    Butler, Roy

    2013-01-01

    The growth in computer hardware performance, coupled with reduced energy requirements, has led to a rapid expansion of the resources available to software systems, driving them towards greater logical abstraction, flexibility, and complexity. This shift in focus from compacting functionality into a limited field towards developing layered, multi-state architectures in a grand field has both driven and been driven by the history of embedded processor design in the robotic spacecraft industry.The combinatorial growth of interprocess conditions is accompanied by benefits (concurrent development, situational autonomy, and evolution of goals) and drawbacks (late integration, non-deterministic interactions, and multifaceted anomalies) in achieving mission success, as illustrated by the case of the Mars Reconnaissance Orbiter. Approaches to optimizing the benefits while mitigating the drawbacks have taken the form of the formalization of requirements, modular design practices, extensive system simulation, and spacecraft data trend analysis. The growth of hardware capability and software complexity can be expected to continue, with future directions including stackable commodity subsystems, computer-generated algorithms, runtime reconfigurable processors, and greater autonomy.

  19. State Machine Modeling of the Space Launch System Solid Rocket Boosters

    NASA Technical Reports Server (NTRS)

    Harris, Joshua A.; Patterson-Hine, Ann

    2013-01-01

    The Space Launch System is a Shuttle-derived heavy-lift vehicle currently in development to serve as NASA's premiere launch vehicle for space exploration. The Space Launch System is a multistage rocket with two Solid Rocket Boosters and multiple payloads, including the Multi-Purpose Crew Vehicle. Planned Space Launch System destinations include near-Earth asteroids, the Moon, Mars, and Lagrange points. The Space Launch System is a complex system with many subsystems, requiring considerable systems engineering and integration. To this end, state machine analysis offers a method to support engineering and operational e orts, identify and avert undesirable or potentially hazardous system states, and evaluate system requirements. Finite State Machines model a system as a finite number of states, with transitions between states controlled by state-based and event-based logic. State machines are a useful tool for understanding complex system behaviors and evaluating "what-if" scenarios. This work contributes to a state machine model of the Space Launch System developed at NASA Ames Research Center. The Space Launch System Solid Rocket Booster avionics and ignition subsystems are modeled using MATLAB/Stateflow software. This model is integrated into a larger model of Space Launch System avionics used for verification and validation of Space Launch System operating procedures and design requirements. This includes testing both nominal and o -nominal system states and command sequences.

  20. Development of a structured approach for decomposition of complex systems on a functional basis

    NASA Astrophysics Data System (ADS)

    Yildirim, Unal; Felician Campean, I.

    2014-07-01

    The purpose of this paper is to present the System State Flow Diagram (SSFD) as a structured and coherent methodology to decompose a complex system on a solution- independent functional basis. The paper starts by reviewing common function modelling frameworks in literature and discusses practical requirements of the SSFD in the context of the current literature and current approaches in industry. The proposed methodology is illustrated through the analysis of a case study: design analysis of a generic Bread Toasting System (BTS).

  1. Thermal Control Technologies for Complex Spacecraft

    NASA Technical Reports Server (NTRS)

    Swanson, Theodore D.

    2004-01-01

    Thermal control is a generic need for all spacecraft. In response to ever more demanding science and exploration requirements, spacecraft are becoming ever more complex, and hence their thermal control systems must evolve. This paper briefly discusses the process of technology development, the state-of-the-art in thermal control, recent experiences with on-orbit two-phase systems, and the emerging thermal control technologies to meet these evolving needs. Some "lessons learned" based on experience with on-orbit systems are also presented.

  2. Hardware Specific Integration Strategy for Impedance-Based Structural Health Monitoring of Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Owen, Robert B.; Gyekenyesi, Andrew L.; Inman, Daniel J.; Ha, Dong S.

    2011-01-01

    The Integrated Vehicle Health Management (IVHM) Project, sponsored by NASA's Aeronautics Research Mission Directorate, is conducting research to advance the state of highly integrated and complex flight-critical health management technologies and systems. An effective IVHM system requires Structural Health Monitoring (SHM). The impedance method is one such SHM technique for detection and monitoring complex structures for damage. This position paper on the impedance method presents the current state of the art, future directions, applications and possible flight test demonstrations.

  3. ADAM: analysis of discrete models of biological systems using computer algebra.

    PubMed

    Hinkelmann, Franziska; Brandon, Madison; Guang, Bonny; McNeill, Rustin; Blekherman, Grigoriy; Veliz-Cuba, Alan; Laubenbacher, Reinhard

    2011-07-20

    Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics.

  4. Data System Implications Derived from User Application Requirements for Satellite Data

    NASA Technical Reports Server (NTRS)

    Neiers, J.

    1979-01-01

    An investigation of the data system needs as driven by users of space acquired Earth observation data is documented. Two major categories of users, operational and research, are identified. Limiting data acquisition alleviates some of the delays in processing thus improving timeliness of the delivered product. Trade offs occur between timeliness and data distribution costs, and between data storage and reprocessing. The complexity of the data system requirements to apply space data to users' needs is such that no single analysis suffices to design and implement the optimum system. A series of iterations is required with analyses of the salient problems in a general way, followed by a limited implementation of benefit to some users with a continual upgrade in system capacity, functions, and applications served. The resulting most important requirement for the data system is flexibility to accommodate changing requirements as the system is implemented.

  5. Detailed requirements document for the integrated structural analysis system, phase B

    NASA Technical Reports Server (NTRS)

    Rainey, J. A.

    1976-01-01

    The requirements are defined for a software system entitled integrated Structural Analysis System (ISAS) Phase B which is being developed to provide the user with a tool by which a complete and detailed analysis of a complex structural system can be performed. This software system will allow for automated interface with numerous structural analysis batch programs and for user interaction in the creation, selection, and validation of data. This system will include modifications to the 4 functions developed for ISAS, and the development of 25 new functions. The new functions are described.

  6. Large space telescope engineering scale model optical design

    NASA Technical Reports Server (NTRS)

    Facey, T. A.

    1973-01-01

    The objective is to develop the detailed design and tolerance data for the LST engineering scale model optical system. This will enable MSFC to move forward to the optical element procurement phase and also to evaluate tolerances, manufacturing requirements, assembly/checkout procedures, reliability, operational complexity, stability requirements of the structure and thermal system, and the flexibility to change and grow.

  7. Scheduling language and algorithm development study. Volume 1: Study summary and overview

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A high level computer programming language and a program library were developed to be used in writing programs for scheduling complex systems such as the space transportation system. The objectives and requirements of the study are summarized and unique features of the specified language and program library are described and related to the why of the objectives and requirements.

  8. SeaFrame: Sustaining Today’s Fleet Efficiently and Effectively. Volume 5, Issue 1, 2009

    DTIC Science & Technology

    2009-01-01

    Maneuvering 11 Shipboard Launch and Recovery Systems 13 Integrated Logistics System 15 Special Hull Treatment Tile Manufacturing 17 Navy Shipboard Oil ...Developing advanced blade section design technology for propulsors that reduces cavitation damage and required repair cost and time. - Conducting...complex we have ever written.” Ammeen adds that steering and diving algorithms are also very complex, because hydrodynamic effects of a submarine

  9. A common framework for greenhouse gas assessment protocols in temperate agroforestry systems: Connecting via GRACEnet

    USDA-ARS?s Scientific Manuscript database

    Agroforestry systems offer many ecosystem benefits, but such systems have previously been marginalized in temperate environments due to overriding economic goals and perceived management complexity. In view of adaptation to a changing climate, agroforestry systems offer advantages that require quan...

  10. Is Model-Based Development a Favorable Approach for Complex and Safety-Critical Computer Systems on Commercial Aircraft?

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2014-01-01

    A system is safety-critical if its failure can endanger human life or cause significant damage to property or the environment. State-of-the-art computer systems on commercial aircraft are highly complex, software-intensive, functionally integrated, and network-centric systems of systems. Ensuring that such systems are safe and comply with existing safety regulations is costly and time-consuming as the level of rigor in the development process, especially the validation and verification activities, is determined by considerations of system complexity and safety criticality. A significant degree of care and deep insight into the operational principles of these systems is required to ensure adequate coverage of all design implications relevant to system safety. Model-based development methodologies, methods, tools, and techniques facilitate collaboration and enable the use of common design artifacts among groups dealing with different aspects of the development of a system. This paper examines the application of model-based development to complex and safety-critical aircraft computer systems. Benefits and detriments are identified and an overall assessment of the approach is given.

  11. Intelligent mobility research for robotic locomotion in complex terrain

    NASA Astrophysics Data System (ADS)

    Trentini, Michael; Beckman, Blake; Digney, Bruce; Vincent, Isabelle; Ricard, Benoit

    2006-05-01

    The objective of the Autonomous Intelligent Systems Section of Defence R&D Canada - Suffield is best described by its mission statement, which is "to augment soldiers and combat systems by developing and demonstrating practical, cost effective, autonomous intelligent systems capable of completing military missions in complex operating environments." The mobility requirement for ground-based mobile systems operating in urban settings must increase significantly if robotic technology is to augment human efforts in these roles and environments. The intelligence required for autonomous systems to operate in complex environments demands advances in many fields of robotics. This has resulted in large bodies of research in areas of perception, world representation, and navigation, but the problem of locomotion in complex terrain has largely been ignored. In order to achieve its objective, the Autonomous Intelligent Systems Section is pursuing research that explores the use of intelligent mobility algorithms designed to improve robot mobility. Intelligent mobility uses sensing, control, and learning algorithms to extract measured variables from the world, control vehicle dynamics, and learn by experience. These algorithms seek to exploit available world representations of the environment and the inherent dexterity of the robot to allow the vehicle to interact with its surroundings and produce locomotion in complex terrain. The primary focus of the paper is to present the intelligent mobility research within the framework of the research methodology, plan and direction defined at Defence R&D Canada - Suffield. It discusses the progress and future direction of intelligent mobility research and presents the research tools, topics, and plans to address this critical research gap. This research will create effective intelligence to improve the mobility of ground-based mobile systems operating in urban settings to assist the Canadian Forces in their future urban operations.

  12. Towards an integral computer environment supporting system operations analysis and conceptual design

    NASA Technical Reports Server (NTRS)

    Barro, E.; Delbufalo, A.; Rossi, F.

    1994-01-01

    VITROCISET has in house developed a prototype tool named System Dynamic Analysis Environment (SDAE) to support system engineering activities in the initial definition phase of a complex space system. The SDAE goal is to provide powerful means for the definition, analysis, and trade-off of operations and design concepts for the space and ground elements involved in a mission. For this purpose SDAE implements a dedicated modeling methodology based on the integration of different modern (static and dynamic) analysis and simulation techniques. The resulting 'system model' is capable of representing all the operational, functional, and behavioral aspects of the system elements which are part of a mission. The execution of customized model simulations enables: the validation of selected concepts with respect to mission requirements; the in-depth investigation of mission specific operational and/or architectural aspects; and the early assessment of performances required by the system elements to cope with mission constraints and objectives. Due to its characteristics, SDAE is particularly tailored for nonconventional or highly complex systems, which require a great analysis effort in their early definition stages. SDAE runs under PC-Windows and is currently used by VITROCISET system engineering group. This paper describes the SDAE main features, showing some tool output examples.

  13. Systems Integration Challenges for a National Space Launch System

    NASA Technical Reports Server (NTRS)

    May, Todd A.

    2011-01-01

    System Integration was refined through the complexity and early failures experienced in rocket flight. System Integration encompasses many different viewpoints of the system development. System Integration must ensure consistency in development and operations activities. Human Space Flight tends toward large, complex systems. Understanding the system fs operational and use context is the guiding principle for System Integration: (1) Sizeable costs can be driven into systems by not fully understanding context (2). Adhering to the system context throughout the system fs life cycle is essential to maintaining efficient System Integration. System Integration exists within the System Architecture. Beautiful systems are simple in use and operation -- Block upgrades facilitate manageable steps in functionality evolution. Effective System Integration requires a stable system concept. Communication is essential to system simplicity

  14. Ares I Integrated Vehicle System Safety Team

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; McNairy, Lisa; Shackelford, Carla

    2009-01-01

    Complex systems require integrated analysis teams which sometimes are divided into subsystem teams. Proper division of the analysis in to subsystem teams is important. Safety analysis is one of the most difficult aspects of integration.

  15. Design for waste-management system

    NASA Technical Reports Server (NTRS)

    Guarneri, C. A.; Reed, A.; Renman, R.

    1973-01-01

    Study was made and system defined for water-recovery and solid-waste processing for low-rise apartment complexes. System can be modified to conform with unique requirements of community, including hydrology, geology, and climate. Reclamation is accomplished by treatment process that features reverse-osmosis membranes.

  16. Thermal Environment for Classrooms. Central System Approach to Air Conditioning.

    ERIC Educational Resources Information Center

    Triechler, Walter W.

    This speech compares the air conditioning requirements of high-rise office buildings with those of large centralized school complexes. A description of one particular air conditioning system provides information about the system's arrangement, functions, performance efficiency, and cost effectiveness. (MLF)

  17. Realtime multi-plot graphics system

    NASA Technical Reports Server (NTRS)

    Shipkowski, Michael S.

    1990-01-01

    The increased complexity of test operations and customer requirements at Langley Research Center's National Transonic Facility (NTF) surpassed the capabilities of the initial realtime graphics system. The analysis of existing hardware and software and the enhancements made to develop a new realtime graphics system are described. The result of this effort is a cost effective system, based on hardware already in place, that support high speed, high resolution, generation and display of multiple realtime plots. The enhanced graphics system (EGS) meets the current and foreseeable future realtime graphics requirements of the NTF. While this system was developed to support wind tunnel operations, the overall design and capability of the system is applicable to other realtime data acquisition systems that have realtime plot requirements.

  18. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Volume 2: Baseline architecture report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  19. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Phased development plan

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  20. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Volume 1: Baseline architecture report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  1. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Operations concept report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  2. Assessment of Stone Complexity for PCNL: A Systematic Review of the Literature, How Best Can We Record Stone Complexity in PCNL?

    PubMed

    Withington, John; Armitage, James; Finch, William; Wiseman, Oliver; Glass, Jonathan; Burgess, Neil

    2016-01-01

    This study aims to systematically review the literature reporting tools for scoring stone complexity and the stratification of outcomes by stone complexity. In doing so, we aim to determine whether the evidence favors uniform adoption of any one scoring system. PubMed and Embase databases were systematically searched for relevant studies from 2004 to 2014. Reports selected according to predetermined inclusion and exclusion criteria were appraised in terms of methodologic quality and their findings summarized in structured tables. After review, 15 studies were considered suitable for inclusion. Four distinct scoring systems were identified and a further five studies that aimed to validate aspects of those scoring systems. Six studies reported the stratification of outcomes by stone complexity, without specifically defining a scoring system. All studies reported some correlation between stone complexity and stone clearance. Correlation with complications was less clearly established, where investigated. This review does not allow us to firmly recommend one scoring system over the other. However, the quality of evidence supporting validation of the Guy's Stone Score is marginally superior, according to the criteria applied in this study. Further evaluation of the interobserver reliability of this scoring system is required.

  3. Advanced Launch System Multi-Path Redundant Avionics Architecture Analysis and Characterization

    NASA Technical Reports Server (NTRS)

    Baker, Robert L.

    1993-01-01

    The objective of the Multi-Path Redundant Avionics Suite (MPRAS) program is the development of a set of avionic architectural modules which will be applicable to the family of launch vehicles required to support the Advanced Launch System (ALS). To enable ALS cost/performance requirements to be met, the MPRAS must support autonomy, maintenance, and testability capabilities which exceed those present in conventional launch vehicles. The multi-path redundant or fault tolerance characteristics of the MPRAS are necessary to offset a reduction in avionics reliability due to the increased complexity needed to support these new cost reduction and performance capabilities and to meet avionics reliability requirements which will provide cost-effective reductions in overall ALS recurring costs. A complex, real-time distributed computing system is needed to meet the ALS avionics system requirements. General Dynamics, Boeing Aerospace, and C.S. Draper Laboratory have proposed system architectures as candidates for the ALS MPRAS. The purpose of this document is to report the results of independent performance and reliability characterization and assessment analyses of each proposed candidate architecture and qualitative assessments of testability, maintainability, and fault tolerance mechanisms. These independent analyses were conducted as part of the MPRAS Part 2 program and were carried under NASA Langley Research Contract NAS1-17964, Task Assignment 28.

  4. Multicriteria hierarchical iterative interactive algorithm for organizing operational modes of large heat supply systems

    NASA Astrophysics Data System (ADS)

    Korotkova, T. I.; Popova, V. I.

    2017-11-01

    The generalized mathematical model of decision-making in the problem of planning and mode selection providing required heat loads in a large heat supply system is considered. The system is multilevel, decomposed into levels of main and distribution heating networks with intermediate control stages. Evaluation of the effectiveness, reliability and safety of such a complex system is carried out immediately according to several indicators, in particular pressure, flow, temperature. This global multicriteria optimization problem with constraints is decomposed into a number of local optimization problems and the coordination problem. An agreed solution of local problems provides a solution to the global multicriterion problem of decision making in a complex system. The choice of the optimum operational mode of operation of a complex heat supply system is made on the basis of the iterative coordination process, which converges to the coordinated solution of local optimization tasks. The interactive principle of multicriteria task decision-making includes, in particular, periodic adjustment adjustments, if necessary, guaranteeing optimal safety, reliability and efficiency of the system as a whole in the process of operation. The degree of accuracy of the solution, for example, the degree of deviation of the internal air temperature from the required value, can also be changed interactively. This allows to carry out adjustment activities in the best way and to improve the quality of heat supply to consumers. At the same time, an energy-saving task is being solved to determine the minimum required values of heads at sources and pumping stations.

  5. On Convergence of Development Costs and Cost Models for Complex Spaceflight Instrument Electronics

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Patel, Umeshkumar D.; Kasa, Robert L.; Hestnes, Phyllis; Brown, Tammy; Vootukuru, Madhavi

    2008-01-01

    Development costs of a few recent spaceflight instrument electrical and electronics subsystems have diverged from respective heritage cost model predictions. The cost models used are Grass Roots, Price-H and Parametric Model. These cost models originated in the military and industry around 1970 and were successfully adopted and patched by NASA on a mission-by-mission basis for years. However, the complexity of new instruments recently changed rapidly by orders of magnitude. This is most obvious in the complexity of representative spaceflight instrument electronics' data system. It is now required to perform intermediate processing of digitized data apart from conventional processing of science phenomenon signals from multiple detectors. This involves on-board instrument formatting of computational operands from row data for example, images), multi-million operations per second on large volumes of data in reconfigurable hardware (in addition to processing on a general purpose imbedded or standalone instrument flight computer), as well as making decisions for on-board system adaptation and resource reconfiguration. The instrument data system is now tasked to perform more functions, such as forming packets and instrument-level data compression of more than one data stream, which are traditionally performed by the spacecraft command and data handling system. It is furthermore required that the electronics box for new complex instruments is developed for one-digit watt power consumption, small size and that it is light-weight, and delivers super-computing capabilities. The conflict between the actual development cost of newer complex instruments and its electronics components' heritage cost model predictions seems to be irreconcilable. This conflict and an approach to its resolution are addressed in this paper by determining the complexity parameters, complexity index, and their use in enhanced cost model.

  6. Plant metabolic modeling: achieving new insight into metabolism and metabolic engineering.

    PubMed

    Baghalian, Kambiz; Hajirezaei, Mohammad-Reza; Schreiber, Falk

    2014-10-01

    Models are used to represent aspects of the real world for specific purposes, and mathematical models have opened up new approaches in studying the behavior and complexity of biological systems. However, modeling is often time-consuming and requires significant computational resources for data development, data analysis, and simulation. Computational modeling has been successfully applied as an aid for metabolic engineering in microorganisms. But such model-based approaches have only recently been extended to plant metabolic engineering, mainly due to greater pathway complexity in plants and their highly compartmentalized cellular structure. Recent progress in plant systems biology and bioinformatics has begun to disentangle this complexity and facilitate the creation of efficient plant metabolic models. This review highlights several aspects of plant metabolic modeling in the context of understanding, predicting and modifying complex plant metabolism. We discuss opportunities for engineering photosynthetic carbon metabolism, sucrose synthesis, and the tricarboxylic acid cycle in leaves and oil synthesis in seeds and the application of metabolic modeling to the study of plant acclimation to the environment. The aim of the review is to offer a current perspective for plant biologists without requiring specialized knowledge of bioinformatics or systems biology. © 2014 American Society of Plant Biologists. All rights reserved.

  7. Plant Metabolic Modeling: Achieving New Insight into Metabolism and Metabolic Engineering

    PubMed Central

    Baghalian, Kambiz; Hajirezaei, Mohammad-Reza; Schreiber, Falk

    2014-01-01

    Models are used to represent aspects of the real world for specific purposes, and mathematical models have opened up new approaches in studying the behavior and complexity of biological systems. However, modeling is often time-consuming and requires significant computational resources for data development, data analysis, and simulation. Computational modeling has been successfully applied as an aid for metabolic engineering in microorganisms. But such model-based approaches have only recently been extended to plant metabolic engineering, mainly due to greater pathway complexity in plants and their highly compartmentalized cellular structure. Recent progress in plant systems biology and bioinformatics has begun to disentangle this complexity and facilitate the creation of efficient plant metabolic models. This review highlights several aspects of plant metabolic modeling in the context of understanding, predicting and modifying complex plant metabolism. We discuss opportunities for engineering photosynthetic carbon metabolism, sucrose synthesis, and the tricarboxylic acid cycle in leaves and oil synthesis in seeds and the application of metabolic modeling to the study of plant acclimation to the environment. The aim of the review is to offer a current perspective for plant biologists without requiring specialized knowledge of bioinformatics or systems biology. PMID:25344492

  8. Specification and Design of Electrical Flight System Architectures with SysML

    NASA Technical Reports Server (NTRS)

    McKelvin, Mark L., Jr.; Jimenez, Alejandro

    2012-01-01

    Modern space flight systems are required to perform more complex functions than previous generations to support space missions. This demand is driving the trend to deploy more electronics to realize system functionality. The traditional approach for the specification, design, and deployment of electrical system architectures in space flight systems includes the use of informal definitions and descriptions that are often embedded within loosely coupled but highly interdependent design documents. Traditional methods become inefficient to cope with increasing system complexity, evolving requirements, and the ability to meet project budget and time constraints. Thus, there is a need for more rigorous methods to capture the relevant information about the electrical system architecture as the design evolves. In this work, we propose a model-centric approach to support the specification and design of electrical flight system architectures using the System Modeling Language (SysML). In our approach, we develop a domain specific language for specifying electrical system architectures, and we propose a design flow for the specification and design of electrical interfaces. Our approach is applied to a practical flight system.

  9. Perimetric Complexity of Binary Digital Images: Notes on Calculation and Relation to Visual Complexity

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    2011-01-01

    Perimetric complexity is a measure of the complexity of binary pictures. It is defined as the sum of inside and outside perimeters of the foreground, squared, divided by the foreground area, divided by 4p . Difficulties arise when this definition is applied to digital images composed of binary pixels. In this paper we identify these problems and propose solutions. Perimetric complexity is often used as a measure of visual complexity, in which case it should take into account the limited resolution of the visual system. We propose a measure of visual perimetric complexity that meets this requirement.

  10. NASA Langley Distributed Propulsion VTOL Tilt-Wing Aircraft Testing, Modeling, Simulation, Control, and Flight Test Development

    NASA Technical Reports Server (NTRS)

    Rothhaar, Paul M.; Murphy, Patrick C.; Bacon, Barton J.; Gregory, Irene M.; Grauer, Jared A.; Busan, Ronald C.; Croom, Mark A.

    2014-01-01

    Control of complex Vertical Take-Off and Landing (VTOL) aircraft traversing from hovering to wing born flight mode and back poses notoriously difficult modeling, simulation, control, and flight-testing challenges. This paper provides an overview of the techniques and advances required to develop the GL-10 tilt-wing, tilt-tail, long endurance, VTOL aircraft control system. The GL-10 prototype's unusual and complex configuration requires application of state-of-the-art techniques and some significant advances in wind tunnel infrastructure automation, efficient Design Of Experiments (DOE) tunnel test techniques, modeling, multi-body equations of motion, multi-body actuator models, simulation, control algorithm design, and flight test avionics, testing, and analysis. The following compendium surveys key disciplines required to develop an effective control system for this challenging vehicle in this on-going effort.

  11. Structural model of control system for hydraulic stepper motor complex

    NASA Astrophysics Data System (ADS)

    Obukhov, A. D.; Dedov, D. L.; Kolodin, A. N.

    2018-03-01

    The article considers the problem of developing a structural model of the control system for a hydraulic stepper drive complex. A comparative analysis of stepper drives and assessment of the applicability of HSM for solving problems, requiring accurate displacement in space with subsequent positioning of the object, are carried out. The presented structural model of the automated control system of the multi-spindle complex of hydraulic stepper drives reflects the main components of the system, as well as the process of its control based on the control signals transfer to the solenoid valves by the controller. The models and methods described in the article can be used to formalize the control process in technical systems based on the application hydraulic stepper drives and allow switching from mechanical control to automated control.

  12. EPR spectroscopy of complex biological iron-sulfur systems.

    PubMed

    Hagen, Wilfred R

    2018-02-21

    From the very first discovery of biological iron-sulfur clusters with EPR, the spectroscopy has been used to study not only purified proteins but also complex systems such as respiratory complexes, membrane particles and, later, whole cells. In recent times, the emphasis of iron-sulfur biochemistry has moved from characterization of individual proteins to the systems biology of iron-sulfur biosynthesis, regulation, degradation, and implications for human health. Although this move would suggest a blossoming of System-EPR as a specific, non-invasive monitor of Fe/S (dys)homeostasis in whole cells, a review of the literature reveals limited success possibly due to technical difficulties in adherence to EPR spectroscopic and biochemical standards. In an attempt to boost application of System-EPR the required boundary conditions and their practical applications are explicitly and comprehensively formulated.

  13. Uncertainties in building a strategic defense.

    PubMed

    Zraket, C A

    1987-03-27

    Building a strategic defense against nuclear ballistic missiles involves complex and uncertain functional, spatial, and temporal relations. Such a defensive system would evolve and grow over decades. It is too complex, dynamic, and interactive to be fully understood initially by design, analysis, and experiments. Uncertainties exist in the formulation of requirements and in the research and design of a defense architecture that can be implemented incrementally and be fully tested to operate reliably. The analysis and measurement of system survivability, performance, and cost-effectiveness are critical to this process. Similar complexities exist for an adversary's system that would suppress or use countermeasures against a missile defense. Problems and opportunities posed by these relations are described, with emphasis on the unique characteristics and vulnerabilities of space-based systems.

  14. 33 CFR 149.125 - What are the requirements for the malfunction detection system?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) Each oil and natural gas system, between a pumping platform complex and the shore, must have a system... malfunction detection system? 149.125 Section 149.125 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: DESIGN, CONSTRUCTION, AND EQUIPMENT...

  15. 33 CFR 149.125 - What are the requirements for the malfunction detection system?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) Each oil and natural gas system, between a pumping platform complex and the shore, must have a system... malfunction detection system? 149.125 Section 149.125 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: DESIGN, CONSTRUCTION, AND EQUIPMENT...

  16. Traditional Chinese medicine: potential approaches from modern dynamical complexity theories.

    PubMed

    Ma, Yan; Zhou, Kehua; Fan, Jing; Sun, Shuchen

    2016-03-01

    Despite the widespread use of traditional Chinese medicine (TCM) in clinical settings, proving its effectiveness via scientific trials is still a challenge. TCM views the human body as a complex dynamical system, and focuses on the balance of the human body, both internally and with its external environment. Such fundamental concepts require investigations using system-level quantification approaches, which are beyond conventional reductionism. Only methods that quantify dynamical complexity can bring new insights into the evaluation of TCM. In a previous article, we briefly introduced the potential value of Multiscale Entropy (MSE) analysis in TCM. This article aims to explain the existing challenges in TCM quantification, to introduce the consistency of dynamical complexity theories and TCM theories, and to inspire future system-level research on health and disease.

  17. Application of Strength Requirements to Complex Loading Scenarios

    NASA Technical Reports Server (NTRS)

    England, Scott; Rajulu, Sudhakar

    2016-01-01

    NASA's endeavors in human spaceflight rely on extensive volumes of human-systems integration requirements to ensure mission success. These requirements protect for space hardware accommodation for the full range of potential crewmembers, but cannot cover every possible action and contingency in detail. This study was undertaken in response to questions from various strength requirement users who were unclear how to apply idealized strength requirements that did not map well to the complex loading scenarios that crewmembers would encounter. Three of the most commonly occurring questions from stakeholders were selected to be investigated with human testing and human modeling. Preliminary findings indicate deviation from nominal postures can affect strength requirement compliance positively or negatively, depending on the nature of the deviation. Human modeling offers some avenues for quickly addressing requirement verification questions, but is limited by the fidelity of the model and environment.

  18. Challenges of Developing New Classes of NASA Self-Managing Mission

    NASA Technical Reports Server (NTRS)

    Hinchey, M. G.; Rash, J. I.; Truszkowski, W. F.; Rouff, C. A.; Sterritt, R.

    2005-01-01

    NASA is proposing increasingly complex missions that will require a high degree of autonomy and autonomicity. These missions pose hereto unforeseen problems and raise issues that have not been well-addressed by the community. Assuring success of such missions will require new software development techniques and tools. This paper discusses some of the challenges that NASA and the rest of the software development community are facing in developing these ever-increasingly complex systems. We give an overview of a proposed NASA mission as well as techniques and tools that are being developed to address autonomic management and the complexity issues inherent in these missions.

  19. StakeMeter: Value-Based Stakeholder Identification and Quantification Framework for Value-Based Software Systems

    PubMed Central

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N. A.; Zaheer, Kashif Bin

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called ‘StakeMeter’. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error. PMID:25799490

  20. After Action Report: Advanced Test Reactor Complex 2015 Evaluated Drill October 6, 2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holmes, Forest Howard

    2015-11-01

    The Advanced Test Reactor (ATR) Complex, operated by Battelle Energy Alliance, LLC, at the Idaho National Laboratory (INL) conducted an evaluated drill on October 6, 2015, to allow the ATR Complex emergency response organization (ERO) to demonstrate the ability to respond to and mitigate an emergency by implementing the requirements of DOE O 151.1C, “Comprehensive Emergency Management System.”

  1. Activation of the DnaK-ClpB Complex is Regulated by the Properties of the Bound Substrate.

    PubMed

    Fernández-Higuero, Jose Angel; Aguado, Alejandra; Perales-Calvo, Judit; Moro, Fernando; Muga, Arturo

    2018-04-11

    The chaperone ClpB in bacteria is responsible for the reactivation of aggregated proteins in collaboration with the DnaK system. Association of these chaperones at the aggregate surface stimulates ATP hydrolysis, which mediates substrate remodeling. However, a question that remains unanswered is whether the bichaperone complex can be selectively activated by substrates that require remodeling. We find that large aggregates or bulky, native-like substrates activates the complex, whereas a smaller, permanently unfolded protein or extended, short peptides fail to stimulate it. Our data also indicate that ClpB interacts differently with DnaK in the presence of aggregates or small peptides, displaying a higher affinity for aggregate-bound DnaK, and that DnaK-ClpB collaboration requires the coupled ATPase-dependent remodeling activities of both chaperones. Complex stimulation is mediated by residues at the β subdomain of DnaK substrate binding domain, which become accessible to the disaggregase when the lid is allosterically detached from the β subdomain. Complex activation also requires an active NBD2 and the integrity of the M domain-ring of ClpB. Disruption of the M-domain ring allows the unproductive stimulation of the DnaK-ClpB complex in solution. The ability of the DnaK-ClpB complex to discrimínate different substrate proteins might allow its activation when client proteins require remodeling.

  2. Reconceptualizing children's complex discharge with health systems theory: novel integrative review with embedded expert consultation and theory development.

    PubMed

    Noyes, Jane; Brenner, Maria; Fox, Patricia; Guerin, Ashleigh

    2014-05-01

    To report a novel review to develop a health systems model of successful transition of children with complex healthcare needs from hospital to home. Children with complex healthcare needs commonly experience an expensive, ineffectual and prolonged nurse-led discharge process. Children gain no benefit from prolonged hospitalization and are exposed to significant harm. Research to enable intervention development and process evaluation across the entire health system is lacking. Novel mixed-method integrative review informed by health systems theory. DATA  CINAHL, PsychInfo, EMBASE, PubMed, citation searching, personal contact. REVIEW  Informed by consultation with experts. English language studies, opinion/discussion papers reporting research, best practice and experiences of children, parents and healthcare professionals and purposively selected policies/guidelines from 2002-December 2012 were abstracted using Framework synthesis, followed by iterative theory development. Seven critical factors derived from thirty-four sources across five health system levels explained successful discharge (new programme theory). All seven factors are required in an integrated care pathway, with a dynamic communication loop to facilitate effective discharge (new programme logic). Current health system responses were frequently static and critical success factors were commonly absent, thereby explaining ineffectual discharge. The novel evidence-based model, which reconceptualizes 'discharge' as a highly complex longitudinal health system intervention, makes a significant contribution to global knowledge to drive practice development. Research is required to develop process and outcome measures at different time points in the discharge process and future trials are needed to determine the effectiveness of integrated health system discharge models. © 2013 John Wiley & Sons Ltd.

  3. Connections Matter: Social Networks and Lifespan Health in Primate Translational Models

    PubMed Central

    McCowan, Brenda; Beisner, Brianne; Bliss-Moreau, Eliza; Vandeleest, Jessica; Jin, Jian; Hannibal, Darcy; Hsieh, Fushing

    2016-01-01

    Humans live in societies full of rich and complex relationships that influence health. The ability to improve human health requires a detailed understanding of the complex interplay of biological systems that contribute to disease processes, including the mechanisms underlying the influence of social contexts on these biological systems. A longitudinal computational systems science approach provides methods uniquely suited to elucidate the mechanisms by which social systems influence health and well-being by investigating how they modulate the interplay among biological systems across the lifespan. In the present report, we argue that nonhuman primate social systems are sufficiently complex to serve as model systems allowing for the development and refinement of both analytical and theoretical frameworks linking social life to health. Ultimately, developing systems science frameworks in nonhuman primate models will speed discovery of the mechanisms that subserve the relationship between social life and human health. PMID:27148103

  4. Representing Operational Modes for Situation Awareness

    NASA Astrophysics Data System (ADS)

    Kirchhübel, Denis; Lind, Morten; Ravn, Ole

    2017-01-01

    Operating complex plants is an increasingly demanding task for human operators. Diagnosis of and reaction to on-line events requires the interpretation of real time data. Vast amounts of sensor data as well as operational knowledge about the state and design of the plant are necessary to deduct reasonable reactions to abnormal situations. Intelligent computational support tools can make the operator’s task easier, but they require knowledge about the overall system in form of some model. While tools used for fault-tolerant control design based on physical principles and relations are valuable tools for designing robust systems, the models become too complex when considering the interactions on a plant-wide level. The alarm systems meant to support human operators in the diagnosis of the plant-wide situation on the other hand fail regularly in situations where these interactions of systems lead to many related alarms overloading the operator with alarm floods. Functional modelling can provide a middle way to reduce the complexity of plant-wide models by abstracting from physical details to more general functions and behaviours. Based on functional models the propagation of failures through the interconnected systems can be inferred and alarm floods can potentially be reduced to their root-cause. However, the desired behaviour of a complex system changes due to operating procedures that require more than one physical and functional configuration. In this paper a consistent representation of possible configurations is deduced from the analysis of an exemplary start-up procedure by functional models. The proposed interpretation of the modelling concepts simplifies the functional modelling of distinct modes. The analysis further reveals relevant links between the quantitative sensor data and the qualitative perspective of the diagnostics tool based on functional models. This will form the basis for the ongoing development of a novel real-time diagnostics system based on the on-line adaptation of the underlying MFM model.

  5. FPGA-Based Laboratory Assignments for NoC-Based Manycore Systems

    ERIC Educational Resources Information Center

    Ttofis, C.; Theocharides, T.; Michael, M. K.

    2012-01-01

    Manycore systems have emerged as being one of the dominant architectural trends in next-generation computer systems. These highly parallel systems are expected to be interconnected via packet-based networks-on-chip (NoC). The complexity of such systems poses novel and exciting challenges in academia, as teaching their design requires the students…

  6. Principal Physicochemical Methods Used to Characterize Dendrimer Molecule Complexes Used as Genetic Therapy Agents, Nanovaccines or Drug Carriers.

    PubMed

    Alberto, Rodríguez Fonseca Rolando; Joao, Rodrigues; de Los Angeles, Muñoz-Fernández María; Alberto, Martínez Muñoz; Manuel Jonathan, Fragoso Vázquez; José, Correa Basurto

    2017-08-30

    Nanomedicine is the application of nanotechnology to medicine. This field is related to the study of nanodevices and nanomaterials applied to various medical uses, such as in improving the pharmacological properties of different molecules. Dendrimers are synthetic nanoparticles whose physicochemical properties vary according to their chemical structure. These molecules have been extensively investigated as drug nanocarriers to improve drug solubility and as sustained-release systems. New therapies such as gene therapy and the development of nanovaccines can be improved by the use of dendrimers. The biophysical and physicochemical characterization of nucleic acid/peptide-dendrimer complexes is crucial to identify their functional properties prior to biological evaluation. In that sense, it is necessary to first identify whether the peptide-dendrimer or nucleic aciddendrimer complexes can be formed and whether the complex can dissociate under the appropriate conditions at the target cells. In addition, biophysical and physicochemical characterization is required to determine how long the complexes remain stable, what proportion of peptide or nucleic acid is required to form the complex or saturate the dendrimer, and the size of the complex formed. In this review, we present the latest information on characterization systems for dendrimer-nucleic acid, dendrimer-peptide and dendrimer-drug complexes with several biotechnological and pharmacological applications. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  7. JPL Counterfeit Parts Avoidance

    NASA Technical Reports Server (NTRS)

    Risse, Lori

    2012-01-01

    SPACE ARCHITECTURE / ENGINEERING: It brings an extreme test bed for both technologies/concepts as well as procedures/processes. Design and construction (engineering) always go together, especially with complex systems. Requirements (objectives) are crucial. More important than the answers are the questions/Requirements/Tools-Techniques/Processes. Different environments force architects and engineering to think out of the box. For instance there might not be gravity forces. Architectural complex problems have common roots: in Space and on Earth. Let us bring Space down on Earth so we can keep sending Mankind to the stars from a better world. Have fun being architects and engineers...!!! This time is amazing and historical. We are changing the way we inhabit the solar systems!

  8. Enhanced job control language procedures for the SIMSYS2D two-dimensional water-quality simulation system

    USGS Publications Warehouse

    Karavitis, G.A.

    1984-01-01

    The SIMSYS2D two-dimensional water-quality simulation system is a large-scale digital modeling software system used to simulate flow and transport of solutes in freshwater and estuarine environments. Due to the size, processing requirements, and complexity of the system, there is a need to easily move the system and its associated files between computer sites when required. A series of job control language (JCL) procedures was written to allow transferability between IBM and IBM-compatible computers. (USGS)

  9. Managing Complex IT Security Processes with Value Based Measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Sheldon, Frederick T; Mili, Ali

    2009-01-01

    Current trends indicate that IT security measures will need to greatly expand to counter the ever increasingly sophisticated, well-funded and/or economically motivated threat space. Traditional risk management approaches provide an effective method for guiding courses of action for assessment, and mitigation investments. However, such approaches no matter how popular demand very detailed knowledge about the IT security domain and the enterprise/cyber architectural context. Typically, the critical nature and/or high stakes require careful consideration and adaptation of a balanced approach that provides reliable and consistent methods for rating vulnerabilities. As reported in earlier works, the Cyberspace Security Econometrics System provides amore » comprehensive measure of reliability, security and safety of a system that accounts for the criticality of each requirement as a function of one or more stakeholders interests in that requirement. This paper advocates a dependability measure that acknowledges the aggregate structure of complex system specifications, and accounts for variations by stakeholder, by specification components, and by verification and validation impact.« less

  10. Peculiarities of organizing the construction of nuclear medicine facilities and the transportation of radionuclide

    NASA Astrophysics Data System (ADS)

    Telichenko, Valeriy; Malykha, Galina; Dorogan, Igor

    2017-10-01

    The article is devoted to the organization of construction of nuclear medicine facilities in Russia. The article describes the main methods of nuclear medical diagnostics, as well as the peculiarities of nuclear medicine facilities that determine the need for application of specific methods for organizing and managing the construction, methods of requirements management in the organization of construction of nuclear medicine facilities. Sustainable development of the transport of radioactive isotopes from the place of production to places of consumption is very important for the safety of the population. The requirements management system is an important and necessary component in organizing the construction of complex facilities, such as nuclear medicine facilities. The author developed and proposed a requirements management system for the design, construction and operation of a nuclear medicine facility, which provides for a cyclic sequence of actions. This system allows reducing the consumption of resources including material and energy during construction and operation of complex objects.

  11. Designing and Validating Assessments of Complex Thinking in Science

    ERIC Educational Resources Information Center

    Ryoo, Kihyun; Linn, Marcia C.

    2015-01-01

    Typical assessment systems often measure isolated ideas rather than the coherent understanding valued in current science classrooms. Such assessments may motivate students to memorize, rather than to use new ideas to solve complex problems. To meet the requirements of the Next Generation Science Standards, instruction needs to emphasize sustained…

  12. Receiver bandwidth effects on complex modulation and detection using directly modulated lasers.

    PubMed

    Yuan, Feng; Che, Di; Shieh, William

    2016-05-01

    Directly modulated lasers (DMLs) have long been employed for short- and medium-reach optical communications due to their low cost. Recently, a new modulation scheme called complex modulated DMLs has been demonstrated showing a significant optical signal to noise ratio sensitivity enhancement compared with the traditional intensity-only detection scheme. However, chirp-induced optical spectrum broadening is inevitable in complex modulated systems, which may imply a need for high-bandwidth receivers. In this Letter, we study the impact of receiver bandwidth effects on the performance of complex modulation and coherent detection systems based on DMLs. We experimentally demonstrate that such systems exhibit a reasonable tolerance for the reduced receiver bandwidth. For 10 Gbaud 4-level pulse amplitude modulation signals, the required electrical bandwidth is as low as 8.5 and 7.5 GHz for 7% and 20% forward error correction, respectively. Therefore, it is feasible to realize DML-based complex modulated systems using cost-effective receivers with narrow bandwidth.

  13. Development of Boolean calculus and its applications. [digital systems design

    NASA Technical Reports Server (NTRS)

    Tapia, M. A.

    1980-01-01

    The development of Boolean calculus for its application to developing digital system design methodologies that would reduce system complexity, size, cost, speed, power requirements, etc., is discussed. Synthesis procedures for logic circuits are examined particularly asynchronous circuits using clock triggered flip flops.

  14. Simulation of a Moving Elastic Beam Using Hamilton’s Weak Principle

    DTIC Science & Technology

    2006-03-01

    versions were limited to two-dimensional systems with open tree configurations (where a cut in any component separates the system in half) [48]. This...whose com- ponents experienced large angular rotations (turbomachinery, camshafts , flywheels, etc.). More complex systems required the simultaneous

  15. 10 CFR 960.5-2-10 - Hydrology.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... the host rock and the land surface. (2) Absence of surface-water systems that could potentially cause flooding of the repository. (3) Availability of the water required for repository construction, operation, and closure. (c) Potentially adverse condition. Ground-water conditions that could require complex...

  16. Object Based Systems Engineering

    DTIC Science & Technology

    2011-10-17

    practically impossible where the original SMEs are unavailable or lack perfect recall. 7. Capture the precious and transient logic behind this...complex system. References 1. FITCH, J. Exploiting Decision-to-Requirements Traceability, briefing to NDIA CMMI Conference, November, 2009 2

  17. Controllability of complex networks for sustainable system dynamics

    EPA Science Inventory

    Successful implementation of sustainability ideas in ecosystem management requires a basic understanding of the often non-linear and non-intuitive relationships among different dimensions of sustainability, particularly the system-wide implications of human actions. This basic un...

  18. PM2006: a highly scalable urban planning management information system--Case study: Suzhou Urban Planning Bureau

    NASA Astrophysics Data System (ADS)

    Jing, Changfeng; Liang, Song; Ruan, Yong; Huang, Jie

    2008-10-01

    During the urbanization process, when facing complex requirements of city development, ever-growing urban data, rapid development of planning business and increasing planning complexity, a scalable, extensible urban planning management information system is needed urgently. PM2006 is such a system that can deal with these problems. In response to the status and problems in urban planning, the scalability and extensibility of PM2006 are introduced which can be seen as business-oriented workflow extensibility, scalability of DLL-based architecture, flexibility on platforms of GIS and database, scalability of data updating and maintenance and so on. It is verified that PM2006 system has good extensibility and scalability which can meet the requirements of all levels of administrative divisions and can adapt to ever-growing changes in urban planning business. At the end of this paper, the application of PM2006 in Urban Planning Bureau of Suzhou city is described.

  19. Integrated geometry and grid generation system for complex configurations

    NASA Technical Reports Server (NTRS)

    Akdag, Vedat; Wulf, Armin

    1992-01-01

    A grid generation system was developed that enables grid generation for complex configurations. The system called ICEM/CFD is described and its role in computational fluid dynamics (CFD) applications is presented. The capabilities of the system include full computer aided design (CAD), grid generation on the actual CAD geometry definition using robust surface projection algorithms, interfacing easily with known CAD packages through common file formats for geometry transfer, grid quality evaluation of the volume grid, coupling boundary condition set-up for block faces with grid topology generation, multi-block grid generation with or without point continuity and block to block interface requirement, and generating grid files directly compatible with known flow solvers. The interactive and integrated approach to the problem of computational grid generation not only substantially reduces manpower time but also increases the flexibility of later grid modifications and enhancements which is required in an environment where CFD is integrated into a product design cycle.

  20. Duobinary pulse shaping for frequency chirp enabled complex modulation.

    PubMed

    Che, Di; Yuan, Feng; Khodakarami, Hamid; Shieh, William

    2016-09-01

    The frequency chirp of optical direct modulation (DM) used to be a performance barrier of optical transmission system, because it broadens the signal optical spectrum, which becomes more susceptible to chromatic dispersion induced inter-symbol interference (ISI). However, by considering the chirp as frequency modulation, the single DM simultaneously generates a 2-D signal containing the intensity and phase (namely, the time integral of frequency). This complex modulation concept significantly increases the optical signal to noise ratio (OSNR) sensitivity of DM systems. This Letter studies the duobinary pulse shaping (DB-PS) for chirp enabled DM and its impact on the optical bandwidth and system OSNR sensitivity. DB-PS relieves the bandwidth requirement, at the sacrifice of system OSNR sensitivity. As DB-PS induces a controlled ISI, the receiver requires one more tap for maximum likelihood sequence estimation (MLSE). We verify this modified MLSE with a 10-Gbaud duobinary PAM-4 transmission experiment.

  1. Engine health monitoring: An advanced system

    NASA Technical Reports Server (NTRS)

    Dyson, R. J. E.

    1981-01-01

    The advanced propulsion monitoring system is described. The system was developed in order to fulfill a growing need for effective engine health monitoring. This need is generated by military requirements for increased performance and efficiency in more complex propulsion systems, while maintaining or improving the cost to operate. This program represents a vital technological step in the advancement of the state of the art for monitoring systems in terms of reliability, flexibility, accuracy, and provision of user oriented results. It draws heavily on the technology and control theory developed for modern, complex, electronically controlled engines and utilizes engine information which is a by-product of such a system.

  2. In-flight testing of the space shuttle orbiter thermal control system

    NASA Technical Reports Server (NTRS)

    Taylor, J. T.

    1985-01-01

    In-flight thermal control system testing of a complex manned spacecraft such as the space shuttle orbiter and the considerations attendant to the definition of the tests are described. Design concerns, design mission requirements, flight test objectives, crew vehicle and mission risk considerations, instrumentation, data requirements, and real-time mission monitoring are discussed. An overview of the tests results is presented.

  3. System Requirement Analyses for Ubiquitous Environment Management System

    NASA Astrophysics Data System (ADS)

    Lim, Sang Boem; Gil, Kyung Jun; Choe, Ho Rim; Eo, Yang Dam

    We are living in new stage of society. U-City introduces new paradigm that cannot be archived in traditional city to future city. Korea is one of the most active countries to construct U-City based on advances of IT technologies - especially based on high-speed network through out country [1]. Peoples are realizing ubiquitous service is key factor of success of U-City. Among the U-services, U-security service is one of the most important services. Nowadays we have to concern about traditional threat and also personal information. Since apartment complex is the most common residence type in Korea. We are developing security rules and system based on analyses of apartment complex and assert of apartment complex. Based on these analyses, we are developing apartment complex security using various technologies including home network system. We also will discuss basic home network security architecture.

  4. Validation of a SysML based design for wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Berrachedi, Amel; Rahim, Messaoud; Ioualalen, Malika; Hammad, Ahmed

    2017-07-01

    When developing complex systems, the requirement for the verification of the systems' design is one of the main challenges. Wireless Sensor Networks (WSNs) are examples of such systems. We address the problem of how WSNs must be designed to fulfil the system requirements. Using the SysML Language, we propose a Model Based System Engineering (MBSE) specification and verification methodology for designing WSNs. This methodology uses SysML to describe the WSNs requirements, structure and behaviour. Then, it translates the SysML elements to an analytic model, specifically, to a Deterministic Stochastic Petri Net. The proposed approach allows to design WSNs and study their behaviors and their energy performances.

  5. Framework for Architecture Trade Study Using MBSE and Performance Simulation

    NASA Technical Reports Server (NTRS)

    Ryan, Jessica; Sarkani, Shahram; Mazzuchim, Thomas

    2012-01-01

    Increasing complexity in modern systems as well as cost and schedule constraints require a new paradigm of system engineering to fulfill stakeholder needs. Challenges facing efficient trade studies include poor tool interoperability, lack of simulation coordination (design parameters) and requirements flowdown. A recent trend toward Model Based System Engineering (MBSE) includes flexible architecture definition, program documentation, requirements traceability and system engineering reuse. As a new domain MBSE still lacks governing standards and commonly accepted frameworks. This paper proposes a framework for efficient architecture definition using MBSE in conjunction with Domain Specific simulation to evaluate trade studies. A general framework is provided followed with a specific example including a method for designing a trade study, defining candidate architectures, planning simulations to fulfill requirements and finally a weighted decision analysis to optimize system objectives.

  6. Healthcare software assurance.

    PubMed

    Cooper, Jason G; Pauley, Keith A

    2006-01-01

    Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Drug Administration (FDA) regulatory requirements and guidance documentation do not address certain aspects of complete software assurance activities. In addition, the FDA's software oversight processes require enhancement to include increasingly complex healthcare systems such as Hospital Information Systems (HIS). The importance of complete software assurance is introduced, current regulatory requirements and guidance discussed, and the necessity for enhancements to the current processes shall be highlighted.

  7. Healthcare Software Assurance

    PubMed Central

    Cooper, Jason G.; Pauley, Keith A.

    2006-01-01

    Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Drug Administration (FDA) regulatory requirements and guidance documentation do not address certain aspects of complete software assurance activities. In addition, the FDA’s software oversight processes require enhancement to include increasingly complex healthcare systems such as Hospital Information Systems (HIS). The importance of complete software assurance is introduced, current regulatory requirements and guidance discussed, and the necessity for enhancements to the current processes shall be highlighted. PMID:17238324

  8. Control of complex physically simulated robot groups

    NASA Astrophysics Data System (ADS)

    Brogan, David C.

    2001-10-01

    Actuated systems such as robots take many forms and sizes but each requires solving the difficult task of utilizing available control inputs to accomplish desired system performance. Coordinated groups of robots provide the opportunity to accomplish more complex tasks, to adapt to changing environmental conditions, and to survive individual failures. Similarly, groups of simulated robots, represented as graphical characters, can test the design of experimental scenarios and provide autonomous interactive counterparts for video games. The complexity of writing control algorithms for these groups currently hinders their use. A combination of biologically inspired heuristics, search strategies, and optimization techniques serve to reduce the complexity of controlling these real and simulated characters and to provide computationally feasible solutions.

  9. Work-Facilitating Information Visualization Techniques for Complex Wastewater Systems

    NASA Astrophysics Data System (ADS)

    Ebert, Achim; Einsfeld, Katja

    The design and the operation of urban drainage systems and wastewater treatment plants (WWTP) have become increasingly complex. This complexity is due to increased requirements concerning process technology, technical, environmental, economical, and occupational safety aspects. The plant operator has access not only to some timeworn filers and measured parameters but also to numerous on-line and off-line parameters that characterize the current state of the plant in detail. Moreover, expert databases and specific support pages of plant manufactures are accessible through the World Wide Web. Thus, the operator is overwhelmed with predominantly unstructured data.

  10. Software Requirements Analysis as Fault Predictor

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores

    2003-01-01

    Waiting until the integration and system test phase to discover errors leads to more costly rework than resolving those same errors earlier in the lifecycle. Costs increase even more significantly once a software system has become operational. WE can assess the quality of system requirements, but do little to correlate this information either to system assurance activities or long-term reliability projections - both of which remain unclear and anecdotal. Extending earlier work on requirements accomplished by the ARM tool, measuring requirements quality information against code complexity and test data for the same system may be used to predict specific software modules containing high impact or deeply embedded faults now escaping in operational systems. Such knowledge would lead to more effective and efficient test programs. It may enable insight into whether a program should be maintained or started over.

  11. Cellular Decomposition Based Hybrid-Hierarchical Control Systems with Applications to Flight Management Systems

    NASA Technical Reports Server (NTRS)

    Caines, P. E.

    1999-01-01

    The work in this research project has been focused on the construction of a hierarchical hybrid control theory which is applicable to flight management systems. The motivation and underlying philosophical position for this work has been that the scale, inherent complexity and the large number of agents (aircraft) involved in an air traffic system imply that a hierarchical modelling and control methodology is required for its management and real time control. In the current work the complex discrete or continuous state space of a system with a small number of agents is aggregated in such a way that discrete (finite state machine or supervisory automaton) controlled dynamics are abstracted from the system's behaviour. High level control may then be either directly applied at this abstracted level, or, if this is in itself of significant complexity, further layers of abstractions may be created to produce a system with an acceptable degree of complexity at each level. By the nature of this construction, high level commands are necessarily realizable at lower levels in the system.

  12. Light-controlled resistors provide quadrature signal rejection for high-gain servo systems

    NASA Technical Reports Server (NTRS)

    Mc Cauley, D. D.

    1967-01-01

    Servo amplifier feedback system, in which the phase sensitive detection, low pass filtering, and multiplication functions required for quadrature rejection, are preformed by light-controlled photoresistors, eliminates complex circuitry. System increases gain, improves signal-to-noise ratio, and eliminates the necessity for compensation.

  13. Public Leadership Competencies in Adoption of Enterprise Systems at Federal Government Institutions

    ERIC Educational Resources Information Center

    Lapham, John Edmund

    2009-01-01

    The Federal Government continues to implement enterprise systems (information and communication technology solutions) as part of reinvention and business transformation. Enterprise system implementations are complex, costly, and often under achieving endeavors requiring that effective public leaders engage and influence the sociotechnical projects…

  14. Efficient evaluation of wireless real-time control networks.

    PubMed

    Horvath, Peter; Yampolskiy, Mark; Koutsoukos, Xenofon

    2015-02-11

    In this paper, we present a system simulation framework for the design and performance evaluation of complex wireless cyber-physical systems. We describe the simulator architecture and the specific developments that are required to simulate cyber-physical systems relying on multi-channel, multihop mesh networks. We introduce realistic and efficient physical layer models and a system simulation methodology, which provides statistically significant performance evaluation results with low computational complexity. The capabilities of the proposed framework are illustrated in the example of WirelessHART, a centralized, real-time, multi-hop mesh network designed for industrial control and monitor applications.

  15. Scaled CMOS Reliability and Considerations for Spacecraft Systems : Bottom-Up and Top-Down Perspectives

    NASA Technical Reports Server (NTRS)

    White, Mark

    2012-01-01

    The recently launched Mars Science Laboratory (MSL) flagship mission, named Curiosity, is the most complex rover ever built by NASA and is scheduled to touch down on the red planet in August, 2012 in Gale Crater. The rover and its instruments will have to endure the harsh environments of the surface of Mars to fulfill its main science objectives. Such complex systems require reliable microelectronic components coupled with adequate component and system-level design margins. Reliability aspects of these elements of the spacecraft system are presented from bottom- up and top-down perspectives.

  16. Advanced Materials, Technologies, and Complex Systems Analyses: Emerging Opportunities to Enhance Urban Water Security.

    PubMed

    Zodrow, Katherine R; Li, Qilin; Buono, Regina M; Chen, Wei; Daigger, Glen; Dueñas-Osorio, Leonardo; Elimelech, Menachem; Huang, Xia; Jiang, Guibin; Kim, Jae-Hong; Logan, Bruce E; Sedlak, David L; Westerhoff, Paul; Alvarez, Pedro J J

    2017-09-19

    Innovation in urban water systems is required to address the increasing demand for clean water due to population growth and aggravated water stress caused by water pollution, aging infrastructure, and climate change. Advances in materials science, modular water treatment technologies, and complex systems analyses, coupled with the drive to minimize the energy and environmental footprints of cities, provide new opportunities to ensure a resilient and safe water supply. We present a vision for enhancing efficiency and resiliency of urban water systems and discuss approaches and research needs for overcoming associated implementation challenges.

  17. Etoile Project : Social Intelligent ICT-System for very large scale education in complex systems

    NASA Astrophysics Data System (ADS)

    Bourgine, P.; Johnson, J.

    2009-04-01

    The project will devise new theory and implement new ICT-based methods of delivering high-quality low-cost postgraduate education to many thousands of people in a scalable way, with the cost of each extra student being negligible (< a few Euros). The research will create an in vivo laboratory of one to ten thousand postgraduate students studying courses in complex systems. This community is chosen because it is large and interdisciplinary and there is a known requirement for courses for thousand of students across Europe. The project involves every aspect of course production and delivery. Within this the research focused on the creation of a Socially Intelligent Resource Mining system to gather large volumes of high quality educational resources from the internet; new methods to deconstruct these to produce a semantically tagged Learning Object Database; a Living Course Ecology to support the creation and maintenance of evolving course materials; systems to deliver courses; and a ‘socially intelligent assessment system'. The system will be tested on one to ten thousand postgraduate students in Europe working towards the Complex System Society's title of European PhD in Complex Systems. Étoile will have a very high impact both scientifically and socially by (i) the provision of new scalable ICT-based methods for providing very low cost scientific education, (ii) the creation of new mathematical and statistical theory for the multiscale dynamics of complex systems, (iii) the provision of a working example of adaptation and emergence in complex socio-technical systems, and (iv) making a major educational contribution to European complex systems science and its applications.

  18. Complexity analysis of the Next Gen Air Traffic Management System: trajectory based operations.

    PubMed

    Lyons, Rhonda

    2012-01-01

    According to Federal Aviation Administration traffic predictions currently our Air Traffic Management (ATM) system is operating at 150 percent capacity; forecasting that within the next two decades, the traffic with increase to a staggering 250 percent [17]. This will require a major redesign of our system. Today's ATM system is complex. It is designed to safely, economically, and efficiently provide air traffic services through the cost-effective provision of facilities and seamless services in collaboration with multiple agents however, contrary the vision, the system is loosely integrated and is suffering tremendously from antiquated equipment and saturated airways. The new Next Generation (Next Gen) ATM system is designed to transform the current system into an agile, robust and responsive set of operations that are designed to safely manage the growing needs of the projected increasingly complex, diverse set of air transportation system users and massive projected worldwide traffic rates. This new revolutionary technology-centric system is dynamically complex and is much more sophisticated than it's soon to be predecessor. ATM system failures could yield large scale catastrophic consequences as it is a safety critical system. This work will attempt to describe complexity and the complex nature of the NextGen ATM system and Trajectory Based Operational. Complex human factors interactions within Next Gen will be analyzed using a proposed dual experimental approach designed to identify hazards, gaps and elicit emergent hazards that would not be visible if conducted in isolation. Suggestions will be made along with a proposal for future human factors research in the TBO safety critical Next Gen environment.

  19. Model Error Budgets

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    2008-01-01

    An error budget is a commonly used tool in design of complex aerospace systems. It represents system performance requirements in terms of allowable errors and flows these down through a hierarchical structure to lower assemblies and components. The requirements may simply be 'allocated' based upon heuristics or experience, or they may be designed through use of physics-based models. This paper presents a basis for developing an error budget for models of the system, as opposed to the system itself. The need for model error budgets arises when system models are a principle design agent as is increasingly more common for poorly testable high performance space systems.

  20. Real-Time Operating System/360

    NASA Technical Reports Server (NTRS)

    Hoffman, R. L.; Kopp, R. S.; Mueller, H. H.; Pollan, W. D.; Van Sant, B. W.; Weiler, P. W.

    1969-01-01

    RTOS has a cost savings advantage for real-time applications, such as those with random inputs requiring a flexible data routing facility, display systems simplified by a device independent interface language, and complex applications needing added storage protection and data queuing.

  1. Automated Design of Complex Dynamic Systems

    PubMed Central

    Hermans, Michiel; Schrauwen, Benjamin; Bienstman, Peter; Dambre, Joni

    2014-01-01

    Several fields of study are concerned with uniting the concept of computation with that of the design of physical systems. For example, a recent trend in robotics is to design robots in such a way that they require a minimal control effort. Another example is found in the domain of photonics, where recent efforts try to benefit directly from the complex nonlinear dynamics to achieve more efficient signal processing. The underlying goal of these and similar research efforts is to internalize a large part of the necessary computations within the physical system itself by exploiting its inherent non-linear dynamics. This, however, often requires the optimization of large numbers of system parameters, related to both the system's structure as well as its material properties. In addition, many of these parameters are subject to fabrication variability or to variations through time. In this paper we apply a machine learning algorithm to optimize physical dynamic systems. We show that such algorithms, which are normally applied on abstract computational entities, can be extended to the field of differential equations and used to optimize an associated set of parameters which determine their behavior. We show that machine learning training methodologies are highly useful in designing robust systems, and we provide a set of both simple and complex examples using models of physical dynamical systems. Interestingly, the derived optimization method is intimately related to direct collocation a method known in the field of optimal control. Our work suggests that the application domains of both machine learning and optimal control have a largely unexplored overlapping area which envelopes a novel design methodology of smart and highly complex physical systems. PMID:24497969

  2. Acceleration techniques for dependability simulation. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Barnette, James David

    1995-01-01

    As computer systems increase in complexity, the need to project system performance from the earliest design and development stages increases. We have to employ simulation for detailed dependability studies of large systems. However, as the complexity of the simulation model increases, the time required to obtain statistically significant results also increases. This paper discusses an approach that is application independent and can be readily applied to any process-based simulation model. Topics include background on classical discrete event simulation and techniques for random variate generation and statistics gathering to support simulation.

  3. Interactions of platinum metals and their complexes in biological systems.

    PubMed Central

    LeRoy, A F

    1975-01-01

    Platinum-metal oxidation catalysts are to be introduced in exhaust systems of many 1975 model-year automobiles in the U.S. to meet Clean Air Act standards. Small quantities of finely divided catalyst have been found issuing from prototype systems; platinum and palladium compounds may be found also. Although platinum exhibits a remarkable resistance to oxidation and chemical attack, it reacts chemically under some conditions producing coordination complex compounds. Palladium reacts more readily than platinum. Some platinum-metal complexes interact with biological systems as bacteriostatic, bacteriocidal, viricidal, and immunosuppressive agents. Workers chronically exposed to platinum complexes often develop asthma-like respiratory distress and skin reactions called platinosis. Platinum complexes used alone and in combination therapy with other drugs have recently emerged as effective agents in cancer chemotherapy. Understanding toxic and favorable interactions of metal species with living organisms requires basic information on quantities and chemical characteristics of complexes at trace concentrations in biological materials. Some basic chemical kinetic and thermodynamic data are presented to characterize the chemical behavior of the complex cis-[Pt(NH3)2Cl2] used therapeutically. A brief discussion of platinum at manogram levels in biological tissue is discussed. PMID:50943

  4. Operable Data Management for Ocean Observing Systems

    NASA Astrophysics Data System (ADS)

    Chavez, F. P.; Graybeal, J. B.; Godin, M. A.

    2004-12-01

    As oceanographic observing systems become more numerous and complex, data management solutions must follow. Most existing oceanographic data management systems fall into one of three categories: they have been developed as dedicated solutions, with limited application to other observing systems; they expect that data will be pre-processed into well-defined formats, such as netCDF; or they are conceived as robust, generic data management solutions, with complexity (high) and maturity and adoption rates (low) to match. Each approach has strengths and weaknesses; no approach yet fully addresses, nor takes advantage of, the sophistication of ocean observing systems as they are now conceived. In this presentation we describe critical data management requirements for advanced ocean observing systems, of the type envisioned by ORION and IOOS. By defining common requirements -- functional, qualitative, and programmatic -- for all such ocean observing systems, the performance and nature of the general data management solution can be characterized. Issues such as scalability, maintaining metadata relationships, data access security, visualization, and operational flexibility suggest baseline architectural characteristics, which may in turn lead to reusable components and approaches. Interoperability with other data management systems, with standards-based solutions in metadata specification and data transport protocols, and with the data management infrastructure envisioned by IOOS and ORION, can also be used to define necessary capabilities. Finally, some requirements for the software infrastructure of ocean observing systems can be inferred. Early operational results and lessons learned, from development and operations of MBARI ocean observing systems, are used to illustrate key requirements, choices, and challenges. Reference systems include the Monterey Ocean Observing System (MOOS), its component software systems (Software Infrastructure and Applications for MOOS, and the Shore Side Data System), and the Autonomous Ocean Sampling Network (AOSN).

  5. Analysis of the possibility of SysML and BPMN application in formal data acquisition system description

    NASA Astrophysics Data System (ADS)

    Ćwikła, G.; Gwiazda, A.; Banaś, W.; Monica, Z.; Foit, K.

    2017-08-01

    The article presents the study of possible application of selected methods of complex description, that can be used as a support of the Manufacturing Information Acquisition System (MIAS) methodology, describing how to design a data acquisition system, allowing for collecting and processing real-time data on the functioning of a production system, necessary for management of a company. MIAS can allow conversion into Cyber-Physical Production System. MIAS is gathering and pre-processing data on the state of production system, including e.g. realisation of production orders, state of machines, materials and human resources. Systematised approach and model-based development is proposed for improving the quality of the design of MIAS methodology-based complex systems supporting data acquisition in various types of companies. Graphical specification can be the baseline for any model-based development in specified areas. The possibility of application of SysML and BPMN, both being UML-based languages, representing different approaches to modelling of requirements, architecture and implementation of the data acquisition system, as a tools supporting description of required features of MIAS, were considered.

  6. Decreasing the temporal complexity for nonlinear, implicit reduced-order models by forecasting

    DOE PAGES

    Carlberg, Kevin; Ray, Jaideep; van Bloemen Waanders, Bart

    2015-02-14

    Implicit numerical integration of nonlinear ODEs requires solving a system of nonlinear algebraic equations at each time step. Each of these systems is often solved by a Newton-like method, which incurs a sequence of linear-system solves. Most model-reduction techniques for nonlinear ODEs exploit knowledge of system's spatial behavior to reduce the computational complexity of each linear-system solve. However, the number of linear-system solves for the reduced-order simulation often remains roughly the same as that for the full-order simulation. We propose exploiting knowledge of the model's temporal behavior to (1) forecast the unknown variable of the reduced-order system of nonlinear equationsmore » at future time steps, and (2) use this forecast as an initial guess for the Newton-like solver during the reduced-order-model simulation. To compute the forecast, we propose using the Gappy POD technique. As a result, the goal is to generate an accurate initial guess so that the Newton solver requires many fewer iterations to converge, thereby decreasing the number of linear-system solves in the reduced-order-model simulation.« less

  7. Management issues in systems engineering

    NASA Astrophysics Data System (ADS)

    Shishko, Robert; Chamberlain, Robert G.; Aster, Robert; Bilardo, Vincent; Forsberg, Kevin; Mooz, Hal; Polaski, Lou; Wade, Ron

    When applied to a system, the doctrine of successive refinement is a divide-and-conquer strategy. Complex systems are sucessively divided into pieces that are less complex, until they are simple enough to be conquered. This decomposition results in several structures for describing the product system and the producing system. These structures play important roles in systems engineering and project management. Many of the remaining sections in this chapter are devoted to describing some of these key structures. Structures that describe the product system include, but are not limited to, the requirements tree, system architecture and certain symbolic information such as system drawings, schematics, and data bases. The structures that describe the producing system include the project's work breakdown, schedules, cost accounts and organization.

  8. Management issues in systems engineering

    NASA Technical Reports Server (NTRS)

    Shishko, Robert; Chamberlain, Robert G.; Aster, Robert; Bilardo, Vincent; Forsberg, Kevin; Mooz, Hal; Polaski, Lou; Wade, Ron

    1993-01-01

    When applied to a system, the doctrine of successive refinement is a divide-and-conquer strategy. Complex systems are sucessively divided into pieces that are less complex, until they are simple enough to be conquered. This decomposition results in several structures for describing the product system and the producing system. These structures play important roles in systems engineering and project management. Many of the remaining sections in this chapter are devoted to describing some of these key structures. Structures that describe the product system include, but are not limited to, the requirements tree, system architecture and certain symbolic information such as system drawings, schematics, and data bases. The structures that describe the producing system include the project's work breakdown, schedules, cost accounts and organization.

  9. An Improved Indoor Positioning System Using RGB-D Cameras and Wireless Networks for Use in Complex Environments.

    PubMed

    Duque Domingo, Jaime; Cerrada, Carlos; Valero, Enrique; Cerrada, Jose A

    2017-10-20

    This work presents an Indoor Positioning System to estimate the location of people navigating in complex indoor environments. The developed technique combines WiFi Positioning Systems and depth maps , delivering promising results in complex inhabited environments, consisting of various connected rooms, where people are freely moving. This is a non-intrusive system in which personal information about subjects is not needed and, although RGB-D cameras are installed in the sensing area, users are only required to carry their smart-phones. In this article, the methods developed to combine the above-mentioned technologies and the experiments performed to test the system are detailed. The obtained results show a significant improvement in terms of accuracy and performance with respect to previous WiFi-based solutions as well as an extension in the range of operation.

  10. Anatomy and Physiology of Multiscale Modeling and Simulation in Systems Medicine.

    PubMed

    Mizeranschi, Alexandru; Groen, Derek; Borgdorff, Joris; Hoekstra, Alfons G; Chopard, Bastien; Dubitzky, Werner

    2016-01-01

    Systems medicine is the application of systems biology concepts, methods, and tools to medical research and practice. It aims to integrate data and knowledge from different disciplines into biomedical models and simulations for the understanding, prevention, cure, and management of complex diseases. Complex diseases arise from the interactions among disease-influencing factors across multiple levels of biological organization from the environment to molecules. To tackle the enormous challenges posed by complex diseases, we need a modeling and simulation framework capable of capturing and integrating information originating from multiple spatiotemporal and organizational scales. Multiscale modeling and simulation in systems medicine is an emerging methodology and discipline that has already demonstrated its potential in becoming this framework. The aim of this chapter is to present some of the main concepts, requirements, and challenges of multiscale modeling and simulation in systems medicine.

  11. Mergers and acquisitions in professional organizations: a complex adaptive systems approach.

    PubMed

    Walls, M E; McDaniel, R R

    1999-09-01

    Nurse managers face unique challenges as they cope with mergers and acquisitions among health care organizations. These challenges can be better understood if it is recognized that health care institutions are professional organizations and that the transformations required are extremely difficult. These difficulties are caused, in part, by the institutionalized nature of professional organizations, and this nature is explicated. Professional organizations are stubborn. They are repositories of expertise and values that are societal in origin and difficult to change. When professional organizations are understood as complex adaptive systems, complexity theory offers insight that provide strategies for managing mergers and acquisitions that may not be apparent when more traditional conceptualizations of professional organizations are used. Specific managerial techniques consistent with both the institutionalized characteristics and the complex adaptive systems characteristics of professional organizations are offered to nurse managers.

  12. TAFII-independent activation mediated by human TBP in the presence of the positive cofactor PC4.

    PubMed Central

    Wu, S Y; Kershnar, E; Chiang, C M

    1998-01-01

    TFIID is a multiprotein complex comprised of the TATA-binding protein (TBP) and an array of TBP-associated factors (TAFIIs). Whereas TBP is sufficient for basal transcription in conjunction with other general transcription factors and RNA polymerase II, TAFIIs are additionally required for activator-dependent transcription in mammalian cell-free transcription systems. However, recent in vivo studies carried out in yeast suggest that TAFIIs are not globally required for activator function. The discrepancy between in vivo yeast studies and in vitro mammalian cell-free systems remains to be resolved. In this study, we describe a mammalian cell-free transcription system reconstituted with only recombinant proteins and epitope-tagged multiprotein complexes. Transcriptional activation can be recapitulated in this highly purified in vitro transcription system in the absence of TAFIIs. This TBP-mediated activation is not induced by human mediator, another transcriptional coactivator complex potentially implicated in activator response. In contrast, general transcription factors TFIIH and TFIIA play a significant role in TBP-mediated activation, which can be detected in vitro with Gal4 fusion proteins containing various transcriptional activation domains. Our data, therefore, suggest that TFIIH and TFIIA can mediate activator function in the absence of TAFIIs. PMID:9687514

  13. Non-rocket Earth-Moon transport system

    NASA Astrophysics Data System (ADS)

    Bolonkin, Alexander

    2003-06-01

    This paper proposes a new transportation system for travel between Earth and Moon. This transportation system uses mechanical energy transfer and requires only minimal energy, using an engine located on Earth. A cable directly connects a pole of the Earth through a drive station to the lunar surface_ The equation for an optimal equal stress cable for complex gravitational field of Earth-Moon has been derived that allows significantly lower cable masses. The required strength could be provided by cables constructed of carbon nanotubes or carbon whiskers. Some of the constraints on such a system are discussed.

  14. Advanced Electric Propulsion for Space Solar Power Satellites

    NASA Technical Reports Server (NTRS)

    Oleson, Steve

    1999-01-01

    The sun tower concept of collecting solar energy in space and beaming it down for commercial use will require very affordable in-space as well as earth-to-orbit transportation. Advanced electric propulsion using a 200 kW power and propulsion system added to the sun tower nodes can provide a factor of two reduction in the required number of launch vehicles when compared to in-space cryogenic chemical systems. In addition, the total time required to launch and deliver the complete sun tower system is of the same order of magnitude using high power electric propulsion or cryogenic chemical propulsion: around one year. Advanced electric propulsion can also be used to minimize the stationkeeping propulsion system mass for this unique space platform. 50 to 100 kW class Hall, ion, magnetoplasmadynamic, and pulsed inductive thrusters are compared. High power Hall thruster technology provides the best mix of launches saved and shortest ground to Geosynchronous Earth Orbital Environment (GEO) delivery time of all the systems, including chemical. More detailed studies comparing launch vehicle costs, transfer operations costs, and propulsion system costs and complexities must be made to down-select a technology. The concept of adding electric propulsion to the sun tower nodes was compared to a concept using re-useable electric propulsion tugs for Low Earth Orbital Environment (LEO) to GEO transfer. While the tug concept would reduce the total number of required propulsion systems, more launchers and notably longer LEO to GEO and complete sun tower ground to GEO times would be required. The tugs would also need more complex, longer life propulsion systems and the ability to dock with sun tower nodes.

  15. Molding cork sheets to complex shapes

    NASA Technical Reports Server (NTRS)

    Sharpe, M. H.; Simpson, W. G.; Walker, H. M.

    1977-01-01

    Partially cured cork sheet is easily formed to complex shapes and then final-cured. Temperature and pressure levels required for process depend upon resin system used and final density and strength desired. Sheet can be bonded to surface during final cure, or can be first-formed in mold and bonded to surface in separate step.

  16. Architectures for Distributed and Complex M-Learning Systems: Applying Intelligent Technologies

    ERIC Educational Resources Information Center

    Caballe, Santi, Ed.; Xhafa, Fatos, Ed.; Daradoumis, Thanasis, Ed.; Juan, Angel A., Ed.

    2009-01-01

    Over the last decade, the needs of educational organizations have been changing in accordance with increasingly complex pedagogical models and with the technological evolution of e-learning environments with very dynamic teaching and learning requirements. This book explores state-of-the-art software architectures and platforms used to support…

  17. Sustainability Learning through Gaming: An Exploratory Study

    ERIC Educational Resources Information Center

    Fabricatore, Carlo; Lopez, Ximena

    2012-01-01

    This study explored the potential of digital games as learning environments to develop mindsets capable of dealing with complexity in the domain of sustainability. Building sustainable futures requires the ability to deal with the complex dynamics that characterize the world in which we live. As central elements in this system, we must develop the…

  18. An integrated view of complex landscapes: a big data-model integration approach to trans-disciplinary science

    USDA-ARS?s Scientific Manuscript database

    The Earth is a complex system comprised of many interacting spatial and temporal scales. Understanding, predicting, and managing for these dynamics requires a trans-disciplinary integrated approach. Although there have been calls for this integration, a general approach is needed. We developed a Tra...

  19. Holonic Rationale and Bio-inspiration on Design of Complex Emergent and Evolvable Systems

    NASA Astrophysics Data System (ADS)

    Leitao, Paulo

    Traditional centralized and rigid control structures are becoming inflexible to face the requirements of reconfigurability, responsiveness and robustness, imposed by customer demands in the current global economy. The Holonic Manufacturing Systems (HMS) paradigm, which was pointed out as a suitable solution to face these requirements, translates the concepts inherited from social organizations and biology to the manufacturing world. It offers an alternative way of designing adaptive systems where the traditional centralized control is replaced by decentralization over distributed and autonomous entities organized in hierarchical structures formed by intermediate stable forms. In spite of its enormous potential, methods regarding the self-adaptation and self-organization of complex systems are still missing. This paper discusses how the insights from biology in connection with new fields of computer science can be useful to enhance the holonic design aiming to achieve more self-adaptive and evolvable systems. Special attention is devoted to the discussion of emergent behavior and self-organization concepts, and the way they can be combined with the holonic rationale.

  20. Towards Self-adaptation for Dependable Service-Oriented Systems

    NASA Astrophysics Data System (ADS)

    Cardellini, Valeria; Casalicchio, Emiliano; Grassi, Vincenzo; Lo Presti, Francesco; Mirandola, Raffaela

    Increasingly complex information systems operating in dynamic environments ask for management policies able to deal intelligently and autonomously with problems and tasks. An attempt to deal with these aspects can be found in the Service-Oriented Architecture (SOA) paradigm that foresees the creation of business applications from independently developed services, where services and applications build up complex dependencies. Therefore the dependability of SOA systems strongly depends on their ability to self-manage and adapt themselves to cope with changes in the operating conditions and to meet the required dependability with a minimum of resources. In this paper we propose a model-based approach to the realization of self-adaptable SOA systems, aimed at the fulfillment of dependability requirements. Specifically, we provide a methodology driving the system adaptation and we discuss the architectural issues related to its implementation. To bring this approach to fruition, we developed a prototype tool and we show the results that can be achieved with a simple example.

  1. Integrating complexity into data-driven multi-hazard supply chain network strategies

    USGS Publications Warehouse

    Long, Suzanna K.; Shoberg, Thomas G.; Ramachandran, Varun; Corns, Steven M.; Carlo, Hector J.

    2013-01-01

    Major strategies in the wake of a large-scale disaster have focused on short-term emergency response solutions. Few consider medium-to-long-term restoration strategies that reconnect urban areas to the national supply chain networks (SCN) and their supporting infrastructure. To re-establish this connectivity, the relationships within the SCN must be defined and formulated as a model of a complex adaptive system (CAS). A CAS model is a representation of a system that consists of large numbers of inter-connections, demonstrates non-linear behaviors and emergent properties, and responds to stimulus from its environment. CAS modeling is an effective method of managing complexities associated with SCN restoration after large-scale disasters. In order to populate the data space large data sets are required. Currently access to these data is hampered by proprietary restrictions. The aim of this paper is to identify the data required to build a SCN restoration model, look at the inherent problems associated with these data, and understand the complexity that arises due to integration of these data.

  2. A new organismal systems biology: how animals walk the tight rope between stability and change.

    PubMed

    Padilla, Dianna K; Tsukimura, Brian

    2014-07-01

    The amount of knowledge in the biological sciences is growing at an exponential rate. Simultaneously, the incorporation of new technologies in gathering scientific information has greatly accelerated our capacity to ask, and answer, new questions. How do we, as organismal biologists, meet these challenges, and develop research strategies that will allow us to address the grand challenge question: how do organisms walk the tightrope between stability and change? Organisms and organismal systems are complex, and multi-scale in both space and time. It is clear that addressing major questions about organismal biology will not come from "business as usual" approaches. Rather, we require the collaboration of a wide range of experts and integration of biological information with more quantitative approaches traditionally found in engineering and applied mathematics. Research programs designed to address grand challenge questions will require deep knowledge and expertise within subfields of organismal biology, collaboration and integration among otherwise disparate areas of research, and consideration of organisms as integrated systems. Our ability to predict which features of complex integrated systems provide the capacity to be robust in changing environments is poorly developed. A predictive organismal biology is needed, but will require more quantitative approaches than are typical in biology, including complex systems-modeling approaches common to engineering. This new organismal systems biology will have reciprocal benefits for biologists, engineers, and mathematicians who address similar questions, including those working on control theory and dynamical systems biology, and will develop the tools we need to address the grand challenge questions of the 21st century. © The Author 2014. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.

  3. Moving toward climate-informed agricultural decision support - can we use PRISM data for more than just monthly averages?

    USDA-ARS?s Scientific Manuscript database

    Decision support systems/models for agriculture are varied in target application and complexity, ranging from simple worksheets to near real-time forecast systems requiring significant computational and manpower resources. Until recently, most such decision support systems have been constructed with...

  4. A protocol for parameterization and calibration of RZWQM2 in field research

    USDA-ARS?s Scientific Manuscript database

    Use of agricultural system models in field research requires a full understanding of both the model and the system it simulates. Since the 1960s, agricultural system models have increased tremendously in their complexity due to greater understanding of the processes simulated, their application to r...

  5. A Design Rationale Capture Using REMAP/MM

    DTIC Science & Technology

    1994-06-01

    company-wide down-sizing, the power company has determined that an automated service order processing system is the most economical solution. This new...service order processing system for a large power company can easily be 37 led. A system of this complexity would typically require three to five years

  6. 33 CFR 149.665 - What are the requirements for a general alarm system?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...? Each pumping platform complex must have a general alarm system that: (a) Is capable of being manually... general alarm system? 149.665 Section 149.665 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: DESIGN, CONSTRUCTION, AND EQUIPMENT Design...

  7. 33 CFR 149.675 - What are the requirements for the public address system?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...? (a) For a manned deepwater port, each pumping platform complex must have a public address system... public address system? 149.675 Section 149.675 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: DESIGN, CONSTRUCTION, AND EQUIPMENT Design...

  8. 33 CFR 149.665 - What are the requirements for a general alarm system?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...? Each pumping platform complex must have a general alarm system that: (a) Is capable of being manually... general alarm system? 149.665 Section 149.665 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: DESIGN, CONSTRUCTION, AND EQUIPMENT Design...

  9. 33 CFR 149.675 - What are the requirements for the public address system?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...? (a) For a manned deepwater port, each pumping platform complex must have a public address system... public address system? 149.675 Section 149.675 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: DESIGN, CONSTRUCTION, AND EQUIPMENT Design...

  10. Children's and Adolescents' Thoughts on Pollution: Cognitive Abilities Required to Understand Environmental Systems

    ERIC Educational Resources Information Center

    Rodríguez, Manuel; Kohen, Raquel; Delval, Juan

    2015-01-01

    Pollution phenomena are complex systems in which different parts are integrated by means of causal and temporal relationships. To understand pollution, children must develop some cognitive abilities related to system thinking and temporal and causal inferential reasoning. These cognitive abilities constrain and guide how children understand…

  11. Analysis of Software Systems for Specialized Computers,

    DTIC Science & Technology

    computer) with given computer hardware and software . The object of study is the software system of a computer, designed for solving a fixed complex of...purpose of the analysis is to find parameters that characterize the system and its elements during operation, i.e., when servicing the given requirement flow. (Author)

  12. Stoichiometry for binding and transport by the twin arginine translocation system.

    PubMed

    Celedon, Jose M; Cline, Kenneth

    2012-05-14

    Twin arginine translocation (Tat) systems transport large folded proteins across sealed membranes. Tat systems accomplish this feat with three membrane components organized in two complexes. In thylakoid membranes, cpTatC and Hcf106 comprise a large receptor complex containing an estimated eight cpTatC-Hcf106 pairs. Protein transport occurs when Tha4 joins the receptor complex as an oligomer of uncertain size that is thought to form the protein-conducting structure. Here, binding analyses with intact membranes or purified complexes indicate that each receptor complex could bind eight precursor proteins. Kinetic analysis of translocation showed that each precursor-bound site was independently functional for transport, and, with sufficient Tha4, all sites were concurrently active for transport. Tha4 titration determined that ∼26 Tha4 protomers were required for transport of each OE17 (oxygen-evolving complex subunit of 17 kD) precursor protein. Our results suggest that, when fully saturated with precursor proteins and Tha4, the Tat translocase is an ∼2.2-megadalton complex that can individually transport eight precursor proteins or cooperatively transport multimeric precursors.

  13. Simplifying the complexity surrounding ICU work processes--identifying the scope for information management in ICU settings.

    PubMed

    Munir, Samina K; Kay, Stephen

    2005-08-01

    A multi-site study, conducted in two English and two Danish intensive care units, investigates the complexity of work processes in intensive care, and the implications of this complexity for information management with regards to clinical information systems. Data were collected via observations, shadowing of clinical staff, interviews and questionnaires. The construction of role activity diagrams enabled the capture of critical care work processes. Upon analysing these diagrams, it was found that intensive care work processes consist of 'simplified-complexity', these processes are changed with the introduction of information systems for the everyday use and management of all clinical information. The prevailing notion of complexity surrounding critical care clinical work processes was refuted and found to be misleading; in reality, it is not the work processes that cause the complexity, the complexity is rooted in the way in which clinical information is used and managed. This study emphasises that the potential for clinical information systems that consider integrating all clinical information requirements is not only immense but also very plausible.

  14. Transportation Network Topologies

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia (Editor)

    2004-01-01

    The existing U.S. hub-and-spoke air transportation system is reaching saturation. Major aspects of the current system, such as capacity, safety, mobility, customer satisfaction, security, communications, and ecological effects, require improvements. The changing dynamics - increased presence of general aviation, unmanned autonomous vehicles, military aircraft in civil airspace as part of homeland defense - contributes to growing complexity of airspace. The system has proven remarkably resistant to change. NASA Langley Research Center and the National Institute of Aerospace conducted a workshop on Transportation Network Topologies on 9-10 December 2003 in Williamsburg, Virginia. The workshop aimed to examine the feasibility of traditional methods for complex system analysis and design as well as potential novel alternatives in application to transportation systems, identify state-of-the-art models and methods, conduct gap analysis, and thus to lay a foundation for establishing a focused research program in complex systems applied to air transportation.

  15. Structural and Functional Analyses of the Proteins Involved in the Iron-Sulfur Cluster Biosynthesis

    NASA Astrophysics Data System (ADS)

    Wada, Kei

    The iron-sulfur (Fe-S) clusters are ubiquitous prosthetic groups that are required to maintain such fundamental life processes as respiratory chain, photosynthesis and the regulation of gene expression. Assembly of intracellular Fe-S cluster requires the sophisticated biosynthetic systems called ISC and SUF machineries. To shed light on the molecular mechanism of Fe-S cluster assembly mediated by SUF machinery, several structures of the SUF components and their sub-complex were determined. The structural findings together with biochemical characterization of the core-complex (SufB-SufC-SufD complex) have led me to propose a working model for the cluster biosynthesis in the SUF machinery.

  16. Management Information Systems.

    ERIC Educational Resources Information Center

    Finlayson, Jean, Ed.

    1989-01-01

    This collection of papers addresses key questions facing college managers and others choosing, introducing, and living with big, complex computer-based systems. "What Use the User Requirement?" (Tony Coles) stresses the importance of an information strategy driven by corporate objectives, not technology. "Process of Selecting a…

  17. Designing the microturbine engine for waste-derived fuels.

    PubMed

    Seljak, Tine; Katrašnik, Tomaž

    2016-01-01

    Presented paper deals with adaptation procedure of a microturbine (MGT) for exploitation of refuse derived fuels (RDF). RDF often possess significantly different properties than conventional fuels and usually require at least some adaptations of internal combustion systems to obtain full functionality. With the methodology, developed in the paper it is possible to evaluate the extent of required adaptations by performing a thorough analysis of fuel combustion properties in a dedicated experimental rig suitable for testing of wide-variety of waste and biomass derived fuels. In the first part key turbine components are analyzed followed by cause and effect analysis of interaction between different fuel properties and design parameters of the components. The data are then used to build a dedicated test system where two fuels with diametric physical and chemical properties are tested - liquefied biomass waste (LW) and waste tire pyrolysis oil (TPO). The analysis suggests that exploitation of LW requires higher complexity of target MGT system as stable combustion can be achieved only with regenerative thermodynamic cycle, high fuel preheat temperatures and optimized fuel injection nozzle. Contrary, TPO requires less complex MGT design and sufficient operational stability is achieved already with simple cycle MGT and conventional fuel system. The presented approach of testing can significantly reduce the extent and cost of required adaptations of commercial system as pre-selection procedure of suitable MGT is done in developed test system. The obtained data can at the same time serve as an input for fine-tuning the processes for RDF production. Copyright © 2015. Published by Elsevier Ltd.

  18. Data based identification and prediction of nonlinear and complex dynamical systems

    NASA Astrophysics Data System (ADS)

    Wang, Wen-Xu; Lai, Ying-Cheng; Grebogi, Celso

    2016-07-01

    The problem of reconstructing nonlinear and complex dynamical systems from measured data or time series is central to many scientific disciplines including physical, biological, computer, and social sciences, as well as engineering and economics. The classic approach to phase-space reconstruction through the methodology of delay-coordinate embedding has been practiced for more than three decades, but the paradigm is effective mostly for low-dimensional dynamical systems. Often, the methodology yields only a topological correspondence of the original system. There are situations in various fields of science and engineering where the systems of interest are complex and high dimensional with many interacting components. A complex system typically exhibits a rich variety of collective dynamics, and it is of great interest to be able to detect, classify, understand, predict, and control the dynamics using data that are becoming increasingly accessible due to the advances of modern information technology. To accomplish these goals, especially prediction and control, an accurate reconstruction of the original system is required. Nonlinear and complex systems identification aims at inferring, from data, the mathematical equations that govern the dynamical evolution and the complex interaction patterns, or topology, among the various components of the system. With successful reconstruction of the system equations and the connecting topology, it may be possible to address challenging and significant problems such as identification of causal relations among the interacting components and detection of hidden nodes. The "inverse" problem thus presents a grand challenge, requiring new paradigms beyond the traditional delay-coordinate embedding methodology. The past fifteen years have witnessed rapid development of contemporary complex graph theory with broad applications in interdisciplinary science and engineering. The combination of graph, information, and nonlinear dynamical systems theories with tools from statistical physics, optimization, engineering control, applied mathematics, and scientific computing enables the development of a number of paradigms to address the problem of nonlinear and complex systems reconstruction. In this Review, we describe the recent advances in this forefront and rapidly evolving field, with a focus on compressive sensing based methods. In particular, compressive sensing is a paradigm developed in recent years in applied mathematics, electrical engineering, and nonlinear physics to reconstruct sparse signals using only limited data. It has broad applications ranging from image compression/reconstruction to the analysis of large-scale sensor networks, and it has become a powerful technique to obtain high-fidelity signals for applications where sufficient observations are not available. We will describe in detail how compressive sensing can be exploited to address a diverse array of problems in data based reconstruction of nonlinear and complex networked systems. The problems include identification of chaotic systems and prediction of catastrophic bifurcations, forecasting future attractors of time-varying nonlinear systems, reconstruction of complex networks with oscillatory and evolutionary game dynamics, detection of hidden nodes, identification of chaotic elements in neuronal networks, reconstruction of complex geospatial networks and nodal positioning, and reconstruction of complex spreading networks with binary data.. A number of alternative methods, such as those based on system response to external driving, synchronization, and noise-induced dynamical correlation, will also be discussed. Due to the high relevance of network reconstruction to biological sciences, a special section is devoted to a brief survey of the current methods to infer biological networks. Finally, a number of open problems including control and controllability of complex nonlinear dynamical networks are discussed. The methods outlined in this Review are principled on various concepts in complexity science and engineering such as phase transitions, bifurcations, stabilities, and robustness. The methodologies have the potential to significantly improve our ability to understand a variety of complex dynamical systems ranging from gene regulatory systems to social networks toward the ultimate goal of controlling such systems.

  19. Audits for advanced treatment dosimetry

    NASA Astrophysics Data System (ADS)

    Ibbott, G. S.; Thwaites, D. I.

    2015-01-01

    Radiation therapy has advanced rapidly over the last few decades, progressing from 3D conformal treatment to image-guided intensity modulated therapy of several different flavors, both 3D and 4D and to adaptive radiotherapy. The use of intensity modulation has increased the complexity of quality assurance and essentially eliminated the physicist's ability to judge the validity of a treatment plan, even approximately, on the basis of appearance and experience. Instead, complex QA devices and procedures are required at the institutional level. Similarly, the assessment of treatment quality through remote and on-site audits also requires greater sophistication. The introduction of 3D and 4D dosimetry into external audit systems must follow, to enable quality assurance systems to perform meaningful and thorough audits.

  20. Formation of virions is strictly required for turnip yellows virus long-distance movement in plants.

    PubMed

    Hipper, Clémence; Monsion, Baptiste; Bortolamiol-Bécet, Diane; Ziegler-Graff, Véronique; Brault, Véronique

    2014-02-01

    Viral genomic RNA of the Turnip yellows virus (TuYV; genus Polerovirus; family Luteoviridae) is protected in virions formed by the major capsid protein (CP) and the minor component, the readthrough (RT*) protein. Long-distance transport, used commonly by viruses to systemically infect host plants, occurs in phloem sieve elements and two viral forms of transport have been described: virions and ribonucleoprotein (RNP) complexes. With regard to poleroviruses, virions have always been presumed to be the long-distance transport form, but the potential role of RNP complexes has not been investigated. Here, we examined the requirement of virions for polerovirus systemic movement by analysing CP-targeted mutants that were unable to form viral particles. We confirmed that TuYV mutants that cannot encapsidate into virions are not able to reach systemic leaves. To completely discard the possibility that the introduced mutations in CP simply blocked the formation or the movement of RNP complexes, we tested in trans complementation of TuYV CP mutants by providing WT CP expressed in transgenic plants. WT CP was able to facilitate systemic movement of TuYV CP mutants and this observation was always correlated with the formation of virions. This demonstrated clearly that virus particles are essential for polerovirus systemic movement.

  1. MTL distributed magnet measurement system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nogiec, J.M.; Craker, P.A.; Garbarini, J.P.

    1993-04-01

    The Magnet Test Laboratory (MTL) at the Superconducting Super collider Laboratory will be required to precisely and reliably measure properties of magnets in a production environment. The extensive testing of the superconducting magnets comprises several types of measurements whose main purpose is to evaluate some basic parameters characterizing magnetic, mechanic and cryogenic properties of magnets. The measurement process will produce a significant amount of data which will be subjected to complex analysis. Such massive measurements require a careful design of both the hardware and software of computer systems, having in mind a reliable, maximally automated system. In order to fulfillmore » this requirement a dedicated Distributed Magnet Measurement System (DMMS) is being developed.« less

  2. Open Architecture Data System for NASA Langley Combined Loads Test System

    NASA Technical Reports Server (NTRS)

    Lightfoot, Michael C.; Ambur, Damodar R.

    1998-01-01

    The Combined Loads Test System (COLTS) is a new structures test complex that is being developed at NASA Langley Research Center (LaRC) to test large curved panels and cylindrical shell structures. These structural components are representative of aircraft fuselage sections of subsonic and supersonic transport aircraft and cryogenic tank structures of reusable launch vehicles. Test structures are subjected to combined loading conditions that simulate realistic flight load conditions. The facility consists of two pressure-box test machines and one combined loads test machine. Each test machine possesses a unique set of requirements or research data acquisition and real-time data display. Given the complex nature of the mechanical and thermal loads to be applied to the various research test articles, each data system has been designed with connectivity attributes that support both data acquisition and data management functions. This paper addresses the research driven data acquisition requirements for each test machine and demonstrates how an open architecture data system design not only meets those needs but provides robust data sharing between data systems including the various control systems which apply spectra of mechanical and thermal loading profiles.

  3. Exploring model based engineering for large telescopes: getting started with descriptive models

    NASA Astrophysics Data System (ADS)

    Karban, R.; Zamparelli, M.; Bauvir, B.; Koehler, B.; Noethe, L.; Balestra, A.

    2008-07-01

    Large telescopes pose a continuous challenge to systems engineering due to their complexity in terms of requirements, operational modes, long duty lifetime, interfaces and number of components. A multitude of decisions must be taken throughout the life cycle of a new system, and a prime means of coping with complexity and uncertainty is using models as one decision aid. The potential of descriptive models based on the OMG Systems Modeling Language (OMG SysMLTM) is examined in different areas: building a comprehensive model serves as the basis for subsequent activities of soliciting and review for requirements, analysis and design alike. Furthermore a model is an effective communication instrument against misinterpretation pitfalls which are typical of cross disciplinary activities when using natural language only or free-format diagrams. Modeling the essential characteristics of the system, like interfaces, system structure and its behavior, are important system level issues which are addressed. Also shown is how to use a model as an analysis tool to describe the relationships among disturbances, opto-mechanical effects and control decisions and to refine the control use cases. Considerations on the scalability of the model structure and organization, its impact on the development process, the relation to document-centric structures, style and usage guidelines and the required tool chain are presented.

  4. The methodology of multi-viewpoint clustering analysis

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala; Wild, Chris

    1993-01-01

    One of the greatest challenges facing the software engineering community is the ability to produce large and complex computer systems, such as ground support systems for unmanned scientific missions, that are reliable and cost effective. In order to build and maintain these systems, it is important that the knowledge in the system be suitably abstracted, structured, and otherwise clustered in a manner which facilitates its understanding, manipulation, testing, and utilization. Development of complex mission-critical systems will require the ability to abstract overall concepts in the system at various levels of detail and to consider the system from different points of view. Multi-ViewPoint - Clustering Analysis MVP-CA methodology has been developed to provide multiple views of large, complicated systems. MVP-CA provides an ability to discover significant structures by providing an automated mechanism to structure both hierarchically (from detail to abstract) and orthogonally (from different perspectives). We propose to integrate MVP/CA into an overall software engineering life cycle to support the development and evolution of complex mission critical systems.

  5. Verification and Validation Challenges for Adaptive Flight Control of Complex Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.

    2018-01-01

    Autonomy of aerospace systems requires the ability for flight control systems to be able to adapt to complex uncertain dynamic environment. In spite of the five decades of research in adaptive control, the fact still remains that currently no adaptive control system has ever been deployed on any safety-critical or human-rated production systems such as passenger transport aircraft. The problem lies in the difficulty with the certification of adaptive control systems since existing certification methods cannot readily be used for nonlinear adaptive control systems. Research to address the notion of metrics for adaptive control began to appear in the recent years. These metrics, if accepted, could pave a path towards certification that would potentially lead to the adoption of adaptive control as a future control technology for safety-critical and human-rated production systems. Development of certifiable adaptive control systems represents a major challenge to overcome. Adaptive control systems with learning algorithms will never become part of the future unless it can be proven that they are highly safe and reliable. Rigorous methods for adaptive control software verification and validation must therefore be developed to ensure that adaptive control system software failures will not occur, to verify that the adaptive control system functions as required, to eliminate unintended functionality, and to demonstrate that certification requirements imposed by regulatory bodies such as the Federal Aviation Administration (FAA) can be satisfied. This presentation will discuss some of the technical issues with adaptive flight control and related V&V challenges.

  6. ADAM: Analysis of Discrete Models of Biological Systems Using Computer Algebra

    PubMed Central

    2011-01-01

    Background Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. Results We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Conclusions Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics. PMID:21774817

  7. Integrating technology into complex intervention trial processes: a case study.

    PubMed

    Drew, Cheney J G; Poile, Vincent; Trubey, Rob; Watson, Gareth; Kelson, Mark; Townson, Julia; Rosser, Anne; Hood, Kerenza; Quinn, Lori; Busse, Monica

    2016-11-17

    Trials of complex interventions are associated with high costs and burdens in terms of paperwork, management, data collection, validation, and intervention fidelity assessment occurring across multiple sites. Traditional data collection methods rely on paper-based forms, where processing can be time-consuming and error rates high. Electronic source data collection can potentially address many of these inefficiencies, but has not routinely been used in complex intervention trials. Here we present the use of an on-line system for managing all aspects of data handling and for the monitoring of trial processes in a multicentre trial of a complex intervention. We custom built a web-accessible software application for the delivery of ENGAGE-HD, a multicentre trial of a complex physical therapy intervention. The software incorporated functionality for participant randomisation, data collection and assessment of intervention fidelity. It was accessible to multiple users with differing levels of access depending on required usage or to maintain blinding. Each site was supplied with a 4G-enabled iPad for accessing the system. The impact of this system was quantified through review of data quality and collation of feedback from site coordinators and assessors through structured process interviews. The custom-built system was an efficient tool for collecting data and managing trial processes. Although the set-up time required was significant, using the system resulted in an overall data completion rate of 98.5% with a data query rate of 0.1%, the majority of which were resolved in under a week. Feedback from research staff indicated that the system was highly acceptable for use in a research environment. This was a reflection of the portability and accessibility of the system when using the iPad and its usefulness in aiding accurate data collection, intervention fidelity and general administration. A combination of commercially available hardware and a bespoke online database designed to support data collection, intervention fidelity and trial progress provides a viable option for streamlining trial processes in a multicentre complex intervention trial. There is scope to further extend the system to cater for larger trials and add further functionality such as automatic reporting facilities and participant management support. ISRCTN65378754 , registered on 13 March 2014.

  8. Systems engineering for very large systems

    NASA Technical Reports Server (NTRS)

    Lewkowicz, Paul E.

    1993-01-01

    Very large integrated systems have always posed special problems for engineers. Whether they are power generation systems, computer networks or space vehicles, whenever there are multiple interfaces, complex technologies or just demanding customers, the challenges are unique. 'Systems engineering' has evolved as a discipline in order to meet these challenges by providing a structured, top-down design and development methodology for the engineer. This paper attempts to define the general class of problems requiring the complete systems engineering treatment and to show how systems engineering can be utilized to improve customer satisfaction and profit ability. Specifically, this work will focus on a design methodology for the largest of systems, not necessarily in terms of physical size, but in terms of complexity and interconnectivity.

  9. Systems engineering for very large systems

    NASA Astrophysics Data System (ADS)

    Lewkowicz, Paul E.

    Very large integrated systems have always posed special problems for engineers. Whether they are power generation systems, computer networks or space vehicles, whenever there are multiple interfaces, complex technologies or just demanding customers, the challenges are unique. 'Systems engineering' has evolved as a discipline in order to meet these challenges by providing a structured, top-down design and development methodology for the engineer. This paper attempts to define the general class of problems requiring the complete systems engineering treatment and to show how systems engineering can be utilized to improve customer satisfaction and profit ability. Specifically, this work will focus on a design methodology for the largest of systems, not necessarily in terms of physical size, but in terms of complexity and interconnectivity.

  10. Complex multidisciplinary system composition for aerospace vehicle conceptual design

    NASA Astrophysics Data System (ADS)

    Gonzalez, Lex

    Although, there exists a vast amount of work concerning the analysis, design, integration of aerospace vehicle systems, there is no standard for how this data and knowledge should be combined in order to create a synthesis system. Each institution creating a synthesis system has in house vehicle and hardware components they are attempting to model and proprietary methods with which to model them. This leads to the fact that synthesis systems begin as one-off creations meant to answer a specific problem. As the scope of the synthesis system grows to encompass more and more problems, so does its size and complexity; in order for a single synthesis system to answer multiple questions the number of methods and method interface must increase. As a means to curtail the requirement that the increase of an aircraft synthesis systems capability leads to an increase in its size and complexity, this research effort focuses on the idea that each problem in aerospace requires its own analysis framework. By focusing on the creation of a methodology which centers on the matching of an analysis framework towards the problem being solved, the complexity of the analysis framework is decoupled from the complexity of the system that creates it. The derived methodology allows for the composition of complex multi-disciplinary systems (CMDS) through the automatic creation and implementation of system and disciplinary method interfaces. The CMDS Composition process follows a four step methodology meant to take a problem definition and progress towards the creation of an analysis framework meant to answer said problem. The unique implementation of the CMDS Composition process take user selected disciplinary analysis methods and automatically integrates them, together in order to create a syntactically composable analysis framework. As a means of assessing the validity of the CMDS Composition process a prototype system (AVDDBMS) has been developed. AVD DBMS has been used to model the Generic Hypersonic Vehicle (GHV), an open source family of hypersonic vehicles originating from the Air Force Research Laboratory. AVDDBMS has been applied in three different ways in order to assess its validity: Verification using GHV disciplinary data, Validation using selected disciplinary analysis methods, and Application of the CMDS Composition Process to assess the design solution space for the GHV hardware. The research demonstrates the holistic effect that selection of individual disciplinary analysis methods has on the structure and integration of the analysis framework.

  11. Tips for Ensuring Successful Software Implementation

    ERIC Educational Resources Information Center

    Weathers, Robert

    2013-01-01

    Implementing an enterprise-level, mission-critical software system is an infrastructure project akin to other sizable projects, such as building a school. It's costly and complex, takes a year or more to complete, requires the collaboration of many different parties, involves uncertainties, results in a long-lived asset requiring ongoing…

  12. Big Data Goes Personal: Privacy and Social Challenges

    ERIC Educational Resources Information Center

    Bonomi, Luca

    2015-01-01

    The Big Data phenomenon is posing new challenges in our modern society. In addition to requiring information systems to effectively manage high-dimensional and complex data, the privacy and social implications associated with the data collection, data analytics, and service requirements create new important research problems. First, the high…

  13. Simulation modelling for new gas turbine fuel controller creation.

    NASA Astrophysics Data System (ADS)

    Vendland, L. E.; Pribylov, V. G.; Borisov, Yu A.; Arzamastsev, M. A.; Kosoy, A. A.

    2017-11-01

    State of the art gas turbine fuel flow control systems are based on throttle principle. Major disadvantage of such systems is that they require high pressure fuel intake. Different approach to fuel flow control is to use regulating compressor. And for this approach because of controller and gas turbine interaction a specific regulating compressor is required. Difficulties emerge as early as the requirement definition stage. To define requirements for new object, his properties must be known. Simulation modelling helps to overcome these difficulties. At the requirement definition stage the most simplified mathematical model is used. Mathematical models will get more complex and detailed as we advance in planned work. If future adjusting of regulating compressor physical model to work with virtual gas turbine and physical control system is planned.

  14. Two Wavelength Ti:sapphire Laser for Ozone DIAL Measurements from Aircraft

    NASA Technical Reports Server (NTRS)

    Situ, Wen; DeYoung, Russel J.

    1998-01-01

    Laser remote sensing of ozone from aircraft has proven to be a valuable technique for understanding the distribution and dynamics of ozone in the atmosphere. Presently the differential absorption lidar (DIAL) technique, using dual ND:YAG lasers that are doubled to pump dye lasers which in turn are doubled into the UV for the "on" and "off' line lasers, is used on either the NASA DC-8 or P-3 aircraft. Typically, the laser output for each line is 40-mJ and this is split into two beams, one looking up and the other downward, each beam having about 20-mJ. The residual ND:YAG (1.06 micron) and dye laser energies are also transmitted to obtain information on the atmospheric aerosols. While this system has operated well, there are several system characteristics that make the system less than ideal for aircraft operations. The system, which uses separate "on" and "off" line lasers, is quite large and massive requiring valuable aircraft volume and weight. The dye slowly degrades with time requiring replacement. The laser complexity requires a number of technical people to maintain the system performance. There is also the future interest in deploying an ozone DIAL system in an Unpiloted Atmospheric Vehicle (UAV) which would require a total payload mass of less than 150 kg and power requirement of less than 1500 W. A laser technology has emerged that could potentially provide significant enhancements over the present ozone DIAL system. The flashlamp pumped Ti:sapphire laser system is an emerging technology that could reduce the mass and volume over the present system and also provide a system with fewer conversion steps, reducing system complexity. This paper will discuss preliminary results from a flashlamp-pumped Ti:sapphire laser constructed as a radiation source for a UV DIAL system to measure ozone.

  15. Performance Support Systems: Integrating AI, Hypermedia, and CBT to Enhance User Performance.

    ERIC Educational Resources Information Center

    McGraw, Karen L.

    1994-01-01

    Examines the use of a performance support system (PSS) to enhance user performance on an operational system. Highlights include background information that describes the stimulus for PSS development; discussion of the major PSS components and the technology they require; and discussion of the design of a PSS for a complex database system.…

  16. 33 CFR 149.415 - What are the requirements for a fire main system on a manned deepwater port?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... complex must have a fixed fire main system. The system must either: (1) Comply with 46 CFR 108.415 through... fire main system on a manned deepwater port? 149.415 Section 149.415 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: DESIGN...

  17. Transparent Information Systems through Gateways, Front Ends, Intermediaries, and Interfaces.

    ERIC Educational Resources Information Center

    Williams, Martha E.

    1986-01-01

    Provides overview of design requirements for transparent information retrieval (implies that user sees through complexity of retrieval activities sequence). Highlights include need for transparent systems; history of transparent retrieval research; information retrieval functions (automated converters, routers, selectors, evaluators/analyzers);…

  18. A Systems Perspective on Responses to Climate Change

    EPA Science Inventory

    The science of climate change integrates many scientific fields to explain and predict the complex effects of greenhouse gas concentrations on the planet’s energy balance, weather patterns, and ecosystems as well as economic and social systems. A changing climate requires respons...

  19. Effective Software Engineering Leadership for Development Programs

    ERIC Educational Resources Information Center

    Cagle West, Marsha

    2010-01-01

    Software is a critical component of systems ranging from simple consumer appliances to complex health, nuclear, and flight control systems. The development of quality, reliable, and effective software solutions requires the incorporation of effective software engineering processes and leadership. Processes, approaches, and methodologies for…

  20. AN AUDITING FRAMEWORK TO SUBSTANTIATE ELECTRONIC RECORDKEEPING PRACTICES

    EPA Science Inventory

    Quality assurance audits of computer systems help to ensure that the end data meet the needs of the user. Increasingly complex systems require the stepwise procedures outlined below.

    The areas reviewed in this paper include both technical and evidentiary criteria. I...

  1. Reliability-Based Model to Analyze the Performance and Cost of a Transit Fare Collection System.

    DOT National Transportation Integrated Search

    1985-06-01

    The collection of transit system fares has become more sophisticated in recent years, with more flexible structures requiring more sophisticated fare collection equipment to process tickets and admit passengers. However, this new and complex equipmen...

  2. System engineering of complex optical systems for mission assurance and affordability

    NASA Astrophysics Data System (ADS)

    Ahmad, Anees

    2017-08-01

    Affordability and reliability are equally important as the performance and development time for many optical systems for military, space and commercial applications. These characteristics are even more important for the systems meant for space and military applications where total lifecycle costs must be affordable. Most customers are looking for high performance optical systems that are not only affordable but are designed with "no doubt" mission assurance, reliability and maintainability in mind. Both US military and commercial customers are now demanding an optimum balance between performance, reliability and affordability. Therefore, it is important to employ a disciplined systems design approach for meeting the performance, cost and schedule targets while keeping affordability and reliability in mind. The US Missile Defense Agency (MDA) now requires all of their systems to be engineered, tested and produced according to the Mission Assurance Provisions (MAP). These provisions or requirements are meant to ensure complex and expensive military systems are designed, integrated, tested and produced with the reliability and total lifecycle costs in mind. This paper describes a system design approach based on the MAP document for developing sophisticated optical systems that are not only cost-effective but also deliver superior and reliable performance during their intended missions.

  3. Efficient calculation of open quantum system dynamics and time-resolved spectroscopy with distributed memory HEOM (DM-HEOM).

    PubMed

    Kramer, Tobias; Noack, Matthias; Reinefeld, Alexander; Rodríguez, Mirta; Zelinskyy, Yaroslav

    2018-06-11

    Time- and frequency-resolved optical signals provide insights into the properties of light-harvesting molecular complexes, including excitation energies, dipole strengths and orientations, as well as in the exciton energy flow through the complex. The hierarchical equations of motion (HEOM) provide a unifying theory, which allows one to study the combined effects of system-environment dissipation and non-Markovian memory without making restrictive assumptions about weak or strong couplings or separability of vibrational and electronic degrees of freedom. With increasing system size the exact solution of the open quantum system dynamics requires memory and compute resources beyond a single compute node. To overcome this barrier, we developed a scalable variant of HEOM. Our distributed memory HEOM, DM-HEOM, is a universal tool for open quantum system dynamics. It is used to accurately compute all experimentally accessible time- and frequency-resolved processes in light-harvesting molecular complexes with arbitrary system-environment couplings for a wide range of temperatures and complex sizes. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  4. Spacecraft systems engineering: An introduction to the process at GSFC

    NASA Technical Reports Server (NTRS)

    Fragomeni, Tony; Ryschkewitsch, Michael G.

    1993-01-01

    The main objective in systems engineering is to devise a coherent total system design capable of achieving the stated requirements. Requirements should be rigid. However, they should be continuously challenged, rechallenged and/or validated. The systems engineer must specify every requirement in order to design, document, implement and conduct the mission. Each and every requirement must be logically considered, traceable and evaluated through various analysis and trade studies in a total systems design. Margins must be determined to be realistic as well as adequate. The systems engineer must also continuously close the loop and verify system performance against the requirements. The fundamental role of the systems engineer, however, is to engineer, not manage. Yet, in large, complex missions, where more than one systems engineer is required, someone needs to manage the systems engineers, and we call them 'systems managers.' Systems engineering management is an overview function which plans, guides, monitors and controls the technical execution of a project as implemented by the systems engineers. As the project moves on through Phases A and B into Phase C/D, the systems engineering tasks become a small portion of the total effort. The systems management role increases since discipline subsystem engineers are conducting analyses and reviewing test data for final review and acceptance by the systems managers.

  5. A Cost Model for Testing Unmanned and Autonomous Systems of Systems

    DTIC Science & Technology

    2011-02-01

    those risks. In addition, the fundamental methods presented by Aranha and Borba to include the complexity and sizing of tests for UASoS, can be expanded...used as an input for test execution effort estimation models (Aranha & Borba , 2007). Such methodology is very relevant to this work because as a UASoS...calculate the test effort based on the complexity of the SoS. However, Aranha and Borba define test size as the number of steps required to complete

  6. Three-dimensional imaging of the craniofacial complex.

    PubMed

    Nguyen, Can X.; Nissanov, Jonathan; Öztürk, Cengizhan; Nuveen, Michiel J.; Tuncay, Orhan C.

    2000-02-01

    Orthodontic treatment requires the rearrangement of craniofacial complex elements in three planes of space, but oddly the diagnosis is done with two-dimensional images. Here we report on a three-dimensional (3D) imaging system that employs the stereoimaging method of structured light to capture the facial image. The images can be subsequently integrated with 3D cephalometric tracings derived from lateral and PA films (www.clinorthodres.com/cor-c-070). The accuracy of the reconstruction obtained with this inexpensive system is about 400 µ.

  7. The production of multiprotein complexes in insect cells using the baculovirus expression system.

    PubMed

    Abdulrahman, Wassim; Radu, Laura; Garzoni, Frederic; Kolesnikova, Olga; Gupta, Kapil; Osz-Papai, Judit; Berger, Imre; Poterszman, Arnaud

    2015-01-01

    The production of a homogeneous protein sample in sufficient quantities is an essential prerequisite not only for structural investigations but represents also a rate-limiting step for many functional studies. In the cell, a large fraction of eukaryotic proteins exists as large multicomponent assemblies with many subunits, which act in concert to catalyze specific activities. Many of these complexes cannot be obtained from endogenous source material, so recombinant expression and reconstitution are then required to overcome this bottleneck. This chapter describes current strategies and protocols for the efficient production of multiprotein complexes in large quantities and of high quality, using the baculovirus/insect cell expression system.

  8. Managing Complexity in the MSL/Curiosity Entry, Descent, and Landing Flight Software and Avionics Verification and Validation Campaign

    NASA Technical Reports Server (NTRS)

    Stehura, Aaron; Rozek, Matthew

    2013-01-01

    The complexity of the Mars Science Laboratory (MSL) mission presented the Entry, Descent, and Landing systems engineering team with many challenges in its Verification and Validation (V&V) campaign. This paper describes some of the logistical hurdles related to managing a complex set of requirements, test venues, test objectives, and analysis products in the implementation of a specific portion of the overall V&V program to test the interaction of flight software with the MSL avionics suite. Application-specific solutions to these problems are presented herein, which can be generalized to other space missions and to similar formidable systems engineering problems.

  9. Water Conservation and Hydrological Transitions in Cities

    NASA Astrophysics Data System (ADS)

    Hornberger, G. M.; Gilligan, J. M.; Hess, D. J.

    2014-12-01

    A 2012 report by the National Research Council, Challenges and Opportunities in the Hydrologic Sciences, called for the development of "translational hydrologic science." Translational research in this context requires knowledge about the communication of science to decision makers and to the public but also improved understanding of the public by the scientists. This kind of knowledge is inherently interdisciplinary because it requires understanding of the complex sociotechnical dimensions of water, policy, and user relations. It is axiomatic that good governance of water resources and water infrastructure requires information about water resources themselves and about the institutions that govern water use. This "socio-hydrologic" or "hydrosociological" knowledge is often characterized by complex dynamics between and among human and natural systems. Water Resources Research has provided a forum for presentation of interdisciplinary research in coupled natural-human systems since its inception 50 years ago. The evolution of ideas presented in the journal provides a basis for framing new work, an example of which is water conservation in cities. In particular, we explore the complex interactions of political, sociodemographic, economic, and hydroclimatological factors in affecting decisions that either advance or retard the development of water conservation policies.

  10. System architectures for telerobotic research

    NASA Technical Reports Server (NTRS)

    Harrison, F. Wallace

    1989-01-01

    Several activities are performed related to the definition and creation of telerobotic systems. The effort and investment required to create architectures for these complex systems can be enormous; however, the magnitude of process can be reduced if structured design techniques are applied. A number of informal methodologies supporting certain aspects of the design process are available. More recently, prototypes of integrated tools supporting all phases of system design from requirements analysis to code generation and hardware layout have begun to appear. Activities related to system architecture of telerobots are described, including current activities which are designed to provide a methodology for the comparison and quantitative analysis of alternative system architectures.

  11. Coherent operation of detector systems and their readout electronics in a complex experiment control environment

    NASA Astrophysics Data System (ADS)

    Koestner, Stefan

    2009-09-01

    With the increasing size and degree of complexity of today's experiments in high energy physics the required amount of work and complexity to integrate a complete subdetector into an experiment control system is often underestimated. We report here on the layered software structure and protocols used by the LHCb experiment to control its detectors and readout boards. The experiment control system of LHCb is based on the commercial SCADA system PVSS II. Readout boards which are outside the radiation area are accessed via embedded credit card sized PCs which are connected to a large local area network. The SPECS protocol is used for control of the front end electronics. Finite state machines are introduced to facilitate the control of a large number of electronic devices and to model the whole experiment at the level of an expert system.

  12. Modeling complexity in engineered infrastructure system: Water distribution network as an example

    NASA Astrophysics Data System (ADS)

    Zeng, Fang; Li, Xiang; Li, Ke

    2017-02-01

    The complex topology and adaptive behavior of infrastructure systems are driven by both self-organization of the demand and rigid engineering solutions. Therefore, engineering complex systems requires a method balancing holism and reductionism. To model the growth of water distribution networks, a complex network model was developed following the combination of local optimization rules and engineering considerations. The demand node generation is dynamic and follows the scaling law of urban growth. The proposed model can generate a water distribution network (WDN) similar to reported real-world WDNs on some structural properties. Comparison with different modeling approaches indicates that a realistic demand node distribution and co-evolvement of demand node and network are important for the simulation of real complex networks. The simulation results indicate that the efficiency of water distribution networks is exponentially affected by the urban growth pattern. On the contrary, the improvement of efficiency by engineering optimization is limited and relatively insignificant. The redundancy and robustness, on another aspect, can be significantly improved through engineering methods.

  13. The Coordinated School Health Program: Implementation in a Rural Elementary School District

    ERIC Educational Resources Information Center

    Miller, Kim H.; Bice, Matthew R.

    2014-01-01

    Child health is a complex issue that requires a comprehensive approach to address the many factors that influence it and are influenced by it. In light of the complexity of children's health, the Coordinated School Health Program (CSHP) was developed as a framework for a systems approach to planning and implementing school-based children's health…

  14. What Gene-Environment Interactions Can Tell Us about Social Competence in Typical and Atypical Populations

    ERIC Educational Resources Information Center

    Iarocci, Grace; Yager, Jodi; Elfers, Theo

    2007-01-01

    Social competence is a complex human behaviour that is likely to involve a system of genes that interacts with a myriad of environmental risk and protective factors. The search for its genetic and environmental origins and influences is equally complex and will require a multidimensional conceptualization and multiple methods and levels of…

  15. "I Want to Listen to My Students' Lives": Developing an Ecological Perspective in Learning to Teach

    ERIC Educational Resources Information Center

    Cook-Sather, Alison; Curl, Heather

    2014-01-01

    Preparing teachers who want to "listen to their students' lives"requires creating opportunities for prospective teachers to perceive and learn about their students' lives and how those unfold within and as part of complex systems. That means supporting prospective teachers not only in understanding students as complex beings who have to…

  16. Using Representational Tools to Learn about Complex Systems: A Tale of Two Classrooms

    ERIC Educational Resources Information Center

    Hmelo-Silver, Cindy E.; Liu, Lei; Gray, Steven; Jordan, Rebecca

    2015-01-01

    Orchestrating inquiry-based science learning in the classroom is a complex undertaking. It requires fitting the culture of the classroom with the teacher's teaching and inquiry practices. To understand the interactions between these variables in relation to student learning, we conducted an investigation in two different classroom settings to…

  17. One-time pad, complexity of verification of keys, and practical security of quantum cryptography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molotkov, S. N., E-mail: sergei.molotkov@gmail.com

    2016-11-15

    A direct relation between the complexity of the complete verification of keys, which is one of the main criteria of security in classical systems, and a trace distance used in quantum cryptography is demonstrated. Bounds for the minimum and maximum numbers of verification steps required to determine the actual key are obtained.

  18. A measurement system for large, complex software programs

    NASA Technical Reports Server (NTRS)

    Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.

    1994-01-01

    This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.

  19. Rich complex behaviour of self-assembled nanoparticles far from equilibrium

    PubMed Central

    Ilday, Serim; Makey, Ghaith; Akguc, Gursoy B.; Yavuz, Özgün; Tokel, Onur; Pavlov, Ihor; Gülseren, Oguz; Ilday, F. Ömer

    2017-01-01

    A profoundly fundamental question at the interface between physics and biology remains open: what are the minimum requirements for emergence of complex behaviour from nonliving systems? Here, we address this question and report complex behaviour of tens to thousands of colloidal nanoparticles in a system designed to be as plain as possible: the system is driven far from equilibrium by ultrafast laser pulses that create spatiotemporal temperature gradients, inducing Marangoni flow that drags particles towards aggregation; strong Brownian motion, used as source of fluctuations, opposes aggregation. Nonlinear feedback mechanisms naturally arise between flow, aggregate and Brownian motion, allowing fast external control with minimal intervention. Consequently, complex behaviour, analogous to those seen in living organisms, emerges, whereby aggregates can self-sustain, self-regulate, self-replicate, self-heal and can be transferred from one location to another, all within seconds. Aggregates can comprise only one pattern or bifurcated patterns can coexist, compete, endure or perish. PMID:28443636

  20. Rich complex behaviour of self-assembled nanoparticles far from equilibrium

    NASA Astrophysics Data System (ADS)

    Ilday, Serim; Makey, Ghaith; Akguc, Gursoy B.; Yavuz, Özgün; Tokel, Onur; Pavlov, Ihor; Gülseren, Oguz; Ilday, F. Ömer

    2017-04-01

    A profoundly fundamental question at the interface between physics and biology remains open: what are the minimum requirements for emergence of complex behaviour from nonliving systems? Here, we address this question and report complex behaviour of tens to thousands of colloidal nanoparticles in a system designed to be as plain as possible: the system is driven far from equilibrium by ultrafast laser pulses that create spatiotemporal temperature gradients, inducing Marangoni flow that drags particles towards aggregation; strong Brownian motion, used as source of fluctuations, opposes aggregation. Nonlinear feedback mechanisms naturally arise between flow, aggregate and Brownian motion, allowing fast external control with minimal intervention. Consequently, complex behaviour, analogous to those seen in living organisms, emerges, whereby aggregates can self-sustain, self-regulate, self-replicate, self-heal and can be transferred from one location to another, all within seconds. Aggregates can comprise only one pattern or bifurcated patterns can coexist, compete, endure or perish.

  1. Anharmonic Vibrational Spectroscopy on Metal Transition Complexes

    NASA Astrophysics Data System (ADS)

    Latouche, Camille; Bloino, Julien; Barone, Vincenzo

    2014-06-01

    Advances in hardware performance and the availability of efficient and reliable computational models have made possible the application of computational spectroscopy to ever larger molecular systems. The systematic interpretation of experimental data and the full characterization of complex molecules can then be facilitated. Focusing on vibrational spectroscopy, several approaches have been proposed to simulate spectra beyond the double harmonic approximation, so that more details become available. However, a routine use of such tools requires the preliminary definition of a valid protocol with the most appropriate combination of electronic structure and nuclear calculation models. Several benchmark of anharmonic calculations frequency have been realized on organic molecules. Nevertheless, benchmarks of organometallics or inorganic metal complexes at this level are strongly lacking despite the interest of these systems due to their strong emission and vibrational properties. Herein we report the benchmark study realized with anharmonic calculations on simple metal complexes, along with some pilot applications on systems of direct technological or biological interest.

  2. GIS Toolsets for Planetary Geomorphology and Landing-Site Analysis

    NASA Astrophysics Data System (ADS)

    Nass, Andrea; van Gasselt, Stephan

    2015-04-01

    Modern Geographic Information Systems (GIS) allow expert and lay users alike to load and position geographic data and perform simple to highly complex surface analyses. For many applications dedicated and ready-to-use GIS tools are available in standard software systems while other applications require the modular combination of available basic tools to answer more specific questions. This also applies to analyses in modern planetary geomorphology where many of such (basic) tools can be used to build complex analysis tools, e.g. in image- and terrain model analysis. Apart from the simple application of sets of different tools, many complex tasks require a more sophisticated design for storing and accessing data using databases (e.g. ArcHydro for hydrological data analysis). In planetary sciences, complex database-driven models are often required to efficiently analyse potential landings sites or store rover data, but also geologic mapping data can be efficiently stored and accessed using database models rather than stand-alone shapefiles. For landings-site analyses, relief and surface roughness estimates are two common concepts that are of particular interest and for both, a number of different definitions co-exist. We here present an advanced toolset for the analysis of image and terrain-model data with an emphasis on extraction of landing site characteristics using established criteria. We provide working examples and particularly focus on the concepts of terrain roughness as it is interpreted in geomorphology and engineering studies.

  3. The Paperless Solution

    NASA Technical Reports Server (NTRS)

    2001-01-01

    REI Systems, Inc. developed a software solution that uses the Internet to eliminate the paperwork typically required to document and manage complex business processes. The data management solution, called Electronic Handbooks (EHBs), is presently used for the entire SBIR program processes at NASA. The EHB-based system is ideal for programs and projects whose users are geographically distributed and are involved in complex management processes and procedures. EHBs provide flexible access control and increased communications while maintaining security for systems of all sizes. Through Internet Protocol- based access, user authentication and user-based access restrictions, role-based access control, and encryption/decryption, EHBs provide the level of security required for confidential data transfer. EHBs contain electronic forms and menus, which can be used in real time to execute the described processes. EHBs use standard word processors that generate ASCII HTML code to set up electronic forms that are viewed within a web browser. EHBs require no end-user software distribution, significantly reducing operating costs. Each interactive handbook simulates a hard-copy version containing chapters with descriptions of participants' roles in the online process.

  4. From Molecules to Life: Quantifying the Complexity of Chemical and Biological Systems in the Universe.

    PubMed

    Böttcher, Thomas

    2018-01-01

    Life is a complex phenomenon and much research has been devoted to both understanding its origins from prebiotic chemistry and discovering life beyond Earth. Yet, it has remained elusive how to quantify this complexity and how to compare chemical and biological units on one common scale. Here, a mathematical description of molecular complexity was applied allowing to quantitatively assess complexity of chemical structures. This in combination with the orthogonal measure of information complexity resulted in a two-dimensional complexity space ranging over the entire spectrum from molecules to organisms. Entities with a certain level of information complexity directly require a functionally complex mechanism for their production or replication and are hence indicative for life-like systems. In order to describe entities combining molecular and information complexity, the term biogenic unit was introduced. Exemplified biogenic unit complexities were calculated for ribozymes, protein enzymes, multimeric protein complexes, and even an entire virus particle. Complexities of prokaryotic and eukaryotic cells, as well as multicellular organisms, were estimated. Thereby distinct evolutionary stages in complexity space were identified. The here developed approach to compare the complexity of biogenic units allows for the first time to address the gradual characteristics of prebiotic and life-like systems without the need for a definition of life. This operational concept may guide our search for life in the Universe, and it may direct the investigations of prebiotic trajectories that lead towards the evolution of complexity at the origins of life.

  5. Design and Development of a Web-Based Interactive Software Tool for Teaching Operating Systems

    ERIC Educational Resources Information Center

    Garmpis, Aristogiannis

    2011-01-01

    Operating Systems (OS) is an important and mandatory discipline in many Computer Science, Information Systems and Computer Engineering curricula. Some of its topics require a careful and detailed explanation from the instructor as they often involve theoretical concepts and somewhat complex mechanisms, demanding a certain degree of abstraction…

  6. A new method for predicting response in complex linear systems. II. [under random or deterministic steady state excitation

    NASA Technical Reports Server (NTRS)

    Bogdanoff, J. L.; Kayser, K.; Krieger, W.

    1977-01-01

    The paper describes convergence and response studies in the low frequency range of complex systems, particularly with low values of damping of different distributions, and reports on the modification of the relaxation procedure required under these conditions. A new method is presented for response estimation in complex lumped parameter linear systems under random or deterministic steady state excitation. The essence of the method is the use of relaxation procedures with a suitable error function to find the estimated response; natural frequencies and normal modes are not computed. For a 45 degree of freedom system, and two relaxation procedures, convergence studies and frequency response estimates were performed. The low frequency studies are considered in the framework of earlier studies (Kayser and Bogdanoff, 1975) involving the mid to high frequency range.

  7. An Approach to Experimental Design for the Computer Analysis of Complex Phenomenon

    NASA Technical Reports Server (NTRS)

    Rutherford, Brian

    2000-01-01

    The ability to make credible system assessments, predictions and design decisions related to engineered systems and other complex phenomenon is key to a successful program for many large-scale investigations in government and industry. Recently, many of these large-scale analyses have turned to computational simulation to provide much of the required information. Addressing specific goals in the computer analysis of these complex phenomenon is often accomplished through the use of performance measures that are based on system response models. The response models are constructed using computer-generated responses together with physical test results where possible. They are often based on probabilistically defined inputs and generally require estimation of a set of response modeling parameters. As a consequence, the performance measures are themselves distributed quantities reflecting these variabilities and uncertainties. Uncertainty in the values of the performance measures leads to uncertainties in predicted performance and can cloud the decisions required of the analysis. A specific goal of this research has been to develop methodology that will reduce this uncertainty in an analysis environment where limited resources and system complexity together restrict the number of simulations that can be performed. An approach has been developed that is based on evaluation of the potential information provided for each "intelligently selected" candidate set of computer runs. Each candidate is evaluated by partitioning the performance measure uncertainty into two components - one component that could be explained through the additional computational simulation runs and a second that would remain uncertain. The portion explained is estimated using a probabilistic evaluation of likely results for the additional computational analyses based on what is currently known about the system. The set of runs indicating the largest potential reduction in uncertainty is then selected and the computational simulations are performed. Examples are provided to demonstrate this approach on small scale problems. These examples give encouraging results. Directions for further research are indicated.

  8. Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems

    NASA Astrophysics Data System (ADS)

    Koch, Patrick Nathan

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.

  9. Shannon information, LMC complexity and Rényi entropies: a straightforward approach.

    PubMed

    López-Ruiz, Ricardo

    2005-04-01

    The LMC complexity, an indicator of complexity based on a probabilistic description, is revisited. A straightforward approach allows us to establish the time evolution of this indicator in a near-equilibrium situation and gives us a new insight for interpreting the LMC complexity for a general non equilibrium system. Its relationship with the Rényi entropies is also explained. One of the advantages of this indicator is that its calculation does not require a considerable computational effort in many cases of physical and biological interest.

  10. Affinity proteomics to study endogenous protein complexes: Pointers, pitfalls, preferences and perspectives

    PubMed Central

    LaCava, John; Molloy, Kelly R.; Taylor, Martin S.; Domanski, Michal; Chait, Brian T.; Rout, Michael P.

    2015-01-01

    Dissecting and studying cellular systems requires the ability to specifically isolate distinct proteins along with the co-assembled constituents of their associated complexes. Affinity capture techniques leverage high affinity, high specificity reagents to target and capture proteins of interest along with specifically associated proteins from cell extracts. Affinity capture coupled to mass spectrometry (MS)-based proteomic analyses has enabled the isolation and characterization of a wide range of endogenous protein complexes. Here, we outline effective procedures for the affinity capture of protein complexes, highlighting best practices and common pitfalls. PMID:25757543

  11. Automated and miniaturized detection of biological threats with a centrifugal microfluidic system

    NASA Astrophysics Data System (ADS)

    Mark, D.; van Oordt, T.; Strohmeier, O.; Roth, G.; Drexler, J.; Eberhard, M.; Niedrig, M.; Patel, P.; Zgaga-Griesz, A.; Bessler, W.; Weidmann, M.; Hufert, F.; Zengerle, R.; von Stetten, F.

    2012-06-01

    The world's growing mobility, mass tourism, and the threat of terrorism increase the risk of the fast spread of infectious microorganisms and toxins. Today's procedures for pathogen detection involve complex stationary devices, and are often too time consuming for a rapid and effective response. Therefore a robust and mobile diagnostic system is required. We present a microstructured LabDisk which performs complex biochemical analyses together with a mobile centrifugal microfluidic device which processes the LabDisk. This portable system will allow fully automated and rapid detection of biological threats at the point-of-need.

  12. Reverse osmosis water purification system

    NASA Technical Reports Server (NTRS)

    Ahlstrom, H. G.; Hames, P. S.; Menninger, F. J.

    1986-01-01

    A reverse osmosis water purification system, which uses a programmable controller (PC) as the control system, was designed and built to maintain the cleanliness and level of water for various systems of a 64-m antenna. The installation operates with other equipment of the antenna at the Goldstone Deep Space Communication Complex. The reverse osmosis system was designed to be fully automatic; with the PC, many complex sequential and timed logic networks were easily implemented and are modified. The PC monitors water levels, pressures, flows, control panel requests, and set points on analog meters; with this information various processes are initiated, monitored, modified, halted, or eliminated as required by the equipment being supplied pure water.

  13. Supporting Space Systems Design via Systems Dependency Analysis Methodology

    NASA Astrophysics Data System (ADS)

    Guariniello, Cesare

    The increasing size and complexity of space systems and space missions pose severe challenges to space systems engineers. When complex systems and Systems-of-Systems are involved, the behavior of the whole entity is not only due to that of the individual systems involved but also to the interactions and dependencies between the systems. Dependencies can be varied and complex, and designers usually do not perform analysis of the impact of dependencies at the level of complex systems, or this analysis involves excessive computational cost, or occurs at a later stage of the design process, after designers have already set detailed requirements, following a bottom-up approach. While classical systems engineering attempts to integrate the perspectives involved across the variety of engineering disciplines and the objectives of multiple stakeholders, there is still a need for more effective tools and methods capable to identify, analyze and quantify properties of the complex system as a whole and to model explicitly the effect of some of the features that characterize complex systems. This research describes the development and usage of Systems Operational Dependency Analysis and Systems Developmental Dependency Analysis, two methods based on parametric models of the behavior of complex systems, one in the operational domain and one in the developmental domain. The parameters of the developed models have intuitive meaning, are usable with subjective and quantitative data alike, and give direct insight into the causes of observed, and possibly emergent, behavior. The approach proposed in this dissertation combines models of one-to-one dependencies among systems and between systems and capabilities, to analyze and evaluate the impact of failures or delays on the outcome of the whole complex system. The analysis accounts for cascading effects, partial operational failures, multiple failures or delays, and partial developmental dependencies. The user of these methods can assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.

  14. A complexity science-based framework for global joint operations analysis to support force projection: LDRD Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawton, Craig R.

    2015-01-01

    The military is undergoing a significant transformation as it modernizes for the information age and adapts to address an emerging asymmetric threat beyond traditional cold war era adversaries. Techniques such as traditional large-scale, joint services war gaming analysis are no longer adequate to support program evaluation activities and mission planning analysis at the enterprise level because the operating environment is evolving too quickly. New analytical capabilities are necessary to address modernization of the Department of Defense (DoD) enterprise. This presents significant opportunity to Sandia in supporting the nation at this transformational enterprise scale. Although Sandia has significant experience with engineeringmore » system of systems (SoS) and Complex Adaptive System of Systems (CASoS), significant fundamental research is required to develop modeling, simulation and analysis capabilities at the enterprise scale. This report documents an enterprise modeling framework which will enable senior level decision makers to better understand their enterprise and required future investments.« less

  15. Reaction factoring and bipartite update graphs accelerate the Gillespie Algorithm for large-scale biochemical systems.

    PubMed

    Indurkhya, Sagar; Beal, Jacob

    2010-01-06

    ODE simulations of chemical systems perform poorly when some of the species have extremely low concentrations. Stochastic simulation methods, which can handle this case, have been impractical for large systems due to computational complexity. We observe, however, that when modeling complex biological systems: (1) a small number of reactions tend to occur a disproportionately large percentage of the time, and (2) a small number of species tend to participate in a disproportionately large percentage of reactions. We exploit these properties in LOLCAT Method, a new implementation of the Gillespie Algorithm. First, factoring reaction propensities allows many propensities dependent on a single species to be updated in a single operation. Second, representing dependencies between reactions with a bipartite graph of reactions and species requires only storage for reactions, rather than the required for a graph that includes only reactions. Together, these improvements allow our implementation of LOLCAT Method to execute orders of magnitude faster than currently existing Gillespie Algorithm variants when simulating several yeast MAPK cascade models.

  16. Reaction Factoring and Bipartite Update Graphs Accelerate the Gillespie Algorithm for Large-Scale Biochemical Systems

    PubMed Central

    Indurkhya, Sagar; Beal, Jacob

    2010-01-01

    ODE simulations of chemical systems perform poorly when some of the species have extremely low concentrations. Stochastic simulation methods, which can handle this case, have been impractical for large systems due to computational complexity. We observe, however, that when modeling complex biological systems: (1) a small number of reactions tend to occur a disproportionately large percentage of the time, and (2) a small number of species tend to participate in a disproportionately large percentage of reactions. We exploit these properties in LOLCAT Method, a new implementation of the Gillespie Algorithm. First, factoring reaction propensities allows many propensities dependent on a single species to be updated in a single operation. Second, representing dependencies between reactions with a bipartite graph of reactions and species requires only storage for reactions, rather than the required for a graph that includes only reactions. Together, these improvements allow our implementation of LOLCAT Method to execute orders of magnitude faster than currently existing Gillespie Algorithm variants when simulating several yeast MAPK cascade models. PMID:20066048

  17. Microgravity isolation system design: A modern control analysis framework

    NASA Technical Reports Server (NTRS)

    Hampton, R. D.; Knospe, C. R.; Allaire, P. E.; Grodsinsky, C. M.

    1994-01-01

    Many acceleration-sensitive, microgravity science experiments will require active vibration isolation from the manned orbiters on which they will be mounted. The isolation problem, especially in the case of a tethered payload, is a complex three-dimensional one that is best suited to modern-control design methods. These methods, although more powerful than their classical counterparts, can nonetheless go only so far in meeting the design requirements for practical systems. Once a tentative controller design is available, it must still be evaluated to determine whether or not it is fully acceptable, and to compare it with other possible design candidates. Realistically, such evaluation will be an inherent part of a necessary iterative design process. In this paper, an approach is presented for applying complex mu-analysis methods to a closed-loop vibration isolation system (experiment plus controller). An analysis framework is presented for evaluating nominal stability, nominal performance, robust stability, and robust performance of active microgravity isolation systems, with emphasis on the effective use of mu-analysis methods.

  18. NASA's Solar Dynamics Observatory (SDO): A Systems Approach to a Complex Mission

    NASA Technical Reports Server (NTRS)

    Ruffa, John A.; Ward, David K.; Bartusek, LIsa M.; Bay, Michael; Gonzales, Peter J.; Pesnell, William D.

    2012-01-01

    The Solar Dynamics Observatory (SDO) includes three advanced instruments, massive science data volume, stringent science data completeness requirements, and a custom ground station to meet mission demands. The strict instrument science requirements imposed a number of challenging drivers on the overall mission system design, leading the SDO team to adopt an integrated systems engineering presence across all aspects of the mission to ensure that mission science requirements would be met. Key strategies were devised to address these system level drivers and mitigate identified threats to mission success. The global systems engineering team approach ensured that key drivers and risk areas were rigorously addressed through all phases of the mission, leading to the successful SDO launch and on-orbit operation. Since launch, SDO's on-orbit performance has met all mission science requirements and enabled groundbreaking science observations, expanding our understanding of the Sun and its dynamic processes.

  19. NASA's Solar Dynamics Observatory (SDO): A Systems Approach to a Complex Mission

    NASA Technical Reports Server (NTRS)

    Ruffa, John A.; Ward, David K.; Bartusek, Lisa M.; Bay, Michael; Gonzales, Peter J.; Pesnell, William D.

    2012-01-01

    The Solar Dynamics Observatory (SDO) includes three advanced instruments, massive science data volume, stringent science data completeness requirements, and a custom ground station to meet mission demands. The strict instrument science requirements imposed a number of challenging drivers on the overall mission system design, leading the SDO team to adopt an integrated systems engineering presence across all aspects of the mission to ensure that mission science requirements would be met. Key strategies were devised to address these system level drivers and mitigate identified threats to mission success. The global systems engineering team approach ensured that key drivers and risk areas were rigorously addressed through all phases of the mission, leading to the successful SDO launch and on-orbit operation. Since launch, SDO s on-orbit performance has met all mission science requirements and enabled groundbreaking science observations, expanding our understanding of the Sun and its dynamic processes.

  20. Nonterrestrial material processing and manufacturing of large space systems

    NASA Technical Reports Server (NTRS)

    Von Tiesenhausen, G.

    1979-01-01

    Nonterrestrial processing of materials and manufacturing of large space system components from preprocessed lunar materials at a manufacturing site in space is described. Lunar materials mined and preprocessed at the lunar resource complex will be flown to the space manufacturing facility (SMF), where together with supplementary terrestrial materials, they will be final processed and fabricated into space communication systems, solar cell blankets, radio frequency generators, and electrical equipment. Satellite Power System (SPS) material requirements and lunar material availability and utilization are detailed, and the SMF processing, refining, fabricating facilities, material flow and manpower requirements are described.

  1. System data communication structures for active-control transport aircraft, volume 2

    NASA Technical Reports Server (NTRS)

    Hopkins, A. L.; Martin, J. H.; Brock, L. D.; Jansson, D. G.; Serben, S.; Smith, T. B.; Hanley, L. D.

    1981-01-01

    The application of communication structures to advanced transport aircraft are addressed. First, a set of avionic functional requirements is established, and a baseline set of avionics equipment is defined that will meet the requirements. Three alternative configurations for this equipment are then identified that represent the evolution toward more dispersed systems. Candidate communication structures are proposed for each system configuration, and these are compared using trade off analyses; these analyses emphasize reliability but also address complexity. Multiplex buses are recognized as the likely near term choice with mesh networks being desirable for advanced, highly dispersed systems.

  2. Future Data Communication Architectures for Safety Critical Aircraft Cabin Systems

    NASA Astrophysics Data System (ADS)

    Berkhahn, Sven-Olaf

    2012-05-01

    The cabin of modern aircraft is subject to increasing demands for fast reconfiguration and hence flexibility. These demands require studies for new network architectures and technologies of the electronic cabin systems, which consider also weight and cost reductions as well as safety constraints. Two major approaches are in consideration to reduce the complex and heavy wiring harness: the usage of a so called hybrid data bus technology, which enables the common usage of the same data bus for several electronic cabin systems with different safety and security requirements and the application of wireless data transfer technologies for electronic cabin systems.

  3. Batch-mode Reinforcement Learning for improved hydro-environmental systems management

    NASA Astrophysics Data System (ADS)

    Castelletti, A.; Galelli, S.; Restelli, M.; Soncini-Sessa, R.

    2010-12-01

    Despite the great progresses made in the last decades, the optimal management of hydro-environmental systems still remains a very active and challenging research area. The combination of multiple, often conflicting interests, high non-linearities of the physical processes and the management objectives, strong uncertainties in the inputs, and high dimensional state makes the problem challenging and intriguing. Stochastic Dynamic Programming (SDP) is one of the most suitable methods for designing (Pareto) optimal management policies preserving the original problem complexity. However, it suffers from a dual curse, which, de facto, prevents its practical application to even reasonably complex water systems. (i) Computational requirement grows exponentially with state and control dimension (Bellman's curse of dimensionality), so that SDP can not be used with water systems where the state vector includes more than few (2-3) units. (ii) An explicit model of each system's component is required (curse of modelling) to anticipate the effects of the system transitions, i.e. any information included into the SDP framework can only be either a state variable described by a dynamic model or a stochastic disturbance, independent in time, with the associated pdf. Any exogenous information that could effectively improve the system operation cannot be explicitly considered in taking the management decision, unless a dynamic model is identified for each additional information, thus adding to the problem complexity through the curse of dimensionality (additional state variables). To mitigate this dual curse, the combined use of batch-mode Reinforcement Learning (bRL) and Dynamic Model Reduction (DMR) techniques is explored in this study. bRL overcomes the curse of modelling by replacing explicit modelling with an external simulator and/or historical observations. The curse of dimensionality is averted using a functional approximation of the SDP value function based on proper non-linear regressors. DMR reduces the complexity and the associated computational requirements of non-linear distributed process based models, making them suitable for being included into optimization schemes. Results from real world applications of the approach are also presented, including reservoir operation with both quality and quantity targets.

  4. Parent Perspective on Care Coordination Services for Their Child with Medical Complexity.

    PubMed

    Cady, Rhonda G; Belew, John L

    2017-06-06

    The overarching goal of care coordination is communication and co-management across settings. Children with medical complexity require care from multiple services and providers, and the many benefits of care coordination on health and patient experience outcomes have been documented. Despite these findings, parents still report their greatest challenge is communication gaps. When this occurs, parents assume responsibility for aggregating and sharing health information across providers and settings. A new primary-specialty care coordination partnership model for children with medical complexity works to address these challenges and bridge communication gaps. During the first year of the new partnership, parents participated in focus groups to better understand how they perceive communication and collaboration between the providers and services delivering care for their medically complex child. Our findings from these sessions reflect the current literature and highlight additional challenges of rural families, as seen from the perspective of the parents. We found that parents appreciate when professional care coordination is provided, but this is often the exception and not the norm. Additionally, parents feel that the local health system's inability to care for their medically complex child results in unnecessary trips to urban-based specialty care. These gaps require a system-level approach to care coordination and, consequently, new paradigms for delivery are urgently needed.

  5. Preparing new nurses with complexity science and problem-based learning.

    PubMed

    Hodges, Helen F

    2011-01-01

    Successful nurses function effectively with adaptability, improvability, and interconnectedness, and can see emerging and unpredictable complex problems. Preparing new nurses for complexity requires a significant change in prevalent but dated nursing education models for rising graduates. The science of complexity coupled with problem-based learning and peer review contributes a feasible framework for a constructivist learning environment to examine real-time systems data; explore uncertainty, inherent patterns, and ambiguity; and develop skills for unstructured problem solving. This article describes a pilot study of a problem-based learning strategy guided by principles of complexity science in a community clinical nursing course. Thirty-five senior nursing students participated during a 3-year period. Assessments included peer review, a final project paper, reflection, and a satisfaction survey. Results were higher than expected levels of student satisfaction, increased breadth and analysis of complex data, acknowledgment of community as complex adaptive systems, and overall higher level thinking skills than in previous years. 2011, SLACK Incorporated.

  6. The life of plant mitochondrial complex I.

    PubMed

    Braun, Hans-Peter; Binder, Stefan; Brennicke, Axel; Eubel, Holger; Fernie, Alisdair R; Finkemeier, Iris; Klodmann, Jennifer; König, Ann-Christine; Kühn, Kristina; Meyer, Etienne; Obata, Toshihiro; Schwarzländer, Markus; Takenaka, Mizuki; Zehrmann, Anja

    2014-11-01

    The mitochondrial NADH dehydrogenase complex (complex I) of the respiratory chain has several remarkable features in plants: (i) particularly many of its subunits are encoded by the mitochondrial genome, (ii) its mitochondrial transcripts undergo extensive maturation processes (e.g. RNA editing, trans-splicing), (iii) its assembly follows unique routes, (iv) it includes an additional functional domain which contains carbonic anhydrases and (v) it is, indirectly, involved in photosynthesis. Comprising about 50 distinct protein subunits, complex I of plants is very large. However, an even larger number of proteins are required to synthesize these subunits and assemble the enzyme complex. This review aims to follow the complete "life cycle" of plant complex I from various molecular perspectives. We provide arguments that complex I represents an ideal model system for studying the interplay of respiration and photosynthesis, the cooperation of mitochondria and the nucleus during organelle biogenesis and the evolution of the mitochondrial oxidative phosphorylation system. Copyright © 2014 Elsevier B.V. and Mitochondria Research Society. All rights reserved.

  7. Nonparametric method for failures diagnosis in the actuating subsystem of aircraft control system

    NASA Astrophysics Data System (ADS)

    Terentev, M. N.; Karpenko, S. S.; Zybin, E. Yu; Kosyanchuk, V. V.

    2018-02-01

    In this paper we design a nonparametric method for failures diagnosis in the aircraft control system that uses the measurements of the control signals and the aircraft states only. It doesn’t require a priori information of the aircraft model parameters, training or statistical calculations, and is based on analytical nonparametric one-step-ahead state prediction approach. This makes it possible to predict the behavior of unidentified and failure dynamic systems, to weaken the requirements to control signals, and to reduce the diagnostic time and problem complexity.

  8. Endoscopic, single-catheter treatment of Dandy-Walker syndrome hydrocephalus: technical case report and review of treatment options.

    PubMed

    Sikorski, Christian W; Curry, Daniel J

    2005-01-01

    Optimal treatment for hydrocephalus related to Dandy-Walker syndrome (DWS) remains elusive. Patients with DWS-related hydrocephalus often require combinations of shunting systems to effectively drain both the supratentorial ventricles and posterior fossa cyst. We describe an endoscopic technique, whereby a frontally placed, single-catheter shunting system effectively drained the supratentorial and infratentorial compartments. This reduces the complexity and potential risk associated with the combined shunting systems required by so many with DWS-related hydrocephalus. Copyright 2005 S. Karger AG, Basel.

  9. Optical Imaging of Targeted β-Galactosidase in Brain Tumors to Detect EGFR Levels

    PubMed Central

    Broome, Ann-Marie; Ramamurthy, Gopal; Lavik, Kari; Liggett, Alexander; Kinstlinger, Ian; Basilion, James

    2015-01-01

    A current limitation in molecular imaging is that it often requires genetic manipulation of cancer cells for noninvasive imaging. Other methods to detect tumor cells in vivo using exogenously delivered and functionally active reporters, such as β-gal, are required. We report the development of a platform system for linking β-gal to any number of different ligands or antibodies for in vivo targeting to tissue or cells, without the requirement for genetic engineering of the target cells prior to imaging. Our studies demonstrate significant uptake in vitro and in vivo of an EGFR-targeted β-gal complex. We were then able to image orthotopic brain tumor accumulation and localization of the targeted enzyme when a fluorophore was added to the complex, as well as validate the internalization of the intravenously administered β-gal reporter complex ex vivo. After fluorescence imaging localized the β-gal complexes to the brain tumor, we topically applied a bioluminescent β-gal substrate to serial sections of the brain to evaluate the delivery and integrity of the enzyme. Finally, robust bioluminescence of the EGFR-targeted β-gal complex was captured within the tumor during noninvasive in vivo imaging. PMID:25775241

  10. Optical imaging of targeted β-galactosidase in brain tumors to detect EGFR levels.

    PubMed

    Broome, Ann-Marie; Ramamurthy, Gopal; Lavik, Kari; Liggett, Alexander; Kinstlinger, Ian; Basilion, James

    2015-04-15

    A current limitation in molecular imaging is that it often requires genetic manipulation of cancer cells for noninvasive imaging. Other methods to detect tumor cells in vivo using exogenously delivered and functionally active reporters, such as β-gal, are required. We report the development of a platform system for linking β-gal to any number of different ligands or antibodies for in vivo targeting to tissue or cells, without the requirement for genetic engineering of the target cells prior to imaging. Our studies demonstrate significant uptake in vitro and in vivo of an EGFR-targeted β-gal complex. We were then able to image orthotopic brain tumor accumulation and localization of the targeted enzyme when a fluorophore was added to the complex, as well as validate the internalization of the intravenously administered β-gal reporter complex ex vivo. After fluorescence imaging localized the β-gal complexes to the brain tumor, we topically applied a bioluminescent β-gal substrate to serial sections of the brain to evaluate the delivery and integrity of the enzyme. Finally, robust bioluminescence of the EGFR-targeted β-gal complex was captured within the tumor during noninvasive in vivo imaging.

  11. Determining the Specificity of Cascade Binding, Interference, and Primed Adaptation In Vivo in the Escherichia coli Type I-E CRISPR-Cas System.

    PubMed

    Cooper, Lauren A; Stringer, Anne M; Wade, Joseph T

    2018-04-17

    In clustered regularly interspaced short palindromic repeat (CRISPR)-Cas (CRISPR-associated) immunity systems, short CRISPR RNAs (crRNAs) are bound by Cas proteins, and these complexes target invading nucleic acid molecules for degradation in a process known as interference. In type I CRISPR-Cas systems, the Cas protein complex that binds DNA is known as Cascade. Association of Cascade with target DNA can also lead to acquisition of new immunity elements in a process known as primed adaptation. Here, we assess the specificity determinants for Cascade-DNA interaction, interference, and primed adaptation in vivo , for the type I-E system of Escherichia coli Remarkably, as few as 5 bp of crRNA-DNA are sufficient for association of Cascade with a DNA target. Consequently, a single crRNA promotes Cascade association with numerous off-target sites, and the endogenous E. coli crRNAs direct Cascade binding to >100 chromosomal sites. In contrast to the low specificity of Cascade-DNA interactions, >18 bp are required for both interference and primed adaptation. Hence, Cascade binding to suboptimal, off-target sites is inert. Our data support a model in which the initial Cascade association with DNA targets requires only limited sequence complementarity at the crRNA 5' end whereas recruitment and/or activation of the Cas3 nuclease, a prerequisite for interference and primed adaptation, requires extensive base pairing. IMPORTANCE Many bacterial and archaeal species encode CRISPR-Cas immunity systems that protect against invasion by foreign DNA. In the Escherichia coli CRISPR-Cas system, a protein complex, Cascade, binds 61-nucleotide (nt) CRISPR RNAs (crRNAs). The Cascade complex is directed to invading DNA molecules through base pairing between the crRNA and target DNA. This leads to recruitment of the Cas3 nuclease, which destroys the invading DNA molecule and promotes acquisition of new immunity elements. We made the first in vivo measurements of Cascade binding to DNA targets. Thus, we show that Cascade binding to DNA is highly promiscuous; endogenous E. coli crRNAs can direct Cascade binding to >100 chromosomal locations. In contrast, we show that targeted degradation and acquisition of new immunity elements require highly specific association of Cascade with DNA, limiting CRISPR-Cas function to the appropriate targets. Copyright © 2018 Cooper et al.

  12. FPGA-based coprocessor for matrix algorithms implementation

    NASA Astrophysics Data System (ADS)

    Amira, Abbes; Bensaali, Faycal

    2003-03-01

    Matrix algorithms are important in many types of applications including image and signal processing. These areas require enormous computing power. A close examination of the algorithms used in these, and related, applications reveals that many of the fundamental actions involve matrix operations such as matrix multiplication which is of O (N3) on a sequential computer and O (N3/p) on a parallel system with p processors complexity. This paper presents an investigation into the design and implementation of different matrix algorithms such as matrix operations, matrix transforms and matrix decompositions using an FPGA based environment. Solutions for the problem of processing large matrices have been proposed. The proposed system architectures are scalable, modular and require less area and time complexity with reduced latency when compared with existing structures.

  13. Systems engineering implementation in the preliminary design phase of the Giant Magellan Telescope

    NASA Astrophysics Data System (ADS)

    Maiten, J.; Johns, M.; Trancho, G.; Sawyer, D.; Mady, P.

    2012-09-01

    Like many telescope projects today, the 24.5-meter Giant Magellan Telescope (GMT) is truly a complex system. The primary and secondary mirrors of the GMT are segmented and actuated to support two operating modes: natural seeing and adaptive optics. GMT is a general-purpose telescope supporting multiple science instruments operated in those modes. GMT is a large, diverse collaboration and development includes geographically distributed teams. The need to implement good systems engineering processes for managing the development of systems like GMT becomes imperative. The management of the requirements flow down from the science requirements to the component level requirements is an inherently difficult task in itself. The interfaces must also be negotiated so that the interactions between subsystems and assemblies are well defined and controlled. This paper will provide an overview of the systems engineering processes and tools implemented for the GMT project during the preliminary design phase. This will include requirements management, documentation and configuration control, interface development and technical risk management. Because of the complexity of the GMT system and the distributed team, using web-accessible tools for collaboration is vital. To accomplish this GMTO has selected three tools: Cognition Cockpit, Xerox Docushare, and Solidworks Enterprise Product Data Management (EPDM). Key to this is the use of Cockpit for managing and documenting the product tree, architecture, error budget, requirements, interfaces, and risks. Additionally, drawing management is accomplished using an EPDM vault. Docushare, a documentation and configuration management tool is used to manage workflow of documents and drawings for the GMT project. These tools electronically facilitate collaboration in real time, enabling the GMT team to track, trace and report on key project metrics and design parameters.

  14. DCS: A global satellite environmental data collection system study

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Cost analysis and technical feasibility data are presented on five medium orbiting and six geosynchronous satellite data collection systems with varying degrees of spacecraft and local user terminal complexity. Data are also provided on system approaches, user requirements, and user classes. Systems considered include orbiting ERTS and EOS type satellites and geosynchronous SmS and SEOS type data collectors.

  15. Neurophysiology of Hunger and Satiety

    ERIC Educational Resources Information Center

    Smith, Pauline M.; Ferguson, Alastair V.

    2008-01-01

    Hunger is defined as a strong desire or need for food while satiety is the condition of being full or gratified. The maintenance of energy homeostasis requires a balance between energy intake and energy expenditure. The regulation of food intake is a complex behavior. It requires discrete nuclei within the central nervous system (CNS) to detect…

  16. Lunar Landing Operational Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Chris; Putney, Blake; Rust, Randy; Derkowski, Brian

    2010-01-01

    Characterizing the risk of spacecraft goes beyond simply modeling equipment reliability. Some portions of the mission require complex interactions between system elements that can lead to failure without an actual hardware fault. Landing risk is currently the least characterized aspect of the Altair lunar lander and appears to result from complex temporal interactions between pilot, sensors, surface characteristics and vehicle capabilities rather than hardware failures. The Lunar Landing Operational Risk Model (LLORM) seeks to provide rapid and flexible quantitative insight into the risks driving the landing event and to gauge sensitivities of the vehicle to changes in system configuration and mission operations. The LLORM takes a Monte Carlo based approach to estimate the operational risk of the Lunar Landing Event and calculates estimates of the risk of Loss of Mission (LOM) - Abort Required and is Successful, Loss of Crew (LOC) - Vehicle Crashes or Cannot Reach Orbit, and Success. The LLORM is meant to be used during the conceptual design phase to inform decision makers transparently of the reliability impacts of design decisions, to identify areas of the design which may require additional robustness, and to aid in the development and flow-down of requirements.

  17. Thermal performance of complex fenestration systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carpenter, S.C.; Elmahdy, A.H.

    1994-12-31

    The thermal performance (i.e., U-factor) of four complex fenestration systems is examined using computer simulation tools and guarded hot box testing. The systems include a flat glazed skylight, a domed or bubble skylight, a greenhouse window, and a curtain wall. The extra care required in performing simulation and testing of these complex products is described. There was good agreement (within 10%) between test and simulation for two of the four products. The agreement was slightly poorer (maximum difference of 16%) for the two high-heat-transfer products: the domed skylight and the greenhouse window. Possible causes for the larger discrepancy in thesemore » projecting window products are uncertainties in the inside and outside film coefficients and lower warm-side air temperatures because of stagnant airflow.« less

  18. How to predict community responses to perturbations in the face of imperfect knowledge and network complexity

    USGS Publications Warehouse

    Aufderheide, Helge; Rudolf, Lars; Gross, Thilo; Lafferty, Kevin D.

    2013-01-01

    Recent attempts to predict the response of large food webs to perturbations have revealed that in larger systems increasingly precise information on the elements of the system is required. Thus, the effort needed for good predictions grows quickly with the system's complexity. Here, we show that not all elements need to be measured equally well, suggesting that a more efficient allocation of effort is possible. We develop an iterative technique for determining an efficient measurement strategy. In model food webs, we find that it is most important to precisely measure the mortality and predation rates of long-lived, generalist, top predators. Prioritizing the study of such species will make it easier to understand the response of complex food webs to perturbations.

  19. Design of neural network model-based controller in a fed-batch microbial electrolysis cell reactor for bio-hydrogen gas production

    NASA Astrophysics Data System (ADS)

    Azwar; Hussain, M. A.; Abdul-Wahab, A. K.; Zanil, M. F.; Mukhlishien

    2018-03-01

    One of major challenge in bio-hydrogen production process by using MEC process is nonlinear and highly complex system. This is mainly due to the presence of microbial interactions and highly complex phenomena in the system. Its complexity makes MEC system difficult to operate and control under optimal conditions. Thus, precise control is required for the MEC reactor, so that the amount of current required to produce hydrogen gas can be controlled according to the composition of the substrate in the reactor. In this work, two schemes for controlling the current and voltage of MEC were evaluated. The controllers evaluated are PID and Inverse neural network (NN) controller. The comparative study has been carried out under optimal condition for the production of bio-hydrogen gas wherein the controller output is based on the correlation of optimal current and voltage to the MEC. Various simulation tests involving multiple set-point changes and disturbances rejection have been evaluated and the performances of both controllers are discussed. The neural network-based controller results in fast response time and less overshoots while the offset effects are minimal. In conclusion, the Inverse neural network (NN)-based controllers provide better control performance for the MEC system compared to the PID controller.

  20. Immune complex-induced human monocyte procoagulant activity. I. a rapid unidirectional lymphocyte-instructed pathway.

    PubMed

    Schwartz, B S; Edgington, T S

    1981-09-01

    It has previously been described that soluble antigen:antibody complexes in antigen excess can induce an increase in the procoagulant activity of human peripheral blood mononuclear cells. It has been proposed that this response may explain the presence of fibrin in immune complex-mediated tissue lesions. In the present study we define cellular participants and their roles in the procoagulant response to soluble immune complexes. Monocytes were shown by cell fractionation and by a direct cytologic assay to be the cell of origin of the procoagulant activity; and virtually all monocytes were able to participate in the response. Monocytes, however, required the presence of lymphocytes to respond. The procoagulant response required cell cooperation, and this collaborative interaction between lymphocytes and monocytes appeared to be unidirectional. Lymphocytes once triggered by immune complexes induced monocytes to synthesize the procoagulant product. Intact viable lymphocytes were required to present instructions to monocytes; no soluble mediator could be found to subserve this function. Indeed, all that appeared necessary to induce monocytes to produce procoagulant activity was an encounter with lymphocytes that had previously been in contact with soluble immune complexes. The optimum cellular ratio for this interaction was four lymphocytes per monocyte, about half the ratio in peripheral blood. The procoagulant response was rapid, reaching a maximum within 6 h after exposure to antigen:antibody complexes. The procoagulant activity was consistent with tissue factor because Factors VII and X and prothrombin were required for clotting of fibrinogen. WE propose that this pathway differs from a number of others involving cells of the immune system. Elucidation of the pathway may clarify the role of this lymphocyte-instructed monocyte response in the Shwartzman phenomenon and other thrombohemorrhagic events associated with immune cell function and the formation of immune complexes.

  1. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  2. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a: system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the ciasses of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  3. Digital control of highly augmented combat rotorcraft

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.

    1987-01-01

    Proposed concepts for the next generation of combat helicopters are to be embodied in a complex, highly maneuverable, multiroled vehicle with avionics systems. Single pilot and nap-of-the-Earth operations require handling qualities which minimize the involvement of the pilot in basic stabilization tasks. To meet these requirements will demand a full authority, high-gain, multimode, multiply-redundant, digital flight-control system. The gap between these requirements and current low-authority, low-bandwidth operational rotorcraft flight-control technology is considerable. This research aims at smoothing the transition between current technology and advanced concept requirements. The state of the art of high-bandwidth digital flight-control systems are reviewed; areas of specific concern for flight-control systems of modern combat are exposed; and the important concepts are illustrated in design and analysis of high-gain, digital systems with a detailed case study involving a current rotorcraft system. Approximate and exact methods are explained and illustrated for treating the important concerns which are unique to digital systems.

  4. Proceedings of the Symposium on Long-Life Hardware for Space

    NASA Technical Reports Server (NTRS)

    1970-01-01

    Two-volume edition of the papers of the symposium is described. It is divided into six sections - parts, materials, management, system testing, component design, and system test. Material presented focuses attention on problems created by the increased complexity of technology and long-term mission requirements.

  5. META II Complex Systems Design and Analysis (CODA)

    DTIC Science & Technology

    2011-08-01

    37  3.8.7  Variables, Parameters and Constraints ............................................................. 37  3.8.8  Objective...18  Figure 7: Inputs, States, Outputs and Parameters of System Requirements Specifications ......... 19...Design Rule Based on Device Parameter ....................................................... 57  Figure 35: AEE Device Design Rules (excerpt

  6. Natural selection and self-organization in complex adaptive systems.

    PubMed

    Di Bernardo, Mirko

    2010-01-01

    The central theme of this work is self-organization "interpreted" both from the point of view of theoretical biology, and from a philosophical point of view. By analysing, on the one hand, those which are now considered--not only in the field of physics--some of the most important discoveries, that is complex systems and deterministic chaos and, on the other hand, the new frontiers of systemic biology, this work highlights how large thermodynamic systems which are open can spontaneously stay in an orderly regime. Such systems can represent the natural source of the order required for a stable self-organization, for homoeostasis and for hereditary variations. The order, emerging in enormous randomly interconnected nets of binary variables, is almost certainly only the precursor of similar orders emerging in all the varieties of complex systems. Hence, this work, by finding new foundations for the order pervading the living world, advances the daring hypothesis according to which Darwinian natural selection is not the only source of order in the biosphere. Thus, the article, by examining the passage from Prigogine's dissipative structures theory to the contemporary theory of biological complexity, highlights the development of a coherent and continuous line of research which is set to individuate the general principles marking the profound reality of that mysterious self-organization characterizing the complexity of life.

  7. Decision Support System Requirements Definition for Human Extravehicular Activity Based on Cognitive Work Analysis

    PubMed Central

    Miller, Matthew James; McGuire, Kerry M.; Feigh, Karen M.

    2016-01-01

    The design and adoption of decision support systems within complex work domains is a challenge for cognitive systems engineering (CSE) practitioners, particularly at the onset of project development. This article presents an example of applying CSE techniques to derive design requirements compatible with traditional systems engineering to guide decision support system development. Specifically, it demonstrates the requirements derivation process based on cognitive work analysis for a subset of human spaceflight operations known as extravehicular activity. The results are presented in two phases. First, a work domain analysis revealed a comprehensive set of work functions and constraints that exist in the extravehicular activity work domain. Second, a control task analysis was performed on a subset of the work functions identified by the work domain analysis to articulate the translation of subject matter states of knowledge to high-level decision support system requirements. This work emphasizes an incremental requirements specification process as a critical component of CSE analyses to better situate CSE perspectives within the early phases of traditional systems engineering design. PMID:28491008

  8. Decision Support System Requirements Definition for Human Extravehicular Activity Based on Cognitive Work Analysis.

    PubMed

    Miller, Matthew James; McGuire, Kerry M; Feigh, Karen M

    2017-06-01

    The design and adoption of decision support systems within complex work domains is a challenge for cognitive systems engineering (CSE) practitioners, particularly at the onset of project development. This article presents an example of applying CSE techniques to derive design requirements compatible with traditional systems engineering to guide decision support system development. Specifically, it demonstrates the requirements derivation process based on cognitive work analysis for a subset of human spaceflight operations known as extravehicular activity . The results are presented in two phases. First, a work domain analysis revealed a comprehensive set of work functions and constraints that exist in the extravehicular activity work domain. Second, a control task analysis was performed on a subset of the work functions identified by the work domain analysis to articulate the translation of subject matter states of knowledge to high-level decision support system requirements. This work emphasizes an incremental requirements specification process as a critical component of CSE analyses to better situate CSE perspectives within the early phases of traditional systems engineering design.

  9. Parametric Cost Analysis: A Design Function

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.

    1989-01-01

    Parametric cost analysis uses equations to map measurable system attributes into cost. The measures of the system attributes are called metrics. The equations are called cost estimating relationships (CER's), and are obtained by the analysis of cost and technical metric data of products analogous to those to be estimated. Examples of system metrics include mass, power, failure_rate, mean_time_to_repair, energy _consumed, payload_to_orbit, pointing_accuracy, manufacturing_complexity, number_of_fasteners, and percent_of_electronics_weight. The basic assumption is that a measurable relationship exists between system attributes and the cost of the system. If a function exists, the attributes are cost drivers. Candidates for metrics include system requirement metrics and engineering process metrics. Requirements are constraints on the engineering process. From optimization theory we know that any active constraint generates cost by not permitting full optimization of the objective. Thus, requirements are cost drivers. Engineering processes reflect a projection of the requirements onto the corporate culture, engineering technology, and system technology. Engineering processes are an indirect measure of the requirements and, hence, are cost drivers.

  10. The Winning Number: An Heuristic Approach with the Geogebra's Help

    ERIC Educational Resources Information Center

    Alves, Francisco Regis Vieira

    2016-01-01

    Admittedly, the study of Complex Analysis (CA) requires of the student considerable mental effort characterized by the mobilization of a related thought to the complex mathematical concepts. Thus, with the aid of the dynamic system Geogebra, we discuss in this paper a particular concept in CA. In fact, the notion of winding number v[f(gamma),P] =…

  11. Support for Self-Regulation in Learning Complex Topics from Multimedia Explanations: Do Learners Need Extensive or Minimal Support?

    ERIC Educational Resources Information Center

    Rodicio, Hector Garcia; Sanchez, Emilio; Acuna, Santiago R.

    2013-01-01

    Acquiring complex conceptual knowledge requires learners to self-regulate their learning by planning, monitoring, and adjusting the process but they find it difficult to do so. In one experiment, we examined whether learners need broad systems of support for self-regulation or whether they are also able to learn with more economical support…

  12. Training Knowledge Bots for Physics-Based Simulations Using Artificial Neural Networks

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.; Wong, Jay Ming

    2014-01-01

    Millions of complex physics-based simulations are required for design of an aerospace vehicle. These simulations are usually performed by highly trained and skilled analysts, who execute, monitor, and steer each simulation. Analysts rely heavily on their broad experience that may have taken 20-30 years to accumulate. In addition, the simulation software is complex in nature, requiring significant computational resources. Simulations of system of systems become even more complex and are beyond human capacity to effectively learn their behavior. IBM has developed machines that can learn and compete successfully with a chess grandmaster and most successful jeopardy contestants. These machines are capable of learning some complex problems much faster than humans can learn. In this paper, we propose using artificial neural network to train knowledge bots to identify the idiosyncrasies of simulation software and recognize patterns that can lead to successful simulations. We examine the use of knowledge bots for applications of computational fluid dynamics (CFD), trajectory analysis, commercial finite-element analysis software, and slosh propellant dynamics. We will show that machine learning algorithms can be used to learn the idiosyncrasies of computational simulations and identify regions of instability without including any additional information about their mathematical form or applied discretization approaches.

  13. Evaluation in context: ATC automation in the field

    NASA Technical Reports Server (NTRS)

    Harwood, Kelly; Sanford, Beverly

    1994-01-01

    The process for incorporating advanced technologies into complex aviation systems is as important as the final product itself. This paper described a process that is currently being applied to the development and assessment of an advanced ATC automation system, CTAS. The key element of the process is field exposure early in the system development cycle. The process deviates from current established practices of system development -- where field testing is an implementation endpoint -- and has been deemed necessary by the FAA for streamlining development and bringing system functions to a level of stability and usefulness. Methods and approaches for field assessment are borrowed from human factors engineering, cognitive engineering, and usability engineering and are tailored for the constraints of an operational ATC environment. To date, the focus has been on the qualitative assessment of the match between TMA capabilities and the context for their use. Capturing the users' experience with the automation tool and understanding tool use in the context of the operational environment is important, not only for developing a tool that is an effective problem-solving instrument but also for defining meaningful operational requirements. Such requirements form the basis for certifying the safety and efficiency of the system. CTAS is the first U.S. advanced ATC automation system of its scope and complexity to undergo this field development and assessment process. With the rapid advances in aviation technologies and our limited understanding of their impact on system performance, it is time we opened our eyes to new possibilities for developing, validating, and ultimately certifying complex aviation systems.

  14. Use of the Trusted Computer System Evaluation Criteria (TCSEC) for Complex, Evolving, Multipolicy Systems.

    DTIC Science & Technology

    1994-07-01

    incorporate the Bell-La Padula rules for implementing the DoD security policy. The policy from which we begin here is the organization’s operational...security policy, which assumes the Bell-La Padula model and assigns the required security variables to elements of the system. A way to ensure a

  15. A Middleware Platform for Providing Mobile and Embedded Computing Instruction to Software Engineering Students

    ERIC Educational Resources Information Center

    Mattmann, C. A.; Medvidovic, N.; Malek, S.; Edwards, G.; Banerjee, S.

    2012-01-01

    As embedded software systems have grown in number, complexity, and importance in the modern world, a corresponding need to teach computer science students how to effectively engineer such systems has arisen. Embedded software systems, such as those that control cell phones, aircraft, and medical equipment, are subject to requirements and…

  16. Defense Systems Modernization and Sustainment Initiative

    DTIC Science & Technology

    2014-03-31

    research programs focus on sustainable production, sustainable energy, sustainable mobility , and ecologically friendly information technology systems...for Sustainable Mobility (CSM): focused on developing viable technologies for sustainable transportation systems and the support of complex equipment...utilization of mobile devices. The objective of the evaluation was to identify features that the new implementation of LEEDS would require, such as

  17. Developing a Software for Fuzzy Group Decision Support System: A Case Study

    ERIC Educational Resources Information Center

    Baba, A. Fevzi; Kuscu, Dincer; Han, Kerem

    2009-01-01

    The complex nature and uncertain information in social problems required the emergence of fuzzy decision support systems in social areas. In this paper, we developed user-friendly Fuzzy Group Decision Support Systems (FGDSS) software. The software can be used for multi-purpose decision making processes. It helps the users determine the main and…

  18. The topological requirements for robust perfect adaptation in networks of any size.

    PubMed

    Araujo, Robyn P; Liotta, Lance A

    2018-05-01

    Robustness, and the ability to function and thrive amid changing and unfavorable environments, is a fundamental requirement for living systems. Until now it has been an open question how large and complex biological networks can exhibit robust behaviors, such as perfect adaptation to a variable stimulus, since complexity is generally associated with fragility. Here we report that all networks that exhibit robust perfect adaptation (RPA) to a persistent change in stimulus are decomposable into well-defined modules, of which there exist two distinct classes. These two modular classes represent a topological basis for all RPA-capable networks, and generate the full set of topological realizations of the internal model principle for RPA in complex, self-organizing, evolvable bionetworks. This unexpected result supports the notion that evolutionary processes are empowered by simple and scalable modular design principles that promote robust performance no matter how large or complex the underlying networks become.

  19. Generation of two-dimensional binary mixtures in complex plasmas

    NASA Astrophysics Data System (ADS)

    Wieben, Frank; Block, Dietmar

    2016-10-01

    Complex plasmas are an excellent model system for strong coupling phenomena. Under certain conditions the dust particles immersed into the plasma form crystals which can be analyzed in terms of structure and dynamics. Previous experiments focussed mostly on monodisperse particle systems whereas dusty plasmas in nature and technology are polydisperse. Thus, a first and important step towards experiments in polydisperse systems are binary mixtures. Recent experiments on binary mixtures under microgravity conditions observed a phase separation of particle species with different radii even for small size disparities. This contradicts several numerical studies of 2D binary mixtures. Therefore, dedicated experiments are required to gain more insight into the physics of polydisperse systems. In this contribution first ground based experiments on two-dimensional binary mixtures are presented. Particular attention is paid to the requirements for the generation of such systems which involve the consideration of the temporal evolution of the particle properties. Furthermore, the structure of these two-component crystals is analyzed and compared to simulations. This work was supported by the Deutsche Forschungsgemeinschaft DFG in the framework of the SFB TR24 Greifswald Kiel, Project A3b.

  20. Enhancing Integrated Pest Management in GM Cotton Systems Using Host Plant Resistance

    PubMed Central

    Trapero, Carlos; Wilson, Iain W.; Stiller, Warwick N.; Wilson, Lewis J.

    2016-01-01

    Cotton has lost many ancestral defensive traits against key invertebrate pests. This is suggested by the levels of resistance to some pests found in wild cotton genotypes as well as in cultivated landraces and is a result of domestication and a long history of targeted breeding for yield and fiber quality, along with the capacity to control pests with pesticides. Genetic modification (GM) allowed integration of toxins from a bacteria into cotton to control key Lepidopteran pests. Since the mid-1990s, use of GM cotton cultivars has greatly reduced the amount of pesticides used in many cotton systems. However, pests not controlled by the GM traits have usually emerged as problems, especially the sucking bug complex. Control of this complex with pesticides often causes a reduction in beneficial invertebrate populations, allowing other secondary pests to increase rapidly and require control. Control of both sucking bug complex and secondary pests is problematic due to the cost of pesticides and/or high risk of selecting for pesticide resistance. Deployment of host plant resistance (HPR) provides an opportunity to manage these issues in GM cotton systems. Cotton cultivars resistant to the sucking bug complex and/or secondary pests would require fewer pesticide applications, reducing costs and risks to beneficial invertebrate populations and pesticide resistance. Incorporation of HPR traits into elite cotton cultivars with high yield and fiber quality offers the potential to further reduce pesticide use and increase the durability of pest management in GM cotton systems. We review the challenges that the identification and use of HPR against invertebrate pests brings to cotton breeding. We explore sources of resistance to the sucking bug complex and secondary pests, the mechanisms that control them and the approaches to incorporate these defense traits to commercial cultivars. PMID:27148323

  1. Optical intersatellite links - Application to commercial satellite communications

    NASA Technical Reports Server (NTRS)

    Paul, D.; Faris, F.; Garlow, R.; Inukai, T.; Pontano, B.; Razdan, R.; Ganz, Aura; Caudill, L.

    1992-01-01

    Application of optical intersatellite links for commercial satellite communications services is addressed in this paper. The feasibility of commercialization centers around basic issues such as the need and derived benefits, implementation complexity and overall cost. In this paper, commercialization of optical ISLs is assessed in terms of the services provided, systems requirements and feasibility of appropriate technology. Both long- and short-range ISLs for GEO-GEO, GEO-LEO and LEO applications are considered. Impact of systems requirements on the payload design and use of advanced technology in reducing its mass, power, and volume requirements are discussed.

  2. Present capabilities and future requirements for computer-aided geometric modeling in the design and manufacture of gas turbine

    NASA Technical Reports Server (NTRS)

    Caille, E.; Propen, M.; Hoffman, A.

    1984-01-01

    Gas turbine engine design requires the ability to rapidly develop complex structures which are subject to severe thermal and mechanical operating loads. As in all facets of the aerospace industry, engine designs are constantly driving towards increased performance, higher temperatures, higher speeds, and lower weight. The ability to address such requirements in a relatively short time frame has resulted in a major thrust towards integrated design/analysis/manufacturing systems. These computer driven graphics systems represent a unique challenge, with major payback opportunities if properly conceived, implemented, and applied.

  3. 29 CFR 541.402 - Executive and administrative computer employees.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... planning, scheduling, and coordinating activities required to develop systems to solve complex business, scientific or engineering problems of the employer or the employer's customers. Similarly, a senior or lead...

  4. Optical techniques to feed and control GaAs MMIC modules for phased array antenna applications

    NASA Astrophysics Data System (ADS)

    Bhasin, K. B.; Anzic, G.; Kunath, R. R.; Connolly, D. J.

    A complex signal distribution system is required to feed and control GaAs monolithic microwave integrated circuits (MMICs) for phased array antenna applications above 20 GHz. Each MMIC module will require one or more RF lines, one or more bias voltage lines, and digital lines to provide a minimum of 10 bits of combined phase and gain control information. In a closely spaced array, the routing of these multiple lines presents difficult topology problems as well as a high probability of signal interference. To overcome GaAs MMIC phased array signal distribution problems optical fibers interconnected to monolithically integrated optical components with GaAs MMIC array elements are proposed as a solution. System architecture considerations using optical fibers are described. The analog and digital optical links to respectively feed and control MMIC elements are analyzed. It is concluded that a fiber optic network will reduce weight and complexity, and increase reliability and performance, but higher power will be required.

  5. Optical techniques to feed and control GaAs MMIC modules for phased array antenna applications

    NASA Technical Reports Server (NTRS)

    Bhasin, K. B.; Anzic, G.; Kunath, R. R.; Connolly, D. J.

    1986-01-01

    A complex signal distribution system is required to feed and control GaAs monolithic microwave integrated circuits (MMICs) for phased array antenna applications above 20 GHz. Each MMIC module will require one or more RF lines, one or more bias voltage lines, and digital lines to provide a minimum of 10 bits of combined phase and gain control information. In a closely spaced array, the routing of these multiple lines presents difficult topology problems as well as a high probability of signal interference. To overcome GaAs MMIC phased array signal distribution problems optical fibers interconnected to monolithically integrated optical components with GaAs MMIC array elements are proposed as a solution. System architecture considerations using optical fibers are described. The analog and digital optical links to respectively feed and control MMIC elements are analyzed. It is concluded that a fiber optic network will reduce weight and complexity, and increase reliability and performance, but higher power will be required.

  6. GT-CATS: Tracking Operator Activities in Complex Systems

    NASA Technical Reports Server (NTRS)

    Callantine, Todd J.; Mitchell, Christine M.; Palmer, Everett A.

    1999-01-01

    Human operators of complex dynamic systems can experience difficulties supervising advanced control automation. One remedy is to develop intelligent aiding systems that can provide operators with context-sensitive advice and reminders. The research reported herein proposes, implements, and evaluates a methodology for activity tracking, a form of intent inferencing that can supply the knowledge required for an intelligent aid by constructing and maintaining a representation of operator activities in real time. The methodology was implemented in the Georgia Tech Crew Activity Tracking System (GT-CATS), which predicts and interprets the actions performed by Boeing 757/767 pilots navigating using autopilot flight modes. This report first describes research on intent inferencing and complex modes of automation. It then provides a detailed description of the GT-CATS methodology, knowledge structures, and processing scheme. The results of an experimental evaluation using airline pilots are given. The results show that GT-CATS was effective in predicting and interpreting pilot actions in real time.

  7. A method for work modeling at complex systems: towards applying information systems in family health care units.

    PubMed

    Jatobá, Alessandro; de Carvalho, Paulo Victor R; da Cunha, Amauri Marques

    2012-01-01

    Work in organizations requires a minimum level of consensus on the understanding of the practices performed. To adopt technological devices to support the activities in environments where work is complex, characterized by the interdependence among a large number of variables, understanding about how work is done not only takes an even greater importance, but also becomes a more difficult task. Therefore, this study aims to present a method for modeling of work in complex systems, which allows improving the knowledge about the way activities are performed where these activities do not simply happen by performing procedures. Uniting techniques of Cognitive Task Analysis with the concept of Work Process, this work seeks to provide a method capable of providing a detailed and accurate vision of how people perform their tasks, in order to apply information systems for supporting work in organizations.

  8. "And DPSIR begat DAPSI(W)R(M)!" - A unifying framework for marine environmental management.

    PubMed

    Elliott, M; Burdon, D; Atkins, J P; Borja, A; Cormier, R; de Jonge, V N; Turner, R K

    2017-05-15

    The marine environment is a complex system formed by interactions between ecological structure and functioning, physico-chemical processes and socio-economic systems. An increase in competing marine uses and users requires a holistic approach to marine management which considers the environmental, economic and societal impacts of all activities. If managed sustainably, the marine environment will deliver a range of ecosystem services which lead to benefits for society. In order to understand the complexity of the system, the DPSIR (Driver-Pressure-State-Impact-Response) approach has long been a valuable problem-structuring framework used to assess the causes, consequences and responses to change in a holistic way. Despite DPSIR being used for a long time, there is still confusion over the definition of its terms and so to be appropriate for current marine management, we contend that this confusion needs to be addressed. Our viewpoint advocates that DPSIR should be extended to DAPSI(W)R(M) (pronounced dap-see-worm) in which Drivers of basic human needs require Activities which lead to Pressures. The Pressures are the mechanisms of State change on the natural system which then leads to Impacts (on human Welfare). Those then require Responses (as Measures). Furthermore, because of the complexity of any managed sea area in terms of multiple Activities, there is the need for a linked-DAPSI(W)R(M) framework, and then the connectivity between marine ecosystems and ecosystems in the catchment and further at sea, requires an interlinked, nested-DAPSI(W)R(M) framework to reflect the continuum between adjacent ecosystems. Finally, the unifying framework for integrated marine management is completed by encompassing ecosystem structure and functioning, ecosystem services and societal benefits. Hence, DAPSI(W)R(M) links the socio-ecological system of the effects of changes to the natural system on the human uses and benefits of the marine system. However, to deliver these sustainably in the light of human activities requires a Risk Assessment and Risk Management framework; the ISO-compliant Bow-Tie method is used here as an example. Finally, to secure ecosystem health and economic benefits such as Blue Growth, successful, adaptive and sustainable marine management Responses (as Measures) are delivered using the 10-tenets, a set of facets covering all management disciplines and approaches. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. A Science Rationale for Mobility in Planetary Environments

    NASA Technical Reports Server (NTRS)

    1999-01-01

    For the last several decades, the Committee on Planetary and Lunar Exploration (COMPLEX) has advocated a systematic approach to exploration of the solar system; that is, the information and understanding resulting from one mission provide the scientific foundations that motivate subsequent, more elaborate investigations. COMPLEX's 1994 report, An Integrated Strategy for the Planetary Sciences: 1995-2010,1 advocated an approach to planetary studies emphasizing "hypothesizing and comprehending" rather than "cataloging and categorizing." More recently, NASA reports, including The Space Science Enterprise Strategic Plan2 and, in particular, Mission to the Solar System: Exploration and Discovery-A Mission and Technology Roadmap,3 have outlined comprehensive plans for planetary exploration during the next several decades. The missions outlined in these plans are both generally consistent with the priorities outlined in the Integrated Strategy and other NRC reports,4-5 and are replete with examples of devices embodying some degree of mobility in the form of rovers, robotic arms, and the like. Because the change in focus of planetary studies called for in the Integrated Strategy appears to require an evolutionary change in the technical means by which solar system exploration missions are conducted, the Space Studies Board charged COMPLEX to review the science that can be uniquely addressed by mobility in planetary environments. In particular, COMPLEX was asked to address the following questions: (1) What are the practical methods for achieving mobility? (2) For surface missions, what are the associated needs for sample acquisition? (3) What is the state of technology for planetary mobility in the United States and elsewhere, and what are the key requirements for technology development? (4) What terrestrial field demonstrations are required prior to spaceflight missions?

  10. A Scientific Rationale for Mobility in Planetary Environments

    NASA Astrophysics Data System (ADS)

    1999-01-01

    For the last several decades, the COMmittee on Planetary and Lunar EXploration (COMPLEX) has advocated a systematic approach to exploration of the solar system; that is, the information and understanding resulting from one mission provide the scientific foundations that motivate subsequent, more elaborate investigations. COMPLEX's 1994 report, An Integrated Strategy for the Planetary Sciences: 1995-2010,1 advocated an approach to planetary studies emphasizing "hypothesizing and comprehending" rather than "cataloging and categorizing." More recently, NASA reports, including The Space Science Enterprise Strategic Plan' and, in particular, Mission to the Solar System: Exploration and Discovery-A Mission and Technology Roadmap, 3 have outlined comprehensive plans for planetary exploration during the next several decades. The missions outlined in these plans are both generally consistent with the priorities outlined in the Integrated Strategy and other NRC reports,4,5 and are replete with examples of devices embodying some degree of mobility in the form of rovers, robotic arms, and the like. Because the change in focus of planetary studies called for in the Integrated Strategy appears to require an evolutionary change in the technical means by which solar system exploration missions are conducted, the Space Studies Board charged COMPLEX to review the science that can be uniquely addressed by mobility in planetary environments. In particular, COMPLEX was asked to address the following questions: 1. What are the practical methods for achieving mobility? 2. For surface missions, what are the associated needs for sample acquisition? 3. What is the state of technology for planetary mobility in the United States and elsewhere, and what are the key requirements for technology development? 4. What terrestrial field demonstrations are required prior to spaceflight missions?

  11. Integrated Safety Analysis Teams

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jonathan C.

    2008-01-01

    Today's complex systems require understanding beyond one person s capability to comprehend. Each system requires a team to divide the system into understandable subsystems which can then be analyzed with an Integrated Hazard Analysis. The team must have both specific experiences and diversity of experience. Safety experience and system understanding are not always manifested in one individual. Group dynamics make the difference between success and failure as well as the difference between a difficult task and a rewarding experience. There are examples in the news which demonstrate the need to connect the pieces of a system into a complete picture. The Columbia disaster is now a standard example of a low consequence hazard in one part of the system; the External Tank is a catastrophic hazard cause for a companion subsystem, the Space Shuttle Orbiter. The interaction between the hardware, the manufacturing process, the handling, and the operations contributed to the problem. Each of these had analysis performed, but who constituted the team which integrated this analysis together? This paper will explore some of the methods used for dividing up a complex system; and how one integration team has analyzed the parts. How this analysis has been documented in one particular launch space vehicle case will also be discussed.

  12. Self-dissimilarity as a High Dimensional Complexity Measure

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Macready, William

    2005-01-01

    For many systems characterized as "complex" the patterns exhibited on different scales differ markedly from one another. For example the biomass distribution in a human body "looks very different" depending on the scale at which one examines it. Conversely, the patterns at different scales in "simple" systems (e.g., gases, mountains, crystals) vary little from one scale to another. Accordingly, the degrees of self-dissimilarity between the patterns of a system at various scales constitute a complexity "signature" of that system. Here we present a novel quantification of self-dissimilarity. This signature can, if desired, incorporate a novel information-theoretic measure of the distance between probability distributions that we derive here. Whatever distance measure is chosen, our quantification of self-dissimilarity can be measured for many kinds of real-world data. This allows comparisons of the complexity signatures of wholly different kinds of systems (e.g., systems involving information density in a digital computer vs. species densities in a rain-forest vs. capital density in an economy, etc.). Moreover, in contrast to many other suggested complexity measures, evaluating the self-dissimilarity of a system does not require one to already have a model of the system. These facts may allow self-dissimilarity signatures to be used a s the underlying observational variables of an eventual overarching theory relating all complex systems. To illustrate self-dissimilarity we present several numerical experiments. In particular, we show that underlying structure of the logistic map is picked out by the self-dissimilarity signature of time series produced by that map

  13. The ITER disruption mitigation trigger: developing its preliminary design

    NASA Astrophysics Data System (ADS)

    Pautasso, G.; de Vries, P. C.; Humphreys, D.; Lehnen, M.; Rapson, C.; Raupp, G.; Snipes, J. A.; Treutterer, W.; Vergara-Fernandez, A.; Zabeo, L.

    2018-03-01

    A concept for the generation of the trigger for the ITER disruption mitigation system is described in this paper. The issuing of the trigger will be the result of a complex decision process, taken by the plasma control system, or by the central interlock system, determining that the plasma is unavoidably going to disrupt—or has disrupted—and that a fast mitigated shut-down is required. Given the redundancy of the mitigation system, the plasma control system must also formulate an injection scheme and specify when and how the injectors of the mitigation system should be activated. The parameters and the conceptual algorithms required for the configuration and generation of the trigger are discussed.

  14. Networked buffering: a basic mechanism for distributed robustness in complex adaptive systems.

    PubMed

    Whitacre, James M; Bender, Axel

    2010-06-15

    A generic mechanism--networked buffering--is proposed for the generation of robust traits in complex systems. It requires two basic conditions to be satisfied: 1) agents are versatile enough to perform more than one single functional role within a system and 2) agents are degenerate, i.e. there exists partial overlap in the functional capabilities of agents. Given these prerequisites, degenerate systems can readily produce a distributed systemic response to local perturbations. Reciprocally, excess resources related to a single function can indirectly support multiple unrelated functions within a degenerate system. In models of genome:proteome mappings for which localized decision-making and modularity of genetic functions are assumed, we verify that such distributed compensatory effects cause enhanced robustness of system traits. The conditions needed for networked buffering to occur are neither demanding nor rare, supporting the conjecture that degeneracy may fundamentally underpin distributed robustness within several biotic and abiotic systems. For instance, networked buffering offers new insights into systems engineering and planning activities that occur under high uncertainty. It may also help explain recent developments in understanding the origins of resilience within complex ecosystems.

  15. Creating virtual humans for simulation-based training and planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stansfield, S.; Sobel, A.

    1998-05-12

    Sandia National Laboratories has developed a distributed, high fidelity simulation system for training and planning small team Operations. The system provides an immersive environment populated by virtual objects and humans capable of displaying complex behaviors. The work has focused on developing the behaviors required to carry out complex tasks and decision making under stress. Central to this work are techniques for creating behaviors for virtual humans and for dynamically assigning behaviors to CGF to allow scenarios without fixed outcomes. Two prototype systems have been developed that illustrate these capabilities: MediSim, a trainer for battlefield medics and VRaptor, a system formore » planning, rehearsing and training assault operations.« less

  16. Using a data base management system for modelling SSME test history data

    NASA Technical Reports Server (NTRS)

    Abernethy, K.

    1985-01-01

    The usefulness of a data base management system (DBMS) for modelling historical test data for the complete series of static test firings for the Space Shuttle Main Engine (SSME) was assessed. From an analysis of user data base query requirements, it became clear that a relational DMBS which included a relationally complete query language would permit a model satisfying the query requirements. Representative models and sample queries are discussed. A list of environment-particular evaluation criteria for the desired DBMS was constructed; these criteria include requirements in the areas of user-interface complexity, program independence, flexibility, modifiability, and output capability. The evaluation process included the construction of several prototype data bases for user assessement. The systems studied, representing the three major DBMS conceptual models, were: MIRADS, a hierarchical system; DMS-1100, a CODASYL-based network system; ORACLE, a relational system; and DATATRIEVE, a relational-type system.

  17. Education Governance in Action: Lessons from Case Studies

    ERIC Educational Resources Information Center

    Burns, Tracey; Köster, Florian; Fuster, Marc

    2016-01-01

    Governing multi-level education systems requires governance models that balance responsiveness to local diversity with the ability to ensure national objectives. This delicate equilibrium is difficult to achieve given the complexity of many education systems. Countries are therefore increasingly looking for examples of good practice and models of…

  18. Information retrieval and display system

    NASA Technical Reports Server (NTRS)

    Groover, J. L.; King, W. L.

    1977-01-01

    Versatile command-driven data management system offers users, through simplified command language, a means of storing and searching data files, sorting data files into specified orders, performing simple or complex computations, effecting file updates, and printing or displaying output data. Commands are simple to use and flexible enough to meet most data management requirements.

  19. Logic system aids in evaluation of project readiness

    NASA Technical Reports Server (NTRS)

    Maris, S. J.; Obrien, T. J.

    1966-01-01

    Measurement Operational Readiness Requirements /MORR/ assignments logic is used for determining the readiness of a complex project to go forward as planned. The system used logic network which assigns qualities to all important criteria in a project and establishes a logical sequence of measurements to determine what the conditions are.

  20. Designing the Regional College Management Information System.

    ERIC Educational Resources Information Center

    Kin Maung Kywe; And Others

    Beginning in 1976, Regional Colleges were formed in Burma to implement career and technical education at the post-secondary level. This paper describes the Regional Colleges and explores the possible use of a systemic management information process that could assist in the complex planning required to develop second-year vocational and technical…

  1. Teaching High-Accuracy Global Positioning System to Undergraduates Using Online Processing Services

    ERIC Educational Resources Information Center

    Wang, Guoquan

    2013-01-01

    High-accuracy Global Positioning System (GPS) has become an important geoscientific tool used to measure ground motions associated with plate movements, glacial movements, volcanoes, active faults, landslides, subsidence, slow earthquake events, as well as large earthquakes. Complex calculations are required in order to achieve high-precision…

  2. Description of the microbial ecology evaluation device, flight equipment, and ground transporter

    NASA Technical Reports Server (NTRS)

    Chassay, C. E.; Taylor, G. R.

    1973-01-01

    Exposure of test systems in space required the fabrication of specialized hardware termed a Microbial Ecology Evaluation Device that had individual test chambers and a complex optical filter system. The characteristics of this device and the manner in which it was deployed in space are described.

  3. Radiation Hardness Assurance (RHA) for Small Missions

    NASA Technical Reports Server (NTRS)

    Campola, Michael J.

    2016-01-01

    Varied mission life and complexity is growing for small spacecraft. Small missions benefit from detailed hazard definition and evaluation as done in the past. Requirements need to flow from the system down to the parts level and aid system level radiation tolerance. RHA is highlighted with increasing COTS usage.

  4. 41 CFR 101-25.101-4 - Supply through indefinite quantity requirement contracts.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... (3) The item is proprietary or so complex in design, function, or operation as to be noncompetitive... Federal Property Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND... introduced into a supply system), or no advantage accrues doing so; and (b) Industry distribution facilities...

  5. 41 CFR 101-25.101-4 - Supply through indefinite quantity requirement contracts.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... (3) The item is proprietary or so complex in design, function, or operation as to be noncompetitive... Federal Property Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND... introduced into a supply system), or no advantage accrues doing so; and (b) Industry distribution facilities...

  6. FIRESCOPE: a new concept in multiagency fire suppression coordination

    Treesearch

    Richard A. Chase

    1980-01-01

    FIRESCOPE is a system developed to improve the capability of firefighting agencies in southern California in allocating and managing fire suppression resources. The system provides an effective and efficient solution to operational coordination requirements and problems of the major fire protection agencies serving the southern California urban-wildland complex. Major...

  7. Responding to an RFP: A Vendor's Viewpoint.

    ERIC Educational Resources Information Center

    Kington, Robert A.

    1987-01-01

    Outlines factors used by online vendors to decide whether to bid on RFPs (requests for proposals) for library automation systems, including specifications for software, hardware or performance requirements not met by the vendor; specifications based on competitors' systems; the size and complexity of the request itself; and vendors' time…

  8. Integrating water and agricultural management: collaborative governance for a complex policy problem.

    PubMed

    Fish, Rob D; Ioris, Antonio A R; Watson, Nigel M

    2010-11-01

    This paper examines governance requirements for integrating water and agricultural management (IWAM). The institutional arrangements for the agriculture and water sectors are complex and multi-dimensional, and integration cannot therefore be achieved through a simplistic 'additive' policy process. Effective integration requires the development of a new collaborative approach to governance that is designed to cope with scale dependencies and interactions, uncertainty and contested knowledge, and interdependency among diverse and unequal interests. When combined with interdisciplinary research, collaborative governance provides a viable normative model because of its emphasis on reciprocity, relationships, learning and creativity. Ultimately, such an approach could lead to the sorts of system adaptations and transformations that are required for IWAM. Copyright © 2009 Elsevier B.V. All rights reserved.

  9. Analytical Micromechanics Modeling Technique Developed for Ceramic Matrix Composites Analysis

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2005-01-01

    Ceramic matrix composites (CMCs) promise many advantages for next-generation aerospace propulsion systems. Specifically, carbon-reinforced silicon carbide (C/SiC) CMCs enable higher operational temperatures and provide potential component weight savings by virtue of their high specific strength. These attributes may provide systemwide benefits. Higher operating temperatures lessen or eliminate the need for cooling, thereby reducing both fuel consumption and the complex hardware and plumbing required for heat management. This, in turn, lowers system weight, size, and complexity, while improving efficiency, reliability, and service life, resulting in overall lower operating costs.

  10. The multi-replication protein A (RPA) system--a new perspective.

    PubMed

    Sakaguchi, Kengo; Ishibashi, Toyotaka; Uchiyama, Yukinobu; Iwabata, Kazuki

    2009-02-01

    Replication protein A (RPA) complex has been shown, using both in vivo and in vitro approaches, to be required for most aspects of eukaryotic DNA metabolism: replication, repair, telomere maintenance and homologous recombination. Here, we review recent data concerning the function and biological importance of the multi-RPA complex. There are distinct complexes of RPA found in the biological kingdoms, although for a long time only one type of RPA complex was believed to be present in eukaryotes. Each complex probably serves a different role. In higher plants, three distinct large and medium subunits are present, but only one species of the smallest subunit. Each of these protein subunits forms stable complexes with their respective partners. They are paralogs as complex. Humans possess two paralogs and one analog of RPA. The multi-RPA system can be regarded as universal in eukaryotes. Among eukaryotic kingdoms, paralogs, orthologs, analogs and heterologs of many DNA synthesis-related factors, including RPA, are ubiquitous. Convergent evolution seems to be ubiquitous in these processes. Using recent findings, we review the composition and biological functions of RPA complexes.

  11. Computer program for determining mass properties of a rigid structure

    NASA Technical Reports Server (NTRS)

    Hull, R. A.; Gilbert, J. L.; Klich, P. J.

    1978-01-01

    A computer program was developed for the rapid computation of the mass properties of complex structural systems. The program uses rigid body analyses and permits differences in structural material throughout the total system. It is based on the premise that complex systems can be adequately described by a combination of basic elemental shapes. Simple geometric data describing size and location of each element and the respective material density or weight of each element were the only required input data. From this minimum input, the program yields system weight, center of gravity, moments of inertia and products of inertia with respect to mutually perpendicular axes through the system center of gravity. The program also yields mass properties of the individual shapes relative to component axes.

  12. A statistical learning strategy for closed-loop control of fluid flows

    NASA Astrophysics Data System (ADS)

    Guéniat, Florimond; Mathelin, Lionel; Hussaini, M. Yousuff

    2016-12-01

    This work discusses a closed-loop control strategy for complex systems utilizing scarce and streaming data. A discrete embedding space is first built using hash functions applied to the sensor measurements from which a Markov process model is derived, approximating the complex system's dynamics. A control strategy is then learned using reinforcement learning once rewards relevant with respect to the control objective are identified. This method is designed for experimental configurations, requiring no computations nor prior knowledge of the system, and enjoys intrinsic robustness. It is illustrated on two systems: the control of the transitions of a Lorenz'63 dynamical system, and the control of the drag of a cylinder flow. The method is shown to perform well.

  13. Clustering and negative feedback by endocytosis in planar cell polarity signaling is modulated by ubiquitinylation of prickle.

    PubMed

    Cho, Bomsoo; Pierre-Louis, Gandhy; Sagner, Andreas; Eaton, Suzanne; Axelrod, Jeffrey D

    2015-05-01

    The core components of the planar cell polarity (PCP) signaling system, including both transmembrane and peripheral membrane associated proteins, form asymmetric complexes that bridge apical intercellular junctions. While these can assemble in either orientation, coordinated cell polarization requires the enrichment of complexes of a given orientation at specific junctions. This might occur by both positive and negative feedback between oppositely oriented complexes, and requires the peripheral membrane associated PCP components. However, the molecular mechanisms underlying feedback are not understood. We find that the E3 ubiquitin ligase complex Cullin1(Cul1)/SkpA/Supernumerary limbs(Slimb) regulates the stability of one of the peripheral membrane components, Prickle (Pk). Excess Pk disrupts PCP feedback and prevents asymmetry. We show that Pk participates in negative feedback by mediating internalization of PCP complexes containing the transmembrane components Van Gogh (Vang) and Flamingo (Fmi), and that internalization is activated by oppositely oriented complexes within clusters. Pk also participates in positive feedback through an unknown mechanism promoting clustering. Our results therefore identify a molecular mechanism underlying generation of asymmetry in PCP signaling.

  14. System Engineering of Autonomous Space Vehicles

    NASA Technical Reports Server (NTRS)

    Watson, Michael D.; Johnson, Stephen B.; Trevino, Luis

    2014-01-01

    Human exploration of the solar system requires fully autonomous systems when travelling more than 5 light minutes from Earth. This autonomy is necessary to manage a large, complex spacecraft with limited crew members and skills available. The communication latency requires the vehicle to deal with events with only limited crew interaction in most cases. The engineering of these systems requires an extensive knowledge of the spacecraft systems, information theory, and autonomous algorithm characteristics. The characteristics of the spacecraft systems must be matched with the autonomous algorithm characteristics to reliably monitor and control the system. This presents a large system engineering problem. Recent work on product-focused, elegant system engineering will be applied to this application, looking at the full autonomy stack, the matching of autonomous systems to spacecraft systems, and the integration of different types of algorithms. Each of these areas will be outlined and a general approach defined for system engineering to provide the optimal solution to the given application context.

  15. Tree physiology research in a changing world.

    PubMed

    Kaufmann, Merrill R.; Linder, Sune

    1996-01-01

    Changes in issues and advances in methodology have contributed to substantial progress in tree physiology research during the last several decades. Current research focuses on process interactions in complex systems and the integration of processes across multiple spatial and temporal scales. An increasingly important challenge for future research is assuring sustainability of production systems and forested ecosystems in the face of increased demands for natural resources and human disturbance of forests. Meeting this challenge requires significant shifts in research approach, including the study of limitations of productivity that may accompany achievement of system sustainability, and a focus on the biological capabilities of complex land bases altered by human activity.

  16. Quality Management and Key Performance Indicators in Oncologic Esophageal Surgery.

    PubMed

    Gockel, Ines; Ahlbrand, Constantin Johannes; Arras, Michael; Schreiber, Elke Maria; Lang, Hauke

    2015-12-01

    Ranking systems and comparisons of quality and performance indicators will be of increasing relevance for complex "high-risk" procedures such as esophageal cancer surgery. The identification of evidence-based standards relevant for key performance indicators in esophageal surgery is essential for establishing monitoring systems and furthermore a requirement to enhance treatment quality. In the course of this review, we analyze the key performance indicators case volume, radicality of resection, and postoperative morbidity and mortality, leading to continuous quality improvement. Ranking systems established on this basis will gain increased relevance in highly complex procedures within the national and international comparison and furthermore improve the treatment of patients with esophageal carcinoma.

  17. A Generalized Decision Framework Using Multi-objective Optimization for Water Resources Planning

    NASA Astrophysics Data System (ADS)

    Basdekas, L.; Stewart, N.; Triana, E.

    2013-12-01

    Colorado Springs Utilities (CSU) is currently engaged in an Integrated Water Resource Plan (IWRP) to address the complex planning scenarios, across multiple time scales, currently faced by CSU. The modeling framework developed for the IWRP uses a flexible data-centered Decision Support System (DSS) with a MODSIM-based modeling system to represent the operation of the current CSU raw water system coupled with a state-of-the-art multi-objective optimization algorithm. Three basic components are required for the framework, which can be implemented for planning horizons ranging from seasonal to interdecadal. First, a water resources system model is required that is capable of reasonable system simulation to resolve performance metrics at the appropriate temporal and spatial scales of interest. The system model should be an existing simulation model, or one developed during the planning process with stakeholders, so that 'buy-in' has already been achieved. Second, a hydrologic scenario tool(s) capable of generating a range of plausible inflows for the planning period of interest is required. This may include paleo informed or climate change informed sequences. Third, a multi-objective optimization model that can be wrapped around the system simulation model is required. The new generation of multi-objective optimization models do not require parameterization which greatly reduces problem complexity. Bridging the gap between research and practice will be evident as we use a case study from CSU's planning process to demonstrate this framework with specific competing water management objectives. Careful formulation of objective functions, choice of decision variables, and system constraints will be discussed. Rather than treating results as theoretically Pareto optimal in a planning process, we use the powerful multi-objective optimization models as tools to more efficiently and effectively move out of the inferior decision space. The use of this framework will help CSU evaluate tradeoffs in a continually changing world.

  18. Conceptual design of a monitoring system for the Charters of Freedom

    NASA Technical Reports Server (NTRS)

    Cutts, J. A.

    1984-01-01

    A conceptual design of a monitoring system for the Charters of Freedom was developed for the National Archives and Records Service. The monitoring system would be installed at the National Archives and used to document the condition of the Charters as part of a regular inspection program. The results of an experimental measurements program that led to the definition of analysis system requirements are presented, a conceptual design of the monitoring system is described and the alternative approaches to implementing this design were discussed. The monitoring system is required to optically detect and measure deterioration in documents that are permanently encapsulated in glass cases. An electronic imaging system with the capability for precise photometric measurements of the contrast of the script on the documents can perform this task. Two general types of imaging systems are considered (line and area array), and their suitability for performing these required measurements are compared. A digital processing capability for analyzing the electronic imaging data is also required, and several optional levels of complexity for this digital analysis system are evaluated.

  19. A meteorological distribution system for high-resolution terrestrial modeling (MicroMet)

    Treesearch

    Glen E. Liston; Kelly Elder

    2006-01-01

    An intermediate-complexity, quasi-physically based, meteorological model (MicroMet) has been developed to produce high-resolution (e.g., 30-m to 1-km horizontal grid increment) atmospheric forcings required to run spatially distributed terrestrial models over a wide variety of landscapes. The following eight variables, required to run most terrestrial models, are...

  20. Learning Science by Constructing Models: Can Dragoon Increase Learning without Increasing the Time Required?

    ERIC Educational Resources Information Center

    VanLehn, Kurt; Chung, Greg; Grover, Sachin; Madni, Ayesha; Wetzel, Jon

    2016-01-01

    A common hypothesis is that students will more deeply understand dynamic systems and other complex phenomena if they construct computational models of them. Attempts to demonstrate the advantages of model construction have been stymied by the long time required for students to acquire skill in model construction. In order to make model…

  1. The Teaching of Creativity in Information Systems Programmes at South African Higher Education Institutions

    ERIC Educational Resources Information Center

    Turpin, Marita; Matthee, Machdel; Kruger, Anine

    2015-01-01

    The development of problem solving skills is a shared goal in science, engineering, mathematics and technology education. In the applied sciences, problems are often open-ended and complex, requiring a multidisciplinary approach as well as new designs. In such cases, problem solving requires not only analytical capabilities, but also creativity…

  2. Raising the Bar: Challenging Students in a Capstone Project Course with an Android and Mobile Web Parallel Development Team Project

    ERIC Educational Resources Information Center

    Wong, Wilson; Pepe, James; Englander, Irv

    2017-01-01

    Information systems capstone projects aim to prepare students for what they will encounter in the industry after graduation. Corporate application development is often a complex endeavor that requires coordination between related products. For example, software development in the mobile application sector may require a coordinated parallel…

  3. A Quantitative Systems Pharmacology Approach to Infer Pathways Involved in Complex Disease Phenotypes.

    PubMed

    Schurdak, Mark E; Pei, Fen; Lezon, Timothy R; Carlisle, Diane; Friedlander, Robert; Taylor, D Lansing; Stern, Andrew M

    2018-01-01

    Designing effective therapeutic strategies for complex diseases such as cancer and neurodegeneration that involve tissue context-specific interactions among multiple gene products presents a major challenge for precision medicine. Safe and selective pharmacological modulation of individual molecular entities associated with a disease often fails to provide efficacy in the clinic. Thus, development of optimized therapeutic strategies for individual patients with complex diseases requires a more comprehensive, systems-level understanding of disease progression. Quantitative systems pharmacology (QSP) is an approach to drug discovery that integrates computational and experimental methods to understand the molecular pathogenesis of a disease at the systems level more completely. Described here is the chemogenomic component of QSP for the inference of biological pathways involved in the modulation of the disease phenotype. The approach involves testing sets of compounds of diverse mechanisms of action in a disease-relevant phenotypic assay, and using the mechanistic information known for the active compounds, to infer pathways and networks associated with the phenotype. The example used here is for monogenic Huntington's disease (HD), which due to the pleiotropic nature of the mutant phenotype has a complex pathogenesis. The overall approach, however, is applicable to any complex disease.

  4. System Guidelines for EMC Safety-Critical Circuits: Design, Selection, and Margin Demonstration

    NASA Technical Reports Server (NTRS)

    Lawton, R. M.

    1996-01-01

    Demonstration of required safety margins on critical electrical/electronic circuits in large complex systems has become an implementation and cost problem. These margins are the difference between the activation level of the circuit and the electrical noise on the circuit in the actual operating environment. This document discusses the origin of the requirement and gives a detailed process flow for the identification of the system electromagnetic compatibility (EMC) critical circuit list. The process flow discusses the roles of engineering disciplines such as systems engineering, safety, and EMC. Design and analysis guidelines are provided to assist the designer in assuring the system design has a high probability of meeting the margin requirements. Examples of approaches used on actual programs (Skylab and Space Shuttle Solid Rocket Booster) are provided to show how variations of the approach can be used successfully.

  5. EMASS (tm): An expandable solution for NASA space data storage needs

    NASA Technical Reports Server (NTRS)

    Peterson, Anthony L.; Cardwell, P. Larry

    1992-01-01

    The data acquisition, distribution, processing, and archiving requirements of NASA and other U.S. Government data centers present significant data management challenges that must be met in the 1990's. The Earth Observing System (EOS) project alone is expected to generate daily data volumes greater than 2 Terabytes (2(10)(exp 12) Bytes). As the scientific community makes use of this data their work product will result in larger, increasingly complex data sets to be further exploited and managed. The challenge for data storage systems is to satisfy the initial data management requirements with cost effective solutions that provide for planned growth. This paper describes the expandable architecture of the E-Systems Modular Automated Storage System (EMASS (TM)), a mass storage system which is designed to support NASA's data capture, storage, distribution, and management requirements into the 21st century.

  6. EMASS (trademark): An expandable solution for NASA space data storage needs

    NASA Technical Reports Server (NTRS)

    Peterson, Anthony L.; Cardwell, P. Larry

    1991-01-01

    The data acquisition, distribution, processing, and archiving requirements of NASA and other U.S. Government data centers present significant data management challenges that must be met in the 1990's. The Earth Observing System (EOS) project alone is expected to generate daily data volumes greater than 2 Terabytes (2 x 10(exp 12) Bytes). As the scientific community makes use of this data, their work will result in larger, increasingly complex data sets to be further exploited and managed. The challenge for data storage systems is to satisfy the initial data management requirements with cost effective solutions that provide for planned growth. The expendable architecture of the E-Systems Modular Automated Storage System (EMASS(TM)), a mass storage system which is designed to support NASA's data capture, storage, distribution, and management requirements into the 21st century is described.

  7. Shuttle mission simulator software conceptual design

    NASA Technical Reports Server (NTRS)

    Burke, J. F.

    1973-01-01

    Software conceptual designs (SCD) are presented for meeting the simulator requirements for the shuttle missions. The major areas of the SCD discussed include: malfunction insertion, flight software, applications software, systems software, and computer complex.

  8. Performance tasks for operator-skills research.

    DOT National Transportation Integrated Search

    1966-06-01

    The selection, development, and operation of several tasks for use in skilled-operator-performance research are described. The tasks are intended, collectively, to sample a broad spectrum of abilities required by complex operator systems; individuall...

  9. The synergy of the whole: building a global system for clinical trials to accelerate medicines development.

    PubMed

    Koski, Greg; Tobin, Mary F; Whalen, Matthew

    2014-10-01

    The pharmaceutical industry, once highly respected, productive, and profitable, is in the throes of major change driven by many forces, including economics, science, regulation, and ethics. A variety of initiatives and partnerships have been launched to improve efficiency and productivity but without significant effect because they have failed to consider the process as a system. Addressing the challenges facing this complex endeavor requires more than modifications of individual processes; it requires a fully integrated application of systems thinking and an understanding of the desired goals and complex interactions among essential components and stakeholders of the whole. A multistakeholder collaborative effort, led by the Alliance for Clinical Research Excellence and Safety (ACRES), a global nonprofit organization operating in the public interest, is now under way to build a shared global system for clinical research. Its systems approach focuses on the interconnection of stakeholders at critical points of interaction within 4 operational domains: site development and support, quality management, information technology, and safety. The ACRES initiatives, Site Accreditation and Standards, Product Safety Culture, Global Ethical Review and Regulatory Innovation, and Quality Assurance and Safety, focus on building and implementing systems solutions. Underpinning these initiatives is an open, shared, integrated technology (site and optics and quality informatics initiative). We describe the rationale, challenges, progress, and successes of this effort to date and lessons learned. The complexity and fragmentation of the intensely proprietary ecosystem of drug development, challenging regulatory climate, and magnitude of the endeavor itself pose significant challenges, but the economic, social, and scientific rewards will more than justify the effort. An effective alliance model requires a willingness of multiple stakeholders to work together to build a shared system within a noncompetitive space that will have major benefits for all, including better access to medicines, better health, and more productive lives. Copyright © 2014 Elsevier HS Journals, Inc. All rights reserved.

  10. The IRMIS object model and services API.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saunders, C.; Dohan, D. A.; Arnold, N. D.

    2005-01-01

    The relational model developed for the Integrated Relational Model of Installed Systems (IRMIS) toolkit has been successfully used to capture the Advanced Photon Source (APS) control system software (EPICS process variables and their definitions). The relational tables are populated by a crawler script that parses each Input/Output Controller (IOC) start-up file when an IOC reboot is detected. User interaction is provided by a Java Swing application that acts as a desktop for viewing the process variable information. Mapping between the display objects and the relational tables was carried out with the Hibernate Object Relational Modeling (ORM) framework. Work is wellmore » underway at the APS to extend the relational modeling to include control system hardware. For this work, due in part to the complex user interaction required, the primary application development environment has shifted from the relational database view to the object oriented (Java) perspective. With this approach, the business logic is executed in Java rather than in SQL stored procedures. This paper describes the object model used to represent control system software, hardware, and interconnects in IRMIS. We also describe the services API used to encapsulate the required behaviors for creating and maintaining the complex data. In addition to the core schema and object model, many important concepts in IRMIS are captured by the services API. IRMIS is an ambitious collaborative effort for defining and developing a relational database and associated applications to comprehensively document the large and complex EPICS-based control systems of today's accelerators. The documentation effort includes process variables, control system hardware, and interconnections. The approach could also be used to document all components of the accelerator, including mechanical, vacuum, power supplies, etc. One key aspect of IRMIS is that it is a documentation framework, not a design and development tool. We do not generate EPICS control system configurations from IRMIS, and hence do not impose any additional requirements on EPICS developers.« less

  11. Identifying problems and generating recommendations for enhancing complex systems: applying the abstraction hierarchy framework as an analytical tool.

    PubMed

    Xu, Wei

    2007-12-01

    This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.

  12. The JPL functional requirements tool

    NASA Technical Reports Server (NTRS)

    Giffin, Geoff; Skinner, Judith; Stoller, Richard

    1987-01-01

    Planetary spacecraft are complex vehicles which are built according to many thousands of requirements. Problems encountered in documenting and maintaining these requirements led to the current attempt to reduce or eliminate these problems by a computer automated data base Functional Requirements Tool. The tool developed at JPL and in use on several JPL Projects is described. The organization and functionality of the Tool, together with an explanation of the data base inputs, their relationships, and use are presented. Methods of interfacing with external documents, representation of tables and figures, and methods of approval and change processing are discussed. The options available for disseminating information from the Tool are identified. The implementation of the Requirements Tool is outlined, and the operation is summarized. The conclusions drawn from this work is that the Requirements Tool represents a useful addition to the System Engineer's Tool kit, it is not currently available elsewhere, and a clear development path exists to expand the capabilities of the Tool to serve larger and more complex projects.

  13. Modeling of BN Lifetime Prediction of a System Based on Integrated Multi-Level Information

    PubMed Central

    Wang, Xiaohong; Wang, Lizhi

    2017-01-01

    Predicting system lifetime is important to ensure safe and reliable operation of products, which requires integrated modeling based on multi-level, multi-sensor information. However, lifetime characteristics of equipment in a system are different and failure mechanisms are inter-coupled, which leads to complex logical correlations and the lack of a uniform lifetime measure. Based on a Bayesian network (BN), a lifetime prediction method for systems that combine multi-level sensor information is proposed. The method considers the correlation between accidental failures and degradation failure mechanisms, and achieves system modeling and lifetime prediction under complex logic correlations. This method is applied in the lifetime prediction of a multi-level solar-powered unmanned system, and the predicted results can provide guidance for the improvement of system reliability and for the maintenance and protection of the system. PMID:28926930

  14. Modeling of BN Lifetime Prediction of a System Based on Integrated Multi-Level Information.

    PubMed

    Wang, Jingbin; Wang, Xiaohong; Wang, Lizhi

    2017-09-15

    Predicting system lifetime is important to ensure safe and reliable operation of products, which requires integrated modeling based on multi-level, multi-sensor information. However, lifetime characteristics of equipment in a system are different and failure mechanisms are inter-coupled, which leads to complex logical correlations and the lack of a uniform lifetime measure. Based on a Bayesian network (BN), a lifetime prediction method for systems that combine multi-level sensor information is proposed. The method considers the correlation between accidental failures and degradation failure mechanisms, and achieves system modeling and lifetime prediction under complex logic correlations. This method is applied in the lifetime prediction of a multi-level solar-powered unmanned system, and the predicted results can provide guidance for the improvement of system reliability and for the maintenance and protection of the system.

  15. Pharmacokinetic Modeling of JP-8 Jet Fuel Components: II. A Conceptual Framework

    DTIC Science & Technology

    2003-12-01

    example, a single type of (simple) binary interaction between 300 components would require the specification of some 105 interaction coefficients . One...individual substances, via binary mechanisms, is enough to predict the interactions present in the mixture. Secondly, complex mixtures can often be...approximated as pseudo- binary systems, consisting of the compound of interest plus a single interacting complex vehicle with well-defined, composite

  16. Numerical implementation of complex orthogonalization, parallel transport on Stiefel bundles, and analyticity

    NASA Astrophysics Data System (ADS)

    Avitabile, Daniele; Bridges, Thomas J.

    2010-06-01

    Numerical integration of complex linear systems of ODEs depending analytically on an eigenvalue parameter are considered. Complex orthogonalization, which is required to stabilize the numerical integration, results in non-analytic systems. It is shown that properties of eigenvalues are still efficiently recoverable by extracting information from a non-analytic characteristic function. The orthonormal systems are constructed using the geometry of Stiefel bundles. Different forms of continuous orthogonalization in the literature are shown to correspond to different choices of connection one-form on the Stiefel bundle. For the numerical integration, Gauss-Legendre Runge-Kutta algorithms are the principal choice for preserving orthogonality, and performance results are shown for a range of GLRK methods. The theory and methods are tested by application to example boundary value problems including the Orr-Sommerfeld equation in hydrodynamic stability.

  17. Operations management system

    NASA Technical Reports Server (NTRS)

    Brandli, A. E.; Eckelkamp, R. E.; Kelly, C. M.; Mccandless, W.; Rue, D. L.

    1990-01-01

    The objective of an operations management system is to provide an orderly and efficient method to operate and maintain aerospace vehicles. Concepts are described for an operations management system and the key technologies are highlighted which will be required if this capability is brought to fruition. Without this automation and decision aiding capability, the growing complexity of avionics will result in an unmanageable workload for the operator, ultimately threatening mission success or survivability of the aircraft or space system. The key technologies include expert system application to operational tasks such as replanning, equipment diagnostics and checkout, global system management, and advanced man machine interfaces. The economical development of operations management systems, which are largely software, will require advancements in other technological areas such as software engineering and computer hardware.

  18. Model-based system engineering approach for the Euclid mission to manage scientific and technical complexity

    NASA Astrophysics Data System (ADS)

    Lorenzo Alvarez, Jose; Metselaar, Harold; Amiaux, Jerome; Saavedra Criado, Gonzalo; Gaspar Venancio, Luis M.; Salvignol, Jean-Christophe; Laureijs, René J.; Vavrek, Roland

    2016-08-01

    In the last years, the system engineering field is coming to terms with a paradigm change in the approach for complexity management. Different strategies have been proposed to cope with highly interrelated systems, system of systems and collaborative system engineering have been proposed and a significant effort is being invested into standardization and ontology definition. In particular, Model Based System Engineering (MBSE) intends to introduce methodologies for a systematic system definition, development, validation, deployment, operation and decommission, based on logical and visual relationship mapping, rather than traditional 'document based' information management. The practical implementation in real large-scale projects is not uniform across fields. In space science missions, the usage has been limited to subsystems or sample projects with modeling being performed 'a-posteriori' in many instances. The main hurdle for the introduction of MBSE practices in new projects is still the difficulty to demonstrate their added value to a project and whether their benefit is commensurate with the level of effort required to put them in place. In this paper we present the implemented Euclid system modeling activities, and an analysis of the benefits and limitations identified to support in particular requirement break-down and allocation, and verification planning at mission level.

  19. Software control and system configuration management: A systems-wide approach

    NASA Technical Reports Server (NTRS)

    Petersen, K. L.; Flores, C., Jr.

    1984-01-01

    A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.

  20. Primordial Evolution in the Finitary Process Soup

    NASA Astrophysics Data System (ADS)

    Görnerup, Olof; Crutchfield, James P.

    A general and basic model of primordial evolution—a soup of reacting finitary and discrete processes—is employed to identify and analyze fundamental mechanisms that generate and maintain complex structures in prebiotic systems. The processes—ɛ-machines as defined in computational mechanics—and their interaction networks both provide well defined notions of structure. This enables us to quantitatively demonstrate hierarchical self-organization in the soup in terms of complexity. We found that replicating processes evolve the strategy of successively building higher levels of organization by autocatalysis. Moreover, this is facilitated by local components that have low structural complexity, but high generality. In effect, the finitary process soup spontaneously evolves a selection pressure that favors such components. In light of the finitary process soup's generality, these results suggest a fundamental law of hierarchical systems: global complexity requires local simplicity.

  1. Comparison of an algebraic multigrid algorithm to two iterative solvers used for modeling ground water flow and transport

    USGS Publications Warehouse

    Detwiler, R.L.; Mehl, S.; Rajaram, H.; Cheung, W.W.

    2002-01-01

    Numerical solution of large-scale ground water flow and transport problems is often constrained by the convergence behavior of the iterative solvers used to solve the resulting systems of equations. We demonstrate the ability of an algebraic multigrid algorithm (AMG) to efficiently solve the large, sparse systems of equations that result from computational models of ground water flow and transport in large and complex domains. Unlike geometric multigrid methods, this algorithm is applicable to problems in complex flow geometries, such as those encountered in pore-scale modeling of two-phase flow and transport. We integrated AMG into MODFLOW 2000 to compare two- and three-dimensional flow simulations using AMG to simulations using PCG2, a preconditioned conjugate gradient solver that uses the modified incomplete Cholesky preconditioner and is included with MODFLOW 2000. CPU times required for convergence with AMG were up to 140 times faster than those for PCG2. The cost of this increased speed was up to a nine-fold increase in required random access memory (RAM) for the three-dimensional problems and up to a four-fold increase in required RAM for the two-dimensional problems. We also compared two-dimensional numerical simulations of steady-state transport using AMG and the generalized minimum residual method with an incomplete LU-decomposition preconditioner. For these transport simulations, AMG yielded increased speeds of up to 17 times with only a 20% increase in required RAM. The ability of AMG to solve flow and transport problems in large, complex flow systems and its ready availability make it an ideal solver for use in both field-scale and pore-scale modeling.

  2. Scaling the Pyramid Model across Complex Systems Providing Early Care for Preschoolers: Exploring How Models for Decision Making May Enhance Implementation Science

    ERIC Educational Resources Information Center

    Johnson, LeAnne D.

    2017-01-01

    Bringing effective practices to scale across large systems requires attending to how information and belief systems come together in decisions to adopt, implement, and sustain those practices. Statewide scaling of the Pyramid Model, a framework for positive behavior intervention and support, across different types of early childhood programs…

  3. Assessing regional differences in nitrogen losses from U.S. dairy farms using the integrated farm systems model

    USDA-ARS?s Scientific Manuscript database

    Nitrogen (N) enters and leaves a dairy production system through many pathways and in many forms: undergoing numerous transformations as it passes from feed to animal to milk or manure and back again. Due to the complexity of the dairy system, estimates of N flows and losses require the use of model...

  4. Developing custom fire behavior fuel models from ecologically complex fuel structures for upper Atlantic Coastal Plain forests

    Treesearch

    Bernard R. Parresol; Joe H. Scott; Anne Andreu; Susan Prichard; Laurie Kurth

    2012-01-01

    Currently geospatial fire behavior analyses are performed with an array of fire behavior modeling systems such as FARSITE, FlamMap, and the Large Fire Simulation System. These systems currently require standard or customized surface fire behavior fuel models as inputs that are often assigned through remote sensing information. The ability to handle hundreds or...

  5. Taking a systems approach to ecological systems

    USGS Publications Warehouse

    Grace, James B.

    2015-01-01

    Increasingly, there is interest in a systems-level understanding of ecological problems, which requires the evaluation of more complex, causal hypotheses. In this issue of the Journal of Vegetation Science, Soliveres et al. use structural equation modeling to test a causal network hypothesis about how tree canopies affect understorey communities. Historical analysis suggests structural equation modeling has been under-utilized in ecology.

  6. An approach to improve management visibility within the procurement and financial group at Goldstone

    NASA Technical Reports Server (NTRS)

    Maiocco, F. R.; Rozek, J. B.

    1976-01-01

    Improvements in the operational efficiency of the data management systems at the Goldstone Deep Space Communications Complex (GDSCC) are discussed. This addresses the existing procurement and financial management data system at GDSCC, identifies management requirements for better visibility, describes a proposed computerized data management system, summarizes results to data, and identifies plans for future development.

  7. Assuring the required spectroradiometric characteristics of the Fragment multispectral system

    NASA Astrophysics Data System (ADS)

    Bogdanov, A. A.; Kuzmin, V. I.; Mosevnina, L. G.; Popkov, A. V.; Sychev, A. G.; Tarnopolskii, V. I.

    The paper examines methods and equipment for assuring the required spectroradiometric characteristics of the satellite-borne Fragment multispectral scanning system during development, fabrication, and autonomous and complex testing. These characteristics comprise: (1) the integrated sensitivity of the measuring channels to the spectral density of brightness (SDB): (2) the relative spectral sensitivity of the channels; (3) the effective spectral width of the sensitivity intervals and their position in the spectral range; (4) maximum values of SDB measured by the system in each spectral interval of sensitivity; (5) the SNR in each measuring channel; and (6) the relative rms of SDB measurements.

  8. High-efficiency high-reliability optical components for a large, high-average-power visible laser system

    NASA Astrophysics Data System (ADS)

    Taylor, John R.; Stolz, Christopher J.

    1993-08-01

    Laser system performance and reliability depends on the related performance and reliability of the optical components which define the cavity and transport subsystems. High-average-power and long transport lengths impose specific requirements on component performance. The complexity of the manufacturing process for optical components requires a high degree of process control and verification. Qualification has proven effective in ensuring confidence in the procurement process for these optical components. Issues related to component reliability have been studied and provide useful information to better understand the long term performance and reliability of the laser system.

  9. High-efficiency high-reliability optical components for a large, high-average-power visible laser system

    NASA Astrophysics Data System (ADS)

    Taylor, J. R.; Stolz, C. J.

    1992-12-01

    Laser system performance and reliability depends on the related performance and reliability of the optical components which define the cavity and transport subsystems. High-average-power and long transport lengths impose specific requirements on component performance. The complexity of the manufacturing process for optical components requires a high degree of process control and verification. Qualification has proven effective in ensuring confidence in the procurement process for these optical components. Issues related to component reliability have been studied and provide useful information to better understand the long term performance and reliability of the laser system.

  10. The Mechanism of Room-Temperature Ionic-Liquid-Based Electrochemical CO₂ Reduction: A Review.

    PubMed

    Lim, Hyung-Kyu; Kim, Hyungjun

    2017-03-28

    Electrochemical CO₂ conversion technology is becoming indispensable in the development of a sustainable carbon-based economy. While various types of electrocatalytic systems have been designed, those based on room-temperature ionic liquids (RTILs) have attracted considerable attention because of their high efficiencies and selectivities. Furthermore, it should be possible to develop more advanced electrocatalytic systems for commercial use because target-specific characteristics can be fine-tuned using various combinations of RTIL ions. To achieve this goal, we require a systematic understanding of the role of the RTIL components in electrocatalytic systems, however, their role has not yet been clarified by experiment or theory. Thus, the purpose of this short review is to summarize recent experimental and theoretical mechanistic studies to provide insight into and to develop guidelines for the successful development of new CO₂ conversion systems. The results discussed here can be summarized as follows. Complex physical and chemical interactions between the RTIL components and the reaction intermediates, in particular at the electrode surface, are critical for determining the activity and selectivity of the electrocatalytic system, although no single factor dominates. Therefore, more fundamental research is required to understand the physical, chemical, and thermodynamic characteristics of complex RTIL-based electrocatalytic systems.

  11. The Challenge of Wireless Reliability and Coexistence.

    PubMed

    Berger, H Stephen

    2016-09-01

    Wireless communication plays an increasingly important role in healthcare delivery. This further heightens the importance of wireless reliability, but quantifying wireless reliability is a complex and difficult challenge. Understanding the risks that accompany the many benefits of wireless communication should be a component of overall risk management. The emerging trend of using sensors and other device-to-device communications, as part of the emerging Internet of Things concept, is evident in healthcare delivery. The trend increases both the importance and complexity of this challenge. As with most system problems, finding a solution requires breaking down the problem into manageable steps. Understanding the operational reliability of a new wireless device and its supporting system requires developing solid, quantified answers to three questions: 1) How well can this new device and its system operate in a spectral environment where many other wireless devices are also operating? 2) What is the spectral environment in which this device and its system are expected to operate? Are the risks and reliability in its operating environment acceptable? 3) How might the new device and its system affect other devices and systems already in use? When operated under an insightful risk management process, wireless technology can be safely implemented, resulting in improved delivery of care.

  12. A theoretical study of complexes formed between cations and curved aromatic systems: electrostatics does not always control cation-π interaction.

    PubMed

    Carrazana-García, Jorge A; Cabaleiro-Lago, Enrique M; Rodríguez-Otero, Jesús

    2017-04-19

    The present work studies the interaction of two extended curved π-systems (corannulene and sumanene) with various cations (sodium, potassium, ammonium, tetramethylammonium, guanidinium and imidazolium). Polyatomic cations are models of groups found in important biomolecules in which cation-π interaction plays a fundamental role. The results indicate an important size effect: with extended π systems and cations of the size of potassium and larger, dispersion is much more important than has been generally recognized for cation-π interactions. In most of the systems studied here, the stability of the cation-π complexes is the result of a balanced combination of electrostatic, induction and dispersion contributions. None of the systems studied here owes its stability to the electrostatic interaction more than 42%. Induction dominates stabilization in complexes with sodium, and in some of the potassium and ammonium complexes. In complexes with large cations and with flat cations dispersion is the major stabilizing contribution and can provide more than 50% of the stabilization energy. This implies that theoretical studies of the cation-π interaction involving large or even medium-size fragments require a level of calculation capable of properly modelling dispersion. The separation between the cation and the π system is another important factor to take into account, especially when the fragments of the cation-π complex are bound (for example, to a protein backbone) and cannot interact at the most favourable distance.

  13. Mechanism of neem limonoids-induced cell death in cancer: role of oxidative phosphorylation

    PubMed Central

    Yadav, Neelu; Kumar, Sandeep; Kumar, Rahul; Srivastava, Pragya; Sun, Leimin; Rapali, Peter; Marlowe, Timothy; Schneider, Andrea; Inigo, Joseph; O’Malley, Jordan; Londonkar, Ramesh; Gogada, Raghu; Chaudhary, Ajay; Yadava, Nagendra; Chandra, Dhyan

    2016-01-01

    We have previously reported that neem limonoids (neem) induce multiple cancer cell death pathways. Here we dissect the underlying mechanisms of neem-induced apoptotic cell death in cancer. We observed that neem-induced caspase activation does not require Bax/Bak channel-mediated mitochondrial outer membrane permeabilization, permeability transition pore, and mitochondrial fragmentation. Neem enhanced mitochondrial DNA and mitochondrial biomass. While oxidative phosphorylation (OXPHOS) Complex-I activity was decreased, the activities of other OXPHOS complexes including Complex-II and -IV were unaltered. Increased reactive oxygen species (ROS) levels were associated with an increase in mitochondrial biomass and apoptosis upon neem exposure. Complex-I deficiency due to the loss of Ndufa1-encoded MWFE protein inhibited neem-induced caspase activation and apoptosis, but cell death induction was enhanced. Complex II-deficiency due to the loss of succinate dehydrogenase complex subunit C (SDHC) robustly decreased caspase activation, apoptosis, and cell death. Additionally, the ablation of Complexes-I, -III, -IV, and -V together did not inhibit caspase activation. Together, we demonstrate that neem limonoids target OXPHOS system to induce cancer cell death, which does not require upregulation or activation of proapoptotic Bcl-2 family proteins. PMID:26627937

  14. Mechanism of neem limonoids-induced cell death in cancer: Role of oxidative phosphorylation.

    PubMed

    Yadav, Neelu; Kumar, Sandeep; Kumar, Rahul; Srivastava, Pragya; Sun, Leimin; Rapali, Peter; Marlowe, Timothy; Schneider, Andrea; Inigo, Joseph R; O'Malley, Jordan; Londonkar, Ramesh; Gogada, Raghu; Chaudhary, Ajay K; Yadava, Nagendra; Chandra, Dhyan

    2016-01-01

    We have previously reported that neem limonoids (neem) induce multiple cancer cell death pathways. Here we dissect the underlying mechanisms of neem-induced apoptotic cell death in cancer. We observed that neem-induced caspase activation does not require Bax/Bak channel-mediated mitochondrial outer membrane permeabilization, permeability transition pore, and mitochondrial fragmentation. Neem enhanced mitochondrial DNA and mitochondrial biomass. While oxidative phosphorylation (OXPHOS) Complex-I activity was decreased, the activities of other OXPHOS complexes including Complex-II and -IV were unaltered. Increased reactive oxygen species (ROS) levels were associated with an increase in mitochondrial biomass and apoptosis upon neem exposure. Complex-I deficiency due to the loss of Ndufa1-encoded MWFE protein inhibited neem-induced caspase activation and apoptosis, but cell death induction was enhanced. Complex II-deficiency due to the loss of succinate dehydrogenase complex subunit C (SDHC) robustly decreased caspase activation, apoptosis, and cell death. Additionally, the ablation of Complexes-I, -III, -IV, and -V together did not inhibit caspase activation. Together, we demonstrate that neem limonoids target OXPHOS system to induce cancer cell death, which does not require upregulation or activation of proapoptotic Bcl-2 family proteins. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Spacecraft Antennas

    NASA Technical Reports Server (NTRS)

    Jamnejad, Vahraz; Manshadi, Farzin; Rahmat-Samii, Yahya; Cramer, Paul

    1990-01-01

    Some of the various categories of issues that must be considered in the selection and design of spacecraft antennas for a Personal Access Satellite System (PASS) are addressed, and parametric studies for some of the antenna concepts to help the system designer in making the most appropriate antenna choice with regards to weight, size, and complexity, etc. are provided. The question of appropriate polarization for the spacecraft as well as for the User Terminal Antenna required particular attention and was studied in some depth. Circular polarization seems to be the favored outcome of this study. Another problem that has generally been a complicating factor in designing the multiple beam reflector antennas, is the type of feeds (single vs. multiple element and overlapping vs. non-overlapping clusters) needed for generating the beams. This choice is dependent on certain system design factors, such as the required frequency reuse, acceptable interbeam isolation, antenna efficiency, number of beams scanned, and beam-forming network (BFN) complexity. This issue is partially addressed, but is not completely resolved. Indications are that it may be possible to use relatively simple non-overlapping clusters of only a few elements, unless a large frequency reuse and very stringent isolation levels are required.

  16. Simulation of the human-telerobot interface

    NASA Technical Reports Server (NTRS)

    Stuart, Mark A.; Smith, Randy L.

    1988-01-01

    A part of NASA's Space Station will be a Flight Telerobotic Servicer (FTS) used to help assemble, service, and maintain the Space Station. Since the human operator will be required to control the FTS, the design of the human-telerobot interface must be optimized from a human factors perspective. Simulation has been used as an aid in the development of complex systems. Simulation has been especially useful when it has been applied to the development of complex systems. Simulation should ensure that the hardware and software components of the human-telerobot interface have been designed and selected so that the operator's capabilities and limitations have been accommodated for since this is a complex system where few direct comparisons to existent systems can be made. Three broad areas of the human-telerobot interface where simulation can be of assistance are described. The use of simulation not only can result in a well-designed human-telerobot interface, but also can be used to ensure that components have been selected to best meet system's goals, and for operator training.

  17. Reconstitution of the yeast RNA polymerase III transcription system with all recombinant factors.

    PubMed

    Ducrot, Cécile; Lefebvre, Olivier; Landrieux, Emilie; Guirouilh-Barbat, Josée; Sentenac, André; Acker, Joel

    2006-04-28

    Transcription factor TFIIIC is a multisubunit complex required for promoter recognition and transcriptional activation of class III genes. We describe here the reconstitution of complete recombinant yeast TFIIIC and the molecular characterization of its two DNA-binding domains, tauA and tauB, using the baculovirus expression system. The B block-binding module, rtauB, was reconstituted with rtau138, rtau91, and rtau60 subunits. rtau131, rtau95, and rtau55 formed also a stable complex, rtauA, that displayed nonspecific DNA binding activity. Recombinant rTFIIIC was functionally equivalent to purified yeast TFIIIC, suggesting that the six recombinant subunits are necessary and sufficient to reconstitute a transcriptionally active TFIIIC complex. The formation and the properties of rTFIIIC-DNA complexes were affected by dephosphorylation treatments. The combination of complete recombinant rTFIIIC and rTFIIIB directed a low level of basal transcription, much weaker than with the crude B'' fraction, suggesting the existence of auxiliary factors that could modulate the yeast RNA polymerase III transcription system.

  18. A case study of quality improvement methods for complex adaptive systems applied to an academic hepatology program.

    PubMed

    Fontanesi, John; Martinez, Anthony; Boyo, Toritsesan O; Gish, Robert

    2015-01-01

    Although demands for greater access to hepatology services that are less costly and achieve better outcomes have led to numerous quality improvement initiatives, traditional quality management methods may be inappropriate for hepatology. We empirically tested a model for conducting quality improvement in an academic hepatology program using methods developed to analyze and improve complex adaptive systems. We achieved a 25% increase in volume using 15% more clinical sessions with no change in staff or faculty FTEs, generating a positive margin of 50%. Wait times for next available appointments were reduced from five months to two weeks; unscheduled appointment slots dropped from 7% to less than 1%; "no-show" rates dropped to less than 10%; Press-Ganey scores increased to the 100th percentile. We conclude that framing hepatology as a complex adaptive system may improve our understanding of the complex, interdependent actions required to improve quality of care, patient satisfaction, and cost-effectiveness.

  19. Assembly dynamics and stability of the pneumococcal epsilon zeta antitoxin toxin (PezAT) system from Streptococcus pneumoniae.

    PubMed

    Mutschler, Hannes; Reinstein, Jochen; Meinhart, Anton

    2010-07-09

    The pneumococcal epsilon zeta antitoxin toxin (PezAT) system is a chromosomally encoded, class II toxin antitoxin system from the human pathogen Streptococcus pneumnoniae. Neutralization of the bacteriotoxic protein PezT is carried out by complex formation with its cognate antitoxin PezA. Here we study the stability of the inhibitory complex in vivo and in vitro. We found that toxin release is impeded in Escherichia coli and Bacillus subtilis due to the proteolytic resistance of PezA once bound to PezT. These findings are supported by in vitro experiments demonstrating a strong thermodynamic stabilization of both proteins upon binding. A detailed kinetic analysis of PezAT assembly revealed that these particular features of PezAT are based on a strong, electrostatically guided binding mechanism leading to a stable toxin antitoxin complex with femtomolar affinity. Our data show that PezAT complex formation is distinct to all other conventional toxin antitoxin modules and a controlled mode of toxin release is required for activation.

  20. Computer model for characterizing, screening, and optimizing electrolyte systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gering, Kevin L.

    2015-06-15

    Electrolyte systems in contemporary batteries are tasked with operating under increasing performance requirements. All battery operation is in some way tied to the electrolyte and how it interacts with various regions within the cell environment. Seeing the electrolyte plays a crucial role in battery performance and longevity, it is imperative that accurate, physics-based models be developed that will characterize key electrolyte properties while keeping pace with the increasing complexity of these liquid systems. Advanced models are needed since laboratory measurements require significant resources to carry out for even a modest experimental matrix. The Advanced Electrolyte Model (AEM) developed at themore » INL is a proven capability designed to explore molecular-to-macroscale level aspects of electrolyte behavior, and can be used to drastically reduce the time required to characterize and optimize electrolytes. Although it is applied most frequently to lithium-ion battery systems, it is general in its theory and can be used toward numerous other targets and intended applications. This capability is unique, powerful, relevant to present and future electrolyte development, and without peer. It redefines electrolyte modeling for highly-complex contemporary systems, wherein significant steps have been taken to capture the reality of electrolyte behavior in the electrochemical cell environment. This capability can have a very positive impact on accelerating domestic battery development to support aggressive vehicle and energy goals in the 21st century.« less

  1. Real-time biomimetic Central Pattern Generators in an FPGA for hybrid experiments

    PubMed Central

    Ambroise, Matthieu; Levi, Timothée; Joucla, Sébastien; Yvert, Blaise; Saïghi, Sylvain

    2013-01-01

    This investigation of the leech heartbeat neural network system led to the development of a low resources, real-time, biomimetic digital hardware for use in hybrid experiments. The leech heartbeat neural network is one of the simplest central pattern generators (CPG). In biology, CPG provide the rhythmic bursts of spikes that form the basis for all muscle contraction orders (heartbeat) and locomotion (walking, running, etc.). The leech neural network system was previously investigated and this CPG formalized in the Hodgkin–Huxley neural model (HH), the most complex devised to date. However, the resources required for a neural model are proportional to its complexity. In response to this issue, this article describes a biomimetic implementation of a network of 240 CPGs in an FPGA (Field Programmable Gate Array), using a simple model (Izhikevich) and proposes a new synapse model: activity-dependent depression synapse. The network implementation architecture operates on a single computation core. This digital system works in real-time, requires few resources, and has the same bursting activity behavior as the complex model. The implementation of this CPG was initially validated by comparing it with a simulation of the complex model. Its activity was then matched with pharmacological data from the rat spinal cord activity. This digital system opens the way for future hybrid experiments and represents an important step toward hybridization of biological tissue and artificial neural networks. This CPG network is also likely to be useful for mimicking the locomotion activity of various animals and developing hybrid experiments for neuroprosthesis development. PMID:24319408

  2. Managing the integration and harmonization of national airspace for unmanned and manned systems

    NASA Astrophysics Data System (ADS)

    Mumm, Hans

    This dissertation examines the leadership challenge created by the requirement to integrate unmanned aerial vehicles (UAVs) into the national airspace system (NAS). The lack of UAV-related federal rules and regulations is a primary factor prolonging this integration. This effort focuses primarily on the leadership portion of the solution and not the technological requirements. The research explores an adaptation of the complexity theory that offers a potential leadership framework for the government, industry, and academia to use for achieving the full integration of UAVs into the NAS. Due to the large number of stakeholders and the multitude of interrelated issues, a complexity-theory-leadership methodology was created and examined as a potential way to help the FAA accelerate their rule-making efforts. This dissertation focuses on United States UAV issues. The United States is one of the leaders in the unmanned systems arena, to include the first significant use of recoverable autonomous weaponized systems in combat. Issues such as airspace, airworthiness, social issues, privacy issues, regulations, and the lack of policies, procedures, or governance are universal for all countries that are active in this technology area. This qualitative dissertation makes use of the grounded theory methodology as it combines a literature review and research along with interviews with subject matter experts, and information gained from attending UAV related gatherings/discussions. The investigation uncovered significant FAA process impediments as well as some possible break through concepts that could work well with the complexity-theory-leadership methodology. Keywords: Complexity theory, leadership, change management, UAV, unmanned aerial vehicle, National Airspace, NAS, FAA, Federal Aviation Administration.

  3. Canonical Initiation Factor Requirements of the Myc Family of Internal Ribosome Entry Segments▿ †

    PubMed Central

    Spriggs, Keith A.; Cobbold, Laura C.; Jopling, Catherine L.; Cooper, Rebecca E.; Wilson, Lindsay A.; Stoneley, Mark; Coldwell, Mark J.; Poncet, Didier; Shen, Ya-Ching; Morley, Simon J.; Bushell, Martin; Willis, Anne E.

    2009-01-01

    Initiation of protein synthesis in eukaryotes requires recruitment of the ribosome to the mRNA and its translocation to the start codon. There are at least two distinct mechanisms by which this process can be achieved; the ribosome can be recruited either to the cap structure at the 5′ end of the message or to an internal ribosome entry segment (IRES), a complex RNA structural element located in the 5′ untranslated region (5′-UTR) of the mRNA. However, it is not well understood how cellular IRESs function to recruit the ribosome or how the 40S ribosomal subunits translocate from the initial recruitment site on the mRNA to the AUG initiation codon. We have investigated the canonical factors that are required by the IRESs found in the 5′-UTRs of c-, L-, and N-myc, using specific inhibitors and a tissue culture-based assay system, and have shown that they differ considerably in their requirements. The L-myc IRES requires the eIF4F complex and the association of PABP and eIF3 with eIF4G for activity. The minimum requirements of the N- and c-myc IRESs are the C-terminal domain of eIF4G to which eIF4A is bound and eIF3, although interestingly this protein does not appear to be recruited to the IRES RNA via eIF4G. Finally, our data show that all three IRESs require a ternary complex, although in contrast to c- and L-myc IRESs, the N-myc IRES has a lesser requirement for a ternary complex. PMID:19124605

  4. Software control and system configuration management - A process that works

    NASA Technical Reports Server (NTRS)

    Petersen, K. L.; Flores, C., Jr.

    1983-01-01

    A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.

  5. Verification of Space Station Secondary Power System Stability Using Design of Experiment

    NASA Technical Reports Server (NTRS)

    Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce

    1998-01-01

    This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.

  6. Stereotaxy, navigation and the temporal concatenation.

    PubMed

    Apuzzo, M L; Chen, J C

    1999-01-01

    Nautical and cerebral navigation share similar elements of functional need and similar developmental pathways. The need for orientation necessitates the development of appropriate concepts, and such concepts are dependent on technology for practical realization. Occasionally, a concept precedes technology in time and requires periods of delay for appropriate development. A temporal concatenation exists where time allows the additive as need, concept and technology ultimately provide an endpoint of elegant solution. Nautical navigation has proceeded through periods of dead reckoning and celestial navigation to satellite orientation with associated refinements of instrumentation and charts for guidance. Cerebral navigation has progressed from craniometric orientation and burr hole mounted guidance systems to simple rectolinear and arc-centered devices based on radiographs to guidance by complex anatomical and functional maps provided as an amalgam of modern imaging modes. These maps are now augmented by complex frame and frameless systems which allow not only precise orientation, but also point and volumetric action. These complex technical modalities required and developed in part from elements of maritime navigation that have been translated to cerebral navigation in a temporal concatenation. Copyright 2000 S. Karger AG, Basel

  7. Humidity control as a strategy for lattice optimization applied to crystals of HLA-A*1101 complexed with variant peptides from dengue virus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chotiyarnwong, Pojchong; Medical Molecular Biology Unit, Faculty of Medicine, Siriraj Hospital, Mahidol University; Stewart-Jones, Guillaume B.

    Crystals of an MHC class I molecule bound to naturally occurring peptide variants from the dengue virus NS3 protein contained high levels of solvent and required optimization of cryoprotectant and dehydration protocols for each complex to yield well ordered diffraction, a process facilitated by the use of a free-mounting system. T-cell recognition of the antigenic peptides presented by MHC class I molecules normally triggers protective immune responses, but can result in immune enhancement of disease. Cross-reactive T-cell responses may underlie immunopathology in dengue haemorrhagic fever. To analyze these effects at the molecular level, the functional MHC class I molecule HLA-A*1101more » was crystallized bound to six naturally occurring peptide variants from the dengue virus NS3 protein. The crystals contained high levels of solvent and required optimization of the cryoprotectant and dehydration protocols for each complex to yield well ordered diffraction, a process that was facilitated by the use of a free-mounting system.« less

  8. Constructing the wonders of the bacterial world: biosynthesis of complex enzymes.

    PubMed

    Sargent, Frank

    2007-03-01

    The prokaryotic cytoplasmic membrane not only maintains cell integrity and forms a barrier between the cell and its outside environment, but is also the location for essential biochemical processes. Microbial model systems provide excellent bases for the study of fundamental problems in membrane biology including signal transduction, chemotaxis, solute transport and, as will be the topic of this review, energy metabolism. Bacterial respiration requires a diverse array of complex, multi-subunit, cofactor-containing redox enzymes, many of which are embedded within, or located on the extracellular side of, the membrane. The biosynthesis of these enzymes therefore requires carefully controlled expression, assembly, targeting and transport processes. Here, focusing on the molybdenum-containing respiratory enzymes central to anaerobic respiration in Escherichia coli, recent descriptions of a chaperone-mediated 'proofreading' system involved in coordinating assembly and export of complex extracellular enzymes will be discussed. The paradigm proofreading chaperones are members of a large group of proteins known as the TorD family, and recent research in this area highlights common principles that underpin biosynthesis of both exported and non-exported respiratory enzymes.

  9. The trajectory of life. Decreasing physiological network complexity through changing fractal patterns

    PubMed Central

    Sturmberg, Joachim P.; Bennett, Jeanette M.; Picard, Martin; Seely, Andrew J. E.

    2015-01-01

    In this position paper, we submit a synthesis of theoretical models based on physiology, non-equilibrium thermodynamics, and non-linear time-series analysis. Based on an understanding of the human organism as a system of interconnected complex adaptive systems, we seek to examine the relationship between health, complexity, variability, and entropy production, as it might be useful to help understand aging, and improve care for patients. We observe the trajectory of life is characterized by the growth, plateauing and subsequent loss of adaptive function of organ systems, associated with loss of functioning and coordination of systems. Understanding development and aging requires the examination of interdependence among these organ systems. Increasing evidence suggests network interconnectedness and complexity can be captured/measured/associated with the degree and complexity of healthy biologic rhythm variability (e.g., heart and respiratory rate variability). We review physiological mechanisms linking the omics, arousal/stress systems, immune function, and mitochondrial bioenergetics; highlighting their interdependence in normal physiological function and aging. We argue that aging, known to be characterized by a loss of variability, is manifested at multiple scales, within functional units at the small scale, and reflected by diagnostic features at the larger scale. While still controversial and under investigation, it appears conceivable that the integrity of whole body complexity may be, at least partially, reflected in the degree and variability of intrinsic biologic rhythms, which we believe are related to overall system complexity that may be a defining feature of health and it's loss through aging. Harnessing this information for the development of therapeutic and preventative strategies may hold an opportunity to significantly improve the health of our patients across the trajectory of life. PMID:26082722

  10. Packet communications in satellites with multiple-beam antennas and signal processing

    NASA Technical Reports Server (NTRS)

    Davies, R.; Chethik, F.; Penick, M.

    1980-01-01

    A communication satellite with a multiple-beam antenna and onboard signal processing is considered for use in a 'message-switched' data relay system. The signal processor may incorporate demodulation, routing, storage, and remodulation of the data. A system user model is established and key functional elements for the signal processing are identified. With the throughput and delay requirements as the controlled variables, the hardware complexity, operational discipline, occupied bandwidth, and overall user end-to-end cost are estimated for (1) random-access packet switching; and (2) reservation-access packet switching. Other aspects of this network (eg, the adaptability to channel switched traffic requirements) are examined. For the given requirements and constraints, the reservation system appears to be the most attractive protocol.

  11. Expert systems for MSFC power systems

    NASA Technical Reports Server (NTRS)

    Weeks, David J.

    1988-01-01

    Future space vehicles and platforms including Space Station will possess complex power systems. These systems will require a high level of autonomous operation to allow the crew to concentrate on mission activities and to limit the number of ground support personnel to a reasonable number. The Electrical Power Branch at NASA-Marshall is developing advanced automation approaches which will enable the necessary levels of autonomy. These approaches include the utilization of knowledge based or expert systems.

  12. HyperForest: A high performance multi-processor architecture for real-time intelligent systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, P. Jr.; Rebeil, J.P.; Pollard, H.

    1997-04-01

    Intelligent Systems are characterized by the intensive use of computer power. The computer revolution of the last few years is what has made possible the development of the first generation of Intelligent Systems. Software for second generation Intelligent Systems will be more complex and will require more powerful computing engines in order to meet real-time constraints imposed by new robots, sensors, and applications. A multiprocessor architecture was developed that merges the advantages of message-passing and shared-memory structures: expendability and real-time compliance. The HyperForest architecture will provide an expandable real-time computing platform for computationally intensive Intelligent Systems and open the doorsmore » for the application of these systems to more complex tasks in environmental restoration and cleanup projects, flexible manufacturing systems, and DOE`s own production and disassembly activities.« less

  13. 47 CFR 36.123 - Operator systems equipment-Category 1.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... apportioned on the basis of the relative processor real time (i.e., actual seconds) required to process TSPS... relative processor real time (i.e., actual seconds) for the entire TSPS complex. [52 FR 17229, May 6, 1987... 47 Telecommunication 2 2014-10-01 2014-10-01 false Operator systems equipment-Category 1. 36.123...

  14. 47 CFR 36.123 - Operator systems equipment-Category 1.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... apportioned on the basis of the relative processor real time (i.e., actual seconds) required to process TSPS... relative processor real time (i.e., actual seconds) for the entire TSPS complex. [52 FR 17229, May 6, 1987... 47 Telecommunication 2 2013-10-01 2013-10-01 false Operator systems equipment-Category 1. 36.123...

  15. 47 CFR 36.123 - Operator systems equipment-Category 1.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... apportioned on the basis of the relative processor real time (i.e., actual seconds) required to process TSPS... relative processor real time (i.e., actual seconds) for the entire TSPS complex. [52 FR 17229, May 6, 1987... 47 Telecommunication 2 2012-10-01 2012-10-01 false Operator systems equipment-Category 1. 36.123...

  16. 47 CFR 36.123 - Operator systems equipment-Category 1.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... apportioned on the basis of the relative processor real time (i.e., actual seconds) required to process TSPS... relative processor real time (i.e., actual seconds) for the entire TSPS complex. [52 FR 17229, May 6, 1987... 47 Telecommunication 2 2011-10-01 2011-10-01 false Operator systems equipment-Category 1. 36.123...

  17. Assessment Systems and Data Management in Colleges of Education: An Examination of Systems and Infrastructure

    ERIC Educational Resources Information Center

    Haughton, Noela A.; Keil, Virginia L.

    2009-01-01

    The College of Education Assessment Infrastructure Survey was developed and administered to 1011 institutions over a twelve-month period ending April 2007. The survey examined the capacity of university-based teacher preparation programs to respond to the growing and increasingly complex data management requirements that accompanies assessment and…

  18. Analyzing Change in Students' Gene-to-Evolution Models in College-Level Introductory Biology

    ERIC Educational Resources Information Center

    Dauer, Joseph T.; Momsen, Jennifer L.; Speth, Elena Bray; Makohon-Moore, Sasha C.; Long, Tammy M.

    2013-01-01

    Research in contemporary biology has become increasingly complex and organized around understanding biological processes in the context of systems. To better reflect the ways of thinking required for learning about systems, we developed and implemented a pedagogical approach using box-and-arrow models (similar to concept maps) as a foundational…

  19. The science of decisionmaking: applications for sustainable forest and grassland management in the National Forest System

    Treesearch

    Matthew P. Thompson; Bruce G. Marcot; Frank R. Thompson; Steven McNulty; Larry A. Fisher; Michael C. Runge; David Cleaves; Monica Tomosy

    2013-01-01

    Sustainable management of national forests and grasslands within the National Forest System (NFS) often requires managers to make tough decisions under considerable uncertainty, complexity, and potential conflict. Resource decisionmakers must weigh a variety of risks, stressors, and challenges to sustainable management, including climate change, wildland fire, invasive...

  20. 47 CFR 36.123 - Operator systems equipment-Category 1.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... apportioned on the basis of the relative processor real time (i.e., actual seconds) required to process TSPS... relative processor real time (i.e., actual seconds) for the entire TSPS complex. [52 FR 17229, May 6, 1987... 47 Telecommunication 2 2010-10-01 2010-10-01 false Operator systems equipment-Category 1. 36.123...

Top