Sample records for control computer complex

  1. Description and operational status of the National Transonic Facility computer complex

    NASA Technical Reports Server (NTRS)

    Boyles, G. B., Jr.

    1986-01-01

    This paper describes the National Transonic Facility (NTF) computer complex and its support of tunnel operations. The capabilities of the research data acquisition and reduction are discussed along with the types of data that can be acquired and presented. Pretest, test, and posttest capabilities are also outlined along with a discussion of the computer complex to monitor the tunnel control processes and provide the tunnel operators with information needed to control the tunnel. Planned enhancements to the computer complex for support of future testing are presented.

  2. Closely Spaced Independent Parallel Runway Simulation.

    DTIC Science & Technology

    1984-10-01

    facility consists of the Central Computer Facility, the Controller Laboratory, and the Simulator Pilot Complex. CENTRAL COMPUTER FACILITY. The Central... Computer Facility consists of a group of mainframes, minicomputers, and associated peripherals which host the operational and data acquisition...in the Controller Laboratory and convert their verbal directives into a keyboard entry which is transmitted to the Central Computer Complex, where

  3. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    NASA Technical Reports Server (NTRS)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  4. Safety Metrics for Human-Computer Controlled Systems

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G; Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  5. Computational complexities and storage requirements of some Riccati equation solvers

    NASA Technical Reports Server (NTRS)

    Utku, Senol; Garba, John A.; Ramesh, A. V.

    1989-01-01

    The linear optimal control problem of an nth-order time-invariant dynamic system with a quadratic performance functional is usually solved by the Hamilton-Jacobi approach. This leads to the solution of the differential matrix Riccati equation with a terminal condition. The bulk of the computation for the optimal control problem is related to the solution of this equation. There are various algorithms in the literature for solving the matrix Riccati equation. However, computational complexities and storage requirements as a function of numbers of state variables, control variables, and sensors are not available for all these algorithms. In this work, the computational complexities and storage requirements for some of these algorithms are given. These expressions show the immensity of the computational requirements of the algorithms in solving the Riccati equation for large-order systems such as the control of highly flexible space structures. The expressions are also needed to compute the speedup and efficiency of any implementation of these algorithms on concurrent machines.

  6. Autonomous control systems: applications to remote sensing and image processing

    NASA Astrophysics Data System (ADS)

    Jamshidi, Mohammad

    2001-11-01

    One of the main challenges of any control (or image processing) paradigm is being able to handle complex systems under unforeseen uncertainties. A system may be called complex here if its dimension (order) is too high and its model (if available) is nonlinear, interconnected, and information on the system is uncertain such that classical techniques cannot easily handle the problem. Examples of complex systems are power networks, space robotic colonies, national air traffic control system, and integrated manufacturing plant, the Hubble Telescope, the International Space Station, etc. Soft computing, a consortia of methodologies such as fuzzy logic, neuro-computing, genetic algorithms and genetic programming, has proven to be powerful tools for adding autonomy and semi-autonomy to many complex systems. For such systems the size of soft computing control architecture will be nearly infinite. In this paper new paradigms using soft computing approaches are utilized to design autonomous controllers and image enhancers for a number of application areas. These applications are satellite array formations for synthetic aperture radar interferometry (InSAR) and enhancement of analog and digital images.

  7. [Soft- and hardware support for the setup for computer tracking of radiation teletherapy].

    PubMed

    Tarutin, I G; Piliavets, V I; Strakh, A G; Minenko, V F; Golubovskiĭ, A I

    1983-06-01

    A hard and soft ware computer assisted complex has been worked out for gamma-beam therapy. The complex included all radiotherapeutic units, including a Siemens program controlled betatron with an energy of 42 MEV computer ES-1022, a Medigraf system of the processing of graphic information, a Mars-256 system for control over the homogeneity of distribution of dose rate on the field of irradiation and a package of mathematical programs to select a plan of irradiation of various tumor sites. The prospects of the utilization of such complexes in the dosimetric support of radiation therapy are discussed.

  8. Simplified microprocessor design for VLSI control applications

    NASA Technical Reports Server (NTRS)

    Cameron, K.

    1991-01-01

    A design technique for microprocessors combining the simplicity of reduced instruction set computers (RISC's) with the richer instruction sets of complex instruction set computers (CISC's) is presented. They utilize the pipelined instruction decode and datapaths common to RISC's. Instruction invariant data processing sequences which transparently support complex addressing modes permit the formulation of simple control circuitry. Compact implementations are possible since neither complicated controllers nor large register sets are required.

  9. Complexity control algorithm based on adaptive mode selection for interframe coding in high efficiency video coding

    NASA Astrophysics Data System (ADS)

    Chen, Gang; Yang, Bing; Zhang, Xiaoyun; Gao, Zhiyong

    2017-07-01

    The latest high efficiency video coding (HEVC) standard significantly increases the encoding complexity for improving its coding efficiency. Due to the limited computational capability of handheld devices, complexity constrained video coding has drawn great attention in recent years. A complexity control algorithm based on adaptive mode selection is proposed for interframe coding in HEVC. Considering the direct proportionality between encoding time and computational complexity, the computational complexity is measured in terms of encoding time. First, complexity is mapped to a target in terms of prediction modes. Then, an adaptive mode selection algorithm is proposed for the mode decision process. Specifically, the optimal mode combination scheme that is chosen through offline statistics is developed at low complexity. If the complexity budget has not been used up, an adaptive mode sorting method is employed to further improve coding efficiency. The experimental results show that the proposed algorithm achieves a very large complexity control range (as low as 10%) for the HEVC encoder while maintaining good rate-distortion performance. For the lowdelayP condition, compared with the direct resource allocation method and the state-of-the-art method, an average gain of 0.63 and 0.17 dB in BDPSNR is observed for 18 sequences when the target complexity is around 40%.

  10. Controls for Burning Solid Wastes

    ERIC Educational Resources Information Center

    Toro, Richard F.; Weinstein, Norman J.

    1975-01-01

    Modern thermal solid waste processing systems are becoming more complex, incorporating features that require instrumentation and control systems to a degree greater than that previously required just for proper combustion control. With the advent of complex, sophisticated, thermal processing systems, TV monitoring and computer control should…

  11. Energy conservation and analysis and evaluation. [specifically at Slidell Computer Complex

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The survey assembled and made recommendations directed at conserving utilities and reducing the use of energy at the Slidell Computer Complex. Specific items included were: (1) scheduling and controlling the use of gas and electricity, (2) building modifications to reduce energy, (3) replacement of old, inefficient equipment, (4) modifications to control systems, (5) evaluations of economizer cycles in HVAC systems, and (6) corrective settings for thermostats, ductstats, and other temperature and pressure control devices.

  12. USSR and Eastern Europe Scientific Abstracts, Cybernetics, Computers, and Automation Technology, Number 26

    DTIC Science & Technology

    1977-01-26

    Sisteme Matematicheskogo Obespecheniya YeS EVM [ Applied Programs in the Software System for the Unified System of Computers], by A. Ye. Fateyev, A. I...computerized systems are most effective in large production complexes , in which the level of utilization of computers can be as high as 500,000...performance of these tasks could be furthered by the complex introduction of electronic computers in automated control systems. The creation of ASU

  13. JPRS Report, Science & Technology, USSR: Computers, Control Systems and Machines

    DTIC Science & Technology

    1989-03-14

    optimizatsii slozhnykh sistem (Coding Theory and Complex System Optimization ). Alma-Ata, Nauka Press, 1977, pp. 8-16. 11. Author’s certificate number...Interpreter Specifics [0. I. Amvrosova] ............................................. 141 Creation of Modern Computer Systems for Complex Ecological...processor can be designed to decrease degradation upon failure and assure more reliable processor operation, without requiring more complex software or

  14. A tool for modeling concurrent real-time computation

    NASA Technical Reports Server (NTRS)

    Sharma, D. D.; Huang, Shie-Rei; Bhatt, Rahul; Sridharan, N. S.

    1990-01-01

    Real-time computation is a significant area of research in general, and in AI in particular. The complexity of practical real-time problems demands use of knowledge-based problem solving techniques while satisfying real-time performance constraints. Since the demands of a complex real-time problem cannot be predicted (owing to the dynamic nature of the environment) powerful dynamic resource control techniques are needed to monitor and control the performance. A real-time computation model for a real-time tool, an implementation of the QP-Net simulator on a Symbolics machine, and an implementation on a Butterfly multiprocessor machine are briefly described.

  15. High level language for measurement complex control based on the computer E-100I

    NASA Technical Reports Server (NTRS)

    Zubkov, B. V.

    1980-01-01

    A high level language was designed to control the process of conducting an experiment using the computer "Elektrinika-1001". Program examples are given to control the measuring and actuating devices. The procedure of including these programs in the suggested high level language is described.

  16. How to Compute Labile Metal-Ligand Equilibria

    ERIC Educational Resources Information Center

    de Levie, Robert

    2007-01-01

    The different methods used for computing labile metal-ligand complexes, which are suitable for an iterative computer solution, are illustrated. The ligand function has allowed students to relegate otherwise tedious iterations to a computer, while retaining complete control over what is calculated.

  17. [The P300-based brain-computer interface: presentation of the complex "flash + movement" stimuli].

    PubMed

    Ganin, I P; Kaplan, A Ia

    2014-01-01

    The P300 based brain-computer interface requires the detection of P300 wave of brain event-related potentials. Most of its users learn the BCI control in several minutes and after the short classifier training they can type a text on the computer screen or assemble an image of separate fragments in simple BCI-based video games. Nevertheless, insufficient attractiveness for users and conservative stimuli organization in this BCI may restrict its integration into real information processes control. At the same time initial movement of object (motion-onset stimuli) may be an independent factor that induces P300 wave. In current work we checked the hypothesis that complex "flash + movement" stimuli together with drastic and compact stimuli organization on the computer screen may be much more attractive for user while operating in P300 BCI. In 20 subjects research we showed the effectiveness of our interface. Both accuracy and P300 amplitude were higher for flashing stimuli and complex "flash + movement" stimuli compared to motion-onset stimuli. N200 amplitude was maximal for flashing stimuli, while for "flash + movement" stimuli and motion-onset stimuli it was only a half of it. Similar BCI with complex stimuli may be embedded into compact control systems requiring high level of user attention under impact of negative external effects obstructing the BCI control.

  18. Computer-Assisted Monitoring Of A Complex System

    NASA Technical Reports Server (NTRS)

    Beil, Bob J.; Mickelson, Eric M.; Sterritt, John M.; Costantino, Rob W.; Houvener, Bob C.; Super, Mike A.

    1995-01-01

    Propulsion System Advisor (PSA) computer-based system assists engineers and technicians in analyzing masses of sensory data indicative of operating conditions of space shuttle propulsion system during pre-launch and launch activities. Designed solely for monitoring; does not perform any control functions. Although PSA developed for highly specialized application, serves as prototype of noncontrolling, computer-based subsystems for monitoring other complex systems like electric-power-distribution networks and factories.

  19. The engineering design integration (EDIN) system. [digital computer program complex

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.; Hirsch, G. N.; Alford, G. E.; Colquitt, W. N.; Reiners, S. J.

    1974-01-01

    A digital computer program complex for the evaluation of aerospace vehicle preliminary designs is described. The system consists of a Univac 1100 series computer and peripherals using the Exec 8 operating system, a set of demand access terminals of the alphanumeric and graphics types, and a library of independent computer programs. Modification of the partial run streams, data base maintenance and construction, and control of program sequencing are provided by a data manipulation program called the DLG processor. The executive control of library program execution is performed by the Univac Exec 8 operating system through a user established run stream. A combination of demand and batch operations is employed in the evaluation of preliminary designs. Applications accomplished with the EDIN system are described.

  20. Approximation algorithms for planning and control

    NASA Technical Reports Server (NTRS)

    Boddy, Mark; Dean, Thomas

    1989-01-01

    A control system operating in a complex environment will encounter a variety of different situations, with varying amounts of time available to respond to critical events. Ideally, such a control system will do the best possible with the time available. In other words, its responses should approximate those that would result from having unlimited time for computation, where the degree of the approximation depends on the amount of time it actually has. There exist approximation algorithms for a wide variety of problems. Unfortunately, the solution to any reasonably complex control problem will require solving several computationally intensive problems. Algorithms for successive approximation are a subclass of the class of anytime algorithms, algorithms that return answers for any amount of computation time, where the answers improve as more time is allotted. An architecture is described for allocating computation time to a set of anytime algorithms, based on expectations regarding the value of the answers they return. The architecture described is quite general, producing optimal schedules for a set of algorithms under widely varying conditions.

  1. Multiplexed Predictive Control of a Large Commercial Turbofan Engine

    NASA Technical Reports Server (NTRS)

    Richter, hanz; Singaraju, Anil; Litt, Jonathan S.

    2008-01-01

    Model predictive control is a strategy well-suited to handle the highly complex, nonlinear, uncertain, and constrained dynamics involved in aircraft engine control problems. However, it has thus far been infeasible to implement model predictive control in engine control applications, because of the combination of model complexity and the time allotted for the control update calculation. In this paper, a multiplexed implementation is proposed that dramatically reduces the computational burden of the quadratic programming optimization that must be solved online as part of the model-predictive-control algorithm. Actuator updates are calculated sequentially and cyclically in a multiplexed implementation, as opposed to the simultaneous optimization taking place in conventional model predictive control. Theoretical aspects are discussed based on a nominal model, and actual computational savings are demonstrated using a realistic commercial engine model.

  2. A service-oriented data access control model

    NASA Astrophysics Data System (ADS)

    Meng, Wei; Li, Fengmin; Pan, Juchen; Song, Song; Bian, Jiali

    2017-01-01

    The development of mobile computing, cloud computing and distributed computing meets the growing individual service needs. Facing with complex application system, it's an urgent problem to ensure real-time, dynamic, and fine-grained data access control. By analyzing common data access control models, on the basis of mandatory access control model, the paper proposes a service-oriented access control model. By regarding system services as subject and data of databases as object, the model defines access levels and access identification of subject and object, and ensures system services securely to access databases.

  3. Parametric bicubic spline and CAD tools for complex targets shape modelling in physical optics radar cross section prediction

    NASA Astrophysics Data System (ADS)

    Delogu, A.; Furini, F.

    1991-09-01

    Increasing interest in radar cross section (RCS) reduction is placing new demands on theoretical, computation, and graphic techniques for calculating scattering properties of complex targets. In particular, computer codes capable of predicting the RCS of an entire aircraft at high frequency and of achieving RCS control with modest structural changes, are becoming of paramount importance in stealth design. A computer code, evaluating the RCS of arbitrary shaped metallic objects that are computer aided design (CAD) generated, and its validation with measurements carried out using ALENIA RCS test facilities are presented. The code, based on the physical optics method, is characterized by an efficient integration algorithm with error control, in order to contain the computer time within acceptable limits, and by an accurate parametric representation of the target surface in terms of bicubic splines.

  4. Nonlinear dynamics as an engine of computation.

    PubMed

    Kia, Behnam; Lindner, John F; Ditto, William L

    2017-03-06

    Control of chaos teaches that control theory can tame the complex, random-like behaviour of chaotic systems. This alliance between control methods and physics-cybernetical physics-opens the door to many applications, including dynamics-based computing. In this article, we introduce nonlinear dynamics and its rich, sometimes chaotic behaviour as an engine of computation. We review our work that has demonstrated how to compute using nonlinear dynamics. Furthermore, we investigate the interrelationship between invariant measures of a dynamical system and its computing power to strengthen the bridge between physics and computation.This article is part of the themed issue 'Horizons of cybernetical physics'. © 2017 The Author(s).

  5. Nonlinear dynamics as an engine of computation

    PubMed Central

    Lindner, John F.; Ditto, William L.

    2017-01-01

    Control of chaos teaches that control theory can tame the complex, random-like behaviour of chaotic systems. This alliance between control methods and physics—cybernetical physics—opens the door to many applications, including dynamics-based computing. In this article, we introduce nonlinear dynamics and its rich, sometimes chaotic behaviour as an engine of computation. We review our work that has demonstrated how to compute using nonlinear dynamics. Furthermore, we investigate the interrelationship between invariant measures of a dynamical system and its computing power to strengthen the bridge between physics and computation. This article is part of the themed issue ‘Horizons of cybernetical physics’. PMID:28115619

  6. Interactive computer graphics and its role in control system design of large space structures

    NASA Technical Reports Server (NTRS)

    Reddy, A. S. S. R.

    1985-01-01

    This paper attempts to show the relevance of interactive computer graphics in the design of control systems to maintain attitude and shape of large space structures to accomplish the required mission objectives. The typical phases of control system design, starting from the physical model such as modeling the dynamics, modal analysis, and control system design methodology are reviewed and the need of the interactive computer graphics is demonstrated. Typical constituent parts of large space structures such as free-free beams and free-free plates are used to demonstrate the complexity of the control system design and the effectiveness of the interactive computer graphics.

  7. Advanced computer architecture for large-scale real-time applications.

    DOT National Transportation Integrated Search

    1973-04-01

    Air traffic control automation is identified as a crucial problem which provides a complex, real-time computer application environment. A novel computer architecture in the form of a pipeline associative processor is conceived to achieve greater perf...

  8. Role of optical computers in aeronautical control applications

    NASA Technical Reports Server (NTRS)

    Baumbick, R. J.

    1981-01-01

    The role that optical computers play in aircraft control is determined. The optical computer has the potential high speed capability required, especially for matrix/matrix operations. The optical computer also has the potential for handling nonlinear simulations in real time. They are also more compatible with fiber optic signal transmission. Optics also permit the use of passive sensors to measure process variables. No electrical energy need be supplied to the sensor. Complex interfacing between optical sensors and the optical computer is avoided if the optical sensor outputs can be directly processed by the optical computer.

  9. Fast calculation of the `ILC norm' in iterative learning control

    NASA Astrophysics Data System (ADS)

    Rice, Justin K.; van Wingerden, Jan-Willem

    2013-06-01

    In this paper, we discuss and demonstrate a method for the exploitation of matrix structure in computations for iterative learning control (ILC). In Barton, Bristow, and Alleyne [International Journal of Control, 83(2), 1-8 (2010)], a special insight into the structure of the lifted convolution matrices involved in ILC is used along with a modified Lanczos method to achieve very fast computational bounds on the learning convergence, by calculating the 'ILC norm' in ? computational complexity. In this paper, we show how their method is equivalent to a special instance of the sequentially semi-separable (SSS) matrix arithmetic, and thus can be extended to many other computations in ILC, and specialised in some cases to even faster methods. Our SSS-based methodology will be demonstrated on two examples: a linear time-varying example resulting in the same ? complexity as in Barton et al., and a linear time-invariant example where our approach reduces the computational complexity to ?, thus decreasing the computation time, for an example, from the literature by a factor of almost 100. This improvement is achieved by transforming the norm computation via a linear matrix inequality into a check of positive definiteness - which allows us to further exploit the almost-Toeplitz properties of the matrix, and additionally provides explicit upper and lower bounds on the norm of the matrix, instead of the indirect Ritz estimate. These methods are now implemented in a MATLAB toolbox, freely available on the Internet.

  10. A Logically Centralized Approach for Control and Management of Large Computer Networks

    ERIC Educational Resources Information Center

    Iqbal, Hammad A.

    2012-01-01

    Management of large enterprise and Internet service provider networks is a complex, error-prone, and costly challenge. It is widely accepted that the key contributors to this complexity are the bundling of control and data forwarding in traditional routers and the use of fully distributed protocols for network control. To address these…

  11. Computationally efficient algorithm for high sampling-frequency operation of active noise control

    NASA Astrophysics Data System (ADS)

    Rout, Nirmal Kumar; Das, Debi Prasad; Panda, Ganapati

    2015-05-01

    In high sampling-frequency operation of active noise control (ANC) system the length of the secondary path estimate and the ANC filter are very long. This increases the computational complexity of the conventional filtered-x least mean square (FXLMS) algorithm. To reduce the computational complexity of long order ANC system using FXLMS algorithm, frequency domain block ANC algorithms have been proposed in past. These full block frequency domain ANC algorithms are associated with some disadvantages such as large block delay, quantization error due to computation of large size transforms and implementation difficulties in existing low-end DSP hardware. To overcome these shortcomings, the partitioned block ANC algorithm is newly proposed where the long length filters in ANC are divided into a number of equal partitions and suitably assembled to perform the FXLMS algorithm in the frequency domain. The complexity of this proposed frequency domain partitioned block FXLMS (FPBFXLMS) algorithm is quite reduced compared to the conventional FXLMS algorithm. It is further reduced by merging one fast Fourier transform (FFT)-inverse fast Fourier transform (IFFT) combination to derive the reduced structure FPBFXLMS (RFPBFXLMS) algorithm. Computational complexity analysis for different orders of filter and partition size are presented. Systematic computer simulations are carried out for both the proposed partitioned block ANC algorithms to show its accuracy compared to the time domain FXLMS algorithm.

  12. Complete LabVIEW-Controlled HPLC Lab: An Advanced Undergraduate Experience

    ERIC Educational Resources Information Center

    Beussman, Douglas J.; Walters, John P.

    2017-01-01

    Virtually all modern chemical instrumentation is controlled by computers. While software packages are continually becoming easier to use, allowing for more researchers to utilize more complex instruments, conveying some level of understanding as to how computers and instruments communicate is still an important part of the undergraduate…

  13. Complex Systems Simulation and Optimization | Computational Science | NREL

    Science.gov Websites

    account. Stochastic Optimization and Control: Formulation and implementation of advanced optimization and account uncertainty. Contact Wesley Jones Group Manager, Complex Systems Simulation and Optimiziation

  14. Secure Data Access Control for Fog Computing Based on Multi-Authority Attribute-Based Signcryption with Computation Outsourcing and Attribute Revocation.

    PubMed

    Xu, Qian; Tan, Chengxiang; Fan, Zhijie; Zhu, Wenye; Xiao, Ya; Cheng, Fujia

    2018-05-17

    Nowadays, fog computing provides computation, storage, and application services to end users in the Internet of Things. One of the major concerns in fog computing systems is how fine-grained access control can be imposed. As a logical combination of attribute-based encryption and attribute-based signature, Attribute-based Signcryption (ABSC) can provide confidentiality and anonymous authentication for sensitive data and is more efficient than traditional "encrypt-then-sign" or "sign-then-encrypt" strategy. Thus, ABSC is suitable for fine-grained access control in a semi-trusted cloud environment and is gaining more and more attention recently. However, in many existing ABSC systems, the computation cost required for the end users in signcryption and designcryption is linear with the complexity of signing and encryption access policy. Moreover, only a single authority that is responsible for attribute management and key generation exists in the previous proposed ABSC schemes, whereas in reality, mostly, different authorities monitor different attributes of the user. In this paper, we propose OMDAC-ABSC, a novel data access control scheme based on Ciphertext-Policy ABSC, to provide data confidentiality, fine-grained control, and anonymous authentication in a multi-authority fog computing system. The signcryption and designcryption overhead for the user is significantly reduced by outsourcing the undesirable computation operations to fog nodes. The proposed scheme is proven to be secure in the standard model and can provide attribute revocation and public verifiability. The security analysis, asymptotic complexity comparison, and implementation results indicate that our construction can balance the security goals with practical efficiency in computation.

  15. Concept of a Cloud Service for Data Preparation and Computational Control on Custom HPC Systems in Application to Molecular Dynamics

    NASA Astrophysics Data System (ADS)

    Puzyrkov, Dmitry; Polyakov, Sergey; Podryga, Viktoriia; Markizov, Sergey

    2018-02-01

    At the present stage of computer technology development it is possible to study the properties and processes in complex systems at molecular and even atomic levels, for example, by means of molecular dynamics methods. The most interesting are problems related with the study of complex processes under real physical conditions. Solving such problems requires the use of high performance computing systems of various types, for example, GRID systems and HPC clusters. Considering the time consuming computational tasks, the need arises of software for automatic and unified monitoring of such computations. A complex computational task can be performed over different HPC systems. It requires output data synchronization between the storage chosen by a scientist and the HPC system used for computations. The design of the computational domain is also quite a problem. It requires complex software tools and algorithms for proper atomistic data generation on HPC systems. The paper describes the prototype of a cloud service, intended for design of atomistic systems of large volume for further detailed molecular dynamic calculations and computational management for this calculations, and presents the part of its concept aimed at initial data generation on the HPC systems.

  16. Geometry of Quantum Computation with Qudits

    PubMed Central

    Luo, Ming-Xing; Chen, Xiu-Bo; Yang, Yi-Xian; Wang, Xiaojun

    2014-01-01

    The circuit complexity of quantum qubit system evolution as a primitive problem in quantum computation has been discussed widely. We investigate this problem in terms of qudit system. Using the Riemannian geometry the optimal quantum circuits are equivalent to the geodetic evolutions in specially curved parametrization of SU(dn). And the quantum circuit complexity is explicitly dependent of controllable approximation error bound. PMID:24509710

  17. Remote control system for high-perfomance computer simulation of crystal growth by the PFC method

    NASA Astrophysics Data System (ADS)

    Pavlyuk, Evgeny; Starodumov, Ilya; Osipov, Sergei

    2017-04-01

    Modeling of crystallization process by the phase field crystal method (PFC) - one of the important directions of modern computational materials science. In this paper, the practical side of the computer simulation of the crystallization process by the PFC method is investigated. To solve problems using this method, it is necessary to use high-performance computing clusters, data storage systems and other often expensive complex computer systems. Access to such resources is often limited, unstable and accompanied by various administrative problems. In addition, the variety of software and settings of different computing clusters sometimes does not allow researchers to use unified program code. There is a need to adapt the program code for each configuration of the computer complex. The practical experience of the authors has shown that the creation of a special control system for computing with the possibility of remote use can greatly simplify the implementation of simulations and increase the performance of scientific research. In current paper we show the principal idea of such a system and justify its efficiency.

  18. Automation of multi-agent control for complex dynamic systems in heterogeneous computational network

    NASA Astrophysics Data System (ADS)

    Oparin, Gennady; Feoktistov, Alexander; Bogdanova, Vera; Sidorov, Ivan

    2017-01-01

    The rapid progress of high-performance computing entails new challenges related to solving large scientific problems for various subject domains in a heterogeneous distributed computing environment (e.g., a network, Grid system, or Cloud infrastructure). The specialists in the field of parallel and distributed computing give the special attention to a scalability of applications for problem solving. An effective management of the scalable application in the heterogeneous distributed computing environment is still a non-trivial issue. Control systems that operate in networks, especially relate to this issue. We propose a new approach to the multi-agent management for the scalable applications in the heterogeneous computational network. The fundamentals of our approach are the integrated use of conceptual programming, simulation modeling, network monitoring, multi-agent management, and service-oriented programming. We developed a special framework for an automation of the problem solving. Advantages of the proposed approach are demonstrated on the parametric synthesis example of the static linear regulator for complex dynamic systems. Benefits of the scalable application for solving this problem include automation of the multi-agent control for the systems in a parallel mode with various degrees of its detailed elaboration.

  19. Cognitive engineering models: A prerequisite to the design of human-computer interaction in complex dynamic systems

    NASA Technical Reports Server (NTRS)

    Mitchell, Christine M.

    1993-01-01

    This chapter examines a class of human-computer interaction applications, specifically the design of human-computer interaction for the operators of complex systems. Such systems include space systems (e.g., manned systems such as the Shuttle or space station, and unmanned systems such as NASA scientific satellites), aviation systems (e.g., the flight deck of 'glass cockpit' airplanes or air traffic control) and industrial systems (e.g., power plants, telephone networks, and sophisticated, e.g., 'lights out,' manufacturing facilities). The main body of human-computer interaction (HCI) research complements but does not directly address the primary issues involved in human-computer interaction design for operators of complex systems. Interfaces to complex systems are somewhat special. The 'user' in such systems - i.e., the human operator responsible for safe and effective system operation - is highly skilled, someone who in human-machine systems engineering is sometimes characterized as 'well trained, well motivated'. The 'job' or task context is paramount and, thus, human-computer interaction is subordinate to human job interaction. The design of human interaction with complex systems, i.e., the design of human job interaction, is sometimes called cognitive engineering.

  20. Habitual control of goal selection in humans

    PubMed Central

    Cushman, Fiery; Morris, Adam

    2015-01-01

    Humans choose actions based on both habit and planning. Habitual control is computationally frugal but adapts slowly to novel circumstances, whereas planning is computationally expensive but can adapt swiftly. Current research emphasizes the competition between habits and plans for behavioral control, yet many complex tasks instead favor their integration. We consider a hierarchical architecture that exploits the computational efficiency of habitual control to select goals while preserving the flexibility of planning to achieve those goals. We formalize this mechanism in a reinforcement learning setting, illustrate its costs and benefits, and experimentally demonstrate its spontaneous application in a sequential decision-making task. PMID:26460050

  1. Robust scalable stabilisability conditions for large-scale heterogeneous multi-agent systems with uncertain nonlinear interactions: towards a distributed computing architecture

    NASA Astrophysics Data System (ADS)

    Manfredi, Sabato

    2016-06-01

    Large-scale dynamic systems are becoming highly pervasive in their occurrence with applications ranging from system biology, environment monitoring, sensor networks, and power systems. They are characterised by high dimensionality, complexity, and uncertainty in the node dynamic/interactions that require more and more computational demanding methods for their analysis and control design, as well as the network size and node system/interaction complexity increase. Therefore, it is a challenging problem to find scalable computational method for distributed control design of large-scale networks. In this paper, we investigate the robust distributed stabilisation problem of large-scale nonlinear multi-agent systems (briefly MASs) composed of non-identical (heterogeneous) linear dynamical systems coupled by uncertain nonlinear time-varying interconnections. By employing Lyapunov stability theory and linear matrix inequality (LMI) technique, new conditions are given for the distributed control design of large-scale MASs that can be easily solved by the toolbox of MATLAB. The stabilisability of each node dynamic is a sufficient assumption to design a global stabilising distributed control. The proposed approach improves some of the existing LMI-based results on MAS by both overcoming their computational limits and extending the applicative scenario to large-scale nonlinear heterogeneous MASs. Additionally, the proposed LMI conditions are further reduced in terms of computational requirement in the case of weakly heterogeneous MASs, which is a common scenario in real application where the network nodes and links are affected by parameter uncertainties. One of the main advantages of the proposed approach is to allow to move from a centralised towards a distributed computing architecture so that the expensive computation workload spent to solve LMIs may be shared among processors located at the networked nodes, thus increasing the scalability of the approach than the network size. Finally, a numerical example shows the applicability of the proposed method and its advantage in terms of computational complexity when compared with the existing approaches.

  2. Secure Data Access Control for Fog Computing Based on Multi-Authority Attribute-Based Signcryption with Computation Outsourcing and Attribute Revocation

    PubMed Central

    Xu, Qian; Tan, Chengxiang; Fan, Zhijie; Zhu, Wenye; Xiao, Ya; Cheng, Fujia

    2018-01-01

    Nowadays, fog computing provides computation, storage, and application services to end users in the Internet of Things. One of the major concerns in fog computing systems is how fine-grained access control can be imposed. As a logical combination of attribute-based encryption and attribute-based signature, Attribute-based Signcryption (ABSC) can provide confidentiality and anonymous authentication for sensitive data and is more efficient than traditional “encrypt-then-sign” or “sign-then-encrypt” strategy. Thus, ABSC is suitable for fine-grained access control in a semi-trusted cloud environment and is gaining more and more attention recently. However, in many existing ABSC systems, the computation cost required for the end users in signcryption and designcryption is linear with the complexity of signing and encryption access policy. Moreover, only a single authority that is responsible for attribute management and key generation exists in the previous proposed ABSC schemes, whereas in reality, mostly, different authorities monitor different attributes of the user. In this paper, we propose OMDAC-ABSC, a novel data access control scheme based on Ciphertext-Policy ABSC, to provide data confidentiality, fine-grained control, and anonymous authentication in a multi-authority fog computing system. The signcryption and designcryption overhead for the user is significantly reduced by outsourcing the undesirable computation operations to fog nodes. The proposed scheme is proven to be secure in the standard model and can provide attribute revocation and public verifiability. The security analysis, asymptotic complexity comparison, and implementation results indicate that our construction can balance the security goals with practical efficiency in computation. PMID:29772840

  3. Scalable quantum computation scheme based on quantum-actuated nuclear-spin decoherence-free qubits

    NASA Astrophysics Data System (ADS)

    Dong, Lihong; Rong, Xing; Geng, Jianpei; Shi, Fazhan; Li, Zhaokai; Duan, Changkui; Du, Jiangfeng

    2017-11-01

    We propose a novel theoretical scheme of quantum computation. Nuclear spin pairs are utilized to encode decoherence-free (DF) qubits. A nitrogen-vacancy center serves as a quantum actuator to initialize, readout, and quantum control the DF qubits. The realization of CNOT gates between two DF qubits are also presented. Numerical simulations show high fidelities of all these processes. Additionally, we discuss the potential of scalability. Our scheme reduces the challenge of classical interfaces from controlling and observing complex quantum systems down to a simple quantum actuator. It also provides a novel way to handle complex quantum systems.

  4. A synthetic computational environment: To control the spread of respiratory infections in a virtual university

    NASA Astrophysics Data System (ADS)

    Ge, Yuanzheng; Chen, Bin; liu, Liang; Qiu, Xiaogang; Song, Hongbin; Wang, Yong

    2018-02-01

    Individual-based computational environment provides an effective solution to study complex social events by reconstructing scenarios. Challenges remain in reconstructing the virtual scenarios and reproducing the complex evolution. In this paper, we propose a framework to reconstruct a synthetic computational environment, reproduce the epidemic outbreak, and evaluate management interventions in a virtual university. The reconstructed computational environment includes 4 fundamental components: the synthetic population, behavior algorithms, multiple social networks, and geographic campus environment. In the virtual university, influenza H1N1 transmission experiments are conducted, and gradually enhanced interventions are evaluated and compared quantitatively. The experiment results indicate that the reconstructed virtual environment provides a solution to reproduce complex emergencies and evaluate policies to be executed in the real world.

  5. Implications of Windowing Techniques for CAI.

    ERIC Educational Resources Information Center

    Heines, Jesse M.; Grinstein, Georges G.

    This paper discusses the use of a technique called windowing in computer assisted instruction to allow independent control of functional areas in complex CAI displays and simultaneous display of output from a running computer program and coordinated instructional material. Two obstacles to widespread use of CAI in computer science courses are…

  6. Experiments in cooperative-arm object manipulation with a two-armed free-flying robot. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Koningstein, Ross

    1990-01-01

    Developing computed-torque controllers for complex manipulator systems using current techniques and tools is difficult because they address the issues pertinent to simulation, as opposed to control. A new formulation of computed-torque (CT) control that leads to an automated computer-torque robot controller program is presented. This automated tool is used for simulations and experimental demonstrations of endpoint and object control from a free-flying robot. A new computed-torque formulation states the multibody control problem in an elegant, homogeneous, and practical form. A recursive dynamics algorithm is presented that numerically evaluates kinematics and dynamics terms for multibody systems given a topological description. Manipulators may be free-flying, and may have closed-chain constraints. With the exception of object squeeze-force control, the algorithm does not deal with actuator redundancy. The algorithm is used to implement an automated 2D computed-torque dynamics and control package that allows joint, endpoint, orientation, momentum, and object squeeze-force control. This package obviates the need for hand-derivation of kinematics and dynamics, and is used for both simulation and experimental control. Endpoint control experiments are performed on a laboratory robot that has two arms to manipulate payloads, and uses an air bearing to achieve very-low drag characteristics. Simulations and experimental data for endpoint and object controllers are presented for the experimental robot - a complex dynamic system. There is a certain rather wide set of conditions under which CT endpoint controllers can neglect robot base accelerations (but not motions) and achieve comparable performance including base accelerations in the model. The regime over which this simplification holds is explored by simulation and experiment.

  7. Reward-Modulated Hebbian Plasticity as Leverage for Partially Embodied Control in Compliant Robotics

    PubMed Central

    Burms, Jeroen; Caluwaerts, Ken; Dambre, Joni

    2015-01-01

    In embodied computation (or morphological computation), part of the complexity of motor control is offloaded to the body dynamics. We demonstrate that a simple Hebbian-like learning rule can be used to train systems with (partial) embodiment, and can be extended outside of the scope of traditional neural networks. To this end, we apply the learning rule to optimize the connection weights of recurrent neural networks with different topologies and for various tasks. We then apply this learning rule to a simulated compliant tensegrity robot by optimizing static feedback controllers that directly exploit the dynamics of the robot body. This leads to partially embodied controllers, i.e., hybrid controllers that naturally integrate the computations that are performed by the robot body into a neural network architecture. Our results demonstrate the universal applicability of reward-modulated Hebbian learning. Furthermore, they demonstrate the robustness of systems trained with the learning rule. This study strengthens our belief that compliant robots should or can be seen as computational units, instead of dumb hardware that needs a complex controller. This link between compliant robotics and neural networks is also the main reason for our search for simple universal learning rules for both neural networks and robotics. PMID:26347645

  8. A computer program to trace seismic ray distribution in complex two-dimensional geological models

    USGS Publications Warehouse

    Yacoub, Nazieh K.; Scott, James H.

    1970-01-01

    A computer program has been developed to trace seismic rays and their amplitudes and energies through complex two-dimensional geological models, for which boundaries between elastic units are defined by a series of digitized X-, Y-coordinate values. Input data for the program includes problem identification, control parameters, model coordinates and elastic parameter for the elastic units. The program evaluates the partitioning of ray amplitude and energy at elastic boundaries, computes the total travel time, total travel distance and other parameters for rays arising at the earth's surface. Instructions are given for punching program control cards and data cards, and for arranging input card decks. An example of printer output for a simple problem is presented. The program is written in FORTRAN IV language. The listing of the program is shown in the Appendix, with an example output from a CDC-6600 computer.

  9. An efficient formulation of robot arm dynamics for control and computer simulation

    NASA Astrophysics Data System (ADS)

    Lee, C. S. G.; Nigam, R.

    This paper describes an efficient formulation of the dynamic equations of motion of industrial robots based on the Lagrange formulation of d'Alembert's principle. This formulation, as applied to a PUMA robot arm, results in a set of closed form second order differential equations with cross product terms. They are not as efficient in computation as those formulated by the Newton-Euler method, but provide a better analytical model for control analysis and computer simulation. Computational complexities of this dynamic model together with other models are tabulated for discussion.

  10. Control of complex physically simulated robot groups

    NASA Astrophysics Data System (ADS)

    Brogan, David C.

    2001-10-01

    Actuated systems such as robots take many forms and sizes but each requires solving the difficult task of utilizing available control inputs to accomplish desired system performance. Coordinated groups of robots provide the opportunity to accomplish more complex tasks, to adapt to changing environmental conditions, and to survive individual failures. Similarly, groups of simulated robots, represented as graphical characters, can test the design of experimental scenarios and provide autonomous interactive counterparts for video games. The complexity of writing control algorithms for these groups currently hinders their use. A combination of biologically inspired heuristics, search strategies, and optimization techniques serve to reduce the complexity of controlling these real and simulated characters and to provide computationally feasible solutions.

  11. Sensitivity analysis of dynamic biological systems with time-delays.

    PubMed

    Wu, Wu Hsiung; Wang, Feng Sheng; Chang, Maw Shang

    2010-10-15

    Mathematical modeling has been applied to the study and analysis of complex biological systems for a long time. Some processes in biological systems, such as the gene expression and feedback control in signal transduction networks, involve a time delay. These systems are represented as delay differential equation (DDE) models. Numerical sensitivity analysis of a DDE model by the direct method requires the solutions of model and sensitivity equations with time-delays. The major effort is the computation of Jacobian matrix when computing the solution of sensitivity equations. The computation of partial derivatives of complex equations either by the analytic method or by symbolic manipulation is time consuming, inconvenient, and prone to introduce human errors. To address this problem, an automatic approach to obtain the derivatives of complex functions efficiently and accurately is necessary. We have proposed an efficient algorithm with an adaptive step size control to compute the solution and dynamic sensitivities of biological systems described by ordinal differential equations (ODEs). The adaptive direct-decoupled algorithm is extended to solve the solution and dynamic sensitivities of time-delay systems describing by DDEs. To save the human effort and avoid the human errors in the computation of partial derivatives, an automatic differentiation technique is embedded in the extended algorithm to evaluate the Jacobian matrix. The extended algorithm is implemented and applied to two realistic models with time-delays: the cardiovascular control system and the TNF-α signal transduction network. The results show that the extended algorithm is a good tool for dynamic sensitivity analysis on DDE models with less user intervention. By comparing with direct-coupled methods in theory, the extended algorithm is efficient, accurate, and easy to use for end users without programming background to do dynamic sensitivity analysis on complex biological systems with time-delays.

  12. The hierarchical expert tuning of PID controllers using tools of soft computing.

    PubMed

    Karray, F; Gueaieb, W; Al-Sharhan, S

    2002-01-01

    We present soft computing-based results pertaining to the hierarchical tuning process of PID controllers located within the control loop of a class of nonlinear systems. The results are compared with PID controllers implemented either in a stand alone scheme or as a part of conventional gain scheduling structure. This work is motivated by the increasing need in the industry to design highly reliable and efficient controllers for dealing with regulation and tracking capabilities of complex processes characterized by nonlinearities and possibly time varying parameters. The soft computing-based controllers proposed are hybrid in nature in that they integrate within a well-defined hierarchical structure the benefits of hard algorithmic controllers with those having supervisory capabilities. The controllers proposed also have the distinct features of learning and auto-tuning without the need for tedious and computationally extensive online systems identification schemes.

  13. Computer assisted thermal-vacuum testing

    NASA Technical Reports Server (NTRS)

    Petrie, W.; Mikk, G.

    1977-01-01

    In testing complex systems and components under dynamic thermal-vacuum environments, it is desirable to optimize the environment control sequence in order to reduce test duration and cost. This paper describes an approach where a computer is utilized as part of the test control operation. Real time test data is made available to the computer through time-sharing terminals at appropriate time intervals. A mathematical model of the test article and environmental control equipment is then operated on using the real time data to yield current thermal status, temperature analysis, trend prediction and recommended thermal control setting changes to arrive at the required thermal condition. The data acquisition interface and the time-sharing hook-up to an IBM-370 computer is described along with a typical control program and data demonstrating its use.

  14. Fuzzy logic based robotic controller

    NASA Technical Reports Server (NTRS)

    Attia, F.; Upadhyaya, M.

    1994-01-01

    Existing Proportional-Integral-Derivative (PID) robotic controllers rely on an inverse kinematic model to convert user-specified cartesian trajectory coordinates to joint variables. These joints experience friction, stiction, and gear backlash effects. Due to lack of proper linearization of these effects, modern control theory based on state space methods cannot provide adequate control for robotic systems. In the presence of loads, the dynamic behavior of robotic systems is complex and nonlinear, especially where mathematical modeling is evaluated for real-time operators. Fuzzy Logic Control is a fast emerging alternative to conventional control systems in situations where it may not be feasible to formulate an analytical model of the complex system. Fuzzy logic techniques track a user-defined trajectory without having the host computer to explicitly solve the nonlinear inverse kinematic equations. The goal is to provide a rule-based approach, which is closer to human reasoning. The approach used expresses end-point error, location of manipulator joints, and proximity to obstacles as fuzzy variables. The resulting decisions are based upon linguistic and non-numerical information. This paper presents a solution to the conventional robot controller which is independent of computationally intensive kinematic equations. Computer simulation results of this approach as obtained from software implementation are also discussed.

  15. Analysis of Selected Enhancements to the En Route Central Computing Complex

    DOT National Transportation Integrated Search

    1981-09-01

    This report analyzes selected hardware enhancements that could improve the performance of the 9020 computer systems, which are used to provide en route air traffic control services. These enhancements could be implemented quickly, would be relatively...

  16. Algorithm For Optimal Control Of Large Structures

    NASA Technical Reports Server (NTRS)

    Salama, Moktar A.; Garba, John A..; Utku, Senol

    1989-01-01

    Cost of computation appears competitive with other methods. Problem to compute optimal control of forced response of structure with n degrees of freedom identified in terms of smaller number, r, of vibrational modes. Article begins with Hamilton-Jacobi formulation of mechanics and use of quadratic cost functional. Complexity reduced by alternative approach in which quadratic cost functional expressed in terms of control variables only. Leads to iterative solution of second-order time-integral matrix Volterra equation of second kind containing optimal control vector. Cost of algorithm, measured in terms of number of computations required, is of order of, or less than, cost of prior algoritms applied to similar problems.

  17. Development of Onboard Computer Complex for Russian Segment of ISS

    NASA Technical Reports Server (NTRS)

    Branets, V.; Brand, G.; Vlasov, R.; Graf, I.; Clubb, J.; Mikrin, E.; Samitov, R.

    1998-01-01

    Report present a description of the Onboard Computer Complex (CC) that was developed during the period of 1994-1998 for the Russian Segment of ISS. The system was developed in co-operation with NASA and ESA. ESA developed a new computation system under the RSC Energia Technical Assignment, called DMS-R. The CC also includes elements developed by Russian experts and organizations. A general architecture of the computer system and the characteristics of primary elements of this system are described. The system was integrated at RSC Energia with the participation of American and European specialists. The report contains information on software simulators, verification and de-bugging facilities witch were been developed for both stand-alone and integrated tests and verification. This CC serves as the basis for the Russian Segment Onboard Control Complex on ISS.

  18. Novel Image Quality Control Systems(Add-On). Innovative Computational Methods for Inverse Problems in Optical and SAR Imaging

    DTIC Science & Technology

    2007-02-28

    Iterative Ultrasonic Signal and Image Deconvolution for Estimation of the Complex Medium Response, International Journal of Imaging Systems and...1767-1782, 2006. 31. Z. Mu, R. Plemmons, and P. Santago. Iterative Ultrasonic Signal and Image Deconvolution for Estimation of the Complex...rigorous mathematical and computational research on inverse problems in optical imaging of direct interest to the Army and also the intelligence agencies

  19. VIEW OF COMPUTER/DATA COLLECTION AREA, SOUTH OF FIRING ROOM NO. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF COMPUTER/DATA COLLECTION AREA, SOUTH OF FIRING ROOM NO. 3, FACING SOUTHEAST - Cape Canaveral Air Force Station, Launch Complex 39, Launch Control Center, LCC Road, East of Kennedy Parkway North, Cape Canaveral, Brevard County, FL

  20. Fast H.264/AVC FRExt intra coding using belief propagation.

    PubMed

    Milani, Simone

    2011-01-01

    In the H.264/AVC FRExt coder, the coding performance of Intra coding significantly overcomes the previous still image coding standards, like JPEG2000, thanks to a massive use of spatial prediction. Unfortunately, the adoption of an extensive set of predictors induces a significant increase of the computational complexity required by the rate-distortion optimization routine. The paper presents a complexity reduction strategy that aims at reducing the computational load of the Intra coding with a small loss in the compression performance. The proposed algorithm relies on selecting a reduced set of prediction modes according to their probabilities, which are estimated adopting a belief-propagation procedure. Experimental results show that the proposed method permits saving up to 60 % of the coding time required by an exhaustive rate-distortion optimization method with a negligible loss in performance. Moreover, it permits an accurate control of the computational complexity unlike other methods where the computational complexity depends upon the coded sequence.

  1. Integration of symbolic and algorithmic hardware and software for the automation of space station subsystems

    NASA Technical Reports Server (NTRS)

    Gregg, Hugh; Healey, Kathleen; Hack, Edmund; Wong, Carla

    1987-01-01

    Expert systems that require access to data bases, complex simulations and real time instrumentation have both symbolic as well as algorithmic computing needs. These needs could both be met using a general computing workstation running both symbolic and algorithmic code, or separate, specialized computers networked together. The later approach was chosen to implement TEXSYS, the thermal expert system, developed to demonstrate the ability of an expert system to autonomously control the thermal control system of the space station. TEXSYS has been implemented on a Symbolics workstation, and will be linked to a microVAX computer that will control a thermal test bed. Integration options are explored and several possible solutions are presented.

  2. Supplementary Computer Generated Cueing to Enhance Air Traffic Controller Efficiency

    DTIC Science & Technology

    2013-03-01

    assess the complexity of air traffic control (Mogford, Guttman, Morrow, & Kopardekar, 1995; Laudeman, Shelden, Branstrom, & Brasil , 1998). Controllers...Behaviorial Sciences: Volume 1: Methodological Issues Volume 2: Statistical Issues, 1, 257. Laudeman, I. V., Shelden, S. G., Branstrom, R., & Brasil

  3. Engineering and Design: Control Stations and Control Systems for Navigation Locks and Dams

    DTIC Science & Technology

    1997-05-30

    of human intelli- hypothetical lock and dam configurations. Finally, b. Terminology. (1) PLC system. The computer- based systems utilize special...electrical industry for industrial use. There- fore, for purposes of this document, a computer- based system is referred to as a PLC system. (2) Relay- based ...be custom made, because most of today’s control systems of any complexity are PLC - based , the standard size of a given motor starter cubicle is not

  4. An exploratory investigation of various assessment instruments as correlates of complex visual monitoring performance.

    DOT National Transportation Integrated Search

    1980-10-01

    The present study examined a variety of possible predictors of complex monitoring performance. The criterion task was designed to resemble that of a highly automated air traffic control radar system containing computer-generated alphanumeric displays...

  5. Controlling Light Transmission Through Highly Scattering Media Using Semi-Definite Programming as a Phase Retrieval Computation Method.

    PubMed

    N'Gom, Moussa; Lien, Miao-Bin; Estakhri, Nooshin M; Norris, Theodore B; Michielssen, Eric; Nadakuditi, Raj Rao

    2017-05-31

    Complex Semi-Definite Programming (SDP) is introduced as a novel approach to phase retrieval enabled control of monochromatic light transmission through highly scattering media. In a simple optical setup, a spatial light modulator is used to generate a random sequence of phase-modulated wavefronts, and the resulting intensity speckle patterns in the transmitted light are acquired on a camera. The SDP algorithm allows computation of the complex transmission matrix of the system from this sequence of intensity-only measurements, without need for a reference beam. Once the transmission matrix is determined, optimal wavefronts are computed that focus the incident beam to any position or sequence of positions on the far side of the scattering medium, without the need for any subsequent measurements or wavefront shaping iterations. The number of measurements required and the degree of enhancement of the intensity at focus is determined by the number of pixels controlled by the spatial light modulator.

  6. Instrumentino: An Open-Source Software for Scientific Instruments.

    PubMed

    Koenka, Israel Joel; Sáiz, Jorge; Hauser, Peter C

    2015-01-01

    Scientists often need to build dedicated computer-controlled experimental systems. For this purpose, it is becoming common to employ open-source microcontroller platforms, such as the Arduino. These boards and associated integrated software development environments provide affordable yet powerful solutions for the implementation of hardware control of transducers and acquisition of signals from detectors and sensors. It is, however, a challenge to write programs that allow interactive use of such arrangements from a personal computer. This task is particularly complex if some of the included hardware components are connected directly to the computer and not via the microcontroller. A graphical user interface framework, Instrumentino, was therefore developed to allow the creation of control programs for complex systems with minimal programming effort. By writing a single code file, a powerful custom user interface is generated, which enables the automatic running of elaborate operation sequences and observation of acquired experimental data in real time. The framework, which is written in Python, allows extension by users, and is made available as an open source project.

  7. Stream Processors

    NASA Astrophysics Data System (ADS)

    Erez, Mattan; Dally, William J.

    Stream processors, like other multi core architectures partition their functional units and storage into multiple processing elements. In contrast to typical architectures, which contain symmetric general-purpose cores and a cache hierarchy, stream processors have a significantly leaner design. Stream processors are specifically designed for the stream execution model, in which applications have large amounts of explicit parallel computation, structured and predictable control, and memory accesses that can be performed at a coarse granularity. Applications in the streaming model are expressed in a gather-compute-scatter form, yielding programs with explicit control over transferring data to and from on-chip memory. Relying on these characteristics, which are common to many media processing and scientific computing applications, stream architectures redefine the boundary between software and hardware responsibilities with software bearing much of the complexity required to manage concurrency, locality, and latency tolerance. Thus, stream processors have minimal control consisting of fetching medium- and coarse-grained instructions and executing them directly on the many ALUs. Moreover, the on-chip storage hierarchy of stream processors is under explicit software control, as is all communication, eliminating the need for complex reactive hardware mechanisms.

  8. Strategic control in decision-making under uncertainty.

    PubMed

    Venkatraman, Vinod; Huettel, Scott A

    2012-04-01

    Complex economic decisions - whether investing money for retirement or purchasing some new electronic gadget - often involve uncertainty about the likely consequences of our choices. Critical for resolving that uncertainty are strategic meta-decision processes, which allow people to simplify complex decision problems, evaluate outcomes against a variety of contexts, and flexibly match behavior to changes in the environment. In recent years, substantial research has implicated the dorsomedial prefrontal cortex (dmPFC) in the flexible control of behavior. However, nearly all such evidence comes from paradigms involving executive function or response selection, not complex decision-making. Here, we review evidence that demonstrates that the dmPFC contributes to strategic control in complex decision-making. This region contains a functional topography such that the posterior dmPFC supports response-related control, whereas the anterior dmPFC supports strategic control. Activation in the anterior dmPFC signals changes in how a decision problem is represented, which in turn can shape computational processes elsewhere in the brain. Based on these findings, we argue for both generalized contributions of the dmPFC to cognitive control, and specific computational roles for its subregions depending upon the task demands and context. We also contend that these strategic considerations are likely to be critical for decision-making in other domains, including interpersonal interactions in social settings. © 2012 The Authors. European Journal of Neuroscience © 2012 Federation of European Neuroscience Societies and Blackwell Publishing Ltd.

  9. Strategic Control in Decision Making under Uncertainty

    PubMed Central

    Venkatraman, Vinod; Huettel, Scott

    2012-01-01

    Complex economic decisions – whether investing money for retirement or purchasing some new electronic gadget – often involve uncertainty about the likely consequences of our choices. Critical for resolving that uncertainty are strategic meta-decision processes, which allow people to simplify complex decision problems, to evaluate outcomes against a variety of contexts, and to flexibly match behavior to changes in the environment. In recent years, substantial research implicates the dorsomedial prefrontal cortex (dmPFC) in the flexible control of behavior. However, nearly all such evidence comes from paradigms involving executive function or response selection, not complex decision making. Here, we review evidence that demonstrates that the dmPFC contributes to strategic control in complex decision making. This region contains a functional topography such that the posterior dmPFC supports response-related control while the anterior dmPFC supports strategic control. Activation in the anterior dmPFC signals changes in how a decision problem is represented, which in turn can shape computational processes elsewhere in the brain. Based on these findings, we argue both for generalized contributions of the dmPFC to cognitive control, and for specific computational roles for its subregions depending upon the task demands and context. We also contend that these strategic considerations are also likely to be critical for decision making in other domains, including interpersonal interactions in social settings. PMID:22487037

  10. Computing with dynamical systems based on insulator-metal-transition oscillators

    NASA Astrophysics Data System (ADS)

    Parihar, Abhinav; Shukla, Nikhil; Jerry, Matthew; Datta, Suman; Raychowdhury, Arijit

    2017-04-01

    In this paper, we review recent work on novel computing paradigms using coupled oscillatory dynamical systems. We explore systems of relaxation oscillators based on linear state transitioning devices, which switch between two discrete states with hysteresis. By harnessing the dynamics of complex, connected systems, we embrace the philosophy of "let physics do the computing" and demonstrate how complex phase and frequency dynamics of such systems can be controlled, programmed, and observed to solve computationally hard problems. Although our discussion in this paper is limited to insulator-to-metallic state transition devices, the general philosophy of such computing paradigms can be translated to other mediums including optical systems. We present the necessary mathematical treatments necessary to understand the time evolution of these systems and demonstrate through recent experimental results the potential of such computational primitives.

  11. BIO-Plex Information System Concept

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Boulanger, Richard; Arnold, James O. (Technical Monitor)

    1999-01-01

    This paper describes a suggested design for an integrated information system for the proposed BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) at Johnson Space Center (JSC), including distributed control systems, central control, networks, database servers, personal computers and workstations, applications software, and external communications. The system will have an open commercial computing and networking, architecture. The network will provide automatic real-time transfer of information to database server computers which perform data collection and validation. This information system will support integrated, data sharing applications for everything, from system alarms to management summaries. Most existing complex process control systems have information gaps between the different real time subsystems, between these subsystems and central controller, between the central controller and system level planning and analysis application software, and between the system level applications and management overview reporting. An integrated information system is vitally necessary as the basis for the integration of planning, scheduling, modeling, monitoring, and control, which will allow improved monitoring and control based on timely, accurate and complete data. Data describing the system configuration and the real time processes can be collected, checked and reconciled, analyzed and stored in database servers that can be accessed by all applications. The required technology is available. The only opportunity to design a distributed, nonredundant, integrated system is before it is built. Retrofit is extremely difficult and costly.

  12. RTSPM: real-time Linux control software for scanning probe microscopy.

    PubMed

    Chandrasekhar, V; Mehta, M M

    2013-01-01

    Real time computer control is an essential feature of scanning probe microscopes, which have become important tools for the characterization and investigation of nanometer scale samples. Most commercial (and some open-source) scanning probe data acquisition software uses digital signal processors to handle the real time data processing and control, which adds to the expense and complexity of the control software. We describe here scan control software that uses a single computer and a data acquisition card to acquire scan data. The computer runs an open-source real time Linux kernel, which permits fast acquisition and control while maintaining a responsive graphical user interface. Images from a simulated tuning-fork based microscope as well as a standard topographical sample are also presented, showing some of the capabilities of the software.

  13. Technology and Transformation in Academic Libraries.

    ERIC Educational Resources Information Center

    Shaw, Ward

    Academic library computing systems, which are among the most complex found in academic environments, now include external systems, such as online commercial search services and nationwide networks, and local systems that control and support internal operations. As librarians have realized the benefit of using computer systems to perform…

  14. Modeling and deadlock avoidance of automated manufacturing systems with multiple automated guided vehicles.

    PubMed

    Wu, Naiqi; Zhou, MengChu

    2005-12-01

    An automated manufacturing system (AMS) contains a number of versatile machines (or workstations), buffers, an automated material handling system (MHS), and is computer-controlled. An effective and flexible alternative for implementing MHS is to use automated guided vehicle (AGV) system. The deadlock issue in AMS is very important in its operation and has extensively been studied. The deadlock problems were separately treated for parts in production and transportation and many techniques were developed for each problem. However, such treatment does not take the advantage of the flexibility offered by multiple AGVs. In general, it is intractable to obtain maximally permissive control policy for either problem. Instead, this paper investigates these two problems in an integrated way. First we model an AGV system and part processing processes by resource-oriented Petri nets, respectively. Then the two models are integrated by using macro transitions. Based on the combined model, a novel control policy for deadlock avoidance is proposed. It is shown to be maximally permissive with computational complexity of O (n2) where n is the number of machines in AMS if the complexity for controlling the part transportation by AGVs is not considered. Thus, the complexity of deadlock avoidance for the whole system is bounded by the complexity in controlling the AGV system. An illustrative example shows its application and power.

  15. Representing and Learning Complex Object Interactions

    PubMed Central

    Zhou, Yilun; Konidaris, George

    2017-01-01

    We present a framework for representing scenarios with complex object interactions, in which a robot cannot directly interact with the object it wishes to control, but must instead do so via intermediate objects. For example, a robot learning to drive a car can only indirectly change its pose, by rotating the steering wheel. We formalize such complex interactions as chains of Markov decision processes and show how they can be learned and used for control. We describe two systems in which a robot uses learning from demonstration to achieve indirect control: playing a computer game, and using a hot water dispenser to heat a cup of water. PMID:28593181

  16. Computing and data processing

    NASA Technical Reports Server (NTRS)

    Smarr, Larry; Press, William; Arnett, David W.; Cameron, Alastair G. W.; Crutcher, Richard M.; Helfand, David J.; Horowitz, Paul; Kleinmann, Susan G.; Linsky, Jeffrey L.; Madore, Barry F.

    1991-01-01

    The applications of computers and data processing to astronomy are discussed. Among the topics covered are the emerging national information infrastructure, workstations and supercomputers, supertelescopes, digital astronomy, astrophysics in a numerical laboratory, community software, archiving of ground-based observations, dynamical simulations of complex systems, plasma astrophysics, and the remote control of fourth dimension supercomputers.

  17. Air Defense: A Computer Game for Research in Human Performance.

    DTIC Science & Technology

    1981-07-01

    warfare (ANW) threat analysis. M’ajor elements of the threat analysis problem \\\\,erc eoibedded in an interactive air detoense game controlled by a...The game requires sustained attention to a complex and interactive "hostile" environment, provides proper experimental control of relevant variables...AD-A102 725 NAVY PERSONNEL RESEARCH AND DEVELOPMENT CENTER SAN DETC F/6 5/10 AIR DEFENSE: A COMPUTER GAME FOR RESEARCH IN HUMAN PERFORMANCE.(U) JUL

  18. Programmable chemical controllers made from DNA.

    PubMed

    Chen, Yuan-Jyue; Dalchau, Neil; Srinivas, Niranjan; Phillips, Andrew; Cardelli, Luca; Soloveichik, David; Seelig, Georg

    2013-10-01

    Biological organisms use complex molecular networks to navigate their environment and regulate their internal state. The development of synthetic systems with similar capabilities could lead to applications such as smart therapeutics or fabrication methods based on self-organization. To achieve this, molecular control circuits need to be engineered to perform integrated sensing, computation and actuation. Here we report a DNA-based technology for implementing the computational core of such controllers. We use the formalism of chemical reaction networks as a 'programming language' and our DNA architecture can, in principle, implement any behaviour that can be mathematically expressed as such. Unlike logic circuits, our formulation naturally allows complex signal processing of intrinsically analogue biological and chemical inputs. Controller components can be derived from biologically synthesized (plasmid) DNA, which reduces errors associated with chemically synthesized DNA. We implement several building-block reaction types and then combine them into a network that realizes, at the molecular level, an algorithm used in distributed control systems for achieving consensus between multiple agents.

  19. Programmable chemical controllers made from DNA

    NASA Astrophysics Data System (ADS)

    Chen, Yuan-Jyue; Dalchau, Neil; Srinivas, Niranjan; Phillips, Andrew; Cardelli, Luca; Soloveichik, David; Seelig, Georg

    2013-10-01

    Biological organisms use complex molecular networks to navigate their environment and regulate their internal state. The development of synthetic systems with similar capabilities could lead to applications such as smart therapeutics or fabrication methods based on self-organization. To achieve this, molecular control circuits need to be engineered to perform integrated sensing, computation and actuation. Here we report a DNA-based technology for implementing the computational core of such controllers. We use the formalism of chemical reaction networks as a 'programming language' and our DNA architecture can, in principle, implement any behaviour that can be mathematically expressed as such. Unlike logic circuits, our formulation naturally allows complex signal processing of intrinsically analogue biological and chemical inputs. Controller components can be derived from biologically synthesized (plasmid) DNA, which reduces errors associated with chemically synthesized DNA. We implement several building-block reaction types and then combine them into a network that realizes, at the molecular level, an algorithm used in distributed control systems for achieving consensus between multiple agents.

  20. Cyber-physical approach to the network-centric robotics control task

    NASA Astrophysics Data System (ADS)

    Muliukha, Vladimir; Ilyashenko, Alexander; Zaborovsky, Vladimir; Lukashin, Alexey

    2016-10-01

    Complex engineering tasks concerning control for groups of mobile robots are developed poorly. In our work for their formalization we use cyber-physical approach, which extends the range of engineering and physical methods for a design of complex technical objects by researching the informational aspects of communication and interaction between objects and with an external environment [1]. The paper analyzes network-centric methods for control of cyber-physical objects. Robots or cyber-physical objects interact with each other by transmitting information via computer networks using preemptive queueing system and randomized push-out mechanism [2],[3]. The main field of application for the results of our work is space robotics. The selection of cyber-physical systems as a special class of designed objects is due to the necessity of integrating various components responsible for computing, communications and control processes. Network-centric solutions allow using universal means for the organization of information exchange to integrate different technologies for the control system.

  1. Programmable chemical controllers made from DNA

    PubMed Central

    Chen, Yuan-Jyue; Dalchau, Neil; Srinivas, Niranjan; Phillips, Andrew; Cardelli, Luca; Soloveichik, David; Seelig, Georg

    2014-01-01

    Biological organisms use complex molecular networks to navigate their environment and regulate their internal state. The development of synthetic systems with similar capabilities could lead to applications such as smart therapeutics or fabrication methods based on self-organization. To achieve this, molecular control circuits need to be engineered to perform integrated sensing, computation and actuation. Here we report a DNA-based technology for implementing the computational core of such controllers. We use the formalism of chemical reaction networks as a 'programming language', and our DNA architecture can, in principle, implement any behaviour that can be mathematically expressed as such. Unlike logic circuits, our formulation naturally allows complex signal processing of intrinsically analogue biological and chemical inputs. Controller components can be derived from biologically synthesized (plasmid) DNA, which reduces errors associated with chemically synthesized DNA. We implement several building-block reaction types and then combine them into a network that realizes, at the molecular level, an algorithm used in distributed control systems for achieving consensus between multiple agents. PMID:24077029

  2. Computational Modeling of Liquid and Gaseous Control Valves

    NASA Technical Reports Server (NTRS)

    Daines, Russell; Ahuja, Vineet; Hosangadi, Ashvin; Shipman, Jeremy; Moore, Arden; Sulyma, Peter

    2005-01-01

    In this paper computational modeling efforts undertaken at NASA Stennis Space Center in support of rocket engine component testing are discussed. Such analyses include structurally complex cryogenic liquid valves and gas valves operating at high pressures and flow rates. Basic modeling and initial successes are documented, and other issues that make valve modeling at SSC somewhat unique are also addressed. These include transient behavior, valve stall, and the determination of flow patterns in LOX valves. Hexahedral structured grids are used for valves that can be simplifies through the use of axisymmetric approximation. Hybrid unstructured methodology is used for structurally complex valves that have disparate length scales and complex flow paths that include strong swirl, local recirculation zones/secondary flow effects. Hexahedral (structured), unstructured, and hybrid meshes are compared for accuracy and computational efficiency. Accuracy is determined using verification and validation techniques.

  3. Human factors aspects of control room design

    NASA Technical Reports Server (NTRS)

    Jenkins, J. P.

    1983-01-01

    A plan for the design and analysis of a multistation control room is reviewed. It is found that acceptance of the computer based information system by the uses in the control room is mandatory for mission and system success. Criteria to improve computer/user interface include: match of system input/output with user; reliability, compatibility and maintainability; easy to learn and little training needed; self descriptive system; system under user control; transparent language, format and organization; corresponds to user expectations; adaptable to user experience level; fault tolerant; dialog capability user communications needs reflected in flexibility, complexity, power and information load; integrated system; and documentation.

  4. Visual control of prey-capture flight in dragonflies.

    PubMed

    Olberg, Robert M

    2012-04-01

    Interacting with a moving object poses a computational problem for an animal's nervous system. This problem has been elegantly solved by the dragonfly, a formidable visual predator on flying insects. The dragonfly computes an interception flight trajectory and steers to maintain it during its prey-pursuit flight. This review summarizes current knowledge about pursuit behavior and neurons thought to control interception in the dragonfly. When understood, this system has the potential for explaining how a small group of neurons can control complex interactions with moving objects. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Stimulation of a turbofan engine for evaluation of multivariable optimal control concepts. [(computerized simulation)

    NASA Technical Reports Server (NTRS)

    Seldner, K.

    1976-01-01

    The development of control systems for jet engines requires a real-time computer simulation. The simulation provides an effective tool for evaluating control concepts and problem areas prior to actual engine testing. The development and use of a real-time simulation of the Pratt and Whitney F100-PW100 turbofan engine is described. The simulation was used in a multi-variable optimal controls research program using linear quadratic regulator theory. The simulation is used to generate linear engine models at selected operating points and evaluate the control algorithm. To reduce the complexity of the design, it is desirable to reduce the order of the linear model. A technique to reduce the order of the model; is discussed. Selected results between high and low order models are compared. The LQR control algorithms can be programmed on digital computer. This computer will control the engine simulation over the desired flight envelope.

  6. Bilayer avalanche spin-diode logic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman, Joseph S., E-mail: joseph.friedman@u-psud.fr; Querlioz, Damien; Fadel, Eric R.

    2015-11-15

    A novel spintronic computing paradigm is proposed and analyzed in which InSb p-n bilayer avalanche spin-diodes are cascaded to efficiently perform complex logic operations. This spin-diode logic family uses control wires to generate magnetic fields that modulate the resistance of the spin-diodes, and currents through these devices control the resistance of cascaded devices. Electromagnetic simulations are performed to demonstrate the cascading mechanism, and guidelines are provided for the development of this innovative computing technology. This cascading scheme permits compact logic circuits with switching speeds determined by electromagnetic wave propagation rather than electron motion, enabling high-performance spintronic computing.

  7. Faster Finances

    NASA Technical Reports Server (NTRS)

    1976-01-01

    TRW has applied the Apollo checkout procedures to retail-store and bank-transaction systems, as well as to control systems for electric power transmission grids -- reducing the chance of power blackouts. Automatic checkout equipment for Apollo Spacecraft is one of the most complex computer systems in the world. Used to integrate extensive Apollo checkout procedures from manufacture to launch, it has spawned major advances in computer systems technology. Store and bank credit system has caused significant improvement in speed and accuracy of transactions, credit authorization, and inventory control. A similar computer service called "Validata" is used nationwide by airlines, airline ticket offices, car rental agencies, and hotels.

  8. Software for Simulating a Complex Robot

    NASA Technical Reports Server (NTRS)

    Goza, S. Michael

    2003-01-01

    RoboSim (Robot Simulation) is a computer program that simulates the poses and motions of the Robonaut a developmental anthropomorphic robot that has a complex system of joints with 43 degrees of freedom and multiple modes of operation and control. RoboSim performs a full kinematic simulation of all degrees of freedom. It also includes interface components that duplicate the functionality of the real Robonaut interface with control software and human operators. Basically, users see no difference between the real Robonaut and the simulation. Consequently, new control algorithms can be tested by computational simulation, without risk to the Robonaut hardware, and without using excessive Robonaut-hardware experimental time, which is always at a premium. Previously developed software incorporated into RoboSim includes Enigma (for graphical displays), OSCAR (for kinematical computations), and NDDS (for communication between the Robonaut and external software). In addition, RoboSim incorporates unique inverse-kinematical algorithms for chains of joints that have fewer than six degrees of freedom (e.g., finger joints). In comparison with the algorithms of OSCAR, these algorithms are more readily adaptable and provide better results when using equivalent sets of data.

  9. Minimized state complexity of quantum-encoded cryptic processes

    NASA Astrophysics Data System (ADS)

    Riechers, Paul M.; Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.

    2016-05-01

    The predictive information required for proper trajectory sampling of a stochastic process can be more efficiently transmitted via a quantum channel than a classical one. This recent discovery allows quantum information processing to drastically reduce the memory necessary to simulate complex classical stochastic processes. It also points to a new perspective on the intrinsic complexity that nature must employ in generating the processes we observe. The quantum advantage increases with codeword length: the length of process sequences used in constructing the quantum communication scheme. In analogy with the classical complexity measure, statistical complexity, we use this reduced communication cost as an entropic measure of state complexity in the quantum representation. Previously difficult to compute, the quantum advantage is expressed here in closed form using spectral decomposition. This allows for efficient numerical computation of the quantum-reduced state complexity at all encoding lengths, including infinite. Additionally, it makes clear how finite-codeword reduction in state complexity is controlled by the classical process's cryptic order, and it allows asymptotic analysis of infinite-cryptic-order processes.

  10. Computational Modeling and Real-Time Control of Patient-Specific Laser Treatment of Cancer

    PubMed Central

    Fuentes, D.; Oden, J. T.; Diller, K. R.; Hazle, J. D.; Elliott, A.; Shetty, A.; Stafford, R. J.

    2014-01-01

    An adaptive feedback control system is presented which employs a computational model of bioheat transfer in living tissue to guide, in real-time, laser treatments of prostate cancer monitored by magnetic resonance thermal imaging (MRTI). The system is built on what can be referred to as cyberinfrastructure - a complex structure of high-speed network, large-scale parallel computing devices, laser optics, imaging, visualizations, inverse-analysis algorithms, mesh generation, and control systems that guide laser therapy to optimally control the ablation of cancerous tissue. The computational system has been successfully tested on in-vivo, canine prostate. Over the course of an 18 minute laser induced thermal therapy (LITT) performed at M.D. Anderson Cancer Center (MDACC) in Houston, Texas, the computational models were calibrated to intra-operative real time thermal imaging treatment data and the calibrated models controlled the bioheat transfer to within 5°C of the predetermined treatment plan. The computational arena is in Austin, Texas and managed at the Institute for Computational Engineering and Sciences (ICES). The system is designed to control the bioheat transfer remotely while simultaneously providing real-time remote visualization of the on-going treatment. Post operative histology of the canine prostate reveal that the damage region was within the targeted 1.2cm diameter treatment objective. PMID:19148754

  11. Computational modeling and real-time control of patient-specific laser treatment of cancer.

    PubMed

    Fuentes, D; Oden, J T; Diller, K R; Hazle, J D; Elliott, A; Shetty, A; Stafford, R J

    2009-04-01

    An adaptive feedback control system is presented which employs a computational model of bioheat transfer in living tissue to guide, in real-time, laser treatments of prostate cancer monitored by magnetic resonance thermal imaging. The system is built on what can be referred to as cyberinfrastructure-a complex structure of high-speed network, large-scale parallel computing devices, laser optics, imaging, visualizations, inverse-analysis algorithms, mesh generation, and control systems that guide laser therapy to optimally control the ablation of cancerous tissue. The computational system has been successfully tested on in vivo, canine prostate. Over the course of an 18 min laser-induced thermal therapy performed at M.D. Anderson Cancer Center (MDACC) in Houston, Texas, the computational models were calibrated to intra-operative real-time thermal imaging treatment data and the calibrated models controlled the bioheat transfer to within 5 degrees C of the predetermined treatment plan. The computational arena is in Austin, Texas and managed at the Institute for Computational Engineering and Sciences (ICES). The system is designed to control the bioheat transfer remotely while simultaneously providing real-time remote visualization of the on-going treatment. Post-operative histology of the canine prostate reveal that the damage region was within the targeted 1.2 cm diameter treatment objective.

  12. A simplified fuel control approach for low cost aircraft gas turbines

    NASA Technical Reports Server (NTRS)

    Gold, H.

    1973-01-01

    Reduction in the complexity of gas turbine fuel controls without loss of control accuracy, reliability, or effectiveness as a method for reducing engine costs is discussed. A description and analysis of hydromechanical approach are presented. A computer simulation of the control mechanism is given and performance of a physical model in engine test is reported.

  13. Blinks, saccades, and fixation pauses during vigilance task performance. I., Time on task.

    DOT National Transportation Integrated Search

    1994-12-01

    In the future, operators of complex equipment will spend more time monitoring computer controlled devices rather than having hands on control of such equipment. The operator intervenes in system operation under "unusual" conditions or when there is a...

  14. Controlling the Universe

    ERIC Educational Resources Information Center

    Evanson, Nick

    2004-01-01

    Basic electronic devices have been used to great effect with console computer games. This paper looks at a range of devices from the very simple, such as microswitches and potentiometers, up to the more complex Hall effect probe. There is a great deal of relatively straightforward use of simple devices in computer games systems, and having read…

  15. 51. VIEW OF LORAL ADS 100A COMPUTERS LOCATED CENTRALLY ON ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    51. VIEW OF LORAL ADS 100A COMPUTERS LOCATED CENTRALLY ON NORTH WALL OF TELEMETRY ROOM (ROOM 106). SLC-3W CONTROL ROOM IS VISIBLE IN BACKGROUND THROUGH WINDOW IN NORTH WALL. - Vandenberg Air Force Base, Space Launch Complex 3, Launch Operations Building, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  16. Modeling driver behavior in a cognitive architecture.

    PubMed

    Salvucci, Dario D

    2006-01-01

    This paper explores the development of a rigorous computational model of driver behavior in a cognitive architecture--a computational framework with underlying psychological theories that incorporate basic properties and limitations of the human system. Computational modeling has emerged as a powerful tool for studying the complex task of driving, allowing researchers to simulate driver behavior and explore the parameters and constraints of this behavior. An integrated driver model developed in the ACT-R (Adaptive Control of Thought-Rational) cognitive architecture is described that focuses on the component processes of control, monitoring, and decision making in a multilane highway environment. This model accounts for the steering profiles, lateral position profiles, and gaze distributions of human drivers during lane keeping, curve negotiation, and lane changing. The model demonstrates how cognitive architectures facilitate understanding of driver behavior in the context of general human abilities and constraints and how the driving domain benefits cognitive architectures by pushing model development toward more complex, realistic tasks. The model can also serve as a core computational engine for practical applications that predict and recognize driver behavior and distraction.

  17. Efficient control schemes with limited computation complexity for Tomographic AO systems on VLTs and ELTs

    NASA Astrophysics Data System (ADS)

    Petit, C.; Le Louarn, M.; Fusco, T.; Madec, P.-Y.

    2011-09-01

    Various tomographic control solutions have been proposed during the last decades to ensure efficient or even optimal closed-loop correction to tomographic Adaptive Optics (AO) concepts such as Laser Tomographic AO (LTAO), Multi-Conjugate AO (MCAO). The optimal solution, based on Linear Quadratic Gaussian (LQG) approach, as well as suboptimal but efficient solutions such as Pseudo-Open Loop Control (POLC) require multiple Matrix Vector Multiplications (MVM). Disregarding their respective performance, these efficient control solutions thus exhibit strong increase of on-line complexity and their implementation may become difficult in demanding cases. Among them, two cases are of particular interest. First, the system Real-Time Computer architecture and implementation is derived from past or present solutions and does not support multiple MVM. This is the case of the AO Facility which RTC architecture is derived from the SPARTA platform and inherits its simple MVM architecture, which does not fit with LTAO control solutions for instance. Second, considering future systems such as Extremely Large Telescopes, the number of degrees of freedom is twenty to one hundred times bigger than present systems. In these conditions, tomographic control solutions can hardly be used in their standard form and optimized implementation shall be considered. Single MVM tomographic control solutions represent a potential solution, and straightforward solutions such as Virtual Deformable Mirrors have been already proposed for LTAO but with tuning issues. We investigate in this paper the possibility to derive from tomographic control solutions, such as POLC or LQG, simplified control solutions ensuring simple MVM architecture and that could be thus implemented on nowadays systems or future complex systems. We theoretically derive various solutions and analyze their respective performance on various systems thanks to numerical simulation. We discuss the optimization of their performance and stability issues with respect to classic control solutions. We finally discuss off-line computation and implementation constraints.

  18. Fast and stable algorithms for computing the principal square root of a complex matrix

    NASA Technical Reports Server (NTRS)

    Shieh, Leang S.; Lian, Sui R.; Mcinnis, Bayliss C.

    1987-01-01

    This note presents recursive algorithms that are rapidly convergent and more stable for finding the principal square root of a complex matrix. Also, the developed algorithms are utilized to derive the fast and stable matrix sign algorithms which are useful in developing applications to control system problems.

  19. Combining high performance simulation, data acquisition, and graphics display computers

    NASA Technical Reports Server (NTRS)

    Hickman, Robert J.

    1989-01-01

    Issues involved in the continuing development of an advanced simulation complex are discussed. This approach provides the capability to perform the majority of tests on advanced systems, non-destructively. The controlled test environments can be replicated to examine the response of the systems under test to alternative treatments of the system control design, or test the function and qualification of specific hardware. Field tests verify that the elements simulated in the laboratories are sufficient. The digital computer is hosted by a Digital Equipment Corp. MicroVAX computer with an Aptec Computer Systems Model 24 I/O computer performing the communication function. An Applied Dynamics International AD100 performs the high speed simulation computing and an Evans and Sutherland PS350 performs on-line graphics display. A Scientific Computer Systems SCS40 acts as a high performance FORTRAN program processor to support the complex, by generating numerous large files from programs coded in FORTRAN that are required for the real time processing. Four programming languages are involved in the process, FORTRAN, ADSIM, ADRIO, and STAPLE. FORTRAN is employed on the MicroVAX host to initialize and terminate the simulation runs on the system. The generation of the data files on the SCS40 also is performed with FORTRAN programs. ADSIM and ADIRO are used to program the processing elements of the AD100 and its IOCP processor. STAPLE is used to program the Aptec DIP and DIA processors.

  20. Strategies for concurrent processing of complex algorithms in data driven architectures

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.

    1988-01-01

    The purpose is to document research to develop strategies for concurrent processing of complex algorithms in data driven architectures. The problem domain consists of decision-free algorithms having large-grained, computationally complex primitive operations. Such are often found in signal processing and control applications. The anticipated multiprocessor environment is a data flow architecture containing between two and twenty computing elements. Each computing element is a processor having local program memory, and which communicates with a common global data memory. A new graph theoretic model called ATAMM which establishes rules for relating a decomposed algorithm to its execution in a data flow architecture is presented. The ATAMM model is used to determine strategies to achieve optimum time performance and to develop a system diagnostic software tool. In addition, preliminary work on a new multiprocessor operating system based on the ATAMM specifications is described.

  1. Quadratic Programming for Allocating Control Effort

    NASA Technical Reports Server (NTRS)

    Singh, Gurkirpal

    2005-01-01

    A computer program calculates an optimal allocation of control effort in a system that includes redundant control actuators. The program implements an iterative (but otherwise single-stage) algorithm of the quadratic-programming type. In general, in the quadratic-programming problem, one seeks the values of a set of variables that minimize a quadratic cost function, subject to a set of linear equality and inequality constraints. In this program, the cost function combines control effort (typically quantified in terms of energy or fuel consumed) and control residuals (differences between commanded and sensed values of variables to be controlled). In comparison with prior control-allocation software, this program offers approximately equal accuracy but much greater computational efficiency. In addition, this program offers flexibility, robustness to actuation failures, and a capability for selective enforcement of control requirements. The computational efficiency of this program makes it suitable for such complex, real-time applications as controlling redundant aircraft actuators or redundant spacecraft thrusters. The program is written in the C language for execution in a UNIX operating system.

  2. Structural Technology Evaluation and Analysis Program (STEAP). Delivery Order 0037: Prognosis-Based Control Reconfiguration for an Aircraft with Faulty Actuator to Enable Performance in a Degraded State

    DTIC Science & Technology

    2010-12-01

    computers in 1953. HIL motion simulators were also built for the dynamic testing of vehicle com- ponents (e.g. suspensions, bodies ) with hydraulic or...complex, comprehensive mechanical systems can be simulated in real-time by parallel computers; examples include multi- body sys- tems, brake systems...hard constraints in a multivariable control framework. And the third aspect is the ability to perform online optimization. These aspects results in

  3. Digital Signal Processing and Control for the Study of Gene Networks

    NASA Astrophysics Data System (ADS)

    Shin, Yong-Jun

    2016-04-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  4. Digital Signal Processing and Control for the Study of Gene Networks.

    PubMed

    Shin, Yong-Jun

    2016-04-22

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  5. Digital Signal Processing and Control for the Study of Gene Networks

    PubMed Central

    Shin, Yong-Jun

    2016-01-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks. PMID:27102828

  6. Web-based interactive drone control using hand gesture

    NASA Astrophysics Data System (ADS)

    Zhao, Zhenfei; Luo, Hao; Song, Guang-Hua; Chen, Zhou; Lu, Zhe-Ming; Wu, Xiaofeng

    2018-01-01

    This paper develops a drone control prototype based on web technology with the aid of hand gesture. The uplink control command and downlink data (e.g., video) are transmitted by WiFi communication, and all the information exchange is realized on web. The control command is translated from various predetermined hand gestures. Specifically, the hardware of this friendly interactive control system is composed by a quadrotor drone, a computer vision-based hand gesture sensor, and a cost-effective computer. The software is simplified as a web-based user interface program. Aided by natural hand gestures, this system significantly reduces the complexity of traditional human-computer interaction, making remote drone operation more intuitive. Meanwhile, a web-based automatic control mode is provided in addition to the hand gesture control mode. For both operation modes, no extra application program is needed to be installed on the computer. Experimental results demonstrate the effectiveness and efficiency of the proposed system, including control accuracy, operation latency, etc. This system can be used in many applications such as controlling a drone in global positioning system denied environment or by handlers without professional drone control knowledge since it is easy to get started.

  7. Web-based interactive drone control using hand gesture.

    PubMed

    Zhao, Zhenfei; Luo, Hao; Song, Guang-Hua; Chen, Zhou; Lu, Zhe-Ming; Wu, Xiaofeng

    2018-01-01

    This paper develops a drone control prototype based on web technology with the aid of hand gesture. The uplink control command and downlink data (e.g., video) are transmitted by WiFi communication, and all the information exchange is realized on web. The control command is translated from various predetermined hand gestures. Specifically, the hardware of this friendly interactive control system is composed by a quadrotor drone, a computer vision-based hand gesture sensor, and a cost-effective computer. The software is simplified as a web-based user interface program. Aided by natural hand gestures, this system significantly reduces the complexity of traditional human-computer interaction, making remote drone operation more intuitive. Meanwhile, a web-based automatic control mode is provided in addition to the hand gesture control mode. For both operation modes, no extra application program is needed to be installed on the computer. Experimental results demonstrate the effectiveness and efficiency of the proposed system, including control accuracy, operation latency, etc. This system can be used in many applications such as controlling a drone in global positioning system denied environment or by handlers without professional drone control knowledge since it is easy to get started.

  8. Computationally inexpensive approach for pitch control of offshore wind turbine on barge floating platform.

    PubMed

    Zuo, Shan; Song, Y D; Wang, Lei; Song, Qing-wang

    2013-01-01

    Offshore floating wind turbine (OFWT) has gained increasing attention during the past decade because of the offshore high-quality wind power and complex load environment. The control system is a tradeoff between power tracking and fatigue load reduction in the above-rated wind speed area. In allusion to the external disturbances and uncertain system parameters of OFWT due to the proximity to load centers and strong wave coupling, this paper proposes a computationally inexpensive robust adaptive control approach with memory-based compensation for blade pitch control. The method is tested and compared with a baseline controller and a conventional individual blade pitch controller with the "NREL offshore 5 MW baseline wind turbine" being mounted on a barge platform run on FAST and Matlab/Simulink, operating in the above-rated condition. It is shown that the advanced control approach is not only robust to complex wind and wave disturbances but adaptive to varying and uncertain system parameters as well. The simulation results demonstrate that the proposed method performs better in reducing power fluctuations, fatigue loads and platform vibration as compared to the conventional individual blade pitch control.

  9. Computationally Inexpensive Approach for Pitch Control of Offshore Wind Turbine on Barge Floating Platform

    PubMed Central

    Zuo, Shan; Song, Y. D.; Wang, Lei; Song, Qing-wang

    2013-01-01

    Offshore floating wind turbine (OFWT) has gained increasing attention during the past decade because of the offshore high-quality wind power and complex load environment. The control system is a tradeoff between power tracking and fatigue load reduction in the above-rated wind speed area. In allusion to the external disturbances and uncertain system parameters of OFWT due to the proximity to load centers and strong wave coupling, this paper proposes a computationally inexpensive robust adaptive control approach with memory-based compensation for blade pitch control. The method is tested and compared with a baseline controller and a conventional individual blade pitch controller with the “NREL offshore 5 MW baseline wind turbine” being mounted on a barge platform run on FAST and Matlab/Simulink, operating in the above-rated condition. It is shown that the advanced control approach is not only robust to complex wind and wave disturbances but adaptive to varying and uncertain system parameters as well. The simulation results demonstrate that the proposed method performs better in reducing power fluctuations, fatigue loads and platform vibration as compared to the conventional individual blade pitch control. PMID:24453834

  10. Modeling of the Human - Operator in a Complex System Functioning Under Extreme Conditions

    NASA Astrophysics Data System (ADS)

    Getzov, Peter; Hubenova, Zoia; Yordanov, Dimitar; Popov, Wiliam

    2013-12-01

    Problems, related to the explication of sophisticated control systems of objects, operating under extreme conditions, have been examined and the impact of the effectiveness of the operator's activity on the systems as a whole. The necessity of creation of complex simulation models, reflecting operator's activity, is discussed. Organizational and technical system of an unmanned aviation complex is described as a sophisticated ergatic system. Computer realization of main subsystems of algorithmic system of the man as a controlling system is implemented and specialized software for data processing and analysis is developed. An original computer model of a Man as a tracking system has been implemented. Model of unmanned complex for operators training and formation of a mental model in emergency situation, implemented in "matlab-simulink" environment, has been synthesized. As a unit of the control loop, the pilot (operator) is simplified viewed as an autocontrol system consisting of three main interconnected subsystems: sensitive organs (perception sensors); central nervous system; executive organs (muscles of the arms, legs, back). Theoretical-data model of prediction the level of operator's information load in ergatic systems is proposed. It allows the assessment and prediction of the effectiveness of a real working operator. Simulation model of operator's activity in takeoff based on the Petri nets has been synthesized.

  11. Microprocessor-based integration of microfluidic control for the implementation of automated sensor monitoring and multithreaded optimization algorithms.

    PubMed

    Ezra, Elishai; Maor, Idan; Bavli, Danny; Shalom, Itai; Levy, Gahl; Prill, Sebastian; Jaeger, Magnus S; Nahmias, Yaakov

    2015-08-01

    Microfluidic applications range from combinatorial synthesis to high throughput screening, with platforms integrating analog perfusion components, digitally controlled micro-valves and a range of sensors that demand a variety of communication protocols. Currently, discrete control units are used to regulate and monitor each component, resulting in scattered control interfaces that limit data integration and synchronization. Here, we present a microprocessor-based control unit, utilizing the MS Gadgeteer open framework that integrates all aspects of microfluidics through a high-current electronic circuit that supports and synchronizes digital and analog signals for perfusion components, pressure elements, and arbitrary sensor communication protocols using a plug-and-play interface. The control unit supports an integrated touch screen and TCP/IP interface that provides local and remote control of flow and data acquisition. To establish the ability of our control unit to integrate and synchronize complex microfluidic circuits we developed an equi-pressure combinatorial mixer. We demonstrate the generation of complex perfusion sequences, allowing the automated sampling, washing, and calibrating of an electrochemical lactate sensor continuously monitoring hepatocyte viability following exposure to the pesticide rotenone. Importantly, integration of an optical sensor allowed us to implement automated optimization protocols that require different computational challenges including: prioritized data structures in a genetic algorithm, distributed computational efforts in multiple-hill climbing searches and real-time realization of probabilistic models in simulated annealing. Our system offers a comprehensive solution for establishing optimization protocols and perfusion sequences in complex microfluidic circuits.

  12. On the study of control effectiveness and computational efficiency of reduced Saint-Venant model in model predictive control of open channel flow

    NASA Astrophysics Data System (ADS)

    Xu, M.; van Overloop, P. J.; van de Giesen, N. C.

    2011-02-01

    Model predictive control (MPC) of open channel flow is becoming an important tool in water management. The complexity of the prediction model has a large influence on the MPC application in terms of control effectiveness and computational efficiency. The Saint-Venant equations, called SV model in this paper, and the Integrator Delay (ID) model are either accurate but computationally costly, or simple but restricted to allowed flow changes. In this paper, a reduced Saint-Venant (RSV) model is developed through a model reduction technique, Proper Orthogonal Decomposition (POD), on the SV equations. The RSV model keeps the main flow dynamics and functions over a large flow range but is easier to implement in MPC. In the test case of a modeled canal reach, the number of states and disturbances in the RSV model is about 45 and 16 times less than the SV model, respectively. The computational time of MPC with the RSV model is significantly reduced, while the controller remains effective. Thus, the RSV model is a promising means to balance the control effectiveness and computational efficiency.

  13. 76 FR 59241 - Foreign Futures and Options Contracts on a Non-Narrow-Based Security Index; Commission...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-26

    ... controls on trading; information and data relating to the index, including the design, computation and... futures contract raises novel or complex issues that require additional time for review, or if the foreign... composition, computation, or method of selection of component entities of an index referenced and defined in...

  14. EXTENDING THE REALM OF OPTIMIZATION FOR COMPLEX SYSTEMS: UNCERTAINTY, COMPETITION, AND DYNAMICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shanbhag, Uday V; Basar, Tamer; Meyn, Sean

    Research reported addressed these topics: the development of analytical and algorithmic tools for distributed computation of Nash equilibria; synchronization in mean-field oscillator games, with an emphasis on learning and efficiency analysis; questions that combine learning and computation; questions including stochastic and mean-field games; modeling and control in the context of power markets.

  15. A Middleware Platform for Providing Mobile and Embedded Computing Instruction to Software Engineering Students

    ERIC Educational Resources Information Center

    Mattmann, C. A.; Medvidovic, N.; Malek, S.; Edwards, G.; Banerjee, S.

    2012-01-01

    As embedded software systems have grown in number, complexity, and importance in the modern world, a corresponding need to teach computer science students how to effectively engineer such systems has arisen. Embedded software systems, such as those that control cell phones, aircraft, and medical equipment, are subject to requirements and…

  16. Improved FFT-based numerical inversion of Laplace transforms via fast Hartley transform algorithm

    NASA Technical Reports Server (NTRS)

    Hwang, Chyi; Lu, Ming-Jeng; Shieh, Leang S.

    1991-01-01

    The disadvantages of numerical inversion of the Laplace transform via the conventional fast Fourier transform (FFT) are identified and an improved method is presented to remedy them. The improved method is based on introducing a new integration step length Delta(omega) = pi/mT for trapezoidal-rule approximation of the Bromwich integral, in which a new parameter, m, is introduced for controlling the accuracy of the numerical integration. Naturally, this method leads to multiple sets of complex FFT computations. A new inversion formula is derived such that N equally spaced samples of the inverse Laplace transform function can be obtained by (m/2) + 1 sets of N-point complex FFT computations or by m sets of real fast Hartley transform (FHT) computations.

  17. Novel physical constraints on implementation of computational processes

    NASA Astrophysics Data System (ADS)

    Wolpert, David; Kolchinsky, Artemy

    Non-equilibrium statistical physics permits us to analyze computational processes, i.e., ways to drive a physical system such that its coarse-grained dynamics implements some desired map. It is now known how to implement any such desired computation without dissipating work, and what the minimal (dissipationless) work is that such a computation will require (the so-called generalized Landauer bound\\x9D). We consider how these analyses change if we impose realistic constraints on the computational process. First, we analyze how many degrees of freedom of the system must be controlled, in addition to the ones specifying the information-bearing degrees of freedom, in order to avoid dissipating work during a given computation, when local detailed balance holds. We analyze this issue for deterministic computations, deriving a state-space vs. speed trade-off, and use our results to motivate a measure of the complexity of a computation. Second, we consider computations that are implemented with logic circuits, in which only a small numbers of degrees of freedom are coupled at a time. We show that the way a computation is implemented using circuits affects its minimal work requirements, and relate these minimal work requirements to information-theoretic measures of complexity.

  18. Robust pinning control of heterogeneous complex networks with jointly connected topologies and time-varying parametric uncertainty

    NASA Astrophysics Data System (ADS)

    Manfredi, Sabato

    2018-05-01

    The pinning/leader control problems provide the design of the leader or pinning controller in order to guide a complex network to a desired trajectory or target (synchronisation or consensus). Let a time-invariant complex network, pinning/leader control problems include the design of the leader or pinning controller gain and number of nodes to pin in order to guide a network to a desired trajectory (synchronization or consensus). Usually, lower is the number of pinned nodes larger is the pinning gain required to assess network synchronisation. On the other side, realistic application scenario of complex networks is characterised by switching topologies, time-varying node coupling strength and link weight that make hard to solve the pinning/leader control problem. Additionally, the system dynamics at nodes can be heterogeneous. In this paper, we derive robust stabilisation conditions of time-varying heterogeneous complex networks with jointly connected topologies when coupling strength and link weight interactions are affected by time-varying uncertainties. By employing Lyapunov stability theory and linear matrix inequality (LMI) technique, we formulate low computationally demanding stabilisability conditions to design a pinning/leader control gain for robust network synchronisation. The effectiveness of the proposed approach is shown by several design examples applied to a paradigmatic well-known complex network composed of heterogeneous Chua's circuits.

  19. Optimal control strategy for a novel computer virus propagation model on scale-free networks

    NASA Astrophysics Data System (ADS)

    Zhang, Chunming; Huang, Haitao

    2016-06-01

    This paper aims to study the combined impact of reinstalling system and network topology on the spread of computer viruses over the Internet. Based on scale-free network, this paper proposes a novel computer viruses propagation model-SLBOSmodel. A systematic analysis of this new model shows that the virus-free equilibrium is globally asymptotically stable when its spreading threshold is less than one; nevertheless, it is proved that the viral equilibrium is permanent if the spreading threshold is greater than one. Then, the impacts of different model parameters on spreading threshold are analyzed. Next, an optimally controlled SLBOS epidemic model on complex networks is also studied. We prove that there is an optimal control existing for the control problem. Some numerical simulations are finally given to illustrate the main results.

  20. Re-membering the body: applications of computational neuroscience to the top-down control of regeneration of limbs and other complex organs.

    PubMed

    Pezzulo, G; Levin, M

    2015-12-01

    A major goal of regenerative medicine and bioengineering is the regeneration of complex organs, such as limbs, and the capability to create artificial constructs (so-called biobots) with defined morphologies and robust self-repair capabilities. Developmental biology presents remarkable examples of systems that self-assemble and regenerate complex structures toward their correct shape despite significant perturbations. A fundamental challenge is to translate progress in molecular genetics into control of large-scale organismal anatomy, and the field is still searching for an appropriate theoretical paradigm for facilitating control of pattern homeostasis. However, computational neuroscience provides many examples in which cell networks - brains - store memories (e.g., of geometric configurations, rules, and patterns) and coordinate their activity towards proximal and distant goals. In this Perspective, we propose that programming large-scale morphogenesis requires exploiting the information processing by which cellular structures work toward specific shapes. In non-neural cells, as in the brain, bioelectric signaling implements information processing, decision-making, and memory in regulating pattern and its remodeling. Thus, approaches used in computational neuroscience to understand goal-seeking neural systems offer a toolbox of techniques to model and control regenerative pattern formation. Here, we review recent data on developmental bioelectricity as a regulator of patterning, and propose that target morphology could be encoded within tissues as a kind of memory, using the same molecular mechanisms and algorithms so successfully exploited by the brain. We highlight the next steps of an unconventional research program, which may allow top-down control of growth and form for numerous applications in regenerative medicine and synthetic bioengineering.

  1. ARACHNE: A neural-neuroglial network builder with remotely controlled parallel computing

    PubMed Central

    Rusakov, Dmitri A.; Savtchenko, Leonid P.

    2017-01-01

    Creating and running realistic models of neural networks has hitherto been a task for computing professionals rather than experimental neuroscientists. This is mainly because such networks usually engage substantial computational resources, the handling of which requires specific programing skills. Here we put forward a newly developed simulation environment ARACHNE: it enables an investigator to build and explore cellular networks of arbitrary biophysical and architectural complexity using the logic of NEURON and a simple interface on a local computer or a mobile device. The interface can control, through the internet, an optimized computational kernel installed on a remote computer cluster. ARACHNE can combine neuronal (wired) and astroglial (extracellular volume-transmission driven) network types and adopt realistic cell models from the NEURON library. The program and documentation (current version) are available at GitHub repository https://github.com/LeonidSavtchenko/Arachne under the MIT License (MIT). PMID:28362877

  2. Predictive Modeling and Computational Toxicology

    EPA Science Inventory

    Embryonic development is orchestrated via a complex series of cellular interactions controlling behaviors such as mitosis, migration, differentiation, adhesion, contractility, apoptosis, and extracellular matrix remodeling. Any chemical exposure that perturbs these cellular proce...

  3. Analysis hierarchical model for discrete event systems

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  4. Evolutionary Computation for the Identification of Emergent Behavior in Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Terrile, Richard J.; Guillaume, Alexandre

    2009-01-01

    Over the past several years the Center for Evolutionary Computation and Automated Design at the Jet Propulsion Laboratory has developed a technique based on Evolutionary Computational Methods (ECM) that allows for the automated optimization of complex computationally modeled systems. An important application of this technique is for the identification of emergent behaviors in autonomous systems. Mobility platforms such as rovers or airborne vehicles are now being designed with autonomous mission controllers that can find trajectories over a solution space that is larger than can reasonably be tested. It is critical to identify control behaviors that are not predicted and can have surprising results (both good and bad). These emergent behaviors need to be identified, characterized and either incorporated into or isolated from the acceptable range of control characteristics. We use cluster analysis of automatically retrieved solutions to identify isolated populations of solutions with divergent behaviors.

  5. Air traffic control : good progress on interim replacement for outage-plagued system, but risks can be further reduced

    DOT National Transportation Integrated Search

    1996-10-01

    Certain air traffic control(ATC) centers experienced a series of major outages, : some of which were caused by the Display Channel Complex or DCC-a mainframe : computer system that processes radar and other data into displayable images on : controlle...

  6. An autonomous molecular computer for logical control of gene expression.

    PubMed

    Benenson, Yaakov; Gil, Binyamin; Ben-Dor, Uri; Adar, Rivka; Shapiro, Ehud

    2004-05-27

    Early biomolecular computer research focused on laboratory-scale, human-operated computers for complex computational problems. Recently, simple molecular-scale autonomous programmable computers were demonstrated allowing both input and output information to be in molecular form. Such computers, using biological molecules as input data and biologically active molecules as outputs, could produce a system for 'logical' control of biological processes. Here we describe an autonomous biomolecular computer that, at least in vitro, logically analyses the levels of messenger RNA species, and in response produces a molecule capable of affecting levels of gene expression. The computer operates at a concentration of close to a trillion computers per microlitre and consists of three programmable modules: a computation module, that is, a stochastic molecular automaton; an input module, by which specific mRNA levels or point mutations regulate software molecule concentrations, and hence automaton transition probabilities; and an output module, capable of controlled release of a short single-stranded DNA molecule. This approach might be applied in vivo to biochemical sensing, genetic engineering and even medical diagnosis and treatment. As a proof of principle we programmed the computer to identify and analyse mRNA of disease-related genes associated with models of small-cell lung cancer and prostate cancer, and to produce a single-stranded DNA molecule modelled after an anticancer drug.

  7. Communication and complexity in a GRN-based multicellular system for graph colouring.

    PubMed

    Buck, Moritz; Nehaniv, Chrystopher L

    2008-01-01

    Artificial Genetic Regulatory Networks (GRNs) are interesting control models through their simplicity and versatility. They can be easily implemented, evolved and modified, and their similarity to their biological counterparts makes them interesting for simulations of life-like systems as well. These aspects suggest they may be perfect control systems for distributed computing in diverse situations, but to be usable for such applications the computational power and evolvability of GRNs need to be studied. In this research we propose a simple distributed system implementing GRNs to solve the well known NP-complete graph colouring problem. Every node (cell) of the graph to be coloured is controlled by an instance of the same GRN. All the cells communicate directly with their immediate neighbours in the graph so as to set up a good colouring. The quality of this colouring directs the evolution of the GRNs using a genetic algorithm. We then observe the quality of the colouring for two different graphs according to different communication protocols and the number of different proteins in the cell (a measure for the possible complexity of a GRN). Those two points, being the main scalability issues that any computational paradigm raises, will then be discussed.

  8. On the Relevancy of Efficient, Integrated Computer and Network Monitoring in HEP Distributed Online Environment

    NASA Astrophysics Data System (ADS)

    Carvalho, D.; Gavillet, Ph.; Delgado, V.; Albert, J. N.; Bellas, N.; Javello, J.; Miere, Y.; Ruffinoni, D.; Smith, G.

    Large Scientific Equipments are controlled by Computer Systems whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, the sophistication of its treatment and, on the other hand by the fast evolution of the computer and network market. Some people call them genetically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this framework the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is proposed to integrate the various functions of DCCS monitoring into one general purpose Multi-layer System.

  9. An Expressive, Lightweight and Secure Construction of Key Policy Attribute-Based Cloud Data Sharing Access Control

    NASA Astrophysics Data System (ADS)

    Lin, Guofen; Hong, Hanshu; Xia, Yunhao; Sun, Zhixin

    2017-10-01

    Attribute-based encryption (ABE) is an interesting cryptographic technique for flexible cloud data sharing access control. However, some open challenges hinder its practical application. In previous schemes, all attributes are considered as in the same status while they are not in most of practical scenarios. Meanwhile, the size of access policy increases dramatically with the raise of its expressiveness complexity. In addition, current research hardly notices that mobile front-end devices, such as smartphones, are poor in computational performance while too much bilinear pairing computation is needed for ABE. In this paper, we propose a key-policy weighted attribute-based encryption without bilinear pairing computation (KP-WABE-WB) for secure cloud data sharing access control. A simple weighted mechanism is presented to describe different importance of each attribute. We introduce a novel construction of ABE without executing any bilinear pairing computation. Compared to previous schemes, our scheme has a better performance in expressiveness of access policy and computational efficiency.

  10. Chromatin Computation

    PubMed Central

    Bryant, Barbara

    2012-01-01

    In living cells, DNA is packaged along with protein and RNA into chromatin. Chemical modifications to nucleotides and histone proteins are added, removed and recognized by multi-functional molecular complexes. Here I define a new computational model, in which chromatin modifications are information units that can be written onto a one-dimensional string of nucleosomes, analogous to the symbols written onto cells of a Turing machine tape, and chromatin-modifying complexes are modeled as read-write rules that operate on a finite set of adjacent nucleosomes. I illustrate the use of this “chromatin computer” to solve an instance of the Hamiltonian path problem. I prove that chromatin computers are computationally universal – and therefore more powerful than the logic circuits often used to model transcription factor control of gene expression. Features of biological chromatin provide a rich instruction set for efficient computation of nontrivial algorithms in biological time scales. Modeling chromatin as a computer shifts how we think about chromatin function, suggests new approaches to medical intervention, and lays the groundwork for the engineering of a new class of biological computing machines. PMID:22567109

  11. Common data buffer system. [communication with computational equipment utilized in spacecraft operations

    NASA Technical Reports Server (NTRS)

    Byrne, F. (Inventor)

    1981-01-01

    A high speed common data buffer system is described for providing an interface and communications medium between a plurality of computers utilized in a distributed computer complex forming part of a checkout, command and control system for space vehicles and associated ground support equipment. The system includes the capability for temporarily storing data to be transferred between computers, for transferring a plurality of interrupts between computers, for monitoring and recording these transfers, and for correcting errors incurred in these transfers. Validity checks are made on each transfer and appropriate error notification is given to the computer associated with that transfer.

  12. Quasi-analytical treatment of spatially averaged radiation transfer in complex terrain

    NASA Astrophysics Data System (ADS)

    Löwe, H.; Helbig, N.

    2012-04-01

    We provide a new quasi-analytical method to compute the topographic influence on the effective albedo of complex topography as required for meteorological, land-surface or climate models. We investigate radiative transfer in complex terrain via the radiosity equation on isotropic Gaussian random fields. Under controlled approximations we derive expressions for domain averages of direct, diffuse and terrain radiation and the sky view factor. Domain averaged quantities are related to a type of level-crossing probability of the random field which is approximated by longstanding results developed for acoustic scattering at ocean boundaries. This allows us to express all non-local horizon effects in terms of a local terrain parameter, namely the mean squared slope. Emerging integrals are computed numerically and fit formulas are given for practical purposes. As an implication of our approach we provide an expression for the effective albedo of complex terrain in terms of the sun elevation angle, mean squared slope, the area averaged surface albedo, and the direct-to-diffuse ratio of solar radiation. As an application, we compute the effective albedo for the Swiss Alps and discuss possible generalizations of the method.

  13. The Use of a Computer-Controlled Random Access Slide Projector for Rapid Information Display.

    ERIC Educational Resources Information Center

    Muller, Mark T.

    A 35mm random access slide projector operated in conjunction with a computer terminal was adapted to meet the need for a more rapid and complex graphic display mechanism than is currently available with teletypewriter terminals. The model projector can be operated manually to provide for a maintenance checkout of the electromechanical system.…

  14. Next generation keyboards: The importance of cognitive compatibility

    NASA Technical Reports Server (NTRS)

    Amell, John R.; Ewry, Michael E.; Colle, Herbert A.

    1988-01-01

    The computer keyboard of today is essentially the same as it has been for many years. Few advances have been made in keyboard design even though computer systems in general have made remarkable progress in improvements. This paper discusses the future of keyboards, their competition and compatibility with voice input systems, and possible special-application intelligent keyboards for controlling complex systems.

  15. Metagram Software - A New Perspective on the Art of Computation.

    DTIC Science & Technology

    1981-10-01

    numober) Computer Programming Information and Analysis Metagramming Philosophy Intelligence Information Systefs Abstraction & Metasystems Metagranmming...control would also serve well in the analysis of military and political intelligence, and in other areas where highly abstract methods of thought serve...needed in intelligence because several levels of abstraction are involved in a political or military system, because analysis entails a complex interplay

  16. A statistical learning strategy for closed-loop control of fluid flows

    NASA Astrophysics Data System (ADS)

    Guéniat, Florimond; Mathelin, Lionel; Hussaini, M. Yousuff

    2016-12-01

    This work discusses a closed-loop control strategy for complex systems utilizing scarce and streaming data. A discrete embedding space is first built using hash functions applied to the sensor measurements from which a Markov process model is derived, approximating the complex system's dynamics. A control strategy is then learned using reinforcement learning once rewards relevant with respect to the control objective are identified. This method is designed for experimental configurations, requiring no computations nor prior knowledge of the system, and enjoys intrinsic robustness. It is illustrated on two systems: the control of the transitions of a Lorenz'63 dynamical system, and the control of the drag of a cylinder flow. The method is shown to perform well.

  17. Automated Design of Complex Dynamic Systems

    PubMed Central

    Hermans, Michiel; Schrauwen, Benjamin; Bienstman, Peter; Dambre, Joni

    2014-01-01

    Several fields of study are concerned with uniting the concept of computation with that of the design of physical systems. For example, a recent trend in robotics is to design robots in such a way that they require a minimal control effort. Another example is found in the domain of photonics, where recent efforts try to benefit directly from the complex nonlinear dynamics to achieve more efficient signal processing. The underlying goal of these and similar research efforts is to internalize a large part of the necessary computations within the physical system itself by exploiting its inherent non-linear dynamics. This, however, often requires the optimization of large numbers of system parameters, related to both the system's structure as well as its material properties. In addition, many of these parameters are subject to fabrication variability or to variations through time. In this paper we apply a machine learning algorithm to optimize physical dynamic systems. We show that such algorithms, which are normally applied on abstract computational entities, can be extended to the field of differential equations and used to optimize an associated set of parameters which determine their behavior. We show that machine learning training methodologies are highly useful in designing robust systems, and we provide a set of both simple and complex examples using models of physical dynamical systems. Interestingly, the derived optimization method is intimately related to direct collocation a method known in the field of optimal control. Our work suggests that the application domains of both machine learning and optimal control have a largely unexplored overlapping area which envelopes a novel design methodology of smart and highly complex physical systems. PMID:24497969

  18. Dynamic Feed Control For Injection Molding

    DOEpatents

    Kazmer, David O.

    1996-09-17

    The invention provides methods and apparatus in which mold material flows through a gate into a mold cavity that defines the shape of a desired part. An adjustable valve is provided that is operable to change dynamically the effective size of the gate to control the flow of mold material through the gate. The valve is adjustable while the mold material is flowing through the gate into the mold cavity. A sensor is provided for sensing a process condition while the part is being molded. During molding, the valve is adjusted based at least in part on information from the sensor. In the preferred embodiment, the adjustable valve is controlled by a digital computer, which includes circuitry for acquiring data from the sensor, processing circuitry for computing a desired position of the valve based on the data from the sensor and a control data file containing target process conditions, and control circuitry for generating signals to control a valve driver to adjust the position of the valve. More complex embodiments include a plurality of gates, sensors, and controllable valves. Each valve is individually controllable so that process conditions corresponding to each gate can be adjusted independently. This allows for great flexibility in the control of injection molding to produce complex, high-quality parts.

  19. STARS: An integrated general-purpose finite element structural, aeroelastic, and aeroservoelastic analysis computer program

    NASA Technical Reports Server (NTRS)

    Gupta, Kajal K.

    1991-01-01

    The details of an integrated general-purpose finite element structural analysis computer program which is also capable of solving complex multidisciplinary problems is presented. Thus, the SOLIDS module of the program possesses an extensive finite element library suitable for modeling most practical problems and is capable of solving statics, vibration, buckling, and dynamic response problems of complex structures, including spinning ones. The aerodynamic module, AERO, enables computation of unsteady aerodynamic forces for both subsonic and supersonic flow for subsequent flutter and divergence analysis of the structure. The associated aeroservoelastic analysis module, ASE, effects aero-structural-control stability analysis yielding frequency responses as well as damping characteristics of the structure. The program is written in standard FORTRAN to run on a wide variety of computers. Extensive graphics, preprocessing, and postprocessing routines are also available pertaining to a number of terminals.

  20. Two-photon quantum walk in a multimode fiber

    PubMed Central

    Defienne, Hugo; Barbieri, Marco; Walmsley, Ian A.; Smith, Brian J.; Gigan, Sylvain

    2016-01-01

    Multiphoton propagation in connected structures—a quantum walk—offers the potential of simulating complex physical systems and provides a route to universal quantum computation. Increasing the complexity of quantum photonic networks where the walk occurs is essential for many applications. We implement a quantum walk of indistinguishable photon pairs in a multimode fiber supporting 380 modes. Using wavefront shaping, we control the propagation of the two-photon state through the fiber in which all modes are coupled. Excitation of arbitrary output modes of the system is realized by controlling classical and quantum interferences. This report demonstrates a highly multimode platform for multiphoton interference experiments and provides a powerful method to program a general high-dimensional multiport optical circuit. This work paves the way for the next generation of photonic devices for quantum simulation, computing, and communication. PMID:27152325

  1. Growth monitoring and control in complex medium: a case study employing fed-batch penicillin fermentation and computer-aided on-line mass balancing.

    PubMed

    Mou, D G; Cooney, C L

    1983-01-01

    To broaden the practicality of on-line growth monitoring and control, its application in fedbatch penicillin fermentation using high corn steep liquor (CSL) concentration (53 g/L) is demonstrated. By employing a calculation method that considers the vagaries of CSL consumption, overall and instantaneous carbon-balancing equations are successfully used to calculate, on-line, the cell concentration and instantaneous specific growth rate in the penicillin production phase. As a consequence, these equations, together with a feedback control strategy, enable the computer control of glucose feed and maintenance of the preselected production-phase growth rate with error less than 0.002 h(-1).

  2. Large space structure damping design

    NASA Technical Reports Server (NTRS)

    Pilkey, W. D.; Haviland, J. K.

    1983-01-01

    Several FORTRAN subroutines and programs were developed which compute complex eigenvalues of a damped system using different approaches, and which rescale mode shapes to unit generalized mass and make rigid bodies orthogonal to each other. An analytical proof of a Minimum Constrained Frequency Criterion (MCFC) for a single damper is presented. A method to minimize the effect of control spill-over for large space structures is proposed. The characteristic equation of an undamped system with a generalized control law is derived using reanalysis theory. This equation can be implemented in computer programs for efficient eigenvalue analysis or control quasi synthesis. Methods to control vibrations in large space structure are reviewed and analyzed. The resulting prototype, using electromagnetic actuator, is described.

  3. Reliability history of the Apollo guidance computer

    NASA Technical Reports Server (NTRS)

    Hall, E. C.

    1972-01-01

    The Apollo guidance computer was designed to provide the computation necessary for guidance, navigation and control of the command module and the lunar landing module of the Apollo spacecraft. The computer was designed using the technology of the early 1960's and the production was completed by 1969. During the development, production, and operational phase of the program, the computer has accumulated a very interesting history which is valuable for evaluating the technology, production methods, system integration, and the reliability of the hardware. The operational experience in the Apollo guidance systems includes 17 computers which flew missions and another 26 flight type computers which are still in various phases of prelaunch activity including storage, system checkout, prelaunch spacecraft checkout, etc. These computers were manufactured and maintained under very strict quality control procedures with requirements for reporting and analyzing all indications of failure. Probably no other computer or electronic equipment with equivalent complexity has been as well documented and monitored. Since it has demonstrated a unique reliability history, it is important to evaluate the techniques and methods which have contributed to the high reliability of this computer.

  4. On Chaotic and Hyperchaotic Complex Nonlinear Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Mahmoud, Gamal M.

    Dynamical systems described by real and complex variables are currently one of the most popular areas of scientific research. These systems play an important role in several fields of physics, engineering, and computer sciences, for example, laser systems, control (or chaos suppression), secure communications, and information science. Dynamical basic properties, chaos (hyperchaos) synchronization, chaos control, and generating hyperchaotic behavior of these systems are briefly summarized. The main advantage of introducing complex variables is the reduction of phase space dimensions by a half. They are also used to describe and simulate the physics of detuned laser and thermal convection of liquid flows, where the electric field and the atomic polarization amplitudes are both complex. Clearly, if the variables of the system are complex the equations involve twice as many variables and control parameters, thus making it that much harder for a hostile agent to intercept and decipher the coded message. Chaotic and hyperchaotic complex systems are stated as examples. Finally there are many open problems in the study of chaotic and hyperchaotic complex nonlinear dynamical systems, which need further investigations. Some of these open problems are given.

  5. Enhanced job control language procedures for the SIMSYS2D two-dimensional water-quality simulation system

    USGS Publications Warehouse

    Karavitis, G.A.

    1984-01-01

    The SIMSYS2D two-dimensional water-quality simulation system is a large-scale digital modeling software system used to simulate flow and transport of solutes in freshwater and estuarine environments. Due to the size, processing requirements, and complexity of the system, there is a need to easily move the system and its associated files between computer sites when required. A series of job control language (JCL) procedures was written to allow transferability between IBM and IBM-compatible computers. (USGS)

  6. Design and Effectiveness of Intelligent Tutors for Operators of Complex Dynamic Systems: A Tutor Implementation for Satellite System Operators.

    ERIC Educational Resources Information Center

    Mitchell, Christine M.; Govindaraj, T.

    1990-01-01

    Discusses the use of intelligent tutoring systems as opposed to traditional on-the-job training for training operators of complex dynamic systems and describes the computer architecture for a system for operators of a NASA (National Aeronautics and Space Administration) satellite control system. An experimental evaluation with college students is…

  7. Electroencephalography(EEG)-based instinctive brain-control of a quadruped locomotion robot.

    PubMed

    Jia, Wenchuan; Huang, Dandan; Luo, Xin; Pu, Huayan; Chen, Xuedong; Bai, Ou

    2012-01-01

    Artificial intelligence and bionic control have been applied in electroencephalography (EEG)-based robot system, to execute complex brain-control task. Nevertheless, due to technical limitations of the EEG decoding, the brain-computer interface (BCI) protocol is often complex, and the mapping between the EEG signal and the practical instructions lack of logic associated, which restrict the user's actual use. This paper presents a strategy that can be used to control a quadruped locomotion robot by user's instinctive action, based on five kinds of movement related neurophysiological signal. In actual use, the user drives or imagines the limbs/wrists action to generate EEG signal to adjust the real movement of the robot according to his/her own motor reflex of the robot locomotion. This method is easy for real use, as the user generates the brain-control signal through the instinctive reaction. By adopting the behavioral control of learning and evolution based on the proposed strategy, complex movement task may be realized by instinctive brain-control.

  8. Control and instanton trajectories for random transitions in turbulent flows

    NASA Astrophysics Data System (ADS)

    Bouchet, Freddy; Laurie, Jason; Zaboronski, Oleg

    2011-12-01

    Many turbulent systems exhibit random switches between qualitatively different attractors. The transition between these bistable states is often an extremely rare event, that can not be computed through DNS, due to complexity limitations. We present results for the calculation of instanton trajectories (a control problem) between non-equilibrium stationary states (attractors) in the 2D stochastic Navier-Stokes equations. By representing the transition probability between two states using a path integral formulation, we can compute the most probable trajectory (instanton) joining two non-equilibrium stationary states. Technically, this is equivalent to the minimization of an action, which can be related to a fluid mechanics control problem.

  9. An Upgrade of the Aeroheating Software ''MINIVER''

    NASA Technical Reports Server (NTRS)

    Louderback, Pierce

    2013-01-01

    Detailed computational modeling: CFO often used to create and execute computational domains. Increasing complexity when moving from 20 to 30 geometries. Computational time increased as finer grids are used (accuracy). Strong tool, but takes time to set up and run. MINIVER: Uses theoretical and empirical correlations. Orders of magnitude faster to set up and run. Not as accurate as CFO, but gives reasonable estimations. MINIVER's Drawbacks: Rigid command-line interface. Lackluster, unorganized documentation. No central control; multiple versions exist and have diverged.

  10. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Volume 2: Baseline architecture report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  11. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Phased development plan

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  12. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Volume 1: Baseline architecture report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  13. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Operations concept report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  14. Intelligent vision system for autonomous vehicle operations

    NASA Technical Reports Server (NTRS)

    Scholl, Marija S.

    1991-01-01

    A complex optical system consisting of a 4f optical correlator with programmatic filters under the control of a digital on-board computer that operates at video rates for filter generation, storage, and management is described.

  15. Compressed quantum computation using a remote five-qubit quantum computer

    NASA Astrophysics Data System (ADS)

    Hebenstreit, M.; Alsina, D.; Latorre, J. I.; Kraus, B.

    2017-05-01

    The notion of compressed quantum computation is employed to simulate the Ising interaction of a one-dimensional chain consisting of n qubits using the universal IBM cloud quantum computer running on log2(n ) qubits. The external field parameter that controls the quantum phase transition of this model translates into particular settings of the quantum gates that generate the circuit. We measure the magnetization, which displays the quantum phase transition, on a two-qubit system, which simulates a four-qubit Ising chain, and show its agreement with the theoretical prediction within a certain error. We also discuss the relevant point of how to assess errors when using a cloud quantum computer with a limited amount of runs. As a solution, we propose to use validating circuits, that is, to run independent controlled quantum circuits of similar complexity to the circuit of interest.

  16. Video Feedback in the Classroom: Development of an Easy-to-Use Learning Environment

    ERIC Educational Resources Information Center

    De Poorter, John; De Jaegher, Lut; De Cock, Mieke; Neuttiens, Tom

    2007-01-01

    Video feedback offers great potential for use in teaching but the relative complexity of the normal set-up of a video camera, a special tripod and a monitor has limited its use in teaching. The authors have developed a computer-webcam set-up which simplifies this. Anyone with an ordinary computer and webcam can learn to control the video feedback…

  17. Translations on Eastern Europe, Scientific Affairs, Number 542.

    DTIC Science & Technology

    1977-04-18

    transplanting human tissue has not as yet been given a final juridical approval like euthanasia, artificial insemination , abortion, birth control, and others...and data teleprocessing. This computer may also be used as a satellite computer for complex systems. The IZOT 310 has a large instruction...a well-known truth that modern science is using the most modern and leading technical facilities—from bathyscaphes to satellites , from gigantic

  18. Reliability/safety analysis of a fly-by-wire system

    NASA Technical Reports Server (NTRS)

    Brock, L. D.; Goddman, H. A.

    1980-01-01

    An analysis technique has been developed to estimate the reliability of a very complex, safety-critical system by constructing a diagram of the reliability equations for the total system. This diagram has many of the characteristics of a fault-tree or success-path diagram, but is much easier to construct for complex redundant systems. The diagram provides insight into system failure characteristics and identifies the most likely failure modes. A computer program aids in the construction of the diagram and the computation of reliability. Analysis of the NASA F-8 Digital Fly-by-Wire Flight Control System is used to illustrate the technique.

  19. [Complex treatment of patients with cholangiogenic hepatic abscess].

    PubMed

    Nychytaĭlo, M Iu; Skums, A V; Medvets'kyĭ, Ie B; Ohorodnyk, P V; Mashkovs'kyĭ, H Iu; Shkarban, V P; Shkarban, P O; Farzolakh, Mekhraban Jafarlu

    2005-07-01

    Results of treatment of 47 patients with cholangiogenic hepatic abscess were analyzed. Clinical, laboratory and special methods of investigation were applied for diagnosis. The authors consider ultrasound investigation (USI), computer tomography and the abscess puncture under the USI control with subsequent cytological and bacterial control the decisive methods in diagnosis of hepatic abscess. In complex of treatment of patients miniinvasive technologies were applied--the abscess puncture, its cavity drainage under USI control transcutaneus transhepatic cholangiostomy, endoscopic papillosphincterotomy with lithotripsy and nasobiliary drainage, according to indications. Efficacy of the abscess cavity sanation, using miramistinum and decasan, was proved. In general therapy the directed transport of medicines was applied.

  20. Controlling Complex Systems and Developing Dynamic Technology

    NASA Astrophysics Data System (ADS)

    Avizienis, Audrius Victor

    In complex systems, control and understanding become intertwined. Following Ilya Prigogine, we define complex systems as having control parameters which mediate transitions between distinct modes of dynamical behavior. From this perspective, determining the nature of control parameters and demonstrating the associated dynamical phase transitions are practically equivalent and fundamental to engaging with complexity. In the first part of this work, a control parameter is determined for a non-equilibrium electrochemical system by studying a transition in the morphology of structures produced by an electroless deposition reaction. Specifically, changing the size of copper posts used as the substrate for growing metallic silver structures by the reduction of Ag+ from solution under diffusion-limited reaction conditions causes a dynamical phase transition in the crystal growth process. For Cu posts with edge lengths on the order of one micron, local forces promoting anisotropic growth predominate, and the reaction produces interconnected networks of Ag nanowires. As the post size is increased above 10 microns, the local interfacial growth reaction dynamics couple with the macroscopic diffusion field, leading to spatially propagating instabilities in the electrochemical potential which induce periodic branching during crystal growth, producing dendritic deposits. This result is interesting both as an example of control and understanding in a complex system, and as a useful combination of top-down lithography with bottom-up electrochemical self-assembly. The second part of this work focuses on the technological development of devices fabricated using this non-equilibrium electrochemical process, towards a goal of integrating a complex network as a dynamic functional component in a neuromorphic computing device. Self-assembled networks of silver nanowires were reacted with sulfur to produce interfacial "atomic switches": silver-silver sulfide junctions, which exhibit complex dynamics (e.g. both short- and long-term changes in conductivity) in response to applied voltage signals. Characterization of these atomic switch networks (ASNs) brought out interesting parallels to biological neural networks, including power-law scaling in the statistics of electrical signal propagation and dynamic self-organization of differentiated subnetworks. A reservoir computing (RC) strategy was employed to utilize measurements of electrical signals dynamically generated in ASNs to perform time-series memory and manipulation tasks including a parity test and arbitrary waveform generation. These results represent the useful integration of a complex network into a dynamic physical RC device.

  1. An autonomous molecular computer for logical control of gene expression

    PubMed Central

    Benenson, Yaakov; Gil, Binyamin; Ben-Dor, Uri; Adar, Rivka; Shapiro, Ehud

    2013-01-01

    Early biomolecular computer research focused on laboratory-scale, human-operated computers for complex computational problems1–7. Recently, simple molecular-scale autonomous programmable computers were demonstrated8–15 allowing both input and output information to be in molecular form. Such computers, using biological molecules as input data and biologically active molecules as outputs, could produce a system for ‘logical’ control of biological processes. Here we describe an autonomous biomolecular computer that, at least in vitro, logically analyses the levels of messenger RNA species, and in response produces a molecule capable of affecting levels of gene expression. The computer operates at a concentration of close to a trillion computers per microlitre and consists of three programmable modules: a computation module, that is, a stochastic molecular automaton12–17; an input module, by which specific mRNA levels or point mutations regulate software molecule concentrations, and hence automaton transition probabilities; and an output module, capable of controlled release of a short single-stranded DNA molecule. This approach might be applied in vivo to biochemical sensing, genetic engineering and even medical diagnosis and treatment. As a proof of principle we programmed the computer to identify and analyse mRNA of disease-related genes18–22 associated with models of small-cell lung cancer and prostate cancer, and to produce a single-stranded DNA molecule modelled after an anticancer drug. PMID:15116117

  2. Extending the Capabilities of Closed-loop Distributed Engine Control Simulations Using LAN Communication

    NASA Technical Reports Server (NTRS)

    Aretskin-Hariton, Eliot D.; Zinnecker, Alicia Mae; Culley, Dennis E.

    2014-01-01

    Distributed Engine Control (DEC) is an enabling technology that has the potential to advance the state-of-the-art in gas turbine engine control. To analyze the capabilities that DEC offers, a Hardware-In-the-Loop (HIL) test bed is being developed at NASA Glenn Research Center. This test bed will support a systems-level analysis of control capabilities in closed-loop engine simulations. The structure of the HIL emulates a virtual test cell by implementing the operator functions, control system, and engine on three separate computers. This implementation increases the flexibility and extensibility of the HIL. Here, a method is discussed for implementing these interfaces by connecting the three platforms over a dedicated Local Area Network (LAN). This approach is verified using the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k), which is typically implemented on one computer. There are marginal differences between the results from simulation of the typical and the three-computer implementation. Additional analysis of the LAN network, including characterization of network load, packet drop, and latency, is presented. The three-computer setup supports the incorporation of complex control models and proprietary engine models into the HIL framework.

  3. Intrinsic evolution of controllable oscillators in FPTA-2

    NASA Technical Reports Server (NTRS)

    Sekanina, Lukas; Zebulum, Ricardo S.

    2005-01-01

    Simple one- and two-bit controllable oscillators were intrinsically evolved using only four cells of Field Programmable Transistor Array (FPTA-2). These oscillators can produce different oscillations for different setting of control signals. Therefore, they could be used, in principle, to compose complex networks of oscillators that could exhibit rich dynamical behavior in order to perform a computation or to model a desired system.

  4. Computational Models and Emergent Properties of Respiratory Neural Networks

    PubMed Central

    Lindsey, Bruce G.; Rybak, Ilya A.; Smith, Jeffrey C.

    2012-01-01

    Computational models of the neural control system for breathing in mammals provide a theoretical and computational framework bringing together experimental data obtained from different animal preparations under various experimental conditions. Many of these models were developed in parallel and iteratively with experimental studies and provided predictions guiding new experiments. This data-driven modeling approach has advanced our understanding of respiratory network architecture and neural mechanisms underlying generation of the respiratory rhythm and pattern, including their functional reorganization under different physiological conditions. Models reviewed here vary in neurobiological details and computational complexity and span multiple spatiotemporal scales of respiratory control mechanisms. Recent models describe interacting populations of respiratory neurons spatially distributed within the Bötzinger and pre-Bötzinger complexes and rostral ventrolateral medulla that contain core circuits of the respiratory central pattern generator (CPG). Network interactions within these circuits along with intrinsic rhythmogenic properties of neurons form a hierarchy of multiple rhythm generation mechanisms. The functional expression of these mechanisms is controlled by input drives from other brainstem components, including the retrotrapezoid nucleus and pons, which regulate the dynamic behavior of the core circuitry. The emerging view is that the brainstem respiratory network has rhythmogenic capabilities at multiple levels of circuit organization. This allows flexible, state-dependent expression of different neural pattern-generation mechanisms under various physiological conditions, enabling a wide repertoire of respiratory behaviors. Some models consider control of the respiratory CPG by pulmonary feedback and network reconfiguration during defensive behaviors such as cough. Future directions in modeling of the respiratory CPG are considered. PMID:23687564

  5. Parietal neural prosthetic control of a computer cursor in a graphical-user-interface task

    NASA Astrophysics Data System (ADS)

    Revechkis, Boris; Aflalo, Tyson NS; Kellis, Spencer; Pouratian, Nader; Andersen, Richard A.

    2014-12-01

    Objective. To date, the majority of Brain-Machine Interfaces have been used to perform simple tasks with sequences of individual targets in otherwise blank environments. In this study we developed a more practical and clinically relevant task that approximated modern computers and graphical user interfaces (GUIs). This task could be problematic given the known sensitivity of areas typically used for BMIs to visual stimuli, eye movements, decision-making, and attentional control. Consequently, we sought to assess the effect of a complex, GUI-like task on the quality of neural decoding. Approach. A male rhesus macaque monkey was implanted with two 96-channel electrode arrays in area 5d of the superior parietal lobule. The animal was trained to perform a GUI-like ‘Face in a Crowd’ task on a computer screen that required selecting one cued, icon-like, face image from a group of alternatives (the ‘Crowd’) using a neurally controlled cursor. We assessed whether the crowd affected decodes of intended cursor movements by comparing it to a ‘Crowd Off’ condition in which only the matching target appeared without alternatives. We also examined if training a neural decoder with the Crowd On rather than Off had any effect on subsequent decode quality. Main results. Despite the additional demands of working with the Crowd On, the animal was able to robustly perform the task under Brain Control. The presence of the crowd did not itself affect decode quality. Training the decoder with the Crowd On relative to Off had no negative influence on subsequent decoding performance. Additionally, the subject was able to gaze around freely without influencing cursor position. Significance. Our results demonstrate that area 5d recordings can be used for decoding in a complex, GUI-like task with free gaze. Thus, this area is a promising source of signals for neural prosthetics that utilize computing devices with GUI interfaces, e.g. personal computers, mobile devices, and tablet computers.

  6. Parietal neural prosthetic control of a computer cursor in a graphical-user-interface task.

    PubMed

    Revechkis, Boris; Aflalo, Tyson N S; Kellis, Spencer; Pouratian, Nader; Andersen, Richard A

    2014-12-01

    To date, the majority of Brain-Machine Interfaces have been used to perform simple tasks with sequences of individual targets in otherwise blank environments. In this study we developed a more practical and clinically relevant task that approximated modern computers and graphical user interfaces (GUIs). This task could be problematic given the known sensitivity of areas typically used for BMIs to visual stimuli, eye movements, decision-making, and attentional control. Consequently, we sought to assess the effect of a complex, GUI-like task on the quality of neural decoding. A male rhesus macaque monkey was implanted with two 96-channel electrode arrays in area 5d of the superior parietal lobule. The animal was trained to perform a GUI-like 'Face in a Crowd' task on a computer screen that required selecting one cued, icon-like, face image from a group of alternatives (the 'Crowd') using a neurally controlled cursor. We assessed whether the crowd affected decodes of intended cursor movements by comparing it to a 'Crowd Off' condition in which only the matching target appeared without alternatives. We also examined if training a neural decoder with the Crowd On rather than Off had any effect on subsequent decode quality. Despite the additional demands of working with the Crowd On, the animal was able to robustly perform the task under Brain Control. The presence of the crowd did not itself affect decode quality. Training the decoder with the Crowd On relative to Off had no negative influence on subsequent decoding performance. Additionally, the subject was able to gaze around freely without influencing cursor position. Our results demonstrate that area 5d recordings can be used for decoding in a complex, GUI-like task with free gaze. Thus, this area is a promising source of signals for neural prosthetics that utilize computing devices with GUI interfaces, e.g. personal computers, mobile devices, and tablet computers.

  7. An algorithm for automatic reduction of complex signal flow graphs

    NASA Technical Reports Server (NTRS)

    Young, K. R.; Hoberock, L. L.; Thompson, J. G.

    1976-01-01

    A computer algorithm is developed that provides efficient means to compute transmittances directly from a signal flow graph or a block diagram. Signal flow graphs are cast as directed graphs described by adjacency matrices. Nonsearch computation, designed for compilers without symbolic capability, is used to identify all arcs that are members of simple cycles for use with Mason's gain formula. The routine does not require the visual acumen of an interpreter to reduce the topology of the graph, and it is particularly useful for analyzing control systems described for computer analyses by means of interactive graphics.

  8. Computer-controlled multi-parameter mapping of 3D compressible flowfields using planar laser-induced iodine fluorescence

    NASA Technical Reports Server (NTRS)

    Donohue, James M.; Victor, Kenneth G.; Mcdaniel, James C., Jr.

    1993-01-01

    A computer-controlled technique, using planar laser-induced iodine fluorescence, for measuring complex compressible flowfields is presented. A new laser permits the use of a planar two-line temperature technique so that all parameters can be measured with the laser operated narrowband. Pressure and temperature measurements in a step flowfield show agreement within 10 percent of a CFD model except in regions close to walls. Deviation of near wall temperature measurements from the model was decreased from 21 percent to 12 percent compared to broadband planar temperature measurements. Computer-control of the experiment has been implemented, except for the frequency tuning of the laser. Image data storage and processing has been improved by integrating a workstation into the experimental setup reducing the data reduction time by a factor of 50.

  9. Transcriptional Network Analysis in Muscle Reveals AP-1 as a Partner of PGC-1α in the Regulation of the Hypoxic Gene Program

    PubMed Central

    Baresic, Mario; Salatino, Silvia; Kupr, Barbara

    2014-01-01

    Skeletal muscle tissue shows an extraordinary cellular plasticity, but the underlying molecular mechanisms are still poorly understood. Here, we use a combination of experimental and computational approaches to unravel the complex transcriptional network of muscle cell plasticity centered on the peroxisome proliferator-activated receptor γ coactivator 1α (PGC-1α), a regulatory nexus in endurance training adaptation. By integrating data on genome-wide binding of PGC-1α and gene expression upon PGC-1α overexpression with comprehensive computational prediction of transcription factor binding sites (TFBSs), we uncover a hitherto-underestimated number of transcription factor partners involved in mediating PGC-1α action. In particular, principal component analysis of TFBSs at PGC-1α binding regions predicts that, besides the well-known role of the estrogen-related receptor α (ERRα), the activator protein 1 complex (AP-1) plays a major role in regulating the PGC-1α-controlled gene program of the hypoxia response. Our findings thus reveal the complex transcriptional network of muscle cell plasticity controlled by PGC-1α. PMID:24912679

  10. Continuously Adaptive vs. Discrete Changes of Task Difficulty in the Training of a Complex Perceptual-Motor Task.

    ERIC Educational Resources Information Center

    Wood, Milton E.

    The purpose of the effort was to determine the benefits to be derived from the adaptive training technique of automatically adjusting task difficulty as a function of a student skill during early learning of a complex perceptual motor task. A digital computer provided the task dynamics, scoring, and adaptive control of a second-order, two-axis,…

  11. Computation of the target state and feedback controls for time optimal consensus in multi-agent systems

    NASA Astrophysics Data System (ADS)

    Mulla, Ameer K.; Patil, Deepak U.; Chakraborty, Debraj

    2018-02-01

    N identical agents with bounded inputs aim to reach a common target state (consensus) in the minimum possible time. Algorithms for computing this time-optimal consensus point, the control law to be used by each agent and the time taken for the consensus to occur, are proposed. Two types of multi-agent systems are considered, namely (1) coupled single-integrator agents on a plane and, (2) double-integrator agents on a line. At the initial time instant, each agent is assumed to have access to the state information of all the other agents. An algorithm, using convexity of attainable sets and Helly's theorem, is proposed, to compute the final consensus target state and the minimum time to achieve this consensus. Further, parts of the computation are parallelised amongst the agents such that each agent has to perform computations of O(N2) run time complexity. Finally, local feedback time-optimal control laws are synthesised to drive each agent to the target point in minimum time. During this part of the operation, the controller for each agent uses measurements of only its own states and does not need to communicate with any neighbouring agents.

  12. Anthropometric considerations for a 4-axis side-arm flight controller

    NASA Technical Reports Server (NTRS)

    Debellis, W. B.

    1986-01-01

    A data base on multiaxis side-arm flight controls was generated. The rapid advances in fly-by-light technology, automatic stability systems, and onboard computers have combined to create flexible flight control systems which could reduce the workload imposed on the operator by complex new equipment. This side-arm flight controller combines four controls into one unit and should simplify the pilot's task. However, the use of a multiaxis side-arm flight controller without complete cockpit integration may tend to increase the pilot's workload.

  13. On the Computational Capabilities of Physical Systems. Part 1; The Impossibility of Infallible Computation

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In this first of two papers, strong limits on the accuracy of physical computation are established. First it is proven that there cannot be a physical computer C to which one can pose any and all computational tasks concerning the physical universe. Next it is proven that no physical computer C can correctly carry out any computational task in the subset of such tasks that can be posed to C. This result holds whether the computational tasks concern a system that is physically isolated from C, or instead concern a system that is coupled to C. As a particular example, this result means that there cannot be a physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly 'processing information faster than the universe does'. The results also mean that there cannot exist an infallible, general-purpose observation apparatus, and that there cannot be an infallible, general-purpose control apparatus. These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - a definition of 'physical computation' - is needed to address the issues considered in these papers. While this definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. The second in this pair of papers presents a preliminary exploration of some of this mathematical structure, including in particular that of prediction complexity, which is a 'physical computation analogue' of algorithmic information complexity. It is proven in that second paper that either the Hamiltonian of our universe proscribes a certain type of computation, or prediction complexity is unique (unlike algorithmic information complexity), in that there is one and only version of it that can be applicable throughout our universe.

  14. Emotor control: computations underlying bodily resource allocation, emotions, and confidence.

    PubMed

    Kepecs, Adam; Mensh, Brett D

    2015-12-01

    Emotional processes are central to behavior, yet their deeply subjective nature has been a challenge for neuroscientific study as well as for psychiatric diagnosis. Here we explore the relationships between subjective feelings and their underlying brain circuits from a computational perspective. We apply recent insights from systems neuroscience-approaching subjective behavior as the result of mental computations instantiated in the brain-to the study of emotions. We develop the hypothesis that emotions are the product of neural computations whose motor role is to reallocate bodily resources mostly gated by smooth muscles. This "emotor" control system is analagous to the more familiar motor control computations that coordinate skeletal muscle movements. To illustrate this framework, we review recent research on "confidence." Although familiar as a feeling, confidence is also an objective statistical quantity: an estimate of the probability that a hypothesis is correct. This model-based approach helped reveal the neural basis of decision confidence in mammals and provides a bridge to the subjective feeling of confidence in humans. These results have important implications for psychiatry, since disorders of confidence computations appear to contribute to a number of psychopathologies. More broadly, this computational approach to emotions resonates with the emerging view that psychiatric nosology may be best parameterized in terms of disorders of the cognitive computations underlying complex behavior.

  15. Computationally efficient control allocation

    NASA Technical Reports Server (NTRS)

    Durham, Wayne (Inventor)

    2001-01-01

    A computationally efficient method for calculating near-optimal solutions to the three-objective, linear control allocation problem is disclosed. The control allocation problem is that of distributing the effort of redundant control effectors to achieve some desired set of objectives. The problem is deemed linear if control effectiveness is affine with respect to the individual control effectors. The optimal solution is that which exploits the collective maximum capability of the effectors within their individual physical limits. Computational efficiency is measured by the number of floating-point operations required for solution. The method presented returned optimal solutions in more than 90% of the cases examined; non-optimal solutions returned by the method were typically much less than 1% different from optimal and the errors tended to become smaller than 0.01% as the number of controls was increased. The magnitude of the errors returned by the present method was much smaller than those that resulted from either pseudo inverse or cascaded generalized inverse solutions. The computational complexity of the method presented varied linearly with increasing numbers of controls; the number of required floating point operations increased from 5.5 i, to seven times faster than did the minimum-norm solution (the pseudoinverse), and at about the same rate as did the cascaded generalized inverse solution. The computational requirements of the method presented were much better than that of previously described facet-searching methods which increase in proportion to the square of the number of controls.

  16. Autonomous Performance Monitoring System: Monitoring and Self-Tuning (MAST)

    NASA Technical Reports Server (NTRS)

    Peterson, Chariya; Ziyad, Nigel A.

    2000-01-01

    Maintaining the long-term performance of software onboard a spacecraft can be a major factor in the cost of operations. In particular, the task of controlling and maintaining a future mission of distributed spacecraft will undoubtedly pose a great challenge, since the complexity of multiple spacecraft flying in formation grows rapidly as the number of spacecraft in the formation increases. Eventually, new approaches will be required in developing viable control systems that can handle the complexity of the data and that are flexible, reliable and efficient. In this paper we propose a methodology that aims to maintain the accuracy of flight software, while reducing the computational complexity of software tuning tasks. The proposed Monitoring and Self-Tuning (MAST) method consists of two parts: a flight software monitoring algorithm and a tuning algorithm. The dependency on the software being monitored is mostly contained in the monitoring process, while the tuning process is a generic algorithm independent of the detailed knowledge on the software. This architecture will enable MAST to be applicable to different onboard software controlling various dynamics of the spacecraft, such as attitude self-calibration, and formation control. An advantage of MAST over conventional techniques such as filter or batch least square is that the tuning algorithm uses machine learning approach to handle uncertainty in the problem domain, resulting in reducing over all computational complexity. The underlying concept of this technique is a reinforcement learning scheme based on cumulative probability generated by the historical performance of the system. The success of MAST will depend heavily on the reinforcement scheme used in the tuning algorithm, which guarantees the tuning solutions exist.

  17. Position And Force Control For Multiple-Arm Robots

    NASA Technical Reports Server (NTRS)

    Hayati, Samad A.

    1988-01-01

    Number of arms increased without introducing undue complexity. Strategy and computer architecture developed for simultaneous control of positions of number of robot arms manipulating same object and of forces and torques that arms exert on object. Scheme enables coordinated manipulation of object, causing it to move along assigned trajectory and be subjected to assigned internal forces and torques.

  18. Application of AI techniques to a voice-actuated computer system for reconstructing and displaying magnetic resonance imaging data

    NASA Astrophysics Data System (ADS)

    Sherley, Patrick L.; Pujol, Alfonso, Jr.; Meadow, John S.

    1990-07-01

    To provide a means of rendering complex computer architectures languages and input/output modalities transparent to experienced and inexperienced users research is being conducted to develop a voice driven/voice response computer graphics imaging system. The system will be used for reconstructing and displaying computed tomography and magnetic resonance imaging scan data. In conjunction with this study an artificial intelligence (Al) control strategy was developed to interface the voice components and support software to the computer graphics functions implemented on the Sun Microsystems 4/280 color graphics workstation. Based on generated text and converted renditions of verbal utterances by the user the Al control strategy determines the user''s intent and develops and validates a plan. The program type and parameters within the plan are used as input to the graphics system for reconstructing and displaying medical image data corresponding to that perceived intent. If the plan is not valid the control strategy queries the user for additional information. The control strategy operates in a conversation mode and vocally provides system status reports. A detailed examination of the various AT techniques is presented with major emphasis being placed on their specific roles within the total control strategy structure. 1.

  19. Fast computation of an optimal controller for large-scale adaptive optics.

    PubMed

    Massioni, Paolo; Kulcsár, Caroline; Raynaud, Henri-François; Conan, Jean-Marc

    2011-11-01

    The linear quadratic Gaussian regulator provides the minimum-variance control solution for a linear time-invariant system. For adaptive optics (AO) applications, under the hypothesis of a deformable mirror with instantaneous response, such a controller boils down to a minimum-variance phase estimator (a Kalman filter) and a projection onto the mirror space. The Kalman filter gain can be computed by solving an algebraic Riccati matrix equation, whose computational complexity grows very quickly with the size of the telescope aperture. This "curse of dimensionality" makes the standard solvers for Riccati equations very slow in the case of extremely large telescopes. In this article, we propose a way of computing the Kalman gain for AO systems by means of an approximation that considers the turbulence phase screen as the cropped version of an infinite-size screen. We demonstrate the advantages of the methods for both off- and on-line computational time, and we evaluate its performance for classical AO as well as for wide-field tomographic AO with multiple natural guide stars. Simulation results are reported.

  20. Temperature and heat flux datasets of a complex object in a fire plume for the validation of fire and thermal response codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jernigan, Dann A.; Blanchat, Thomas K.

    It is necessary to improve understanding and develop temporally- and spatially-resolved integral scale validation data of the heat flux incident to a complex object in addition to measuring the thermal response of said object located within the fire plume for the validation of the SIERRA/FUEGO/SYRINX fire and SIERRA/CALORE codes. To meet this objective, a complex calorimeter with sufficient instrumentation to allow validation of the coupling between FUEGO/SYRINX/CALORE has been designed, fabricated, and tested in the Fire Laboratory for Accreditation of Models and Experiments (FLAME) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparisonmore » between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. This report presents the data validation steps and processes, the results of the penlight radiant heat experiments (for the purpose of validating the CALORE heat transfer modeling of the complex calorimeter), and the results of the fire tests in FLAME.« less

  1. Spintronic Nanodevices for Bioinspired Computing

    PubMed Central

    Grollier, Julie; Querlioz, Damien; Stiles, Mark D.

    2016-01-01

    Bioinspired hardware holds the promise of low-energy, intelligent, and highly adaptable computing systems. Applications span from automatic classification for big data management, through unmanned vehicle control, to control for biomedical prosthesis. However, one of the major challenges of fabricating bioinspired hardware is building ultra-high-density networks out of complex processing units interlinked by tunable connections. Nanometer-scale devices exploiting spin electronics (or spintronics) can be a key technology in this context. In particular, magnetic tunnel junctions (MTJs) are well suited for this purpose because of their multiple tunable functionalities. One such functionality, non-volatile memory, can provide massive embedded memory in unconventional circuits, thus escaping the von-Neumann bottleneck arising when memory and processors are located separately. Other features of spintronic devices that could be beneficial for bioinspired computing include tunable fast nonlinear dynamics, controlled stochasticity, and the ability of single devices to change functions in different operating conditions. Large networks of interacting spintronic nanodevices can have their interactions tuned to induce complex dynamics such as synchronization, chaos, soliton diffusion, phase transitions, criticality, and convergence to multiple metastable states. A number of groups have recently proposed bioinspired architectures that include one or several types of spintronic nanodevices. In this paper, we show how spintronics can be used for bioinspired computing. We review the different approaches that have been proposed, the recent advances in this direction, and the challenges toward fully integrated spintronics complementary metal–oxide–semiconductor (CMOS) bioinspired hardware. PMID:27881881

  2. An Exploration of Cognitive Agility as Quantified by Attention Allocation in a Complex Environment

    DTIC Science & Technology

    2017-03-01

    quantified by eye-tracking data collected while subjects played a military-relevant cognitive agility computer game (Make Goal), to determine whether...subjects played a military-relevant cognitive agility computer game (Make Goal), to determine whether certain patterns are associated with effective...Group and Control Group on Eye Tracking and Game Performance .....................36 3. Comparison between High and Low Performers on Eye tracking and

  3. Confessions of a robot lobotomist

    NASA Technical Reports Server (NTRS)

    Gottshall, R. Marc

    1994-01-01

    Since its inception, numerically controlled (NC) machining methods have been used throughout the aerospace industry to mill, drill, and turn complex shapes by sequentially stepping through motion programs. However, the recent demand for more precision, faster feeds, exotic sensors, and branching execution have existing computer numerical control (CNC) and distributed numerical control (DNC) systems running at maximum controller capacity. Typical disadvantages of current CNC's include fixed memory capacities, limited communication ports, and the use of multiple control languages. The need to tailor CNC's to meet specific applications, whether it be expanded memory, additional communications, or integrated vision, often requires replacing the original controller supplied with the commercial machine tool with a more powerful and capable system. This paper briefly describes the process and equipment requirements for new controllers and their evolutionary implementation in an aerospace environment. The process of controller retrofit with currently available machines is examined, along with several case studies and their computational and architectural implications.

  4. Quantum computing gates via optimal control

    NASA Astrophysics Data System (ADS)

    Atia, Yosi; Elias, Yuval; Mor, Tal; Weinstein, Yossi

    2014-10-01

    We demonstrate the use of optimal control to design two entropy-manipulating quantum gates which are more complex than the corresponding, commonly used, gates, such as CNOT and Toffoli (CCNOT): A two-qubit gate called polarization exchange (PE) and a three-qubit gate called polarization compression (COMP) were designed using GRAPE, an optimal control algorithm. Both gates were designed for a three-spin system. Our design provided efficient and robust nuclear magnetic resonance (NMR) radio frequency (RF) pulses for 13C2-trichloroethylene (TCE), our chosen three-spin system. We then experimentally applied these two quantum gates onto TCE at the NMR lab. Such design of these gates and others could be relevant for near-future applications of quantum computing devices.

  5. Computerized parts list system coordinates engineering releases, parts control, and manufacturing planning

    NASA Technical Reports Server (NTRS)

    Horton, W.; Kinsey, M.

    1967-01-01

    Computerized parts list system compiles and summarize all pertinent and available information on complex new systems. The parts list system consists of three computer subroutines - list of parts, parts numerical sequence list, and specifications list.

  6. Efficient Pricing Technique for Resource Allocation Problem in Downlink OFDM Cognitive Radio Networks

    NASA Astrophysics Data System (ADS)

    Abdulghafoor, O. B.; Shaat, M. M. R.; Ismail, M.; Nordin, R.; Yuwono, T.; Alwahedy, O. N. A.

    2017-05-01

    In this paper, the problem of resource allocation in OFDM-based downlink cognitive radio (CR) networks has been proposed. The purpose of this research is to decrease the computational complexity of the resource allocation algorithm for downlink CR network while concerning the interference constraint of primary network. The objective has been secured by adopting pricing scheme to develop power allocation algorithm with the following concerns: (i) reducing the complexity of the proposed algorithm and (ii) providing firm power control to the interference introduced to primary users (PUs). The performance of the proposed algorithm is tested for OFDM- CRNs. The simulation results show that the performance of the proposed algorithm approached the performance of the optimal algorithm at a lower computational complexity, i.e., O(NlogN), which makes the proposed algorithm suitable for more practical applications.

  7. Chaotic dynamics in nanoscale NbO2 Mott memristors for analogue computing

    NASA Astrophysics Data System (ADS)

    Kumar, Suhas; Strachan, John Paul; Williams, R. Stanley

    2017-08-01

    At present, machine learning systems use simplified neuron models that lack the rich nonlinear phenomena observed in biological systems, which display spatio-temporal cooperative dynamics. There is evidence that neurons operate in a regime called the edge of chaos that may be central to complexity, learning efficiency, adaptability and analogue (non-Boolean) computation in brains. Neural networks have exhibited enhanced computational complexity when operated at the edge of chaos, and networks of chaotic elements have been proposed for solving combinatorial or global optimization problems. Thus, a source of controllable chaotic behaviour that can be incorporated into a neural-inspired circuit may be an essential component of future computational systems. Such chaotic elements have been simulated using elaborate transistor circuits that simulate known equations of chaos, but an experimental realization of chaotic dynamics from a single scalable electronic device has been lacking. Here we describe niobium dioxide (NbO2) Mott memristors each less than 100 nanometres across that exhibit both a nonlinear-transport-driven current-controlled negative differential resistance and a Mott-transition-driven temperature-controlled negative differential resistance. Mott materials have a temperature-dependent metal-insulator transition that acts as an electronic switch, which introduces a history-dependent resistance into the device. We incorporate these memristors into a relaxation oscillator and observe a tunable range of periodic and chaotic self-oscillations. We show that the nonlinear current transport coupled with thermal fluctuations at the nanoscale generates chaotic oscillations. Such memristors could be useful in certain types of neural-inspired computation by introducing a pseudo-random signal that prevents global synchronization and could also assist in finding a global minimum during a constrained search. We specifically demonstrate that incorporating such memristors into the hardware of a Hopfield computing network can greatly improve the efficiency and accuracy of converging to a solution for computationally difficult problems.

  8. Concept of software interface for BCI systems

    NASA Astrophysics Data System (ADS)

    Svejda, Jaromir; Zak, Roman; Jasek, Roman

    2016-06-01

    Brain Computer Interface (BCI) technology is intended to control external system by brain activity. One of main part of such system is software interface, which carries about clear communication between brain and either computer or additional devices connected to computer. This paper is organized as follows. Firstly, current knowledge about human brain is briefly summarized to points out its complexity. Secondly, there is described a concept of BCI system, which is then used to build an architecture of proposed software interface. Finally, there are mentioned disadvantages of sensing technology discovered during sensing part of our research.

  9. Complexity analysis of fetal heart rate preceding intrauterine demise.

    PubMed

    Schnettler, William T; Goldberger, Ary L; Ralston, Steven J; Costa, Madalena

    2016-08-01

    Visual non-stress test interpretation lacks the optimal specificity and observer-agreement of an ideal screening tool for intrauterine fetal demise (IUFD) syndrome prevention. Computational methods based on traditional heart rate variability have also been of limited value. Complexity analysis probes properties of the dynamics of physiologic signals that are otherwise not accessible and, therefore, might be useful in this context. To explore the association between fetal heart rate (FHR) complexity analysis and subsequent IUFD. Our specific hypothesis is that the complexity of the fetal heart rate dynamics is lower in the IUFD group compared with controls. This case-control study utilized cases of IUFD at a single tertiary-care center among singleton pregnancies with at least 10min of continuous electronic FHR monitoring on at least 2 weekly occasions in the 3 weeks immediately prior to fetal demise. Controls delivered a live singleton beyond 35 weeks' gestation and were matched to cases by gestational age, testing indication, and maternal age in a 3:1 ratio. FHR data was analyzed using the multiscale entropy (MSE) method to derive their complexity index. In addition, pNNx, a measure of short-term heart rate variability, which in adults is ascribable primarily to cardiac vagal tone modulation, was also computed. 211 IUFDs occurred during the 9-year period of review, but only 6 met inclusion criteria. The median gestational age at the time of IUFD was 35.5 weeks. Three controls were matched to each case for a total of 24 subjects, and 87 FHR tracings were included for analysis. The median gestational age at the first fetal heart rate tracing was similar between groups (median [1st-3rd quartiles] weeks: IUFD cases: 34.7 (34.4-36.2); controls: 35.3 (34.4-36.1); p=.94). The median complexity of the cases' tracings was significantly less than the controls' (12.44 [8.9-16.77] vs. 17.82 [15.21-22.17]; p<.0001). Furthermore, the cases' median complexity decreased as gestation advanced whereas the controls' median complexity increased over time. However, this difference was not statistically significant [-0.83 (-2.03 to 0.47) vs. 0.14 (-1.25 to 0.94); p=.62]. The degree of short-term variability of FHR tracings, as measured by the pNN metric, was significantly lower (p<.005) for the controls (1.1 [0.8-1.3]) than the IUFD cases (1.3 [1.1-1.6]). FHR complexity analysis using multiscale entropy analysis may add value to other measures in detecting and monitoring pregnancies at the highest risk for IUFD. The decrease in complexity and short-term variability seen in the IUFD cases may reflect perturbations in neuroautonomic control due to multiple maternal-fetal factors. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Petri net model for analysis of concurrently processed complex algorithms

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.

    1986-01-01

    This paper presents a Petri-net model suitable for analyzing the concurrent processing of computationally complex algorithms. The decomposed operations are to be processed in a multiple processor, data driven architecture. Of particular interest is the application of the model to both the description of the data/control flow of a particular algorithm, and to the general specification of the data driven architecture. A candidate architecture is also presented.

  11. Advanced computer-aided design for bone tissue-engineering scaffolds.

    PubMed

    Ramin, E; Harris, R A

    2009-04-01

    The design of scaffolds with an intricate and controlled internal structure represents a challenge for tissue engineering. Several scaffold-manufacturing techniques allow the creation of complex architectures but with little or no control over the main features of the channel network such as the size, shape, and interconnectivity of each individual channel, resulting in intricate but random structures. The combined use of computer-aided design (CAD) systems and layer-manufacturing techniques allows a high degree of control over these parameters with few limitations in terms of achievable complexity. However, the design of complex and intricate networks of channels required in CAD is extremely time-consuming since manually modelling hundreds of different geometrical elements, all with different parameters, may require several days to design individual scaffold structures. An automated design methodology is proposed by this research to overcome these limitations. This approach involves the investigation of novel software algorithms, which are able to interact with a conventional CAD program and permit the automated design of several geometrical elements, each with a different size and shape. In this work, the variability of the parameters required to define each geometry has been set as random, but any other distribution could have been adopted. This methodology has been used to design five cubic scaffolds with interconnected pore channels that range from 200 to 800 microm in diameter, each with an increased complexity of the internal geometrical arrangement. A clinical case study, consisting of an integration of one of these geometries with a craniofacial implant, is then presented.

  12. Computational models of neuromodulation.

    PubMed

    Fellous, J M; Linster, C

    1998-05-15

    Computational modeling of neural substrates provides an excellent theoretical framework for the understanding of the computational roles of neuromodulation. In this review, we illustrate, with a large number of modeling studies, the specific computations performed by neuromodulation in the context of various neural models of invertebrate and vertebrate preparations. We base our characterization of neuromodulations on their computational and functional roles rather than on anatomical or chemical criteria. We review the main framework in which neuromodulation has been studied theoretically (central pattern generation and oscillations, sensory processing, memory and information integration). Finally, we present a detailed mathematical overview of how neuromodulation has been implemented at the single cell and network levels in modeling studies. Overall, neuromodulation is found to increase and control computational complexity.

  13. Using 3D computer simulations to enhance ophthalmic training.

    PubMed

    Glittenberg, C; Binder, S

    2006-01-01

    To develop more effective methods of demonstrating and teaching complex topics in ophthalmology with the use of computer aided three-dimensional (3D) animation and interactive multimedia technologies. We created 3D animations and interactive computer programmes demonstrating the neuroophthalmological nature of the oculomotor system, including the anatomy, physiology and pathophysiology of the extra-ocular eye muscles and the oculomotor cranial nerves, as well as pupillary symptoms of neurological diseases. At the University of Vienna we compared their teaching effectiveness to conventional teaching methods in a comparative study involving 100 medical students, a multiple choice exam and a survey. The comparative study showed that our students achieved significantly better test results (80%) than the control group (63%) (diff. = 17 +/- 5%, p = 0.004). The survey showed a positive reaction to the software and a strong preference to have more subjects and techniques demonstrated in this fashion. Three-dimensional computer animation technology can significantly increase the quality and efficiency of the education and demonstration of complex topics in ophthalmology.

  14. Efficient evaluation of wireless real-time control networks.

    PubMed

    Horvath, Peter; Yampolskiy, Mark; Koutsoukos, Xenofon

    2015-02-11

    In this paper, we present a system simulation framework for the design and performance evaluation of complex wireless cyber-physical systems. We describe the simulator architecture and the specific developments that are required to simulate cyber-physical systems relying on multi-channel, multihop mesh networks. We introduce realistic and efficient physical layer models and a system simulation methodology, which provides statistically significant performance evaluation results with low computational complexity. The capabilities of the proposed framework are illustrated in the example of WirelessHART, a centralized, real-time, multi-hop mesh network designed for industrial control and monitor applications.

  15. Test and evaluation of a multifunction keyboard and a dedicated keyboard for control of a flight management computer

    NASA Technical Reports Server (NTRS)

    Crane, J. M.; Boucek, G. P., Jr.; Smith, W. D.

    1986-01-01

    A flight management computer (FMC) control display unit (CDU) test was conducted to compare two types of input devices: a fixed legend (dedicated) keyboard and a programmable legend (multifunction) keyboard. The task used for comparison was operation of the flight management computer for the Boeing 737-300. The same tasks were performed by twelve pilots on the FMC control display unit configured with a programmable legend keyboard and with the currently used B737-300 dedicated keyboard. Flight simulator work activity levels and input task complexity were varied during each pilot session. Half of the points tested were previously familiar with the B737-300 dedicated keyboard CDU and half had no prior experience with it. The data collected included simulator flight parameters, keystroke time and sequences, and pilot questionnaire responses. A timeline analysis was also used for evaluation of the two keyboard concepts.

  16. Structured analysis and modeling of complex systems

    NASA Technical Reports Server (NTRS)

    Strome, David R.; Dalrymple, Mathieu A.

    1992-01-01

    The Aircrew Evaluation Sustained Operations Performance (AESOP) facility at Brooks AFB, Texas, combines the realism of an operational environment with the control of a research laboratory. In recent studies we collected extensive data from the Airborne Warning and Control Systems (AWACS) Weapons Directors subjected to high and low workload Defensive Counter Air Scenarios. A critical and complex task in this environment involves committing a friendly fighter against a hostile fighter. Structured Analysis and Design techniques and computer modeling systems were applied to this task as tools for analyzing subject performance and workload. This technology is being transferred to the Man-Systems Division of NASA Johnson Space Center for application to complex mission related tasks, such as manipulating the Shuttle grappler arm.

  17. NASCAP user's manual, 1978

    NASA Technical Reports Server (NTRS)

    Cassidy, J. J., III

    1978-01-01

    NASCAP simulates the charging process for a complex object in either tenuous plasma (geosynchronous orbit) or ground test (electron gun source) environment. Program control words, the structure of user input files, and various user options available are described in this computer programmer's user manual.

  18. Reduced-Order Modeling for Optimization and Control of Complex Flows

    DTIC Science & Technology

    2010-11-30

    Statistics Colloquium, Auburn, AL, (January 2009). 16. University of Pittsburgh, Mathematics Colloquium, Pittsburgh, PA, (February 2009). 17. Goethe ...Center for Scientific Computing, Goethe University Frankfurt am Main, Ger- many, (June 2009). 18. Air Force Institute of Technology, Wright-Patterson

  19. Method for concurrent execution of primitive operations by dynamically assigning operations based upon computational marked graph and availability of data

    NASA Technical Reports Server (NTRS)

    Mielke, Roland V. (Inventor); Stoughton, John W. (Inventor)

    1990-01-01

    Computationally complex primitive operations of an algorithm are executed concurrently in a plurality of functional units under the control of an assignment manager. The algorithm is preferably defined as a computationally marked graph contianing data status edges (paths) corresponding to each of the data flow edges. The assignment manager assigns primitive operations to the functional units and monitors completion of the primitive operations to determine data availability using the computational marked graph of the algorithm. All data accessing of the primitive operations is performed by the functional units independently of the assignment manager.

  20. High performance network and channel-based storage

    NASA Technical Reports Server (NTRS)

    Katz, Randy H.

    1991-01-01

    In the traditional mainframe-centered view of a computer system, storage devices are coupled to the system through complex hardware subsystems called input/output (I/O) channels. With the dramatic shift towards workstation-based computing, and its associated client/server model of computation, storage facilities are now found attached to file servers and distributed throughout the network. We discuss the underlying technology trends that are leading to high performance network-based storage, namely advances in networks, storage devices, and I/O controller and server architectures. We review several commercial systems and research prototypes that are leading to a new approach to high performance computing based on network-attached storage.

  1. From an Executive Network to Executive Control: A Computational Model of the "n"-Back Task

    ERIC Educational Resources Information Center

    Chatham, Christopher H.; Herd, Seth A.; Brant, Angela M.; Hazy, Thomas E.; Miyake, Akira; O'Reilly, Randy; Friedman, Naomi P.

    2011-01-01

    A paradigmatic test of executive control, the n-back task, is known to recruit a widely distributed parietal, frontal, and striatal "executive network," and is thought to require an equally wide array of executive functions. The mapping of functions onto substrates in such a complex task presents a significant challenge to any theoretical…

  2. Autonomous stair-climbing with miniature jumping robots.

    PubMed

    Stoeter, Sascha A; Papanikolopoulos, Nikolaos

    2005-04-01

    The problem of vision-guided control of miniature mobile robots is investigated. Untethered mobile robots with small physical dimensions of around 10 cm or less do not permit powerful onboard computers because of size and power constraints. These challenges have, in the past, reduced the functionality of such devices to that of a complex remote control vehicle with fancy sensors. With the help of a computationally more powerful entity such as a larger companion robot, the control loop can be closed. Using the miniature robot's video transmission or that of an observer to localize it in the world, control commands can be computed and relayed to the inept robot. The result is a system that exhibits autonomous capabilities. The framework presented here solves the problem of climbing stairs with the miniature Scout robot. The robot's unique locomotion mode, the jump, is employed to hop one step at a time. Methods for externally tracking the Scout are developed. A large number of real-world experiments are conducted and the results discussed.

  3. Real-Time linux dynamic clamp: a fast and flexible way to construct virtual ion channels in living cells.

    PubMed

    Dorval, A D; Christini, D J; White, J A

    2001-10-01

    We describe a system for real-time control of biological and other experiments. This device, based around the Real-Time Linux operating system, was tested specifically in the context of dynamic clamping, a demanding real-time task in which a computational system mimics the effects of nonlinear membrane conductances in living cells. The system is fast enough to represent dozens of nonlinear conductances in real time at clock rates well above 10 kHz. Conductances can be represented in deterministic form, or more accurately as discrete collections of stochastically gating ion channels. Tests were performed using a variety of complex models of nonlinear membrane mechanisms in excitable cells, including simulations of spatially extended excitable structures, and multiple interacting cells. Only in extreme cases does the computational load interfere with high-speed "hard" real-time processing (i.e., real-time processing that never falters). Freely available on the worldwide web, this experimental control system combines good performance. immense flexibility, low cost, and reasonable ease of use. It is easily adapted to any task involving real-time control, and excels in particular for applications requiring complex control algorithms that must operate at speeds over 1 kHz.

  4. Electromagnetic interference-aware transmission scheduling and power control for dynamic wireless access in hospital environments.

    PubMed

    Phunchongharn, Phond; Hossain, Ekram; Camorlinga, Sergio

    2011-11-01

    We study the multiple access problem for e-Health applications (referred to as secondary users) coexisting with medical devices (referred to as primary or protected users) in a hospital environment. In particular, we focus on transmission scheduling and power control of secondary users in multiple spatial reuse time-division multiple access (STDMA) networks. The objective is to maximize the spectrum utilization of secondary users and minimize their power consumption subject to the electromagnetic interference (EMI) constraints for active and passive medical devices and minimum throughput guarantee for secondary users. The multiple access problem is formulated as a dual objective optimization problem which is shown to be NP-complete. We propose a joint scheduling and power control algorithm based on a greedy approach to solve the problem with much lower computational complexity. To this end, an enhanced greedy algorithm is proposed to improve the performance of the greedy algorithm by finding the optimal sequence of secondary users for scheduling. Using extensive simulations, the tradeoff in performance in terms of spectrum utilization, energy consumption, and computational complexity is evaluated for both the algorithms.

  5. Electroencephalogram complexity analysis in children with attention-deficit/hyperactivity disorder during a visual cognitive task.

    PubMed

    Zarafshan, Hadi; Khaleghi, Ali; Mohammadi, Mohammad Reza; Moeini, Mahdi; Malmir, Nastaran

    2016-01-01

    The aim of this study was to investigate electroencephalogram (EEG) dynamics using complexity analysis in children with attention-deficit/hyperactivity disorder (ADHD) compared with healthy control children when performing a cognitive task. Thirty 7-12-year-old children meeting Diagnostic and Statistical Manual of Mental Disorders-Fifth Edition (DSM-5) criteria for ADHD and 30 healthy control children underwent an EEG evaluation during a cognitive task, and Lempel-Ziv complexity (LZC) values were computed. There were no significant differences between ADHD and control groups on age and gender. The mean LZC of the ADHD children was significantly larger than healthy children over the right anterior and right posterior regions during the cognitive performance. In the ADHD group, complexity of the right hemisphere was higher than that of the left hemisphere, but the complexity of the left hemisphere was higher than that of the right hemisphere in the normal group. Although fronto-striatal dysfunction is considered conclusive evidence for the pathophysiology of ADHD, our arithmetic mental task has provided evidence of structural and functional changes in the posterior regions and probably cerebellum in ADHD.

  6. Advanced Kalman Filter for Real-Time Responsiveness in Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Welch, Gregory Francis; Zhang, Jinghe

    2014-06-10

    Complex engineering systems pose fundamental challenges in real-time operations and control because they are highly dynamic systems consisting of a large number of elements with severe nonlinearities and discontinuities. Today’s tools for real-time complex system operations are mostly based on steady state models, unable to capture the dynamic nature and too slow to prevent system failures. We developed advanced Kalman filtering techniques and the formulation of dynamic state estimation using Kalman filtering techniques to capture complex system dynamics in aiding real-time operations and control. In this work, we looked at complex system issues including severe nonlinearity of system equations, discontinuitiesmore » caused by system controls and network switches, sparse measurements in space and time, and real-time requirements of power grid operations. We sought to bridge the disciplinary boundaries between Computer Science and Power Systems Engineering, by introducing methods that leverage both existing and new techniques. While our methods were developed in the context of electrical power systems, they should generalize to other large-scale scientific and engineering applications.« less

  7. Electrooptical adaptive switching network for the hypercube computer

    NASA Technical Reports Server (NTRS)

    Chow, E.; Peterson, J.

    1988-01-01

    An all-optical network design for the hyperswitch network using regular free-space interconnects between electronic processor nodes is presented. The adaptive routing model used is described, and an adaptive routing control example is presented. The design demonstrates that existing electrooptical techniques are sufficient for implementing efficient parallel architectures without the need for more complex means of implementing arbitrary interconnection schemes. The electrooptical hyperswitch network significantly improves the communication performance of the hypercube computer.

  8. Operating room integration and telehealth.

    PubMed

    Bucholz, Richard D; Laycock, Keith A; McDurmont, Leslie

    2011-01-01

    The increasing use of advanced automated and computer-controlled systems and devices in surgical procedures has resulted in problems arising from the crowding of the operating room with equipment and the incompatible control and communication standards associated with each system. This lack of compatibility between systems and centralized control means that the surgeon is frequently required to interact with multiple computer interfaces in order to obtain updates and exert control over the various devices at his disposal. To reduce this complexity and provide the surgeon with more complete and precise control of the operating room systems, a unified interface and communication network has been developed. In addition to improving efficiency, this network also allows the surgeon to grant remote access to consultants and observers at other institutions, enabling experts to participate in the procedure without having to travel to the site.

  9. Soft control of scanning probe microscope with high flexibility.

    PubMed

    Liu, Zhenghui; Guo, Yuzheng; Zhang, Zhaohui; Zhu, Xing

    2007-01-01

    Most commercial scanning probe microscopes have multiple embedded digital microprocessors and utilize complex software for system control, which is not easily obtained or modified by researchers wishing to perform novel and special applications. In this paper, we present a simple and flexible control solution that just depends on software running on a single-processor personal computer with real-time Linux operating system to carry out all the control tasks including negative feedback, tip moving, data processing and user interface. In this way, we fully exploit the potential of a personal computer in calculating and programming, enabling us to manipulate the scanning probe as required without any special digital control circuits and related technical know-how. This solution has been successfully applied to a homemade ultrahigh vacuum scanning tunneling microscope and a multiprobe scanning tunneling microscope.

  10. Fully probabilistic control design in an adaptive critic framework.

    PubMed

    Herzallah, Randa; Kárný, Miroslav

    2011-12-01

    Optimal stochastic controller pushes the closed-loop behavior as close as possible to the desired one. The fully probabilistic design (FPD) uses probabilistic description of the desired closed loop and minimizes Kullback-Leibler divergence of the closed-loop description to the desired one. Practical exploitation of the fully probabilistic design control theory continues to be hindered by the computational complexities involved in numerically solving the associated stochastic dynamic programming problem; in particular, very hard multivariate integration and an approximate interpolation of the involved multivariate functions. This paper proposes a new fully probabilistic control algorithm that uses the adaptive critic methods to circumvent the need for explicitly evaluating the optimal value function, thereby dramatically reducing computational requirements. This is a main contribution of this paper. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing.

    PubMed

    Brown, David K; Penkler, David L; Musyoka, Thommas M; Bishop, Özlem Tastan

    2015-01-01

    Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS.

  12. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing

    PubMed Central

    Brown, David K.; Penkler, David L.; Musyoka, Thommas M.; Bishop, Özlem Tastan

    2015-01-01

    Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS. PMID:26280450

  13. Probabilistic structural mechanics research for parallel processing computers

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Martin, William R.

    1991-01-01

    Aerospace structures and spacecraft are a complex assemblage of structural components that are subjected to a variety of complex, cyclic, and transient loading conditions. Significant modeling uncertainties are present in these structures, in addition to the inherent randomness of material properties and loads. To properly account for these uncertainties in evaluating and assessing the reliability of these components and structures, probabilistic structural mechanics (PSM) procedures must be used. Much research has focused on basic theory development and the development of approximate analytic solution methods in random vibrations and structural reliability. Practical application of PSM methods was hampered by their computationally intense nature. Solution of PSM problems requires repeated analyses of structures that are often large, and exhibit nonlinear and/or dynamic response behavior. These methods are all inherently parallel and ideally suited to implementation on parallel processing computers. New hardware architectures and innovative control software and solution methodologies are needed to make solution of large scale PSM problems practical.

  14. On the placement of active members in adaptive truss structures for vibration control

    NASA Technical Reports Server (NTRS)

    Lu, L.-Y.; Utku, S.; Wada, B. K.

    1992-01-01

    The problem of optimal placement of active members which are used for vibration control in adaptive truss structures is investigated. The control scheme is based on the method of eigenvalue assignment as a means of shaping the transient response of the controlled adaptive structures, and the minimization of required control action is considered as the optimization criterion. To this end, a performance index which measures the control strokes of active members is formulated in an efficient way. In order to reduce the computation burden, particularly for the case where the locations of active members have to be selected from a large set of available sites, several heuristic searching schemes are proposed for obtaining the near-optimal locations. The proposed schemes significantly reduce the computational complexity of placing multiple active members to the order of that when a single active member is placed.

  15. Constrained Multipoint Aerodynamic Shape Optimization Using an Adjoint Formulation and Parallel Computers

    NASA Technical Reports Server (NTRS)

    Reuther, James; Jameson, Antony; Alonso, Juan Jose; Rimlinger, Mark J.; Saunders, David

    1997-01-01

    An aerodynamic shape optimization method that treats the design of complex aircraft configurations subject to high fidelity computational fluid dynamics (CFD), geometric constraints and multiple design points is described. The design process will be greatly accelerated through the use of both control theory and distributed memory computer architectures. Control theory is employed to derive the adjoint differential equations whose solution allows for the evaluation of design gradient information at a fraction of the computational cost required by previous design methods. The resulting problem is implemented on parallel distributed memory architectures using a domain decomposition approach, an optimized communication schedule, and the MPI (Message Passing Interface) standard for portability and efficiency. The final result achieves very rapid aerodynamic design based on a higher order CFD method. In order to facilitate the integration of these high fidelity CFD approaches into future multi-disciplinary optimization (NW) applications, new methods must be developed which are capable of simultaneously addressing complex geometries, multiple objective functions, and geometric design constraints. In our earlier studies, we coupled the adjoint based design formulations with unconstrained optimization algorithms and showed that the approach was effective for the aerodynamic design of airfoils, wings, wing-bodies, and complex aircraft configurations. In many of the results presented in these earlier works, geometric constraints were satisfied either by a projection into feasible space or by posing the design space parameterization such that it automatically satisfied constraints. Furthermore, with the exception of reference 9 where the second author initially explored the use of multipoint design in conjunction with adjoint formulations, our earlier works have focused on single point design efforts. Here we demonstrate that the same methodology may be extended to treat complete configuration designs subject to multiple design points and geometric constraints. Examples are presented for both transonic and supersonic configurations ranging from wing alone designs to complex configuration designs involving wing, fuselage, nacelles and pylons.

  16. Emotor control: computations underlying bodily resource allocation, emotions, and confidence

    PubMed Central

    Kepecs, Adam; Mensh, Brett D.

    2015-01-01

    Emotional processes are central to behavior, yet their deeply subjective nature has been a challenge for neuroscientific study as well as for psychiatric diagnosis. Here we explore the relationships between subjective feelings and their underlying brain circuits from a computational perspective. We apply recent insights from systems neuroscience—approaching subjective behavior as the result of mental computations instantiated in the brain—to the study of emotions. We develop the hypothesis that emotions are the product of neural computations whose motor role is to reallocate bodily resources mostly gated by smooth muscles. This “emotor” control system is analagous to the more familiar motor control computations that coordinate skeletal muscle movements. To illustrate this framework, we review recent research on “confidence.” Although familiar as a feeling, confidence is also an objective statistical quantity: an estimate of the probability that a hypothesis is correct. This model-based approach helped reveal the neural basis of decision confidence in mammals and provides a bridge to the subjective feeling of confidence in humans. These results have important implications for psychiatry, since disorders of confidence computations appear to contribute to a number of psychopathologies. More broadly, this computational approach to emotions resonates with the emerging view that psychiatric nosology may be best parameterized in terms of disorders of the cognitive computations underlying complex behavior. PMID:26869840

  17. Fluid/Structure Interaction Studies of Aircraft Using High Fidelity Equations on Parallel Computers

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru; VanDalsem, William (Technical Monitor)

    1994-01-01

    Abstract Aeroelasticity which involves strong coupling of fluids, structures and controls is an important element in designing an aircraft. Computational aeroelasticity using low fidelity methods such as the linear aerodynamic flow equations coupled with the modal structural equations are well advanced. Though these low fidelity approaches are computationally less intensive, they are not adequate for the analysis of modern aircraft such as High Speed Civil Transport (HSCT) and Advanced Subsonic Transport (AST) which can experience complex flow/structure interactions. HSCT can experience vortex induced aeroelastic oscillations whereas AST can experience transonic buffet associated structural oscillations. Both aircraft may experience a dip in the flutter speed at the transonic regime. For accurate aeroelastic computations at these complex fluid/structure interaction situations, high fidelity equations such as the Navier-Stokes for fluids and the finite-elements for structures are needed. Computations using these high fidelity equations require large computational resources both in memory and speed. Current conventional super computers have reached their limitations both in memory and speed. As a result, parallel computers have evolved to overcome the limitations of conventional computers. This paper will address the transition that is taking place in computational aeroelasticity from conventional computers to parallel computers. The paper will address special techniques needed to take advantage of the architecture of new parallel computers. Results will be illustrated from computations made on iPSC/860 and IBM SP2 computer by using ENSAERO code that directly couples the Euler/Navier-Stokes flow equations with high resolution finite-element structural equations.

  18. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deboever, Jeremiah; Zhang, Xiaochen; Reno, Matthew J.

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10more » to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.« less

  19. Autonomous perception and decision making in cyber-physical systems

    NASA Astrophysics Data System (ADS)

    Sarkar, Soumik

    2011-07-01

    The cyber-physical system (CPS) is a relatively new interdisciplinary technology area that includes the general class of embedded and hybrid systems. CPSs require integration of computation and physical processes that involves the aspects of physical quantities such as time, energy and space during information processing and control. The physical space is the source of information and the cyber space makes use of the generated information to make decisions. This dissertation proposes an overall architecture of autonomous perception-based decision & control of complex cyber-physical systems. Perception involves the recently developed framework of Symbolic Dynamic Filtering for abstraction of physical world in the cyber space. For example, under this framework, sensor observations from a physical entity are discretized temporally and spatially to generate blocks of symbols, also called words that form a language. A grammar of a language is the set of rules that determine the relationships among words to build sentences. Subsequently, a physical system is conjectured to be a linguistic source that is capable of generating a specific language. The proposed technology is validated on various (experimental and simulated) case studies that include health monitoring of aircraft gas turbine engines, detection and estimation of fatigue damage in polycrystalline alloys, and parameter identification. Control of complex cyber-physical systems involve distributed sensing, computation, control as well as complexity analysis. A novel statistical mechanics-inspired complexity analysis approach is proposed in this dissertation. In such a scenario of networked physical systems, the distribution of physical entities determines the underlying network topology and the interaction among the entities forms the abstract cyber space. It is envisioned that the general contributions, made in this dissertation, will be useful for potential application areas such as smart power grids and buildings, distributed energy systems, advanced health care procedures and future ground and air transportation systems.

  20. The JAU-JPL anthropomorphic telerobot

    NASA Technical Reports Server (NTRS)

    Jau, Bruno M.

    1989-01-01

    Work in progress on the new anthropomorphic telerobot is described. The initial robot configuration consists of a seven DOF arm and a sixteen DOF hand, having three fingers and a thumb. The robot has active compliance, enabling subsequent dual arm manipulations. To control the rather complex configuration of this robot, an exoskeleton master arm harness and a glove controller were built. The controller will be used for teleoperational tasks and as a research tool to efficiently teach the computer controller advanced manipulation techniques.

  1. A spectral method for spatial downscaling

    EPA Science Inventory

    Complex computer models play a crucial role in air quality research. These models are used to evaluate potential regulatory impacts of emission control strategies and to estimate air quality in areas without monitoring data. For both of these purposes, it is important to calibrat...

  2. Redundant Asynchronous Microprocessor System

    NASA Technical Reports Server (NTRS)

    Meyer, G.; Johnston, J. O.; Dunn, W. R.

    1985-01-01

    Fault-tolerant computer structure called RAMPS (for redundant asynchronous microprocessor system) has simplicity of static redundancy but offers intermittent-fault handling ability of complex, dynamically redundant systems. New structure useful wherever several microprocessors are employed for control - in aircraft, industrial processes, robotics, and automatic machining, for example.

  3. The implementation of fail-operative functions in integrated digital avionics systems

    NASA Technical Reports Server (NTRS)

    Osoer, S. S.

    1976-01-01

    System architectures which incorporate fail operative flight guidance functions within a total integrated avionics complex are described. It is shown that the mixture of flight critical and nonflight critical functions within a common computer complex is an efficient solution to the integration of navigation, guidance, flight control, display, and flight management. Interfacing subsystems retain autonomous capability to avoid vulnerability to total avionics system shutdown as a result of only a few failures.

  4. Dynamic Identification for Control of Large Space Structures

    NASA Technical Reports Server (NTRS)

    Ibrahim, S. R.

    1985-01-01

    This is a compilation of reports by the one author on one subject. It consists of the following five journal articles: (1) A Parametric Study of the Ibrahim Time Domain Modal Identification Algorithm; (2) Large Modal Survey Testing Using the Ibrahim Time Domain Identification Technique; (3) Computation of Normal Modes from Identified Complex Modes; (4) Dynamic Modeling of Structural from Measured Complex Modes; and (5) Time Domain Quasi-Linear Identification of Nonlinear Dynamic Systems.

  5. Using block pulse functions for seismic vibration semi-active control of structures with MR dampers

    NASA Astrophysics Data System (ADS)

    Rahimi Gendeshmin, Saeed; Davarnia, Daniel

    2018-03-01

    This article applied the idea of block pulse functions in the semi-active control of structures. The BP functions give effective tools to approximate complex problems. The applied control algorithm has a major effect on the performance of the controlled system and the requirements of the control devices. In control problems, it is important to devise an accurate analytical technique with less computational cost. It is proved that the BP functions are fundamental tools in approximation problems which have been applied in disparate areas in last decades. This study focuses on the employment of BP functions in control algorithm concerning reduction the computational cost. Magneto-rheological (MR) dampers are one of the well-known semi-active tools that can be used to control the response of civil Structures during earthquake. For validation purposes, numerical simulations of a 5-story shear building frame with MR dampers are presented. The results of suggested method were compared with results obtained by controlling the frame by the optimal control method based on linear quadratic regulator theory. It can be seen from simulation results that the suggested method can be helpful in reducing seismic structural responses. Besides, this method has acceptable accuracy and is in agreement with optimal control method with less computational costs.

  6. Distributed intelligent control and status networking

    NASA Technical Reports Server (NTRS)

    Fortin, Andre; Patel, Manoj

    1993-01-01

    Over the past two years, the Network Control Systems Branch (Code 532) has been investigating control and status networking technologies. These emerging technologies use distributed processing over a network to accomplish a particular custom task. These networks consist of small intelligent 'nodes' that perform simple tasks. Containing simple, inexpensive hardware and software, these nodes can be easily developed and maintained. Once networked, the nodes can perform a complex operation without a central host. This type of system provides an alternative to more complex control and status systems which require a central computer. This paper will provide some background and discuss some applications of this technology. It will also demonstrate the suitability of one particular technology for the Space Network (SN) and discuss the prototyping activities of Code 532 utilizing this technology.

  7. Biologically inspired collision avoidance system for unmanned vehicles

    NASA Astrophysics Data System (ADS)

    Ortiz, Fernando E.; Graham, Brett; Spagnoli, Kyle; Kelmelis, Eric J.

    2009-05-01

    In this project, we collaborate with researchers in the neuroscience department at the University of Delaware to develop an Field Programmable Gate Array (FPGA)-based embedded computer, inspired by the brains of small vertebrates (fish). The mechanisms of object detection and avoidance in fish have been extensively studied by our Delaware collaborators. The midbrain optic tectum is a biological multimodal navigation controller capable of processing input from all senses that convey spatial information, including vision, audition, touch, and lateral-line (water current sensing in fish). Unfortunately, computational complexity makes these models too slow for use in real-time applications. These simulations are run offline on state-of-the-art desktop computers, presenting a gap between the application and the target platform: a low-power embedded device. EM Photonics has expertise in developing of high-performance computers based on commodity platforms such as graphic cards (GPUs) and FPGAs. FPGAs offer (1) high computational power, low power consumption and small footprint (in line with typical autonomous vehicle constraints), and (2) the ability to implement massively-parallel computational architectures, which can be leveraged to closely emulate biological systems. Combining UD's brain modeling algorithms and the power of FPGAs, this computer enables autonomous navigation in complex environments, and further types of onboard neural processing in future applications.

  8. Computer-assisted midface reconstruction in Treacher Collins syndrome part 1: skeletal reconstruction.

    PubMed

    Herlin, Christian; Doucet, Jean Charles; Bigorre, Michèle; Khelifa, Hatem Cheikh; Captier, Guillaume

    2013-10-01

    Treacher Collins syndrome (TCS) is a severe and complex craniofacial malformation affecting the facial skeleton and soft tissues. The palate as well as the external and middle ear are also affected, but his prognosis is mainly related to neonatal airway management. Methods of zygomatico-orbital reconstruction are numerous and currently use primarily autologous bone, lyophilized cartilage, alloplastic implants, or even free flaps. This work developed a reliable "customized" method of zygomatico-orbital bony reconstruction using a generic reference model tailored to each patient. From a standard computed tomography (CT) acquisition, we studied qualitatively and quantitatively the skeleton of four individuals with TCS whose age was between 6 and 20 years. In parallel, we studied 40 controls at the same age to obtain a morphometric database of reference. Surgical simulation was carried out using validated software used in craniofacial surgery. The zygomatic hypoplasia was very important quantitatively and morphologically in all TCS individuals. Orbital involvement was mainly morphological, with volumes comparable to the controls of the same age. The control database was used to create three-dimensional computer models to be used in the manufacture of cutting guides for autologous cranial bone grafts or alloplastic implants perfectly adapted to each patient's morphology. Presurgical simulation was also used to fabricate custom positioning guides permitting a simple and reliable surgical procedure. The use of a virtual database allowed us to design a reliable and reproducible skeletal reconstruction method for this rare and complex syndrome. The use of presurgical simulation tools seem essential in this type of craniofacial malformation to increase the reliability of these uncommon and complex surgical procedures, and to ensure stable results over time. Copyright © 2013 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  9. Computational approaches to motor learning by imitation.

    PubMed Central

    Schaal, Stefan; Ijspeert, Auke; Billard, Aude

    2003-01-01

    Movement imitation requires a complex set of mechanisms that map an observed movement of a teacher onto one's own movement apparatus. Relevant problems include movement recognition, pose estimation, pose tracking, body correspondence, coordinate transformation from external to egocentric space, matching of observed against previously learned movement, resolution of redundant degrees-of-freedom that are unconstrained by the observation, suitable movement representations for imitation, modularization of motor control, etc. All of these topics by themselves are active research problems in computational and neurobiological sciences, such that their combination into a complete imitation system remains a daunting undertaking-indeed, one could argue that we need to understand the complete perception-action loop. As a strategy to untangle the complexity of imitation, this paper will examine imitation purely from a computational point of view, i.e. we will review statistical and mathematical approaches that have been suggested for tackling parts of the imitation problem, and discuss their merits, disadvantages and underlying principles. Given the focus on action recognition of other contributions in this special issue, this paper will primarily emphasize the motor side of imitation, assuming that a perceptual system has already identified important features of a demonstrated movement and created their corresponding spatial information. Based on the formalization of motor control in terms of control policies and their associated performance criteria, useful taxonomies of imitation learning can be generated that clarify different approaches and future research directions. PMID:12689379

  10. Proper Orthogonal Decomposition in Optimal Control of Fluids

    NASA Technical Reports Server (NTRS)

    Ravindran, S. S.

    1999-01-01

    In this article, we present a reduced order modeling approach suitable for active control of fluid dynamical systems based on proper orthogonal decomposition (POD). The rationale behind the reduced order modeling is that numerical simulation of Navier-Stokes equations is still too costly for the purpose of optimization and control of unsteady flows. We examine the possibility of obtaining reduced order models that reduce computational complexity associated with the Navier-Stokes equations while capturing the essential dynamics by using the POD. The POD allows extraction of certain optimal set of basis functions, perhaps few, from a computational or experimental data-base through an eigenvalue analysis. The solution is then obtained as a linear combination of these optimal set of basis functions by means of Galerkin projection. This makes it attractive for optimal control and estimation of systems governed by partial differential equations. We here use it in active control of fluid flows governed by the Navier-Stokes equations. We show that the resulting reduced order model can be very efficient for the computations of optimization and control problems in unsteady flows. Finally, implementational issues and numerical experiments are presented for simulations and optimal control of fluid flow through channels.

  11. Computation of Ground-State Properties in Molecular Systems: Back-Propagation with Auxiliary-Field Quantum Monte Carlo.

    PubMed

    Motta, Mario; Zhang, Shiwei

    2017-11-14

    We address the computation of ground-state properties of chemical systems and realistic materials within the auxiliary-field quantum Monte Carlo method. The phase constraint to control the Fermion phase problem requires the random walks in Slater determinant space to be open-ended with branching. This in turn makes it necessary to use back-propagation (BP) to compute averages and correlation functions of operators that do not commute with the Hamiltonian. Several BP schemes are investigated, and their optimization with respect to the phaseless constraint is considered. We propose a modified BP method for the computation of observables in electronic systems, discuss its numerical stability and computational complexity, and assess its performance by computing ground-state properties in several molecular systems, including small organic molecules.

  12. Current Grid Generation Strategies and Future Requirements in Hypersonic Vehicle Design, Analysis and Testing

    NASA Technical Reports Server (NTRS)

    Papadopoulos, Periklis; Venkatapathy, Ethiraj; Prabhu, Dinesh; Loomis, Mark P.; Olynick, Dave; Arnold, James O. (Technical Monitor)

    1998-01-01

    Recent advances in computational power enable computational fluid dynamic modeling of increasingly complex configurations. A review of grid generation methodologies implemented in support of the computational work performed for the X-38 and X-33 are presented. In strategizing topological constructs and blocking structures factors considered are the geometric configuration, optimal grid size, numerical algorithms, accuracy requirements, physics of the problem at hand, computational expense, and the available computer hardware. Also addressed are grid refinement strategies, the effects of wall spacing, and convergence. The significance of grid is demonstrated through a comparison of computational and experimental results of the aeroheating environment experienced by the X-38 vehicle. Special topics on grid generation strategies are also addressed to model control surface deflections, and material mapping.

  13. Controlling uncertainty: a review of human behavior in complex dynamic environments.

    PubMed

    Osman, Magda

    2010-01-01

    Complex dynamic control (CDC) tasks are a type of problem-solving environment used for examining many cognitive activities (e.g., attention, control, decision making, hypothesis testing, implicit learning, memory, monitoring, planning, and problem solving). Because of their popularity, there have been many findings from diverse domains of research (economics, engineering, ergonomics, human-computer interaction, management, psychology), but they remain largely disconnected from each other. The objective of this article is to review theoretical developments and empirical work on CDC tasks, and to introduce a novel framework (monitoring and control framework) as a tool for integrating theory and findings. The main thesis of the monitoring and control framework is that CDC tasks are characteristically uncertain environments, and subjective judgments of uncertainty guide the way in which monitoring and control behaviors attempt to reduce it. The article concludes by discussing new insights into continuing debates and future directions for research on CDC tasks.

  14. Automating quantum experiment control

    NASA Astrophysics Data System (ADS)

    Stevens, Kelly E.; Amini, Jason M.; Doret, S. Charles; Mohler, Greg; Volin, Curtis; Harter, Alexa W.

    2017-03-01

    The field of quantum information processing is rapidly advancing. As the control of quantum systems approaches the level needed for useful computation, the physical hardware underlying the quantum systems is becoming increasingly complex. It is already becoming impractical to manually code control for the larger hardware implementations. In this chapter, we will employ an approach to the problem of system control that parallels compiler design for a classical computer. We will start with a candidate quantum computing technology, the surface electrode ion trap, and build a system instruction language which can be generated from a simple machine-independent programming language via compilation. We incorporate compile time generation of ion routing that separates the algorithm description from the physical geometry of the hardware. Extending this approach to automatic routing at run time allows for automated initialization of qubit number and placement and additionally allows for automated recovery after catastrophic events such as qubit loss. To show that these systems can handle real hardware, we present a simple demonstration system that routes two ions around a multi-zone ion trap and handles ion loss and ion placement. While we will mainly use examples from transport-based ion trap quantum computing, many of the issues and solutions are applicable to other architectures.

  15. Trajectory-Based Complexity (TBX): A Modified Aircraft Count to Predict Sector Complexity During Trajectory-Based Operations

    NASA Technical Reports Server (NTRS)

    Prevot, Thomas; Lee, Paul U.

    2011-01-01

    In this paper we introduce a new complexity metric to predict -in real-time- sector complexity for trajectory-based operations (TBO). TBO will be implemented in the Next Generation Air Transportation System (NextGen). Trajectory-Based Complexity (TBX) is a modified aircraft count that can easily be computed and communicated in a TBO environment based upon predictions of aircraft and weather trajectories. TBX is scaled to aircraft count and represents an alternate and additional means to manage air traffic demand and capacity with more consideration of dynamic factors such as weather, aircraft equipage or predicted separation violations, as well as static factors such as sector size. We have developed and evaluated TBX in the Airspace Operations Laboratory (AOL) at the NASA Ames Research Center during human-in-the-loop studies of trajectory-based concepts since 2009. In this paper we will describe the TBX computation in detail and present the underlying algorithm. Next, we will describe the specific TBX used in an experiment at NASA's AOL. We will evaluate the performance of this metric using data collected during a controller-inthe- loop study on trajectory-based operations at different equipage levels. In this study controllers were prompted at regular intervals to rate their current workload on a numeric scale. When comparing this real-time workload rating to the TBX values predicted for these time periods we demonstrate that TBX is a better predictor of workload than aircraft count. Furthermore we demonstrate that TBX is well suited to be used for complexity management in TBO and can easily be adjusted to future operational concepts.

  16. Controls of multi-modal wave conditions in a complex coastal setting

    USGS Publications Warehouse

    Hegermiller, Christie; Rueda, Ana C.; Erikson, Li H.; Barnard, Patrick L.; Antolinez, J.A.A.; Mendez, Fernando J.

    2017-01-01

    Coastal hazards emerge from the combined effect of wave conditions and sea level anomalies associated with storms or low-frequency atmosphere-ocean oscillations. Rigorous characterization of wave climate is limited by the availability of spectral wave observations, the computational cost of dynamical simulations, and the ability to link wave-generating atmospheric patterns with coastal conditions. We present a hybrid statistical-dynamical approach to simulating nearshore wave climate in complex coastal settings, demonstrated in the Southern California Bight, where waves arriving from distant, disparate locations are refracted over complex bathymetry and shadowed by offshore islands. Contributions of wave families and large-scale atmospheric drivers to nearshore wave energy flux are analyzed. Results highlight the variability of influences controlling wave conditions along neighboring coastlines. The universal method demonstrated here can be applied to complex coastal settings worldwide, facilitating analysis of the effects of climate change on nearshore wave climate.

  17. Controls of Multimodal Wave Conditions in a Complex Coastal Setting

    NASA Astrophysics Data System (ADS)

    Hegermiller, C. A.; Rueda, A.; Erikson, L. H.; Barnard, P. L.; Antolinez, J. A. A.; Mendez, F. J.

    2017-12-01

    Coastal hazards emerge from the combined effect of wave conditions and sea level anomalies associated with storms or low-frequency atmosphere-ocean oscillations. Rigorous characterization of wave climate is limited by the availability of spectral wave observations, the computational cost of dynamical simulations, and the ability to link wave-generating atmospheric patterns with coastal conditions. We present a hybrid statistical-dynamical approach to simulating nearshore wave climate in complex coastal settings, demonstrated in the Southern California Bight, where waves arriving from distant, disparate locations are refracted over complex bathymetry and shadowed by offshore islands. Contributions of wave families and large-scale atmospheric drivers to nearshore wave energy flux are analyzed. Results highlight the variability of influences controlling wave conditions along neighboring coastlines. The universal method demonstrated here can be applied to complex coastal settings worldwide, facilitating analysis of the effects of climate change on nearshore wave climate.

  18. Periodically Self Restoring Redundant Systems for VLSI Based Highly Reliable Design,

    DTIC Science & Technology

    1984-01-01

    fault tolerance technique for realizing highly reliable computer systems for critical control applications . However, VL.SI technology has imposed a...operating correctly; failed critical real time control applications . n modules are discarded from the vote. the classical "static" voted redundancy...redundant modules are failure number of InterconnecttIon3. This results In f aree. However, for applications requiring higm modular complexity because

  19. Minimally complex ion traps as modules for quantum communication and computing

    NASA Astrophysics Data System (ADS)

    Nigmatullin, Ramil; Ballance, Christopher J.; de Beaudrap, Niel; Benjamin, Simon C.

    2016-10-01

    Optically linked ion traps are promising as components of network-based quantum technologies, including communication systems and modular computers. Experimental results achieved to date indicate that the fidelity of operations within each ion trap module will be far higher than the fidelity of operations involving the links; fortunately internal storage and processing can effectively upgrade the links through the process of purification. Here we perform the most detailed analysis to date on this purification task, using a protocol which is balanced to maximise fidelity while minimising the device complexity and the time cost of the process. Moreover we ‘compile down’ the quantum circuit to device-level operations including cooling and shuttling events. We find that a linear trap with only five ions (two of one species, three of another) can support our protocol while incorporating desirable features such as global control, i.e. laser control pulses need only target an entire zone rather than differentiating one ion from its neighbour. To evaluate the capabilities of such a module we consider its use both as a universal communications node for quantum key distribution, and as the basic repeating unit of a quantum computer. For the latter case we evaluate the threshold for fault tolerant quantum computing using the surface code, finding acceptable fidelities for the ‘raw’ entangling link as low as 83% (or under 75% if an additional ion is available).

  20. Physiological complexity and system adaptability: evidence from postural control dynamics of older adults.

    PubMed

    Manor, Brad; Costa, Madalena D; Hu, Kun; Newton, Elizabeth; Starobinets, Olga; Kang, Hyun Gu; Peng, C K; Novak, Vera; Lipsitz, Lewis A

    2010-12-01

    The degree of multiscale complexity in human behavioral regulation, such as that required for postural control, appears to decrease with advanced aging or disease. To help delineate causes and functional consequences of complexity loss, we examined the effects of visual and somatosensory impairment on the complexity of postural sway during quiet standing and its relationship to postural adaptation to cognitive dual tasking. Participants of the MOBILIZE Boston Study were classified into mutually exclusive groups: controls [intact vision and foot somatosensation, n = 299, 76 ± 5 (SD) yr old], visual impairment only (<20/40 vision, n = 81, 77 ± 4 yr old), somatosensory impairment only (inability to perceive 5.07 monofilament on plantar halluxes, n = 48, 80 ± 5 yr old), and combined impairments (n = 25, 80 ± 4 yr old). Postural sway (i.e., center-of-pressure) dynamics were assessed during quiet standing and cognitive dual tasking, and a complexity index was quantified using multiscale entropy analysis. Postural sway speed and area, which did not correlate with complexity, were also computed. During quiet standing, the complexity index (mean ± SD) was highest in controls (9.5 ± 1.2) and successively lower in the visual (9.1 ± 1.1), somatosensory (8.6 ± 1.6), and combined (7.8 ± 1.3) impairment groups (P = 0.001). Dual tasking resulted in increased sway speed and area but reduced complexity (P < 0.01). Lower complexity during quiet standing correlated with greater absolute (R = -0.34, P = 0.002) and percent (R = -0.45, P < 0.001) increases in postural sway speed from quiet standing to dual-tasking conditions. Sensory impairments contributed to decreased postural sway complexity, which reflected reduced adaptive capacity of the postural control system. Relatively low baseline complexity may, therefore, indicate control systems that are more vulnerable to cognitive and other stressors.

  1. Physiological complexity and system adaptability: evidence from postural control dynamics of older adults

    PubMed Central

    Costa, Madalena D.; Hu, Kun; Newton, Elizabeth; Starobinets, Olga; Kang, Hyun Gu; Peng, C. K.; Novak, Vera; Lipsitz, Lewis A.

    2010-01-01

    The degree of multiscale complexity in human behavioral regulation, such as that required for postural control, appears to decrease with advanced aging or disease. To help delineate causes and functional consequences of complexity loss, we examined the effects of visual and somatosensory impairment on the complexity of postural sway during quiet standing and its relationship to postural adaptation to cognitive dual tasking. Participants of the MOBILIZE Boston Study were classified into mutually exclusive groups: controls [intact vision and foot somatosensation, n = 299, 76 ± 5 (SD) yr old], visual impairment only (<20/40 vision, n = 81, 77 ± 4 yr old), somatosensory impairment only (inability to perceive 5.07 monofilament on plantar halluxes, n = 48, 80 ± 5 yr old), and combined impairments (n = 25, 80 ± 4 yr old). Postural sway (i.e., center-of-pressure) dynamics were assessed during quiet standing and cognitive dual tasking, and a complexity index was quantified using multiscale entropy analysis. Postural sway speed and area, which did not correlate with complexity, were also computed. During quiet standing, the complexity index (mean ± SD) was highest in controls (9.5 ± 1.2) and successively lower in the visual (9.1 ± 1.1), somatosensory (8.6 ± 1.6), and combined (7.8 ± 1.3) impairment groups (P = 0.001). Dual tasking resulted in increased sway speed and area but reduced complexity (P < 0.01). Lower complexity during quiet standing correlated with greater absolute (R = −0.34, P = 0.002) and percent (R = −0.45, P < 0.001) increases in postural sway speed from quiet standing to dual-tasking conditions. Sensory impairments contributed to decreased postural sway complexity, which reflected reduced adaptive capacity of the postural control system. Relatively low baseline complexity may, therefore, indicate control systems that are more vulnerable to cognitive and other stressors. PMID:20947715

  2. A radiation-hardened, computer for satellite applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaona, J.I. Jr.

    1996-08-01

    This paper describes high reliability radiation hardened computers built by Sandia for application aboard DOE satellite programs requiring 32 bit processing. The computers highlight a radiation hardened (10 kGy(Si)) R3000 executing up to 10 million reduced instruction set instructions (RISC) per second (MIPS), a dual purpose module control bus used for real-time default and power management which allows for extended mission operation on as little as 1.2 watts, and a local area network capable of 480 Mbits/s. The central processing unit (CPU) is the NASA Goddard R3000 nicknamed the ``Mongoose or Mongoose 1``. The Sandia Satellite Computer (SSC) uses Rational`smore » Ada compiler, debugger, operating system kernel, and enhanced floating point emulation library targeted at the Mongoose. The SSC gives Sandia the capability of processing complex types of spacecraft attitude determination and control algorithms and of modifying programmed control laws via ground command. And in general, SSC offers end users the ability to process data onboard the spacecraft that would normally have been sent to the ground which allows reconsideration of traditional space-grounded partitioning options.« less

  3. Application of programmable logic controllers to space simulation

    NASA Technical Reports Server (NTRS)

    Sushon, Janet

    1992-01-01

    Incorporating a state-of-the-art process control and instrumentation system into a complex system for thermal vacuum testing is discussed. The challenge was to connect several independent control systems provided by various vendors to a supervisory computer. This combination will sequentially control and monitor the process, collect the data, and transmit it to color a graphic system for subsequent manipulation. The vacuum system upgrade included: replacement of seventeen diffusion pumps with eight cryogenic pumps and one turbomolecular pump, replacing a relay based control system, replacing vacuum instrumentation, and upgrading the data acquisition system.

  4. Airport-Noise Levels and Annoyance Model (ALAMO) user's guide

    NASA Technical Reports Server (NTRS)

    Deloach, R.; Donaldson, J. L.; Johnson, M. J.

    1986-01-01

    A guide for the use of the Airport-Noise Level and Annoyance MOdel (ALAMO) at the Langley Research Center computer complex is provided. This document is divided into 5 primary sections, the introduction, the purpose of the model, and an in-depth description of the following subsystems: baseline, noise reduction simulation and track analysis. For each subsystem, the user is provided with a description of architecture, an explanation of subsystem use, sample results, and a case runner's check list. It is assumed that the user is familiar with the operations at the Langley Research Center (LaRC) computer complex, the Network Operating System (NOS 1.4) and CYBER Control Language. Incorporated within the ALAMO model is a census database system called SITE II.

  5. A decision-theoretic approach to the display of information for time-critical decisions: The Vista project

    NASA Technical Reports Server (NTRS)

    Horvitz, Eric; Ruokangas, Corinne; Srinivas, Sampath; Barry, Matthew

    1993-01-01

    We describe a collaborative research and development effort between the Palo Alto Laboratory of the Rockwell Science Center, Rockwell Space Operations Company, and the Propulsion Systems Section of NASA JSC to design computational tools that can manage the complexity of information displayed to human operators in high-stakes, time-critical decision contexts. We shall review an application from NASA Mission Control and describe how we integrated a probabilistic diagnostic model and a time-dependent utility model, with techniques for managing the complexity of computer displays. Then, we shall describe the behavior of VPROP, a system constructed to demonstrate promising display-management techniques. Finally, we shall describe our current research directions on the Vista 2 follow-on project.

  6. Prospects of a mathematical theory of human behavior in complex man-machine systems tasks. [time sharing computer analogy of automobile driving

    NASA Technical Reports Server (NTRS)

    Johannsen, G.; Rouse, W. B.

    1978-01-01

    A hierarchy of human activities is derived by analyzing automobile driving in general terms. A structural description leads to a block diagram and a time-sharing computer analogy. The range of applicability of existing mathematical models is considered with respect to the hierarchy of human activities in actual complex tasks. Other mathematical tools so far not often applied to man machine systems are also discussed. The mathematical descriptions at least briefly considered here include utility, estimation, control, queueing, and fuzzy set theory as well as artificial intelligence techniques. Some thoughts are given as to how these methods might be integrated and how further work might be pursued.

  7. Scalable service architecture for providing strong service guarantees

    NASA Astrophysics Data System (ADS)

    Christin, Nicolas; Liebeherr, Joerg

    2002-07-01

    For the past decade, a lot of Internet research has been devoted to providing different levels of service to applications. Initial proposals for service differentiation provided strong service guarantees, with strict bounds on delays, loss rates, and throughput, but required high overhead in terms of computational complexity and memory, both of which raise scalability concerns. Recently, the interest has shifted to service architectures with low overhead. However, these newer service architectures only provide weak service guarantees, which do not always address the needs of applications. In this paper, we describe a service architecture that supports strong service guarantees, can be implemented with low computational complexity, and only requires to maintain little state information. A key mechanism of the proposed service architecture is that it addresses scheduling and buffer management in a single algorithm. The presented architecture offers no solution for controlling the amount of traffic that enters the network. Instead, we plan on exploiting feedback mechanisms of TCP congestion control algorithms for the purpose of regulating the traffic entering the network.

  8. Viability of Bioprinted Cellular Constructs Using a Three Dispenser Cartesian Printer.

    PubMed

    Dennis, Sarah Grace; Trusk, Thomas; Richards, Dylan; Jia, Jia; Tan, Yu; Mei, Ying; Fann, Stephen; Markwald, Roger; Yost, Michael

    2015-09-22

    Tissue engineering has centralized its focus on the construction of replacements for non-functional or damaged tissue. The utilization of three-dimensional bioprinting in tissue engineering has generated new methods for the printing of cells and matrix to fabricate biomimetic tissue constructs. The solid freeform fabrication (SFF) method developed for three-dimensional bioprinting uses an additive manufacturing approach by depositing droplets of cells and hydrogels in a layer-by-layer fashion. Bioprinting fabrication is dependent on the specific placement of biological materials into three-dimensional architectures, and the printed constructs should closely mimic the complex organization of cells and extracellular matrices in native tissue. This paper highlights the use of the Palmetto Printer, a Cartesian bioprinter, as well as the process of producing spatially organized, viable constructs while simultaneously allowing control of environmental factors. This methodology utilizes computer-aided design and computer-aided manufacturing to produce these specific and complex geometries. Finally, this approach allows for the reproducible production of fabricated constructs optimized by controllable printing parameters.

  9. An auto-adaptive optimization approach for targeting nonpoint source pollution control practices.

    PubMed

    Chen, Lei; Wei, Guoyuan; Shen, Zhenyao

    2015-10-21

    To solve computationally intensive and technically complex control of nonpoint source pollution, the traditional genetic algorithm was modified into an auto-adaptive pattern, and a new framework was proposed by integrating this new algorithm with a watershed model and an economic module. Although conceptually simple and comprehensive, the proposed algorithm would search automatically for those Pareto-optimality solutions without a complex calibration of optimization parameters. The model was applied in a case study in a typical watershed of the Three Gorges Reservoir area, China. The results indicated that the evolutionary process of optimization was improved due to the incorporation of auto-adaptive parameters. In addition, the proposed algorithm outperformed the state-of-the-art existing algorithms in terms of convergence ability and computational efficiency. At the same cost level, solutions with greater pollutant reductions could be identified. From a scientific viewpoint, the proposed algorithm could be extended to other watersheds to provide cost-effective configurations of BMPs.

  10. Computational Fluid Dynamics (CFD) Simulations of a Finned Projectile with Microflaps for Flow Control

    DTIC Science & Technology

    2016-04-01

    fields associated with these control mechanisms for US Army weapons are complex, involving 3-dimensional (3-D) shock- boundary layer interactions...distribution over the rear finned section and thus produce control forces and moments. Dykes et al.6 used a flat - plate fin interaction design of...cells—tetrahedrals, triangular prisms, and pyramids—were used in the mesh. Grid points shown in Fig. 3a were clustered in the boundary layer region

  11. Fuzzy Integration of Support Vector Regression Models for Anticipatory Control of Complex Energy Systems

    DOE PAGES

    Alamaniotis, Miltiadis; Agarwal, Vivek

    2014-04-01

    Anticipatory control systems are a class of systems whose decisions are based on predictions for the future state of the system under monitoring. Anticipation denotes intelligence and is an inherent property of humans that make decisions by projecting in future. Likewise, artificially intelligent systems equipped with predictive functions may be utilized for anticipating future states of complex systems, and therefore facilitate automated control decisions. Anticipatory control of complex energy systems is paramount to their normal and safe operation. In this paper a new intelligent methodology integrating fuzzy inference with support vector regression is introduced. Our proposed methodology implements an anticipatorymore » system aiming at controlling energy systems in a robust way. Initially a set of support vector regressors is adopted for making predictions over critical system parameters. Furthermore, the predicted values are fed into a two stage fuzzy inference system that makes decisions regarding the state of the energy system. The inference system integrates the individual predictions into a single one at its first stage, and outputs a decision together with a certainty factor computed at its second stage. The certainty factor is an index of the significance of the decision. The proposed anticipatory control system is tested on a real world set of data obtained from a complex energy system, describing the degradation of a turbine. Results exhibit the robustness of the proposed system in controlling complex energy systems.« less

  12. A quantum Fredkin gate.

    PubMed

    Patel, Raj B; Ho, Joseph; Ferreyrol, Franck; Ralph, Timothy C; Pryde, Geoff J

    2016-03-01

    Minimizing the resources required to build logic gates into useful processing circuits is key to realizing quantum computers. Although the salient features of a quantum computer have been shown in proof-of-principle experiments, difficulties in scaling quantum systems have made more complex operations intractable. This is exemplified in the classical Fredkin (controlled-SWAP) gate for which, despite theoretical proposals, no quantum analog has been realized. By adding control to the SWAP unitary, we use photonic qubit logic to demonstrate the first quantum Fredkin gate, which promises many applications in quantum information and measurement. We implement example algorithms and generate the highest-fidelity three-photon Greenberger-Horne-Zeilinger states to date. The technique we use allows one to add a control operation to a black-box unitary, something that is impossible in the standard circuit model. Our experiment represents the first use of this technique to control a two-qubit operation and paves the way for larger controlled circuits to be realized efficiently.

  13. Computer software.

    PubMed

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  14. Enabling Predictive Simulation and UQ of Complex Multiphysics PDE Systems by the Development of Goal-Oriented Variational Sensitivity Analysis and a-Posteriori Error Estimation Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estep, Donald

    2015-11-30

    This project addressed the challenge of predictive computational analysis of strongly coupled, highly nonlinear multiphysics systems characterized by multiple physical phenomena that span a large range of length- and time-scales. Specifically, the project was focused on computational estimation of numerical error and sensitivity analysis of computational solutions with respect to variations in parameters and data. In addition, the project investigated the use of accurate computational estimates to guide efficient adaptive discretization. The project developed, analyzed and evaluated new variational adjoint-based techniques for integration, model, and data error estimation/control and sensitivity analysis, in evolutionary multiphysics multiscale simulations.

  15. Handling Qualities Evaluations of Low Complexity Model Reference Adaptive Controllers for Reduced Pitch and Roll Damping Scenarios

    NASA Technical Reports Server (NTRS)

    Hanson, Curt; Schaefer, Jacob; Burken, John J.; Johnson, Marcus; Nguyen, Nhan

    2011-01-01

    National Aeronautics and Space Administration (NASA) researchers have conducted a series of flight experiments designed to study the effects of varying levels of adaptive controller complexity on the performance and handling qualities of an aircraft under various simulated failure or damage conditions. A baseline, nonlinear dynamic inversion controller was augmented with three variations of a model reference adaptive control design. The simplest design consisted of a single adaptive parameter in each of the pitch and roll axes computed using a basic gradient-based update law. A second design was built upon the first by increasing the complexity of the update law. The third and most complex design added an additional adaptive parameter to each axis. Flight tests were conducted using NASA s Full-scale Advanced Systems Testbed, a highly modified F-18 aircraft that contains a research flight control system capable of housing advanced flight controls experiments. Each controller was evaluated against a suite of simulated failures and damage ranging from destabilization of the pitch and roll axes to significant coupling between the axes. Two pilots evaluated the three adaptive controllers as well as the non-adaptive baseline controller in a variety of dynamic maneuvers and precision flying tasks designed to uncover potential deficiencies in the handling qualities of the aircraft, and adverse interactions between the pilot and the adaptive controllers. The work was completed as part of the Integrated Resilient Aircraft Control Project under NASA s Aviation Safety Program.

  16. Multiobjective Optimal Control Methodology for the Analysis of Certain Sociodynamic Problems

    DTIC Science & Technology

    2009-03-01

    but less expensive in both time and memory. 137 References [1] R. Albert and A-L Barabasi. Statistical mechanics of complex networks. Reviews of Modern...Review, E(51):4282–4286, 1995. [24] D. Helbing, P. Molnar, and F. Schweitzer . Computer simulation of pedestrian dynamics and trail formation. May 1998...Patterson AFB, OH, 2001. [49] F. Schweitzer . Brownian Agents and Active Particles. Springer, Santa Fe, NM, 2003. [50] P. Sen. Complexities of social

  17. Characterization of the influence of external stimulus on protein-nucleic acid complex through multiscale computations

    NASA Astrophysics Data System (ADS)

    Gosai, Agnivo

    The concomitant detection, monitoring and analysis of biomolecules have assumed utmost importance in the field of medical diagnostics as well as in different spheres of biotechnology research such as drug development, environmental hazard detection and biodefense. There is an increased demand for the modulation of the biological response for such detection / sensing schemes which will be facilitated by the sensitive and controllable transmission of external stimuli. Electrostatic actuation for the controlled release/capture of biomolecules through conformational transformations of bioreceptors provides an efficient and feasible mechanism to modulate biological response. In addition, electrostatic actuation mechanism has the advantage of allowing massively parallel schemes and measurement capabilities that could ultimately be essential for biomedical applications. Experiments have previously demonstrated the unbinding of thrombin from its aptamer in presence of small positive electrode potential whereas the complex remained associated in presence of small negative potentials / zero potential. However, the nanoscale physics/chemistry involved in this process is not clearly understood. In this thesis a combination of continuum mechanics based modeling and a variety of atomistic simulation techniques have been utilized to corroborate the aforementioned experimental observations. It is found that the computational approach can satisfactorily predict the dynamics of the electrically excited aptamer-thrombin complex as well as provide an analytical model to characterize the forced binding of the complex.

  18. Visual Complexity and Affect: Ratings Reflect More Than Meets the Eye.

    PubMed

    Madan, Christopher R; Bayer, Janine; Gamer, Matthias; Lonsdorf, Tina B; Sommer, Tobias

    2017-01-01

    Pictorial stimuli can vary on many dimensions, several aspects of which are captured by the term 'visual complexity.' Visual complexity can be described as, "a picture of a few objects, colors, or structures would be less complex than a very colorful picture of many objects that is composed of several components." Prior studies have reported a relationship between affect and visual complexity, where complex pictures are rated as more pleasant and arousing. However, a relationship in the opposite direction, an effect of affect on visual complexity, is also possible; emotional arousal and valence are known to influence selective attention and visual processing. In a series of experiments, we found that ratings of visual complexity correlated with affective ratings, and independently also with computational measures of visual complexity. These computational measures did not correlate with affect, suggesting that complexity ratings are separately related to distinct factors. We investigated the relationship between affect and ratings of visual complexity, finding an 'arousal-complexity bias' to be a robust phenomenon. Moreover, we found this bias could be attenuated when explicitly indicated but did not correlate with inter-individual difference measures of affective processing, and was largely unrelated to cognitive and eyetracking measures. Taken together, the arousal-complexity bias seems to be caused by a relationship between arousal and visual processing as it has been described for the greater vividness of arousing pictures. The described arousal-complexity bias is also of relevance from an experimental perspective because visual complexity is often considered a variable to control for when using pictorial stimuli.

  19. Visual Complexity and Affect: Ratings Reflect More Than Meets the Eye

    PubMed Central

    Madan, Christopher R.; Bayer, Janine; Gamer, Matthias; Lonsdorf, Tina B.; Sommer, Tobias

    2018-01-01

    Pictorial stimuli can vary on many dimensions, several aspects of which are captured by the term ‘visual complexity.’ Visual complexity can be described as, “a picture of a few objects, colors, or structures would be less complex than a very colorful picture of many objects that is composed of several components.” Prior studies have reported a relationship between affect and visual complexity, where complex pictures are rated as more pleasant and arousing. However, a relationship in the opposite direction, an effect of affect on visual complexity, is also possible; emotional arousal and valence are known to influence selective attention and visual processing. In a series of experiments, we found that ratings of visual complexity correlated with affective ratings, and independently also with computational measures of visual complexity. These computational measures did not correlate with affect, suggesting that complexity ratings are separately related to distinct factors. We investigated the relationship between affect and ratings of visual complexity, finding an ‘arousal-complexity bias’ to be a robust phenomenon. Moreover, we found this bias could be attenuated when explicitly indicated but did not correlate with inter-individual difference measures of affective processing, and was largely unrelated to cognitive and eyetracking measures. Taken together, the arousal-complexity bias seems to be caused by a relationship between arousal and visual processing as it has been described for the greater vividness of arousing pictures. The described arousal-complexity bias is also of relevance from an experimental perspective because visual complexity is often considered a variable to control for when using pictorial stimuli. PMID:29403412

  20. Fusing terrain and goals: agent control in urban environments

    NASA Astrophysics Data System (ADS)

    Kaptan, Varol; Gelenbe, Erol

    2006-04-01

    The changing face of contemporary military conflicts has forced a major shift of focus in tactical planning and evaluation from the classical Cold War battlefield to an asymmetric guerrilla-type warfare in densely populated urban areas. The new arena of conflict presents unique operational difficulties due to factors like complex mobility restrictions and the necessity to preserve civilian lives and infrastructure. In this paper we present a novel method for autonomous agent control in an urban environment. Our approach is based on fusing terrain information and agent goals for the purpose of transforming the problem of navigation in a complex environment with many obstacles into the easier problem of navigation in a virtual obstacle-free space. The main advantage of our approach is its ability to act as an adapter layer for a number of efficient agent control techniques which normally show poor performance when applied to an environment with many complex obstacles. Because of the very low computational and space complexity at runtime, our method is also particularly well suited for simulation or control of a huge number of agents (military as well as civilian) in a complex urban environment where traditional path-planning may be too expensive or where a just-in-time decision with hard real-time constraints is required.

  1. New technologies in radiation therapy: ensuring patient safety, radiation safety and regulatory issues in radiation oncology.

    PubMed

    Amols, Howard I

    2008-11-01

    New technologies such as intensity modulated and image guided radiation therapy, computer controlled linear accelerators, record and verify systems, electronic charts, and digital imaging have revolutionized radiation therapy over the past 10-15 y. Quality assurance (QA) as historically practiced and as recommended in reports such as American Association of Physicists in Medicine Task Groups 40 and 53 needs to be updated to address the increasing complexity and computerization of radiotherapy equipment, and the increased quantity of data defining a treatment plan and treatment delivery. While new technology has reduced the probability of many types of medical events, seeing new types of errors caused by improper use of new technology, communication failures between computers, corrupted or erroneous computer data files, and "software bugs" are now being seen. The increased use of computed tomography, magnetic resonance, and positron emission tomography imaging has become routine for many types of radiotherapy treatment planning, and QA for imaging modalities is beyond the expertise of most radiotherapy physicists. Errors in radiotherapy rarely result solely from hardware failures. More commonly they are a combination of computer and human errors. The increased use of radiosurgery, hypofractionation, more complex intensity modulated treatment plans, image guided radiation therapy, and increasing financial pressures to treat more patients in less time will continue to fuel this reliance on high technology and complex computer software. Clinical practitioners and regulatory agencies are beginning to realize that QA for new technologies is a major challenge and poses dangers different in nature than what are historically familiar.

  2. Synthetic analog computation in living cells.

    PubMed

    Daniel, Ramiz; Rubens, Jacob R; Sarpeshkar, Rahul; Lu, Timothy K

    2013-05-30

    A central goal of synthetic biology is to achieve multi-signal integration and processing in living cells for diagnostic, therapeutic and biotechnology applications. Digital logic has been used to build small-scale circuits, but other frameworks may be needed for efficient computation in the resource-limited environments of cells. Here we demonstrate that synthetic analog gene circuits can be engineered to execute sophisticated computational functions in living cells using just three transcription factors. Such synthetic analog gene circuits exploit feedback to implement logarithmically linear sensing, addition, ratiometric and power-law computations. The circuits exhibit Weber's law behaviour as in natural biological systems, operate over a wide dynamic range of up to four orders of magnitude and can be designed to have tunable transfer functions. Our circuits can be composed to implement higher-order functions that are well described by both intricate biochemical models and simple mathematical functions. By exploiting analog building-block functions that are already naturally present in cells, this approach efficiently implements arithmetic operations and complex functions in the logarithmic domain. Such circuits may lead to new applications for synthetic biology and biotechnology that require complex computations with limited parts, need wide-dynamic-range biosensing or would benefit from the fine control of gene expression.

  3. Data-Based Predictive Control with Multirate Prediction Step

    NASA Technical Reports Server (NTRS)

    Barlow, Jonathan S.

    2010-01-01

    Data-based predictive control is an emerging control method that stems from Model Predictive Control (MPC). MPC computes current control action based on a prediction of the system output a number of time steps into the future and is generally derived from a known model of the system. Data-based predictive control has the advantage of deriving predictive models and controller gains from input-output data. Thus, a controller can be designed from the outputs of complex simulation code or a physical system where no explicit model exists. If the output data happens to be corrupted by periodic disturbances, the designed controller will also have the built-in ability to reject these disturbances without the need to know them. When data-based predictive control is implemented online, it becomes a version of adaptive control. One challenge of MPC is computational requirements increasing with prediction horizon length. This paper develops a closed-loop dynamic output feedback controller that minimizes a multi-step-ahead receding-horizon cost function with multirate prediction step. One result is a reduced influence of prediction horizon and the number of system outputs on the computational requirements of the controller. Another result is an emphasis on portions of the prediction window that are sampled more frequently. A third result is the ability to include more outputs in the feedback path than in the cost function.

  4. General Methodology Combining Engineering Optimization of Primary HVAC and R Plants with Decision Analysis Methods--Part II: Uncertainty and Decision Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Wei; Reddy, T. A.; Gurian, Patrick

    2007-01-31

    A companion paper to Jiang and Reddy that presents a general and computationally efficient methodology for dyanmic scheduling and optimal control of complex primary HVAC&R plants using a deterministic engineering optimization approach.

  5. Mathematical concepts for modeling human behavior in complex man-machine systems

    NASA Technical Reports Server (NTRS)

    Johannsen, G.; Rouse, W. B.

    1979-01-01

    Many human behavior (e.g., manual control) models have been found to be inadequate for describing processes in certain real complex man-machine systems. An attempt is made to find a way to overcome this problem by examining the range of applicability of existing mathematical models with respect to the hierarchy of human activities in real complex tasks. Automobile driving is chosen as a baseline scenario, and a hierarchy of human activities is derived by analyzing this task in general terms. A structural description leads to a block diagram and a time-sharing computer analogy.

  6. The semiotics of control and modeling relations in complex systems.

    PubMed

    Joslyn, C

    2001-01-01

    We provide a conceptual analysis of ideas and principles from the systems theory discourse which underlie Pattee's semantic or semiotic closure, which is itself foundational for a school of theoretical biology derived from systems theory and cybernetics, and is now being related to biological semiotics and explicated in the relational biological school of Rashevsky and Rosen. Atomic control systems and models are described as the canonical forms of semiotic organization, sharing measurement relations, but differing topologically in that control systems are circularly and models linearly related to their environments. Computation in control systems is introduced, motivating hierarchical decomposition, hybrid modeling and control systems, and anticipatory or model-based control. The semiotic relations in complex control systems are described in terms of relational constraints, and rules and laws are distinguished as contingent and necessary functional entailments, respectively. Finally, selection as a meta-level of constraint is introduced as the necessary condition for semantic relations in control systems and models.

  7. Complexity Bounds for Quantum Computation

    DTIC Science & Technology

    2007-06-22

    Programs Trustees of Boston University Boston, MA 02215 - Complexity Bounds for Quantum Computation REPORT DOCUMENTATION PAGE 18. SECURITY CLASSIFICATION...Complexity Bounds for Quantum Comp[utation Report Title ABSTRACT This project focused on upper and lower bounds for quantum computability using constant...classical computation models, particularly emphasizing new examples of where quantum circuits are more powerful than their classical counterparts. A second

  8. Assessment of flat rolling theories for the use in a model-based controller for high-precision rolling applications

    NASA Astrophysics Data System (ADS)

    Stockert, Sven; Wehr, Matthias; Lohmar, Johannes; Abel, Dirk; Hirt, Gerhard

    2017-10-01

    In the electrical and medical industries the trend towards further miniaturization of devices is accompanied by the demand for smaller manufacturing tolerances. Such industries use a plentitude of small and narrow cold rolled metal strips with high thickness accuracy. Conventional rolling mills can hardly achieve further improvement of these tolerances. However, a model-based controller in combination with an additional piezoelectric actuator for high dynamic roll adjustment is expected to enable the production of the required metal strips with a thickness tolerance of +/-1 µm. The model-based controller has to be based on a rolling theory which can describe the rolling process very accurately. Additionally, the required computing time has to be low in order to predict the rolling process in real-time. In this work, four rolling theories from literature with different levels of complexity are tested for their suitability for the predictive controller. Rolling theories of von Kármán, Siebel, Bland & Ford and Alexander are implemented in Matlab and afterwards transferred to the real-time computer used for the controller. The prediction accuracy of these theories is validated using rolling trials with different thickness reduction and a comparison to the calculated results. Furthermore, the required computing time on the real-time computer is measured. Adequate results according the prediction accuracy can be achieved with the rolling theories developed by Bland & Ford and Alexander. A comparison of the computing time of those two theories reveals that Alexander's theory exceeds the sample rate of 1 kHz of the real-time computer.

  9. Comparison between iterative wavefront control algorithm and direct gradient wavefront control algorithm for adaptive optics system

    NASA Astrophysics Data System (ADS)

    Cheng, Sheng-Yi; Liu, Wen-Jin; Chen, Shan-Qiu; Dong, Li-Zhi; Yang, Ping; Xu, Bing

    2015-08-01

    Among all kinds of wavefront control algorithms in adaptive optics systems, the direct gradient wavefront control algorithm is the most widespread and common method. This control algorithm obtains the actuator voltages directly from wavefront slopes through pre-measuring the relational matrix between deformable mirror actuators and Hartmann wavefront sensor with perfect real-time characteristic and stability. However, with increasing the number of sub-apertures in wavefront sensor and deformable mirror actuators of adaptive optics systems, the matrix operation in direct gradient algorithm takes too much time, which becomes a major factor influencing control effect of adaptive optics systems. In this paper we apply an iterative wavefront control algorithm to high-resolution adaptive optics systems, in which the voltages of each actuator are obtained through iteration arithmetic, which gains great advantage in calculation and storage. For AO system with thousands of actuators, the computational complexity estimate is about O(n2) ˜ O(n3) in direct gradient wavefront control algorithm, while the computational complexity estimate in iterative wavefront control algorithm is about O(n) ˜ (O(n)3/2), in which n is the number of actuators of AO system. And the more the numbers of sub-apertures and deformable mirror actuators, the more significant advantage the iterative wavefront control algorithm exhibits. Project supported by the National Key Scientific and Research Equipment Development Project of China (Grant No. ZDYZ2013-2), the National Natural Science Foundation of China (Grant No. 11173008), and the Sichuan Provincial Outstanding Youth Academic Technology Leaders Program, China (Grant No. 2012JQ0012).

  10. Controllability and observability of Boolean networks arising from biology

    NASA Astrophysics Data System (ADS)

    Li, Rui; Yang, Meng; Chu, Tianguang

    2015-02-01

    Boolean networks are currently receiving considerable attention as a computational scheme for system level analysis and modeling of biological systems. Studying control-related problems in Boolean networks may reveal new insights into the intrinsic control in complex biological systems and enable us to develop strategies for manipulating biological systems using exogenous inputs. This paper considers controllability and observability of Boolean biological networks. We propose a new approach, which draws from the rich theory of symbolic computation, to solve the problems. Consequently, simple necessary and sufficient conditions for reachability, controllability, and observability are obtained, and algorithmic tests for controllability and observability which are based on the Gröbner basis method are presented. As practical applications, we apply the proposed approach to several different biological systems, namely, the mammalian cell-cycle network, the T-cell activation network, the large granular lymphocyte survival signaling network, and the Drosophila segment polarity network, gaining novel insights into the control and/or monitoring of the specific biological systems.

  11. A time-efficient algorithm for implementing the Catmull-Clark subdivision method

    NASA Astrophysics Data System (ADS)

    Ioannou, G.; Savva, A.; Stylianou, V.

    2015-10-01

    Splines are the most popular methods in Figure Modeling and CAGD (Computer Aided Geometric Design) in generating smooth surfaces from a number of control points. The control points define the shape of a figure and splines calculate the required number of points which when displayed on a computer screen the result is a smooth surface. However, spline methods are based on a rectangular topological structure of points, i.e., a two-dimensional table of vertices, and thus cannot generate complex figures, such as the human and animal bodies that their complex structure does not allow them to be defined by a regular rectangular grid. On the other hand surface subdivision methods, which are derived by splines, generate surfaces which are defined by an arbitrary topology of control points. This is the reason that during the last fifteen years subdivision methods have taken the lead over regular spline methods in all areas of modeling in both industry and research. The cost of executing computer software developed to read control points and calculate the surface is run-time, due to the fact that the surface-structure required for handling arbitrary topological grids is very complicate. There are many software programs that have been developed related to the implementation of subdivision surfaces however, not many algorithms are documented in the literature, to support developers for writing efficient code. This paper aims to assist programmers by presenting a time-efficient algorithm for implementing subdivision splines. The Catmull-Clark which is the most popular of the subdivision methods has been employed to illustrate the algorithm.

  12. Hybrid Toffoli gate on photons and quantum spins

    PubMed Central

    Luo, Ming-Xing; Ma, Song-Ya; Chen, Xiu-Bo; Wang, Xiaojun

    2015-01-01

    Quantum computation offers potential advantages in solving a number of interesting and difficult problems. Several controlled logic gates, the elemental building blocks of quantum computer, have been realized with various physical systems. A general technique was recently proposed that significantly reduces the realization complexity of multiple-control logic gates by harnessing multi-level information carriers. We present implementations of a key quantum circuit: the three-qubit Toffoli gate. By exploring the optical selection rules of one-sided optical microcavities, a Toffoli gate may be realized on all combinations of photon and quantum spins in the QD-cavity. The three general controlled-NOT gates are involved using an auxiliary photon with two degrees of freedom. Our results show that photons and quantum spins may be used alternatively in quantum information processing. PMID:26568078

  13. Hybrid Toffoli gate on photons and quantum spins.

    PubMed

    Luo, Ming-Xing; Ma, Song-Ya; Chen, Xiu-Bo; Wang, Xiaojun

    2015-11-16

    Quantum computation offers potential advantages in solving a number of interesting and difficult problems. Several controlled logic gates, the elemental building blocks of quantum computer, have been realized with various physical systems. A general technique was recently proposed that significantly reduces the realization complexity of multiple-control logic gates by harnessing multi-level information carriers. We present implementations of a key quantum circuit: the three-qubit Toffoli gate. By exploring the optical selection rules of one-sided optical microcavities, a Toffoli gate may be realized on all combinations of photon and quantum spins in the QD-cavity. The three general controlled-NOT gates are involved using an auxiliary photon with two degrees of freedom. Our results show that photons and quantum spins may be used alternatively in quantum information processing.

  14. A model for the neural control of pineal periodicity

    NASA Astrophysics Data System (ADS)

    de Oliveira Cruz, Frederico Alan; Soares, Marilia Amavel Gomes; Cortez, Celia Martins

    2016-12-01

    The aim of this work was verify if a computational model associating the synchronization dynamics of coupling oscillators to a set of synaptic transmission equations would be able to simulate the control of pineal by a complex neural pathway that connects the retina to this gland. Results from the simulations showed that the frequency and temporal firing patterns were in the range of values found in literature.

  15. Research on the man in the loop control system of the robot arm based on gesture control

    NASA Astrophysics Data System (ADS)

    Xiao, Lifeng; Peng, Jinbao

    2017-03-01

    The Man in the loop control system of the robot arm based on gesture control research complex real-world environment, which requires the operator to continuously control and adjust the remote manipulator, as the background, completes the specific mission human in the loop entire system as the research object. This paper puts forward a kind of robot arm control system of Man in the loop based on gesture control, by robot arm control system based on gesture control and Virtual reality scene feedback to enhance immersion and integration of operator, to make operator really become a part of the whole control loop. This paper expounds how to construct a man in the loop control system of the robot arm based on gesture control. The system is a complex system of human computer cooperative control, but also people in the loop control problem areas. The new system solves the problems that the traditional method has no immersion feeling and the operation lever is unnatural, the adjustment time is long, and the data glove mode wears uncomfortable and the price is expensive.

  16. Applications of advanced data analysis and expert system technologies in the ATLAS Trigger-DAQ Controls framework

    NASA Astrophysics Data System (ADS)

    Avolio, G.; Corso Radu, A.; Kazarov, A.; Lehmann Miotto, G.; Magnoni, L.

    2012-12-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment is a very complex distributed computing system, composed of more than 20000 applications running on more than 2000 computers. The TDAQ Controls system has to guarantee the smooth and synchronous operations of all the TDAQ components and has to provide the means to minimize the downtime of the system caused by runtime failures. During data taking runs, streams of information messages sent or published by running applications are the main sources of knowledge about correctness of running operations. The huge flow of operational monitoring data produced is constantly monitored by experts in order to detect problems or misbehaviours. Given the scale of the system and the rates of data to be analyzed, the automation of the system functionality in the areas of operational monitoring, system verification, error detection and recovery is a strong requirement. To accomplish its objective, the Controls system includes some high-level components which are based on advanced software technologies, namely the rule-based Expert System and the Complex Event Processing engines. The chosen techniques allow to formalize, store and reuse the knowledge of experts and thus to assist the shifters in the ATLAS control room during the data-taking activities.

  17. Quantitative analysis of biomedical samples using synchrotron radiation microbeams

    NASA Astrophysics Data System (ADS)

    Ektessabi, Ali; Shikine, Shunsuke; Yoshida, Sohei

    2001-07-01

    X-ray fluorescence (XRF) using a synchrotron radiation (SR) microbeam was applied to investigate distributions and concentrations of elements in single neurons of patients with neurodegenerative diseases. In this paper we introduce a computer code that has been developed to quantify the trace elements and matrix elements at the single cell level. This computer code has been used in studies of several important neurodegenerative diseases such as Alzheimer's disease (AD), Parkinson's disease (PD) and parkinsonism-dementia complex (PDC), as well as in basic biological experiments to determine the elemental changes in cells due to incorporation of foreign metal elements. The substantial nigra (SN) tissue obtained from the autopsy specimens of patients with Guamanian parkinsonism-dementia complex (PDC) and control cases were examined. Quantitative XRF analysis showed that neuromelanin granules of Parkinsonian SN contained higher levels of Fe than those of the control. The concentrations were in the ranges of 2300-3100 ppm and 2000-2400 ppm respectively. On the contrary, Zn and Ni in neuromelanin granules of SN tissue from the PDC case were lower than those of the control. Especially Zn was less than 40 ppm in SN tissue from the PDC case while it was 560-810 ppm in the control. These changes are considered to be closely related to the neuro-degeneration and cell death.

  18. Evolution of brain-computer interfaces: going beyond classic motor physiology

    PubMed Central

    Leuthardt, Eric C.; Schalk, Gerwin; Roland, Jarod; Rouse, Adam; Moran, Daniel W.

    2010-01-01

    The notion that a computer can decode brain signals to infer the intentions of a human and then enact those intentions directly through a machine is becoming a realistic technical possibility. These types of devices are known as brain-computer interfaces (BCIs). The evolution of these neuroprosthetic technologies could have significant implications for patients with motor disabilities by enhancing their ability to interact and communicate with their environment. The cortical physiology most investigated and used for device control has been brain signals from the primary motor cortex. To date, this classic motor physiology has been an effective substrate for demonstrating the potential efficacy of BCI-based control. However, emerging research now stands to further enhance our understanding of the cortical physiology underpinning human intent and provide further signals for more complex brain-derived control. In this review, the authors report the current status of BCIs and detail the emerging research trends that stand to augment clinical applications in the future. PMID:19569892

  19. Digital system for structural dynamics simulation

    NASA Technical Reports Server (NTRS)

    Krauter, A. I.; Lagace, L. J.; Wojnar, M. K.; Glor, C.

    1982-01-01

    State-of-the-art digital hardware and software for the simulation of complex structural dynamic interactions, such as those which occur in rotating structures (engine systems). System were incorporated in a designed to use an array of processors in which the computation for each physical subelement or functional subsystem would be assigned to a single specific processor in the simulator. These node processors are microprogrammed bit-slice microcomputers which function autonomously and can communicate with each other and a central control minicomputer over parallel digital lines. Inter-processor nearest neighbor communications busses pass the constants which represent physical constraints and boundary conditions. The node processors are connected to the six nearest neighbor node processors to simulate the actual physical interface of real substructures. Computer generated finite element mesh and force models can be developed with the aid of the central control minicomputer. The control computer also oversees the animation of a graphics display system, disk-based mass storage along with the individual processing elements.

  20. High Performance Parallel Computational Nanotechnology

    NASA Technical Reports Server (NTRS)

    Saini, Subhash; Craw, James M. (Technical Monitor)

    1995-01-01

    At a recent press conference, NASA Administrator Dan Goldin encouraged NASA Ames Research Center to take a lead role in promoting research and development of advanced, high-performance computer technology, including nanotechnology. Manufacturers of leading-edge microprocessors currently perform large-scale simulations in the design and verification of semiconductor devices and microprocessors. Recently, the need for this intensive simulation and modeling analysis has greatly increased, due in part to the ever-increasing complexity of these devices, as well as the lessons of experiences such as the Pentium fiasco. Simulation, modeling, testing, and validation will be even more important for designing molecular computers because of the complex specification of millions of atoms, thousands of assembly steps, as well as the simulation and modeling needed to ensure reliable, robust and efficient fabrication of the molecular devices. The software for this capacity does not exist today, but it can be extrapolated from the software currently used in molecular modeling for other applications: semi-empirical methods, ab initio methods, self-consistent field methods, Hartree-Fock methods, molecular mechanics; and simulation methods for diamondoid structures. In as much as it seems clear that the application of such methods in nanotechnology will require powerful, highly powerful systems, this talk will discuss techniques and issues for performing these types of computations on parallel systems. We will describe system design issues (memory, I/O, mass storage, operating system requirements, special user interface issues, interconnects, bandwidths, and programming languages) involved in parallel methods for scalable classical, semiclassical, quantum, molecular mechanics, and continuum models; molecular nanotechnology computer-aided designs (NanoCAD) techniques; visualization using virtual reality techniques of structural models and assembly sequences; software required to control mini robotic manipulators for positional control; scalable numerical algorithms for reliability, verifications and testability. There appears no fundamental obstacle to simulating molecular compilers and molecular computers on high performance parallel computers, just as the Boeing 777 was simulated on a computer before manufacturing it.

  1. Boolean Logic Tree of Label-Free Dual-Signal Electrochemical Aptasensor System for Biosensing, Three-State Logic Computation, and Keypad Lock Security Operation.

    PubMed

    Lu, Jiao Yang; Zhang, Xin Xing; Huang, Wei Tao; Zhu, Qiu Yan; Ding, Xue Zhi; Xia, Li Qiu; Luo, Hong Qun; Li, Nian Bing

    2017-09-19

    The most serious and yet unsolved problems of molecular logic computing consist in how to connect molecular events in complex systems into a usable device with specific functions and how to selectively control branchy logic processes from the cascading logic systems. This report demonstrates that a Boolean logic tree is utilized to organize and connect "plug and play" chemical events DNA, nanomaterials, organic dye, biomolecule, and denaturant for developing the dual-signal electrochemical evolution aptasensor system with good resettability for amplification detection of thrombin, controllable and selectable three-state logic computation, and keypad lock security operation. The aptasensor system combines the merits of DNA-functionalized nanoamplification architecture and simple dual-signal electroactive dye brilliant cresyl blue for sensitive and selective detection of thrombin with a wide linear response range of 0.02-100 nM and a detection limit of 1.92 pM. By using these aforementioned chemical events as inputs and the differential pulse voltammetry current changes at different voltages as dual outputs, a resettable three-input biomolecular keypad lock based on sequential logic is established. Moreover, the first example of controllable and selectable three-state molecular logic computation with active-high and active-low logic functions can be implemented and allows the output ports to assume a high impediment or nothing (Z) state in addition to the 0 and 1 logic levels, effectively controlling subsequent branchy logic computation processes. Our approach is helpful in developing the advanced controllable and selectable logic computing and sensing system in large-scale integration circuits for application in biomedical engineering, intelligent sensing, and control.

  2. User's guide for ENSAERO: A multidisciplinary program for fluid/structural/control interaction studies of aircraft (release 1)

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    1994-01-01

    Strong interactions can occur between the flow about an aerospace vehicle and its structural components resulting in several important aeroelastic phenomena. These aeroelastic phenomena can significantly influence the performance of the vehicle. At present, closed-form solutions are available for aeroelastic computations when flows are in either the linear subsonic or supersonic range. However, for aeroelasticity involving complex nonlinear flows with shock waves, vortices, flow separations, and aerodynamic heating, computational methods are still under development. These complex aeroelastic interactions can be dangerous and limit the performance of aircraft. Examples of these detrimental effects are aircraft with highly swept wings experiencing vortex-induced aeroelastic oscillations, transonic regime at which the flutter speed is low, aerothermoelastic loads that play a critical role in the design of high-speed vehicles, and flow separations that often lead to buffeting with undesirable structural oscillations. The simulation of these complex aeroelastic phenomena requires an integrated analysis of fluids and structures. This report presents a summary of the development, applications, and procedures to use the multidisciplinary computer code ENSAERO. This code is based on the Euler/Navier-Stokes flow equations and modal/finite-element structural equations.

  3. The algorithmic details of polynomials application in the problems of heat and mass transfer control on the hypersonic aircraft permeable surfaces

    NASA Astrophysics Data System (ADS)

    Bilchenko, G. G.; Bilchenko, N. G.

    2018-03-01

    The hypersonic aircraft permeable surfaces heat and mass transfer effective control mathematical modeling problems are considered. The analysis of the control (the blowing) constructive and gasdynamical restrictions is carried out for the porous and perforated surfaces. The functions classes allowing realize the controls taking into account the arising types of restrictions are suggested. Estimates of the computational complexity of the W. G. Horner scheme application in the case of using the C. Hermite interpolation polynomial are given.

  4. A framework for stochastic simulations and visualization of biological electron-transfer dynamics

    NASA Astrophysics Data System (ADS)

    Nakano, C. Masato; Byun, Hye Suk; Ma, Heng; Wei, Tao; El-Naggar, Mohamed Y.

    2015-08-01

    Electron transfer (ET) dictates a wide variety of energy-conversion processes in biological systems. Visualizing ET dynamics could provide key insight into understanding and possibly controlling these processes. We present a computational framework named VizBET to visualize biological ET dynamics, using an outer-membrane Mtr-Omc cytochrome complex in Shewanella oneidensis MR-1 as an example. Starting from X-ray crystal structures of the constituent cytochromes, molecular dynamics simulations are combined with homology modeling, protein docking, and binding free energy computations to sample the configuration of the complex as well as the change of the free energy associated with ET. This information, along with quantum-mechanical calculations of the electronic coupling, provides inputs to kinetic Monte Carlo (KMC) simulations of ET dynamics in a network of heme groups within the complex. Visualization of the KMC simulation results has been implemented as a plugin to the Visual Molecular Dynamics (VMD) software. VizBET has been used to reveal the nature of ET dynamics associated with novel nonequilibrium phase transitions in a candidate configuration of the Mtr-Omc complex due to electron-electron interactions.

  5. Computer aided reliability, availability, and safety modeling for fault-tolerant computer systems with commentary on the HARP program

    NASA Technical Reports Server (NTRS)

    Shooman, Martin L.

    1991-01-01

    Many of the most challenging reliability problems of our present decade involve complex distributed systems such as interconnected telephone switching computers, air traffic control centers, aircraft and space vehicles, and local area and wide area computer networks. In addition to the challenge of complexity, modern fault-tolerant computer systems require very high levels of reliability, e.g., avionic computers with MTTF goals of one billion hours. Most analysts find that it is too difficult to model such complex systems without computer aided design programs. In response to this need, NASA has developed a suite of computer aided reliability modeling programs beginning with CARE 3 and including a group of new programs such as: HARP, HARP-PC, Reliability Analysts Workbench (Combination of model solvers SURE, STEM, PAWS, and common front-end model ASSIST), and the Fault Tree Compiler. The HARP program is studied and how well the user can model systems using this program is investigated. One of the important objectives will be to study how user friendly this program is, e.g., how easy it is to model the system, provide the input information, and interpret the results. The experiences of the author and his graduate students who used HARP in two graduate courses are described. Some brief comparisons were made with the ARIES program which the students also used. Theoretical studies of the modeling techniques used in HARP are also included. Of course no answer can be any more accurate than the fidelity of the model, thus an Appendix is included which discusses modeling accuracy. A broad viewpoint is taken and all problems which occurred in the use of HARP are discussed. Such problems include: computer system problems, installation manual problems, user manual problems, program inconsistencies, program limitations, confusing notation, long run times, accuracy problems, etc.

  6. Simple Logic for Big Problems: An Inside Look at Relational Databases.

    ERIC Educational Resources Information Center

    Seba, Douglas B.; Smith, Pat

    1982-01-01

    Discusses database design concept termed "normalization" (process replacing associations between data with associations in two-dimensional tabular form) which results in formation of relational databases (they are to computers what dictionaries are to spoken languages). Applications of the database in serials control and complex systems…

  7. Neural network applications in telecommunications

    NASA Technical Reports Server (NTRS)

    Alspector, Joshua

    1994-01-01

    Neural network capabilities include automatic and organized handling of complex information, quick adaptation to continuously changing environments, nonlinear modeling, and parallel implementation. This viewgraph presentation presents Bellcore work on applications, learning chip computational function, learning system block diagram, neural network equalization, broadband access control, calling-card fraud detection, software reliability prediction, and conclusions.

  8. A Review and Reappraisal of Adaptive Human-Computer Interfaces in Complex Control Systems

    DTIC Science & Technology

    2006-08-01

    maneuverability measures. The cost elements were expressed as fuzzy membership functions. Figure 9 shows the flowchart of the route planner. A fuzzy navigator...and updating of the user model, which contains information about three generic stereotypes ( beginner , intermediate and expert users) plus an

  9. Air Force Laboratory’s 2005 Technology Milestones

    DTIC Science & Technology

    2006-01-01

    Computational materials science methods can benefit the design and property prediction of complex real-world materials. With these models , scientists and...Warfighter Page Air High - Frequency Acoustic System...800) 203-6451 High - Frequency Acoustic System Payoff Scientists created the High - Frequency Acoustic Suppression Technology (HiFAST) airflow control

  10. New Algorithms Manage Fourfold Redundancy

    NASA Technical Reports Server (NTRS)

    Gelderloos, H. C.

    1982-01-01

    Redundant sensors, actuators, and computers improve reliability of complex control systems, such as those in nuclear powerplants and aircraft. If one or more redundant elements fail, another takes over so that normal operation is not interrupted. Quad selection filter rejects data from null-failed and hardover-failed and hardover-failed units.

  11. Sparsity enabled cluster reduced-order models for control

    NASA Astrophysics Data System (ADS)

    Kaiser, Eurika; Morzyński, Marek; Daviller, Guillaume; Kutz, J. Nathan; Brunton, Bingni W.; Brunton, Steven L.

    2018-01-01

    Characterizing and controlling nonlinear, multi-scale phenomena are central goals in science and engineering. Cluster-based reduced-order modeling (CROM) was introduced to exploit the underlying low-dimensional dynamics of complex systems. CROM builds a data-driven discretization of the Perron-Frobenius operator, resulting in a probabilistic model for ensembles of trajectories. A key advantage of CROM is that it embeds nonlinear dynamics in a linear framework, which enables the application of standard linear techniques to the nonlinear system. CROM is typically computed on high-dimensional data; however, access to and computations on this full-state data limit the online implementation of CROM for prediction and control. Here, we address this key challenge by identifying a small subset of critical measurements to learn an efficient CROM, referred to as sparsity-enabled CROM. In particular, we leverage compressive measurements to faithfully embed the cluster geometry and preserve the probabilistic dynamics. Further, we show how to identify fewer optimized sensor locations tailored to a specific problem that outperform random measurements. Both of these sparsity-enabled sensing strategies significantly reduce the burden of data acquisition and processing for low-latency in-time estimation and control. We illustrate this unsupervised learning approach on three different high-dimensional nonlinear dynamical systems from fluids with increasing complexity, with one application in flow control. Sparsity-enabled CROM is a critical facilitator for real-time implementation on high-dimensional systems where full-state information may be inaccessible.

  12. Dissecting innate immune responses with the tools of systems biology.

    PubMed

    Smith, Kelly D; Bolouri, Hamid

    2005-02-01

    Systems biology strives to derive accurate predictive descriptions of complex systems such as innate immunity. The innate immune system is essential for host defense, yet the resulting inflammatory response must be tightly regulated. Current understanding indicates that this system is controlled by complex regulatory networks, which maintain homoeostasis while accurately distinguishing pathogenic infections from harmless exposures. Recent studies have used high throughput technologies and computational techniques that presage predictive models and will be the foundation of a systems level understanding of innate immunity.

  13. Controllability of Complex Dynamic Objects

    NASA Astrophysics Data System (ADS)

    Kalach, G. G.; Kazachek, N. A.; Morozov, A. A.

    2017-01-01

    Quality requirements for mobile robots intended for both specialized and everyday use are increasing in step with the complexity of the technological tasks assigned to the robots. Whether a mobile robot is for ground, aerial, or underwater use, the relevant quality characteristics can be summarized under the common concept of agility. This term denotes the object’s (the robot)’s ability to react quickly to control actions (change speed and direction), turn in a limited area, etc. When using this approach in integrated assessment of the quality characteristics of an object with the control system, it seems more constructive to use the term “degree of control”. This paper assesses the degree of control by an example of a mobile robot with the variable-geometry drive wheel axle. We show changes in the degree of control depending on the robot’s configuration, and results illustrated by calculation data, computer and practical experiments. We describe the prospects of using intelligent technology for efficient control of objects with a high degree of controllability.

  14. A real-time control system for the control of suspended interferometers based on hybrid computing techniques

    NASA Astrophysics Data System (ADS)

    Acernese, Fausto; Barone, Fabrizio; De Rosa, Rosario; Eleuteri, Antonio; Milano, Leopoldo; Pardi, Silvio; Ricciardi, Iolanda; Russo, Guido

    2004-09-01

    One of the main requirements of a digital system for the control of interferometric detectors of gravitational waves is the computing power, that is a direct consequence of the increasing complexity of the digital algorithms necessary for the control signals generation. For this specific task many specialized non standard real-time architectures have been developed, often very expensive and difficult to upgrade. On the other hand, such computing power is generally fully available for off-line applications on standard Pc based systems. Therefore, a possible and obvious solution may be provided by the integration of both the real-time and off-line architecture resulting in a hybrid control system architecture based on standards available components, trying to get both the advantages of the perfect data synchronization provided by the real-time systems and by the large computing power available on Pc based systems. Such integration may be provided by the implementation of the link between the two different architectures through the standard Ethernet network, whose data transfer speed is largely increasing in these years, using the TCP/IP, UDP and raw Ethernet protocols. In this paper we describe the architecture of an hybrid Ethernet based real-time control system prototype we implemented in Napoli, discussing its characteristics and performances. Finally we discuss a possible application to the real-time control of a suspended mass of the mode cleaner of the 3m prototype optical interferometer for gravitational wave detection (IDGW-3P) operational in Napoli.

  15. Focal Cortical Dysplasia (FCD) lesion analysis with complex diffusion approach.

    PubMed

    Rajan, Jeny; Kannan, K; Kesavadas, C; Thomas, Bejoy

    2009-10-01

    Identification of Focal Cortical Dysplasia (FCD) can be difficult due to the subtle MRI changes. Though sequences like FLAIR (fluid attenuated inversion recovery) can detect a large majority of these lesions, there are smaller lesions without signal changes that can easily go unnoticed by the naked eye. The aim of this study is to improve the visibility of focal cortical dysplasia lesions in the T1 weighted brain MRI images. In the proposed method, we used a complex diffusion based approach for calculating the FCD affected areas. Based on the diffused image and thickness map, a complex map is created. From this complex map; FCD areas can be easily identified. MRI brains of 48 subjects selected by neuroradiologists were given to computer scientists who developed the complex map for identifying the cortical dysplasia. The scientists were blinded to the MRI interpretation result of the neuroradiologist. The FCD could be identified in all the patients in whom surgery was done, however three patients had false positive lesions. More lesions were identified in patients in whom surgery was not performed and lesions were seen in few of the controls. These were considered as false positive. This computer aided detection technique using complex diffusion approach can help detect focal cortical dysplasia in patients with epilepsy.

  16. Design of safety-oriented control allocation strategies for overactuated electric vehicles

    NASA Astrophysics Data System (ADS)

    de Castro, Ricardo; Tanelli, Mara; Esteves Araújo, Rui; Savaresi, Sergio M.

    2014-08-01

    The new vehicle platforms for electric vehicles (EVs) that are becoming available are characterised by actuator redundancy, which makes it possible to jointly optimise different aspects of the vehicle motion. To do this, high-level control objectives are first specified and solved with appropriate control strategies. Then, the resulting virtual control action must be translated into actual actuator commands by a control allocation layer that takes care of computing the forces to be applied at the wheels. This step, in general, is quite demanding as far as computational complexity is considered. In this work, a safety-oriented approach to this problem is proposed. Specifically, a four-wheel steer EV with four in-wheel motors is considered, and the high-level motion controller is designed within a sliding mode framework with conditional integrators. For distributing the forces among the tyres, two control allocation approaches are investigated. The first, based on the extension of the cascading generalised inverse method, is computationally efficient but shows some limitations in dealing with unfeasible force values. To solve the problem, a second allocation algorithm is proposed, which relies on the linearisation of the tyre-road friction constraints. Extensive tests, carried out in the CarSim simulation environment, demonstrate the effectiveness of the proposed approach.

  17. Proceedings of the 2004 Workshop on CFD Validation of Synthetic Jets and Turbulent Separation Control

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L. (Compiler)

    2007-01-01

    The papers presented here are from the Langley Research Center Workshop on Computational Fluid Dynamics (CFD) Validation of Synthetic Jets and Turbulent Separation Control (nicknamed "CFDVAL2004"), held March 2004 in Williamsburg, Virginia. The goal of the workshop was to bring together an international group of CFD practitioners to assess the current capabilities of different classes of turbulent flow solution methodologies to predict flow fields induced by synthetic jets and separation control geometries. The workshop consisted of three flow-control test cases of varying complexity, and participants could contribute to any number of the cases. Along with their workshop submissions, each participant included a short write-up describing their method for computing the particular case(s). These write-ups are presented as received from the authors with no editing. Descriptions of each of the test cases and experiments are also included.

  18. Information diffusion, Facebook clusters, and the simplicial model of social aggregation: a computational simulation of simplicial diffusers for community health interventions.

    PubMed

    Kee, Kerk F; Sparks, Lisa; Struppa, Daniele C; Mannucci, Mirco A; Damiano, Alberto

    2016-01-01

    By integrating the simplicial model of social aggregation with existing research on opinion leadership and diffusion networks, this article introduces the constructs of simplicial diffusers (mathematically defined as nodes embedded in simplexes; a simplex is a socially bonded cluster) and simplicial diffusing sets (mathematically defined as minimal covers of a simplicial complex; a simplicial complex is a social aggregation in which socially bonded clusters are embedded) to propose a strategic approach for information diffusion of cancer screenings as a health intervention on Facebook for community cancer prevention and control. This approach is novel in its incorporation of interpersonally bonded clusters, culturally distinct subgroups, and different united social entities that coexist within a larger community into a computational simulation to select sets of simplicial diffusers with the highest degree of information diffusion for health intervention dissemination. The unique contributions of the article also include seven propositions and five algorithmic steps for computationally modeling the simplicial model with Facebook data.

  19. Ad Hoc modeling, expert problem solving, and R&T program evaluation

    NASA Technical Reports Server (NTRS)

    Silverman, B. G.; Liebowitz, J.; Moustakis, V. S.

    1983-01-01

    A simplified cost and time (SCAT) analysis program utilizing personal-computer technology is presented and demonstrated in the case of the NASA-Goddard end-to-end data system. The difficulties encountered in implementing complex program-selection and evaluation models in the research and technology field are outlined. The prototype SCAT system described here is designed to allow user-friendly ad hoc modeling in real time and at low cost. A worksheet constructed on the computer screen displays the critical parameters and shows how each is affected when one is altered experimentally. In the NASA case, satellite data-output and control requirements, ground-facility data-handling capabilities, and project priorities are intricately interrelated. Scenario studies of the effects of spacecraft phaseout or new spacecraft on throughput and delay parameters are shown. The use of a network of personal computers for higher-level coordination of decision-making processes is suggested, as a complement or alternative to complex large-scale modeling.

  20. Active vibration control with model correction on a flexible laboratory grid structure

    NASA Technical Reports Server (NTRS)

    Schamel, George C., II; Haftka, Raphael T.

    1991-01-01

    This paper presents experimental and computational comparisons of three active damping control laws applied to a complex laboratory structure. Two reduced structural models were used with one model being corrected on the basis of measured mode shapes and frequencies. Three control laws were investigated, a time-invariant linear quadratic regulator with state estimation and two direct rate feedback control laws. Experimental results for all designs were obtained with digital implementation. It was found that model correction improved the agreement between analytical and experimental results. The best agreement was obtained with the simplest direct rate feedback control.

  1. Using CamiTK for rapid prototyping of interactive computer assisted medical intervention applications.

    PubMed

    Promayon, Emmanuel; Fouard, Céline; Bailet, Mathieu; Deram, Aurélien; Fiard, Gaëlle; Hungr, Nikolai; Luboz, Vincent; Payan, Yohan; Sarrazin, Johan; Saubat, Nicolas; Selmi, Sonia Yuki; Voros, Sandrine; Cinquin, Philippe; Troccaz, Jocelyne

    2013-01-01

    Computer Assisted Medical Intervention (CAMI hereafter) is a complex multi-disciplinary field. CAMI research requires the collaboration of experts in several fields as diverse as medicine, computer science, mathematics, instrumentation, signal processing, mechanics, modeling, automatics, optics, etc. CamiTK is a modular framework that helps researchers and clinicians to collaborate together in order to prototype CAMI applications by regrouping the knowledge and expertise from each discipline. It is an open-source, cross-platform generic and modular tool written in C++ which can handle medical images, surgical navigation, biomedicals simulations and robot control. This paper presents the Computer Assisted Medical Intervention ToolKit (CamiTK) and how it is used in various applications in our research team.

  2. Optimization behavior of brainstem respiratory neurons. A cerebral neural network model.

    PubMed

    Poon, C S

    1991-01-01

    A recent model of respiratory control suggested that the steady-state respiratory responses to CO2 and exercise may be governed by an optimal control law in the brainstem respiratory neurons. It was not certain, however, whether such complex optimization behavior could be accomplished by a realistic biological neural network. To test this hypothesis, we developed a hybrid computer-neural model in which the dynamics of the lung, brain and other tissue compartments were simulated on a digital computer. Mimicking the "controller" was a human subject who pedalled on a bicycle with varying speed (analog of ventilatory output) with a view to minimize an analog signal of the total cost of breathing (chemical and mechanical) which was computed interactively and displayed on an oscilloscope. In this manner, the visuomotor cortex served as a proxy (homolog) of the brainstem respiratory neurons in the model. Results in 4 subjects showed a linear steady-state ventilatory CO2 response to arterial PCO2 during simulated CO2 inhalation and a nearly isocapnic steady-state response during simulated exercise. Thus, neural optimization is a plausible mechanism for respiratory control during exercise and can be achieved by a neural network with cognitive computational ability without the need for an exercise stimulus.

  3. A computational interactome for prioritizing genes associated with complex agronomic traits in rice (Oryza sativa).

    PubMed

    Liu, Shiwei; Liu, Yihui; Zhao, Jiawei; Cai, Shitao; Qian, Hongmei; Zuo, Kaijing; Zhao, Lingxia; Zhang, Lida

    2017-04-01

    Rice (Oryza sativa) is one of the most important staple foods for more than half of the global population. Many rice traits are quantitative, complex and controlled by multiple interacting genes. Thus, a full understanding of genetic relationships will be critical to systematically identify genes controlling agronomic traits. We developed a genome-wide rice protein-protein interaction network (RicePPINet, http://netbio.sjtu.edu.cn/riceppinet) using machine learning with structural relationship and functional information. RicePPINet contained 708 819 predicted interactions for 16 895 non-transposable element related proteins. The power of the network for discovering novel protein interactions was demonstrated through comparison with other publicly available protein-protein interaction (PPI) prediction methods, and by experimentally determined PPI data sets. Furthermore, global analysis of domain-mediated interactions revealed RicePPINet accurately reflects PPIs at the domain level. Our studies showed the efficiency of the RicePPINet-based method in prioritizing candidate genes involved in complex agronomic traits, such as disease resistance and drought tolerance, was approximately 2-11 times better than random prediction. RicePPINet provides an expanded landscape of computational interactome for the genetic dissection of agronomically important traits in rice. © 2017 The Authors The Plant Journal © 2017 John Wiley & Sons Ltd.

  4. 3D graphics, virtual reality, and motion-onset visual evoked potentials in neurogaming.

    PubMed

    Beveridge, R; Wilson, S; Coyle, D

    2016-01-01

    A brain-computer interface (BCI) offers movement-free control of a computer application and is achieved by reading and translating the cortical activity of the brain into semantic control signals. Motion-onset visual evoked potentials (mVEP) are neural potentials employed in BCIs and occur when motion-related stimuli are attended visually. mVEP dynamics are correlated with the position and timing of the moving stimuli. To investigate the feasibility of utilizing the mVEP paradigm with video games of various graphical complexities including those of commercial quality, we conducted three studies over four separate sessions comparing the performance of classifying five mVEP responses with variations in graphical complexity and style, in-game distractions, and display parameters surrounding mVEP stimuli. To investigate the feasibility of utilizing contemporary presentation modalities in neurogaming, one of the studies compared mVEP classification performance when stimuli were presented using the oculus rift virtual reality headset. Results from 31 independent subjects were analyzed offline. The results show classification performances ranging up to 90% with variations in conditions in graphical complexity having limited effect on mVEP performance; thus, demonstrating the feasibility of using the mVEP paradigm within BCI-based neurogaming. © 2016 Elsevier B.V. All rights reserved.

  5. Ultra-Compact Transputer-Based Controller for High-Level, Multi-Axis Coordination

    NASA Technical Reports Server (NTRS)

    Zenowich, Brian; Crowell, Adam; Townsend, William T.

    2013-01-01

    The design of machines that rely on arrays of servomotors such as robotic arms, orbital platforms, and combinations of both, imposes a heavy computational burden to coordinate their actions to perform coherent tasks. For example, the robotic equivalent of a person tracing a straight line in space requires enormously complex kinematics calculations, and complexity increases with the number of servo nodes. A new high-level architecture for coordinated servo-machine control enables a practical, distributed transputer alternative to conventional central processor electronics. The solution is inherently scalable, dramatically reduces bulkiness and number of conductor runs throughout the machine, requires only a fraction of the power, and is designed for cooling in a vacuum.

  6. Circuit For Control Of Electromechanical Prosthetic Hand

    NASA Technical Reports Server (NTRS)

    Bozeman, Richard J., Jr.

    1995-01-01

    Proposed circuit for control of electromechanical prosthetic hand derives electrical control signals from shoulder movements. Updated, electronic version of prosthesis, that includes two hooklike fingers actuated via cables from shoulder harness. Circuit built around favored shoulder harness, provides more dexterous movement, without incurring complexity of computer-controlled "bionic" or hydraulically actuated devices. Additional harness and potentiometer connected to similar control circuit mounted on other shoulder. Used to control stepping motor rotating hand about prosthetic wrist to one of number of angles consistent with number of digital outputs. Finger-control signals developed by circuit connected to first shoulder harness transmitted to prosthetic hand via sliprings at prosthetic wrist joint.

  7. STARS: An Integrated, Multidisciplinary, Finite-Element, Structural, Fluids, Aeroelastic, and Aeroservoelastic Analysis Computer Program

    NASA Technical Reports Server (NTRS)

    Gupta, K. K.

    1997-01-01

    A multidisciplinary, finite element-based, highly graphics-oriented, linear and nonlinear analysis capability that includes such disciplines as structures, heat transfer, linear aerodynamics, computational fluid dynamics, and controls engineering has been achieved by integrating several new modules in the original STARS (STructural Analysis RoutineS) computer program. Each individual analysis module is general-purpose in nature and is effectively integrated to yield aeroelastic and aeroservoelastic solutions of complex engineering problems. Examples of advanced NASA Dryden Flight Research Center projects analyzed by the code in recent years include the X-29A, F-18 High Alpha Research Vehicle/Thrust Vectoring Control System, B-52/Pegasus Generic Hypersonics, National AeroSpace Plane (NASP), SR-71/Hypersonic Launch Vehicle, and High Speed Civil Transport (HSCT) projects. Extensive graphics capabilities exist for convenient model development and postprocessing of analysis results. The program is written in modular form in standard FORTRAN language to run on a variety of computers, such as the IBM RISC/6000, SGI, DEC, Cray, and personal computer; associated graphics codes use OpenGL and IBM/graPHIGS language for color depiction. This program is available from COSMIC, the NASA agency for distribution of computer programs.

  8. Increasingly automated procedure acquisition in dynamic systems

    NASA Technical Reports Server (NTRS)

    Mathe, Nathalie; Kedar, Smadar

    1992-01-01

    Procedures are widely used by operators for controlling complex dynamic systems. Currently, most development of such procedures is done manually, consuming a large amount of paper, time, and manpower in the process. While automated knowledge acquisition is an active field of research, not much attention has been paid to the problem of computer-assisted acquisition and refinement of complex procedures for dynamic systems. The Procedure Acquisition for Reactive Control Assistant (PARC), which is designed to assist users in more systematically and automatically encoding and refining complex procedures. PARC is able to elicit knowledge interactively from the user during operation of the dynamic system. We categorize procedure refinement into two stages: diagnosis - diagnose the failure and choose a repair - and repair - plan and perform the repair. The basic approach taken in PARC is to assist the user in all steps of this process by providing increased levels of assistance with layered tools. We illustrate the operation of PARC in refining procedures for the control of a robot arm.

  9. Effective algorithm for solving complex problems of production control and of material flows control of industrial enterprise

    NASA Astrophysics Data System (ADS)

    Mezentsev, Yu A.; Baranova, N. V.

    2018-05-01

    A universal economical and mathematical model designed for determination of optimal strategies for managing subsystems (components of subsystems) of production and logistics of enterprises is considered. Declared universality allows taking into account on the system level both production components, including limitations on the ways of converting raw materials and components into sold goods, as well as resource and logical restrictions on input and output material flows. The presented model and generated control problems are developed within the framework of the unified approach that allows one to implement logical conditions of any complexity and to define corresponding formal optimization tasks. Conceptual meaning of used criteria and limitations are explained. The belonging of the generated tasks of the mixed programming with the class of NP is shown. An approximate polynomial algorithm for solving the posed optimization tasks for mixed programming of real dimension with high computational complexity is proposed. Results of testing the algorithm on the tasks in a wide range of dimensions are presented.

  10. A single network adaptive critic (SNAC) architecture for optimal control synthesis for a class of nonlinear systems.

    PubMed

    Padhi, Radhakant; Unnikrishnan, Nishant; Wang, Xiaohua; Balakrishnan, S N

    2006-12-01

    Even though dynamic programming offers an optimal control solution in a state feedback form, the method is overwhelmed by computational and storage requirements. Approximate dynamic programming implemented with an Adaptive Critic (AC) neural network structure has evolved as a powerful alternative technique that obviates the need for excessive computations and storage requirements in solving optimal control problems. In this paper, an improvement to the AC architecture, called the "Single Network Adaptive Critic (SNAC)" is presented. This approach is applicable to a wide class of nonlinear systems where the optimal control (stationary) equation can be explicitly expressed in terms of the state and costate variables. The selection of this terminology is guided by the fact that it eliminates the use of one neural network (namely the action network) that is part of a typical dual network AC setup. As a consequence, the SNAC architecture offers three potential advantages: a simpler architecture, lesser computational load and elimination of the approximation error associated with the eliminated network. In order to demonstrate these benefits and the control synthesis technique using SNAC, two problems have been solved with the AC and SNAC approaches and their computational performances are compared. One of these problems is a real-life Micro-Electro-Mechanical-system (MEMS) problem, which demonstrates that the SNAC technique is applicable to complex engineering systems.

  11. Theoretical prediction of welding distortion in large and complex structures

    NASA Astrophysics Data System (ADS)

    Deng, De-An

    2010-06-01

    Welding technology is widely used to assemble large thin plate structures such as ships, automobiles, and passenger trains because of its high productivity. However, it is impossible to avoid welding-induced distortion during the assembly process. Welding distortion not only reduces the fabrication accuracy of a weldment, but also decreases the productivity due to correction work. If welding distortion can be predicted using a practical method beforehand, the prediction will be useful for taking appropriate measures to control the dimensional accuracy to an acceptable limit. In this study, a two-step computational approach, which is a combination of a thermoelastic-plastic finite element method (FEM) and an elastic finite element with consideration for large deformation, is developed to estimate welding distortion for large and complex welded structures. Welding distortions in several representative large complex structures, which are often used in shipbuilding, are simulated using the proposed method. By comparing the predictions and the measurements, the effectiveness of the two-step computational approach is verified.

  12. Virtual-system-coupled adaptive umbrella sampling to compute free-energy landscape for flexible molecular docking.

    PubMed

    Higo, Junichi; Dasgupta, Bhaskar; Mashimo, Tadaaki; Kasahara, Kota; Fukunishi, Yoshifumi; Nakamura, Haruki

    2015-07-30

    A novel enhanced conformational sampling method, virtual-system-coupled adaptive umbrella sampling (V-AUS), was proposed to compute 300-K free-energy landscape for flexible molecular docking, where a virtual degrees of freedom was introduced to control the sampling. This degree of freedom interacts with the biomolecular system. V-AUS was applied to complex formation of two disordered amyloid-β (Aβ30-35 ) peptides in a periodic box filled by an explicit solvent. An interpeptide distance was defined as the reaction coordinate, along which sampling was enhanced. A uniform conformational distribution was obtained covering a wide interpeptide distance ranging from the bound to unbound states. The 300-K free-energy landscape was characterized by thermodynamically stable basins of antiparallel and parallel β-sheet complexes and some other complex forms. Helices were frequently observed, when the two peptides contacted loosely or fluctuated freely without interpeptide contacts. We observed that V-AUS converged to uniform distribution more effectively than conventional AUS sampling did. © 2015 Wiley Periodicals, Inc.

  13. A controlled genetic algorithm by fuzzy logic and belief functions for job-shop scheduling.

    PubMed

    Hajri, S; Liouane, N; Hammadi, S; Borne, P

    2000-01-01

    Most scheduling problems are highly complex combinatorial problems. However, stochastic methods such as genetic algorithm yield good solutions. In this paper, we present a controlled genetic algorithm (CGA) based on fuzzy logic and belief functions to solve job-shop scheduling problems. For better performance, we propose an efficient representational scheme, heuristic rules for creating the initial population, and a new methodology for mixing and computing genetic operator probabilities.

  14. Bioinspired Concepts: Unified Theory for Complex Biological and Engineering Systems

    DTIC Science & Technology

    2006-01-01

    i.e., data flows of finite size arrive at the system randomly. For such a system , we propose a modified dual scheduling algorithm that stabilizes ...demon. We compute the efficiency of the controller over finite and infinite time intervals, and since the controller is optimal, this yields hard limits...and highly optimized tolerance. PNAS, 102, 2005. 51. G. N. Nair and R. J. Evans. Stabilizability of stochastic linear systems with finite feedback

  15. OCCULT-ORSER complete conversational user-language translator

    NASA Technical Reports Server (NTRS)

    Ramapriyan, H. K.; Young, K.

    1981-01-01

    Translator program (OCCULT) assists non-computer-oriented users in setting up and submitting jobs for complex ORSER system. ORSER is collection of image processing programs for analyzing remotely sensed data. OCCULT is designed for those who would like to use ORSER but cannot justify acquiring and maintaining necessary proficiency in Remote Job Entry Language, Job Control Language, and control-card formats. OCCULT is written in FORTRAN IV and OS Assembler for interactive execution.

  16. Quadcopter control in three-dimensional space using a noninvasive motor imagery based brain-computer interface

    PubMed Central

    LaFleur, Karl; Cassady, Kaitlin; Doud, Alexander; Shades, Kaleb; Rogin, Eitan; He, Bin

    2013-01-01

    Objective At the balanced intersection of human and machine adaptation is found the optimally functioning brain-computer interface (BCI). In this study, we report a novel experiment of BCI controlling a robotic quadcopter in three-dimensional physical space using noninvasive scalp EEG in human subjects. We then quantify the performance of this system using metrics suitable for asynchronous BCI. Lastly, we examine the impact that operation of a real world device has on subjects’ control with comparison to a two-dimensional virtual cursor task. Approach Five human subjects were trained to modulate their sensorimotor rhythms to control an AR Drone navigating a three-dimensional physical space. Visual feedback was provided via a forward facing camera on the hull of the drone. Individual subjects were able to accurately acquire up to 90.5% of all valid targets presented while travelling at an average straight-line speed of 0.69 m/s. Significance Freely exploring and interacting with the world around us is a crucial element of autonomy that is lost in the context of neurodegenerative disease. Brain-computer interfaces are systems that aim to restore or enhance a user’s ability to interact with the environment via a computer and through the use of only thought. We demonstrate for the first time the ability to control a flying robot in the three-dimensional physical space using noninvasive scalp recorded EEG in humans. Our work indicates the potential of noninvasive EEG based BCI systems to accomplish complex control in three-dimensional physical space. The present study may serve as a framework for the investigation of multidimensional non-invasive brain-computer interface control in a physical environment using telepresence robotics. PMID:23735712

  17. Non-adiabatic holonomic quantum computation in linear system-bath coupling

    PubMed Central

    Sun, Chunfang; Wang, Gangcheng; Wu, Chunfeng; Liu, Haodi; Feng, Xun-Li; Chen, Jing-Ling; Xue, Kang

    2016-01-01

    Non-adiabatic holonomic quantum computation in decoherence-free subspaces protects quantum information from control imprecisions and decoherence. For the non-collective decoherence that each qubit has its own bath, we show the implementations of two non-commutable holonomic single-qubit gates and one holonomic nontrivial two-qubit gate that compose a universal set of non-adiabatic holonomic quantum gates in decoherence-free-subspaces of the decoupling group, with an encoding rate of . The proposed scheme is robust against control imprecisions and the non-collective decoherence, and its non-adiabatic property ensures less operation time. We demonstrate that our proposed scheme can be realized by utilizing only two-qubit interactions rather than many-qubit interactions. Our results reduce the complexity of practical implementation of holonomic quantum computation in experiments. We also discuss the physical implementation of our scheme in coupled microcavities. PMID:26846444

  18. Non-adiabatic holonomic quantum computation in linear system-bath coupling.

    PubMed

    Sun, Chunfang; Wang, Gangcheng; Wu, Chunfeng; Liu, Haodi; Feng, Xun-Li; Chen, Jing-Ling; Xue, Kang

    2016-02-05

    Non-adiabatic holonomic quantum computation in decoherence-free subspaces protects quantum information from control imprecisions and decoherence. For the non-collective decoherence that each qubit has its own bath, we show the implementations of two non-commutable holonomic single-qubit gates and one holonomic nontrivial two-qubit gate that compose a universal set of non-adiabatic holonomic quantum gates in decoherence-free-subspaces of the decoupling group, with an encoding rate of (N - 2)/N. The proposed scheme is robust against control imprecisions and the non-collective decoherence, and its non-adiabatic property ensures less operation time. We demonstrate that our proposed scheme can be realized by utilizing only two-qubit interactions rather than many-qubit interactions. Our results reduce the complexity of practical implementation of holonomic quantum computation in experiments. We also discuss the physical implementation of our scheme in coupled microcavities.

  19. Biosensors with Built-In Biomolecular Logic Gates for Practical Applications

    PubMed Central

    Lai, Yu-Hsuan; Sun, Sin-Cih; Chuang, Min-Chieh

    2014-01-01

    Molecular logic gates, designs constructed with biological and chemical molecules, have emerged as an alternative computing approach to silicon-based logic operations. These molecular computers are capable of receiving and integrating multiple stimuli of biochemical significance to generate a definitive output, opening a new research avenue to advanced diagnostics and therapeutics which demand handling of complex factors and precise control. In molecularly gated devices, Boolean logic computations can be activated by specific inputs and accurately processed via bio-recognition, bio-catalysis, and selective chemical reactions. In this review, we survey recent advances of the molecular logic approaches to practical applications of biosensors, including designs constructed with proteins, enzymes, nucleic acids, nanomaterials, and organic compounds, as well as the research avenues for future development of digitally operating “sense and act” schemes that logically process biochemical signals through networked circuits to implement intelligent control systems. PMID:25587423

  20. Integration of symbolic and algorithmic hardware and software for the automation of space station subsystems

    NASA Technical Reports Server (NTRS)

    Gregg, Hugh; Healey, Kathleen; Hack, Edmund; Wong, Carla

    1987-01-01

    Traditional expert systems, such as diagnostic and training systems, interact with users only through a keyboard and screen, and are usually symbolic in nature. Expert systems that require access to data bases, complex simulations and real-time instrumentation have both symbolic as well as algorithmic computing needs. These needs could both be met using a general purpose workstation running both symbolic and algorithmic code, or separate, specialized computers networked together. The latter approach was chosen to implement TEXSYS, the thermal expert system, developed by NASA Ames Research Center in conjunction with Johnson Space Center to demonstrate the ability of an expert system to autonomously monitor the thermal control system of the space station. TEXSYS has been implemented on a Symbolics workstation, and will be linked to a microVAX computer that will control a thermal test bed. This paper will explore the integration options, and present several possible solutions.

  1. Scalable software-defined optical networking with high-performance routing and wavelength assignment algorithms.

    PubMed

    Lee, Chankyun; Cao, Xiaoyuan; Yoshikane, Noboru; Tsuritani, Takehiro; Rhee, June-Koo Kevin

    2015-10-19

    The feasibility of software-defined optical networking (SDON) for a practical application critically depends on scalability of centralized control performance. The paper, highly scalable routing and wavelength assignment (RWA) algorithms are investigated on an OpenFlow-based SDON testbed for proof-of-concept demonstration. Efficient RWA algorithms are proposed to achieve high performance in achieving network capacity with reduced computation cost, which is a significant attribute in a scalable centralized-control SDON. The proposed heuristic RWA algorithms differ in the orders of request processes and in the procedures of routing table updates. Combined in a shortest-path-based routing algorithm, a hottest-request-first processing policy that considers demand intensity and end-to-end distance information offers both the highest throughput of networks and acceptable computation scalability. We further investigate trade-off relationship between network throughput and computation complexity in routing table update procedure by a simulation study.

  2. F100 Multivariable Control Synthesis Program. Computer Implementation of the F100 Multivariable Control Algorithm

    NASA Technical Reports Server (NTRS)

    Soeder, J. F.

    1983-01-01

    As turbofan engines become more complex, the development of controls necessitate the use of multivariable control techniques. A control developed for the F100-PW-100(3) turbofan engine by using linear quadratic regulator theory and other modern multivariable control synthesis techniques is described. The assembly language implementation of this control on an SEL 810B minicomputer is described. This implementation was then evaluated by using a real-time hybrid simulation of the engine. The control software was modified to run with a real engine. These modifications, in the form of sensor and actuator failure checks and control executive sequencing, are discussed. Finally recommendations for control software implementations are presented.

  3. MapReduce Based Parallel Bayesian Network for Manufacturing Quality Control

    NASA Astrophysics Data System (ADS)

    Zheng, Mao-Kuan; Ming, Xin-Guo; Zhang, Xian-Yu; Li, Guo-Ming

    2017-09-01

    Increasing complexity of industrial products and manufacturing processes have challenged conventional statistics based quality management approaches in the circumstances of dynamic production. A Bayesian network and big data analytics integrated approach for manufacturing process quality analysis and control is proposed. Based on Hadoop distributed architecture and MapReduce parallel computing model, big volume and variety quality related data generated during the manufacturing process could be dealt with. Artificial intelligent algorithms, including Bayesian network learning, classification and reasoning, are embedded into the Reduce process. Relying on the ability of the Bayesian network in dealing with dynamic and uncertain problem and the parallel computing power of MapReduce, Bayesian network of impact factors on quality are built based on prior probability distribution and modified with posterior probability distribution. A case study on hull segment manufacturing precision management for ship and offshore platform building shows that computing speed accelerates almost directly proportionally to the increase of computing nodes. It is also proved that the proposed model is feasible for locating and reasoning of root causes, forecasting of manufacturing outcome, and intelligent decision for precision problem solving. The integration of bigdata analytics and BN method offers a whole new perspective in manufacturing quality control.

  4. Systems biology by the rules: hybrid intelligent systems for pathway modeling and discovery.

    PubMed

    Bosl, William J

    2007-02-15

    Expert knowledge in journal articles is an important source of data for reconstructing biological pathways and creating new hypotheses. An important need for medical research is to integrate this data with high throughput sources to build useful models that span several scales. Researchers traditionally use mental models of pathways to integrate information and development new hypotheses. Unfortunately, the amount of information is often overwhelming and these are inadequate for predicting the dynamic response of complex pathways. Hierarchical computational models that allow exploration of semi-quantitative dynamics are useful systems biology tools for theoreticians, experimentalists and clinicians and may provide a means for cross-communication. A novel approach for biological pathway modeling based on hybrid intelligent systems or soft computing technologies is presented here. Intelligent hybrid systems, which refers to several related computing methods such as fuzzy logic, neural nets, genetic algorithms, and statistical analysis, has become ubiquitous in engineering applications for complex control system modeling and design. Biological pathways may be considered to be complex control systems, which medicine tries to manipulate to achieve desired results. Thus, hybrid intelligent systems may provide a useful tool for modeling biological system dynamics and computational exploration of new drug targets. A new modeling approach based on these methods is presented in the context of hedgehog regulation of the cell cycle in granule cells. Code and input files can be found at the Bionet website: www.chip.ord/~wbosl/Software/Bionet. This paper presents the algorithmic methods needed for modeling complicated biochemical dynamics using rule-based models to represent expert knowledge in the context of cell cycle regulation and tumor growth. A notable feature of this modeling approach is that it allows biologists to build complex models from their knowledge base without the need to translate that knowledge into mathematical form. Dynamics on several levels, from molecular pathways to tissue growth, are seamlessly integrated. A number of common network motifs are examined and used to build a model of hedgehog regulation of the cell cycle in cerebellar neurons, which is believed to play a key role in the etiology of medulloblastoma, a devastating childhood brain cancer.

  5. Projection rule for complex-valued associative memory with large constant terms

    NASA Astrophysics Data System (ADS)

    Kitahara, Michimasa; Kobayashi, Masaki

    Complex-valued Associative Memory (CAM) has an inherent property of rotation invariance. Rotation invariance produces many undesirable stable states and reduces the noise robustness of CAM. Constant terms may remove rotation invariance, but if the constant terms are too small, rotation invariance does not vanish. In this paper, we eliminate rotation invariance by introducing large constant terms to complex-valued neurons. We have to make constant terms sufficiently large to improve the noise robustness. We introduce a parameter to control the amplitudes of constant terms into projection rule. The large constant terms are proved to be effective by our computer simulations.

  6. A method for computing the kernel of the downwash integral equation for arbitrary complex frequencies

    NASA Technical Reports Server (NTRS)

    Desmarais, R. N.; Rowe, W. S.

    1984-01-01

    For the design of active controls to stabilize flight vehicles, which requires the use of unsteady aerodynamics that are valid for arbitrary complex frequencies, algorithms are derived for evaluating the nonelementary part of the kernel of the integral equation that relates unsteady pressure to downwash. This part of the kernel is separated into an infinite limit integral that is evaluated using Bessel and Struve functions and into a finite limit integral that is expanded in series and integrated termwise in closed form. The developed series expansions gave reliable answers for all complex reduced frequencies and executed faster than exponential approximations for many pressure stations.

  7. A musculoskeletal model of the elbow joint complex

    NASA Technical Reports Server (NTRS)

    Gonzalez, Roger V.; Barr, Ronald E.; Abraham, Lawrence D.

    1993-01-01

    This paper describes a musculoskeletal model that represents human elbow flexion-extension and forearm pronation-supination. Musculotendon parameters and the skeletal geometry were determined for the musculoskeletal model in the analysis of ballistic elbow joint complex movements. The key objective was to develop a computational model, guided by optimal control, to investigate the relationship among patterns of muscle excitation, individual muscle forces, and movement kinematics. The model was verified using experimental kinematic, torque, and electromyographic data from volunteer subjects performing both isometric and ballistic elbow joint complex movements. In general, the model predicted kinematic and muscle excitation patterns similar to what was experimentally measured.

  8. Morphological communication: exploiting coupled dynamics in a complex mechanical structure to achieve locomotion

    PubMed Central

    Rieffel, John A.; Valero-Cuevas, Francisco J.; Lipson, Hod

    2010-01-01

    Traditional engineering approaches strive to avoid, or actively suppress, nonlinear dynamic coupling among components. Biological systems, in contrast, are often rife with these dynamics. Could there be, in some cases, a benefit to high degrees of dynamical coupling? Here we present a distributed robotic control scheme inspired by the biological phenomenon of tensegrity-based mechanotransduction. This emergence of morphology-as-information-conduit or ‘morphological communication’, enabled by time-sensitive spiking neural networks, presents a new paradigm for the decentralized control of large, coupled, modular systems. These results significantly bolster, both in magnitude and in form, the idea of morphological computation in robotic control. Furthermore, they lend further credence to ideas of embodied anatomical computation in biological systems, on scales ranging from cellular structures up to the tendinous networks of the human hand. PMID:19776146

  9. Preliminary demonstration of a robust controller design method

    NASA Technical Reports Server (NTRS)

    Anderson, L. R.

    1980-01-01

    Alternative computational procedures for obtaining a feedback control law which yields a control signal based on measurable quantitites are evaluated. The three methods evaluated are: (1) the standard linear quadratic regulator design model; (2) minimization of the norm of the feedback matrix, k via nonlinear programming subject to the constraint that the closed loop eigenvalues be in a specified domain in the complex plane; and (3) maximize the angles between the closed loop eigenvectors in combination with minimizing the norm of K also via the constrained nonlinear programming. The third or robust design method was chosen to yield a closed loop system whose eigenvalues are insensitive to small changes in the A and B matrices. The relationship between orthogonality of closed loop eigenvectors and the sensitivity of closed loop eigenvalues is described. Computer programs are described.

  10. Thermodynamic cost of computation, algorithmic complexity and the information metric

    NASA Technical Reports Server (NTRS)

    Zurek, W. H.

    1989-01-01

    Algorithmic complexity is discussed as a computational counterpart to the second law of thermodynamics. It is shown that algorithmic complexity, which is a measure of randomness, sets limits on the thermodynamic cost of computations and casts a new light on the limitations of Maxwell's demon. Algorithmic complexity can also be used to define distance between binary strings.

  11. FAA Air Traffic Control Operations Concepts. Volume 2. ACF/ACCC (Area Control Facility/Area Control Computer Complex) Terminal and En Route Controllers. Change 1

    DTIC Science & Technology

    1988-07-29

    VOLff2) 6 July 19837 A-74 A1.5.6 MONITORING NON-CONTROLLED OBJECTS AIM, 7 OIHERS REPORT AIRSPACEJ FIRST 10 DETECT INTRUSION IradR’SIoN BY NON-CON’TROLLED 1...1988 Volume II: ACF/ACCC Terminal and En Route Controllers (ClIG 1) 6 . Porliming Organization Code 7 . Author(s) 8. Performing Organization Report No...MANEUVER SYSTEM GENERATES ABSORPT ION PREVIOUSLY PREPARED RECEIVED MANEUVER FOR A FLIGHT CLEAPANCEI D0T/FAA/AP-47-01 (VOLt2) 6 July 1987 A- 7 A,1.O

  12. Analysis of explicit model predictive control for path-following control

    PubMed Central

    2018-01-01

    In this paper, explicit Model Predictive Control(MPC) is employed for automated lane-keeping systems. MPC has been regarded as the key to handle such constrained systems. However, the massive computational complexity of MPC, which employs online optimization, has been a major drawback that limits the range of its target application to relatively small and/or slow problems. Explicit MPC can reduce this computational burden using a multi-parametric quadratic programming technique(mp-QP). The control objective is to derive an optimal front steering wheel angle at each sampling time so that autonomous vehicles travel along desired paths, including straight, circular, and clothoid parts, at high entry speeds. In terms of the design of the proposed controller, a method of choosing weighting matrices in an optimization problem and the range of horizons for path-following control are described through simulations. For the verification of the proposed controller, simulation results obtained using other control methods such as MPC, Linear-Quadratic Regulator(LQR), and driver model are employed, and CarSim, which reflects the features of a vehicle more realistically than MATLAB/Simulink, is used for reliable demonstration. PMID:29534080

  13. Analysis of explicit model predictive control for path-following control.

    PubMed

    Lee, Junho; Chang, Hyuk-Jun

    2018-01-01

    In this paper, explicit Model Predictive Control(MPC) is employed for automated lane-keeping systems. MPC has been regarded as the key to handle such constrained systems. However, the massive computational complexity of MPC, which employs online optimization, has been a major drawback that limits the range of its target application to relatively small and/or slow problems. Explicit MPC can reduce this computational burden using a multi-parametric quadratic programming technique(mp-QP). The control objective is to derive an optimal front steering wheel angle at each sampling time so that autonomous vehicles travel along desired paths, including straight, circular, and clothoid parts, at high entry speeds. In terms of the design of the proposed controller, a method of choosing weighting matrices in an optimization problem and the range of horizons for path-following control are described through simulations. For the verification of the proposed controller, simulation results obtained using other control methods such as MPC, Linear-Quadratic Regulator(LQR), and driver model are employed, and CarSim, which reflects the features of a vehicle more realistically than MATLAB/Simulink, is used for reliable demonstration.

  14. QUALITY ASSURANCE AND QUALITY CONTROL IN THE DEVELOPMENT AND APPLICATION OF THE AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA) TOOL

    EPA Science Inventory

    Planning and assessment in land and water resource management are evolving from simple, local-scale problems toward complex, spatially explicit regional ones. Such problems have to be addressed with distributed models that can compute runoff and erosion at different spatial and t...

  15. A cardiovascular system model for lower-body negative pressure response

    NASA Technical Reports Server (NTRS)

    Mitchell, B. A., Jr.; Giese, R. P.

    1971-01-01

    Mathematical models used to study complex physiological control systems are discussed. Efforts were made to modify a model of the cardiovascular system for use in studying lower body negative pressure. A computer program was written which allows orderly, straightforward expansion to include exercise, metabolism (thermal stress), respiration, and other body functions.

  16. Design and Evaluation of Simulations for the Development of Complex Decision-Making Skills.

    ERIC Educational Resources Information Center

    Hartley, Roger; Varley, Glen

    2002-01-01

    Command and Control Training Using Simulation (CACTUS) is a computer digital mapping system used by police to manage large-scale public events. Audio and video records of adaptive training scenarios using CACTUS show how the simulation develops decision-making skills for strategic and tactical event management. (SK)

  17. Sensor fusion and computer vision for context-aware control of a multi degree-of-freedom prosthesis

    NASA Astrophysics Data System (ADS)

    Markovic, Marko; Dosen, Strahinja; Popovic, Dejan; Graimann, Bernhard; Farina, Dario

    2015-12-01

    Objective. Myoelectric activity volitionally generated by the user is often used for controlling hand prostheses in order to replicate the synergistic actions of muscles in healthy humans during grasping. Muscle synergies in healthy humans are based on the integration of visual perception, heuristics and proprioception. Here, we demonstrate how sensor fusion that combines artificial vision and proprioceptive information with the high-level processing characteristics of biological systems can be effectively used in transradial prosthesis control. Approach. We developed a novel context- and user-aware prosthesis (CASP) controller integrating computer vision and inertial sensing with myoelectric activity in order to achieve semi-autonomous and reactive control of a prosthetic hand. The presented method semi-automatically provides simultaneous and proportional control of multiple degrees-of-freedom (DOFs), thus decreasing overall physical effort while retaining full user control. The system was compared against the major commercial state-of-the art myoelectric control system in ten able-bodied and one amputee subject. All subjects used transradial prosthesis with an active wrist to grasp objects typically associated with activities of daily living. Main results. The CASP significantly outperformed the myoelectric interface when controlling all of the prosthesis DOF. However, when tested with less complex prosthetic system (smaller number of DOF), the CASP was slower but resulted with reaching motions that contained less compensatory movements. Another important finding is that the CASP system required minimal user adaptation and training. Significance. The CASP constitutes a substantial improvement for the control of multi-DOF prostheses. The application of the CASP will have a significant impact when translated to real-life scenarious, particularly with respect to improving the usability and acceptance of highly complex systems (e.g., full prosthetic arms) by amputees.

  18. Sensor fusion and computer vision for context-aware control of a multi degree-of-freedom prosthesis.

    PubMed

    Markovic, Marko; Dosen, Strahinja; Popovic, Dejan; Graimann, Bernhard; Farina, Dario

    2015-12-01

    Myoelectric activity volitionally generated by the user is often used for controlling hand prostheses in order to replicate the synergistic actions of muscles in healthy humans during grasping. Muscle synergies in healthy humans are based on the integration of visual perception, heuristics and proprioception. Here, we demonstrate how sensor fusion that combines artificial vision and proprioceptive information with the high-level processing characteristics of biological systems can be effectively used in transradial prosthesis control. We developed a novel context- and user-aware prosthesis (CASP) controller integrating computer vision and inertial sensing with myoelectric activity in order to achieve semi-autonomous and reactive control of a prosthetic hand. The presented method semi-automatically provides simultaneous and proportional control of multiple degrees-of-freedom (DOFs), thus decreasing overall physical effort while retaining full user control. The system was compared against the major commercial state-of-the art myoelectric control system in ten able-bodied and one amputee subject. All subjects used transradial prosthesis with an active wrist to grasp objects typically associated with activities of daily living. The CASP significantly outperformed the myoelectric interface when controlling all of the prosthesis DOF. However, when tested with less complex prosthetic system (smaller number of DOF), the CASP was slower but resulted with reaching motions that contained less compensatory movements. Another important finding is that the CASP system required minimal user adaptation and training. The CASP constitutes a substantial improvement for the control of multi-DOF prostheses. The application of the CASP will have a significant impact when translated to real-life scenarious, particularly with respect to improving the usability and acceptance of highly complex systems (e.g., full prosthetic arms) by amputees.

  19. Designing for adaptation to novelty and change: functional information, emergent feature graphics, and higher-level control.

    PubMed

    Hajdukiewicz, John R; Vicente, Kim J

    2002-01-01

    Ecological interface design (EID) is a theoretical framework that aims to support worker adaptation to change and novelty in complex systems. Previous evaluations of EID have emphasized representativeness to enhance generalizability of results to operational settings. The research presented here is complementary, emphasizing experimental control to enhance theory building. Two experiments were conducted to test the impact of functional information and emergent feature graphics on adaptation to novelty and change in a thermal-hydraulic process control microworld. Presenting functional information in an interface using emergent features encouraged experienced participants to become perceptually coupled to the interface and thereby to exhibit higher-level control and more successful adaptation to unanticipated events. The absence of functional information or of emergent features generally led to lower-level control and less success at adaptation, the exception being a minority of participants who compensated by relying on analytical reasoning. These findings may have practical implications for shaping coordination in complex systems and fundamental implications for the development of a general unified theory of coordination for the technical, human, and social sciences. Actual or potential applications of this research include the design of human-computer interfaces that improve safety in complex sociotechnical systems.

  20. Parallel Optimization of Polynomials for Large-scale Problems in Stability and Control

    NASA Astrophysics Data System (ADS)

    Kamyar, Reza

    In this thesis, we focus on some of the NP-hard problems in control theory. Thanks to the converse Lyapunov theory, these problems can often be modeled as optimization over polynomials. To avoid the problem of intractability, we establish a trade off between accuracy and complexity. In particular, we develop a sequence of tractable optimization problems --- in the form of Linear Programs (LPs) and/or Semi-Definite Programs (SDPs) --- whose solutions converge to the exact solution of the NP-hard problem. However, the computational and memory complexity of these LPs and SDPs grow exponentially with the progress of the sequence - meaning that improving the accuracy of the solutions requires solving SDPs with tens of thousands of decision variables and constraints. Setting up and solving such problems is a significant challenge. The existing optimization algorithms and software are only designed to use desktop computers or small cluster computers --- machines which do not have sufficient memory for solving such large SDPs. Moreover, the speed-up of these algorithms does not scale beyond dozens of processors. This in fact is the reason we seek parallel algorithms for setting-up and solving large SDPs on large cluster- and/or super-computers. We propose parallel algorithms for stability analysis of two classes of systems: 1) Linear systems with a large number of uncertain parameters; 2) Nonlinear systems defined by polynomial vector fields. First, we develop a distributed parallel algorithm which applies Polya's and/or Handelman's theorems to some variants of parameter-dependent Lyapunov inequalities with parameters defined over the standard simplex. The result is a sequence of SDPs which possess a block-diagonal structure. We then develop a parallel SDP solver which exploits this structure in order to map the computation, memory and communication to a distributed parallel environment. Numerical tests on a supercomputer demonstrate the ability of the algorithm to efficiently utilize hundreds and potentially thousands of processors, and analyze systems with 100+ dimensional state-space. Furthermore, we extend our algorithms to analyze robust stability over more complicated geometries such as hypercubes and arbitrary convex polytopes. Our algorithms can be readily extended to address a wide variety of problems in control such as Hinfinity synthesis for systems with parametric uncertainty and computing control Lyapunov functions.

  1. A quantum Fredkin gate

    PubMed Central

    Patel, Raj B.; Ho, Joseph; Ferreyrol, Franck; Ralph, Timothy C.; Pryde, Geoff J.

    2016-01-01

    Minimizing the resources required to build logic gates into useful processing circuits is key to realizing quantum computers. Although the salient features of a quantum computer have been shown in proof-of-principle experiments, difficulties in scaling quantum systems have made more complex operations intractable. This is exemplified in the classical Fredkin (controlled-SWAP) gate for which, despite theoretical proposals, no quantum analog has been realized. By adding control to the SWAP unitary, we use photonic qubit logic to demonstrate the first quantum Fredkin gate, which promises many applications in quantum information and measurement. We implement example algorithms and generate the highest-fidelity three-photon Greenberger-Horne-Zeilinger states to date. The technique we use allows one to add a control operation to a black-box unitary, something that is impossible in the standard circuit model. Our experiment represents the first use of this technique to control a two-qubit operation and paves the way for larger controlled circuits to be realized efficiently. PMID:27051868

  2. Maximized gust loads for a nonlinear airplane using matched filter theory and constrained optimization

    NASA Technical Reports Server (NTRS)

    Scott, Robert C.; Pototzky, Anthony S.; Perry, Boyd, III

    1991-01-01

    Two matched filter theory based schemes are described and illustrated for obtaining maximized and time correlated gust loads for a nonlinear aircraft. The first scheme is computationally fast because it uses a simple 1-D search procedure to obtain its answers. The second scheme is computationally slow because it uses a more complex multi-dimensional search procedure to obtain its answers, but it consistently provides slightly higher maximum loads than the first scheme. Both schemes are illustrated with numerical examples involving a nonlinear control system.

  3. Maximized gust loads for a nonlinear airplane using matched filter theory and constrained optimization

    NASA Technical Reports Server (NTRS)

    Scott, Robert C.; Perry, Boyd, III; Pototzky, Anthony S.

    1991-01-01

    This paper describes and illustrates two matched-filter-theory based schemes for obtaining maximized and time-correlated gust-loads for a nonlinear airplane. The first scheme is computationally fast because it uses a simple one-dimensional search procedure to obtain its answers. The second scheme is computationally slow because it uses a more complex multidimensional search procedure to obtain its answers, but it consistently provides slightly higher maximum loads than the first scheme. Both schemes are illustrated with numerical examples involving a nonlinear control system.

  4. State estimation for distributed systems with sensing delay

    NASA Astrophysics Data System (ADS)

    Alexander, Harold L.

    1991-08-01

    Control of complex systems such as remote robotic vehicles requires combining data from many sensors where the data may often be delayed by sensory processing requirements. The number and variety of sensors make it desirable to distribute the computational burden of sensing and estimation among multiple processors. Classic Kalman filters do not lend themselves to distributed implementations or delayed measurement data. The alternative Kalman filter designs presented in this paper are adapted for delays in sensor data generation and for distribution of computation for sensing and estimation over a set of networked processors.

  5. Computer aided statistical process control for on-line instrumentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meils, D.E.

    1995-01-01

    On-line chemical process instrumentation historically has been used for trending. Recent technological advances in on-line instrumentation have improved the accuracy and reliability of on-line instrumentation. However, little attention has been given to validating and verifying on-line instrumentation. This paper presents two practical approaches for validating instrument performance by comparison of on-line instrument response to either another portable instrument or another bench instrument. Because the comparison of two instruments` performance to each other requires somewhat complex statistical calculations, a computer code (Lab Stats Pack{reg_sign}) is used to simplify the calculations. Lab Stats Pack{reg_sign} also develops control charts that may be usedmore » for continuous verification of on-line instrument performance.« less

  6. Diagnosis of the Computer-Controlled Milling Machine, Definition of the Working Errors and Input Corrections on the Basis of Mathematical Model

    NASA Astrophysics Data System (ADS)

    Starikov, A. I.; Nekrasov, R. Yu; Teploukhov, O. J.; Soloviev, I. V.; Narikov, K. A.

    2016-10-01

    Manufactures, machinery and equipment improve of constructively as science advances and technology, and requirements are improving of quality and longevity. That is, the requirements for surface quality and precision manufacturing, oil and gas equipment parts are constantly increasing. Production of oil and gas engineering products on modern machine tools with computer numerical control - is a complex synthesis of technical and electrical equipment parts, as well as the processing procedure. Technical machine part wears during operation and in the electrical part are accumulated mathematical errors. Thus, the above-mentioned disadvantages of any of the following parts of metalworking equipment affect the manufacturing process of products in general, and as a result lead to the flaw.

  7. Introduction to the LaRC central scientific computing complex

    NASA Technical Reports Server (NTRS)

    Shoosmith, John N.

    1993-01-01

    The computers and associated equipment that make up the Central Scientific Computing Complex of the Langley Research Center are briefly described. The electronic networks that provide access to the various components of the complex and a number of areas that can be used by Langley and contractors staff for special applications (scientific visualization, image processing, software engineering, and grid generation) are also described. Flight simulation facilities that use the central computers are described. Management of the complex, procedures for its use, and available services and resources are discussed. This document is intended for new users of the complex, for current users who wish to keep appraised of changes, and for visitors who need to understand the role of central scientific computers at Langley.

  8. [Influence of mental rotation of objects on psychophysiological functions of women].

    PubMed

    Chikina, L V; Fedorchuk, S V; Trushina, V A; Ianchuk, P I; Makarchuk, M Iu

    2012-01-01

    An integral part of activity of modern human beings is an involvement to work with the computer systems which, in turn, produces a nervous - emotional tension. Hence, a problem of control of the psychophysiological state of workmen with the purpose of health preservation and success of their activity and the problem of application of rehabilitational actions are actual. At present it is known that the efficiency of rehabilitational procedures rises following application of the complex of regenerative programs. Previously performed by us investigation showed that mental rotation is capable to compensate the consequences of a nervous - emotional tension. Therefore, in the present work we investigated how the complex of spatial tasks developed by us influences psychophysiological performances of tested women for which the psycho-emotional tension with the usage of computer technologies is more essential, and the procedure of mental rotation is more complex task for them, than for men. The complex of spatial tasks applied in the given work included: mental rotation of simple objects (letters and digits), mental rotation of complex objects (geometrical figures) and mental rotation of complex objects with the usage of a short-term memory. Execution of the complex of spatial tasks reduces the time of simple and complex sensomotor response, raises parameters of a short-term memory, brain work capacity and improves nervous processes. Collectively, mental rotation of objects can be recommended as a rehabilitational resource for compensation of consequences of any psycho-emotional strain, both for men, and for women.

  9. POPEYE: A production rule-based model of multitask supervisory control (POPCORN)

    NASA Technical Reports Server (NTRS)

    Townsend, James T.; Kadlec, Helena; Kantowitz, Barry H.

    1988-01-01

    Recent studies of relationships between subjective ratings of mental workload, performance, and human operator and task characteristics have indicated that these relationships are quite complex. In order to study the various relationships and place subjective mental workload within a theoretical framework, we developed a production system model for the performance component of the complex supervisory task called POPCORN. The production system model is represented by a hierarchial structure of goals and subgoals, and the information flow is controlled by a set of condition-action rules. The implementation of this production system, called POPEYE, generates computer simulated data under different task difficulty conditions which are comparable to those of human operators performing the task. This model is the performance aspect of an overall dynamic psychological model which we are developing to examine and quantify relationships between performance and psychological aspects in a complex environment.

  10. Empirical modeling for intelligent, real-time manufacture control

    NASA Technical Reports Server (NTRS)

    Xu, Xiaoshu

    1994-01-01

    Artificial neural systems (ANS), also known as neural networks, are an attempt to develop computer systems that emulate the neural reasoning behavior of biological neural systems (e.g. the human brain). As such, they are loosely based on biological neural networks. The ANS consists of a series of nodes (neurons) and weighted connections (axons) that, when presented with a specific input pattern, can associate specific output patterns. It is essentially a highly complex, nonlinear, mathematical relationship or transform. These constructs have two significant properties that have proven useful to the authors in signal processing and process modeling: noise tolerance and complex pattern recognition. Specifically, the authors have developed a new network learning algorithm that has resulted in the successful application of ANS's to high speed signal processing and to developing models of highly complex processes. Two of the applications, the Weld Bead Geometry Control System and the Welding Penetration Monitoring System, are discussed in the body of this paper.

  11. Experimental econophysics: Complexity, self-organization, and emergent properties

    NASA Astrophysics Data System (ADS)

    Huang, J. P.

    2015-03-01

    Experimental econophysics is concerned with statistical physics of humans in the laboratory, and it is based on controlled human experiments developed by physicists to study some problems related to economics or finance. It relies on controlled human experiments in the laboratory together with agent-based modeling (for computer simulations and/or analytical theory), with an attempt to reveal the general cause-effect relationship between specific conditions and emergent properties of real economic/financial markets (a kind of complex adaptive systems). Here I review the latest progress in the field, namely, stylized facts, herd behavior, contrarian behavior, spontaneous cooperation, partial information, and risk management. Also, I highlight the connections between such progress and other topics of traditional statistical physics. The main theme of the review is to show diverse emergent properties of the laboratory markets, originating from self-organization due to the nonlinear interactions among heterogeneous humans or agents (complexity).

  12. Cognitive engineering models in space systems

    NASA Technical Reports Server (NTRS)

    Mitchell, Christine M.

    1992-01-01

    NASA space systems, including mission operations on the ground and in space, are complex, dynamic, predominantly automated systems in which the human operator is a supervisory controller. The human operator monitors and fine-tunes computer-based control systems and is responsible for ensuring safe and efficient system operation. In such systems, the potential consequences of human mistakes and errors may be very large, and low probability of such events is likely. Thus, models of cognitive functions in complex systems are needed to describe human performance and form the theoretical basis of operator workstation design, including displays, controls, and decision support aids. The operator function model represents normative operator behavior-expected operator activities given current system state. The extension of the theoretical structure of the operator function model and its application to NASA Johnson mission operations and space station applications is discussed.

  13. Implicit Multibody Penalty-BasedDistributed Contact.

    PubMed

    Xu, Hongyi; Zhao, Yili; Barbic, Jernej

    2014-09-01

    The penalty method is a simple and popular approach to resolving contact in computer graphics and robotics. Penalty-based contact, however, suffers from stability problems due to the highly variable and unpredictable net stiffness, and this is particularly pronounced in simulations with time-varying distributed geometrically complex contact. We employ semi-implicit integration, exact analytical contact gradients, symbolic Gaussian elimination and a SVD solver to simulate stable penalty-based frictional contact with large, time-varying contact areas, involving many rigid objects and articulated rigid objects in complex conforming contact and self-contact. We also derive implicit proportional-derivative control forces for real-time control of articulated structures with loops. We present challenging contact scenarios such as screwing a hexbolt into a hole, bowls stacked in perfectly conforming configurations, and manipulating many objects using actively controlled articulated mechanisms in real time.

  14. Computational complexity of the landscape II-Cosmological considerations

    NASA Astrophysics Data System (ADS)

    Denef, Frederik; Douglas, Michael R.; Greene, Brian; Zukowski, Claire

    2018-05-01

    We propose a new approach for multiverse analysis based on computational complexity, which leads to a new family of "computational" measure factors. By defining a cosmology as a space-time containing a vacuum with specified properties (for example small cosmological constant) together with rules for how time evolution will produce the vacuum, we can associate global time in a multiverse with clock time on a supercomputer which simulates it. We argue for a principle of "limited computational complexity" governing early universe dynamics as simulated by this supercomputer, which translates to a global measure for regulating the infinities of eternal inflation. The rules for time evolution can be thought of as a search algorithm, whose details should be constrained by a stronger principle of "minimal computational complexity". Unlike previously studied global measures, ours avoids standard equilibrium considerations and the well-known problems of Boltzmann Brains and the youngness paradox. We also give various definitions of the computational complexity of a cosmology, and argue that there are only a few natural complexity classes.

  15. Dual Quaternions as Constraints in 4D-DPM Models for Pose Estimation.

    PubMed

    Martinez-Berti, Enrique; Sánchez-Salmerón, Antonio-José; Ricolfe-Viala, Carlos

    2017-08-19

    The goal of this research work is to improve the accuracy of human pose estimation using the Deformation Part Model (DPM) without increasing computational complexity. First, the proposed method seeks to improve pose estimation accuracy by adding the depth channel to DPM, which was formerly defined based only on red-green-blue (RGB) channels, in order to obtain a four-dimensional DPM (4D-DPM). In addition, computational complexity can be controlled by reducing the number of joints by taking it into account in a reduced 4D-DPM. Finally, complete solutions are obtained by solving the omitted joints by using inverse kinematics models. In this context, the main goal of this paper is to analyze the effect on pose estimation timing cost when using dual quaternions to solve the inverse kinematics.

  16. Galileo battery testing and the impact of test automation

    NASA Technical Reports Server (NTRS)

    Pertuch, W. T.; Dils, C. T.

    1985-01-01

    Test complexity, changes of test specifications, and the demand for tight control of tests led to the development of automated testing used for Galileo and other projects. The use of standardized interfacing, i.e., IEEE-488, with desktop computers and test instruments, resulted in greater reliability, repeatability, and accuracy of both control and data reporting. Increased flexibility of test programming has reduced costs by permitting a wide spectrum of test requirements at one station rather than many stations.

  17. Federal Aviation Administration Aviation System Capital Investment Plan 1993

    DTIC Science & Technology

    1993-12-01

    Facilitates full use of terminal airspace capacity. 0 Increases safety and efficiency. 62-21 Airport Surface Traffic 0 Optimizes sequencing and...installation of tower control computer complexes (TCCCs) in se- 0 AAS software for terminal and en route ATC lected airport traffic control towers. TCCCs...project provides economical ASR-4/5/6, and install 40 ASR-9s at radar service at airports with air traffic densi- ASR-4/5/6 sites). ties high enough to

  18. Using an Extended Dynamic Drag-and-Drop Assistive Program to Assist People with Multiple Disabilities and Minimal Motor Control to Improve Computer Drag-and-Drop Ability through a Mouse Wheel

    ERIC Educational Resources Information Center

    Shih, Ching-Hsiang

    2012-01-01

    Software technology is adopted by the current research to improve the Drag-and-Drop abilities of two people with multiple disabilities and minimal motor control. This goal was realized through a Dynamic Drag-and-Drop Assistive Program (DDnDAP) in which the complex dragging process is replaced by simply poking the mouse wheel and clicking. However,…

  19. Flight control application of new stability robustness bounds for linear uncertain systems

    NASA Technical Reports Server (NTRS)

    Yedavalli, Rama K.

    1993-01-01

    This paper addresses the issue of obtaining bounds on the real parameter perturbations of a linear state-space model for robust stability. Based on Kronecker algebra, new, easily computable sufficient bounds are derived that are much less conservative than the existing bounds since the technique is meant for only real parameter perturbations (in contrast to specializing complex variation case to real parameter case). The proposed theory is illustrated with application to several flight control examples.

  20. Approximations of Thermoelastic and Viscoelastic Control Systems

    DTIC Science & Technology

    1990-06-01

    parabolic partial differential equations. The development of computational algorithms for designing controllers for such systems is an Immenselv complex...hereditary differential system on Rr , then approximate the "’historv" or -’memory- term (i.e.. the integral term in i.S)). In this paper we will use a... variation introduced by Fabiano and Ito ([FI]) of the averaging scheme considered by Banks and Burns ([BB]) for the second stage. The idea of the "’AVE

  1. The Capabilities of Chaos and Complexity

    PubMed Central

    Abel, David L.

    2009-01-01

    To what degree could chaos and complexity have organized a Peptide or RNA World of crude yet necessarily integrated protometabolism? How far could such protolife evolve in the absence of a heritable linear digital symbol system that could mutate, instruct, regulate, optimize and maintain metabolic homeostasis? To address these questions, chaos, complexity, self-ordered states, and organization must all be carefully defined and distinguished. In addition their cause-and-effect relationships and mechanisms of action must be delineated. Are there any formal (non physical, abstract, conceptual, algorithmic) components to chaos, complexity, self-ordering and organization, or are they entirely physicodynamic (physical, mass/energy interaction alone)? Chaos and complexity can produce some fascinating self-ordered phenomena. But can spontaneous chaos and complexity steer events and processes toward pragmatic benefit, select function over non function, optimize algorithms, integrate circuits, produce computational halting, organize processes into formal systems, control and regulate existing systems toward greater efficiency? The question is pursued of whether there might be some yet-to-be discovered new law of biology that will elucidate the derivation of prescriptive information and control. “System” will be rigorously defined. Can a low-informational rapid succession of Prigogine’s dissipative structures self-order into bona fide organization? PMID:19333445

  2. EIAGRID: In-field optimization of seismic data acquisition by real-time subsurface imaging using a remote GRID computing environment.

    NASA Astrophysics Data System (ADS)

    Heilmann, B. Z.; Vallenilla Ferrara, A. M.

    2009-04-01

    The constant growth of contaminated sites, the unsustainable use of natural resources, and, last but not least, the hydrological risk related to extreme meteorological events and increased climate variability are major environmental issues of today. Finding solutions for these complex problems requires an integrated cross-disciplinary approach, providing a unified basis for environmental science and engineering. In computer science, grid computing is emerging worldwide as a formidable tool allowing distributed computation and data management with administratively-distant resources. Utilizing these modern High Performance Computing (HPC) technologies, the GRIDA3 project bundles several applications from different fields of geoscience aiming to support decision making for reasonable and responsible land use and resource management. In this abstract we present a geophysical application called EIAGRID that uses grid computing facilities to perform real-time subsurface imaging by on-the-fly processing of seismic field data and fast optimization of the processing workflow. Even though, seismic reflection profiling has a broad application range spanning from shallow targets in a few meters depth to targets in a depth of several kilometers, it is primarily used by the hydrocarbon industry and hardly for environmental purposes. The complexity of data acquisition and processing poses severe problems for environmental and geotechnical engineering: Professional seismic processing software is expensive to buy and demands large experience from the user. In-field processing equipment needed for real-time data Quality Control (QC) and immediate optimization of the acquisition parameters is often not available for this kind of studies. As a result, the data quality will be suboptimal. In the worst case, a crucial parameter such as receiver spacing, maximum offset, or recording time turns out later to be inappropriate and the complete acquisition campaign has to be repeated. The EIAGRID portal provides an innovative solution to this problem combining state-of-the-art data processing methods and modern remote grid computing technology. In field-processing equipment is substituted by remote access to high performance grid computing facilities. The latter can be ubiquitously controlled by a user-friendly web-browser interface accessed from the field by any mobile computer using wireless data transmission technology such as UMTS (Universal Mobile Telecommunications System) or HSUPA/HSDPA (High-Speed Uplink/Downlink Packet Access). The complexity of data-manipulation and processing and thus also the time demanding user interaction is minimized by a data-driven, and highly automated velocity analysis and imaging approach based on the Common-Reflection-Surface (CRS) stack. Furthermore, the huge computing power provided by the grid deployment allows parallel testing of alternative processing sequences and parameter settings, a feature which considerably reduces the turn-around times. A shared data storage using georeferencing tools and data grid technology is under current development. It will allow to publish already accomplished projects, making results, processing workflows and parameter settings available in a transparent and reproducible way. Creating a unified database shared by all users will facilitate complex studies and enable the use of data-crossing techniques to incorporate results of other environmental applications hosted on the GRIDA3 portal.

  3. Design and control of a macro-micro robot for precise force applications

    NASA Technical Reports Server (NTRS)

    Wang, Yulun; Mangaser, Amante; Laby, Keith; Jordan, Steve; Wilson, Jeff

    1993-01-01

    Creating a robot which can delicately interact with its environment has been the goal of much research. Primarily two difficulties have made this goal hard to attain. The execution of control strategies which enable precise force manipulations are difficult to implement in real time because such algorithms have been too computationally complex for available controllers. Also, a robot mechanism which can quickly and precisely execute a force command is difficult to design. Actuation joints must be sufficiently stiff, frictionless, and lightweight so that desired torques can be accurately applied. This paper describes a robotic system which is capable of delicate manipulations. A modular high-performance multiprocessor control system was designed to provide sufficient compute power for executing advanced control methods. An 8 degree of freedom macro-micro mechanism was constructed to enable accurate tip forces. Control algorithms based on the impedance control method were derived, coded, and load balanced for maximum execution speed on the multiprocessor system. Delicate force tasks such as polishing, finishing, cleaning, and deburring, are the target applications of the robot.

  4. Using the stereokinetic effect to convey depth - Computationally efficient depth-from-motion displays

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K.; Proffitt, Dennis R.

    1992-01-01

    Recent developments in microelectronics have encouraged the use of 3D data bases to create compelling volumetric renderings of graphical objects. However, even with the computational capabilities of current-generation graphical systems, real-time displays of such objects are difficult, particularly when dynamic spatial transformations are involved. In this paper we discuss a type of visual stimulus (the stereokinetic effect display) that is computationally far less complex than a true three-dimensional transformation but yields an equally compelling depth impression, often perceptually indiscriminable from the true spatial transformation. Several possible applications for this technique are discussed (e.g., animating contour maps and air traffic control displays so as to evoke accurate depth percepts).

  5. Computational methods in metabolic engineering for strain design.

    PubMed

    Long, Matthew R; Ong, Wai Kit; Reed, Jennifer L

    2015-08-01

    Metabolic engineering uses genetic approaches to control microbial metabolism to produce desired compounds. Computational tools can identify new biological routes to chemicals and the changes needed in host metabolism to improve chemical production. Recent computational efforts have focused on exploring what compounds can be made biologically using native, heterologous, and/or enzymes with broad specificity. Additionally, computational methods have been developed to suggest different types of genetic modifications (e.g. gene deletion/addition or up/down regulation), as well as suggest strategies meeting different criteria (e.g. high yield, high productivity, or substrate co-utilization). Strategies to improve the runtime performances have also been developed, which allow for more complex metabolic engineering strategies to be identified. Future incorporation of kinetic considerations will further improve strain design algorithms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. NGScloud: RNA-seq analysis of non-model species using cloud computing.

    PubMed

    Mora-Márquez, Fernando; Vázquez-Poletti, José Luis; López de Heredia, Unai

    2018-05-03

    RNA-seq analysis usually requires large computing infrastructures. NGScloud is a bioinformatic system developed to analyze RNA-seq data using the cloud computing services of Amazon that permit the access to ad hoc computing infrastructure scaled according to the complexity of the experiment, so its costs and times can be optimized. The application provides a user-friendly front-end to operate Amazon's hardware resources, and to control a workflow of RNA-seq analysis oriented to non-model species, incorporating the cluster concept, which allows parallel runs of common RNA-seq analysis programs in several virtual machines for faster analysis. NGScloud is freely available at https://github.com/GGFHF/NGScloud/. A manual detailing installation and how-to-use instructions is available with the distribution. unai.lopezdeheredia@upm.es.

  7. Undecidability and Irreducibility Conditions for Open-Ended Evolution and Emergence.

    PubMed

    Hernández-Orozco, Santiago; Hernández-Quiroz, Francisco; Zenil, Hector

    2018-01-01

    Is undecidability a requirement for open-ended evolution (OEE)? Using methods derived from algorithmic complexity theory, we propose robust computational definitions of open-ended evolution and the adaptability of computable dynamical systems. Within this framework, we show that decidability imposes absolute limits on the stable growth of complexity in computable dynamical systems. Conversely, systems that exhibit (strong) open-ended evolution must be undecidable, establishing undecidability as a requirement for such systems. Complexity is assessed in terms of three measures: sophistication, coarse sophistication, and busy beaver logical depth. These three complexity measures assign low complexity values to random (incompressible) objects. As time grows, the stated complexity measures allow for the existence of complex states during the evolution of a computable dynamical system. We show, however, that finding these states involves undecidable computations. We conjecture that for similar complexity measures that assign low complexity values, decidability imposes comparable limits on the stable growth of complexity, and that such behavior is necessary for nontrivial evolutionary systems. We show that the undecidability of adapted states imposes novel and unpredictable behavior on the individuals or populations being modeled. Such behavior is irreducible. Finally, we offer an example of a system, first proposed by Chaitin, that exhibits strong OEE.

  8. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... Computer Software and Complex Electronics Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear...-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used... Electronics Engineers (ANSI/IEEE) Standard 830-1998, ``IEEE Recommended Practice for Software Requirements...

  9. The role of the host in a cooperating mainframe and workstation environment, volumes 1 and 2

    NASA Technical Reports Server (NTRS)

    Kusmanoff, Antone; Martin, Nancy L.

    1989-01-01

    In recent years, advancements made in computer systems have prompted a move from centralized computing based on timesharing a large mainframe computer to distributed computing based on a connected set of engineering workstations. A major factor in this advancement is the increased performance and lower cost of engineering workstations. The shift to distributed computing from centralized computing has led to challenges associated with the residency of application programs within the system. In a combined system of multiple engineering workstations attached to a mainframe host, the question arises as to how does a system designer assign applications between the larger mainframe host and the smaller, yet powerful, workstation. The concepts related to real time data processing are analyzed and systems are displayed which use a host mainframe and a number of engineering workstations interconnected by a local area network. In most cases, distributed systems can be classified as having a single function or multiple functions and as executing programs in real time or nonreal time. In a system of multiple computers, the degree of autonomy of the computers is important; a system with one master control computer generally differs in reliability, performance, and complexity from a system in which all computers share the control. This research is concerned with generating general criteria principles for software residency decisions (host or workstation) for a diverse yet coupled group of users (the clustered workstations) which may need the use of a shared resource (the mainframe) to perform their functions.

  10. Structural studies of RNA-protein complexes: A hybrid approach involving hydrodynamics, scattering, and computational methods.

    PubMed

    Patel, Trushar R; Chojnowski, Grzegorz; Astha; Koul, Amit; McKenna, Sean A; Bujnicki, Janusz M

    2017-04-15

    The diverse functional cellular roles played by ribonucleic acids (RNA) have emphasized the need to develop rapid and accurate methodologies to elucidate the relationship between the structure and function of RNA. Structural biology tools such as X-ray crystallography and Nuclear Magnetic Resonance are highly useful methods to obtain atomic-level resolution models of macromolecules. However, both methods have sample, time, and technical limitations that prevent their application to a number of macromolecules of interest. An emerging alternative to high-resolution structural techniques is to employ a hybrid approach that combines low-resolution shape information about macromolecules and their complexes from experimental hydrodynamic (e.g. analytical ultracentrifugation) and solution scattering measurements (e.g., solution X-ray or neutron scattering), with computational modeling to obtain atomic-level models. While promising, scattering methods rely on aggregation-free, monodispersed preparations and therefore the careful development of a quality control pipeline is fundamental to an unbiased and reliable structural determination. This review article describes hydrodynamic techniques that are highly valuable for homogeneity studies, scattering techniques useful to study the low-resolution shape, and strategies for computational modeling to obtain high-resolution 3D structural models of RNAs, proteins, and RNA-protein complexes. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  11. Standardised Embedded Data framework for Drones [SEDD

    NASA Astrophysics Data System (ADS)

    Wyngaard, J.; Barbieri, L.; Peterson, F. S.

    2015-12-01

    A number of barriers to entry remain for UAS use in science. One in particular is that of implementing an experiment and UAS specific software stack. Currently this stack is most often developed in-house and customised for a particular UAS-sensor pairing - limiting its reuse. Alternatively, when adaptable a suitable commercial package may be used, but such systems are both costly and usually suboptimal.In order to address this challenge the Standardised Embedded Data framework for Drones [SEDD] is being developed in μpython. SEDD provides an open source, reusable, and scientist-accessible drop in solution for drone data capture and triage. Targeted at embedded hardware, and offering easy access to standard I/O interfaces, SEDD provides an easy solution for simply capturing data from a sensor. However, the intention is rather to enable more complex systems of multiple sensors, computer hardware, and feedback loops, via 3 primary components.A data asset manager ensures data assets are associated with appropriate metadata as they are captured. Thereafter, the asset is easily archived or otherwise redirected, possibly to - onboard storage, onboard compute resource for processing, an interface for transmission, another sensor control system, remote storage and processing (such as EarthCube's CHORDS), or to any combination of the above.A service workflow managerenables easy implementation of complex onboard systems via dedicated control of multiple continuous and periodic services. Such services will include the housekeeping chores of operating a UAS and multiple sensors, but will also permit a scientist to drop in an initial scientific data processing code utilising on-board compute resources beyond the autopilot. Having such capabilities firstly enables easy creation of real-time feedback, to the human- or auto- pilot, or other sensors, on data quality or needed flight path changes. Secondly, compute hardware provides the opportunity to carry out real-time data triage, for the purposes of conserving on-board storage space or transmission bandwidth in inherently poor connectivity environments.A compute manager is finally included. Depending on system complexity, and given the need for power efficient parallelism, it can quickly become necessary to provide a scheduling service for multiple workflows.

  12. A review on locomotion robophysics: the study of movement at the intersection of robotics, soft matter and dynamical systems.

    PubMed

    Aguilar, Jeffrey; Zhang, Tingnan; Qian, Feifei; Kingsbury, Mark; McInroe, Benjamin; Mazouchova, Nicole; Li, Chen; Maladen, Ryan; Gong, Chaohui; Travers, Matt; Hatton, Ross L; Choset, Howie; Umbanhowar, Paul B; Goldman, Daniel I

    2016-11-01

    Discovery of fundamental principles which govern and limit effective locomotion (self-propulsion) is of intellectual interest and practical importance. Human technology has created robotic moving systems that excel in movement on and within environments of societal interest: paved roads, open air and water. However, such devices cannot yet robustly and efficiently navigate (as animals do) the enormous diversity of natural environments which might be of future interest for autonomous robots; examples include vertical surfaces like trees and cliffs, heterogeneous ground like desert rubble and brush, turbulent flows found near seashores, and deformable/flowable substrates like sand, mud and soil. In this review we argue for the creation of a physics of moving systems-a 'locomotion robophysics'-which we define as the pursuit of principles of self-generated motion. Robophysics can provide an important intellectual complement to the discipline of robotics, largely the domain of researchers from engineering and computer science. The essential idea is that we must complement the study of complex robots in complex situations with systematic study of simplified robotic devices in controlled laboratory settings and in simplified theoretical models. We must thus use the methods of physics to examine both locomotor successes and failures using parameter space exploration, systematic control, and techniques from dynamical systems. Using examples from our and others' research, we will discuss how such robophysical studies have begun to aid engineers in the creation of devices that have begun to achieve life-like locomotor abilities on and within complex environments, have inspired interesting physics questions in low dimensional dynamical systems, geometric mechanics and soft matter physics, and have been useful to develop models for biological locomotion in complex terrain. The rapidly decreasing cost of constructing robot models with easy access to significant computational power bodes well for scientists and engineers to engage in a discipline which can readily integrate experiment, theory and computation.

  13. A review on locomotion robophysics: the study of movement at the intersection of robotics, soft matter and dynamical systems

    NASA Astrophysics Data System (ADS)

    Aguilar, Jeffrey; Zhang, Tingnan; Qian, Feifei; Kingsbury, Mark; McInroe, Benjamin; Mazouchova, Nicole; Li, Chen; Maladen, Ryan; Gong, Chaohui; Travers, Matt; Hatton, Ross L.; Choset, Howie; Umbanhowar, Paul B.; Goldman, Daniel I.

    2016-11-01

    Discovery of fundamental principles which govern and limit effective locomotion (self-propulsion) is of intellectual interest and practical importance. Human technology has created robotic moving systems that excel in movement on and within environments of societal interest: paved roads, open air and water. However, such devices cannot yet robustly and efficiently navigate (as animals do) the enormous diversity of natural environments which might be of future interest for autonomous robots; examples include vertical surfaces like trees and cliffs, heterogeneous ground like desert rubble and brush, turbulent flows found near seashores, and deformable/flowable substrates like sand, mud and soil. In this review we argue for the creation of a physics of moving systems—a ‘locomotion robophysics’—which we define as the pursuit of principles of self-generated motion. Robophysics can provide an important intellectual complement to the discipline of robotics, largely the domain of researchers from engineering and computer science. The essential idea is that we must complement the study of complex robots in complex situations with systematic study of simplified robotic devices in controlled laboratory settings and in simplified theoretical models. We must thus use the methods of physics to examine both locomotor successes and failures using parameter space exploration, systematic control, and techniques from dynamical systems. Using examples from our and others’ research, we will discuss how such robophysical studies have begun to aid engineers in the creation of devices that have begun to achieve life-like locomotor abilities on and within complex environments, have inspired interesting physics questions in low dimensional dynamical systems, geometric mechanics and soft matter physics, and have been useful to develop models for biological locomotion in complex terrain. The rapidly decreasing cost of constructing robot models with easy access to significant computational power bodes well for scientists and engineers to engage in a discipline which can readily integrate experiment, theory and computation.

  14. The receptive field is dead. Long live the receptive field?

    PubMed Central

    Fairhall, Adrienne

    2014-01-01

    Advances in experimental techniques, including behavioral paradigms using rich stimuli under closed loop conditions and the interfacing of neural systems with external inputs and outputs, reveal complex dynamics in the neural code and require a revisiting of standard concepts of representation. High-throughput recording and imaging methods along with the ability to observe and control neuronal subpopulations allow increasingly detailed access to the neural circuitry that subserves these representations and the computations they support. How do we harness theory to build biologically grounded models of complex neural function? PMID:24618227

  15. Considerations In The Design And Specifications Of An Automatic Inspection System

    NASA Astrophysics Data System (ADS)

    Lee, David T.

    1980-05-01

    Considerable activities have been centered around the automation of manufacturing quality control and inspection functions. Several reasons can be cited for this development. The continuous pressure of direct and indirect labor cost increase is only one of the obvious motivations. With the drive for electronics miniaturization come more and more complex processes where control parameters are critical and the yield is highly susceptible to inadequate process monitor and inspection. With multi-step, multi-layer process for substrate fabrication, process defects that are not detected and corrected at certain critical points may render the entire subassembly useless. As a process becomes more complex, the time required to test the product increases significantly in the total build cycle. The urgency to reduce test time brings more pressure to improve in-process control and inspection. The advances and improvements of components, assemblies and systems such as micro-processors, micro-computers, programmable controllers, and other intelligent devices, have made the automation of quality control much more cost effective and justifiable.

  16. Analysis of a display and control system man-machine interface concept. Volume 1: Final technical report

    NASA Technical Reports Server (NTRS)

    Karl, D. R.

    1972-01-01

    An evaluation was made of the feasibility of utilizing a simplified man machine interface concept to manage and control a complex space system involving multiple redundant computers that control multiple redundant subsystems. The concept involves the use of a CRT for display and a simple keyboard for control, with a tree-type control logic for accessing and controlling mission, systems, and subsystem elements. The concept was evaluated in terms of the Phase B space shuttle orbiter, to utilize the wide scope of data management and subsystem control inherent in the central data management subsystem provided by the Phase B design philosophy. Results of these investigations are reported in four volumes.

  17. Quantum computational complexity, Einstein's equations and accelerated expansion of the Universe

    NASA Astrophysics Data System (ADS)

    Ge, Xian-Hui; Wang, Bin

    2018-02-01

    We study the relation between quantum computational complexity and general relativity. The quantum computational complexity is proposed to be quantified by the shortest length of geodesic quantum curves. We examine the complexity/volume duality in a geodesic causal ball in the framework of Fermi normal coordinates and derive the full non-linear Einstein equation. Using insights from the complexity/action duality, we argue that the accelerated expansion of the universe could be driven by the quantum complexity and free from coincidence and fine-tunning problems.

  18. A Study of the Use of Ontologies for Building Computer-Aided Control Engineering Self-Learning Educational Software

    NASA Astrophysics Data System (ADS)

    García, Isaías; Benavides, Carmen; Alaiz, Héctor; Alonso, Angel

    2013-08-01

    This paper describes research on the use of knowledge models (ontologies) for building computer-aided educational software in the field of control engineering. Ontologies are able to represent in the computer a very rich conceptual model of a given domain. This model can be used later for a number of purposes in different software applications. In this study, domain ontology about the field of lead-lag compensator design has been built and used for automatic exercise generation, graphical user interface population and interaction with the user at any level of detail, including explanations about why things occur. An application called Onto-CELE (ontology-based control engineering learning environment) uses the ontology for implementing a learning environment that can be used for self and lifelong learning purposes. The experience has shown that the use of knowledge models as the basis for educational software applications is capable of showing students the whole complexity of the analysis and design processes at any level of detail. A practical experience with postgraduate students has shown the mentioned benefits and possibilities of the approach.

  19. Computer tomography urography assisted real-time ultrasound-guided percutaneous nephrolithotomy on renal calculus.

    PubMed

    Fang, You-Qiang; Wu, Jie-Ying; Li, Teng-Cheng; Zheng, Hao-Feng; Liang, Guan-Can; Chen, Yan-Xiong; Hong, Xiao-Bin; Cai, Wei-Zhong; Zang, Zhi-Jun; Di, Jin-Ming

    2017-06-01

    This study aimed to assess the role of pre-designed route on computer tomography urography (CTU) in the ultrasound-guided percutaneous nephrolithotomy (PCNL) for renal calculus.From August 2013 to May 2016, a total of 100 patients diagnosed with complex renal calculus in our hospital were randomly divided into CTU group and control group (without CTU assistance). CTU was used to design a rational route for puncturing in CTU group. Ultrasound was used in both groups to establish a working trace in the operation areas. Patients' perioperative parameters and postoperative complications were recorded.All operations were successfully performed, without transferring to open surgery. Time of channel establishment in CTU group (6.5 ± 4.3 minutes) was shorter than the control group (10.0 ± 6.7 minutes) (P = .002). In addition, there was shorter operation time, lower rates of blood transfusion, secondary operation, and less establishing channels. The incidence of postoperative complications including residual stones, sepsis, severe hemorrhage, and perirenal hematoma was lower in CTU group than in control group.Pre-designing puncture route on CTU images would improve the puncturing accuracy, lessen establishing channels as well as improve the security in the ultrasound-guided PCNL for complex renal calculus, but at the cost of increased radiation exposure.

  20. Health technology assessment review: Computerized glucose regulation in the intensive care unit - how to create artificial control

    PubMed Central

    2009-01-01

    Current care guidelines recommend glucose control (GC) in critically ill patients. To achieve GC, many ICUs have implemented a (nurse-based) protocol on paper. However, such protocols are often complex, time-consuming, and can cause iatrogenic hypoglycemia. Computerized glucose regulation protocols may improve patient safety, efficiency, and nurse compliance. Such computerized clinical decision support systems (Cuss) use more complex logic to provide an insulin infusion rate based on previous blood glucose levels and other parameters. A computerized CDSS for glucose control has the potential to reduce overall workload, reduce the chance of human cognitive failure, and improve glucose control. Several computer-assisted glucose regulation programs have been published recently. In order of increasing complexity, the three main types of algorithms used are computerized flowcharts, Proportional-Integral-Derivative (PID), and Model Predictive Control (MPC). PID is essentially a closed-loop feedback system, whereas MPC models the behavior of glucose and insulin in ICU patients. Although the best approach has not yet been determined, it should be noted that PID controllers are generally thought to be more robust than MPC systems. The computerized Cuss that are most likely to emerge are those that are fully a part of the routine workflow, use patient-specific characteristics and apply variable sampling intervals. PMID:19849827

  1. Cloud Computing for Complex Performance Codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Appel, Gordon John; Hadgu, Teklu; Klein, Brandon Thorin

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  2. Geometry definition and grid generation for a complete fighter aircraft

    NASA Technical Reports Server (NTRS)

    Edwards, T. A.

    1986-01-01

    Recent advances in computing power and numerical solution procedures have enabled computational fluid dynamicists to attempt increasingly difficult problems. In particular, efforts are focusing on computations of complex three-dimensional flow fields about realistic aerodynamic bodies. To perform such computations, a very accurate and detailed description of the surface geometry must be provided, and a three-dimensional grid must be generated in the space around the body. The geometry must be supplied in a format compatible with the grid generation requirements, and must be verified to be free of inconsistencies. This paper presents a procedure for performing the geometry definition of a fighter aircraft that makes use of a commercial computer-aided design/computer-aided manufacturing system. Furthermore, visual representations of the geometry are generated using a computer graphics system for verification of the body definition. Finally, the three-dimensional grids for fighter-like aircraft are generated by means of an efficient new parabolic grid generation method. This method exhibits good control of grid quality.

  3. Geometry definition and grid generation for a complete fighter aircraft

    NASA Technical Reports Server (NTRS)

    Edwards, Thomas A.

    1986-01-01

    Recent advances in computing power and numerical solution procedures have enabled computational fluid dynamicists to attempt increasingly difficult problems. In particular, efforts are focusing on computations of complex three-dimensional flow fields about realistic aerodynamic bodies. To perform such computations, a very accurate and detailed description of the surface geometry must be provided, and a three-dimensional grid must be generated in the space around the body. The geometry must be supplied in a format compatible with the grid generation requirements, and must be verified to be free of inconsistencies. A procedure for performing the geometry definition of a fighter aircraft that makes use of a commercial computer-aided design/computer-aided manufacturing system is presented. Furthermore, visual representations of the geometry are generated using a computer graphics system for verification of the body definition. Finally, the three-dimensional grids for fighter-like aircraft are generated by means of an efficient new parabolic grid generation method. This method exhibits good control of grid quality.

  4. Dimensionality of visual complexity in computer graphics scenes

    NASA Astrophysics Data System (ADS)

    Ramanarayanan, Ganesh; Bala, Kavita; Ferwerda, James A.; Walter, Bruce

    2008-02-01

    How do human observers perceive visual complexity in images? This problem is especially relevant for computer graphics, where a better understanding of visual complexity can aid in the development of more advanced rendering algorithms. In this paper, we describe a study of the dimensionality of visual complexity in computer graphics scenes. We conducted an experiment where subjects judged the relative complexity of 21 high-resolution scenes, rendered with photorealistic methods. Scenes were gathered from web archives and varied in theme, number and layout of objects, material properties, and lighting. We analyzed the subject responses using multidimensional scaling of pooled subject responses. This analysis embedded the stimulus images in a two-dimensional space, with axes that roughly corresponded to "numerosity" and "material / lighting complexity". In a follow-up analysis, we derived a one-dimensional complexity ordering of the stimulus images. We compared this ordering with several computable complexity metrics, such as scene polygon count and JPEG compression size, and did not find them to be very correlated. Understanding the differences between these measures can lead to the design of more efficient rendering algorithms in computer graphics.

  5. Quasi-analytical treatment of spatially averaged radiation transfer in complex terrain

    NASA Astrophysics Data System (ADS)

    LöWe, H.; Helbig, N.

    2012-10-01

    We provide a new quasi-analytical method to compute the subgrid topographic influences on the shortwave radiation fluxes and the effective albedo in complex terrain as required for large-scale meteorological, land surface, or climate models. We investigate radiative transfer in complex terrain via the radiosity equation on isotropic Gaussian random fields. Under controlled approximations we derive expressions for domain-averaged fluxes of direct, diffuse, and terrain radiation and the sky view factor. Domain-averaged quantities can be related to a type of level-crossing probability of the random field, which is approximated by long-standing results developed for acoustic scattering at ocean boundaries. This allows us to express all nonlocal horizon effects in terms of a local terrain parameter, namely, the mean-square slope. Emerging integrals are computed numerically, and fit formulas are given for practical purposes. As an implication of our approach, we provide an expression for the effective albedo of complex terrain in terms of the Sun elevation angle, mean-square slope, the area-averaged surface albedo, and the ratio of atmospheric direct beam to diffuse radiation. For demonstration we compute the decrease of the effective albedo relative to the area-averaged albedo in Switzerland for idealized snow-covered and clear-sky conditions at noon in winter. We find an average decrease of 5.8% and spatial patterns which originate from characteristics of the underlying relief. Limitations and possible generalizations of the method are discussed.

  6. Guest Editorial High Performance Computing (HPC) Applications for a More Resilient and Efficient Power Grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Zhenyu Henry; Tate, Zeb; Abhyankar, Shrirang

    The power grid has been evolving over the last 120 years, but it is seeing more changes in this decade and next than it has seen over the past century. In particular, the widespread deployment of intermittent renewable generation, smart loads and devices, hierarchical and distributed control technologies, phasor measurement units, energy storage, and widespread usage of electric vehicles will require fundamental changes in methods and tools for the operation and planning of the power grid. The resulting new dynamic and stochastic behaviors will demand the inclusion of more complexity in modeling the power grid. Solving such complex models inmore » the traditional computing environment will be a major challenge. Along with the increasing complexity of power system models, the increasing complexity of smart grid data further adds to the prevailing challenges. In this environment, the myriad of smart sensors and meters in the power grid increase by multiple orders of magnitude, so do the volume and speed of the data. The information infrastructure will need to drastically change to support the exchange of enormous amounts of data as smart grid applications will need the capability to collect, assimilate, analyze and process the data, to meet real-time grid functions. High performance computing (HPC) holds the promise to enhance these functions, but it is a great resource that has not been fully explored and adopted for the power grid domain.« less

  7. Integrative Utilization of Microenvironments, Biomaterials and Computational Techniques for Advanced Tissue Engineering.

    PubMed

    Shamloo, Amir; Mohammadaliha, Negar; Mohseni, Mina

    2015-10-20

    This review aims to propose the integrative implementation of microfluidic devices, biomaterials, and computational methods that can lead to a significant progress in tissue engineering and regenerative medicine researches. Simultaneous implementation of multiple techniques can be very helpful in addressing biological processes. Providing controllable biochemical and biomechanical cues within artificial extracellular matrix similar to in vivo conditions is crucial in tissue engineering and regenerative medicine researches. Microfluidic devices provide precise spatial and temporal control over cell microenvironment. Moreover, generation of accurate and controllable spatial and temporal gradients of biochemical factors is attainable inside microdevices. Since biomaterials with tunable properties are a worthwhile option to construct artificial extracellular matrix, in vitro platforms that simultaneously utilize natural, synthetic, or engineered biomaterials inside microfluidic devices are phenomenally advantageous to experimental studies in the field of tissue engineering. Additionally, collaboration between experimental and computational methods is a useful way to predict and understand mechanisms responsible for complex biological phenomena. Computational results can be verified by using experimental platforms. Computational methods can also broaden the understanding of the mechanisms behind the biological phenomena observed during experiments. Furthermore, computational methods are powerful tools to optimize the fabrication of microfluidic devices and biomaterials with specific features. Here we present a succinct review of the benefits of microfluidic devices, biomaterial, and computational methods in the case of tissue engineering and regeneration medicine. Furthermore, some breakthroughs in biological phenomena including the neuronal axon development, cancerous cell migration and blood vessel formation via angiogenesis by virtue of the aforementioned approaches are discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Tutoring at a Distance: Modelling as a Tool to Control Chaos

    ERIC Educational Resources Information Center

    Bertin, Jean-Claude; Narcy-Combes, Jean-Paul

    2012-01-01

    This article builds on a previous article published in 2007, which aimed at clarifying the concept of tutoring. Based on a new epistemological stance (emergentism) the authors will here show how the various components of the computer-assisted language learning situation form a complex chaotic system. They advocate that modelling is a way of…

  9. Automation of checkout for the shuttle operations era

    NASA Technical Reports Server (NTRS)

    Anderson, J. A.; Hendrickson, K. O.

    1985-01-01

    The Space Shuttle checkout is different from its Apollo predecessor. The complexity of the hardware, the shortened turnaround time, and the software that performs ground checkout are outlined. Generating new techniques and standards for software development and the management structure to control it are implemented. The utilization of computer systems for vehicle testing is high lighted.

  10. Computations of Aerodynamic Performance Databases Using Output-Based Refinement

    NASA Technical Reports Server (NTRS)

    Nemec, Marian; Aftosmis, Michael J.

    2009-01-01

    Objectives: Handle complex geometry problems; Control discretization errors via solution-adaptive mesh refinement; Focus on aerodynamic databases of parametric and optimization studies: 1. Accuracy: satisfy prescribed error bounds 2. Robustness and speed: may require over 105 mesh generations 3. Automation: avoid user supervision Obtain "expert meshes" independent of user skill; and Run every case adaptively in production settings.

  11. Password Complexity Recommendations: xezandpAxat8Um or P4$$w0rd!!!!

    DTIC Science & Technology

    2014-10-01

    have we seen the computer screen with fast- scrolling characters, with good answers being indicated one by one? This is not a MasterMind game ! Password...security/2013/ 05/how-crackers-make-minced- meat -out-of-your-passwords (Access Date: 2014-04-02). 18 DRDC-RDDC-2014-R27 DOCUMENT CONTROL DATA (Security

  12. A source-controlled data center network model.

    PubMed

    Yu, Yang; Liang, Mangui; Wang, Zhe

    2017-01-01

    The construction of data center network by applying SDN technology has become a hot research topic. The SDN architecture has innovatively separated the control plane from the data plane which makes the network more software-oriented and agile. Moreover, it provides virtual multi-tenancy, effective scheduling resources and centralized control strategies to meet the demand for cloud computing data center. However, the explosion of network information is facing severe challenges for SDN controller. The flow storage and lookup mechanisms based on TCAM device have led to the restriction of scalability, high cost and energy consumption. In view of this, a source-controlled data center network (SCDCN) model is proposed herein. The SCDCN model applies a new type of source routing address named the vector address (VA) as the packet-switching label. The VA completely defines the communication path and the data forwarding process can be finished solely relying on VA. There are four advantages in the SCDCN architecture. 1) The model adopts hierarchical multi-controllers and abstracts large-scale data center network into some small network domains that has solved the restriction for the processing ability of single controller and reduced the computational complexity. 2) Vector switches (VS) developed in the core network no longer apply TCAM for table storage and lookup that has significantly cut down the cost and complexity for switches. Meanwhile, the problem of scalability can be solved effectively. 3) The SCDCN model simplifies the establishment process for new flows and there is no need to download flow tables to VS. The amount of control signaling consumed when establishing new flows can be significantly decreased. 4) We design the VS on the NetFPGA platform. The statistical results show that the hardware resource consumption in a VS is about 27% of that in an OFS.

  13. A source-controlled data center network model

    PubMed Central

    Yu, Yang; Liang, Mangui; Wang, Zhe

    2017-01-01

    The construction of data center network by applying SDN technology has become a hot research topic. The SDN architecture has innovatively separated the control plane from the data plane which makes the network more software-oriented and agile. Moreover, it provides virtual multi-tenancy, effective scheduling resources and centralized control strategies to meet the demand for cloud computing data center. However, the explosion of network information is facing severe challenges for SDN controller. The flow storage and lookup mechanisms based on TCAM device have led to the restriction of scalability, high cost and energy consumption. In view of this, a source-controlled data center network (SCDCN) model is proposed herein. The SCDCN model applies a new type of source routing address named the vector address (VA) as the packet-switching label. The VA completely defines the communication path and the data forwarding process can be finished solely relying on VA. There are four advantages in the SCDCN architecture. 1) The model adopts hierarchical multi-controllers and abstracts large-scale data center network into some small network domains that has solved the restriction for the processing ability of single controller and reduced the computational complexity. 2) Vector switches (VS) developed in the core network no longer apply TCAM for table storage and lookup that has significantly cut down the cost and complexity for switches. Meanwhile, the problem of scalability can be solved effectively. 3) The SCDCN model simplifies the establishment process for new flows and there is no need to download flow tables to VS. The amount of control signaling consumed when establishing new flows can be significantly decreased. 4) We design the VS on the NetFPGA platform. The statistical results show that the hardware resource consumption in a VS is about 27% of that in an OFS. PMID:28328925

  14. A distributed, graphical user interface based, computer control system for atomic physics experiments

    NASA Astrophysics Data System (ADS)

    Keshet, Aviv; Ketterle, Wolfgang

    2013-01-01

    Atomic physics experiments often require a complex sequence of precisely timed computer controlled events. This paper describes a distributed graphical user interface-based control system designed with such experiments in mind, which makes use of off-the-shelf output hardware from National Instruments. The software makes use of a client-server separation between a user interface for sequence design and a set of output hardware servers. Output hardware servers are designed to use standard National Instruments output cards, but the client-server nature should allow this to be extended to other output hardware. Output sequences running on multiple servers and output cards can be synchronized using a shared clock. By using a field programmable gate array-generated variable frequency clock, redundant buffers can be dramatically shortened, and a time resolution of 100 ns achieved over effectively arbitrary sequence lengths.

  15. A distributed, graphical user interface based, computer control system for atomic physics experiments.

    PubMed

    Keshet, Aviv; Ketterle, Wolfgang

    2013-01-01

    Atomic physics experiments often require a complex sequence of precisely timed computer controlled events. This paper describes a distributed graphical user interface-based control system designed with such experiments in mind, which makes use of off-the-shelf output hardware from National Instruments. The software makes use of a client-server separation between a user interface for sequence design and a set of output hardware servers. Output hardware servers are designed to use standard National Instruments output cards, but the client-server nature should allow this to be extended to other output hardware. Output sequences running on multiple servers and output cards can be synchronized using a shared clock. By using a field programmable gate array-generated variable frequency clock, redundant buffers can be dramatically shortened, and a time resolution of 100 ns achieved over effectively arbitrary sequence lengths.

  16. Emulating weak localization using a solid-state quantum circuit.

    PubMed

    Chen, Yu; Roushan, P; Sank, D; Neill, C; Lucero, Erik; Mariantoni, Matteo; Barends, R; Chiaro, B; Kelly, J; Megrant, A; Mutus, J Y; O'Malley, P J J; Vainsencher, A; Wenner, J; White, T C; Yin, Yi; Cleland, A N; Martinis, John M

    2014-10-14

    Quantum interference is one of the most fundamental physical effects found in nature. Recent advances in quantum computing now employ interference as a fundamental resource for computation and control. Quantum interference also lies at the heart of sophisticated condensed matter phenomena such as Anderson localization, phenomena that are difficult to reproduce in numerical simulations. Here, employing a multiple-element superconducting quantum circuit, with which we manipulate a single microwave photon, we demonstrate that we can emulate the basic effects of weak localization. By engineering the control sequence, we are able to reproduce the well-known negative magnetoresistance of weak localization as well as its temperature dependence. Furthermore, we can use our circuit to continuously tune the level of disorder, a parameter that is not readily accessible in mesoscopic systems. Demonstrating a high level of control, our experiment shows the potential for employing superconducting quantum circuits as emulators for complex quantum phenomena.

  17. A programmable computational image sensor for high-speed vision

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Shi, Cong; Long, Xitian; Wu, Nanjian

    2013-08-01

    In this paper we present a programmable computational image sensor for high-speed vision. This computational image sensor contains four main blocks: an image pixel array, a massively parallel processing element (PE) array, a row processor (RP) array and a RISC core. The pixel-parallel PE is responsible for transferring, storing and processing image raw data in a SIMD fashion with its own programming language. The RPs are one dimensional array of simplified RISC cores, it can carry out complex arithmetic and logic operations. The PE array and RP array can finish great amount of computation with few instruction cycles and therefore satisfy the low- and middle-level high-speed image processing requirement. The RISC core controls the whole system operation and finishes some high-level image processing algorithms. We utilize a simplified AHB bus as the system bus to connect our major components. Programming language and corresponding tool chain for this computational image sensor are also developed.

  18. Experimental, Theoretical, and Computational Investigation of Separated Nozzle Flows

    NASA Technical Reports Server (NTRS)

    Hunter, Craig A.

    2004-01-01

    A detailed experimental, theoretical, and computational study of separated nozzle flows has been conducted. Experimental testing was performed at the NASA Langley 16-Foot Transonic Tunnel Complex. As part of a comprehensive static performance investigation, force, moment, and pressure measurements were made and schlieren flow visualization was obtained for a sub-scale, non-axisymmetric, two-dimensional, convergent- divergent nozzle. In addition, two-dimensional numerical simulations were run using the computational fluid dynamics code PAB3D with two-equation turbulence closure and algebraic Reynolds stress modeling. For reference, experimental and computational results were compared with theoretical predictions based on one-dimensional gas dynamics and an approximate integral momentum boundary layer method. Experimental results from this study indicate that off-design overexpanded nozzle flow was dominated by shock induced boundary layer separation, which was divided into two distinct flow regimes; three- dimensional separation with partial reattachment, and fully detached two-dimensional separation. The test nozzle was observed to go through a marked transition in passing from one regime to the other. In all cases, separation provided a significant increase in static thrust efficiency compared to the ideal prediction. Results indicate that with controlled separation, the entire overexpanded range of nozzle performance would be within 10% of the peak thrust efficiency. By offering savings in weight and complexity over a conventional mechanical exhaust system, this may allow a fixed geometry nozzle to cover an entire flight envelope. The computational simulation was in excellent agreement with experimental data over most of the test range, and did a good job of modeling internal flow and thrust performance. An exception occurred at low nozzle pressure ratios, where the two-dimensional computational model was inconsistent with the three-dimensional separation observed in the experiment. In general, the computation captured the physics of the shock boundary layer interaction and shock induced boundary layer separation in the nozzle, though there were some differences in shock structure compared to experiment. Though minor, these differences could be important for studies involving flow control or thrust vectoring of separated nozzles. Combined with other observations, this indicates that more detailed, three-dimensional computational modeling needs to be conducted to more realistically simulate shock-separated nozzle flows.

  19. Mission Control Center (MCC) system specification for the shuttle Orbital Flight Test (OFT) timeframe

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The Mission Control Center Shuttle (MCC) Shuttle Orbital Flight Test (OFT) Data System (OFTDS) provides facilities for flight control and data systems personnel to monitor and control the Shuttle flights from launch (tower clear) to rollout (wheels stopped on runway). It also supports the preparation for flight (flight planning, flight controller and crew training, and integrated vehicle and network testing activities). The MCC Shuttle OFTDS is described in detail. Three major support systems of the OFTDS and the data types and sources of data entering or exiting the MCC were illustrated. These systems are the communication interface system, the data computation complex, and the display and control system.

  20. A Study of Complex Deep Learning Networks on High Performance, Neuromorphic, and Quantum Computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Potok, Thomas E; Schuman, Catherine D; Young, Steven R

    Current Deep Learning models use highly optimized convolutional neural networks (CNN) trained on large graphical processing units (GPU)-based computers with a fairly simple layered network topology, i.e., highly connected layers, without intra-layer connections. Complex topologies have been proposed, but are intractable to train on current systems. Building the topologies of the deep learning network requires hand tuning, and implementing the network in hardware is expensive in both cost and power. In this paper, we evaluate deep learning models using three different computing architectures to address these problems: quantum computing to train complex topologies, high performance computing (HPC) to automatically determinemore » network topology, and neuromorphic computing for a low-power hardware implementation. Due to input size limitations of current quantum computers we use the MNIST dataset for our evaluation. The results show the possibility of using the three architectures in tandem to explore complex deep learning networks that are untrainable using a von Neumann architecture. We show that a quantum computer can find high quality values of intra-layer connections and weights, while yielding a tractable time result as the complexity of the network increases; a high performance computer can find optimal layer-based topologies; and a neuromorphic computer can represent the complex topology and weights derived from the other architectures in low power memristive hardware. This represents a new capability that is not feasible with current von Neumann architecture. It potentially enables the ability to solve very complicated problems unsolvable with current computing technologies.« less

  1. Comparing Virtual and Physical Robotics Environments for Supporting Complex Systems and Computational Thinking

    ERIC Educational Resources Information Center

    Berland, Matthew; Wilensky, Uri

    2015-01-01

    Both complex systems methods (such as agent-based modeling) and computational methods (such as programming) provide powerful ways for students to understand new phenomena. To understand how to effectively teach complex systems and computational content to younger students, we conducted a study in four urban middle school classrooms comparing…

  2. Dynamic Detection of Malicious Code in COTS Software

    DTIC Science & Technology

    2000-04-01

    run the following documented hostile applets or ActiveX of these tools work only on mobile code (Java, ActiveX , controls: 16-11 Hostile Applets Tiny...Killer App Exploder Runner ActiveX Check Spy eSafe Protect Desktop 9/9 blocked NB B NB 13/17 blocked NB Surfinshield Online 9/9 blocked NB B B 13/17...Exploder is an ActiveX control top (@). that performs a clean shutdown of your computer. The interface is attractive, although rather complex, as McLain’s

  3. Input-output oriented computation algorithms for the control of large flexible structures

    NASA Technical Reports Server (NTRS)

    Minto, K. D.

    1989-01-01

    An overview is given of work in progress aimed at developing computational algorithms addressing two important aspects in the control of large flexible space structures; namely, the selection and placement of sensors and actuators, and the resulting multivariable control law design problem. The issue of sensor/actuator set selection is particularly crucial to obtaining a satisfactory control design, as clearly a poor choice will inherently limit the degree to which good control can be achieved. With regard to control law design, the researchers are driven by concerns stemming from the practical issues associated with eventual implementation of multivariable control laws, such as reliability, limit protection, multimode operation, sampling rate selection, processor throughput, etc. Naturally, the burden imposed by dealing with these aspects of the problem can be reduced by ensuring that the complexity of the compensator is minimized. Our approach to these problems is based on extensions to input/output oriented techniques that have proven useful in the design of multivariable control systems for aircraft engines. In particular, researchers are exploring the use of relative gain analysis and the condition number as a means of quantifying the process of sensor/actuator selection and placement for shape control of a large space platform.

  4. A resource management architecture based on complex network theory in cloud computing federation

    NASA Astrophysics Data System (ADS)

    Zhang, Zehua; Zhang, Xuejie

    2011-10-01

    Cloud Computing Federation is a main trend of Cloud Computing. Resource Management has significant effect on the design, realization, and efficiency of Cloud Computing Federation. Cloud Computing Federation has the typical characteristic of the Complex System, therefore, we propose a resource management architecture based on complex network theory for Cloud Computing Federation (abbreviated as RMABC) in this paper, with the detailed design of the resource discovery and resource announcement mechanisms. Compare with the existing resource management mechanisms in distributed computing systems, a Task Manager in RMABC can use the historical information and current state data get from other Task Managers for the evolution of the complex network which is composed of Task Managers, thus has the advantages in resource discovery speed, fault tolerance and adaptive ability. The result of the model experiment confirmed the advantage of RMABC in resource discovery performance.

  5. Spatial issues in user interface design from a graphic design perspective

    NASA Technical Reports Server (NTRS)

    Marcus, Aaron

    1989-01-01

    The user interface of a computer system is a visual display that provides information about the status of operations on data within the computer and control options to the user that enable adjustments to these operations. From the very beginning of computer technology the user interface was a spatial display, although its spatial features were not necessarily complex or explicitly recognized by the users. All text and nonverbal signs appeared in a virtual space generally thought of as a single flat plane of symbols. Current technology of high performance workstations permits any element of the display to appear as dynamic, multicolor, 3-D signs in a virtual 3-D space. The complexity of appearance and the user's interaction with the display provide significant challenges to the graphic designer of current and future user interfaces. In particular, spatial depiction provides many opportunities for effective communication of objects, structures, processes, navigation, selection, and manipulation. Issues are presented that are relevant to the graphic designer seeking to optimize the user interface's spatial attributes for effective visual communication.

  6. Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 1: Overview and summary

    NASA Technical Reports Server (NTRS)

    1989-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned Marshall Space Flight Center (MSFC) Payload Training Complex (PTC) required to meet this need will train the space station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs. This study was performed August 1988 to October 1989. Thus, the results are based on the SSFP August 1989 baseline, i.e., pre-Langley configuration/budget review (C/BR) baseline. Some terms, e.g., combined trainer, are being redefined. An overview of the study activities and a summary of study results are given here.

  7. Vapor Compression Distillation Subsystem (VCDS) Component Enhancement, Testing and Expert Fault Diagnostics Development, Volume 2

    NASA Technical Reports Server (NTRS)

    Mallinak, E. S.

    1987-01-01

    A wide variety of Space Station functions will be managed via computerized controls. Many of these functions are at the same time very complex and very critical to the operation of the Space Station. The Environmental Control and Life Support System is one group of very complex and critical subsystems which directly affects the ability of the crew to perform their mission. Failure of the Environmental Control and Life Support Subsystems are to be avoided and, in the event of failure, repair must be effected as rapidly as possible. Due to the complex and diverse nature of the subsystems, it is not possible to train the Space Station crew to be experts in the operation of all of the subsystems. By applying the concepts of computer-based expert systems, it may be possible to provide the necessary expertise for these subsystems in dedicated controllers. In this way, an expert system could avoid failures and extend the operating time of the subsystems even in the event of failure of some components, and could reduce the time to repair by being able to pinpoint the cause of a failure when one cannot be avoided.

  8. Nonsomatotopic organization of the higher motor centers in octopus.

    PubMed

    Zullo, Letizia; Sumbre, German; Agnisola, Claudio; Flash, Tamar; Hochner, Binyamin

    2009-10-13

    Hyperredundant limbs with a virtually unlimited number of degrees of freedom (DOFs) pose a challenge for both biological and computational systems of motor control. In the flexible arms of the octopus, simplification strategies have evolved to reduce the number of controlled DOFs. Motor control in the octopus nervous system is hierarchically organized. A relatively small central brain integrates a huge amount of visual and tactile information from the large optic lobes and the peripheral nervous system of the arms and issues commands to lower motor centers controlling the elaborated neuromuscular system of the arms. This unique organization raises new questions on the organization of the octopus brain and whether and how it represents the rich movement repertoire. We developed a method of brain microstimulation in freely behaving animals and stimulated the higher motor centers-the basal lobes-thus inducing discrete and complex sets of movements. As stimulation strength increased, complex movements were recruited from basic components shared by different types of movement. We found no stimulation site where movements of a single arm or body part could be elicited. Discrete and complex components have no central topographical organization but are distributed over wide regions.

  9. Intelligence and cortical thickness in children with complex partial seizures.

    PubMed

    Tosun, Duygu; Caplan, Rochelle; Siddarth, Prabha; Seidenberg, Michael; Gurbani, Suresh; Toga, Arthur W; Hermann, Bruce

    2011-07-15

    Prior studies on healthy children have demonstrated regional variations and a complex and dynamic relationship between intelligence and cerebral tissue. Yet, there is little information regarding the neuroanatomical correlates of general intelligence in children with epilepsy compared to healthy controls. In vivo imaging techniques, combined with methods for advanced image processing and analysis, offer the potential to examine quantitative mapping of brain development and its abnormalities in childhood epilepsy. A surface-based, computational high resolution 3-D magnetic resonance image analytic technique was used to compare the relationship of cortical thickness with age and intelligence quotient (IQ) in 65 children and adolescents with complex partial seizures (CPS) and 58 healthy controls, aged 6-18 years. Children were grouped according to health status (epilepsy; controls) and IQ level (average and above; below average) and compared on age-related patterns of cortical thickness. Our cross-sectional findings suggest that disruption in normal age-related cortical thickness expression is associated with intelligence in pediatric CPS patients both with average and below average IQ scores. Copyright © 2011 Elsevier Inc. All rights reserved.

  10. A study of interactive control scheduling and economic assessment for robotic systems

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A class of interactive control systems is derived by generalizing interactive manipulator control systems. Tasks of interactive control systems can be represented as a network of a finite set of actions which have specific operational characteristics and specific resource requirements, and which are of limited duration. This has enabled the decomposition of the overall control algorithm simultaneously and asynchronously. The performance benefits of sensor referenced and computer-aided control of manipulators in a complex environment is evaluated. The first phase of the CURV arm control system software development and the basic features of the control algorithms and their software implementation are presented. An optimal solution for a production scheduling problem that will be easy to implement in practical situations is investigated.

  11. Planning and task management in Parkinson's disease: differential emphasis in dual-task performance.

    PubMed

    Bialystok, Ellen; Craik, Fergus I M; Stefurak, Taresa

    2008-03-01

    Seventeen patients diagnosed with Parkinson's disease completed a complex computer-based task that involved planning and management while also performing an attention-demanding secondary task. The tasks were performed concurrently, but it was necessary to switch from one to the other. Performance was compared to a group of healthy age-matched control participants and a group of young participants. Parkinson's patients performed better than the age-matched controls on almost all measures and as well as the young controls in many cases. However, the Parkinson's patients achieved this by paying relatively less attention to the secondary task and focusing attention more on the primary task. Thus, Parkinson's patients can apparently improve their performance on some aspects of a multidimensional task by simplifying task demands. This benefit may occur as a consequence of their inflexible exaggerated attention to some aspects of a complex task to the relative neglect of other aspects.

  12. Real-time automated failure identification in the Control Center Complex (CCC)

    NASA Technical Reports Server (NTRS)

    Kirby, Sarah; Lauritsen, Janet; Pack, Ginger; Ha, Anhhoang; Jowers, Steven; Mcnenny, Robert; Truong, The; Dell, James

    1993-01-01

    A system which will provide real-time failure management support to the Space Station Freedom program is described. The system's use of a simplified form of model based reasoning qualifies it as an advanced automation system. However, it differs from most such systems in that it was designed from the outset to meet two sets of requirements. First, it must provide a useful increment to the fault management capabilities of the Johnson Space Center (JSC) Control Center Complex (CCC) Fault Detection Management system. Second, it must satisfy CCC operational environment constraints such as cost, computer resource requirements, verification, and validation, etc. The need to meet both requirement sets presents a much greater design challenge than would have been the case had functionality been the sole design consideration. The choice of technology, discussing aspects of that choice and the process for migrating it into the control center is overviewed.

  13. Direct adaptive control of a PUMA 560 industrial robot

    NASA Technical Reports Server (NTRS)

    Seraji, Homayoun; Lee, Thomas; Delpech, Michel

    1989-01-01

    The implementation and experimental validation of a new direct adaptive control scheme on a PUMA 560 industrial robot is described. The testbed facility consists of a Unimation PUMA 560 six-jointed robot and controller, and a DEC MicroVAX II computer which hosts the Robot Control C Library software. The control algorithm is implemented on the MicroVAX which acts as a digital controller for the PUMA robot, and the Unimation controller is effectively bypassed and used merely as an I/O device to interface the MicroVAX to the joint motors. The control algorithm for each robot joint consists of an auxiliary signal generated by a constant-gain Proportional plus Integral plus Derivative (PID) controller, and an adaptive position-velocity (PD) feedback controller with adjustable gains. The adaptive independent joint controllers compensate for the inter-joint couplings and achieve accurate trajectory tracking without the need for the complex dynamic model and parameter values of the robot. Extensive experimental results on PUMA joint control are presented to confirm the feasibility of the proposed scheme, in spite of strong interactions between joint motions. Experimental results validate the capabilities of the proposed control scheme. The control scheme is extremely simple and computationally very fast for concurrent processing with high sampling rates.

  14. Integration of symbolic and algorithmic hardware and software for the automation of space station subsystems

    NASA Technical Reports Server (NTRS)

    Gregg, Hugh; Healey, Kathleen; Hack, Edmund; Wong, Carla

    1988-01-01

    Expert systems that require access to data bases, complex simulations and real time instrumentation have both symbolic and algorithmic needs. Both of these needs could be met using a general purpose workstation running both symbolic and algorithmic codes, or separate, specialized computers networked together. The later approach was chosen to implement TEXSYS, the thermal expert system, developed by the NASA Ames Research Center in conjunction with the Johnson Space Center to demonstrate the ability of an expert system to autonomously monitor the thermal control system of the space station. TEXSYS has been implemented on a Symbolics workstation, and will be linked to a microVAX computer that will control a thermal test bed. The integration options and several possible solutions are presented.

  15. Development of Moire machine vision

    NASA Technical Reports Server (NTRS)

    Harding, Kevin G.

    1987-01-01

    Three dimensional perception is essential to the development of versatile robotics systems in order to handle complex manufacturing tasks in future factories and in providing high accuracy measurements needed in flexible manufacturing and quality control. A program is described which will develop the potential of Moire techniques to provide this capability in vision systems and automated measurements, and demonstrate artificial intelligence (AI) techniques to take advantage of the strengths of Moire sensing. Moire techniques provide a means of optically manipulating the complex visual data in a three dimensional scene into a form which can be easily and quickly analyzed by computers. This type of optical data manipulation provides high productivity through integrated automation, producing a high quality product while reducing computer and mechanical manipulation requirements and thereby the cost and time of production. This nondestructive evaluation is developed to be able to make full field range measurement and three dimensional scene analysis.

  16. Simulation and optimization of an experimental membrane wastewater treatment plant using computational intelligence methods.

    PubMed

    Ludwig, T; Kern, P; Bongards, M; Wolf, C

    2011-01-01

    The optimization of relaxation and filtration times of submerged microfiltration flat modules in membrane bioreactors used for municipal wastewater treatment is essential for efficient plant operation. However, the optimization and control of such plants and their filtration processes is a challenging problem due to the underlying highly nonlinear and complex processes. This paper presents the use of genetic algorithms for this optimization problem in conjunction with a fully calibrated simulation model, as computational intelligence methods are perfectly suited to the nonconvex multi-objective nature of the optimization problems posed by these complex systems. The simulation model is developed and calibrated using membrane modules from the wastewater simulation software GPS-X based on the Activated Sludge Model No.1 (ASM1). Simulation results have been validated at a technical reference plant. They clearly show that filtration process costs for cleaning and energy can be reduced significantly by intelligent process optimization.

  17. Development of Moire machine vision

    NASA Astrophysics Data System (ADS)

    Harding, Kevin G.

    1987-10-01

    Three dimensional perception is essential to the development of versatile robotics systems in order to handle complex manufacturing tasks in future factories and in providing high accuracy measurements needed in flexible manufacturing and quality control. A program is described which will develop the potential of Moire techniques to provide this capability in vision systems and automated measurements, and demonstrate artificial intelligence (AI) techniques to take advantage of the strengths of Moire sensing. Moire techniques provide a means of optically manipulating the complex visual data in a three dimensional scene into a form which can be easily and quickly analyzed by computers. This type of optical data manipulation provides high productivity through integrated automation, producing a high quality product while reducing computer and mechanical manipulation requirements and thereby the cost and time of production. This nondestructive evaluation is developed to be able to make full field range measurement and three dimensional scene analysis.

  18. An effective and secure key-management scheme for hierarchical access control in E-medicine system.

    PubMed

    Odelu, Vanga; Das, Ashok Kumar; Goswami, Adrijit

    2013-04-01

    Recently several hierarchical access control schemes are proposed in the literature to provide security of e-medicine systems. However, most of them are either insecure against 'man-in-the-middle attack' or they require high storage and computational overheads. Wu and Chen proposed a key management method to solve dynamic access control problems in a user hierarchy based on hybrid cryptosystem. Though their scheme improves computational efficiency over Nikooghadam et al.'s approach, it suffers from large storage space for public parameters in public domain and computational inefficiency due to costly elliptic curve point multiplication. Recently, Nikooghadam and Zakerolhosseini showed that Wu-Chen's scheme is vulnerable to man-in-the-middle attack. In order to remedy this security weakness in Wu-Chen's scheme, they proposed a secure scheme which is again based on ECC (elliptic curve cryptography) and efficient one-way hash function. However, their scheme incurs huge computational cost for providing verification of public information in the public domain as their scheme uses ECC digital signature which is costly when compared to symmetric-key cryptosystem. In this paper, we propose an effective access control scheme in user hierarchy which is only based on symmetric-key cryptosystem and efficient one-way hash function. We show that our scheme reduces significantly the storage space for both public and private domains, and computational complexity when compared to Wu-Chen's scheme, Nikooghadam-Zakerolhosseini's scheme, and other related schemes. Through the informal and formal security analysis, we further show that our scheme is secure against different attacks and also man-in-the-middle attack. Moreover, dynamic access control problems in our scheme are also solved efficiently compared to other related schemes, making our scheme is much suitable for practical applications of e-medicine systems.

  19. Atomic switch networks-nanoarchitectonic design of a complex system for natural computing.

    PubMed

    Demis, E C; Aguilera, R; Sillin, H O; Scharnhorst, K; Sandouk, E J; Aono, M; Stieg, A Z; Gimzewski, J K

    2015-05-22

    Self-organized complex systems are ubiquitous in nature, and the structural complexity of these natural systems can be used as a model to design new classes of functional nanotechnology based on highly interconnected networks of interacting units. Conventional fabrication methods for electronic computing devices are subject to known scaling limits, confining the diversity of possible architectures. This work explores methods of fabricating a self-organized complex device known as an atomic switch network and discusses its potential utility in computing. Through a merger of top-down and bottom-up techniques guided by mathematical and nanoarchitectonic design principles, we have produced functional devices comprising nanoscale elements whose intrinsic nonlinear dynamics and memorization capabilities produce robust patterns of distributed activity and a capacity for nonlinear transformation of input signals when configured in the appropriate network architecture. Their operational characteristics represent a unique potential for hardware implementation of natural computation, specifically in the area of reservoir computing-a burgeoning field that investigates the computational aptitude of complex biologically inspired systems.

  20. Quantum population and entanglement evolution in photosynthetic process

    NASA Astrophysics Data System (ADS)

    Zhu, Jing

    Applications of the concepts of quantum information theory are usually related to the powerful and counter-intuitive quantum mechanical effects of superposition, interference and entanglement. In this thesis, I examine the role of coherence and entanglement in complex chemical systems. The research has focused mainly on two related projects: The first project is developing a theoretical model to explain the recent ultrafast experiments on excitonic migration in photosynthetic complexes that show long-lived coherence of the order of hundreds of femtoseconds and the second project developing the Grover algorithm for global optimization of complex systems. The first part can be divided into two sections. The first section is investigating the theoretical frame about the transfer of electronic excitation energy through the Fenna-Matthews-Olson (FMO) pigment-protein complex. The new developed modified scaled hierarchical equation of motion (HEOM) approach is employed for simulating the open quantum system. The second section is investigating the evolution of entanglement in the FMO complex based on the simulation result via scaled HEOM approach. We examine the role of multipartite entanglement in the FMO complex by direct computation of the convex roof optimization for a number of different measures, including pairwise, triplet, quadruple and quintuple sites entanglement. Our results support the hypothesis that multipartite entanglement is maximum primary along the two distinct electronic energy transfer pathways. The second part of this thesis can be separated into two sections. The first section demonstrated that a modified Grover's quantum algorithm can be applied to real problems of finding a global minimum using modest numbers of quantum bits. Calculations of the global minimum of simple test functions and Lennard-Jones clusters have been carried out on a quantum computer simulator using a modified Grover's algorithm. The second section is implementing the basic quantum logical gates upon arrays of trapped ultracold polar molecules as qubits for the quantum computer. Utilized herein is the Multi-Target Optimal Control Theory (MTOCT) as a means of manipulating the initial-to-target transition probability via external laser field. The detailed calculation is applied for the SrO molecule, an ideal candidate in proposed quantum computers using arrays of trapped ultra-cold polar molecules.

  1. CCOMP: An efficient algorithm for complex roots computation of determinantal equations

    NASA Astrophysics Data System (ADS)

    Zouros, Grigorios P.

    2018-01-01

    In this paper a free Python algorithm, entitled CCOMP (Complex roots COMPutation), is developed for the efficient computation of complex roots of determinantal equations inside a prescribed complex domain. The key to the method presented is the efficient determination of the candidate points inside the domain which, in their close neighborhood, a complex root may lie. Once these points are detected, the algorithm proceeds to a two-dimensional minimization problem with respect to the minimum modulus eigenvalue of the system matrix. In the core of CCOMP exist three sub-algorithms whose tasks are the efficient estimation of the minimum modulus eigenvalues of the system matrix inside the prescribed domain, the efficient computation of candidate points which guarantee the existence of minima, and finally, the computation of minima via bound constrained minimization algorithms. Theoretical results and heuristics support the development and the performance of the algorithm, which is discussed in detail. CCOMP supports general complex matrices, and its efficiency, applicability and validity is demonstrated to a variety of microwave applications.

  2. Control of a Serpentine Robot for Inspection Tasks

    NASA Technical Reports Server (NTRS)

    Colbaugh, R.; Glass, K.; Seraji, H.

    1994-01-01

    This paper presents a simple and robust kinematic control scheme for the JPL serpentine robot system. The proposed strategy is developed using the dampened-least-squares/configuration control methodology, and permits the considerable dexterity of the JPL serpentine robot to be effectively utilized for maneuvering in the congested and uncertain workspaces often encountered in inspection tasks. Computer simulation results are given for the 20 degree-of-freedom (DOF) manipulator system obtained by mounting the twelve DOF serpentine robot at the end-effector of an eight DOF Robotics Research arm/lathe-bed system. These simulations demonstrate that the proposed approach provides an effective method of controlling this complex system.

  3. A Scalable O(N) Algorithm for Large-Scale Parallel First-Principles Molecular Dynamics Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osei-Kuffuor, Daniel; Fattebert, Jean-Luc

    2014-01-01

    Traditional algorithms for first-principles molecular dynamics (FPMD) simulations only gain a modest capability increase from current petascale computers, due to their O(N 3) complexity and their heavy use of global communications. To address this issue, we are developing a truly scalable O(N) complexity FPMD algorithm, based on density functional theory (DFT), which avoids global communications. The computational model uses a general nonorthogonal orbital formulation for the DFT energy functional, which requires knowledge of selected elements of the inverse of the associated overlap matrix. We present a scalable algorithm for approximately computing selected entries of the inverse of the overlap matrix,more » based on an approximate inverse technique, by inverting local blocks corresponding to principal submatrices of the global overlap matrix. The new FPMD algorithm exploits sparsity and uses nearest neighbor communication to provide a computational scheme capable of extreme scalability. Accuracy is controlled by the mesh spacing of the finite difference discretization, the size of the localization regions in which the electronic orbitals are confined, and a cutoff beyond which the entries of the overlap matrix can be omitted when computing selected entries of its inverse. We demonstrate the algorithm's excellent parallel scaling for up to O(100K) atoms on O(100K) processors, with a wall-clock time of O(1) minute per molecular dynamics time step.« less

  4. Mode Transitions in Glass Cockpit Aircraft: Results of a Field Study

    NASA Technical Reports Server (NTRS)

    Degani, Asaf; Kirlik, Alex; Shafto, Michael (Technical Monitor)

    1995-01-01

    One consequence of increased levels of automation in complex control systems is the presence of modes. A mode is a particular configuration of a control system that defines how human command inputs are interpreted. In complex systems, modes also often determine a specific allocation of control authority between the human and automated systems. Even in simple static devices (e.g., electronic watches, word processors), the presence of modes has been found to cause problems in either-the acquisition or production of skilled performance. Many of these problems arise due to the fact that the selection of a mode causes device behavior to be mediated by hidden internal state information. For these simple systems, many of these interaction problems can be solved by the design of appropriate feedback to communicate internal state information to the human operator. In complex dynamic systems, however, the design issues associated with modes seem to trancend the problem of merely communicating internal state information via displayed feedback. In complex supervisory control systems (e.g., aircraft, spacecraft, military command and control), a key function of modes is the selection of a particular configuration of control authority between the human operator and automated control systems. One mode may result in full manual control, another may result in a mix of manual and automatic control, while a third may result in full automatic control over the entire system. The human operator selects an appropriate mode as a function of current goals, operating conditions, and operating procedures. Thus, the operator is put in a position of essentially trying to control two coupled dynamic systems: the target system itself, and also a highly complex suite of automation controlling the target system. From a historical perspective, it should probably not come as a surprise that very little information is available to guide the design of mode-oriented control systems. The topic of function allocation (i.e., the proper division of control authority among human and computer) has a long history in human-machine systems research. Although this research has produced some relevant guidelines, a design approach capable of defining appropriate allocations of control function between the human and automation is not yet available. As a result, the function allocation decision itself has been allocated to the operator, to be performed in real-time, in the operation of mode-oriented control systems. A variety of documented aircraft accidents and incidents suggest that the real-time selection and monitoring of control modes is a weak link in the effective operation of complex supervisory control systems. Research in human-machine systems and human-computer interaction has barely scraped the surface of the problem of understanding how operators manage this task.The purpose of this paper is to present the results of a field study which examined how operators manage mode selection in a complex supervisory control system. Data on mode engagements using the Boeing B757/767 auto-flight system were collected during approach and descent into four major airports in the East Coast of the United States. Protocols documenting mode selection, automatic mode changes, pilot actions, quantitative records of flight-path variables, and verbal reports during and after mode engagements were collected by an observer from the jumpseat. Observations were conducted on two typical trips between three airports. Each trip was be replicated 11 times, which yielded a total of 22 trips and 66 legs on which data were collected. All data collected concerned the same flight numbers, and therefore, the same time of day, same type of aircraft, and identical operational environments (e.g., ATC facilities, weather patterns, traffic flow etc.)

  5. Characterization of Freshwater EM Sub Bottom Sediment Properties and Target Responses for Detection of UXO with Ground-Penetrating RADAR (GPR)

    DTIC Science & Technology

    2008-09-01

    such that n*= √ε*. We computed phase velocity vph = c/Real(n*). We computed the one-way attenuation rate β (dB m−1) from the imaginary part of the...velocities of propagation at 100 MHz and 1 GHz. At 1 GHz we might expect vph to be controlled by the free, or nearly free value of εshi. The complex...distorted waveform resulted from changes in vph , β, or both across the pulse bandwidth. The small differences in vphmeas between 100 MHz and 1 GHz at

  6. On a computational model of building thermal dynamic response

    NASA Astrophysics Data System (ADS)

    Jarošová, Petra; Vala, Jiří

    2016-07-01

    Development and exploitation of advanced materials, structures and technologies in civil engineering, both for buildings with carefully controlled interior temperature and for common residential houses, together with new European and national directives and technical standards, stimulate the development of rather complex and robust, but sufficiently simple and inexpensive computational tools, supporting their design and optimization of energy consumption. This paper demonstrates the possibility of consideration of such seemingly contradictory requirements, using the simplified non-stationary thermal model of a building, motivated by the analogy with the analysis of electric circuits; certain semi-analytical forms of solutions come from the method of lines.

  7. A sweep algorithm for massively parallel simulation of circuit-switched networks

    NASA Technical Reports Server (NTRS)

    Gaujal, Bruno; Greenberg, Albert G.; Nicol, David M.

    1992-01-01

    A new massively parallel algorithm is presented for simulating large asymmetric circuit-switched networks, controlled by a randomized-routing policy that includes trunk-reservation. A single instruction multiple data (SIMD) implementation is described, and corresponding experiments on a 16384 processor MasPar parallel computer are reported. A multiple instruction multiple data (MIMD) implementation is also described, and corresponding experiments on an Intel IPSC/860 parallel computer, using 16 processors, are reported. By exploiting parallelism, our algorithm increases the possible execution rate of such complex simulations by as much as an order of magnitude.

  8. Grid Computing and Collaboration Technology in Support of Fusion Energy Sciences

    NASA Astrophysics Data System (ADS)

    Schissel, D. P.

    2004-11-01

    The SciDAC Initiative is creating a computational grid designed to advance scientific understanding in fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling, and allowing more efficient use of experimental facilities. The philosophy is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as easy to use network available services. Access to services is stressed rather than portability. Services share the same basic security infrastructure so that stakeholders can control their own resources and helps ensure fair use of resources. The collaborative control room is being developed using the open-source Access Grid software that enables secure group-to-group collaboration with capabilities beyond teleconferencing including application sharing and control. The ability to effectively integrate off-site scientists into a dynamic control room will be critical to the success of future international projects like ITER. Grid computing, the secure integration of computer systems over high-speed networks to provide on-demand access to data analysis capabilities and related functions, is being deployed as an alternative to traditional resource sharing among institutions. The first grid computational service deployed was the transport code TRANSP and included tools for run preparation, submission, monitoring and management. This approach saves user sites from the laborious effort of maintaining a complex code while at the same time reducing the burden on developers by avoiding the support of a large number of heterogeneous installations. This tutorial will present the philosophy behind an advanced collaborative environment, give specific examples, and discuss its usage beyond FES.

  9. Speech recognition for embedded automatic positioner for laparoscope

    NASA Astrophysics Data System (ADS)

    Chen, Xiaodong; Yin, Qingyun; Wang, Yi; Yu, Daoyin

    2014-07-01

    In this paper a novel speech recognition methodology based on Hidden Markov Model (HMM) is proposed for embedded Automatic Positioner for Laparoscope (APL), which includes a fixed point ARM processor as the core. The APL system is designed to assist the doctor in laparoscopic surgery, by implementing the specific doctor's vocal control to the laparoscope. Real-time respond to the voice commands asks for more efficient speech recognition algorithm for the APL. In order to reduce computation cost without significant loss in recognition accuracy, both arithmetic and algorithmic optimizations are applied in the method presented. First, depending on arithmetic optimizations most, a fixed point frontend for speech feature analysis is built according to the ARM processor's character. Then the fast likelihood computation algorithm is used to reduce computational complexity of the HMM-based recognition algorithm. The experimental results show that, the method shortens the recognition time within 0.5s, while the accuracy higher than 99%, demonstrating its ability to achieve real-time vocal control to the APL.

  10. Design, Control and in Situ Visualization of Gas Nitriding Processes

    PubMed Central

    Ratajski, Jerzy; Olik, Roman; Suszko, Tomasz; Dobrodziej, Jerzy; Michalski, Jerzy

    2010-01-01

    The article presents a complex system of design, in situ visualization and control of the commonly used surface treatment process: the gas nitriding process. In the computer design conception, analytical mathematical models and artificial intelligence methods were used. As a result, possibilities were obtained of the poly-optimization and poly-parametric simulations of the course of the process combined with a visualization of the value changes of the process parameters in the function of time, as well as possibilities to predict the properties of nitrided layers. For in situ visualization of the growth of the nitrided layer, computer procedures were developed which make use of the results of the correlations of direct and differential voltage and time runs of the process result sensor (magnetic sensor), with the proper layer growth stage. Computer procedures make it possible to combine, in the duration of the process, the registered voltage and time runs with the models of the process. PMID:22315536

  11. Adaptive Wavelet Coding Applied in a Wireless Control System.

    PubMed

    Gama, Felipe O S; Silveira, Luiz F Q; Salazar, Andrés O

    2017-12-13

    Wireless control systems can sense, control and act on the information exchanged between the wireless sensor nodes in a control loop. However, the exchanged information becomes susceptible to the degenerative effects produced by the multipath propagation. In order to minimize the destructive effects characteristic of wireless channels, several techniques have been investigated recently. Among them, wavelet coding is a good alternative for wireless communications for its robustness to the effects of multipath and its low computational complexity. This work proposes an adaptive wavelet coding whose parameters of code rate and signal constellation can vary according to the fading level and evaluates the use of this transmission system in a control loop implemented by wireless sensor nodes. The performance of the adaptive system was evaluated in terms of bit error rate (BER) versus E b / N 0 and spectral efficiency, considering a time-varying channel with flat Rayleigh fading, and in terms of processing overhead on a control system with wireless communication. The results obtained through computational simulations and experimental tests show performance gains obtained by insertion of the adaptive wavelet coding in a control loop with nodes interconnected by wireless link. These results enable the use of this technique in a wireless link control loop.

  12. Reinforcement learning in computer vision

    NASA Astrophysics Data System (ADS)

    Bernstein, A. V.; Burnaev, E. V.

    2018-04-01

    Nowadays, machine learning has become one of the basic technologies used in solving various computer vision tasks such as feature detection, image segmentation, object recognition and tracking. In many applications, various complex systems such as robots are equipped with visual sensors from which they learn state of surrounding environment by solving corresponding computer vision tasks. Solutions of these tasks are used for making decisions about possible future actions. It is not surprising that when solving computer vision tasks we should take into account special aspects of their subsequent application in model-based predictive control. Reinforcement learning is one of modern machine learning technologies in which learning is carried out through interaction with the environment. In recent years, Reinforcement learning has been used both for solving such applied tasks as processing and analysis of visual information, and for solving specific computer vision problems such as filtering, extracting image features, localizing objects in scenes, and many others. The paper describes shortly the Reinforcement learning technology and its use for solving computer vision problems.

  13. Cross-Talk in Superconducting Transmon Quantum Computing Architecture

    NASA Astrophysics Data System (ADS)

    Abraham, David; Chow, Jerry; Corcoles, Antonio; Rothwell, Mary; Keefe, George; Gambetta, Jay; Steffen, Matthias; IBM Quantum Computing Team

    2013-03-01

    Superconducting transmon quantum computing test structures often exhibit significant undesired cross-talk. For experiments with only a handful of qubits this cross-talk can be quantified and understood, and therefore corrected. As quantum computing circuits become more complex, and thereby contain increasing numbers of qubits and resonators, it becomes more vital that the inadvertent coupling between these elements is minimized. The task of accurately controlling each single qubit to the level of precision required throughout the realization of a quantum algorithm is difficult by itself, but coupled with the need of nulling out leakage signals from neighboring qubits or resonators would quickly become impossible. We discuss an approach to solve this critical problem. We acknowledge support from IARPA under contract W911NF-10-1-0324.

  14. Design Strategy for a Formally Verified Reliable Computing Platform

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Caldwell, James L.; DiVito, Ben L.

    1991-01-01

    This paper presents a high-level design for a reliable computing platform for real-time control applications. The design tradeoffs and analyses related to the development of a formally verified reliable computing platform are discussed. The design strategy advocated in this paper requires the use of techniques that can be completely characterized mathematically as opposed to more powerful or more flexible algorithms whose performance properties can only be analyzed by simulation and testing. The need for accurate reliability models that can be related to the behavior models is also stressed. Tradeoffs between reliability and voting complexity are explored. In particular, the transient recovery properties of the system are found to be fundamental to both the reliability analysis as well as the "correctness" models.

  15. Improved ALE mesh velocities for complex flows

    DOE PAGES

    Bakosi, Jozsef; Waltz, Jacob I.; Morgan, Nathaniel Ray

    2017-05-31

    A key choice in the development of arbitrary Lagrangian-Eulerian solution algorithms is how to move the computational mesh. The most common approaches are smoothing and relaxation techniques, or to compute a mesh velocity field that produces smooth mesh displacements. We present a method in which the mesh velocity is specified by the irrotational component of the fluid velocity as computed from a Helmholtz decomposition, and excess compression of mesh cells is treated through a noniterative, local spring-force model. This approach allows distinct and separate control over rotational and translational modes. In conclusion, the utility of the new mesh motion algorithmmore » is demonstrated on a number of 3D test problems, including problems that involve both shocks and significant amounts of vorticity.« less

  16. Computer modeling of electron and proton transport in chloroplasts.

    PubMed

    Tikhonov, Alexander N; Vershubskii, Alexey V

    2014-07-01

    Photosynthesis is one of the most important biological processes in biosphere, which provides production of organic substances from atmospheric CO2 and water at expense of solar energy. In this review, we contemplate computer models of oxygenic photosynthesis in the context of feedback regulation of photosynthetic electron transport in chloroplasts, the energy-transducing organelles of the plant cell. We start with a brief overview of electron and proton transport processes in chloroplasts coupled to ATP synthesis and consider basic regulatory mechanisms of oxygenic photosynthesis. General approaches to computer simulation of photosynthetic processes are considered, including the random walk models of plastoquinone diffusion in thylakoid membranes and deterministic approach to modeling electron transport in chloroplasts based on the mass action law. Then we focus on a kinetic model of oxygenic photosynthesis that includes key stages of the linear electron transport, alternative pathways of electron transfer around photosystem I (PSI), transmembrane proton transport and ATP synthesis in chloroplasts. This model includes different regulatory processes: pH-dependent control of the intersystem electron transport, down-regulation of photosystem II (PSII) activity (non-photochemical quenching), the light-induced activation of the Bassham-Benson-Calvin (BBC) cycle. The model correctly describes pH-dependent feedback control of electron transport in chloroplasts and adequately reproduces a variety of experimental data on induction events observed under different experimental conditions in intact chloroplasts (variations of CO2 and O2 concentrations in atmosphere), including a complex kinetics of P700 (primary electron donor in PSI) photooxidation, CO2 consumption in the BBC cycle, and photorespiration. Finally, we describe diffusion-controlled photosynthetic processes in chloroplasts within the framework of the model that takes into account complex architecture of chloroplasts and lateral heterogeneity of lamellar system of thylakoids. The lateral profiles of pH in the thylakoid lumen and in the narrow gap between grana thylakoids have been calculated under different metabolic conditions. Analyzing topological aspects of diffusion-controlled stages of electron and proton transport in chloroplasts, we conclude that along with the NPQ mechanism of attenuation of PSII activity and deceleration of PQH2 oxidation by the cytochrome b6f complex caused by the lumen acidification, the intersystem electron transport may be down-regulated due to the light-induced alkalization of the narrow partition between adjacent thylakoids of grana. The computer models of electron and proton transport described in this article may be integrated as appropriate modules into a comprehensive model of oxygenic photosynthesis. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. RAMP: A fault tolerant distributed microcomputer structure for aircraft navigation and control

    NASA Technical Reports Server (NTRS)

    Dunn, W. R.

    1980-01-01

    RAMP consists of distributed sets of parallel computers partioned on the basis of software and packaging constraints. To minimize hardware and software complexity, the processors operate asynchronously. It was shown that through the design of asymptotically stable control laws, data errors due to the asynchronism were minimized. It was further shown that by designing control laws with this property and making minor hardware modifications to the RAMP modules, the system became inherently tolerant to intermittent faults. A laboratory version of RAMP was constructed and is described in the paper along with the experimental results.

  18. Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain-computer interface

    NASA Astrophysics Data System (ADS)

    LaFleur, Karl; Cassady, Kaitlin; Doud, Alexander; Shades, Kaleb; Rogin, Eitan; He, Bin

    2013-08-01

    Objective. At the balanced intersection of human and machine adaptation is found the optimally functioning brain-computer interface (BCI). In this study, we report a novel experiment of BCI controlling a robotic quadcopter in three-dimensional (3D) physical space using noninvasive scalp electroencephalogram (EEG) in human subjects. We then quantify the performance of this system using metrics suitable for asynchronous BCI. Lastly, we examine the impact that the operation of a real world device has on subjects' control in comparison to a 2D virtual cursor task. Approach. Five human subjects were trained to modulate their sensorimotor rhythms to control an AR Drone navigating a 3D physical space. Visual feedback was provided via a forward facing camera on the hull of the drone. Main results. Individual subjects were able to accurately acquire up to 90.5% of all valid targets presented while travelling at an average straight-line speed of 0.69 m s-1. Significance. Freely exploring and interacting with the world around us is a crucial element of autonomy that is lost in the context of neurodegenerative disease. Brain-computer interfaces are systems that aim to restore or enhance a user's ability to interact with the environment via a computer and through the use of only thought. We demonstrate for the first time the ability to control a flying robot in 3D physical space using noninvasive scalp recorded EEG in humans. Our work indicates the potential of noninvasive EEG-based BCI systems for accomplish complex control in 3D physical space. The present study may serve as a framework for the investigation of multidimensional noninvasive BCI control in a physical environment using telepresence robotics.

  19. Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain-computer interface.

    PubMed

    LaFleur, Karl; Cassady, Kaitlin; Doud, Alexander; Shades, Kaleb; Rogin, Eitan; He, Bin

    2013-08-01

    At the balanced intersection of human and machine adaptation is found the optimally functioning brain-computer interface (BCI). In this study, we report a novel experiment of BCI controlling a robotic quadcopter in three-dimensional (3D) physical space using noninvasive scalp electroencephalogram (EEG) in human subjects. We then quantify the performance of this system using metrics suitable for asynchronous BCI. Lastly, we examine the impact that the operation of a real world device has on subjects' control in comparison to a 2D virtual cursor task. Five human subjects were trained to modulate their sensorimotor rhythms to control an AR Drone navigating a 3D physical space. Visual feedback was provided via a forward facing camera on the hull of the drone. Individual subjects were able to accurately acquire up to 90.5% of all valid targets presented while travelling at an average straight-line speed of 0.69 m s(-1). Freely exploring and interacting with the world around us is a crucial element of autonomy that is lost in the context of neurodegenerative disease. Brain-computer interfaces are systems that aim to restore or enhance a user's ability to interact with the environment via a computer and through the use of only thought. We demonstrate for the first time the ability to control a flying robot in 3D physical space using noninvasive scalp recorded EEG in humans. Our work indicates the potential of noninvasive EEG-based BCI systems for accomplish complex control in 3D physical space. The present study may serve as a framework for the investigation of multidimensional noninvasive BCI control in a physical environment using telepresence robotics.

  20. Optimal control of complex atomic quantum systems

    PubMed Central

    van Frank, S.; Bonneau, M.; Schmiedmayer, J.; Hild, S.; Gross, C.; Cheneau, M.; Bloch, I.; Pichler, T.; Negretti, A.; Calarco, T.; Montangero, S.

    2016-01-01

    Quantum technologies will ultimately require manipulating many-body quantum systems with high precision. Cold atom experiments represent a stepping stone in that direction: a high degree of control has been achieved on systems of increasing complexity. However, this control is still sub-optimal. In many scenarios, achieving a fast transformation is crucial to fight against decoherence and imperfection effects. Optimal control theory is believed to be the ideal candidate to bridge the gap between early stage proof-of-principle demonstrations and experimental protocols suitable for practical applications. Indeed, it can engineer protocols at the quantum speed limit – the fastest achievable timescale of the transformation. Here, we demonstrate such potential by computing theoretically and verifying experimentally the optimal transformations in two very different interacting systems: the coherent manipulation of motional states of an atomic Bose-Einstein condensate and the crossing of a quantum phase transition in small systems of cold atoms in optical lattices. We also show that such processes are robust with respect to perturbations, including temperature and atom number fluctuations. PMID:27725688

  1. Optimal control of complex atomic quantum systems.

    PubMed

    van Frank, S; Bonneau, M; Schmiedmayer, J; Hild, S; Gross, C; Cheneau, M; Bloch, I; Pichler, T; Negretti, A; Calarco, T; Montangero, S

    2016-10-11

    Quantum technologies will ultimately require manipulating many-body quantum systems with high precision. Cold atom experiments represent a stepping stone in that direction: a high degree of control has been achieved on systems of increasing complexity. However, this control is still sub-optimal. In many scenarios, achieving a fast transformation is crucial to fight against decoherence and imperfection effects. Optimal control theory is believed to be the ideal candidate to bridge the gap between early stage proof-of-principle demonstrations and experimental protocols suitable for practical applications. Indeed, it can engineer protocols at the quantum speed limit - the fastest achievable timescale of the transformation. Here, we demonstrate such potential by computing theoretically and verifying experimentally the optimal transformations in two very different interacting systems: the coherent manipulation of motional states of an atomic Bose-Einstein condensate and the crossing of a quantum phase transition in small systems of cold atoms in optical lattices. We also show that such processes are robust with respect to perturbations, including temperature and atom number fluctuations.

  2. Predictor symbology in computer-generated pictorial displays

    NASA Technical Reports Server (NTRS)

    Grunwald, A. J.

    1981-01-01

    The display under investigation, is a tunnel display for the four-dimensional commercial aircraft approach-to-landing under instrument flight rules. It is investigated whether more complex predictive information such as a three-dimensional perspective vehicle symbol, predicting the future vehicle position as well as future vehicle attitude angles, contributes to a better system response, and suitable predictor laws for the predictor motions, are formulated. Methods for utilizing the predictor symbol in controlling the forward velocity of the aircraft in four-dimensional approaches, are investigated. The simulator tests show, that the complex perspective vehicle symbol yields improved damping in the lateral response as compared to a flat two-dimensional predictor cross, but yields generally larger vertical deviations. Methods of using the predictor symbol in controlling the forward velocity of the vehicle are shown to be effective. The tunnel display with superimposed perspective vehicle symbol yields very satisfactory results and pilot acceptance in the lateral control but is found to be unsatisfactory in the vertical control, as a result of too large vertical path-angle deviations.

  3. Improved prescribed performance control for air-breathing hypersonic vehicles with unknown deadzone input nonlinearity.

    PubMed

    Wang, Yingyang; Hu, Jianbo

    2018-05-19

    An improved prescribed performance controller is proposed for the longitudinal model of an air-breathing hypersonic vehicle (AHV) subject to uncertain dynamics and input nonlinearity. Different from the traditional non-affine model requiring non-affine functions to be differentiable, this paper utilizes a semi-decomposed non-affine model with non-affine functions being locally semi-bounded and possibly in-differentiable. A new error transformation combined with novel prescribed performance functions is proposed to bypass complex deductions caused by conventional error constraint approaches and circumvent high frequency chattering in control inputs. On the basis of backstepping technique, the improved prescribed performance controller with low structural and computational complexity is designed. The methodology guarantees the altitude and velocity tracking error within transient and steady state performance envelopes and presents excellent robustness against uncertain dynamics and deadzone input nonlinearity. Simulation results demonstrate the efficacy of the proposed method. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  4. Compact, Reliable EEPROM Controller

    NASA Technical Reports Server (NTRS)

    Katz, Richard; Kleyner, Igor

    2010-01-01

    A compact, reliable controller for an electrically erasable, programmable read-only memory (EEPROM) has been developed specifically for a space-flight application. The design may be adaptable to other applications in which there are requirements for reliability in general and, in particular, for prevention of inadvertent writing of data in EEPROM cells. Inadvertent writes pose risks of loss of reliability in the original space-flight application and could pose such risks in other applications. Prior EEPROM controllers are large and complex and do not provide all reasonable protections (in many cases, few or no protections) against inadvertent writes. In contrast, the present controller provides several layers of protection against inadvertent writes. The controller also incorporates a write-time monitor, enabling determination of trends in the performance of an EEPROM through all phases of testing. The controller has been designed as an integral subsystem of a system that includes not only the controller and the controlled EEPROM aboard a spacecraft but also computers in a ground control station, relatively simple onboard support circuitry, and an onboard communication subsystem that utilizes the MIL-STD-1553B protocol. (MIL-STD-1553B is a military standard that encompasses a method of communication and electrical-interface requirements for digital electronic subsystems connected to a data bus. MIL-STD- 1553B is commonly used in defense and space applications.) The intent was to both maximize reliability while minimizing the size and complexity of onboard circuitry. In operation, control of the EEPROM is effected via the ground computers, the MIL-STD-1553B communication subsystem, and the onboard support circuitry, all of which, in combination, provide the multiple layers of protection against inadvertent writes. There is no controller software, unlike in many prior EEPROM controllers; software can be a major contributor to unreliability, particularly in fault situations such as the loss of power or brownouts. Protection is also provided by a powermonitoring circuit.

  5. Feedback controlled optics with wavefront compensation

    NASA Technical Reports Server (NTRS)

    Breckenridge, William G. (Inventor); Redding, David C. (Inventor)

    1993-01-01

    The sensitivity model of a complex optical system obtained by linear ray tracing is used to compute a control gain matrix by imposing the mathematical condition for minimizing the total wavefront error at the optical system's exit pupil. The most recent deformations or error states of the controlled segments or optical surfaces of the system are then assembled as an error vector, and the error vector is transformed by the control gain matrix to produce the exact control variables which will minimize the total wavefront error at the exit pupil of the optical system. These exact control variables are then applied to the actuators controlling the various optical surfaces in the system causing the immediate reduction in total wavefront error observed at the exit pupil of the optical system.

  6. Space shuttle main engine controller

    NASA Technical Reports Server (NTRS)

    Mattox, R. M.; White, J. B.

    1981-01-01

    A technical description of the space shuttle main engine controller, which provides engine checkout prior to launch, engine control and monitoring during launch, and engine safety and monitoring in orbit, is presented. Each of the major controller subassemblies, the central processing unit, the computer interface electronics, the input electronics, the output electronics, and the power supplies are described and discussed in detail along with engine and orbiter interfaces and operational requirements. The controller represents a unique application of digital concepts, techniques, and technology in monitoring, managing, and controlling a high performance rocket engine propulsion system. The operational requirements placed on the controller, the extremely harsh operating environment to which it is exposed, and the reliability demanded, result in the most complex and rugged digital system ever designed, fabricated, and flown.

  7. Scalable hybrid computation with spikes.

    PubMed

    Sarpeshkar, Rahul; O'Halloran, Micah

    2002-09-01

    We outline a hybrid analog-digital scheme for computing with three important features that enable it to scale to systems of large complexity: First, like digital computation, which uses several one-bit precise logical units to collectively compute a precise answer to a computation, the hybrid scheme uses several moderate-precision analog units to collectively compute a precise answer to a computation. Second, frequent discrete signal restoration of the analog information prevents analog noise and offset from degrading the computation. And, third, a state machine enables complex computations to be created using a sequence of elementary computations. A natural choice for implementing this hybrid scheme is one based on spikes because spike-count codes are digital, while spike-time codes are analog. We illustrate how spikes afford easy ways to implement all three components of scalable hybrid computation. First, as an important example of distributed analog computation, we show how spikes can create a distributed modular representation of an analog number by implementing digital carry interactions between spiking analog neurons. Second, we show how signal restoration may be performed by recursive spike-count quantization of spike-time codes. And, third, we use spikes from an analog dynamical system to trigger state transitions in a digital dynamical system, which reconfigures the analog dynamical system using a binary control vector; such feedback interactions between analog and digital dynamical systems create a hybrid state machine (HSM). The HSM extends and expands the concept of a digital finite-state-machine to the hybrid domain. We present experimental data from a two-neuron HSM on a chip that implements error-correcting analog-to-digital conversion with the concurrent use of spike-time and spike-count codes. We also present experimental data from silicon circuits that implement HSM-based pattern recognition using spike-time synchrony. We outline how HSMs may be used to perform learning, vector quantization, spike pattern recognition and generation, and how they may be reconfigured.

  8. Hierarchical Ada robot programming system (HARPS)- A complete and working telerobot control system based on the NASREM model

    NASA Technical Reports Server (NTRS)

    Leake, Stephen; Green, Tom; Cofer, Sue; Sauerwein, Tim

    1989-01-01

    HARPS is a telerobot control system that can perform some simple but useful tasks. This capability is demonstrated by performing the ORU exchange demonstration. HARPS is based on NASREM (NASA Standard Reference Model). All software is developed in Ada, and the project incorporates a number of different CASE (computer-aided software engineering) tools. NASREM was found to be a valid and useful model for building a telerobot control system. Its hierarchical and distributed structure creates a natural and logical flow for implementing large complex robust control systems. The ability of Ada to create and enforce abstraction enhanced the implementation of such control systems.

  9. Motion of the two-control airplane in rectilinear flight after initial disturbances with introduction of controls following an exponential law

    NASA Technical Reports Server (NTRS)

    Klemin, Alexander

    1937-01-01

    An airplane in steady rectilinear flight was assumed to experience an initial disturbance in rolling or yawing velocity. The equations of motion were solved to see if it was possible to hasten recovery of a stable airplane or to secure recovery of an unstable airplane by the application of a single lateral control following an exponential law. The sample computations indicate that, for initial disturbances complex in character, it would be difficult to secure correlation with any type of exponential control. The possibility is visualized that the two-control operation may seriously impair the ability to hasten recovery or counteract instability.

  10. Qualitative Case Study Exploring Operational Barriers Impeding Small and Private, Nonprofit Higher Education Institutions from Implementing Information Security Controls

    ERIC Educational Resources Information Center

    Liesen, Joseph J.

    2017-01-01

    The higher education industry uses the very latest technologies to effectively prepare students for their careers, but these technologies often contain vulnerabilities that can be exploited via their connection to the Internet. The complex task of securing information and computing systems is made more difficult at institutions of higher education…

  11. Theoretical Foundations of Software Technology.

    DTIC Science & Technology

    1983-02-14

    major research interests are software testing, aritificial intelligence , pattern recogu- tion, and computer graphics. Dr. Chandranekaran is currently...produce PASCAL language code for the problems. Because of its relationship to many issues in Artificial Intelligence , we also investigated problems of...analysis to concurmt-prmcess software re- are not " intelligent " enough to discover these by themselves, ouirl more complex control flow models. The PAF

  12. Automated validation of a computer operating system

    NASA Technical Reports Server (NTRS)

    Dervage, M. M.; Milberg, B. A.

    1970-01-01

    Programs apply selected input/output loads to complex computer operating system and measure performance of that system under such loads. Technique lends itself to checkout of computer software designed to monitor automated complex industrial systems.

  13. Vibrotactile Feedback for Brain-Computer Interface Operation

    PubMed Central

    Cincotti, Febo; Kauhanen, Laura; Aloise, Fabio; Palomäki, Tapio; Caporusso, Nicholas; Jylänki, Pasi; Mattia, Donatella; Babiloni, Fabio; Vanacker, Gerolf; Nuttin, Marnix; Marciani, Maria Grazia; Millán, José del R.

    2007-01-01

    To be correctly mastered, brain-computer interfaces (BCIs) need an uninterrupted flow of feedback to the user. This feedback is usually delivered through the visual channel. Our aim was to explore the benefits of vibrotactile feedback during users' training and control of EEG-based BCI applications. A protocol for delivering vibrotactile feedback, including specific hardware and software arrangements, was specified. In three studies with 33 subjects (including 3 with spinal cord injury), we compared vibrotactile and visual feedback, addressing: (I) the feasibility of subjects' training to master their EEG rhythms using tactile feedback; (II) the compatibility of this form of feedback in presence of a visual distracter; (III) the performance in presence of a complex visual task on the same (visual) or different (tactile) sensory channel. The stimulation protocol we developed supports a general usage of the tactors; preliminary experimentations. All studies indicated that the vibrotactile channel can function as a valuable feedback modality with reliability comparable to the classical visual feedback. Advantages of using a vibrotactile feedback emerged when the visual channel was highly loaded by a complex task. In all experiments, vibrotactile feedback felt, after some training, more natural for both controls and SCI users. PMID:18354734

  14. Implementation of data acquisition interface using on-board field-programmable gate array (FPGA) universal serial bus (USB) link

    NASA Astrophysics Data System (ADS)

    Yussup, N.; Ibrahim, M. M.; Lombigit, L.; Rahman, N. A. A.; Zin, M. R. M.

    2014-02-01

    Typically a system consists of hardware as the controller and software which is installed in the personal computer (PC). In the effective nuclear detection, the hardware involves the detection setup and the electronics used, with the software consisting of analysis tools and graphical display on PC. A data acquisition interface is necessary to enable the communication between the controller hardware and PC. Nowadays, Universal Serial Bus (USB) has become a standard connection method for computer peripherals and has replaced many varieties of serial and parallel ports. However the implementation of USB is complex. This paper describes the implementation of data acquisition interface between a field-programmable gate array (FPGA) board and a PC by exploiting the USB link of the FPGA board. The USB link is based on an FTDI chip which allows direct access of input and output to the Joint Test Action Group (JTAG) signals from a USB host and a complex programmable logic device (CPLD) with a 24 MHz clock input to the USB link. The implementation and results of using the USB link of FPGA board as the data interfacing are discussed.

  15. Implementation of data acquisition interface using on-board field-programmable gate array (FPGA) universal serial bus (USB) link

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yussup, N.; Ibrahim, M. M.; Lombigit, L.

    Typically a system consists of hardware as the controller and software which is installed in the personal computer (PC). In the effective nuclear detection, the hardware involves the detection setup and the electronics used, with the software consisting of analysis tools and graphical display on PC. A data acquisition interface is necessary to enable the communication between the controller hardware and PC. Nowadays, Universal Serial Bus (USB) has become a standard connection method for computer peripherals and has replaced many varieties of serial and parallel ports. However the implementation of USB is complex. This paper describes the implementation of datamore » acquisition interface between a field-programmable gate array (FPGA) board and a PC by exploiting the USB link of the FPGA board. The USB link is based on an FTDI chip which allows direct access of input and output to the Joint Test Action Group (JTAG) signals from a USB host and a complex programmable logic device (CPLD) with a 24 MHz clock input to the USB link. The implementation and results of using the USB link of FPGA board as the data interfacing are discussed.« less

  16. Data based identification and prediction of nonlinear and complex dynamical systems

    NASA Astrophysics Data System (ADS)

    Wang, Wen-Xu; Lai, Ying-Cheng; Grebogi, Celso

    2016-07-01

    The problem of reconstructing nonlinear and complex dynamical systems from measured data or time series is central to many scientific disciplines including physical, biological, computer, and social sciences, as well as engineering and economics. The classic approach to phase-space reconstruction through the methodology of delay-coordinate embedding has been practiced for more than three decades, but the paradigm is effective mostly for low-dimensional dynamical systems. Often, the methodology yields only a topological correspondence of the original system. There are situations in various fields of science and engineering where the systems of interest are complex and high dimensional with many interacting components. A complex system typically exhibits a rich variety of collective dynamics, and it is of great interest to be able to detect, classify, understand, predict, and control the dynamics using data that are becoming increasingly accessible due to the advances of modern information technology. To accomplish these goals, especially prediction and control, an accurate reconstruction of the original system is required. Nonlinear and complex systems identification aims at inferring, from data, the mathematical equations that govern the dynamical evolution and the complex interaction patterns, or topology, among the various components of the system. With successful reconstruction of the system equations and the connecting topology, it may be possible to address challenging and significant problems such as identification of causal relations among the interacting components and detection of hidden nodes. The "inverse" problem thus presents a grand challenge, requiring new paradigms beyond the traditional delay-coordinate embedding methodology. The past fifteen years have witnessed rapid development of contemporary complex graph theory with broad applications in interdisciplinary science and engineering. The combination of graph, information, and nonlinear dynamical systems theories with tools from statistical physics, optimization, engineering control, applied mathematics, and scientific computing enables the development of a number of paradigms to address the problem of nonlinear and complex systems reconstruction. In this Review, we describe the recent advances in this forefront and rapidly evolving field, with a focus on compressive sensing based methods. In particular, compressive sensing is a paradigm developed in recent years in applied mathematics, electrical engineering, and nonlinear physics to reconstruct sparse signals using only limited data. It has broad applications ranging from image compression/reconstruction to the analysis of large-scale sensor networks, and it has become a powerful technique to obtain high-fidelity signals for applications where sufficient observations are not available. We will describe in detail how compressive sensing can be exploited to address a diverse array of problems in data based reconstruction of nonlinear and complex networked systems. The problems include identification of chaotic systems and prediction of catastrophic bifurcations, forecasting future attractors of time-varying nonlinear systems, reconstruction of complex networks with oscillatory and evolutionary game dynamics, detection of hidden nodes, identification of chaotic elements in neuronal networks, reconstruction of complex geospatial networks and nodal positioning, and reconstruction of complex spreading networks with binary data.. A number of alternative methods, such as those based on system response to external driving, synchronization, and noise-induced dynamical correlation, will also be discussed. Due to the high relevance of network reconstruction to biological sciences, a special section is devoted to a brief survey of the current methods to infer biological networks. Finally, a number of open problems including control and controllability of complex nonlinear dynamical networks are discussed. The methods outlined in this Review are principled on various concepts in complexity science and engineering such as phase transitions, bifurcations, stabilities, and robustness. The methodologies have the potential to significantly improve our ability to understand a variety of complex dynamical systems ranging from gene regulatory systems to social networks toward the ultimate goal of controlling such systems.

  17. Technical Development and Application of Soft Computing in Agricultural and Biological Engineering

    USDA-ARS?s Scientific Manuscript database

    Soft computing is a set of “inexact” computing techniques, which are able to model and analyze very complex problems. For these complex problems, more conventional methods have not been able to produce cost-effective, analytical, or complete solutions. Soft computing has been extensively studied and...

  18. Development of Soft Computing and Applications in Agricultural and Biological Engineering

    USDA-ARS?s Scientific Manuscript database

    Soft computing is a set of “inexact” computing techniques, which are able to model and analyze very complex problems. For these complex problems, more conventional methods have not been able to produce cost-effective, analytical, or complete solutions. Soft computing has been extensively studied and...

  19. Does the nervous system use equilibrium-point control to guide single and multiple joint movements?

    PubMed

    Bizzi, E; Hogan, N; Mussa-Ivaldi, F A; Giszter, S

    1992-12-01

    The hypothesis that the central nervous system (CNS) generates movement as a shift of the limb's equilibrium posture has been corroborated experimentally in studies involving single- and multijoint motions. Posture may be controlled through the choice of muscle length-tension curve that set agonist-antagonist torque-angle curves determining an equilibrium position for the limb and the stiffness about the joints. Arm trajectories seem to be generated through a control signal defining a series of equilibrium postures. The equilibrium-point hypothesis drastically simplifies the requisite computations for multijoint movements and mechanical interactions with complex dynamic objects in the environment. Because the neuromuscular system is springlike, the instantaneous difference between the arm's actual position and the equilibrium position specified by the neural activity can generate the requisite torques, avoiding the complex "inverse dynamic" problem of computing the torques at the joints. The hypothesis provides a simple, unified description of posture and movement as well as contact control task performance, in which the limb must exert force stably and do work on objects in the environment. The latter is a surprisingly difficult problem, as robotic experience has shown. The prior evidence for the hypothesis came mainly from psychophysical and behavioral experiments. Our recent work has shown that microstimulation of the frog spinal cord's premotoneural network produces leg movements to various positions in the frog's motor space. The hypothesis can now be investigated in the neurophysiological machinery of the spinal cord.

  20. Computational quantum-classical boundary of noisy commuting quantum circuits

    PubMed Central

    Fujii, Keisuke; Tamate, Shuhei

    2016-01-01

    It is often said that the transition from quantum to classical worlds is caused by decoherence originated from an interaction between a system of interest and its surrounding environment. Here we establish a computational quantum-classical boundary from the viewpoint of classical simulatability of a quantum system under decoherence. Specifically, we consider commuting quantum circuits being subject to decoherence. Or equivalently, we can regard them as measurement-based quantum computation on decohered weighted graph states. To show intractability of classical simulation in the quantum side, we utilize the postselection argument and crucially strengthen it by taking noise effect into account. Classical simulatability in the classical side is also shown constructively by using both separable criteria in a projected-entangled-pair-state picture and the Gottesman-Knill theorem for mixed state Clifford circuits. We found that when each qubit is subject to a single-qubit complete-positive-trace-preserving noise, the computational quantum-classical boundary is sharply given by the noise rate required for the distillability of a magic state. The obtained quantum-classical boundary of noisy quantum dynamics reveals a complexity landscape of controlled quantum systems. This paves a way to an experimentally feasible verification of quantum mechanics in a high complexity limit beyond classically simulatable region. PMID:27189039

  1. Computational quantum-classical boundary of noisy commuting quantum circuits.

    PubMed

    Fujii, Keisuke; Tamate, Shuhei

    2016-05-18

    It is often said that the transition from quantum to classical worlds is caused by decoherence originated from an interaction between a system of interest and its surrounding environment. Here we establish a computational quantum-classical boundary from the viewpoint of classical simulatability of a quantum system under decoherence. Specifically, we consider commuting quantum circuits being subject to decoherence. Or equivalently, we can regard them as measurement-based quantum computation on decohered weighted graph states. To show intractability of classical simulation in the quantum side, we utilize the postselection argument and crucially strengthen it by taking noise effect into account. Classical simulatability in the classical side is also shown constructively by using both separable criteria in a projected-entangled-pair-state picture and the Gottesman-Knill theorem for mixed state Clifford circuits. We found that when each qubit is subject to a single-qubit complete-positive-trace-preserving noise, the computational quantum-classical boundary is sharply given by the noise rate required for the distillability of a magic state. The obtained quantum-classical boundary of noisy quantum dynamics reveals a complexity landscape of controlled quantum systems. This paves a way to an experimentally feasible verification of quantum mechanics in a high complexity limit beyond classically simulatable region.

  2. Computational quantum-classical boundary of noisy commuting quantum circuits

    NASA Astrophysics Data System (ADS)

    Fujii, Keisuke; Tamate, Shuhei

    2016-05-01

    It is often said that the transition from quantum to classical worlds is caused by decoherence originated from an interaction between a system of interest and its surrounding environment. Here we establish a computational quantum-classical boundary from the viewpoint of classical simulatability of a quantum system under decoherence. Specifically, we consider commuting quantum circuits being subject to decoherence. Or equivalently, we can regard them as measurement-based quantum computation on decohered weighted graph states. To show intractability of classical simulation in the quantum side, we utilize the postselection argument and crucially strengthen it by taking noise effect into account. Classical simulatability in the classical side is also shown constructively by using both separable criteria in a projected-entangled-pair-state picture and the Gottesman-Knill theorem for mixed state Clifford circuits. We found that when each qubit is subject to a single-qubit complete-positive-trace-preserving noise, the computational quantum-classical boundary is sharply given by the noise rate required for the distillability of a magic state. The obtained quantum-classical boundary of noisy quantum dynamics reveals a complexity landscape of controlled quantum systems. This paves a way to an experimentally feasible verification of quantum mechanics in a high complexity limit beyond classically simulatable region.

  3. Essentials and Perspectives of Computational Modelling Assistance for CNS-oriented Nanoparticle-based Drug Delivery Systems.

    PubMed

    Kisała, Joanna; Heclik, Kinga I; Pogocki, Krzysztof; Pogocki, Dariusz

    2018-05-16

    The blood-brain barrier (BBB) is a complex system controlling two-way substances traffic between circulatory (cardiovascular) system and central nervous system (CNS). It is almost perfectly crafted to regulate brain homeostasis and to permit selective transport of molecules that are essential for brain function. For potential drug candidates, the CNS-oriented neuropharmaceuticals as well as for those of primary targets in the periphery, the extent to which a substance in the circulation gains access to the CNS seems crucial. With the advent of nanopharmacology the problem of the BBB permeability for drug nano-carriers gains new significance. Compare to some other fields of medicinal chemistry, the computational science of nanodelivery is still prematured to offer the black-box type solutions, especially for the BBB-case. However, even its enormous complexity can be spell out the physical principles, and as such subjected to computation. Basic understanding of various physico-chemical parameters describing the brain uptake is required to take advantage of their usage for the BBB-nanodelivery. This mini-review provides a sketchy introduction into essential concepts allowing application of computational simulation to the BBB-nanodelivery design. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  4. Measurement of the complex transmittance of large optical elements with Ptychographical Iterative Engine.

    PubMed

    Wang, Hai-Yan; Liu, Cheng; Veetil, Suhas P; Pan, Xing-Chen; Zhu, Jian-Qiang

    2014-01-27

    Wavefront control is a significant parameter in inertial confinement fusion (ICF). The complex transmittance of large optical elements which are often used in ICF is obtained by computing the phase difference of the illuminating and transmitting fields using Ptychographical Iterative Engine (PIE). This can accurately and effectively measure the transmittance of large optical elements with irregular surface profiles, which are otherwise not measurable using commonly used interferometric techniques due to a lack of standard reference plate. Experiments are done with a Continue Phase Plate (CPP) to illustrate the feasibility of this method.

  5. Coordinating complex problem-solving among distributed intelligent agents

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1992-01-01

    A process-oriented control model is described for distributed problem solving. The model coordinates the transfer and manipulation of information across independent networked applications, both intelligent and conventional. The model was implemented using SOCIAL, a set of object-oriented tools for distributing computing. Complex sequences of distributed tasks are specified in terms of high level scripts. Scripts are executed by SOCIAL objects called Manager Agents, which realize an intelligent coordination model that routes individual tasks to suitable server applications across the network. These tools are illustrated in a prototype distributed system for decision support of ground operations for NASA's Space Shuttle fleet.

  6. Controllability of Surface Water Networks

    NASA Astrophysics Data System (ADS)

    Riasi, M. Sadegh; Yeghiazarian, Lilit

    2017-12-01

    To sustainably manage water resources, we must understand how to control complex networked systems. In this paper, we study surface water networks from the perspective of structural controllability, a concept that integrates classical control theory with graph-theoretic formalism. We present structural controllability theory and compute four metrics: full and target controllability, control centrality and control profile (FTCP) that collectively determine the structural boundaries of the system's control space. We use these metrics to answer the following questions: How does the structure of a surface water network affect its controllability? How to efficiently control a preselected subset of the network? Which nodes have the highest control power? What types of topological structures dominate controllability? Finally, we demonstrate the structural controllability theory in the analysis of a wide range of surface water networks, such as tributary, deltaic, and braided river systems.

  7. Laminar and orientation-dependent characteristics of spatial nonlinearities: implications for the computational architecture of visual cortex.

    PubMed

    Victor, Jonathan D; Mechler, Ferenc; Ohiorhenuan, Ifije; Schmid, Anita M; Purpura, Keith P

    2009-12-01

    A full understanding of the computations performed in primary visual cortex is an important yet elusive goal. Receptive field models consisting of cascades of linear filters and static nonlinearities may be adequate to account for responses to simple stimuli such as gratings and random checkerboards, but their predictions of responses to complex stimuli such as natural scenes are only approximately correct. It is unclear whether these discrepancies are limited to quantitative inaccuracies that reflect well-recognized mechanisms such as response normalization, gain controls, and cross-orientation suppression or, alternatively, imply additional qualitative features of the underlying computations. To address this question, we examined responses of V1 and V2 neurons in the monkey and area 17 neurons in the cat to two-dimensional Hermite functions (TDHs). TDHs are intermediate in complexity between traditional analytic stimuli and natural scenes and have mathematical properties that facilitate their use to test candidate models. By exploiting these properties, along with the laminar organization of V1, we identify qualitative aspects of neural computations beyond those anticipated from the above-cited model framework. Specifically, we find that V1 neurons receive signals from orientation-selective mechanisms that are highly nonlinear: they are sensitive to phase correlations, not just spatial frequency content. That is, the behavior of V1 neurons departs from that of linear-nonlinear cascades with standard modulatory mechanisms in a qualitative manner: even relatively simple stimuli evoke responses that imply complex spatial nonlinearities. The presence of these findings in the input layers suggests that these nonlinearities act in a feedback fashion.

  8. Handheld Devices with Wide-Area Wireless Connectivity: Applications in Astronomy Educational Technology and Remote Computational Control

    NASA Astrophysics Data System (ADS)

    Budiardja, R. D.; Lingerfelt, E. J.; Guidry, M. W.

    2003-05-01

    Wireless technology implemented with handheld devices has attractive features because of the potential to access large amounts of data and the prospect of on-the-fly computational analysis from a device that can be carried in a shirt pocket. We shall describe applications of such technology to the general paradigm of making digital wireless connections from the field to upload information and queries to network servers, executing (potentially complex) programs and controlling data analysis and/or database operations on fast network computers, and returning real-time information from this analysis to the handheld device in the field. As illustration, we shall describe several client/server programs that we have written for applications in teaching introductory astronomy. For example, one program allows static and dynamic properties of astronomical objects to be accessed in a remote observation laboratory setting using a digital cell phone or PDA. Another implements interactive quizzing over a cell phone or PDA using a 700-question introductory astronomy quiz database, thus permitting students to study for astronomy quizzes in any environment in which they have a few free minutes and a digital cell phone or wireless PDA. Another allows one to control and monitor a computation done on a Beowulf cluster by changing the parameters of the computation remotely and retrieving the result when the computation is done. The presentation will include hands-on demonstrations with real devices. *Managed by UT-Battelle, LLC, for the U.S. Department of Energy under contract DE-AC05-00OR22725.

  9. A study on special test stand of automatic and manual descent control in presence of simulated g-load effect

    NASA Astrophysics Data System (ADS)

    Glazkov, Yury; Artjuchin, Yury; Astakhov, Alexander; Vas'kov, Alexander; Malyshev, Veniamin; Mitroshin, Edward; Glinsky, Valery; Moiseenko, Vasily; Makovlev, Vyacheslav

    The development of aircraft-type reusable space vehicles (RSV) involves the problem of complete compatibility of automatic, director and manual control. Task decision is complicated, in particular, due to considerable quantitative and qualitative changes of vehicle dynamic characteristics, little stability margins (and even of unstability) of the RSV, and stringent requirements to control accuracy at some flight phases. Besides, during control a pilot is affected by g-loads which hamper motor activity and deteriorate its accuracy, alter the functional status of the visual analyser, and influence higher nervous activity. A study of g-load effects on the control efficiency, especially in manual and director modes, is of primary importance. The main tools for study of a rational selection of manual and director vehicle control systems and as an aid in formulating recommendations for optimum crew-automatic control system interactions are special complex and functional flight simulator test stands. The proposed simulator stand includes a powerful digital computer complex combined with the control system of the centrifuge. The interior of a pilot's vehicle cabin is imitated. A situation image system, pyscho-physical monitoring system, physician, centrifuge operator, and instructor stations are linked with the test stand.

  10. Robust Decision Making: The Cognitive and Computational Modeling of Team Problem Solving for Decision Making under Complex and Dynamic Conditions

    DTIC Science & Technology

    2015-07-14

    AFRL-OSR-VA-TR-2015-0202 Robust Decision Making: The Cognitive and Computational Modeling of Team Problem Solving for Decision Making under Complex...Computational Modeling of Team Problem Solving for Decision Making Under Complex and Dynamic Conditions 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-12-1...functioning as they solve complex problems, and propose the means to improve the performance of teams, under changing or adversarial conditions. By

  11. An adaptive Cartesian control scheme for manipulators

    NASA Technical Reports Server (NTRS)

    Seraji, H.

    1987-01-01

    A adaptive control scheme for direct control of manipulator end-effectors to achieve trajectory tracking in Cartesian space is developed. The control structure is obtained from linear multivariable theory and is composed of simple feedforward and feedback controllers and an auxiliary input. The direct adaptation laws are derived from model reference adaptive control theory and are not based on parameter estimation of the robot model. The utilization of feedforward control and the inclusion of auxiliary input are novel features of the present scheme and result in improved dynamic performance over existing adaptive control schemes. The adaptive controller does not require the complex mathematical model of the robot dynamics or any knowledge of the robot parameters or the payload, and is computationally fast for online implementation with high sampling rates.

  12. Fast Dynamic Simulation-Based Small Signal Stability Assessment and Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acharya, Naresh; Baone, Chaitanya; Veda, Santosh

    2014-12-31

    Power grid planning and operation decisions are made based on simulation of the dynamic behavior of the system. Enabling substantial energy savings while increasing the reliability of the aging North American power grid through improved utilization of existing transmission assets hinges on the adoption of wide-area measurement systems (WAMS) for power system stabilization. However, adoption of WAMS alone will not suffice if the power system is to reach its full entitlement in stability and reliability. It is necessary to enhance predictability with "faster than real-time" dynamic simulations that will enable the dynamic stability margins, proactive real-time control, and improve gridmore » resiliency to fast time-scale phenomena such as cascading network failures. Present-day dynamic simulations are performed only during offline planning studies, considering only worst case conditions such as summer peak, winter peak days, etc. With widespread deployment of renewable generation, controllable loads, energy storage devices and plug-in hybrid electric vehicles expected in the near future and greater integration of cyber infrastructure (communications, computation and control), monitoring and controlling the dynamic performance of the grid in real-time would become increasingly important. The state-of-the-art dynamic simulation tools have limited computational speed and are not suitable for real-time applications, given the large set of contingency conditions to be evaluated. These tools are optimized for best performance of single-processor computers, but the simulation is still several times slower than real-time due to its computational complexity. With recent significant advances in numerical methods and computational hardware, the expectations have been rising towards more efficient and faster techniques to be implemented in power system simulators. This is a natural expectation, given that the core solution algorithms of most commercial simulators were developed decades ago, when High Performance Computing (HPC) resources were not commonly available.« less

  13. Decomposed multidimensional control grid interpolation for common consumer electronic image processing applications

    NASA Astrophysics Data System (ADS)

    Zwart, Christine M.; Venkatesan, Ragav; Frakes, David H.

    2012-10-01

    Interpolation is an essential and broadly employed function of signal processing. Accordingly, considerable development has focused on advancing interpolation algorithms toward optimal accuracy. Such development has motivated a clear shift in the state-of-the art from classical interpolation to more intelligent and resourceful approaches, registration-based interpolation for example. As a natural result, many of the most accurate current algorithms are highly complex, specific, and computationally demanding. However, the diverse hardware destinations for interpolation algorithms present unique constraints that often preclude use of the most accurate available options. For example, while computationally demanding interpolators may be suitable for highly equipped image processing platforms (e.g., computer workstations and clusters), only more efficient interpolators may be practical for less well equipped platforms (e.g., smartphones and tablet computers). The latter examples of consumer electronics present a design tradeoff in this regard: high accuracy interpolation benefits the consumer experience but computing capabilities are limited. It follows that interpolators with favorable combinations of accuracy and efficiency are of great practical value to the consumer electronics industry. We address multidimensional interpolation-based image processing problems that are common to consumer electronic devices through a decomposition approach. The multidimensional problems are first broken down into multiple, independent, one-dimensional (1-D) interpolation steps that are then executed with a newly modified registration-based one-dimensional control grid interpolator. The proposed approach, decomposed multidimensional control grid interpolation (DMCGI), combines the accuracy of registration-based interpolation with the simplicity, flexibility, and computational efficiency of a 1-D interpolation framework. Results demonstrate that DMCGI provides improved interpolation accuracy (and other benefits) in image resizing, color sample demosaicing, and video deinterlacing applications, at a computational cost that is manageable or reduced in comparison to popular alternatives.

  14. A Computer Story: Complexity from Simplicity

    ERIC Educational Resources Information Center

    DeLeo, Gary; Weidenhammer, Amanda; Wecht, Kristen

    2012-01-01

    In this technological age, digital devices are conspicuous examples of extraordinary complexity. When a user clicks on computer icons or presses calculator buttons, these devices channel electricity through a complex system of decision-making circuits. Yet, in spite of this remarkable complexity, the hearts of these devices are components that…

  15. Dicopper(II) metallacyclophanes as multifunctional magnetic devices: a joint experimental and computational study.

    PubMed

    Castellano, María; Ruiz-García, Rafael; Cano, Joan; Ferrando-Soria, Jesús; Pardo, Emilio; Fortea-Pérez, Francisco R; Stiriba, Salah-Eddine; Julve, Miguel; Lloret, Francesc

    2015-03-17

    Metallosupramolecular complexes constitute an important advance in the emerging fields of molecular spintronics and quantum computation and a useful platform in the development of active components of spintronic circuits and quantum computers for applications in information processing and storage. The external control of chemical reactivity (electro- and photochemical) and physical properties (electronic and magnetic) in metallosupramolecular complexes is a current challenge in supramolecular coordination chemistry, which lies at the interface of several other supramolecular disciplines, including electro-, photo-, and magnetochemistry. The specific control of current flow or spin delocalization through a molecular assembly in response to one or many input signals leads to the concept of developing a molecule-based spintronics that can be viewed as a potential alternative to the classical molecule-based electronics. A great variety of factors can influence over these electronically or magnetically coupled, metallosupramolecular complexes in a reversible manner, electronic or photonic external stimuli being the most promising ones. The response ability of the metal centers and/or the organic bridging ligands to the application of an electric field or light irradiation, together with the geometrical features that allow the precise positioning in space of substituent groups, make these metal-organic systems particularly suitable to build highly integrated molecular spintronic circuits. In this Account, we describe the chemistry and physics of dinuclear copper(II) metallacyclophanes with oxamato-containing dinucleating ligands featuring redox- and photoactive aromatic spacers. Our recent works on dicopper(II) metallacyclophanes and earlier ones on related organic cyclophanes are now compared in a critical manner. Special focus is placed on the ligand design as well as in the combination of experimental and computational methods to demonstrate the multifunctionality nature of these metallosupramolecular complexes. This new class of oxamato-based dicopper(II) metallacyclophanes affords an excellent synthetic and theoretical set of models for both chemical and physical fundamental studies on redox- and photo-triggered, long-distance electron exchange phenomena, which are two major topics in molecular magnetism and molecular electronics. Apart from their use as ground tests for the fundamental research on the relative importance of the spin delocalization and spin polarization mechanisms of the electron exchange interaction through extended π-conjugated aromatic ligands in polymetallic complexes, oxamato-based dicopper(II) metallacyclophanes possessing spin-containing electro- and chromophores at the metal and/or the ligand counterparts emerge as potentially active (magnetic and electronic) molecular components to build a metal-based spintronic circuit. They are thus unique examples of multifunctional magnetic complexes to get single-molecule spintronic devices by controlling and allowing the spin communication, when serving as molecular magnetic couplers and wires, or by exhibiting bistable spin behavior, when acting as molecular magnetic rectifiers and switches. Oxamato-based dicopper(II) metallacyclophanes also emerge as potential candidates for the study of coherent electron transport through single molecules, both experimentally and theoretically. The results presented herein, which are a first step in the metallosupramolecular approach to molecular spintronics, intend to attract the attention of physicists and materials scientists with a large expertice in the manipulation and measurement of single-molecule electron transport properties, as well as in the processing and addressing of molecules on different supports.

  16. Intelligent control system based on ARM for lithography tool

    NASA Astrophysics Data System (ADS)

    Chen, Changlong; Tang, Xiaoping; Hu, Song; Wang, Nan

    2014-08-01

    The control system of traditional lithography tool is based on PC and MCU. The PC handles the complex algorithm, human-computer interaction, and communicates with MCU via serial port; The MCU controls motors and electromagnetic valves, etc. This mode has shortcomings like big volume, high power consumption, and wasting of PC resource. In this paper, an embedded intelligent control system of lithography tool, based on ARM, is provided. The control system used S5PV210 as processor, completing the functions of PC in traditional lithography tool, and provided a good human-computer interaction by using LCD and capacitive touch screen. Using Android4.0.3 as operating system, the equipment provided a cool and easy UI which made the control more user-friendly, and implemented remote control and debug, pushing video information of product by network programming. As a result, it's convenient for equipment vendor to provide technical support for users. Finally, compared with traditional lithography tool, this design reduced the PC part, making the hardware resources efficiently used and reducing the cost and volume. Introducing embedded OS and the concepts in "The Internet of things" into the design of lithography tool can be a development trend.

  17. Dynamic emulation modelling for the optimal operation of water systems: an overview

    NASA Astrophysics Data System (ADS)

    Castelletti, A.; Galelli, S.; Giuliani, M.

    2014-12-01

    Despite sustained increase in computing power over recent decades, computational limitations remain a major barrier to the effective and systematic use of large-scale, process-based simulation models in rational environmental decision-making. Whereas complex models may provide clear advantages when the goal of the modelling exercise is to enhance our understanding of the natural processes, they introduce problems of model identifiability caused by over-parameterization and suffer from high computational burden when used in management and planning problems. As a result, increasing attention is now being devoted to emulation modelling (or model reduction) as a way of overcoming these limitations. An emulation model, or emulator, is a low-order approximation of the process-based model that can be substituted for it in order to solve high resource-demanding problems. In this talk, an overview of emulation modelling within the context of the optimal operation of water systems will be provided. Particular emphasis will be given to Dynamic Emulation Modelling (DEMo), a special type of model complexity reduction in which the dynamic nature of the original process-based model is preserved, with consequent advantages in a wide range of problems, particularly feedback control problems. This will be contrasted with traditional non-dynamic emulators (e.g. response surface and surrogate models) that have been studied extensively in recent years and are mainly used for planning purposes. A number of real world numerical experiences will be used to support the discussion ranging from multi-outlet water quality control in water reservoir through erosion/sedimentation rebalancing in the operation of run-off-river power plants to salinity control in lake and reservoirs.

  18. Adaptive hybrid control of manipulators

    NASA Technical Reports Server (NTRS)

    Seraji, H.

    1987-01-01

    Simple methods for the design of adaptive force and position controllers for robot manipulators within the hybrid control architecuture is presented. The force controller is composed of an adaptive PID feedback controller, an auxiliary signal and a force feedforward term, and it achieves tracking of desired force setpoints in the constraint directions. The position controller consists of adaptive feedback and feedforward controllers and an auxiliary signal, and it accomplishes tracking of desired position trajectories in the free directions. The controllers are capable of compensating for dynamic cross-couplings that exist between the position and force control loops in the hybrid control architecture. The adaptive controllers do not require knowledge of the complex dynamic model or parameter values of the manipulator or the environment. The proposed control schemes are computationally fast and suitable for implementation in on-line control with high sampling rates.

  19. Synthetic biology to access and expand nature’s chemical diversity

    PubMed Central

    Smanski, Michael J.; Zhou, Hui; Claesen, Jan; Shen, Ben; Fischbach, Michael; Voigt, Christopher A.

    2016-01-01

    Bacterial genomes encode the biosynthetic potential to produce hundreds of thousands of complex molecules with diverse applications, from medicine to agriculture and materials. Economically accessing the potential encoded within sequenced genomes promises to reinvigorate waning drug discovery pipelines and provide novel routes to intricate chemicals. This is a tremendous undertaking, as the pathways often comprise dozens of genes spanning as much as 100+ kiliobases of DNA, are controlled by complex regulatory networks, and the most interesting molecules are made by non-model organisms. Advances in synthetic biology address these issues, including DNA construction technologies, genetic parts for precision expression control, synthetic regulatory circuits, computer aided design, and multiplexed genome engineering. Collectively, these technologies are moving towards an era when chemicals can be accessed en mass based on sequence information alone. This will enable the harnessing of metagenomic data and massive strain banks for high-throughput molecular discovery and, ultimately, the ability to forward design pathways to complex chemicals not found in nature. PMID:26876034

  20. RAVEN: a GUI and an Artificial Intelligence Engine in a Dynamic PRA Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C. Rabiti; D. Mandelli; A. Alfonsi

    Increases in computational power and pressure for more accurate simulations and estimations of accident scenario consequences are driving the need for Dynamic Probabilistic Risk Assessment (PRA) [1] of very complex models. While more sophisticated algorithms and computational power address the back end of this challenge, the front end is still handled by engineers that need to extract meaningful information from the large amount of data and build these complex models. Compounding this problem is the difficulty in knowledge transfer and retention, and the increasing speed of software development. The above-described issues would have negatively impacted deployment of the new highmore » fidelity plant simulator RELAP-7 (Reactor Excursion and Leak Analysis Program) at Idaho National Laboratory. Therefore, RAVEN that was initially focused to be the plant controller for RELAP-7 will help mitigate future RELAP-7 software engineering risks. In order to accomplish this task, Reactor Analysis and Virtual Control Environment (RAVEN) has been designed to provide an easy to use Graphical User Interface (GUI) for building plant models and to leverage artificial intelligence algorithms in order to reduce computational time, improve results, and help the user to identify the behavioral pattern of the Nuclear Power Plants (NPPs). In this paper we will present the GUI implementation and its current capability status. We will also introduce the support vector machine algorithms and show our evaluation of their potentiality in increasing the accuracy and reducing the computational costs of PRA analysis. In this evaluation we will refer to preliminary studies performed under the Risk Informed Safety Margins Characterization (RISMC) project of the Light Water Reactors Sustainability (LWRS) campaign [3]. RISMC simulation needs and algorithm testing are currently used as a guidance to prioritize RAVEN developments relevant to PRA.« less

  1. Lightweight fuzzy processes in clinical computing.

    PubMed

    Hurdle, J F

    1997-09-01

    In spite of advances in computing hardware, many hospitals still have a hard time finding extra capacity in their production clinical information system to run artificial intelligence (AI) modules, for example: to support real-time drug-drug or drug-lab interactions; to track infection trends; to monitor compliance with case specific clinical guidelines; or to monitor/ control biomedical devices like an intelligent ventilator. Historically, adding AI functionality was not a major design concern when a typical clinical system is originally specified. AI technology is usually retrofitted 'on top of the old system' or 'run off line' in tandem with the old system to ensure that the routine work load would still get done (with as little impact from the AI side as possible). To compound the burden on system performance, most institutions have witnessed a long and increasing trend for intramural and extramural reporting, (e.g. the collection of data for a quality-control report in microbiology, or a meta-analysis of a suite of coronary artery bypass grafts techniques, etc.) and these place an ever-growing burden on typical the computer system's performance. We discuss a promising approach to adding extra AI processing power to a heavily-used system based on the notion 'lightweight fuzzy processing (LFP)', that is, fuzzy modules designed from the outset to impose a small computational load. A formal model for a useful subclass of fuzzy systems is defined below and is used as a framework for the automated generation of LFPs. By seeking to reduce the arithmetic complexity of the model (a hand-crafted process) and the data complexity of the model (an automated process), we show how LFPs can be generated for three sample datasets of clinical relevance.

  2. EPIBLASTER-fast exhaustive two-locus epistasis detection strategy using graphical processing units

    PubMed Central

    Kam-Thong, Tony; Czamara, Darina; Tsuda, Koji; Borgwardt, Karsten; Lewis, Cathryn M; Erhardt-Lehmann, Angelika; Hemmer, Bernhard; Rieckmann, Peter; Daake, Markus; Weber, Frank; Wolf, Christiane; Ziegler, Andreas; Pütz, Benno; Holsboer, Florian; Schölkopf, Bernhard; Müller-Myhsok, Bertram

    2011-01-01

    Detection of epistatic interaction between loci has been postulated to provide a more in-depth understanding of the complex biological and biochemical pathways underlying human diseases. Studying the interaction between two loci is the natural progression following traditional and well-established single locus analysis. However, the added costs and time duration required for the computation involved have thus far deterred researchers from pursuing a genome-wide analysis of epistasis. In this paper, we propose a method allowing such analysis to be conducted very rapidly. The method, dubbed EPIBLASTER, is applicable to case–control studies and consists of a two-step process in which the difference in Pearson's correlation coefficients is computed between controls and cases across all possible SNP pairs as an indication of significant interaction warranting further analysis. For the subset of interactions deemed potentially significant, a second-stage analysis is performed using the likelihood ratio test from the logistic regression to obtain the P-value for the estimated coefficients of the individual effects and the interaction term. The algorithm is implemented using the parallel computational capability of commercially available graphical processing units to greatly reduce the computation time involved. In the current setup and example data sets (211 cases, 222 controls, 299468 SNPs; and 601 cases, 825 controls, 291095 SNPs), this coefficient evaluation stage can be completed in roughly 1 day. Our method allows for exhaustive and rapid detection of significant SNP pair interactions without imposing significant marginal effects of the single loci involved in the pair. PMID:21150885

  3. A new concept of a unified parameter management, experiment control, and data analysis in fMRI: application to real-time fMRI at 3T and 7T.

    PubMed

    Hollmann, M; Mönch, T; Mulla-Osman, S; Tempelmann, C; Stadler, J; Bernarding, J

    2008-10-30

    In functional MRI (fMRI) complex experiments and applications require increasingly complex parameter handling as the experimental setup usually consists of separated soft- and hardware systems. Advanced real-time applications such as neurofeedback-based training or brain computer interfaces (BCIs) may even require adaptive changes of the paradigms and experimental setup during the measurement. This would be facilitated by an automated management of the overall workflow and a control of the communication between all experimental components. We realized a concept based on an XML software framework called Experiment Description Language (EDL). All parameters relevant for real-time data acquisition, real-time fMRI (rtfMRI) statistical data analysis, stimulus presentation, and activation processing are stored in one central EDL file, and processed during the experiment. A usability study comparing the central EDL parameter management with traditional approaches showed an improvement of the complete experimental handling. Based on this concept, a feasibility study realizing a dynamic rtfMRI-based brain computer interface showed that the developed system in combination with EDL was able to reliably detect and evaluate activation patterns in real-time. The implementation of a centrally controlled communication between the subsystems involved in the rtfMRI experiments reduced potential inconsistencies, and will open new applications for adaptive BCIs.

  4. FAA Air Traffic Control Operations Concepts. Volume 5. ATCT/TCCC (airport Traffic Control Tower/Tower Control Computer Complex) Tower Controllers

    DTIC Science & Technology

    1988-07-29

    Departure List. T1.1.4.13 RESEQUENCE FOE MANUALLY 3.7.2.2.1.1.2.2-00 ARRIVAL LIST 457 3.7.2.2.1,1.2.2-11 b. Ordering - In manual ordering , the 457 a...3.7.2.2.1.1.2.3-12 b. Ordering - In manual ordering , the 457 controller shall have the capability to put a new FOE in the appropriate place in a sublist...3.7.2.2.1.1.2.2-00 ARRIVAL LIST 457 3.7.2.2.1.1.2.2-11 b. Ordering - In manual ordering , the 457 controller snall have the capability to put a new FOE in

  5. A problem of optimal control and observation for distributed homogeneous multi-agent system

    NASA Astrophysics Data System (ADS)

    Kruglikov, Sergey V.

    2017-12-01

    The paper considers the implementation of a algorithm for controlling a distributed complex of several mobile multi-robots. The concept of a unified information space of the controlling system is applied. The presented information and mathematical models of participants and obstacles, as real agents, and goals and scenarios, as virtual agents, create the base forming the algorithmic and software background for computer decision support system. The controlling scheme assumes the indirect management of the robotic team on the basis of optimal control and observation problem predicting intellectual behavior in a dynamic, hostile environment. A basic content problem is a compound cargo transportation by a group of participants in the case of a distributed control scheme in the terrain with multiple obstacles.

  6. Multiaxis, Lightweight, Computer-Controlled Exercise System

    NASA Technical Reports Server (NTRS)

    Haynes, Leonard; Bachrach, Benjamin; Harvey, William

    2006-01-01

    The multipurpose, multiaxial, isokinetic dynamometer (MMID) is a computer-controlled system of exercise machinery that can serve as a means for quantitatively assessing a subject s muscle coordination, range of motion, strength, and overall physical condition with respect to a wide variety of forces, motions, and exercise regimens. The MMID is easily reconfigurable and compactly stowable and, in comparison with prior computer-controlled exercise systems, it weighs less, costs less, and offers more capabilities. Whereas a typical prior isokinetic exercise machine is limited to operation in only one plane, the MMID can operate along any path. In addition, the MMID is not limited to the isokinetic (constant-speed) mode of operation. The MMID provides for control and/or measurement of position, force, and/or speed of exertion in as many as six degrees of freedom simultaneously; hence, it can accommodate more complex, more nearly natural combinations of motions and, in so doing, offers greater capabilities for physical conditioning and evaluation. The MMID (see figure) includes as many as eight active modules, each of which can be anchored to a floor, wall, ceiling, or other fixed object. A cable is payed out from a reel in each module to a bar or other suitable object that is gripped and manipulated by the subject. The reel is driven by a DC brushless motor or other suitable electric motor via a gear reduction unit. The motor can be made to function as either a driver or an electromagnetic brake, depending on the required nature of the interaction with the subject. The module includes a force and a displacement sensor for real-time monitoring of the tension in and displacement of the cable, respectively. In response to commands from a control computer, the motor can be operated to generate a required tension in the cable, to displace the cable a required distance, or to reel the cable in or out at a required speed. The computer can be programmed, either locally or via a remote terminal, to support exercises in one or more of the usual exercise modes (isometric, isokinetic, or isotonic) along complex, multiaxis trajectories. The motions of, and forces applied by, the subject can be monitored in real time and recorded for subsequent evaluation. Through suitable programming, the exercise can be adjusted in real time according to the physical condition of the subject. The remote- programming capability makes it possible to connect multiple exercise machines into a network for supervised exercise by multiple subjects or even for competition by geographically dispersed subjects.

  7. Complex Instruction Set Quantum Computing

    NASA Astrophysics Data System (ADS)

    Sanders, G. D.; Kim, K. W.; Holton, W. C.

    1998-03-01

    In proposed quantum computers, electromagnetic pulses are used to implement logic gates on quantum bits (qubits). Gates are unitary transformations applied to coherent qubit wavefunctions and a universal computer can be created using a minimal set of gates. By applying many elementary gates in sequence, desired quantum computations can be performed. This reduced instruction set approach to quantum computing (RISC QC) is characterized by serial application of a few basic pulse shapes and a long coherence time. However, the unitary matrix of the overall computation is ultimately a unitary matrix of the same size as any of the elementary matrices. This suggests that we might replace a sequence of reduced instructions with a single complex instruction using an optimally taylored pulse. We refer to this approach as complex instruction set quantum computing (CISC QC). One trades the requirement for long coherence times for the ability to design and generate potentially more complex pulses. We consider a model system of coupled qubits interacting through nearest neighbor coupling and show that CISC QC can reduce the time required to perform quantum computations.

  8. LABORATORY PROCESS CONTROLLER USING NATURAL LANGUAGE COMMANDS FROM A PERSONAL COMPUTER

    NASA Technical Reports Server (NTRS)

    Will, H.

    1994-01-01

    The complex environment of the typical research laboratory requires flexible process control. This program provides natural language process control from an IBM PC or compatible machine. Sometimes process control schedules require changes frequently, even several times per day. These changes may include adding, deleting, and rearranging steps in a process. This program sets up a process control system that can either run without an operator, or be run by workers with limited programming skills. The software system includes three programs. Two of the programs, written in FORTRAN77, record data and control research processes. The third program, written in Pascal, generates the FORTRAN subroutines used by the other two programs to identify the user commands with the user-written device drivers. The software system also includes an input data set which allows the user to define the user commands which are to be executed by the computer. To set the system up the operator writes device driver routines for all of the controlled devices. Once set up, this system requires only an input file containing natural language command lines which tell the system what to do and when to do it. The operator can make up custom commands for operating and taking data from external research equipment at any time of the day or night without the operator in attendance. This process control system requires a personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. The program requires a FORTRAN77 compiler and user-written device drivers. This program was developed in 1989 and has a memory requirement of about 62 Kbytes.

  9. Applying a cloud computing approach to storage architectures for spacecraft

    NASA Astrophysics Data System (ADS)

    Baldor, Sue A.; Quiroz, Carlos; Wood, Paul

    As sensor technologies, processor speeds, and memory densities increase, spacecraft command, control, processing, and data storage systems have grown in complexity to take advantage of these improvements and expand the possible missions of spacecraft. Spacecraft systems engineers are increasingly looking for novel ways to address this growth in complexity and mitigate associated risks. Looking to conventional computing, many solutions have been executed to solve both the problem of complexity and heterogeneity in systems. In particular, the cloud-based paradigm provides a solution for distributing applications and storage capabilities across multiple platforms. In this paper, we propose utilizing a cloud-like architecture to provide a scalable mechanism for providing mass storage in spacecraft networks that can be reused on multiple spacecraft systems. By presenting a consistent interface to applications and devices that request data to be stored, complex systems designed by multiple organizations may be more readily integrated. Behind the abstraction, the cloud storage capability would manage wear-leveling, power consumption, and other attributes related to the physical memory devices, critical components in any mass storage solution for spacecraft. Our approach employs SpaceWire networks and SpaceWire-capable devices, although the concept could easily be extended to non-heterogeneous networks consisting of multiple spacecraft and potentially the ground segment.

  10. Improving communication when seeking informed consent: a randomised controlled study of a computer-based method for providing information to prospective clinical trial participants.

    PubMed

    Karunaratne, Asuntha S; Korenman, Stanley G; Thomas, Samantha L; Myles, Paul S; Komesaroff, Paul A

    2010-04-05

    To assess the efficacy, with respect to participant understanding of information, of a computer-based approach to communication about complex, technical issues that commonly arise when seeking informed consent for clinical research trials. An open, randomised controlled study of 60 patients with diabetes mellitus, aged 27-70 years, recruited between August 2006 and October 2007 from the Department of Diabetes and Endocrinology at the Alfred Hospital and Baker IDI Heart and Diabetes Institute, Melbourne. Participants were asked to read information about a mock study via a computer-based presentation (n = 30) or a conventional paper-based information statement (n = 30). The computer-based presentation contained visual aids, including diagrams, video, hyperlinks and quiz pages. Understanding of information as assessed by quantitative and qualitative means. Assessment scores used to measure level of understanding were significantly higher in the group that completed the computer-based task than the group that completed the paper-based task (82% v 73%; P = 0.005). More participants in the group that completed the computer-based task expressed interest in taking part in the mock study (23 v 17 participants; P = 0.01). Most participants from both groups preferred the idea of a computer-based presentation to the paper-based statement (21 in the computer-based task group, 18 in the paper-based task group). A computer-based method of providing information may help overcome existing deficiencies in communication about clinical research, and may reduce costs and improve efficiency in recruiting participants for clinical trials.

  11. Virtual Transgenics: Using a Molecular Biology Simulation to Impact Student Academic Achievement and Attitudes

    NASA Astrophysics Data System (ADS)

    Shegog, Ross; Lazarus, Melanie M.; Murray, Nancy G.; Diamond, Pamela M.; Sessions, Nathalie; Zsigmond, Eva

    2012-10-01

    The transgenic mouse model is useful for studying the causes and potential cures for human genetic diseases. Exposing high school biology students to laboratory experience in developing transgenic animal models is logistically prohibitive. Computer-based simulation, however, offers this potential in addition to advantages of fidelity and reach. This study describes and evaluates a computer-based simulation to train advanced placement high school science students in laboratory protocols, a transgenic mouse model was produced. A simulation module on preparing a gene construct in the molecular biology lab was evaluated using a randomized clinical control design with advanced placement high school biology students in Mercedes, Texas ( n = 44). Pre-post tests assessed procedural and declarative knowledge, time on task, attitudes toward computers for learning and towards science careers. Students who used the simulation increased their procedural and declarative knowledge regarding molecular biology compared to those in the control condition (both p < 0.005). Significant increases continued to occur with additional use of the simulation ( p < 0.001). Students in the treatment group became more positive toward using computers for learning ( p < 0.001). The simulation did not significantly affect attitudes toward science in general. Computer simulation of complex transgenic protocols have potential to provide a "virtual" laboratory experience as an adjunct to conventional educational approaches.

  12. Computer simulations in the high school: students' cognitive stages, science process skills and academic achievement in microbiology

    NASA Astrophysics Data System (ADS)

    Huppert, J.; Michal Lomask, S.; Lazarowitz, R.

    2002-08-01

    Computer-assisted learning, including simulated experiments, has great potential to address the problem solving process which is a complex activity. It requires a highly structured approach in order to understand the use of simulations as an instructional device. This study is based on a computer simulation program, 'The Growth Curve of Microorganisms', which required tenth grade biology students to use problem solving skills whilst simultaneously manipulating three independent variables in one simulated experiment. The aims were to investigate the computer simulation's impact on students' academic achievement and on their mastery of science process skills in relation to their cognitive stages. The results indicate that the concrete and transition operational students in the experimental group achieved significantly higher academic achievement than their counterparts in the control group. The higher the cognitive operational stage, the higher students' achievement was, except in the control group where students in the concrete and transition operational stages did not differ. Girls achieved equally with the boys in the experimental group. Students' academic achievement may indicate the potential impact a computer simulation program can have, enabling students with low reasoning abilities to cope successfully with learning concepts and principles in science which require high cognitive skills.

  13. VBOT: Motivating computational and complex systems fluencies with constructionist virtual/physical robotics

    NASA Astrophysics Data System (ADS)

    Berland, Matthew W.

    As scientists use the tools of computational and complex systems theory to broaden science perspectives (e.g., Bar-Yam, 1997; Holland, 1995; Wolfram, 2002), so can middle-school students broaden their perspectives using appropriate tools. The goals of this dissertation project are to build, study, evaluate, and compare activities designed to foster both computational and complex systems fluencies through collaborative constructionist virtual and physical robotics. In these activities, each student builds an agent (e.g., a robot-bird) that must interact with fellow students' agents to generate a complex aggregate (e.g., a flock of robot-birds) in a participatory simulation environment (Wilensky & Stroup, 1999a). In a participatory simulation, students collaborate by acting in a common space, teaching each other, and discussing content with one another. As a result, the students improve both their computational fluency and their complex systems fluency, where fluency is defined as the ability to both consume and produce relevant content (DiSessa, 2000). To date, several systems have been designed to foster computational and complex systems fluencies through computer programming and collaborative play (e.g., Hancock, 2003; Wilensky & Stroup, 1999b); this study suggests that, by supporting the relevant fluencies through collaborative play, they become mutually reinforcing. In this work, I will present both the design of the VBOT virtual/physical constructionist robotics learning environment and a comparative study of student interaction with the virtual and physical environments across four middle-school classrooms, focusing on the contrast in systems perspectives differently afforded by the two environments. In particular, I found that while performance gains were similar overall, the physical environment supported agent perspectives on aggregate behavior, and the virtual environment supported aggregate perspectives on agent behavior. The primary research questions are: (1) What are the relative affordances of virtual and physical constructionist robotics systems towards computational and complex systems fluencies? (2) What can middle school students learn using computational/complex systems learning environments in a collaborative setting? (3) In what ways are these environments and activities effective in teaching students computational and complex systems fluencies?

  14. A comprehensive approach to identify dominant controls of the behavior of a land surface-hydrology model across various hydroclimatic conditions

    NASA Astrophysics Data System (ADS)

    Haghnegahdar, Amin; Elshamy, Mohamed; Yassin, Fuad; Razavi, Saman; Wheater, Howard; Pietroniro, Al

    2017-04-01

    Complex physically-based environmental models are being increasingly used as the primary tool for watershed planning and management due to advances in computation power and data acquisition. Model sensitivity analysis plays a crucial role in understanding the behavior of these complex models and improving their performance. Due to the non-linearity and interactions within these complex models, Global sensitivity analysis (GSA) techniques should be adopted to provide a comprehensive understanding of model behavior and identify its dominant controls. In this study we adopt a multi-basin multi-criteria GSA approach to systematically assess the behavior of the Modélisation Environmentale-Surface et Hydrologie (MESH) across various hydroclimatic conditions in Canada including areas in the Great Lakes Basin, Mackenzie River Basin, and South Saskatchewan River Basin. MESH is a semi-distributed physically-based coupled land surface-hydrology modelling system developed by Environment and Climate Change Canada (ECCC) for various water resources management purposes in Canada. We use a novel method, called Variogram Analysis of Response Surfaces (VARS), to perform sensitivity analysis. VARS is a variogram-based GSA technique that can efficiently provide a spectrum of sensitivity information across a range of scales within the parameter space. We use multiple metrics to identify dominant controls of model response (e.g. streamflow) to model parameters under various conditions such as high flows, low flows, and flow volume. We also investigate the influence of initial conditions on model behavior as part of this study. Our preliminary results suggest that this type of GSA can significantly help with estimating model parameters, decreasing calibration computational burden, and reducing prediction uncertainty.

  15. Using Off-the-Shelf Gaming Controllers For Computer Control in the K-12 Classroom

    NASA Astrophysics Data System (ADS)

    Bourgoin, N. L.; Withee, J.; Segee, M.; Birkel, S. D.; Albee, E.; Koons, P. O.; Zhu, Y.; Segee, B.

    2009-12-01

    In the classroom, the interaction between students, teachers, and datasets is becoming more game like. Software such as GoogleEarth allow students to interact with data on a more personal level; allowing them the dynamically change variables, move arbitrarily, and personalize their experience with the datasets. As this becomes more immersive, traditional software control such as keyboard and mouse begin to hold the student back in terms of intuitive interfacing with the data. This is a problem that has best been tackled by modern gaming systems such as the Wii, XBox 360, and Playstation 3 Systems. By utilizing the solutions given by these gaming systems, it is possible to further a students immersion with a system. Through an NSF ITEST (Information and Technology Experiences for Students and Teachers) grant, researchers at the University of Maine have experimented with using the game controller that is used for interacting with the Nintendo Wii (often called a Wiimote) with existing geodynamic systems in an effort to eases interaction with these systems. Since these game controllers operate using Bluetooth, a common protocol in computing, Wiimotes can easily communicate with existing laptop computers that are issued to Maine students. This paper describes the technical requirements, setup, and usage of Wiimotes as an input device to complex geodynamical systems for use in the K-12 classroom.

  16. Western Aeronautical Test Range (WATR) mission control Blue room

    NASA Image and Video Library

    1994-12-05

    Mission control Blue Room, seen here, in building 4800 at NASA's Dryden Flight Research Center, is part of the Western Aeronautical Test Range (WATR). All aspects of a research mission are monitored from one of two of these control rooms at Dryden. The WATR consists of a highly automated complex of computer controlled tracking, telemetry, and communications systems and control room complexes that are capable of supporting any type of mission ranging from system and component testing, to sub-scale and full-scale flight tests of new aircraft and reentry systems. Designated areas are assigned for spin/dive tests, corridors are provided for low, medium, and high-altitude supersonic flight, and special STOL/VSTOL facilities are available at Ames Moffett and Crows Landing. Special use airspace, available at Edwards, covers approximately twelve thousand square miles of mostly desert area. The southern boundary lies to the south of Rogers Dry Lake, the western boundary lies midway between Mojave and Bakersfield, the northern boundary passes just south of Bishop, and the eastern boundary follows about 25 miles west of the Nevada border except in the northern areas where it crosses into Nevada.

  17. Optimizing the way kinematical feed chains with great distance between slides are chosen for CNC machine tools

    NASA Astrophysics Data System (ADS)

    Lucian, P.; Gheorghe, S.

    2017-08-01

    This paper presents a new method, based on FRISCO formula, for optimizing the choice of the best control system for kinematical feed chains with great distance between slides used in computer numerical controlled machine tools. Such machines are usually, but not limited to, used for machining large and complex parts (mostly in the aviation industry) or complex casting molds. For such machine tools the kinematic feed chains are arranged in a dual-parallel drive structure that allows the mobile element to be moved by the two kinematical branches and their related control systems. Such an arrangement allows for high speed and high rigidity (a critical requirement for precision machining) during the machining process. A significant issue for such an arrangement it’s the ability of the two parallel control systems to follow the same trajectory accurately in order to address this issue it is necessary to achieve synchronous motion control for the two kinematical branches ensuring that the correct perpendicular position it’s kept by the mobile element during its motion on the two slides.

  18. Western Aeronautical Test Range (WATR) mission control Gold room

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Mission control Gold room is seen here, located at the Dryden Flight Research Center of the Western Aeronautical Test Range (WATR). All aspects of a research mission are monitored from one of two of these control rooms at Dryden. The WATR consists of a highly automated complex of computer controlled tracking, telemetry, and communications systems and control room complexes that are capable of supporting any type of mission ranging from system and component testing, to sub-scale and full-scale flight tests of new aircraft and reentry systems. Designated areas are assigned for spin/dive tests, corridors are provided for low, medium, and high-altitude supersonic flight, and special STOL/VSTOL facilities are available at Ames Moffett and Crows Landing. Special use airspace, available at Edwards, covers approximately twelve thousand square miles of mostly desert area. The southern boundary lies to the south of Rogers Dry Lake, the western boundary lies midway between Mojave and Bakersfield, the northern boundary passes just south of Bishop, and the eastern boundary follows about 25 miles west of the Nevada border except in the northern areas where it crosses into Nevada.

  19. Phosphorescent cyclometalated complexes for efficient blue organic light-emitting diodes

    PubMed Central

    Suzuri, Yoshiyuki; Oshiyama, Tomohiro; Ito, Hiroto; Hiyama, Kunihisa; Kita, Hiroshi

    2014-01-01

    Phosphorescent emitters are extremely important for efficient organic light-emitting diodes (OLEDs), which attract significant attention. Phosphorescent emitters, which have a high phosphorescence quantum yield at room temperature, typically contain a heavy metal such as iridium and have been reported to emit blue, green and red light. In particular, the blue cyclometalated complexes with high efficiency and high stability are being developed. In this review, we focus on blue cyclometalated complexes. Recent progress of computational analysis necessary to design a cyclometalated complex is introduced. The prediction of the radiative transition is indispensable to get an emissive cyclometalated complex. We summarize four methods to control phosphorescence peak of the cyclometalated complex: (i) substituent effect on ligands, (ii) effects of ancillary ligands on heteroleptic complexes, (iii) design of the ligand skeleton, and (iv) selection of the central metal. It is considered that novel ligand skeletons would be important to achieve both a high efficiency and long lifetime in the blue OLEDs. Moreover, the combination of an emitter and a host is important as well as the emitter itself. According to the dependences on the combination of an emitter and a host, the control of exciton density of the triplet is necessary to achieve both a high efficiency and a long lifetime, because the annihilations of the triplet state cause exciton quenching and material deterioration. PMID:27877712

  20. Phosphorescent cyclometalated complexes for efficient blue organic light-emitting diodes

    NASA Astrophysics Data System (ADS)

    Suzuri, Yoshiyuki; Oshiyama, Tomohiro; Ito, Hiroto; Hiyama, Kunihisa; Kita, Hiroshi

    2014-10-01

    Phosphorescent emitters are extremely important for efficient organic light-emitting diodes (OLEDs), which attract significant attention. Phosphorescent emitters, which have a high phosphorescence quantum yield at room temperature, typically contain a heavy metal such as iridium and have been reported to emit blue, green and red light. In particular, the blue cyclometalated complexes with high efficiency and high stability are being developed. In this review, we focus on blue cyclometalated complexes. Recent progress of computational analysis necessary to design a cyclometalated complex is introduced. The prediction of the radiative transition is indispensable to get an emissive cyclometalated complex. We summarize four methods to control phosphorescence peak of the cyclometalated complex: (i) substituent effect on ligands, (ii) effects of ancillary ligands on heteroleptic complexes, (iii) design of the ligand skeleton, and (iv) selection of the central metal. It is considered that novel ligand skeletons would be important to achieve both a high efficiency and long lifetime in the blue OLEDs. Moreover, the combination of an emitter and a host is important as well as the emitter itself. According to the dependences on the combination of an emitter and a host, the control of exciton density of the triplet is necessary to achieve both a high efficiency and a long lifetime, because the annihilations of the triplet state cause exciton quenching and material deterioration.

Top