Sample records for emulation capabilities supports

  1. MCCx C3I Control Center Interface Emulator

    NASA Technical Reports Server (NTRS)

    Mireles, James R.

    2010-01-01

    This slide presentation reviews the project to develop and demonstrate alternate Information Technologies and systems for new Mission Control Centers that will reduce the cost of facility development, maintenance and operational costs and will enable more efficient cost and effective operations concepts for ground support operations. The development of a emulator for the Control Center capability will enable the facilities to conduct the simulation requiring interactivity with the Control Center when it is off line or unavailable, and it will support testing of C3I interfaces for both command and telemetry data exchange messages (DEMs).

  2. Personal-Computer Video-Terminal Emulator

    NASA Technical Reports Server (NTRS)

    Buckley, R. H.; Koromilas, A.; Smith, R. M.; Lee, G. E.; Giering, E. W.

    1985-01-01

    OWL-1200 video terminal emulator has been written for IBM Personal Computer. The OWL-1200 is a simple user terminal with some intelligent capabilities. These capabilities include screen formatting and block transmission of data. Emulator is written in PASCAL and Assembler for the IBM Personal Computer operating under DOS 1.1.

  3. UPEML Version 2. 0: A machine-portable CDC Update emulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehlhorn, T.A.; Young, M.F.

    1987-05-01

    UPEML is a machine-portable CDC Update emulation program. UPEML is written in ANSI standard Fortran-77 and is relatively simple and compact. It is capable of emulating a significant subset of the standard CDC Update functions, including program library creation and subsequent modification. Machine-portability is an essential attribute of UPEML. UPEML was written primarily to facilitate the use of CDC-based scientific packages on alternate computer systems such as the VAX 11/780 and the IBM 3081. UPEML has also been successfully used on the multiprocessor ELXSI, on CRAYs under both COS and CTSS operating systems, on APOLLO workstations, and on the HP-9000.more » Version 2.0 includes enhanced error checking, full ASCI character support, a program library audit capability, and a partial update option in which only selected or modified decks are written to the compile file. Further enhancements include checks for overlapping corrections, processing of nested calls to common decks, and reads and addfiles from alternate input files.« less

  4. Development of a Converter-Based Transmission Line Emulator with Three-Phase Short-Circuit Fault Emulation Capability

    DOE PAGES

    Zhang, Shuoting; Liu, Bo; Zheng, Sheng; ...

    2018-01-01

    A transmission line emulator has been developed to flexibly represent interconnected ac lines under normal operating conditions in a voltage source converter (VSC)-based power system emulation platform. As the most serious short-circuit fault condition, the three-phase short-circuit fault emulation is essential for power system studies. Here, this paper proposes a model to realize a three-phase short-circuit fault emulation at different locations along a single transmission line or one of several parallel-connected transmission lines. At the same time, a combination method is proposed to eliminate the undesired transients caused by the current reference step changes while switching between the fault statemore » and the normal state. Experiment results verify the developed transmission line three-phase short-circuit fault emulation capability.« less

  5. Development of a Converter-Based Transmission Line Emulator with Three-Phase Short-Circuit Fault Emulation Capability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Shuoting; Liu, Bo; Zheng, Sheng

    A transmission line emulator has been developed to flexibly represent interconnected ac lines under normal operating conditions in a voltage source converter (VSC)-based power system emulation platform. As the most serious short-circuit fault condition, the three-phase short-circuit fault emulation is essential for power system studies. Here, this paper proposes a model to realize a three-phase short-circuit fault emulation at different locations along a single transmission line or one of several parallel-connected transmission lines. At the same time, a combination method is proposed to eliminate the undesired transients caused by the current reference step changes while switching between the fault statemore » and the normal state. Experiment results verify the developed transmission line three-phase short-circuit fault emulation capability.« less

  6. A Device to Emulate Diffusion and Thermal Conductivity Using Water Flow

    ERIC Educational Resources Information Center

    Blanck, Harvey F.

    2005-01-01

    A device designed to emulate diffusion and thermal conductivity using flowing water is reviewed. Water flowing through a series of cells connected by a small tube in each partition in this plastic model is capable of emulating diffusion and thermal conductivity that occurs in variety of systems described by several mathematical equations.

  7. Utility of Emulation and Simulation Computer Modeling of Space Station Environmental Control and Life Support Systems

    NASA Technical Reports Server (NTRS)

    Yanosy, James L.

    1988-01-01

    Over the years, computer modeling has been used extensively in many disciplines to solve engineering problems. A set of computer program tools is proposed to assist the engineer in the various phases of the Space Station program from technology selection through flight operations. The development and application of emulation and simulation transient performance modeling tools for life support systems are examined. The results of the development and the demonstration of the utility of three computer models are presented. The first model is a detailed computer model (emulation) of a solid amine water desorbed (SAWD) CO2 removal subsystem combined with much less detailed models (simulations) of a cabin, crew, and heat exchangers. This model was used in parallel with the hardware design and test of this CO2 removal subsystem. The second model is a simulation of an air revitalization system combined with a wastewater processing system to demonstrate the capabilities to study subsystem integration. The third model is that of a Space Station total air revitalization system. The station configuration consists of a habitat module, a lab module, two crews, and four connecting nodes.

  8. Investigation of hyper-NA scanner emulation for photomask CDU performance

    NASA Astrophysics Data System (ADS)

    Poortinga, Eric; Scheruebl, Thomas; Conley, Will; Sundermann, Frank

    2007-02-01

    As the semiconductor industry moves toward immersion lithography using numerical apertures above 1.0 the quality of the photomask becomes even more crucial. Photomask specifications are driven by the critical dimension (CD) metrology within the wafer fab. Knowledge of the CD values at resist level provides a reliable mechanism for the prediction of device performance. Ultimately, tolerances of device electrical properties drive the wafer linewidth specifications of the lithography group. Staying within this budget is influenced mainly by the scanner settings, resist process, and photomask quality. Tightening of photomask specifications is one mechanism for meeting the wafer CD targets. The challenge lies in determining how photomask level metrology results influence wafer level imaging performance. Can it be inferred that photomask level CD performance is the direct contributor to wafer level CD performance? With respect to phase shift masks, criteria such as phase and transmission control are generally tightened with each technology node. Are there other photomask relevant influences that effect wafer CD performance? A comprehensive study is presented supporting the use of scanner emulation based photomask CD metrology to predict wafer level within chip CD uniformity (CDU). Using scanner emulation with the photomask can provide more accurate wafer level prediction because it inherently includes all contributors to image formation related to the 3D topography such as the physical CD, phase, transmission, sidewall angle, surface roughness, and other material properties. Emulated images from different photomask types were captured to provide CD values across chip. Emulated scanner image measurements were completed using an AIMS TM45-193i with its hyper-NA, through-pellicle data acquisition capability including the Global CDU Map TM software option for AIMS TM tools. The through-pellicle data acquisition capability is an essential prerequisite for capturing final CDU data (after final clean and pellicle mounting) before the photomask ships or for re-qualification at the wafer fab. Data was also collected on these photomasks using a conventional CD-SEM metrology system with the pellicles removed. A comparison was then made to wafer prints demonstrating the benefit of using scanner emulation based photomask CD metrology.

  9. Machine Learning and Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Chapline, George

    The author has previously pointed out some similarities between selforganizing neural networks and quantum mechanics. These types of neural networks were originally conceived of as away of emulating the cognitive capabilities of the human brain. Recently extensions of these networks, collectively referred to as deep learning networks, have strengthened the connection between self-organizing neural networks and human cognitive capabilities. In this note we consider whether hardware quantum devices might be useful for emulating neural networks with human-like cognitive capabilities, or alternatively whether implementations of deep learning neural networks using conventional computers might lead to better algorithms for solving the many body Schrodinger equation.

  10. UPEML: a machine-portable CDC Update emulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehlhorn, T.A.; Young, M.F.

    1984-12-01

    UPEML is a machine-portable CDC Update emulation program. UPEML is written in ANSI standard Fortran-77 and is relatively simple and compact. It is capable of emulating a significant subset of the standard CDC Update functions including program library creation and subsequent modification. Machine-portability is an essential attribute of UPEML. It was written primarily to facilitate the use of CDC-based scientific packages on alternate computer systems such as the VAX 11/780 and the IBM 3081.

  11. Advanced Machine Learning Emulators of Radiative Transfer Models

    NASA Astrophysics Data System (ADS)

    Camps-Valls, G.; Verrelst, J.; Martino, L.; Vicent, J.

    2017-12-01

    Physically-based model inversion methodologies are based on physical laws and established cause-effect relationships. A plethora of remote sensing applications rely on the physical inversion of a Radiative Transfer Model (RTM), which lead to physically meaningful bio-geo-physical parameter estimates. The process is however computationally expensive, needs expert knowledge for both the selection of the RTM, its parametrization and the the look-up table generation, as well as its inversion. Mimicking complex codes with statistical nonlinear machine learning algorithms has become the natural alternative very recently. Emulators are statistical constructs able to approximate the RTM, although at a fraction of the computational cost, providing an estimation of uncertainty, and estimations of the gradient or finite integral forms. We review the field and recent advances of emulation of RTMs with machine learning models. We posit Gaussian processes (GPs) as the proper framework to tackle the problem. Furthermore, we introduce an automatic methodology to construct emulators for costly RTMs. The Automatic Gaussian Process Emulator (AGAPE) methodology combines the interpolation capabilities of GPs with the accurate design of an acquisition function that favours sampling in low density regions and flatness of the interpolation function. We illustrate the good capabilities of our emulators in toy examples, leaf and canopy levels PROSPECT and PROSAIL RTMs, and for the construction of an optimal look-up-table for atmospheric correction based on MODTRAN5.

  12. Windows Program For Driving The TDU-850 Printer

    NASA Technical Reports Server (NTRS)

    Parrish, Brett T.

    1995-01-01

    Program provides WYSIWYG compatibility between video display and printout. PDW is Microsoft Windows printer-driver computer program for use with Raytheon TDU-850 printer. Provides previously unavailable linkage between printer and IBM PC-compatible computers running Microsoft Windows. Enhances capabilities of Raytheon TDU-850 hardcopier by emulating all textual and graphical features normally supported by laser/ink-jet printers and makes printer compatible with any Microsoft Windows application. Also provides capabilities not found in laser/ink-jet printer drivers by providing certain Windows applications with ability to render high quality, true gray-scale photographic hardcopy on TDU-850. Written in C language.

  13. Digital avionics design and reliability analyzer

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The description and specifications for a digital avionics design and reliability analyzer are given. Its basic function is to provide for the simulation and emulation of the various fault-tolerant digital avionic computer designs that are developed. It has been established that hardware emulation at the gate-level will be utilized. The primary benefit of emulation to reliability analysis is the fact that it provides the capability to model a system at a very detailed level. Emulation allows the direct insertion of faults into the system, rather than waiting for actual hardware failures to occur. This allows for controlled and accelerated testing of system reaction to hardware failures. There is a trade study which leads to the decision to specify a two-machine system, including an emulation computer connected to a general-purpose computer. There is also an evaluation of potential computers to serve as the emulation computer.

  14. Criticality as a Set-Point for Adaptive Behavior in Neuromorphic Hardware

    PubMed Central

    Srinivasa, Narayan; Stepp, Nigel D.; Cruz-Albrecht, Jose

    2015-01-01

    Neuromorphic hardware are designed by drawing inspiration from biology to overcome limitations of current computer architectures while forging the development of a new class of autonomous systems that can exhibit adaptive behaviors. Several designs in the recent past are capable of emulating large scale networks but avoid complexity in network dynamics by minimizing the number of dynamic variables that are supported and tunable in hardware. We believe that this is due to the lack of a clear understanding of how to design self-tuning complex systems. It has been widely demonstrated that criticality appears to be the default state of the brain and manifests in the form of spontaneous scale-invariant cascades of neural activity. Experiment, theory and recent models have shown that neuronal networks at criticality demonstrate optimal information transfer, learning and information processing capabilities that affect behavior. In this perspective article, we argue that understanding how large scale neuromorphic electronics can be designed to enable emergent adaptive behavior will require an understanding of how networks emulated by such hardware can self-tune local parameters to maintain criticality as a set-point. We believe that such capability will enable the design of truly scalable intelligent systems using neuromorphic hardware that embrace complexity in network dynamics rather than avoiding it. PMID:26648839

  15. Emulation of reionization simulations for Bayesian inference of astrophysics parameters using neural networks

    NASA Astrophysics Data System (ADS)

    Schmit, C. J.; Pritchard, J. R.

    2018-03-01

    Next generation radio experiments such as LOFAR, HERA, and SKA are expected to probe the Epoch of Reionization (EoR) and claim a first direct detection of the cosmic 21cm signal within the next decade. Data volumes will be enormous and can thus potentially revolutionize our understanding of the early Universe and galaxy formation. However, numerical modelling of the EoR can be prohibitively expensive for Bayesian parameter inference and how to optimally extract information from incoming data is currently unclear. Emulation techniques for fast model evaluations have recently been proposed as a way to bypass costly simulations. We consider the use of artificial neural networks as a blind emulation technique. We study the impact of training duration and training set size on the quality of the network prediction and the resulting best-fitting values of a parameter search. A direct comparison is drawn between our emulation technique and an equivalent analysis using 21CMMC. We find good predictive capabilities of our network using training sets of as low as 100 model evaluations, which is within the capabilities of fully numerical radiative transfer codes.

  16. Application of the dynamically allocated virtual clustering management system to emulated tactical network experimentation

    NASA Astrophysics Data System (ADS)

    Marcus, Kelvin

    2014-06-01

    The U.S Army Research Laboratory (ARL) has built a "Network Science Research Lab" to support research that aims to improve their ability to analyze, predict, design, and govern complex systems that interweave the social/cognitive, information, and communication network genres. Researchers at ARL and the Network Science Collaborative Technology Alliance (NS-CTA), a collaborative research alliance funded by ARL, conducted experimentation to determine if automated network monitoring tools and task-aware agents deployed within an emulated tactical wireless network could potentially increase the retrieval of relevant data from heterogeneous distributed information nodes. ARL and NS-CTA required the capability to perform this experimentation over clusters of heterogeneous nodes with emulated wireless tactical networks where each node could contain different operating systems, application sets, and physical hardware attributes. Researchers utilized the Dynamically Allocated Virtual Clustering Management System (DAVC) to address each of the infrastructure support requirements necessary in conducting their experimentation. The DAVC is an experimentation infrastructure that provides the means to dynamically create, deploy, and manage virtual clusters of heterogeneous nodes within a cloud computing environment based upon resource utilization such as CPU load, available RAM and hard disk space. The DAVC uses 802.1Q Virtual LANs (VLANs) to prevent experimentation crosstalk and to allow for complex private networks. Clusters created by the DAVC system can be utilized for software development, experimentation, and integration with existing hardware and software. The goal of this paper is to explore how ARL and the NS-CTA leveraged the DAVC to create, deploy and manage multiple experimentation clusters to support their experimentation goals.

  17. Providing a parallel and distributed capability for JMASS using SPEEDES

    NASA Astrophysics Data System (ADS)

    Valinski, Maria; Driscoll, Jonathan; McGraw, Robert M.; Meyer, Bob

    2002-07-01

    The Joint Modeling And Simulation System (JMASS) is a Tri-Service simulation environment that supports engineering and engagement-level simulations. As JMASS is expanded to support other Tri-Service domains, the current set of modeling services must be expanded for High Performance Computing (HPC) applications by adding support for advanced time-management algorithms, parallel and distributed topologies, and high speed communications. By providing support for these services, JMASS can better address modeling domains requiring parallel computationally intense calculations such clutter, vulnerability and lethality calculations, and underwater-based scenarios. A risk reduction effort implementing some HPC services for JMASS using the SPEEDES (Synchronous Parallel Environment for Emulation and Discrete Event Simulation) Simulation Framework has recently concluded. As an artifact of the JMASS-SPEEDES integration, not only can HPC functionality be brought to the JMASS program through SPEEDES, but an additional HLA-based capability can be demonstrated that further addresses interoperability issues. The JMASS-SPEEDES integration provided a means of adding HLA capability to preexisting JMASS scenarios through an implementation of the standard JMASS port communication mechanism that allows players to communicate.

  18. Component-Level Electronic-Assembly Repair (CLEAR) Synthetic Instrument Capabilities Assessment and Test Report

    NASA Technical Reports Server (NTRS)

    Oeftering, Richard C.; Bradish, Martin A.

    2011-01-01

    The role of synthetic instruments (SIs) for Component-Level Electronic-Assembly Repair (CLEAR) is to provide an external lower-level diagnostic and functional test capability beyond the built-in-test capabilities of spacecraft electronics. Built-in diagnostics can report faults and symptoms, but isolating the root cause and performing corrective action requires specialized instruments. Often a fault can be revealed by emulating the operation of external hardware. This implies complex hardware that is too massive to be accommodated in spacecraft. The SI strategy is aimed at minimizing complexity and mass by employing highly reconfigurable instruments that perform diagnostics and emulate external functions. In effect, SI can synthesize an instrument on demand. The SI architecture section of this document summarizes the result of a recent program diagnostic and test needs assessment based on the International Space Station. The SI architecture addresses operational issues such as minimizing crew time and crew skill level, and the SI data transactions between the crew and supporting ground engineering searching for the root cause and formulating corrective actions. SI technology is described within a teleoperations framework. The remaining sections describe a lab demonstration intended to show that a single SI circuit could synthesize an instrument in hardware and subsequently clear the hardware and synthesize a completely different instrument on demand. An analysis of the capabilities and limitations of commercially available SI hardware and programming tools is included. Future work in SI technology is also described.

  19. An autonomous fault detection, isolation, and recovery system for a 20-kHz electric power distribution test bed

    NASA Technical Reports Server (NTRS)

    Quinn, Todd M.; Walters, Jerry L.

    1991-01-01

    Future space explorations will require long term human presence in space. Space environments that provide working and living quarters for manned missions are becoming increasingly larger and more sophisticated. Monitor and control of the space environment subsystems by expert system software, which emulate human reasoning processes, could maintain the health of the subsystems and help reduce the human workload. The autonomous power expert (APEX) system was developed to emulate a human expert's reasoning processes used to diagnose fault conditions in the domain of space power distribution. APEX is a fault detection, isolation, and recovery (FDIR) system, capable of autonomous monitoring and control of the power distribution system. APEX consists of a knowledge base, a data base, an inference engine, and various support and interface software. APEX provides the user with an easy-to-use interactive interface. When a fault is detected, APEX will inform the user of the detection. The user can direct APEX to isolate the probable cause of the fault. Once a fault has been isolated, the user can ask APEX to justify its fault isolation and to recommend actions to correct the fault. APEX implementation and capabilities are discussed.

  20. Scalable and reusable emulator for evaluating the performance of SS7 networks

    NASA Astrophysics Data System (ADS)

    Lazar, Aurel A.; Tseng, Kent H.; Lim, Koon Seng; Choe, Winston

    1994-04-01

    A scalable and reusable emulator was designed and implemented for studying the behavior of SS7 networks. The emulator design was largely based on public domain software. It was developed on top of an environment supported by PVM, the Parallel Virtual Machine, and managed by OSIMIS-the OSI Management Information Service platform. The emulator runs on top of a commercially available ATM LAN interconnecting engineering workstations. As a case study for evaluating the emulator, the behavior of the Singapore National SS7 Network under fault and unbalanced loading conditions was investigated.

  1. Challenges in building intelligent systems for space mission operations

    NASA Technical Reports Server (NTRS)

    Hartman, Wayne

    1991-01-01

    The purpose here is to provide a top-level look at the stewardship functions performed in space operations, and to identify the major issues and challenges that must be addressed to build intelligent systems that can realistically support operations functions. The focus is on decision support activities involving monitoring, state assessment, goal generation, plan generation, and plan execution. The bottom line is that problem solving in the space operations domain is a very complex process. A variety of knowledge constructs, representations, and reasoning processes are necessary to support effective human problem solving. Emulating these kinds of capabilities in intelligent systems offers major technical challenges that the artificial intelligence community is only beginning to address.

  2. Hawaiian Electric Advanced Inverter Grid Support Function Laboratory Validation and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Austin; Nagarajan, Adarsh; Prabakar, Kumar

    The objective for this test plan was to better understand how to utilize the performance capabilities of advanced inverter functions to allow the interconnection of distributed energy resource (DER) systems to support the new Customer Self-Supply, Customer Grid-Supply, and other future DER programs. The purpose of this project was: 1) to characterize how the tested grid supportive inverters performed the functions of interest, 2) to evaluate the grid supportive inverters in an environment that emulates the dynamics of O'ahu's electrical distribution system, and 3) to gain insight into the benefits of the grid support functions on selected O'ahu island distributionmore » feeders. These goals were achieved through laboratory testing of photovoltaic inverters, including power hardware-in-the-loop testing.« less

  3. Manufacture and characterization of breast tissue phantoms for emulating benign lesions

    NASA Astrophysics Data System (ADS)

    Villamarín, J. A.; Rojas, M. A.; Potosi, O. M.; Narváez-Semanate, J. L.; Gaviria, C.

    2017-11-01

    Phantoms elaboration has turned a very important field of study during the last decades due to its applications in medicine. These objects are capable of emulating or mimicking acoustically biological tissues in which parameters like speed of sound (SOS) and attenuation are successfully attained. However, these materials are expensive depending on their characteristics (USD 460.00 - 6000.00) and is difficult to have precise measurements because of their composition. This paper presents the elaboration and characterization of low cost ( USD $25.00) breast phantoms which emulate histological normality and pathological conditions in order to support algorithm calibration procedures in imaging diagnosis. Quantitative ultrasound (QUS) was applied to estimate SOS and attenuation values for breast tissue (background) and benign lesions (fibroadenoma and cysts). Results showed values of the SOS and attenuation for the background between 1410 - 1450 m/s and 0.40 - 0.55 dB/cm at 1 MHz sampling frequency, respectively. On the other hand, the SOS obtained for the lesions ranges from 1350 to 1700 m/s and attenuation values between 0.50 - 1.80 dB/cm at 1 MHz. Finally, the fabricated phantoms allowed for obtaining ultrasonograms comparable with real ones whose acoustic parameters are in agree with those reported in the literature.

  4. Development of a Radio Frequency Space Environment Path Emulator for Evaluating Spacecraft Ranging Hardware

    NASA Technical Reports Server (NTRS)

    Mitchell, Jason W.; Baldwin, Philip J.; Kurichh, Rishi; Naasz, Bo J.; Luquette, Richard J.

    2007-01-01

    The Formation Flying Testbed (FFTB) at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) provides a hardware-in-the-loop test environment for formation navigation and control. The facility is evolving as a modular, hybrid, dynamic simulation facility for end-to-end guidance, navigation and. control (GN&C) design and analysis of formation flying spacecraft. The core capabilities of the FFTB, as a platform for testing critical hardware and software algorithms in-the-loop, have expanded to include S-band Radio Frequency (RF) modems for inter-spacecraft communication and ranging. To enable realistic simulations that require RF ranging sensors for relative navigation, a mechanism is needed to buffer the RF signals exchanged between spacecraft that accurately emulates the dynamic environment through which the RF signals travel, including the effects of medium, moving platforms, and radiated power. The Path Emulator for RF Signals (PERFS), currently under development at NASA GSFC, provides this capability. The function and performance of a prototype device are presented.

  5. Characterization of a Prototype Radio Frequency Space Environment Path Emulator for Evaluating Spacecraft Ranging Hardware

    NASA Technical Reports Server (NTRS)

    Mitchell, Jason W.; Baldwin, Philip J.; Kurichh, Rishi; Naasz, Bo J.; Luquette, Richard J.

    2007-01-01

    The Formation Flying Testbed (FFTB) at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) provides a hardware-in-the-loop test environment for formation navigation and control. The facility is evolving as a modular, hybrid, dynamic simulation facility for end-to-end guidance, navigation and control (GN&C) design and analysis of formation flying spacecraft. The core capabilities of the FFTB, as a platform for testing critical hardware and software algorithms in-the-loop, have expanded to include S-band Radio Frequency (RF) modems for interspacecraft communication and ranging. To enable realistic simulations that require RF ranging sensors for relative navigation, a mechanism is needed to buffer the RF signals exchanged between spacecraft that accurately emulates the dynamic environment through which the RF signals travel, including the effects of the medium, moving platforms, and radiated power. The Path Emulator for Radio Frequency Signals (PERFS), currently under development at NASA GSFC, provides this capability. The function and performance of a prototype device are presented.

  6. Hardware and Software Integration to Support Real-Time Space Link Emulation

    NASA Technical Reports Server (NTRS)

    Murawski, Robert; Bhasin, Kul; Bittner, David; Sweet, Aaron; Coulter, Rachel; Schwab, Devin

    2012-01-01

    Prior to operational use, communications hardware and software must be thoroughly tested and verified. In space-link communications, field testing equipment can be prohibitively expensive and cannot test to non-ideal situations. In this paper, we show how software and hardware emulation tools can be used to accurately model the characteristics of a satellite communication channel in a lab environment. We describe some of the challenges associated with developing an emulation lab and present results to demonstrate the channel modeling. We then show how network emulation software can be used to extend a hardware emulation model without requiring additional network and channel simulation hardware.

  7. Hardware and Software Integration to Support Real-Time Space-Link Emulation

    NASA Technical Reports Server (NTRS)

    Murawski, Robert; Bhasin, Kul; Bittner, David

    2012-01-01

    Prior to operational use, communications hardware and software must be thoroughly tested and verified. In space-link communications, field testing equipment can be prohibitively expensive and cannot test to non-ideal situations. In this paper, we show how software and hardware emulation tools can be used to accurately model the characteristics of a satellite communication channel in a lab environment. We describe some of the challenges associated with developing an emulation lab and present results to demonstrate the channel modeling. We then show how network emulation software can be used to extend a hardware emulation model without requiring additional network and channel simulation hardware.

  8. Six networks on a universal neuromorphic computing substrate.

    PubMed

    Pfeil, Thomas; Grübl, Andreas; Jeltsch, Sebastian; Müller, Eric; Müller, Paul; Petrovici, Mihai A; Schmuker, Michael; Brüderle, Daniel; Schemmel, Johannes; Meier, Karlheinz

    2013-01-01

    In this study, we present a highly configurable neuromorphic computing substrate and use it for emulating several types of neural networks. At the heart of this system lies a mixed-signal chip, with analog implementations of neurons and synapses and digital transmission of action potentials. Major advantages of this emulation device, which has been explicitly designed as a universal neural network emulator, are its inherent parallelism and high acceleration factor compared to conventional computers. Its configurability allows the realization of almost arbitrary network topologies and the use of widely varied neuronal and synaptic parameters. Fixed-pattern noise inherent to analog circuitry is reduced by calibration routines. An integrated development environment allows neuroscientists to operate the device without any prior knowledge of neuromorphic circuit design. As a showcase for the capabilities of the system, we describe the successful emulation of six different neural networks which cover a broad spectrum of both structure and functionality.

  9. Six Networks on a Universal Neuromorphic Computing Substrate

    PubMed Central

    Pfeil, Thomas; Grübl, Andreas; Jeltsch, Sebastian; Müller, Eric; Müller, Paul; Petrovici, Mihai A.; Schmuker, Michael; Brüderle, Daniel; Schemmel, Johannes; Meier, Karlheinz

    2013-01-01

    In this study, we present a highly configurable neuromorphic computing substrate and use it for emulating several types of neural networks. At the heart of this system lies a mixed-signal chip, with analog implementations of neurons and synapses and digital transmission of action potentials. Major advantages of this emulation device, which has been explicitly designed as a universal neural network emulator, are its inherent parallelism and high acceleration factor compared to conventional computers. Its configurability allows the realization of almost arbitrary network topologies and the use of widely varied neuronal and synaptic parameters. Fixed-pattern noise inherent to analog circuitry is reduced by calibration routines. An integrated development environment allows neuroscientists to operate the device without any prior knowledge of neuromorphic circuit design. As a showcase for the capabilities of the system, we describe the successful emulation of six different neural networks which cover a broad spectrum of both structure and functionality. PMID:23423583

  10. The Signi?cance of Emulation in the Oral Interaction between Teacher and Students

    ERIC Educational Resources Information Center

    Kindeberg, Tina

    2013-01-01

    The lack of attention to the role of emotions generally has led modern learning theories to neglect the importance of emulation as a pedagogical support to students' learning. One reason could be that the influence of teacher personality is not considered in relation to learning outcome. Another reason may be that the concept of emulation has been…

  11. Emulation as an Integrating Principle for Cognition

    PubMed Central

    Colder, Brian

    2011-01-01

    Emulations, defined as ongoing internal representations of potential actions and the futures those actions are expected to produce, play a critical role in directing human bodily activities. Studies of gross motor behavior, perception, allocation of attention, response to errors, interoception, and homeostatic activities, and higher cognitive reasoning suggest that the proper execution of all these functions relies on emulations. Further evidence supports the notion that reinforcement learning in humans is aimed at updating emulations, and that action selection occurs via the advancement of preferred emulations toward realization of their action and environmental prediction. Emulations are hypothesized to exist as distributed active networks of neurons in cortical and sub-cortical structures. This manuscript ties together previously unrelated theories of the role of prediction in different aspects of human information processing to create an integrated framework for cognition. PMID:21660288

  12. Emulation Platform for Cyber Analysis of Wireless Communication Network Protocols

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Leeuwen, Brian P.; Eldridge, John M.

    Wireless networking and mobile communications is increasing around the world and in all sectors of our lives. With increasing use, the density and complexity of the systems increase with more base stations and advanced protocols to enable higher data throughputs. The security of data transported over wireless networks must also evolve with the advances in technologies enabling more capable wireless networks. However, means for analysis of the effectiveness of security approaches and implementations used on wireless networks are lacking. More specifically a capability to analyze the lower-layer protocols (i.e., Link and Physical layers) is a major challenge. An analysis approachmore » that incorporates protocol implementations without the need for RF emissions is necessary. In this research paper several emulation tools and custom extensions that enable an analysis platform to perform cyber security analysis of lower layer wireless networks is presented. A use case of a published exploit in the 802.11 (i.e., WiFi) protocol family is provided to demonstrate the effectiveness of the described emulation platform.« less

  13. Experimental Demonstration of Technologies for Autonomous On-Orbit Robotic Assembly

    NASA Technical Reports Server (NTRS)

    LeMaster, Edward A.; Schaechter, David B.; Carrington, Connie K.

    2006-01-01

    The Modular Reconfigurable High Energy (MRHE) program aimed to develop technologies for the automated assembly and deployment of large-scale space structures and aggregate spacecraft. Part of the project involved creation of a terrestrial robotic testbed for validation and demonstration of these technologies and for the support of future development activities. This testbed was completed in 2005, and was thereafter used to demonstrate automated rendezvous, docking, and self-assembly tasks between a group of three modular robotic spacecraft emulators. This paper discusses the rationale for the MRHE project, describes the testbed capabilities, and presents the MRHE assembly demonstration sequence.

  14. Development of HWIL Testing Capabilities for Satellite Target Emulation at AEDC

    NASA Astrophysics Data System (ADS)

    Lowry, H.; Crider, D.; Burns, J.; Thompson, R.; Goldsmith, G., II; Sholes, W.

    Programs involved in Space Situational Awareness (SSA) need the capability to test satellite sensors in a Hardware-in-the-Loop (HWIL) environment. Testing in a ground system avoids the significant cost of on-orbit test targets and the resulting issues such as debris mitigation, and in-space testing implications. The space sensor test facilities at AEDC consist of cryo-vacuum chambers that have been developed to project simulated targets to air-borne, space-borne, and ballistic platforms. The 7V chamber performs calibration and characterization of surveillance and seeker systems, as well as some mission simulation. The 10V chamber is being upgraded to provide real-time target simulation during the detection, acquisition, discrimination, and terminal phases of a seeker mission. The objective of the Satellite Emulation project is to upgrade this existing capability to support the ability to discern and track other satellites and orbital debris in a HWIL capability. It would provide a baseline for realistic testing of satellite surveillance sensors, which would be operated in a controlled environment. Many sensor functions could be tested, including scene recognition and maneuvering control software, using real interceptor hardware and software. Statistically significant and repeatable datasets produced by the satellite emulation system can be acquired during such test and saved for further analysis. In addition, the robustness of the discrimination and tracking algorithms can be investigated by a parametric analysis using slightly different scenarios; this will be used to determine critical points where a sensor system might fail. The radiometric characteristics of satellites are expected to be similar to the targets and decoys that make up a typical interceptor mission scenario, since they are near ambient temperature. Their spectral reflectivity, emissivity, and shape must also be considered, but the projection systems employed in the 7V and 10V chambers should be capable of providing the simulation of satellites as well. There may also be a need for greater radiometric intensity or shorter time response. An appropriate satellite model is integral to the scene generation process to meet the requirements of SSA programs. The Kinetic Kill Vehicle Hardware-in-the-Loop Simulator (KHILS) facility and the Guided Weapons Evaluation Facility (GWEF), both at Eglin Air Force Base, FL are assisting in developing the scene projection hardware, based on their significant test experience using resistive emitter arrays to test interceptors in a real-time environment. Army Aviation and Missile Research & Development Command (AMRDEC) will develop the Scene Generation System for the real-time mission simulation.

  15. An emulator for minimizing finite element analysis implementation resources

    NASA Technical Reports Server (NTRS)

    Melosh, R. J.; Utku, S.; Salama, M.; Islam, M.

    1982-01-01

    A finite element analysis emulator providing a basis for efficiently establishing an optimum computer implementation strategy when many calculations are involved is described. The SCOPE emulator determines computer resources required as a function of the structural model, structural load-deflection equation characteristics, the storage allocation plan, and computer hardware capabilities. Thereby, it provides data for trading analysis implementation options to arrive at a best strategy. The models contained in SCOPE lead to micro-operation computer counts of each finite element operation as well as overall computer resource cost estimates. Application of SCOPE to the Memphis-Arkansas bridge analysis provides measures of the accuracy of resource assessments. Data indicate that predictions are within 17.3 percent for calculation times and within 3.2 percent for peripheral storage resources for the ELAS code.

  16. High performance flight computer developed for deep space applications

    NASA Technical Reports Server (NTRS)

    Bunker, Robert L.

    1993-01-01

    The development of an advanced space flight computer for real time embedded deep space applications which embodies the lessons learned on Galileo and modern computer technology is described. The requirements are listed and the design implementation that meets those requirements is described. The development of SPACE-16 (Spaceborne Advanced Computing Engine) (where 16 designates the databus width) was initiated to support the MM2 (Marine Mark 2) project. The computer is based on a radiation hardened emulation of a modern 32 bit microprocessor and its family of support devices including a high performance floating point accelerator. Additional custom devices which include a coprocessor to improve input/output capabilities, a memory interface chip, and an additional support chip that provide management of all fault tolerant features, are described. Detailed supporting analyses and rationale which justifies specific design and architectural decisions are provided. The six chip types were designed and fabricated. Testing and evaluation of a brass/board was initiated.

  17. Demonstration of emulator-based Bayesian calibration of safety analysis codes: Theory and formulation

    DOE PAGES

    Yurko, Joseph P.; Buongiorno, Jacopo; Youngblood, Robert

    2015-05-28

    System codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the code and thereby improves its support for decision-making. The work reported here implements several improvements on classic calibration techniques afforded by modern analysis techniques. The key innovation has come from development of code surrogate model (or code emulator) construction and prediction algorithms. Use of a fast emulator makes the calibration processes used here withmore » Markov Chain Monte Carlo (MCMC) sampling feasible. This study uses Gaussian Process (GP) based emulators, which have been used previously to emulate computer codes in the nuclear field. The present work describes the formulation of an emulator that incorporates GPs into a factor analysis-type or pattern recognition-type model. This “function factorization” Gaussian Process (FFGP) model allows overcoming limitations present in standard GP emulators, thereby improving both accuracy and speed of the emulator-based calibration process. Calibration of a friction-factor example using a Method of Manufactured Solution is performed to illustrate key properties of the FFGP based process.« less

  18. Extending the Capabilities of Closed-loop Distributed Engine Control Simulations Using LAN Communication

    NASA Technical Reports Server (NTRS)

    Aretskin-Hariton, Eliot D.; Zinnecker, Alicia Mae; Culley, Dennis E.

    2014-01-01

    Distributed Engine Control (DEC) is an enabling technology that has the potential to advance the state-of-the-art in gas turbine engine control. To analyze the capabilities that DEC offers, a Hardware-In-the-Loop (HIL) test bed is being developed at NASA Glenn Research Center. This test bed will support a systems-level analysis of control capabilities in closed-loop engine simulations. The structure of the HIL emulates a virtual test cell by implementing the operator functions, control system, and engine on three separate computers. This implementation increases the flexibility and extensibility of the HIL. Here, a method is discussed for implementing these interfaces by connecting the three platforms over a dedicated Local Area Network (LAN). This approach is verified using the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k), which is typically implemented on one computer. There are marginal differences between the results from simulation of the typical and the three-computer implementation. Additional analysis of the LAN network, including characterization of network load, packet drop, and latency, is presented. The three-computer setup supports the incorporation of complex control models and proprietary engine models into the HIL framework.

  19. Light Microscopy Module (LMM)-Emulator

    NASA Technical Reports Server (NTRS)

    Levine, Howard G.; Smith, Trent M.; Richards, Stephanie E.

    2016-01-01

    The Light Microscopy Module (LMM) is a microscope facility developed at Glenn Research Center (GRC) that provides researchers with powerful imaging capability onboard the International Space Station (ISS). LMM has the ability to have its hardware recongured on-orbit to accommodate a wide variety of investigations, with the capability of remotely acquiring and downloading digital images across multiple levels of magnication.

  20. Space Communications Emulation Facility

    NASA Technical Reports Server (NTRS)

    Hill, Chante A.

    2004-01-01

    Establishing space communication between ground facilities and other satellites is a painstaking task that requires many precise calculations dealing with relay time, atmospheric conditions, and satellite positions, to name a few. The Space Communications Emulation Facility (SCEF) team here at NASA is developing a facility that will approximately emulate the conditions in space that impact space communication. The emulation facility is comprised of a 32 node distributed cluster of computers; each node representing a satellite or ground station. The objective of the satellites is to observe the topography of the Earth (water, vegetation, land, and ice) and relay this information back to the ground stations. Software originally designed by the University of Kansas, labeled the Emulation Manager, controls the interaction of the satellites and ground stations, as well as handling the recording of data. The Emulation Manager is installed on a Linux Operating System, employing both Java and C++ programming codes. The emulation scenarios are written in extensible Markup Language, XML. XML documents are designed to store, carry, and exchange data. With XML documents data can be exchanged between incompatible systems, which makes it ideal for this project because Linux, MAC and Windows Operating Systems are all used. Unfortunately, XML documents cannot display data like HTML documents. Therefore, the SCEF team uses XML Schema Definition (XSD) or just schema to describe the structure of an XML document. Schemas are very important because they have the capability to validate the correctness of data, define restrictions on data, define data formats, and convert data between different data types, among other things. At this time, in order for the Emulation Manager to open and run an XML emulation scenario file, the user must first establish a link between the schema file and the directory under which the XML scenario files are saved. This procedure takes place on the command line on the Linux Operating System. Once this link has been established the Emulation manager validates all the XML files in that directory against the schema file, before the actual scenario is run. Using some very sophisticated commercial software called the Satellite Tool Kit (STK) installed on the Linux box, the Emulation Manager is able to display the data and graphics generated by the execution of a XML emulation scenario file. The Emulation Manager software is written in JAVA programming code. Since the SCEF project is in the developmental stage, the source code for this type of software is being modified to better fit the requirements of the SCEF project. Some parameters for the emulation are hard coded, set at fixed values. Members of the SCEF team are altering the code to allow the user to choose the values of these hard coded parameters by inserting a toolbar onto the preexisting GUI.

  1. A Novel Approach for Determining Source–Receptor Relationships in Model Simulations: A Case Study of Black Carbon Transport in Northern Hemisphere Winter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Po-Lun; Gattiker, J. R.; Liu, Xiaohong

    2013-06-27

    A Gaussian process (GP) emulator is applied to quantify the contribution of local and remote emissions of black carbon (BC) on the BC concentrations in different regions using a Latin Hypercube sampling strategy for emission perturbations in the offline version of the Community Atmosphere Model Version 5.1 (CAM5) simulations. The source-receptor relationships are computed based on simulations constrained by a standard free-running CAM5 simulation and the ERA-Interim reanalysis product. The analysis demonstrates that the emulator is capable of retrieving the source-receptor relationships based on a small number of CAM5 simulations. Most regions are found susceptible to their local emissions. Themore » emulator also finds that the source-receptor relationships retrieved from the model-driven and the reanalysis-driven simulations are very similar, suggesting that the simulated circulation in CAM5 resembles the assimilated meteorology in ERA-Interim. The robustness of the results provides confidence for applying the emulator to detect dose-response signals in the climate system.« less

  2. BSM (+BMM) Data Emulator Dynamic Interrogative Data Capture (DIDC) Assessment Report: Proof of Concept

    DOT National Transportation Integrated Search

    2016-04-01

    The report documents the study conducted as part of the BSM+BMM Data Emulator project, to examine the DIDC concept and determine the best set of DIDC parameters that provide the most support to the performance measure estimation process with the leas...

  3. User authentication based on the NFC host-card-emulation technology

    NASA Astrophysics Data System (ADS)

    Kološ, Jan; Kotyrba, Martin

    2017-11-01

    This paper deals with implementation of algorithms for data exchange between mobile devices supporting NFC HCE (Host-Card-Emulation) and a contactless NFC reader communicating in a read/write mode. This solution provides multiplatform architecture for data exchange between devices with a focus on safe and simple user authentication.

  4. Development of the CELSS Emulator at NASA JSC

    NASA Technical Reports Server (NTRS)

    Cullingford, Hatice S.

    1989-01-01

    The Controlled Ecological Life Support System (CELSS) Emulator is under development at the NASA Johnson Space Center (JSC) with the purpose to investigate computer simulations of integrated CELSS operations involving humans, plants, and process machinery. This paper describes Version 1.0 of the CELSS Emulator that was initiated in 1988 on the JSC Multi Purpose Applications Console Test Bed as the simulation framework. The run module of the simulation system now contains a CELSS model called BLSS. The CELSS Emulator makes it possible to generate model data sets, store libraries of results for further analysis, and also display plots of model variables as a function of time. The progress of the project is presented with sample test runs and simulation display pages.

  5. Integration Testing of Space Flight Systems

    NASA Technical Reports Server (NTRS)

    Honeycutt, Timothy; Sowards, Stephanie

    2008-01-01

    Based on the previous success' of Multi-Element Integration Testing (MEITs) for the International Space Station Program, these type of integrated tests have also been planned for the Constellation Program: MEIT (1) CEV to ISS (emulated) (2) CEV to Lunar Lander/EDS (emulated) (3) Future: Lunar Surface Systems and Mars Missions Finite Element Integration Test (FEIT) (1) CEV/CLV (2) Lunar Lander/EDS/CaL V Integrated Verification Tests (IVT) (1) Performed as a subset of the FEITs during the flight tests and then performed for every flight after Full Operational Capability (FOC) has been obtained with the flight and ground Systems.

  6. Electroactive polymer (EAP) actuators for future humanlike robots

    NASA Astrophysics Data System (ADS)

    Bar-Cohen, Yoseph

    2009-03-01

    Human-like robots are increasingly becoming an engineering reality thanks to recent technology advances. These robots, which are inspired greatly by science fiction, were originated from the desire to reproduce the human appearance, functions and intelligence and they may become our household appliance or even companion. The development of such robots is greatly supported by emerging biologically inspired technologies. Potentially, electroactive polymer (EAP) materials are offering actuation capabilities that allow emulating the action of our natural muscles for making such machines perform lifelike. There are many technical issues related to making such robots including the need for EAP materials that can operate as effective actuators. Beside the technology challenges these robots also raise concerns that need to be addressed prior to forming super capable robots. These include the need to prevent accidents, deliberate harm, or their use in crimes. In this paper, the potential EAP actuators and the challenges that these robots may pose will be reviewed.

  7. Electroactive Polymer (EAP) Actuators for Future Humanlike Robots

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph

    2009-01-01

    Human-like robots are increasingly becoming an engineering reality thanks to recent technology advances. These robots, which are inspired greatly by science fiction, were originated from the desire to reproduce the human appearance, functions and intelligence and they may become our household appliance or even companion. The development of such robots is greatly supported by emerging biologically inspired technologies. Potentially, electroactive polymer (EAP) materials are offering actuation capabilities that allow emulating the action of our natural muscles for making such machines perform lifelike. There are many technical issues related to making such robots including the need for EAP materials that can operate as effective actuators. Beside the technology challenges these robots also raise concerns that need to be addressed prior to forming super capable robots. These include the need to prevent accidents, deliberate harm, or their use in crimes. In this paper, the potential EAP actuators and the challenges that these robots may pose will be reviewed.

  8. Multi-level emulation of complex climate model responses to boundary forcing data

    NASA Astrophysics Data System (ADS)

    Tran, Giang T.; Oliver, Kevin I. C.; Holden, Philip B.; Edwards, Neil R.; Sóbester, András; Challenor, Peter

    2018-04-01

    Climate model components involve both high-dimensional input and output fields. It is desirable to efficiently generate spatio-temporal outputs of these models for applications in integrated assessment modelling or to assess the statistical relationship between such sets of inputs and outputs, for example, uncertainty analysis. However, the need for efficiency often compromises the fidelity of output through the use of low complexity models. Here, we develop a technique which combines statistical emulation with a dimensionality reduction technique to emulate a wide range of outputs from an atmospheric general circulation model, PLASIM, as functions of the boundary forcing prescribed by the ocean component of a lower complexity climate model, GENIE-1. Although accurate and detailed spatial information on atmospheric variables such as precipitation and wind speed is well beyond the capability of GENIE-1's energy-moisture balance model of the atmosphere, this study demonstrates that the output of this model is useful in predicting PLASIM's spatio-temporal fields through multi-level emulation. Meaningful information from the fast model, GENIE-1 was extracted by utilising the correlation between variables of the same type in the two models and between variables of different types in PLASIM. We present here the construction and validation of several PLASIM variable emulators and discuss their potential use in developing a hybrid model with statistical components.

  9. Material requirements for bio-inspired sensing systems

    NASA Astrophysics Data System (ADS)

    Biggins, Peter; Lloyd, Peter; Salmond, David; Kusterbeck, Anne

    2008-10-01

    The aim of developing bio-inspired sensing systems is to try and emulate the amazing sensitivity and specificity observed in the natural world. These capabilities have evolved, often for specific tasks, which provide the organism with an advantage in its fight to survive and prosper. Capabilities cover a wide range of sensing functions including vision, temperature, hearing, touch, taste and smell. For some functions, the capabilities of natural systems are still greater than that achieved by traditional engineering solutions; a good example being a dog's sense of smell. Furthermore, attempting to emulate aspects of biological optics, processing and guidance may lead to more simple and effective devices. A bio-inspired sensing system is much more than the sensory mechanism. A system will need to collect samples, especially if pathogens or chemicals are of interest. Other functions could include the provision of power, surfaces and receptors, structure, locomotion and control. In fact it is possible to conceive of a complete bio-inspired system concept which is likely to be radically different from more conventional approaches. This concept will be described and individual component technologies considered.

  10. An orbital emulator for pursuit-evasion game theoretic sensor management

    NASA Astrophysics Data System (ADS)

    Shen, Dan; Wang, Tao; Wang, Gang; Jia, Bin; Wang, Zhonghai; Chen, Genshe; Blasch, Erik; Pham, Khanh

    2017-05-01

    This paper develops and evaluates an orbital emulator (OE) for space situational awareness (SSA). The OE can produce 3D satellite movements using capabilities generated from omni-wheeled robot and robotic arm motion methods. The 3D motion of a satellite is partitioned into the movements in the equatorial plane and the up-down motions in the vertical plane. The 3D actions are emulated by omni-wheeled robot models while the up-down motions are performed by a stepped-motor-controlled-ball along a rod (robotic arm), which is attached to the robot. For multiple satellites, a fast map-merging algorithm is integrated into the robot operating system (ROS) and simultaneous localization and mapping (SLAM) routines to locate the multiple robots in the scene. The OE is used to demonstrate a pursuit-evasion (PE) game theoretic sensor management algorithm, which models conflicts between a space-based-visible (SBV) satellite (as pursuer) and a geosynchronous (GEO) satellite (as evader). The cost function of the PE game is based on the informational entropy of the SBV-tracking-GEO scenario. GEO can maneuver using a continuous and low thruster. The hard-in-loop space emulator visually illustrates the SSA problem solution based PE game.

  11. Extreme Environments Capabilities at Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Balcerski, Jeffrey; Kremic, Tibor; Arnett, Lori; Vento, Dan; Nakley, Leah

    2016-01-01

    The NASA Glenn Research Center has several facilities that can provide testing for extreme evironments of interest to the New Frontiers community. This includes the Glenn Extreme Enivironments Rig (GEER) which can duplicate the atmospheric chemistry and conditions for the Venus surface or any other planet with a hot environment. GRC also has several cryogenic facilities which have the capability to run with hydrogen atmospheres, hydrocarbon atmosphere, CO2 based atmospheres or nitrogen atmospheres. The cryogenic facilities have the capability to emulate Titan lakes.

  12. The use of emulator-based simulators for on-board software maintenance

    NASA Astrophysics Data System (ADS)

    Irvine, M. M.; Dartnell, A.

    2002-07-01

    Traditionally, onboard software maintenance activities within the space sector are performed using hardware-based facilities. These facilities are developed around the use of hardware emulation or breadboards containing target processors. Some sort of environment is provided around the hardware to support the maintenance actives. However, these environments are not easy to use to set-up the required test scenarios, particularly when the onboard software executes in a dynamic I/O environment, e.g. attitude control software, or data handling software. In addition, the hardware and/or environment may not support the test set-up required during investigations into software anomalies, e.g. raise spurious interrupt, fail memory, etc, and the overall "visibility" of the software executing may be limited. The Software Maintenance Simulator (SOMSIM) is a tool that can support the traditional maintenance facilities. The following list contains some of the main benefits that SOMSIM can provide: Low cost flexible extension to existing product - operational simulator containing software processor emulator; System-level high-fidelity test-bed in which software "executes"; Provides a high degree of control/configuration over the entire "system", including contingency conditions perhaps not possible with real hardware; High visibility and control over execution of emulated software. This paper describes the SOMSIM concept in more detail, and also describes the SOMSIM study being carried out for ESA/ESOC by VEGA IT GmbH.

  13. Model description document for a computer program for the emulation/simulation of a space station environmental control and life support system (ESCM)

    NASA Technical Reports Server (NTRS)

    Yanosy, James L.

    1988-01-01

    Emulation/Simulation Computer Model (ESCM) computes the transient performance of a Space Station air revitalization subsystem with carbon dioxide removal provided by a solid amine water desorbed subsystem called SAWD. This manual describes the mathematical modeling and equations used in the ESCM. For the system as a whole and for each individual component, the fundamental physical and chemical laws which govern their operations are presented. Assumptions are stated, and when necessary, data is presented to support empirically developed relationships.

  14. Autonomous Soft Robotic Fish Capable of Escape Maneuvers Using Fluidic Elastomer Actuators.

    PubMed

    Marchese, Andrew D; Onal, Cagdas D; Rus, Daniela

    2014-03-01

    In this work we describe an autonomous soft-bodied robot that is both self-contained and capable of rapid, continuum-body motion. We detail the design, modeling, fabrication, and control of the soft fish, focusing on enabling the robot to perform rapid escape responses. The robot employs a compliant body with embedded actuators emulating the slender anatomical form of a fish. In addition, the robot has a novel fluidic actuation system that drives body motion and has all the subsystems of a traditional robot onboard: power, actuation, processing, and control. At the core of the fish's soft body is an array of fluidic elastomer actuators. We design the fish to emulate escape responses in addition to forward swimming because such maneuvers require rapid body accelerations and continuum-body motion. These maneuvers showcase the performance capabilities of this self-contained robot. The kinematics and controllability of the robot during simulated escape response maneuvers are analyzed and compared with studies on biological fish. We show that during escape responses, the soft-bodied robot has similar input-output relationships to those observed in biological fish. The major implication of this work is that we show soft robots can be both self-contained and capable of rapid body motion.

  15. Text Processing and Formatting: Composure, Composition and Eros.

    ERIC Educational Resources Information Center

    Blair, John C., Jr.

    1984-01-01

    Review of computer software offering text editing/processing capabilities highlights work habits, elements of computer style and composition, buffers, the CRT, line- and screen-oriented text editors, video attributes, "swapping,""cache" memory, "disk emulators," text editing versus text processing, and UNIX operating…

  16. Development of Network-based Communications Architectures for Future NASA Missions

    NASA Technical Reports Server (NTRS)

    Slywczak, Richard A.

    2007-01-01

    Since the Vision for Space Exploration (VSE) announcement, NASA has been developing a communications infrastructure that combines existing terrestrial techniques with newer concepts and capabilities. The overall goal is to develop a flexible, modular, and extensible architecture that leverages and enhances terrestrial networking technologies that can either be directly applied or modified for the space regime. In addition, where existing technologies leaves gaps, new technologies must be developed. An example includes dynamic routing that accounts for constrained power and bandwidth environments. Using these enhanced technologies, NASA can develop nodes that provide characteristics, such as routing, store and forward, and access-on-demand capabilities. But with the development of the new infrastructure, challenges and obstacles will arise. The current communications infrastructure has been developed on a mission-by-mission basis rather than an end-to-end approach; this has led to a greater ground infrastructure, but has not encouraged communications between space-based assets. This alone provides one of the key challenges that NASA must encounter. With the development of the new Crew Exploration Vehicle (CEV), NASA has the opportunity to provide an integration path for the new vehicles and provide standards for their development. Some of the newer capabilities these vehicles could include are routing, security, and Software Defined Radios (SDRs). To meet these needs, the NASA/Glenn Research Center s (GRC) Network Emulation Laboratory (NEL) has been using both simulation and emulation to study and evaluate these architectures. These techniques provide options to NASA that directly impact architecture development. This paper identifies components of the infrastructure that play a pivotal role in the new NASA architecture, develops a scheme using simulation and emulation for testing these architectures and demonstrates how NASA can strengthen the new infrastructure by implementing these concepts.

  17. Interfacing Space Communications and Navigation Network Simulation with Distributed System Integration Laboratories (DSIL)

    NASA Technical Reports Server (NTRS)

    Jennings, Esther H.; Nguyen, Sam P.; Wang, Shin-Ywan; Woo, Simon S.

    2008-01-01

    NASA's planned Lunar missions will involve multiple NASA centers where each participating center has a specific role and specialization. In this vision, the Constellation program (CxP)'s Distributed System Integration Laboratories (DSIL) architecture consist of multiple System Integration Labs (SILs), with simulators, emulators, testlabs and control centers interacting with each other over a broadband network to perform test and verification for mission scenarios. To support the end-to-end simulation and emulation effort of NASA' exploration initiatives, different NASA centers are interconnected to participate in distributed simulations. Currently, DSIL has interconnections among the following NASA centers: Johnson Space Center (JSC), Kennedy Space Center (KSC), Marshall Space Flight Center (MSFC) and Jet Propulsion Laboratory (JPL). Through interconnections and interactions among different NASA centers, critical resources and data can be shared, while independent simulations can be performed simultaneously at different NASA locations, to effectively utilize the simulation and emulation capabilities at each center. Furthermore, the development of DSIL can maximally leverage the existing project simulation and testing plans. In this work, we describe the specific role and development activities at JPL for Space Communications and Navigation Network (SCaN) simulator using the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) tool to simulate communications effects among mission assets. Using MACHETE, different space network configurations among spacecrafts and ground systems of various parameter sets can be simulated. Data that is necessary for tracking, navigation, and guidance of spacecrafts such as Crew Exploration Vehicle (CEV), Crew Launch Vehicle (CLV), and Lunar Relay Satellite (LRS) and orbit calculation data are disseminated to different NASA centers and updated periodically using the High Level Architecture (HLA). In addition, the performance of DSIL under different traffic loads with different mix of data and priorities are evaluated.

  18. The TAVERNS emulator: An Ada simulation of the space station data communications network and software development environment

    NASA Technical Reports Server (NTRS)

    Howes, Norman R.

    1986-01-01

    The Space Station DMS (Data Management System) is the onboard component of the Space Station Information System (SSIS) that includes the computers, networks and software that support the various core and payload subsystems of the Space Station. TAVERNS (Test And Validation Environment for Remote Networked Systems) is a distributed approach for development and validation of application software for Space Station. The TAVERNS concept assumes that the different subsystems will be developed by different contractors who may be geographically separated. The TAVERNS Emulator is an Ada simulation of a TAVERNS on the ASD VAX. The software services described in the DMS Test Bed User's Manual are being emulated on the VAX together with simulations of some of the core subsystems and a simulation of the DCN. The TAVERNS Emulator will be accessible remotely from any VAX that can communicate with the ASD VAX.

  19. Flash LIDAR Emulator for HIL Simulation

    NASA Technical Reports Server (NTRS)

    Brewster, Paul F.

    2010-01-01

    NASA's Autonomous Landing and Hazard Avoidance Technology (ALHAT) project is building a system for detecting hazards and automatically landing controlled vehicles safely anywhere on the Moon. The Flash Light Detection And Ranging (LIDAR) sensor is used to create on-the-fly a 3D map of the unknown terrain for hazard detection. As part of the ALHAT project, a hardware-in-the-loop (HIL) simulation testbed was developed to test the data processing, guidance, and navigation algorithms in real-time to prove their feasibility for flight. Replacing the Flash LIDAR camera with an emulator in the testbed provided a cheaper, safer, more feasible way to test the algorithms in a controlled environment. This emulator must have the same hardware interfaces as the LIDAR camera, have the same performance characteristics, and produce images similar in quality to the camera. This presentation describes the issues involved and the techniques used to create a real-time flash LIDAR emulator to support HIL simulation.

  20. S-Band propagation measurements

    NASA Technical Reports Server (NTRS)

    Briskman, Robert D.

    1994-01-01

    A geosynchronous satellite system capable of providing many channels of digital audio radio service (DARS) to mobile platforms within the contiguous United States using S-band radio frequencies is being implemented. The system is designed uniquely to mitigate both multipath fading and outages from physical blockage in the transmission path by use of satellite spatial diversity in combination with radio frequency and time diversity. The system also employs a satellite orbital geometry wherein all mobile platforms in the contiguous United States have elevation angles greater than 20 deg to both of the diversity satellites. Since implementation of the satellite system will require three years, an emulation has been performed using terrestrial facilities in order to allow evaluation of DARS capabilities in advance of satellite system operations. The major objective of the emulation was to prove the feasibility of broadcasting from satellites 30 channels of CD quality programming using S-band frequencies to an automobile equipped with a small disk antenna and to obtain quantitative performance data on S-band propagation in a satellite spatial diversity system.

  1. Proposed Development of NASA Glenn Research Center's Aeronautical Network Research Simulator

    NASA Technical Reports Server (NTRS)

    Nguyen, Thanh C.; Kerczewski, Robert J.; Wargo, Chris A.; Kocin, Michael J.; Garcia, Manuel L.

    2004-01-01

    Accurate knowledge and understanding of data link traffic loads that will have an impact on the underlying communications infrastructure within the National Airspace System (NAS) is of paramount importance for planning, development and fielding of future airborne and ground-based communications systems. Attempting to better understand this impact, NASA Glenn Research Center (GRC), through its contractor Computer Networks & Software, Inc. (CNS, Inc.), has developed an emulation and test facility known as the Virtual Aircraft and Controller (VAC) to study data link interactions and the capacity of the NAS to support Controller Pilot Data Link Communications (CPDLC) traffic. The drawback of the current VAC test bed is that it does not allow the test personnel and researchers to present a real world RF environment to a complex airborne or ground system. Fortunately, the United States Air Force and Navy Avionics Test Commands, through its contractor ViaSat, Inc., have developed the Joint Communications Simulator (JCS) to provide communications band test and simulation capability for the RF spectrum through 18 GHz including Communications, Navigation, and Identification and Surveillance functions. In this paper, we are proposing the development of a new and robust test bed that will leverage on the existing NASA GRC's VAC and the Air Force and Navy Commands JCS systems capabilities and functionalities. The proposed NASA Glenn Research Center's Aeronautical Networks Research Simulator (ANRS) will combine current Air Traffic Control applications and physical RF stimulation into an integrated system capable of emulating data transmission behaviors including propagation delay, physical protocol delay, transmission failure and channel interference. The ANRS will provide a simulation/stimulation tool and test bed environment that allow the researcher to predict the performance of various aeronautical network protocol standards and their associated waveforms under varying density conditions. The system allows the user to define human-interactive and scripted aircraft and controller models of various standards, such as (but not limited to) Very High Frequency Digital Link (VDL) of various modes.

  2. QERx- A Faster than Real-Time Emulator for Space Processors

    NASA Astrophysics Data System (ADS)

    Carvalho, B.; Pidgeon, A.; Robinson, P.

    2012-08-01

    Developing software for space systems is challenging. Especially because, in order to be sure it can cope with the harshness of the environment and the imperative requirements and constrains imposed by the platform were it will run, it needs to be tested exhaustively. Software Validation Facilities (SVF) are known to the industry and developers, and provide the means to run the On-Board Software (OBSW) in a realistic environment, allowing the development team to debug and test the software.But the challenge is to be able to keep up with the performance of the new processors (LEON2 and LEON3), which need to be emulated within the SVF. Such processor emulators are also used in Operational Simulators, used to support mission preparation and train mission operators. These simulators mimic the satellite and its behaviour, as realistically as possible. For test/operational efficiency reasons and because they will need to interact with external systems, both these uses cases require the processor emulators to provide real-time, or faster, performance.It is known to the industry that the performance of previously available emulators is not enough to cope with the performance of the new processors available in the market. SciSys approached this problem with dynamic translation technology trying to keep costs down by avoiding a hardware solution and keeping the integration flexibility of full software emulation.SciSys presented “QERx: A High Performance Emulator for Software Validation and Simulations” [1], in a previous DASIA event. Since then that idea has evolved and QERx has been successfully validated. SciSys is now presenting QERx as a product that can be tailored to fit different emulation needs. This paper will present QERx latest developments and current status.

  3. A polymorphic reconfigurable emulator for parallel simulation

    NASA Technical Reports Server (NTRS)

    Parrish, E. A., Jr.; Mcvey, E. S.; Cook, G.

    1980-01-01

    Microprocessor and arithmetic support chip technology was applied to the design of a reconfigurable emulator for real time flight simulation. The system developed consists of master control system to perform all man machine interactions and to configure the hardware to emulate a given aircraft, and numerous slave compute modules (SCM) which comprise the parallel computational units. It is shown that all parts of the state equations can be worked on simultaneously but that the algebraic equations cannot (unless they are slowly varying). Attempts to obtain algorithms that will allow parellel updates are reported. The word length and step size to be used in the SCM's is determined and the architecture of the hardware and software is described.

  4. Teaching Effectively with Visual Effect in an Image-Processing Class.

    ERIC Educational Resources Information Center

    Ng, G. S.

    1997-01-01

    Describes a course teaching the use of computers in emulating human visual capability and image processing and proposes an interactive presentation using multimedia technology to capture and sustain student attention. Describes the three phase presentation: introduction of image processing equipment, presentation of lecture material, and…

  5. Fourth International Workshop on Grid Simulator Testing of Wind Turbine

    Science.gov Websites

    , United Kingdom Smart Reconfiguration and Protection in Advanced Electric Distribution Grids - Mayank Capabilities in Kinectrics - Nicolas Wrathall, Kinectrics, Canada Discussion Day 2: April 26, 2017 Advanced Grid Emulation Methods Advanced PHIL Interface for Multi-MW Scale Inverter Testing - Przemyslaw

  6. A Voice-Detecting Sensor and a Scanning Keyboard Emulator to Support Word Writing by Two Boys with Extensive Motor Disabilities

    ERIC Educational Resources Information Center

    Lancioni, Giulio E.; Singh, Nirbhay N.; O'Reilly, Mark F.; Sigafoos, Jeff; Green, Vanessa; Chiapparino, Claudia; Stasolla, Fabrizio; Oliva, Doretta

    2009-01-01

    The present study assessed the use of a voice-detecting sensor interfaced with a scanning keyboard emulator to allow two boys with extensive motor disabilities to write. Specifically, the study (a) compared the effects of the voice-detecting sensor with those of a familiar pressure sensor on the boys' writing time, (b) checked which of the sensors…

  7. Internet Tomography in Support of Internet and Network Simulation and Emulation Modelling

    NASA Astrophysics Data System (ADS)

    Moloisane, A.; Ganchev, I.; O'Droma, M.

    Internet performance measurement data extracted through Internet Tomography techniques and metrics and how it may be used to enhance the capacity of network simulation and emulation modelling is addressed in this paper. The advantages of network simulation and emulation as a means to aid design and develop the component networks, which make up the Internet and are fundamental to its ongoing evolution, are highlighted. The Internet's rapid growth has spurred development of new protocols and algorithms to meet changing operational requirements such as security, multicast delivery, mobile networking, policy management, and quality of service (QoS) support. Both the development and evaluation of these operational tools requires the answering of many design and operational questions. Creating the technical support required by network engineers and managers in their efforts to seek answers to these questions is in itself a major challenge. Within the Internet the number and range of services supported continues to grow exponentially, from legacy and client/server applications to VoIP, multimedia streaming services and interactive multimedia services. Services have their own distinctive requirements and idiosyncrasies. They respond differently to bandwidth limitations, latency and jitter problems. They generate different types of “conversations” between end-user terminals, back-end resources and middle-tier servers. To add to the complexity, each new or enhanced service introduced onto the network contends for available bandwidth with every other service. In an effort to ensure networking products and resources being designed and developed handling diverse conditions encountered in real Internet environments, network simulation and emulation modelling is a valuable tool, and becoming a critical element, in networking product and application design and development. The better these laboratory tools reflect real-world environment and conditions the more helpful to designers they will be.

  8. Voice/Data Integration in the PBX (Private Branch Exchange).

    DTIC Science & Technology

    1985-01-01

    NUMBER 7. AUTHOR(a) 6. CONTRACT OR GRANT NUMBER(a) Michelle A. Bull * . PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT...37 capability to emulate other data environments such as IBM’s System Network Architechture (SNA) environ- met26 Access outside the PNX system is

  9. Development of the CELSS emulator at NASA. Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Cullingford, Hatice S.

    1990-01-01

    The Closed Ecological Life Support System (CELSS) Emulator is under development. It will be used to investigate computer simulations of integrated CELSS operations involving humans, plants, and process machinery. Described here is Version 1.0 of the CELSS Emulator that was initiated in 1988 on the Johnson Space Center (JSC) Multi Purpose Applications Console Test Bed as the simulation framework. The run model of the simulation system now contains a CELSS model called BLSS. The CELSS simulator empowers us to generate model data sets, store libraries of results for further analysis, and also display plots of model variables as a function of time. The progress of the project is presented with sample test runs and simulation display pages.

  10. Competition among states: Case studies in the political role of remote sensing capabilities

    NASA Astrophysics Data System (ADS)

    Ammons, Audrey Ann

    International politics is a competitive realm. One of the most powerful modern advantages in this competitive world is the ownership of independent and autonomous remote sensing satellites. Few have this venue for competition and those that do belong to a very exclusive groups of states. Kenneth Waltz, author of Theory of International Politics, theorized that states emulate the innovations, strategies and practices of those countries with the greatest capability and ingenuity. As Waltz explains, states will emulate the leader in an anarchic realm to attain the same capabilities that helped the hegemon attain or maintain its status. Waltz referred to this as a tendency toward sameness of the competitors. Modern-day states that pursue global preeminence often exhibit exceptional risk-taking and significant technological innovation. They also challenge the recognized hegemon in an area of expertise and leadership. Realists would say that these states are emulating the behavior of the states they view as successful in order to maintain or improve their position in the world order. Realists also point out that strategic interests lead states to try to gain or at least neutralize those areas that, if controlled by an adversary, could menace them. Realist writers suggest that states will be reluctant to cede control of an important new technology to another state, even a friendly one, lest they find themselves permanently disadvantaged in an on-going contest for wealth, influence and even preeminence. The purpose of this research is to investigate if remote sensing capabilities are a venue of competition among modern states and one that they view as a potential path to global preeminence. Why do some states expend scarce resources to develop and maintain an indigenous remote sensing capability when it appears that they can acquire much of the end product from other sources at a reasonable cost? If this is true, it should be possible to confirm that states acquire end-to-end remote sensing capabilities as a means to maintain or improve their position in the world order. These states are willing to devote significant resources in order to control this technology because they believe successful states have used remote sensing technology as a means to acquire and maintain their preeminent position. States that own and operate remote sensing capabilities must take considerable risks and apply technological innovation to succeed. Whether the technology is an historical example such as a sixteenth century ship or its modern equivalent---a twenty-first century satellite---the potential rewards are the same: military advantage, commercial markets, and global recognition.

  11. Emulating DC constant power load: a robust sliding mode control approach

    NASA Astrophysics Data System (ADS)

    Singh, Suresh; Fulwani, Deepak; Kumar, Vinod

    2017-09-01

    This article presents emulation of a programmable power electronic, constant power load (CPL) using a dc/dc step-up (boost) converter. The converter is controlled by a robust sliding mode controller (SMC). A novel switching surface is proposed to ensure a required power sunk by the converter. The proposed dc CPL is simple in design, has fast dynamic response and high accuracy, and offers an inexpensive alternative to study converters for cascaded dc distribution power system applications. Furthermore, the proposed CPL is sufficiently robust against the input voltage variations. A laboratory prototype of the proposed dc CPL has been developed and validated with SMC realised through OPAL-RT platform. The capability of the proposed dc CPL is confirmed via experimentations in varied scenarios.

  12. Programming a Detector Emulator on NI's FlexRIO Platform

    NASA Astrophysics Data System (ADS)

    Gervais, Michelle; Crawford, Christopher; Sprow, Aaron; Nab Collaboration

    2017-09-01

    Recently digital detector emulators have been on the rise as a means to test data acquisition systems and analysis toolkits from a well understood data set. National Instruments' PXIe-7962R FPGA module and Active Technologies AT-1212 DAC module provide a customizable platform for analog output. Using a graphical programming language, we have developed a system capable of producing two time-correlated channels of analog output which sample unique amplitude spectra to mimic nuclear physics experiments. This system will be used to model the Nab experiment, in which a prompt beta decay electron is followed by a slow proton according to a defined time distribution. We will present the results of our work and discuss further development potential. DOE under Contract DE-SC0008107.

  13. Dynamic emulation modelling for the optimal operation of water systems: an overview

    NASA Astrophysics Data System (ADS)

    Castelletti, A.; Galelli, S.; Giuliani, M.

    2014-12-01

    Despite sustained increase in computing power over recent decades, computational limitations remain a major barrier to the effective and systematic use of large-scale, process-based simulation models in rational environmental decision-making. Whereas complex models may provide clear advantages when the goal of the modelling exercise is to enhance our understanding of the natural processes, they introduce problems of model identifiability caused by over-parameterization and suffer from high computational burden when used in management and planning problems. As a result, increasing attention is now being devoted to emulation modelling (or model reduction) as a way of overcoming these limitations. An emulation model, or emulator, is a low-order approximation of the process-based model that can be substituted for it in order to solve high resource-demanding problems. In this talk, an overview of emulation modelling within the context of the optimal operation of water systems will be provided. Particular emphasis will be given to Dynamic Emulation Modelling (DEMo), a special type of model complexity reduction in which the dynamic nature of the original process-based model is preserved, with consequent advantages in a wide range of problems, particularly feedback control problems. This will be contrasted with traditional non-dynamic emulators (e.g. response surface and surrogate models) that have been studied extensively in recent years and are mainly used for planning purposes. A number of real world numerical experiences will be used to support the discussion ranging from multi-outlet water quality control in water reservoir through erosion/sedimentation rebalancing in the operation of run-off-river power plants to salinity control in lake and reservoirs.

  14. Graphics with Special Interfaces for Disabled People.

    ERIC Educational Resources Information Center

    Tronconi, A.; And Others

    The paper describes new software and special input devices to allow physically impaired children to utilize the graphic capabilities of personal computers. Special input devices for computer graphics access--the voice recognition card, the single switch, or the mouse emulator--can be used either singly or in combination by the disabled to control…

  15. 46 CFR 525.3 - Availability of marine terminal operator schedules.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... that is made available to the public shall be available during normal business hours and in electronic... computer (PC) by: (1) Dial-up connection via public switched telephone networks (PSTN); or (2) The Internet... incoming calls, (ii) Smart terminal capability for VT-100 terminal or terminal emulation access, and (iii...

  16. 46 CFR 525.3 - Availability of marine terminal operator schedules.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... that is made available to the public shall be available during normal business hours and in electronic... computer (PC) by: (1) Dial-up connection via public switched telephone networks (PSTN); or (2) The Internet... incoming calls, (ii) Smart terminal capability for VT-100 terminal or terminal emulation access, and (iii...

  17. 46 CFR 525.3 - Availability of marine terminal operator schedules.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... that is made available to the public shall be available during normal business hours and in electronic... computer (PC) by: (1) Dial-up connection via public switched telephone networks (PSTN); or (2) The Internet... incoming calls, (ii) Smart terminal capability for VT-100 terminal or terminal emulation access, and (iii...

  18. Biomimetic robots using EAP as artificial muscles - progress and challenges

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph

    2004-01-01

    Biology offers a great model for emulation in areas ranging from tools, computational algorithms, materials science, mechanisms and information technology. In recent years, the field of biomimetics, namely mimicking biology, has blossomed with significant advances enabling the reverse engineering of many animals' functions and implementation of some of these capabilities.

  19. A Hardware-in-the-Loop Testbed for Spacecraft Formation Flying Applications

    NASA Technical Reports Server (NTRS)

    Leitner, Jesse; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    The Formation Flying Test Bed (FFTB) at NASA Goddard Space Flight Center (GSFC) is being developed as a modular, hybrid dynamic simulation facility employed for end-to-end guidance, navigation, and control (GN&C) analysis and design for formation flying clusters and constellations of satellites. The FFTB will support critical hardware and software technology development to enable current and future missions for NASA, other government agencies, and external customers for a wide range of missions, particularly those involving distributed spacecraft operations. The initial capabilities of the FFTB are based upon an integration of high fidelity hardware and software simulation, emulation, and test platforms developed at GSFC in recent years; including a high-fidelity GPS simulator which has been a fundamental component of the Guidance, Navigation, and Control Center's GPS Test Facility. The FFTB will be continuously evolving over the next several years from a too[ with initial capabilities in GPS navigation hardware/software- in-the- loop analysis and closed loop GPS-based orbit control algorithm assessment to one with cross-link communications and relative navigation analysis and simulation capability. Eventually the FFT13 will provide full capability to support all aspects of multi-sensor, absolute and relative position determination and control, in all (attitude and orbit) degrees of freedom, as well as information management for satellite clusters and constellations. In this paper we focus on the architecture for the FFT13 as a general GN&C analysis environment for the spacecraft formation flying community inside and outside of NASA GSFC and we briefly reference some current and future activities which will drive the requirements and development.

  20. Internal monitoring of GBTx emulator using IPbus for CBM experiment

    NASA Astrophysics Data System (ADS)

    Mandal, Swagata; Zabolotny, Wojciech; Sau, Suman; Chkrabarti, Amlan; Saini, Jogender; Chattopadhyay, Subhasis; Pal, Sushanta Kumar

    2015-09-01

    The Compressed Baryonic Matter (CBM) experiment is a part of the Facility for Antiproton and Ion Research (FAIR) in Darmstadt at GSI. In CBM experiment a precisely time synchronized fault tolerant self-triggered electronics is required for Data Acquisition (DAQ) system in CBM experiments which can support high data rate (up to several TB/s). As a part of the implementation of the DAQ system of Muon Chamber (MUCH) which is one of the important detectors in CBM experiment, a FPGA based Gigabit Transceiver (GBTx) emulator is implemented. Readout chain for MUCH consists of XYTER chips (Front end electronics) which will be directly connected to detector, GBTx emulator, Data Processing Board (DPB) and First level event selector board (FLIB) with backend software interface. GBTx emulator will be connected with the XYTER emulator through LVDS (Low Voltage Differential Signalling) line in the front end and in the back end it is connected with DPB through 4.8 Gbps optical link. IPBus over Ethernet is used for internal monitoring of the registers within the GBTx. In IPbus implementation User Datagram Protocol (UDP) stack is used in transport layer of OSI model so that GBTx can be controlled remotely. A Python script is used at computer side to drive IPbus controller.

  1. High Fidelity Simulations of Large-Scale Wireless Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onunkwo, Uzoma; Benz, Zachary

    The worldwide proliferation of wireless connected devices continues to accelerate. There are 10s of billions of wireless links across the planet with an additional explosion of new wireless usage anticipated as the Internet of Things develops. Wireless technologies do not only provide convenience for mobile applications, but are also extremely cost-effective to deploy. Thus, this trend towards wireless connectivity will only continue and Sandia must develop the necessary simulation technology to proactively analyze the associated emerging vulnerabilities. Wireless networks are marked by mobility and proximity-based connectivity. The de facto standard for exploratory studies of wireless networks is discrete event simulationsmore » (DES). However, the simulation of large-scale wireless networks is extremely difficult due to prohibitively large turnaround time. A path forward is to expedite simulations with parallel discrete event simulation (PDES) techniques. The mobility and distance-based connectivity associated with wireless simulations, however, typically doom PDES and fail to scale (e.g., OPNET and ns-3 simulators). We propose a PDES-based tool aimed at reducing the communication overhead between processors. The proposed solution will use light-weight processes to dynamically distribute computation workload while mitigating communication overhead associated with synchronizations. This work is vital to the analytics and validation capabilities of simulation and emulation at Sandia. We have years of experience in Sandia’s simulation and emulation projects (e.g., MINIMEGA and FIREWHEEL). Sandia’s current highly-regarded capabilities in large-scale emulations have focused on wired networks, where two assumptions prevent scalable wireless studies: (a) the connections between objects are mostly static and (b) the nodes have fixed locations.« less

  2. discovery toolset for Emulytics v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fritz, David; Crussell, Jonathan

    The discovery toolset for Emulytics enables the construction of high-fidelity emulation models of systems. The toolset consists of a set of tools and techniques to automatically go from network discovery of operational systems to emulating those complex systems. Our toolset combines data from host discovery and network mapping tools into an intermediate representation that can then be further refined. Once the intermediate representation reaches the desired state, our toolset supports emitting the Emulytics models with varying levels of specificity based on experiment needs.

  3. Debugging embedded computer programs. [tactical missile computers

    NASA Technical Reports Server (NTRS)

    Kemp, G. H.

    1980-01-01

    Every embedded computer program must complete its debugging cycle using some system that will allow real time debugging. Many of the common items addressed during debugging are listed. Seven approaches to debugging are analyzed to evaluate how well they treat those items. Cost evaluations are also included in the comparison. The results indicate that the best collection of capabilities to cover the common items present in the debugging task occurs in the approach where a minicomputer handles the environment simulation with an emulation of some kind representing the embedded computer. This approach can be taken at a reasonable cost. The case study chosen is an embedded computer in a tactical missile. Several choices of computer for the environment simulation are discussed as well as different approaches to the embedded emulator.

  4. The use of real-time, hardware-in-the-loop simulation in the design and development of the new Hughes HS601 spacecraft attitude control system

    NASA Technical Reports Server (NTRS)

    Slafer, Loren I.

    1989-01-01

    Realtime simulation and hardware-in-the-loop testing is being used extensively in all phases of the design, development, and testing of the attitude control system (ACS) for the new Hughes HS601 satellite bus. Realtime, hardware-in-the-loop simulation, integrated with traditional analysis and pure simulation activities is shown to provide a highly efficient and productive overall development program. Implementation of high fidelity simulations of the satellite dynamics and control system algorithms, capable of real-time execution (using applied Dynamics International's System 100), provides a tool which is capable of being integrated with the critical flight microprocessor to create a mixed simulation test (MST). The MST creates a highly accurate, detailed simulated on-orbit test environment, capable of open and closed loop ACS testing, in which the ACS design can be validated. The MST is shown to provide a valuable extension of traditional test methods. A description of the MST configuration is presented, including the spacecraft dynamics simulation model, sensor and actuator emulators, and the test support system. Overall system performance parameters are presented. MST applications are discussed; supporting ACS design, developing on-orbit system performance predictions, flight software development and qualification testing (augmenting the traditional software-based testing), mission planning, and a cost-effective subsystem-level acceptance test. The MST is shown to provide an ideal tool in which the ACS designer can fly the spacecraft on the ground.

  5. Kernel-based Linux emulation for Plan 9.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Minnich, Ronald G.

    2010-09-01

    CNKemu is a kernel-based system for the 9k variant of the Plan 9 kernel. It is designed to provide transparent binary support for programs compiled for IBM's Compute Node Kernel (CNK) on the Blue Gene series of supercomputers. This support allows users to build applications with the standard Blue Gene toolchain, including C++ and Fortran compilers. While the CNK is not Linux, IBM designed the CNK so that the user interface has much in common with the Linux 2.0 system call interface. The Plan 9 CNK emulator hence provides the foundation of kernel-based Linux system call support on Plan 9.more » In this paper we discuss cnkemu's implementation and some of its more interesting features, such as the ability to easily intermix Plan 9 and Linux system calls.« less

  6. Emulation of reactor irradiation damage using ion beams

    DOE PAGES

    Was, G. S.; Jiao, Z.; Getto, E.; ...

    2014-06-14

    The continued operation of existing light water nuclear reactors and the development of advanced nuclear reactor depend heavily on understanding how damage by radiation to levels degrades materials that serve as the structural components in reactor cores. The first high dose ion irradiation experiments on a ferritic-martensitic steel showing that ion irradiation closely emulates the full radiation damage microstructure created in-reactor are described. Ferritic-martensitic alloy HT9 (heat 84425) in the form of a hexagonal fuel bundle duct (ACO-3) accumulated 155 dpa at an average temperature of 443°C in the Fast Flux Test Facility (FFTF). Using invariance theory as a guide,more » irradiation of the same heat was conducted using self-ions (Fe++) at 5 MeV at a temperature of 460°C and to a dose of 188 displacements per atom. The void swelling was nearly identical between the two irradiation and the size and density of precipitates and loops following ion irradiation are within a factor of two of those for neutron irradiation. The level of agreement across all of the principal microstructure changes between ion and reactor irradiation establishes the capability of tailoring ion irradiation to emulate the reactor-irradiated microstructure.« less

  7. A robotic orbital emulator with lidar-based SLAM and AMCL for multiple entity pose estimation

    NASA Astrophysics Data System (ADS)

    Shen, Dan; Xiang, Xingyu; Jia, Bin; Wang, Zhonghai; Chen, Genshe; Blasch, Erik; Pham, Khanh

    2018-05-01

    This paper revises and evaluates an orbital emulator (OE) for space situational awareness (SSA). The OE can produce 3D satellite movements using capabilities generated from omni-wheeled robot and robotic arm motions. The 3D motion of satellite is partitioned into the movements in the equatorial plane and the up-down motions in the vertical plane. The 3D actions are emulated by omni-wheeled robot models while the up-down motions are performed by a stepped-motorcontrolled- ball along a rod (robotic arm), which is attached to the robot. Lidar only measurements are used to estimate the pose information of the multiple robots. SLAM (simultaneous localization and mapping) is running on one robot to generate the map and compute the pose for the robot. Based on the SLAM map maintained by the robot, the other robots run the adaptive Monte Carlo localization (AMCL) method to estimate their poses. The controller is designed to guide the robot to follow a given orbit. The controllability is analyzed by using a feedback linearization method. Experiments are conducted to show the convergence of AMCL and the orbit tracking performance.

  8. Designing Cyber Exercises

    DTIC Science & Technology

    2014-10-01

    maneuver • Traditional “ Brick and Mortar ” training models – Difficult to train regularly due to logistics/budget restrictions – Doesn’t scale...complexity, scenario, location , and resources available • Scalable 4-cell planning construct – Exercise Control (White Cell) – Threat Emulation (Red...business impact) • Collaborative effort  Trusted Agents ( SMEs ) – Threats – Cyber defense capabilities – Policies and procedures – Project and/or

  9. Appendices to the model description document for a computer program for the emulation/simulation of a space station environmental control and life support system

    NASA Technical Reports Server (NTRS)

    Yanosy, James L.

    1988-01-01

    A Model Description Document for the Emulation Simulation Computer Model was already published. The model consisted of a detailed model (emulation) of a SAWD CO2 removal subsystem which operated with much less detailed (simulation) models of a cabin, crew, and condensing and sensible heat exchangers. The purpose was to explore the utility of such an emulation simulation combination in the design, development, and test of a piece of ARS hardware, SAWD. Extensions to this original effort are presented. The first extension is an update of the model to reflect changes in the SAWD control logic which resulted from test. Also, slight changes were also made to the SAWD model to permit restarting and to improve the iteration technique. The second extension is the development of simulation models for more pieces of air and water processing equipment. Models are presented for: EDC, Molecular Sieve, Bosch, Sabatier, a new condensing heat exchanger, SPE, SFWES, Catalytic Oxidizer, and multifiltration. The third extension is to create two system simulations using these models. The first system presented consists of one air and one water processing system. The second consists of a potential air revitalization system.

  10. Appendices to the user's manual for a computer program for the emulation/simulation of a space station environmental control and life support system

    NASA Technical Reports Server (NTRS)

    Yanosy, James L.

    1988-01-01

    A user's Manual for the Emulation Simulation Computer Model was published previously. The model consisted of a detailed model (emulation) of a SAWD CO2 removal subsystem which operated with much less detailed (simulation) models of a cabin, crew, and condensing and sensible heat exchangers. The purpose was to explore the utility of such an emulation/simulation combination in the design, development, and test of a piece of ARS hardware - SAWD. Extensions to this original effort are presented. The first extension is an update of the model to reflect changes in the SAWD control logic which resulted from the test. In addition, slight changes were also made to the SAWD model to permit restarting and to improve the iteration technique. The second extension is the development of simulation models for more pieces of air and water processing equipment. Models are presented for: EDC, Molecular Sieve, Bosch, Sabatier, a new condensing heat exchanger, SPE, SFWES, Catalytic Oxidizer, and multifiltration. The third extension is to create two system simulations using these models. The first system presented consists of one air and one water processing system, the second a potential Space Station air revitalization system.

  11. Establishing the Capability of a 1D SVAT Modelling Scheme in Predicting Key Biophysical Vegetation Characterisation Parameters

    NASA Astrophysics Data System (ADS)

    Ireland, Gareth; Petropoulos, George P.; Carlson, Toby N.; Purdy, Sarah

    2015-04-01

    Sensitivity analysis (SA) consists of an integral and important validatory check of a computer simulation model before it is used to perform any kind of analysis. In the present work, we present the results from a SA performed on the SimSphere Soil Vegetation Atmosphere Transfer (SVAT) model utilising a cutting edge and robust Global Sensitivity Analysis (GSA) approach, based on the use of the Gaussian Emulation Machine for Sensitivity Analysis (GEM-SA) tool. The sensitivity of the following model outputs was evaluated: the ambient CO2 concentration and the rate of CO2 uptake by the plant, the ambient O3 concentration, the flux of O3 from the air to the plant/soil boundary, and the flux of O3 taken up by the plant alone. The most sensitive model inputs for the majority of model outputs were related to the structural properties of vegetation, namely, the Leaf Area Index, Fractional Vegetation Cover, Cuticle Resistance and Vegetation Height. External CO2 in the leaf and the O3 concentration in the air input parameters also exhibited significant influence on model outputs. This work presents a very important step towards an all-inclusive evaluation of SimSphere. Indeed, results from this study contribute decisively towards establishing its capability as a useful teaching and research tool in modelling Earth's land surface interactions. This is of considerable importance in the light of the rapidly expanding use of this model worldwide, which also includes research conducted by various Space Agencies examining its synergistic use with Earth Observation data towards the development of operational products at a global scale. This research was supported by the European Commission Marie Curie Re-Integration Grant "TRANSFORM-EO". SimSphere is currently maintained and freely distributed by the Department of Geography and Earth Sciences at Aberystwyth University (http://www.aber.ac.uk/simsphere). Keywords: CO2 flux, ambient CO2, O3 flux, SimSphere, Gaussian process emulators, BACCO GEM-SA, TRANSFORM-EO.

  12. Sensor Management for Fighter Applications

    DTIC Science & Technology

    2006-06-01

    has consistently shown that by directly estimating the prob- ability density of a target state using a track - before - detect scheme, weak and densely... track - before - detect nonlinear filter was constructed to estimate the joint density of all state variables. A simulation that emulates estimator...targets in clutter and noise from sensed kinematic and identity data. Among the most capable is track - before - detect (TBD), which delivers

  13. Microcomputer software development facilities

    NASA Technical Reports Server (NTRS)

    Gorman, J. S.; Mathiasen, C.

    1980-01-01

    A more efficient and cost effective method for developing microcomputer software is to utilize a host computer with high-speed peripheral support. Application programs such as cross assemblers, loaders, and simulators are implemented in the host computer for each of the microcomputers for which software development is a requirement. The host computer is configured to operate in a time share mode for multiusers. The remote terminals, printers, and down loading capabilities provided are based on user requirements. With this configuration a user, either local or remote, can use the host computer for microcomputer software development. Once the software is developed (through the code and modular debug stage) it can be downloaded to the development system or emulator in a test area where hardware/software integration functions can proceed. The microcomputer software program sources reside in the host computer and can be edited, assembled, loaded, and then downloaded as required until the software development project has been completed.

  14. Using quantum process tomography to characterize decoherence in an analog electronic device

    NASA Astrophysics Data System (ADS)

    Ostrove, Corey; La Cour, Brian; Lanham, Andrew; Ott, Granville

    The mathematical structure of a universal gate-based quantum computer can be emulated faithfully on a classical electronic device using analog signals to represent a multi-qubit state. We describe a prototype device capable of performing a programmable sequence of single-qubit and controlled two-qubit gate operations on a pair of voltage signals representing the real and imaginary parts of a two-qubit quantum state. Analog filters and true-RMS voltage measurements are used to perform unitary and measurement gate operations. We characterize the degradation of the represented quantum state with successive gate operations by formally performing quantum process tomography to estimate the equivalent decoherence channel. Experimental measurements indicate that the performance of the device may be accurately modeled as an equivalent quantum operation closely resembling a depolarizing channel with a fidelity of over 99%. This work was supported by the Office of Naval Research under Grant No. N00014-14-1-0323.

  15. rpe v5: an emulator for reduced floating-point precision in large numerical simulations

    NASA Astrophysics Data System (ADS)

    Dawson, Andrew; Düben, Peter D.

    2017-06-01

    This paper describes the rpe (reduced-precision emulator) library which has the capability to emulate the use of arbitrary reduced floating-point precision within large numerical models written in Fortran. The rpe software allows model developers to test how reduced floating-point precision affects the result of their simulations without having to make extensive code changes or port the model onto specialized hardware. The software can be used to identify parts of a program that are problematic for numerical precision and to guide changes to the program to allow a stronger reduction in precision.The development of rpe was motivated by the strong demand for more computing power. If numerical precision can be reduced for an application under consideration while still achieving results of acceptable quality, computational cost can be reduced, since a reduction in numerical precision may allow an increase in performance or a reduction in power consumption. For simulations with weather and climate models, savings due to a reduction in precision could be reinvested to allow model simulations at higher spatial resolution or complexity, or to increase the number of ensemble members to improve predictions. rpe was developed with a particular focus on the community of weather and climate modelling, but the software could be used with numerical simulations from other domains.

  16. Friendly Neighborhood Computer Project. Extension of the IBM NJE network to DEC VAX computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raffenetti, R.C.; Bertoncini, P.J.; Engert, D.E.

    1984-07-01

    This manual is divided into six chapters. The first is an overview of the VAX NJE emulator system and describes what can be done with the VAX NJE emulator software. The second chapter describes the commands that users of the VAX systems will use. Each command description includes the format of the command, a list of valid options and parameters and their meanings, and several short examples of command use. The third chapter describes the commands and capabilities for sending general, sequential files from and to VAX VMS nodes. The fourth chapter describes how to transmit data to a VAXmore » from other computer systems on the network. The fifth chapter explains how to exchange electronic mail with IBM CMS users and with users of other VAX VMS systems connected by NJE communications. The sixth chapter describes operator procedures and the additional commands operators may use.« less

  17. Approaches for scalable modeling and emulation of cyber systems : LDRD final report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayo, Jackson R.; Minnich, Ronald G.; Armstrong, Robert C.

    2009-09-01

    The goal of this research was to combine theoretical and computational approaches to better understand the potential emergent behaviors of large-scale cyber systems, such as networks of {approx} 10{sup 6} computers. The scale and sophistication of modern computer software, hardware, and deployed networked systems have significantly exceeded the computational research community's ability to understand, model, and predict current and future behaviors. This predictive understanding, however, is critical to the development of new approaches for proactively designing new systems or enhancing existing systems with robustness to current and future cyber threats, including distributed malware such as botnets. We have developed preliminarymore » theoretical and modeling capabilities that can ultimately answer questions such as: How would we reboot the Internet if it were taken down? Can we change network protocols to make them more secure without disrupting existing Internet connectivity and traffic flow? We have begun to address these issues by developing new capabilities for understanding and modeling Internet systems at scale. Specifically, we have addressed the need for scalable network simulation by carrying out emulations of a network with {approx} 10{sup 6} virtualized operating system instances on a high-performance computing cluster - a 'virtual Internet'. We have also explored mappings between previously studied emergent behaviors of complex systems and their potential cyber counterparts. Our results provide foundational capabilities for further research toward understanding the effects of complexity in cyber systems, to allow anticipating and thwarting hackers.« less

  18. Conjunctively optimizing flash flood control and water quality in urban water reservoirs by model predictive control and dynamic emulation

    NASA Astrophysics Data System (ADS)

    Galelli, Stefano; Goedbloed, Albert; Schmitter, Petra; Castelletti, Andrea

    2014-05-01

    Urban water reservoirs are a viable adaptation option to account for increasing drinking water demand of urbanized areas as they allow storage and re-use of water that is normally lost. In addition, the direct availability of freshwater reduces pumping costs and diversifies the portfolios of drinking water supply. Yet, these benefits have an associated twofold cost. Firstly, the presence of large, impervious areas increases the hydraulic efficiency of urban catchments, with short time of concentration, increased runoff rates, losses of infiltration and baseflow, and higher risk of flash floods. Secondly, the high concentration of nutrients and sediments characterizing urban discharges is likely to cause water quality problems. In this study we propose a new control scheme combining Model Predictive Control (MPC), hydro-meteorological forecasts and dynamic model emulation to design real-time operating policies that conjunctively optimize water quantity and quality targets. The main advantage of this scheme stands in its capability of exploiting real-time hydro-meteorological forecasts, which are crucial in such fast-varying systems. In addition, the reduced computational requests of the MPC scheme allows coupling it with dynamic emulators of water quality processes. The approach is demonstrated on Marina Reservoir, a multi-purpose reservoir located in the heart of Singapore and characterized by a large, highly urbanized catchment with a short (i.e. approximately one hour) time of concentration. Results show that the MPC scheme, coupled with a water quality emulator, provides a good compromise between different operating objectives, namely flood risk reduction, drinking water supply and salinity control. Finally, the scheme is used to assess the effect of source control measures (e.g. green roofs) aimed at restoring the natural hydrological regime of Marina Reservoir catchment.

  19. FMCW Radar Jamming Techniques and Analysis

    DTIC Science & Technology

    2013-09-01

    an education system that is compacted with various radar capabilities, the circuitry does not provide the full functionality of each type of radar as...example of a typical FMCW architecture. The hardware components and their functionalities are explained individually in the order of the signal processing...drawn. Chapter IV presents a MATLAB model that emulates the functionality of the homodyne FMCW radar discussed in Chapter II. The model design and

  20. Topics in programmable automation. [for materials handling, inspection, and assembly

    NASA Technical Reports Server (NTRS)

    Rosen, C. A.

    1975-01-01

    Topics explored in the development of integrated programmable automation systems include: numerically controlled and computer controlled machining; machine intelligence and the emulation of human-like capabilities; large scale semiconductor integration technology applications; and sensor technology for asynchronous local computation without burdening the executive minicomputer which controls the whole system. The role and development of training aids, and the potential application of these aids to augmented teleoperator systems are discussed.

  1. Neurokernel: An Open Source Platform for Emulating the Fruit Fly Brain

    PubMed Central

    2016-01-01

    We have developed an open software platform called Neurokernel for collaborative development of comprehensive models of the brain of the fruit fly Drosophila melanogaster and their execution and testing on multiple Graphics Processing Units (GPUs). Neurokernel provides a programming model that capitalizes upon the structural organization of the fly brain into a fixed number of functional modules to distinguish between these modules’ local information processing capabilities and the connectivity patterns that link them. By defining mandatory communication interfaces that specify how data is transmitted between models of each of these modules regardless of their internal design, Neurokernel explicitly enables multiple researchers to collaboratively model the fruit fly’s entire brain by integration of their independently developed models of its constituent processing units. We demonstrate the power of Neurokernel’s model integration by combining independently developed models of the retina and lamina neuropils in the fly’s visual system and by demonstrating their neuroinformation processing capability. We also illustrate Neurokernel’s ability to take advantage of direct GPU-to-GPU data transfers with benchmarks that demonstrate scaling of Neurokernel’s communication performance both over the number of interface ports exposed by an emulation’s constituent modules and the total number of modules comprised by an emulation. PMID:26751378

  2. Intelligent Systems For Aerospace Engineering: An Overview

    NASA Technical Reports Server (NTRS)

    KrishnaKumar, K.

    2003-01-01

    Intelligent systems are nature-inspired, mathematically sound, computationally intensive problem solving tools and methodologies that have become extremely important for advancing the current trends in information technology. Artificially intelligent systems currently utilize computers to emulate various faculties of human intelligence and biological metaphors. They use a combination of symbolic and sub-symbolic systems capable of evolving human cognitive skills and intelligence, not just systems capable of doing things humans do not do well. Intelligent systems are ideally suited for tasks such as search and optimization, pattern recognition and matching, planning, uncertainty management, control, and adaptation. In this paper, the intelligent system technologies and their application potential are highlighted via several examples.

  3. Intelligent Systems for Aerospace Engineering: An Overview

    NASA Technical Reports Server (NTRS)

    Krishnakumar, Kalmanje

    2002-01-01

    Intelligent systems are nature-inspired, mathematically sound, computationally intensive problem solving tools and methodologies that have become extremely important for advancing the current trends in information technology. Artificially intelligent systems currently utilize computers to emulate various faculties of human intelligence and biological metaphors. They use a combination of symbolic and sub-symbolic systems capable of evolving human cognitive skills and intelligence, not just systems capable of doing things humans do not do well. Intelligent systems are ideally suited for tasks such as search and optimization, pattern recognition and matching, planning, uncertainty management, control, and adaptation. In this paper, the intelligent system technologies and their application potential are highlighted via several examples.

  4. User's manual for a computer program for the emulation/simulation of a space station Environmental Control and Life Support System (ESCM)

    NASA Technical Reports Server (NTRS)

    Yanosy, James L.

    1988-01-01

    This manual describes how to use the Emulation Simulation Computer Model (ESCM). Based on G189A, ESCM computes the transient performance of a Space Station atmospheric revitalization subsystem (ARS) with CO2 removal provided by a solid amine water desorbed subsystem called SAWD. Many performance parameters are computed some of which are cabin CO2 partial pressure, relative humidity, temperature, O2 partial pressure, and dew point. The program allows the user to simulate various possible combinations of man loading, metabolic profiles, cabin volumes and certain hypothesized failures that could occur.

  5. UPEML Version 3.0: A machine-portable CDC update emulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehlhorn, T.A.; Haill, T.A.

    1992-04-01

    UPEML is a machine-portable program that emulates a subset of the functions of the standard CDC Update. Machine-portability has been achieved by conforming to ANSI standards for Fortran-77. UPEML is compact and fairly efficient; however, it only allows a restricted syntax as compared with the CDC Update. This program was written primarily to facilitate the use of CDC-based scientific packages on alternate computer systems such as the VAX/VMS mainframes and UNIX workstations. UPEML has also been successfully used on the multiprocessor ELXSI, on CRAYs under both UNICOS and CTSS operating systems, and on Sun, HP, Stardent and IBM workstations. UPEMLmore » was originally released with the ITS electron/photon Monte Carlo transport package, which was developed on a CDC-7600 and makes extensive use of conditional file structure to combine several problem geometry and machine options into a single program file. UPEML 3.0 is an enhanced version of the original code and is being independently released for use at any installation or with any code package. Version 3.0 includes enhanced error checking, full ASCII character support, a program library audit capability, and a partial update option in which only selected or modified decks are written to the complete file. Version 3.0 also checks for overlapping corrections, allows processing of pested calls to common decks, and allows the use of alternate files in READ and ADDFILE commands. Finally, UPEML Version 3.0 allows the assignment of input and output files at runtime on the control line.« less

  6. UPEML Version 3. 0: A machine-portable CDC update emulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehlhorn, T.A.; Haill, T.A.

    1992-04-01

    UPEML is a machine-portable program that emulates a subset of the functions of the standard CDC Update. Machine-portability has been achieved by conforming to ANSI standards for Fortran-77. UPEML is compact and fairly efficient; however, it only allows a restricted syntax as compared with the CDC Update. This program was written primarily to facilitate the use of CDC-based scientific packages on alternate computer systems such as the VAX/VMS mainframes and UNIX workstations. UPEML has also been successfully used on the multiprocessor ELXSI, on CRAYs under both UNICOS and CTSS operating systems, and on Sun, HP, Stardent and IBM workstations. UPEMLmore » was originally released with the ITS electron/photon Monte Carlo transport package, which was developed on a CDC-7600 and makes extensive use of conditional file structure to combine several problem geometry and machine options into a single program file. UPEML 3.0 is an enhanced version of the original code and is being independently released for use at any installation or with any code package. Version 3.0 includes enhanced error checking, full ASCII character support, a program library audit capability, and a partial update option in which only selected or modified decks are written to the complete file. Version 3.0 also checks for overlapping corrections, allows processing of pested calls to common decks, and allows the use of alternate files in READ and ADDFILE commands. Finally, UPEML Version 3.0 allows the assignment of input and output files at runtime on the control line.« less

  7. Digital tracking loops for a programmable digital modem

    NASA Technical Reports Server (NTRS)

    Poklemba, John J.

    1992-01-01

    In this paper, an analysis and hardware emulation of the tracking loops for a very flexible programmable digital modem (PDM) will be presented. The modem is capable of being programmed for 2, 4, 8, 16-PSK, 16-QAM, MSK, and Offset-QPSK modulation schemes over a range of data rates from 2.34 to 300 Mbps with programmable spectral occupancy from 1.2 to 1.8 times the symbol rate; these operational parameters are executable in burst or continuous mode. All of the critical processing in both the modulator and demodulator is done at baseband with very high-speed digital hardware and memory. Quadrature analog front-ends are used for translation between baseband and the IF center frequency. The modulator is based on a table lookup approach, where precomputed samples are stored in memory and clocked out according to the incoming data pattern. The sample values are predistorted to counteract the effects of the other filtering functions in the link as well as any transmission impairments. The demodulator architecture was adapted from a joint estimator-detector (JED) mathematical analysis. Its structure is applicable to most signalling formats that can be represented in a two-dimensional space. The JED realization uses interdependent, mutually aiding tracking loops with post-detection data feedback. To expedite and provide for more reliable synchronization, initial estimates for these loops are computed in a parallel acquisition processor. The cornerstone of the demodulator realization is the pre-averager received data filter which allows operation over a broad range of data rates without any hardware changes and greatly simplifies the implementation complexity. The emulation results confirmed tracking loop operation over the entire range of operational parameters listed above, as well as the capability of achieving and maintaining synchronization at BER's in excess of 10(exp -1). The emulation results also showed very close agreement with the tracking loop analysis, and validated the resolution apportionment of the various hardware elements in the tracking loops.

  8. USAF Expeditionary Security Operations 2040:A Technology Vision For Deployed Air Base Defense Capabilities

    DTIC Science & Technology

    2014-04-09

    development is the ability to connect to language translation service providers, exemplified by the hand-held Enabling Language Service Anywhere ( ELSA ...device. Designed primarily for use by first responders, ELSA connects via cellular signal to a company employing interpreters for over 180 languages...69 ELSA provides a possible model for military emulation where a pool of linguists is available on-call for use in a tactical environment

  9. Time-Filtered Navier-Stokes Approach and Emulation of Turbulence-Chemistry Interaction

    NASA Technical Reports Server (NTRS)

    Liu, Nan-Suey; Wey, Thomas; Shih, Tsan-Hsing

    2013-01-01

    This paper describes the time-filtered Navier-Stokes approach capable of capturing unsteady flow structures important for turbulent mixing and an accompanying subgrid model directly accounting for the major processes in turbulence-chemistry interaction. They have been applied to the computation of two-phase turbulent combustion occurring in a single-element lean-direct-injection combustor. Some of the preliminary results from this computational effort are presented in this paper.

  10. Integration of Marine Mammal Movement and Behavior into the Effects of Sound on the Marine Environment

    DTIC Science & Technology

    2011-09-30

    capability to emulate the dive and movement behavior of marine mammals provides a significant advantage to modeling environmental impact than do historic...approaches used in Navy environmental assessments (EA) and impact statements (EIS). Many previous methods have been statistical or pseudo-statistical...Siderius. 2011. Comparison of methods used for computing the impact of sound on the marine environment, Marine Environmental Research, 71:342-350. [published

  11. ALLTEM Multi-Axis Electromagnetic Induction System Demonstration and Validation

    DTIC Science & Technology

    2011-11-17

    fencing that test the capabilities of the platform systems Recently the Open Field area was reconfigured to emulate typical impact area conditions. The...surveyed. • Open field (indirect fire) The indirect fire subarea contains only three munition types that could be typically found at an impact area...direct fire subarea contains only three munition types that could be typically found at an impact area of a direct fire weapons range. These are 25 mm

  12. Comparison of cooperative and non-cooperative adaptive optics reference performance for propagation with thermal blooming effects

    NASA Astrophysics Data System (ADS)

    Edwards, Brian E.; Nitkowski, Arthur; Lawrence, Ryan; Horton, Kasey; Higgs, Charles

    2004-10-01

    Atmospheric turbulence and laser-induced thermal blooming effects can degrade the beam quality of a high-energy laser (HEL) weapon, and ultimately limit the amount of energy deliverable to a target. Lincoln Laboratory has built a thermal blooming laboratory capable of emulating atmospheric thermal blooming and turbulence effects for tactical HEL systems. The HEL weapon emulation hardware includes an adaptive optics beam delivery system, which utilizes a Shack-Hartman wavefront sensor and a 349 actuator deformable mirror. For this experiment, the laboratory was configured to emulate an engagement scenario consisting of sea skimming target approaching directly toward the HEL weapon at a range of 10km. The weapon utilizes a 1.5m aperture and radiates at a 1.62 micron wavelength. An adaptive optics reference beam was provided as either a point source located at the target (cooperative) or a projected point source reflected from the target (uncooperative). Performance of the adaptive optics system was then compared between reference sources. Results show that, for operating conditions with a thermal blooming distortion number of 75 and weak turbulence (Rytov of 0.02 and D/ro of 3), cooperative beacon AO correction experiences Phase Compensation Instability, resulting in lower performance than a simple, open-loop condition. The uncooperative beacon resulted in slightly better performance than the open-loop condition.

  13. Improving Non-Observational Experiences: Channelling and Ordering

    ERIC Educational Resources Information Center

    de Zeeuw, Gerard

    2011-01-01

    That the present day society profits from research in many areas is evident. This has stimulated a keen desire to emulate similarly advantageous contributions in other areas. It appears to imply not only a need to know how to (better) support action in general or any action, but also how to support the act of making "better" itself (better…

  14. FEM design and simulation of a short, 10 MV, S-band Linac with Monte Carlo dose simulations.

    PubMed

    Baillie, Devin; St Aubin, J; Fallone, B G; Steciw, S

    2015-04-01

    Current commercial 10 MV Linac waveguides are 1.5 m. The authors' current 6 MV linear accelerator-magnetic resonance imager (Linac-MR) system fits in typical radiotherapy vaults. To allow 10 MV treatments with the Linac-MR and still fit within typical vaults, the authors design a 10 MV Linac with an accelerator waveguide of the same length (27.5 cm) as current 6 MV Linacs. The first design stage is to design a cavity such that a specific experimental measurement for breakdown is applicable to the cavity. This is accomplished through the use of finite element method (FEM) simulations to match published shunt impedance, Q factor, and ratio of peak to mean-axial electric field strength from an electric breakdown study. A full waveguide is then designed and tuned in FEM simulations based on this cavity design. Electron trajectories are computed through the resulting radio frequency fields, and the waveguide geometry is modified by shifting the first coupling cavity in order to optimize the electron beam properties until the energy spread and mean energy closely match values published for an emulated 10 MV Linac. Finally, Monte Carlo dose simulations are used to compare the resulting photon beam depth dose profile and penumbra with that produced by the emulated 10 MV Linac. The shunt impedance, Q factor, and ratio of peak to mean-axial electric field strength are all matched to within 0.1%. A first coupling cavity shift of 1.45 mm produces an energy spectrum width of 0.347 MeV, very close to the published value for the emulated 10 MV of 0.315 MeV, and a mean energy of 10.53 MeV, nearly identical to the published 10.5 MeV for the emulated 10 MV Linac. The depth dose profile produced by their new Linac is within 1% of that produced by the emulated 10 MV spectrum for all depths greater than 1.5 cm. The penumbra produced is 11% narrower, as measured from 80% to 20% of the central axis dose. The authors have successfully designed and simulated an S-band waveguide of length of 27.5 cm capable of producing a 10 MV photon beam. This waveguide operates well within the breakdown threshold determined for the cavity geometry used. The designed Linac produces depth dose profiles similar to those of the emulated 10 MV Linac (waveguide-length of 1.5 m) but yields a narrower penumbra.

  15. FEM design and simulation of a short, 10 MV, S-band Linac with Monte Carlo dose simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baillie, Devin; Aubin, J. St.; Steciw, S., E-mail: ssteciw@ualberta.ca

    2015-04-15

    Purpose: Current commercial 10 MV Linac waveguides are 1.5 m. The authors’ current 6 MV linear accelerator–magnetic resonance imager (Linac–MR) system fits in typical radiotherapy vaults. To allow 10 MV treatments with the Linac–MR and still fit within typical vaults, the authors design a 10 MV Linac with an accelerator waveguide of the same length (27.5 cm) as current 6 MV Linacs. Methods: The first design stage is to design a cavity such that a specific experimental measurement for breakdown is applicable to the cavity. This is accomplished through the use of finite element method (FEM) simulations to match publishedmore » shunt impedance, Q factor, and ratio of peak to mean-axial electric field strength from an electric breakdown study. A full waveguide is then designed and tuned in FEM simulations based on this cavity design. Electron trajectories are computed through the resulting radio frequency fields, and the waveguide geometry is modified by shifting the first coupling cavity in order to optimize the electron beam properties until the energy spread and mean energy closely match values published for an emulated 10 MV Linac. Finally, Monte Carlo dose simulations are used to compare the resulting photon beam depth dose profile and penumbra with that produced by the emulated 10 MV Linac. Results: The shunt impedance, Q factor, and ratio of peak to mean-axial electric field strength are all matched to within 0.1%. A first coupling cavity shift of 1.45 mm produces an energy spectrum width of 0.347 MeV, very close to the published value for the emulated 10 MV of 0.315 MeV, and a mean energy of 10.53 MeV, nearly identical to the published 10.5 MeV for the emulated 10 MV Linac. The depth dose profile produced by their new Linac is within 1% of that produced by the emulated 10 MV spectrum for all depths greater than 1.5 cm. The penumbra produced is 11% narrower, as measured from 80% to 20% of the central axis dose. Conclusions: The authors have successfully designed and simulated an S-band waveguide of length of 27.5 cm capable of producing a 10 MV photon beam. This waveguide operates well within the breakdown threshold determined for the cavity geometry used. The designed Linac produces depth dose profiles similar to those of the emulated 10 MV Linac (waveguide-length of 1.5 m) but yields a narrower penumbra.« less

  16. Multiple ion beam irradiation for the study of radiation damage in materials

    NASA Astrophysics Data System (ADS)

    Taller, Stephen; Woodley, David; Getto, Elizabeth; Monterrosa, Anthony M.; Jiao, Zhijie; Toader, Ovidiu; Naab, Fabian; Kubley, Thomas; Dwaraknath, Shyam; Was, Gary S.

    2017-12-01

    The effects of transmutation produced helium and hydrogen must be included in ion irradiation experiments to emulate the microstructure of reactor irradiated materials. Descriptions of the criteria and systems necessary for multiple ion beam irradiation are presented and validated experimentally. A calculation methodology was developed to quantify the spatial distribution, implantation depth and amount of energy-degraded and implanted light ions when using a thin foil rotating energy degrader during multi-ion beam irradiation. A dual ion implantation using 1.34 MeV Fe+ ions and energy-degraded D+ ions was conducted on single crystal silicon to benchmark the dosimetry used for multi-ion beam irradiations. Secondary Ion Mass Spectroscopy (SIMS) analysis showed good agreement with calculations of the peak implantation depth and the total amount of iron and deuterium implanted. The results establish the capability to quantify the ion fluence from both heavy ion beams and energy-degraded light ion beams for the purpose of using multi-ion beam irradiations to emulate reactor irradiated microstructures.

  17. A Practical Guide To Solar Array Simulation And PCDU Test

    NASA Astrophysics Data System (ADS)

    Schmitz, Noah; Carroll, Greg; Clegg, Russell

    2011-10-01

    Solar arrays consisting of multiple photovoltaic segments provide power to satellites and charge internal batteries for use during eclipse. Solar arrays have unique I-V characteristics and output power which vary with environmental and operational conditions such as temperature, irradiance, spin, and eclipse. Therefore, specialty power solutions are needed to properly test the satellite on the ground, especially the Power Control and Distribution Unit (PCDU) and the Array Power Regulator (APR.) This paper explores some practical and theoretical considerations that should be taken into account when choosing a commercial, off-the-shelf solar array simulator (SAS) for verification of the satellite PCDU. An SAS is a unique power supply with I-V output characteristics that emulate the solar arrays used to power a satellite. It is important to think about the strengths and the limitations of this emulation capability, how closely the SAS approximates a real solar panel, and how best to design a system using SAS as components.

  18. Using machine learning to emulate human hearing for predictive maintenance of equipment

    NASA Astrophysics Data System (ADS)

    Verma, Dinesh; Bent, Graham

    2017-05-01

    At the current time, interfaces between humans and machines use only a limited subset of senses that humans are capable of. The interaction among humans and computers can become much more intuitive and effective if we are able to use more senses, and create other modes of communicating between them. New machine learning technologies can make this type of interaction become a reality. In this paper, we present a framework for a holistic communication between humans and machines that uses all of the senses, and discuss how a subset of this capability can allow machines to talk to humans to indicate their health for various tasks such as predictive maintenance.

  19. SEXTANT - Station Explorer for X-Ray Timing and Navigation Technology

    NASA Technical Reports Server (NTRS)

    Mitchell, Jason; Hasouneh, Monther; Winternitz, Luke; Valdez, Jennifer; Price, Sam; Semper, Sean; Yu, Wayne; Gaebler, John; Ray, Paul; Wood, Kent; hide

    2015-01-01

    The Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) is a NASA funded technology- demonstration. SEXTANT will, for the first time, demonstrate real-time, on-board X-ray Pulsar-based Navigation (XNAV), a significant milestone in the quest to establish a GPS-like navigation capability available throughout our Solar System and beyond. This paper describes the basic design of the SEXTANT system with a focus on core models and algorithms, and the design and continued development of the GSFC X-ray Navigation Laboratory Testbed (GXLT) with its dynamic pulsar emulation capability. We also present early results from GXLT modeling of the combined NICER X-ray timing instrument hardware and SEXTANT flight software algorithms.

  20. Numerical model estimating the capabilities and limitations of the fast Fourier transform technique in absolute interferometry

    NASA Astrophysics Data System (ADS)

    Talamonti, James J.; Kay, Richard B.; Krebs, Danny J.

    1996-05-01

    A numerical model was developed to emulate the capabilities of systems performing noncontact absolute distance measurements. The model incorporates known methods to minimize signal processing and digital sampling errors and evaluates the accuracy limitations imposed by spectral peak isolation by using Hanning, Blackman, and Gaussian windows in the fast Fourier transform technique. We applied this model to the specific case of measuring the relative lengths of a compound Michelson interferometer. By processing computer-simulated data through our model, we project the ultimate precision for ideal data, and data containing AM-FM noise. The precision is shown to be limited by nonlinearities in the laser scan. absolute distance, interferometer.

  1. X-ray Pulsar Navigation Algorithms and Testbed for SEXTANT

    NASA Technical Reports Server (NTRS)

    Winternitz, Luke M. B.; Hasouneh, Monther A.; Mitchell, Jason W.; Valdez, Jennifer E.; Price, Samuel R.; Semper, Sean R.; Yu, Wayne H.; Ray, Paul S.; Wood, Kent S.; Arzoumanian, Zaven; hide

    2015-01-01

    The Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) is a NASA funded technologydemonstration. SEXTANT will, for the first time, demonstrate real-time, on-board X-ray Pulsar-based Navigation (XNAV), a significant milestone in the quest to establish a GPS-like navigation capability available throughout our Solar System and beyond. This paper describes the basic design of the SEXTANT system with a focus on core models and algorithms, and the design and continued development of the GSFC X-ray Navigation Laboratory Testbed (GXLT) with its dynamic pulsar emulation capability. We also present early results from GXLT modeling of the combined NICER X-ray timing instrument hardware and SEXTANT flight software algorithms.

  2. Is Brain Emulation Dangerous?

    NASA Astrophysics Data System (ADS)

    Eckersley, Peter; Sandberg, Anders

    2013-12-01

    Brain emulation is a hypothetical but extremely transformative technology which has a non-zero chance of appearing during the next century. This paper investigates whether such a technology would also have any predictable characteristics that give it a chance of being catastrophically dangerous, and whether there are any policy levers which might be used to make it safer. We conclude that the riskiness of brain emulation probably depends on the order of the preceding research trajectory. Broadly speaking, it appears safer for brain emulation to happen sooner, because slower CPUs would make the technology`s impact more gradual. It may also be safer if brains are scanned before they are fully understood from a neuroscience perspective, thereby increasing the initial population of emulations, although this prediction is weaker and more scenario-dependent. The risks posed by brain emulation also seem strongly connected to questions about the balance of power between attackers and defenders in computer security contests. If economic property rights in CPU cycles1 are essentially enforceable, emulation appears to be comparatively safe; if CPU cycles are ultimately easy to steal, the appearance of brain emulation is more likely to be a destabilizing development for human geopolitics. Furthermore, if the computers used to run emulations can be kept secure, then it appears that making brain emulation technologies ―open‖ would make them safer. If, however, computer insecurity is deep and unavoidable, openness may actually be more dangerous. We point to some arguments that suggest the former may be true, tentatively implying that it would be good policy to work towards brain emulation using open scientific methodology and free/open source software codebases

  3. Architecture overview and data summary of a 5.4 km free-space laser communication experiment

    NASA Astrophysics Data System (ADS)

    Moores, John D.; Walther, Frederick G.; Greco, Joseph A.; Michael, Steven; Wilcox, William E., Jr.; Volpicelli, Alicia M.; Magliocco, Richard J.; Henion, Scott R.

    2009-08-01

    MIT Lincoln Laboratory designed and built two free-space laser communications terminals, and successfully demonstrated error-free communication between two ground sites separated by 5.4 km in September, 2008. The primary goal of this work was to emulate a low elevation angle air-to-ground link capable of supporting standard OTU1 (2.667 Gb/s) data formatting with standard client interfaces. Mitigation of turbulence-induced scintillation effects was accomplished through the use of multiple small-aperture receivers and novel encoding and interleaver hardware. Data from both the field and laboratory experiments were used to assess link performance as a function of system parameters such as transmitted power, degree of spatial diversity, and interleaver span, with and without forward error correction. This work was sponsored by the Department of Defense, RRCO DDR&E, under Air Force Contract FA8721-05-C-0002. Opinions, interpretations, conclusions and recommendations are those of the authors and are not necessarily endorsed by the United States Government.

  4. Real-time autocorrelator for fluorescence correlation spectroscopy based on graphical-processor-unit architecture: method, implementation, and comparative studies

    NASA Astrophysics Data System (ADS)

    Laracuente, Nicholas; Grossman, Carl

    2013-03-01

    We developed an algorithm and software to calculate autocorrelation functions from real-time photon-counting data using the fast, parallel capabilities of graphical processor units (GPUs). Recent developments in hardware and software have allowed for general purpose computing with inexpensive GPU hardware. These devices are more suited for emulating hardware autocorrelators than traditional CPU-based software applications by emphasizing parallel throughput over sequential speed. Incoming data are binned in a standard multi-tau scheme with configurable points-per-bin size and are mapped into a GPU memory pattern to reduce time-expensive memory access. Applications include dynamic light scattering (DLS) and fluorescence correlation spectroscopy (FCS) experiments. We ran the software on a 64-core graphics pci card in a 3.2 GHz Intel i5 CPU based computer running Linux. FCS measurements were made on Alexa-546 and Texas Red dyes in a standard buffer (PBS). Software correlations were compared to hardware correlator measurements on the same signals. Supported by HHMI and Swarthmore College

  5. Pynamic: the Python Dynamic Benchmark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, G L; Ahn, D H; de Supinksi, B R

    2007-07-10

    Python is widely used in scientific computing to facilitate application development and to support features such as computational steering. Making full use of some of Python's popular features, which improve programmer productivity, leads to applications that access extremely high numbers of dynamically linked libraries (DLLs). As a result, some important Python-based applications severely stress a system's dynamic linking and loading capabilities and also cause significant difficulties for most development environment tools, such as debuggers. Furthermore, using the Python paradigm for large scale MPI-based applications can create significant file IO and further stress tools and operating systems. In this paper, wemore » present Pynamic, the first benchmark program to support configurable emulation of a wide-range of the DLL usage of Python-based applications for large scale systems. Pynamic has already accurately reproduced system software and tool issues encountered by important large Python-based scientific applications on our supercomputers. Pynamic provided insight for our system software and tool vendors, and our application developers, into the impact of several design decisions. As we describe the Pynamic benchmark, we will highlight some of the issues discovered in our large scale system software and tools using Pynamic.« less

  6. Integrated RF/Optical Interplanetary Networking Preliminary Explorations and Empirical Results

    NASA Technical Reports Server (NTRS)

    Raible, Daniel E.; Hylton, Alan G.

    2012-01-01

    Over the last decade interplanetary telecommunication capabilities have been significantly expanded--specifically in support of the Mars exploration rover and lander missions. NASA is continuing to drive advances in new, high payoff optical communications technologies to enhance the network to Gbps performance from Mars, and the transition from technology demonstration to operational system is examined through a hybrid RF/optical approach. Such a system combines the best features of RF and optical communications considering availability and performance to realize a dual band trunk line operating within characteristic constraints. Disconnection due to planetary obscuration and solar conjunction, link delays, timing, ground terminal mission congestion and scheduling policy along with space and atmospheric weather disruptions all imply the need for network protocol solutions to ultimately manage the physical layer in a transparent manner to the end user. Delay Tolerant Networking (DTN) is an approach under evaluation which addresses these challenges. A multi-hop multi-path hybrid RF and optical test bed has been constructed to emulate the integrated deep space network and to support protocol and hardware refinement. Initial experimental results characterize several of these challenges and evaluate the effectiveness of DTN as a solution to mitigate them.

  7. Guidance, Navigation and Control Digital Emulation Technology Laboratory. Volume 1. Part 2. Task 1: Digital Emulation Technology Laboratory

    DTIC Science & Technology

    1991-09-27

    AD-A241 692 II I] II I11 ANNUAL REPORT VOLUME 1 PART 2 TASK 1: DIGITAL EMULATION TECHNOLOGY LABORATOIRY REPORT NO. AR-0142-91-001 September 27, 1991... DIGITAL EMULATION TECHNOLOGY LABORATORY Contract No. DASG60-89-C-0142 Sponsored By The United States Army ? trategic Defense Command COMPUTER...ANNUAL REPORT VOLUME 1 PART 2 TASK 1: DIGITAL EMULATION TECHNOLOGY LABORATORY September 27, 1991 Authors Thomas R. Collins and Stephen R. Wachtel

  8. A Physical Heart Failure Simulation System Utilizing the Total Artificial Heart and Modified Donovan Mock Circulation.

    PubMed

    Crosby, Jessica R; DeCook, Katrina J; Tran, Phat L; Betterton, Edward; Smith, Richard G; Larson, Douglas F; Khalpey, Zain I; Burkhoff, Daniel; Slepian, Marvin J

    2017-07-01

    With the growth and diversity of mechanical circulatory support (MCS) systems entering clinical use, a need exists for a robust mock circulation system capable of reliably emulating and reproducing physiologic as well as pathophysiologic states for use in MCS training and inter-device comparison. We report on the development of such a platform utilizing the SynCardia Total Artificial Heart and a modified Donovan Mock Circulation System, capable of being driven at normal and reduced output. With this platform, clinically relevant heart failure hemodynamics could be reliably reproduced as evidenced by elevated left atrial pressure (+112%), reduced aortic flow (-12.6%), blunted Starling-like behavior, and increased afterload sensitivity when compared with normal function. Similarly, pressure-volume relationships demonstrated enhanced sensitivity to afterload and decreased Starling-like behavior in the heart failure model. Lastly, the platform was configured to allow the easy addition of a left ventricular assist device (HeartMate II at 9600 RPM), which upon insertion resulted in improvement of hemodynamics. The present configuration has the potential to serve as a viable system for training and research, aimed at fostering safe and effective MCS device use. © 2016 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  9. Emulator-assisted data assimilation in complex models

    NASA Astrophysics Data System (ADS)

    Margvelashvili, Nugzar Yu; Herzfeld, Mike; Rizwi, Farhan; Mongin, Mathieu; Baird, Mark E.; Jones, Emlyn; Schaffelke, Britta; King, Edward; Schroeder, Thomas

    2016-09-01

    Emulators are surrogates of complex models that run orders of magnitude faster than the original model. The utility of emulators for the data assimilation into ocean models is still not well understood. High complexity of ocean models translates into high uncertainty of the corresponding emulators which may undermine the quality of the assimilation schemes based on such emulators. Numerical experiments with a chaotic Lorenz-95 model are conducted to illustrate this point and suggest a strategy to alleviate this problem through the localization of the emulation and data assimilation procedures. Insights gained through these experiments are used to design and implement data assimilation scenario for a 3D fine-resolution sediment transport model of the Great Barrier Reef (GBR), Australia.

  10. Validation techniques for fault emulation of SRAM-based FPGAs

    DOE PAGES

    Quinn, Heather; Wirthlin, Michael

    2015-08-07

    A variety of fault emulation systems have been created to study the effect of single-event effects (SEEs) in static random access memory (SRAM) based field-programmable gate arrays (FPGAs). These systems are useful for augmenting radiation-hardness assurance (RHA) methodologies for verifying the effectiveness for mitigation techniques; understanding error signatures and failure modes in FPGAs; and failure rate estimation. For radiation effects researchers, it is important that these systems properly emulate how SEEs manifest in FPGAs. If the fault emulation systems does not mimic the radiation environment, the system will generate erroneous data and incorrect predictions of behavior of the FPGA inmore » a radiation environment. Validation determines whether the emulated faults are reasonable analogs to the radiation-induced faults. In this study we present methods for validating fault emulation systems and provide several examples of validated FPGA fault emulation systems.« less

  11. The EmulSiv filter removes microbial contamination from propofol but is not a substitute for aseptic technique.

    PubMed

    Hall, Wendy C E; Jolly, Donald T; Hrazdil, Jiri; Galbraith, John C; Greacen, Maria; Clanachan, Alexander S

    2003-01-01

    To evaluate the ability of the EmulSiv filter (EF) to remove extrinsic microbial contaminants from propofol. Aliquots of Staphylococcus aureus (S. aureus), Candida albicans (C. albicans), Klebsiella pneumoniae (K. pneumoniae), Moraxella osloensis (M. osloensis), Enterobacter agglomerans (E. agglomerans), Escherichia coli (E. coli), Serratia marcescens (S. marcescens), Moraxella catarrhalis (M. catarrhalis), Haemophilus influenzae (H. influenzae) and Campylobacter jejuni (C. jejuni) were inoculated into vials containing 20 mL of sterile propofol. The unfiltered inoculated propofol solutions served as controls. Ten millilitres and 20 mL samples of the inoculated propofol were filtered through the EF. All solutions were then subplated onto three culture plates using a precision 1 micro L calibrated platinum loop and incubated. The number of colony forming units (CFU) were counted. Data were analyzed using a one-sample t test, and a P value of less than 0.05 was selected as the level of statistical significance. The EF was able to completely remove CFU of S. aureus, C. albicans, K. pneumoniae, M. osloensis, E. agglomerans, E. coli, S. marcescens, and M. catarrhalis (P < 0.05). A small number of H. influenzae CFU were able to evade filtration in both the 10 mL and 20 mL samples. C. jejuni CFU were able to evade filtration in only the 10 mL sample. The EF removes the majority of microbial contaminates from propofol with the exception of H. influenzae and C. jejuni. Although the EF is capable of removing most of the microbial contamination produced by H. influenzae and C. jejuni, a few CFU are capable of evading filtration. Consequently, even the use of a filter capable of removing microbial contaminants is not a substitute for meticulous aseptic technique and prompt administration when propofol is used.

  12. Remediation of blowouts by clonal plants in Maqu degraded alpine grasslands of northwest China.

    PubMed

    Kang, JianJun; Zhao, WenZhi; Zhao, Ming

    2017-03-01

    The sand-fixation of plants is considered to be the most effective and fundamental measure in desertification control in many arid and semi-arid regions. Carex brunnescens (Carex spp) and Leymus secalinus (Leymus), two perennial clonal herbs native to the Maqu degraded alpine areas of northwest China, are dominant and constructive species in active sand dunes that have excellent adaptability to fix sand dunes found to date. In order to study the ability and mechanism of sandland blowout remediation by two clone plants C. brunnescens and L. secalinus, the artificially emulated blowouts were set up in the populations of two clonal plants in the field. The results showed that both C. brunnescens and L. secalinus produced more new ramets in the artificially emulated blowouts than in the natural conditions, suggesting that the two clonal plants had strong ability in blowouts remediation; while the biomass, number of leaves and height of new ramets in the artificially emulated blowouts were less than in the natural conditions due to the restriction of poor nutrients in the artificially emulated blowouts. The ability of blowouts remediation by C. brunnescens was stronger than L. secalinus, as it generated more new ramets than L. secalinus in the process of blowouts remediation. The new ramets of L. secalinus in the blowouts remediation were mainly generated by the buds in the rhizomes which spread from outside of the blowouts; while those of C. brunnescens were generated both by the buds in the rhizomes which spread from outside, and by the buds in the rhizomes inside which were freed from dormancy in the deeper soil under wind erosion conditions. These findings suggest that through rapid clonal expansion capability, C. brunnescens and L. secalinus exhibited strong ability in blowouts remediation which can be one of the most effective strategies to restore and reconstruct degraded vegetations in Maqu alpine areas of northwest China.

  13. Dynamically allocated virtual clustering management system

    NASA Astrophysics Data System (ADS)

    Marcus, Kelvin; Cannata, Jess

    2013-05-01

    The U.S Army Research Laboratory (ARL) has built a "Wireless Emulation Lab" to support research in wireless mobile networks. In our current experimentation environment, our researchers need the capability to run clusters of heterogeneous nodes to model emulated wireless tactical networks where each node could contain a different operating system, application set, and physical hardware. To complicate matters, most experiments require the researcher to have root privileges. Our previous solution of using a single shared cluster of statically deployed virtual machines did not sufficiently separate each user's experiment due to undesirable network crosstalk, thus only one experiment could be run at a time. In addition, the cluster did not make efficient use of our servers and physical networks. To address these concerns, we created the Dynamically Allocated Virtual Clustering management system (DAVC). This system leverages existing open-source software to create private clusters of nodes that are either virtual or physical machines. These clusters can be utilized for software development, experimentation, and integration with existing hardware and software. The system uses the Grid Engine job scheduler to efficiently allocate virtual machines to idle systems and networks. The system deploys stateless nodes via network booting. The system uses 802.1Q Virtual LANs (VLANs) to prevent experimentation crosstalk and to allow for complex, private networks eliminating the need to map each virtual machine to a specific switch port. The system monitors the health of the clusters and the underlying physical servers and it maintains cluster usage statistics for historical trends. Users can start private clusters of heterogeneous nodes with root privileges for the duration of the experiment. Users also control when to shutdown their clusters.

  14. Integration of OpenMC methods into MAMMOTH and Serpent

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerby, Leslie; DeHart, Mark; Tumulak, Aaron

    OpenMC, a Monte Carlo particle transport simulation code focused on neutron criticality calculations, contains several methods we wish to emulate in MAMMOTH and Serpent. First, research coupling OpenMC and the Multiphysics Object-Oriented Simulation Environment (MOOSE) has shown promising results. Second, the utilization of Functional Expansion Tallies (FETs) allows for a more efficient passing of multiphysics data between OpenMC and MOOSE. Both of these capabilities have been preliminarily implemented into Serpent. Results are discussed and future work recommended.

  15. Emulation-Based Virtual Laboratories: A Low-Cost Alternative to Physical Experiments in Control Engineering Education

    ERIC Educational Resources Information Center

    Goodwin, G. C.; Medioli, A. M.; Sher, W.; Vlacic, L. B.; Welsh, J. S.

    2011-01-01

    This paper argues the case for emulation-based virtual laboratories in control engineering education. It demonstrates that such emulation experiments can give students an industrially relevant educational experience at relatively low cost. The paper also describes a particular emulation-based system that has been developed with the aim of giving…

  16. A Prolog Emulator

    NASA Technical Reports Server (NTRS)

    Tick, Evan

    1987-01-01

    This note describes an efficient software emulator for the Warren Abstract Machine (WAM) Prolog architecture. The version of the WAM implemented is called Lcode. The Lcode emulator, written in C, executes the 'naive reverse' benchmark at 3900 LIPS. The emulator is one of a set of tools used to measure the memory-referencing characteristics and performance of Prolog programs. These tools include a compiler, assembler, and memory simulators. An overview of the Lcode architecture is given here, followed by a description and listing of the emulator code implementing each Lcode instruction. This note will be of special interest to those studying the WAM and its performance characteristics. In general, this note will be of interest to those creating efficient software emulators for abstract machine architectures.

  17. Bridging groundwater models and decision support with a Bayesian network

    USGS Publications Warehouse

    Fienen, Michael N.; Masterson, John P.; Plant, Nathaniel G.; Gutierrez, Benjamin T.; Thieler, E. Robert

    2013-01-01

    Resource managers need to make decisions to plan for future environmental conditions, particularly sea level rise, in the face of substantial uncertainty. Many interacting processes factor in to the decisions they face. Advances in process models and the quantification of uncertainty have made models a valuable tool for this purpose. Long-simulation runtimes and, often, numerical instability make linking process models impractical in many cases. A method for emulating the important connections between model input and forecasts, while propagating uncertainty, has the potential to provide a bridge between complicated numerical process models and the efficiency and stability needed for decision making. We explore this using a Bayesian network (BN) to emulate a groundwater flow model. We expand on previous approaches to validating a BN by calculating forecasting skill using cross validation of a groundwater model of Assateague Island in Virginia and Maryland, USA. This BN emulation was shown to capture the important groundwater-flow characteristics and uncertainty of the groundwater system because of its connection to island morphology and sea level. Forecast power metrics associated with the validation of multiple alternative BN designs guided the selection of an optimal level of BN complexity. Assateague island is an ideal test case for exploring a forecasting tool based on current conditions because the unique hydrogeomorphological variability of the island includes a range of settings indicative of past, current, and future conditions. The resulting BN is a valuable tool for exploring the response of groundwater conditions to sea level rise in decision support.

  18. Lessons in Self-Made Success: Programs Teach Business, Entrepreneurship

    ERIC Educational Resources Information Center

    Rosenfeld, Stuart; Pages, Erik

    2008-01-01

    Everyone admires entrepreneurs, and every region aspires to become entrepreneurial. Whether community colleges should teach entrepreneurship today--or support entrepreneurs--is a non-issue. Colleges want students, graduates, faculty, and administrators to be entrepreneurial. Other countries marvel at, and work to emulate, America's entrepreneurial…

  19. Evaluation of Electromechanical Systems Dynamically Emulating a Candidate Hydrokinetic Turbine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cavagnaro, Robert J.; Neely, Jason C.; Fay, Franois-Xavier

    The use of controllable motor-generator sets to emulate the dynamics of a hydrokinetic turbine is evaluated as an alternative to field testing a prototype. The emulator control dynamic equations are presented, methods for scaling turbine parameters are examined, and experimental results are presented from three electromechanical emulation machines (EEMs) programmed to emulate the same vertical-axis fixed-pitch turbine. Although hardware platforms and control implementations varied, results show that each EEM is successful in emulating the turbine model, thus demonstrating the general feasibility of the approach. However, performance of motor control under torque command, current command or speed command differed. In onemore » of the EEMs evaluated, the power take off controller tracks the maximum power-point of the turbine in response to turbulence. Utilizing realistic inflow conditions and control laws, the emulator dynamic speed response is shown to agree well at low frequencies with numerical simulation but to deviate at high frequencies.« less

  20. Evaluation of Electromechanical Systems Dynamically Emulating a Candidate Hydrokinetic Turbine

    DOE PAGES

    Cavagnaro, Robert J.; Neely, Jason C.; Fay, Franois-Xavier; ...

    2016-11-06

    The use of controllable motor-generator sets to emulate the dynamics of a hydrokinetic turbine is evaluated as an alternative to field testing a prototype. The emulator control dynamic equations are presented, methods for scaling turbine parameters are examined, and experimental results are presented from three electromechanical emulation machines (EEMs) programmed to emulate the same vertical-axis fixed-pitch turbine. Although hardware platforms and control implementations varied, results show that each EEM is successful in emulating the turbine model, thus demonstrating the general feasibility of the approach. However, performance of motor control under torque command, current command or speed command differed. In onemore » of the EEMs evaluated, the power take off controller tracks the maximum power-point of the turbine in response to turbulence. Utilizing realistic inflow conditions and control laws, the emulator dynamic speed response is shown to agree well at low frequencies with numerical simulation but to deviate at high frequencies.« less

  1. Hydropower Optimization Using Artificial Neural Network Surrogate Models of a High-Fidelity Hydrodynamics and Water Quality Model

    NASA Astrophysics Data System (ADS)

    Shaw, Amelia R.; Smith Sawyer, Heather; LeBoeuf, Eugene J.; McDonald, Mark P.; Hadjerioua, Boualem

    2017-11-01

    Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2 is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. The reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.

  2. Hydropower Optimization Using Artificial Neural Network Surrogate Models of a High-Fidelity Hydrodynamics and Water Quality Model

    DOE PAGES

    Shaw, Amelia R.; Sawyer, Heather Smith; LeBoeuf, Eugene J.; ...

    2017-10-24

    Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2more » is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. Here, the reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.« less

  3. Hydropower Optimization Using Artificial Neural Network Surrogate Models of a High-Fidelity Hydrodynamics and Water Quality Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaw, Amelia R.; Sawyer, Heather Smith; LeBoeuf, Eugene J.

    Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2more » is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. Here, the reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.« less

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinn, Heather; Wirthlin, Michael

    A variety of fault emulation systems have been created to study the effect of single-event effects (SEEs) in static random access memory (SRAM) based field-programmable gate arrays (FPGAs). These systems are useful for augmenting radiation-hardness assurance (RHA) methodologies for verifying the effectiveness for mitigation techniques; understanding error signatures and failure modes in FPGAs; and failure rate estimation. For radiation effects researchers, it is important that these systems properly emulate how SEEs manifest in FPGAs. If the fault emulation systems does not mimic the radiation environment, the system will generate erroneous data and incorrect predictions of behavior of the FPGA inmore » a radiation environment. Validation determines whether the emulated faults are reasonable analogs to the radiation-induced faults. In this study we present methods for validating fault emulation systems and provide several examples of validated FPGA fault emulation systems.« less

  5. Real-time emulation of neural images in the outer retinal circuit.

    PubMed

    Hasegawa, Jun; Yagi, Tetsuya

    2008-12-01

    We describe a novel real-time system that emulates the architecture and functionality of the vertebrate retina. This system reconstructs the neural images formed by the retinal neurons in real time by using a combination of analog and digital systems consisting of a neuromorphic silicon retina chip, a field-programmable gate array, and a digital computer. While the silicon retina carries out the spatial filtering of input images instantaneously, using the embedded resistive networks that emulate the receptive field structure of the outer retinal neurons, the digital computer carries out the temporal filtering of the spatially filtered images to emulate the dynamical properties of the outer retinal circuits. The emulations of the neural image, including 128 x 128 bipolar cells, are carried out at a frame rate of 62.5 Hz. The emulation of the response to the Hermann grid and a spot of light and an annulus of lights has demonstrated that the system responds as expected by previous physiological and psychophysical observations. Furthermore, the emulated dynamics of neural images in response to natural scenes revealed the complex nature of retinal neuron activity. We have concluded that the system reflects the spatiotemporal responses of bipolar cells in the vertebrate retina. The proposed emulation system is expected to aid in understanding the visual computation in the retina and the brain.

  6. Robotic End Effectors for Hard-Rock Climbing

    NASA Technical Reports Server (NTRS)

    Kennedy, Brett; Leger, Patrick

    2004-01-01

    Special-purpose robot hands (end effectors) now under development are intended to enable robots to traverse cliffs much as human climbers do. Potential applications for robots having this capability include scientific exploration (both on Earth and other rocky bodies in space), military reconnaissance, and outdoor search and rescue operations. Until now, enabling robots to traverse cliffs has been considered too difficult a task because of the perceived need of prohibitively sophisticated planning algorithms as well as end effectors as dexterous as human hands. The present end effectors are being designed to enable robots to attach themselves to typical rock-face features with less planning and simpler end effectors. This advance is based on the emulation of the equipment used by human climbers rather than the emulation of the human hand. Climbing-aid equipment, specifically cams, aid hooks, and cam hooks, are used by sport climbers when a quick ascent of a cliff is desired (see Figure 1). Currently two different end-effector designs have been created. The first, denoted the simple hook emulator, consists of three "fingers" arranged around a central "palm." Each finger emulates the function of a particular type of climbing hook (aid hook, wide cam hook, and a narrow cam hook). These fingers are connected to the palm via a mechanical linkage actuated with a leadscrew/nut. This mechanism allows the fingers to be extended or retracted. The second design, denoted the advanced hook emulator (see Figure 2), shares these features, but it incorporates an aid hook and a cam hook into each finger. The spring-loading of the aid hook allows the passive selection of the type of hook used. The end effectors can be used in several different modes. In the aid-hook mode, the aid hook on one of the fingers locks onto a horizontal ledge while the other two fingers act to stabilize the end effector against the cliff face. In the cam-hook mode, the broad, flat tip of the cam hook is inserted into a non-horizontal crack in the cliff face. A subsequent transfer of weight onto the end effector causes the tip to rotate within the crack, creating a passive, self-locking action of the hook relative to the crack. In the advanced hook emulator, the aid hook is pushed into its retracted position by contact with the cliff face as the cam hook tip is inserted into the crack. When a cliff face contains relatively large pockets or cracks, another type of passive self-locking can be used. Emulating the function of the piece of climbing equipment called a "cam" (note: not the same as a "cam hook"; see Figure 1), the fingers can be fully retracted and the entire end effector inserted into the feature. The fingers are then extended as far as the feature allows. Any weight then transferred to the end effector will tend to extend the fingers further due to frictional force, passively increasing the grip on the feature. In addition to the climbing modes, these end effectors can be used to walk on (either on the palm or the fingertips) and to grasp objects by fully extending the fingers.

  7. A Social Cognitive View of Parental Influences on Student Academic Self-Regulation.

    ERIC Educational Resources Information Center

    Martinez-Pons, Manuel

    2002-01-01

    Discusses recent theory and research on parental activities that influence children's academic self-regulatory development, describing a social-cognitive perspective on academic self- regulation which assumes parents function as implicit and explicit social models for their children and socially support their emulation and adaptive use of…

  8. Undocumented Citizens: The Civic Engagement of Activist Immigrants

    ERIC Educational Resources Information Center

    Hinton, Kip Austin

    2015-01-01

    IDEAS is the undocumented student support group at University of California Los Angeles. This ethnography follows their planning of a conference on immigrant rights legislation. How do undocumented immigrants engage in active citizenship? Patterns of student activism are seen during the conference planning process. IDEAS members emulate Freirean…

  9. Diagnostic emulation: Implementation and user's guide

    NASA Technical Reports Server (NTRS)

    Becher, Bernice

    1987-01-01

    The Diagnostic Emulation Technique was developed within the System Validation Methods Branch as a part of the development of methods for the analysis of the reliability of highly reliable, fault tolerant digital avionics systems. This is a general technique which allows for the emulation of a digital hardware system. The technique is general in the sense that it is completely independent of the particular target hardware which is being emulated. Parts of the system are described and emulated at the logic or gate level, while other parts of the system are described and emulated at the functional level. This algorithm allows for the insertion of faults into the system, and for the observation of the response of the system to these faults. This allows for controlled and accelerated testing of system reaction to hardware failures in the target machine. This document describes in detail how the algorithm was implemented at NASA Langley Research Center and gives instructions for using the system.

  10. A Novel Approach for Determining Source-Receptor Relationships of Aerosols in Model Simulations

    NASA Astrophysics Data System (ADS)

    Ma, P.; Gattiker, J.; Liu, X.; Rasch, P. J.

    2013-12-01

    The climate modeling community usually performs sensitivity studies in the 'one-factor-at-a-time' fashion. However, owing to the a-priori unknown complexity and nonlinearity of the climate system and simulation response, it is computationally expensive to systematically identify the cause-and-effect of multiple factors in climate models. In this study, we use a Gaussian Process emulator, based on a small number of Community Atmosphere Model Version 5.1 (CAM5) simulations (constrained by meteorological reanalyses) using a Latin Hypercube experimental design, to demonstrate that it is possible to characterize model behavior accurately and very efficiently without any modifications to the model itself. We use the emulator to characterize the source-receptor relationships of black carbon (BC), focusing specifically on describing the constituent burden and surface deposition rates from emissions in various regions. Our results show that the emulator is capable of quantifying the contribution of aerosol burden and surface deposition from different source regions, finding that most of current Arctic BC comes from remote sources. We also demonstrate that the sensitivity of the BC burdens to emission perturbations differs for various source regions. For example, the emission growth in Africa where dry convections are strong results in a moderate increase of BC burden over the globe while the same emission growth in the Arctic leads to a significant increase of local BC burdens and surface deposition rates. These results provide insights into the dynamical, physical, and chemical processes of the climate model, and the conclusions may have policy implications for making cost-effective global and regional pollution management strategies.

  11. Computer-Graphics Emulation of Chemical Instrumentation: Absorption Spectrophotometers.

    ERIC Educational Resources Information Center

    Gilbert, D. D.; And Others

    1982-01-01

    Describes interactive, computer-graphics program emulating behavior of high resolution, ultraviolet-visible analog recording spectrophotometer. Graphics terminal behaves as recording absorption spectrophotometer. Objective of the emulation is study of optimization of the instrument to yield accurate absorption spectra, including…

  12. Using a Gaussian Process Emulator for Data-driven Surrogate Modelling of a Complex Urban Drainage Simulator

    NASA Astrophysics Data System (ADS)

    Bellos, V.; Mahmoodian, M.; Leopold, U.; Torres-Matallana, J. A.; Schutz, G.; Clemens, F.

    2017-12-01

    Surrogate models help to decrease the run-time of computationally expensive, detailed models. Recent studies show that Gaussian Process Emulators (GPE) are promising techniques in the field of urban drainage modelling. However, this study focusses on developing a GPE-based surrogate model for later application in Real Time Control (RTC) using input and output time series of a complex simulator. The case study is an urban drainage catchment in Luxembourg. A detailed simulator, implemented in InfoWorks ICM, is used to generate 120 input-output ensembles, from which, 100 are used for training the emulator and 20 for validation of the results. An ensemble of historical rainfall events with 2 hours duration and 10 minutes time steps are considered as the input data. Two example outputs, are selected as wastewater volume and total COD concentration in a storage tank in the network. The results of the emulator are tested with unseen random rainfall events from the ensemble dataset. The emulator is approximately 1000 times faster than the original simulator for this small case study. Whereas the overall patterns of the simulator are matched by the emulator, in some cases the emulator deviates from the simulator. To quantify the accuracy of the emulator in comparison with the original simulator, Nash-Sutcliffe efficiency (NSE) between the emulator and simulator is calculated for unseen rainfall scenarios. The range of NSE for the case of tank volume is from 0.88 to 0.99 with a mean value of 0.95, whereas for COD is from 0.71 to 0.99 with a mean value of 0.92. The emulator is able to predict the tank volume with higher accuracy as the relationship between rainfall intensity and tank volume is linear. For COD, which has a non-linear behaviour, the predictions are less accurate and more uncertain, in particular when rainfall intensity increases. This predictions were improved by including a larger amount of training data for the higher rainfall intensities. It was observed that, the accuracy of the emulator predictions depends on the ensemble training dataset design and the amount of data fed. Finally, more investigation is required to test the possibility of applying this type of fast emulators for model-based RTC applications in which limited number of inputs and outputs are considered in a short prediction horizon.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Frank; Popp, Till; Wieczorek, Klaus

    The purposes of this paper are to review the vast amount of knowledge concerning crushed salt reconsolidation and its attendant hydraulic properties (i.e., its capability for fluid or gas transport) and to provide a sufficient basis to understand reconsolidation and healing rates under repository conditions. Topics covered include: deformation mechanisms and hydro-mechanical interactions during reconsolidation; the experimental data base pertaining to crushed salt reconsolidation; transport properties of consolidating granulated salt and provides quantitative substantiation of its evolution to characteristics emulating undisturbed rock salt; and extension of microscopic and laboratory observations and data to the applicable field scale.

  14. Capability and Interface Assessment of Gaming Technologies for Future Multi-Unmanned Air Vehicle Systems

    DTIC Science & Technology

    2011-08-01

    resource management games (e.g., Sim City 2000), board game simulations (e.g., VASSAL), and abstract games (e.g., Tetris). The second purpose of the...which occur simultaneously o E.g., Starcraft  Board game o A computer game that emulates a board game o E.g., Archon  2D Side View o A game...a mouse  Joypad o E.g., A playstation/X-box controller  Accelerometer o E.g., A Wii Controller  Touch 22 Distribution A: Approved for

  15. Self Diagnostic Accelerometer for Mission Critical Health Monitoring of Aircraft and Spacecraft Engines

    NASA Technical Reports Server (NTRS)

    Lekki, John; Tokars, Roger; Jaros, Dave; Riggs, M. Terrence; Evans, Kenneth P.; Gyekenyesi, Andrew

    2009-01-01

    A self diagnostic accelerometer system has been shown to be sensitive to multiple failure modes of charge mode accelerometers. These failures include sensor structural damage, an electrical open circuit and most importantly sensor detachment. In this paper, experimental work that was performed to determine the capabilities of a self diagnostic accelerometer system while operating in the presence of various levels of mechanical noise, emulating real world conditions, is presented. The results show that the system can successfully conduct a self diagnostic routine under these conditions.

  16. Adaptive control of a Stewart platform-based manipulator

    NASA Technical Reports Server (NTRS)

    Nguyen, Charles C.; Antrazi, Sami S.; Zhou, Zhen-Lei; Campbell, Charles E., Jr.

    1993-01-01

    A joint-space adaptive control scheme for controlling noncompliant motion of a Stewart platform-based manipulator (SPBM) was implemented in the Hardware Real-Time Emulator at Goddard Space Flight Center. The six-degrees of freedom SPBM uses two platforms and six linear actuators driven by dc motors. The adaptive control scheme is based on proportional-derivative controllers whose gains are adjusted by an adaptation law based on model reference adaptive control and Liapunov direct method. It is concluded that the adaptive control scheme provides superior tracking capability as compared to fixed-gain controllers.

  17. The Processing of Airspace Concept Evaluations Using FASTE-CNS as a Pre- or Post-Simulation CNS Analysis Tool

    NASA Technical Reports Server (NTRS)

    Mainger, Steve

    2004-01-01

    As NASA speculates on and explores the future of aviation, the technological and physical aspects of our environment increasing become hurdles that must be overcome for success. Research into methods for overcoming some of these selected hurdles have been purposed by several NASA research partners as concepts. The task of establishing a common evaluation environment was placed on NASA's Virtual Airspace Simulation Technologies (VAST) project (sub-project of VAMS), and they responded with the development of the Airspace Concept Evaluation System (ACES). As one examines the ACES environment from a communication, navigation or surveillance (CNS) perspective, the simulation parameters are built with assumed perfection in the transactions associated with CNS. To truly evaluate these concepts in a realistic sense, the contributions/effects of CNS must be part of the ACES. NASA Glenn Research Center (GRC) has supported the Virtual Airspace Modeling and Simulation (VAMS) project through the continued development of CNS models and analysis capabilities which supports the ACES environment. NASA GRC initiated the development a communications traffic loading analysis tool, called the Future Aeronautical Sub-network Traffic Emulator for Communications, Navigation and Surveillance (FASTE-CNS), as part of this support. This tool allows for forecasting of communications load with the understanding that, there is no single, common source for loading models used to evaluate the existing and planned communications channels; and that, consensus and accuracy in the traffic load models is a very important input to the decisions being made on the acceptability of communication techniques used to fulfill the aeronautical requirements. Leveraging off the existing capabilities of the FASTE-CNS tool, GRC has called for FASTE-CNS to have the functionality to pre- and post-process the simulation runs of ACES to report on instances when traffic density, frequency congestion or aircraft spacing/distance violations have occurred. The integration of these functions require that the CNS models used to characterize these avionic system be of higher fidelity and better consistency then is present in FASTE-CNS system. This presentation will explore the capabilities of FASTE-CNS with renewed emphasis on the enhancements being added to perform these processing functions; the fidelity and reliability of CNS models necessary to make the enhancements work; and the benchmarking of FASTE-CNS results to improve confidence for the results of the new processing capabilities.

  18. Demonstration of L-band DP-QPSK transmission over FSO and fiber channels employing InAs/InP quantum-dash laser source

    NASA Astrophysics Data System (ADS)

    Shemis, M. A.; Khan, M. T. A.; Alkhazraji, E.; Ragheb, A. M.; Esmail, M. A.; Fathallah, H.; Qureshi, K. K.; Alshebeili, S.; Khan, M. Z. M.

    2018-03-01

    The next generation of optical access communication networks that support 100 Gbps and beyond, require advances in modulation schemes, spectrum utilization, new transmission bands, and efficient devices, particularly laser diodes. In this paper, we investigated the viability of new-class of InAs/InP Quantum-dash laser diode (Qdash-LD) exhibiting multiple longitudinal light modes in the L-band to carry high-speed data rate for access network applications. We exploited external and self injection-locking techniques on Qdash-LD to generate large number of stable and tunable locked modes, and compared them. To stem the capability of each locked mode as a potential subcarrier, data transmission is carried out over two mediums; single mode fiber (SMF) and free space optics (FSO) to emulate real deployment scenarios of optical networks. The results showed that with external-injection locking (EIL), an error-free transmission of 100 Gbps dual polarization quadrature phase shift keying (DP-QPSK) signal is demonstrated over 10 km SMF and 4 m indoor FSO channels, with capability of reaching up to 128 Gbps, demonstrated under back-to-back (BTB) configuration. On the other hand, using self-injection locking (SIL) scheme, a successful data transmission of 64 Gbps and 128 Gbps DP-QPSK signal over 20 km SMF and 10 m indoor FSO links, respectively, is achieved.

  19. Evaluating Emulation-based Models of Distributed Computing Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Stephen T.; Gabert, Kasimir G.; Tarman, Thomas D.

    Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses andmore » describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.« less

  20. What role does performance information play in securing improvement in healthcare? a conceptual framework for levers of change

    PubMed Central

    Levesque, Jean-Frederic; Sutherland, Kim

    2017-01-01

    Objective Across healthcare systems, there is consensus on the need for independent and impartial assessment of performance. There is less agreement about how measurement and reporting performance improves healthcare. This paper draws on academic theories to develop a conceptual framework—one that classifies in an integrated manner the ways in which change can be leveraged by healthcare performance information. Methods A synthesis of published frameworks. Results The framework identifies eight levers for change enabled by performance information, spanning internal and external drivers, and emergent and planned processes: (1) cognitive levers provide awareness and understanding; (2) mimetic levers inform about the performance of others to encourage emulation; (3) supportive levers provide facilitation, implementation tools or models of care to actively support change; (4) formative levers develop capabilities and skills through teaching, mentoring and feedback; (5) normative levers set performance against guidelines, standards, certification and accreditation processes; (6) coercive levers use policies, regulations incentives and disincentives to force change; (7) structural levers modify the physical environment or professional cultures and routines; (8) competitive levers attract patients or funders. Conclusion This framework highlights how performance measurement and reporting can contribute to eight different levers for change. It provides guidance into how to align performance measurement and reporting into quality improvement programme. PMID:28851769

  1. Inquiry into the Heart of a Comet

    ERIC Educational Resources Information Center

    Cobb, Whitney; Roundtree-Brown, Maura; McFadden, Lucy; Warner, Elizabeth

    2011-01-01

    Real science means wrangling with peers over real ideas. Wouldn't it be thrilling to emulate a real life model of science in action in classrooms? How? By starting with a great, hands-on activity modeling an object in space that introduces both key vocabulary and science concepts with visuals to support retention and learning; encouraging…

  2. Public acceptance of disturbance-based forest management: factors influencing support

    Treesearch

    Christine S. Olsen; Angela L. Mallon; Bruce A. Shindler

    2012-01-01

    Growing emphasis on ecosystem and landscape-level forest management across North America has spurred an examination of alternative management strategies which focus on emulating dynamic natural disturbance processes, particularly those associated with forest fire regimes. This topic is the cornerstone of research in the Blue River Landscape Study (BRLS) on the...

  3. Emulating multiple inheritance in Fortran 2003/2008

    DOE PAGES

    Morris, Karla

    2015-01-24

    Although the high-performance computing (HPC) community increasingly embraces object-oriented programming (OOP), most HPC OOP projects employ the C++ programming language. Until recently, Fortran programmers interested in mining the benefits of OOP had to emulate OOP in Fortran 90/95. The advent of widespread compiler support for Fortran 2003 now facilitates explicitly constructing object-oriented class hierarchies via inheritance and leveraging related class behaviors such as dynamic polymorphism. Although C++ allows a class to inherit from multiple parent classes, Fortran and several other OOP languages restrict or prohibit explicit multiple inheritance relationships in order to circumvent several pitfalls associated with them. Nonetheless, whatmore » appears as an intrinsic feature in one language can be modeled as a user-constructed design pattern in another language. The present paper demonstrates how to apply the facade structural design pattern to support a multiple inheritance class relationship in Fortran 2003. As a result, the design unleashes the power of the associated class relationships for modeling complicated data structures yet avoids the ambiguities that plague some multiple inheritance scenarios.« less

  4. Performance Evaluation, Emulation, and Control of Cross-Flow Hydrokinetic Turbines

    NASA Astrophysics Data System (ADS)

    Cavagnaro, Robert J.

    Cross-flow hydrokinetic turbines are a promising option for effectively harvesting energy from fast-flowing streams or currents. This work describes the dynamics of such turbines, analyzes techniques used to scale turbine properties for prototyping, determines and demonstrates the limits of stability for cross-flow rotors, and discusses means and objectives of turbine control. Novel control strategies are under development to utilize low-speed operation (slower than at maximum power point) as a means of shedding power under rated conditions. However, operation in this regime may be unstable. An experiment designed to characterize the stability of a laboratory-scale cross-flow turbine operating near a critically low speed yields evidence that system stall (complete loss of ability to rotate) occurs due, in part, to interactions with turbulent decreases in flow speed. The turbine is capable of maintaining 'stable' operation at critical speed for short duration (typically less than 10 s), as described by exponential decay. The presence of accelerated 'bypass' flow around the rotor and decelerated 'induction' region directly upstream of the rotor, both predicted by linear momentum theory, are observed and quantified with particle image velocimetry (PIV) measurements conducted upstream of the turbine. Additionally, general agreement is seen between PIV inflow measurements and those obtained by an advection-corrected acoustic Doppler velocimeter (ADV) further upstream. Performance of a turbine at small (prototype) geometric scale may be prone to undesirable effects due to operation at low Reynolds number and in the presence of high channel blockage. Therefore, testing at larger scale, in open water is desirable. A cross-flow hydrokinetic turbine with a projected area (product of blade span and rotor diameter) of 0.7 m2 is evaluated in open-water tow trials at three inflow speeds ranging from 1.0 m/s to 2.1 m/s. Measurements of the inflow velocity, the rotor mechanical power, and electrical power output of a complete power take-off (PTO) system are utilized to determine the rotor hydrodynamic efficiency (maximum of 17%) and total system efficiency (maximum of 9%). A lab-based dynamometry method yields individual component and total PTO efficiencies, shown to have high variability and strong influence on total system efficiency. Dynamic efficiencies of PTO components can effect the overall efficiency of a turbine system, a result from field characterization. Thus, the ability to evaluate such components and their potential effects on turbine performance prior to field deployment is desirable. Before attempting control experiments with actual turbines, hardware-in-the-loop testing on controllable motor-generator sets or electromechanical emulation machines (EEMs) are explored to better understand power take-off response. The emulator control dynamic equations are presented, methods for scaling turbine parameters are developed and evaluated, and experimental results are presented from three EEMs programmed to emulate the same cross-flow turbine. Although hardware platforms and control implementations varied, results show that each EEM is successful in emulating the turbine model at different power levels, thus demonstrating the general feasibility of the approach. However, performance of motor control under torque command, current command, or speed command differed; torque methods required accurate characterization of the motors while speed methods utilized encoder feedback and more accurately tracked turbine dynamics. In a demonstration of an EEM for evaluating a hydrokinetic turbine implementation, a controller is used to track the maximum power-point of the turbine in response to turbulence. Utilizing realistic inflow conditions and control laws, the emulator dynamic speed response is shown to agree well at low frequencies with simulation but to deviate at high frequencies. The efficacy of an electromechanical emulator as an accurate representation of a fielded turbine is evaluated. A commercial horizontally-oriented cross-flow turbine is dynamically emulated on hardware to investigate control strategies and grid integration. A representative inflow time-series with a mean of 2 m/s is generated from high-resolution flow measurements of a riverine site and is used to drive emulation. Power output during emulation under similar input and loading conditions yields agreement with field measurements to within 3% at high power, near-optimal levels. Constant tip-speed ratio and constant speed proportional plus integral control schemes are compared to optimal nonlinear control and constant resistance regulation. All controllers yield similar results in terms of overall system efficiency. The emulated turbine is more responsive to turbulent inflow than the field turbine, as the model utilized to drive emulation does not account for a smoothing effect of turbulent fluctuations over the span of the fielded turbine's rotors. The turbine has a lower inertia than the demand of an isolated grid, indicating a secondary source of power with a similar frequency response is necessary if a single turbine cannot meet the entire demand. (Abstract shortened by UMI.).

  5. Emulating Industrial Control System Field Devices Using Gumstix Technology

    DTIC Science & Technology

    2012-06-01

    EMULATING INDUSTRIAL CONTROL SYSTEM FIELD DEVICES USING GUMSTIX TECHNOLOGY THESIS Dustin J...APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. The views expressed in this thesis are those of the...EMULATING INDUSTRIAL CONTROL SYSTEM FIELD DEVICES USING GUMSTIX TECHNOLOGY THESIS Presented to the Faculty Department of

  6. Direct-coupled microcomputer-based building emulator for building energy management and control systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lam, H.N.

    1999-07-01

    In this paper, the development and implementation of a direct-coupled building emulator for a building energy management and control system (EMCS) is presented. The building emulator consists of a microcomputer and a computer model of an air-conditioning system implemented in a modular dynamic simulation software package for direct-coupling to an EMCS, without using analog-to-digital and digital-to-analog converters. The building emulator can be used to simulate in real time the behavior of the air-conditioning system under a given operating environment and subject to a given usage pattern. Software modules for data communication, graphical display, dynamic data exchange, and synchronization of simulationmore » outputs with real time have been developed to achieve direct digital data transfer between the building emulator and a commercial EMCS. Based on the tests conducted, the validity of the building emulator has been established and the proportional-plus-integral control function of the EMCS assessed.« less

  7. Development of Wave Turbine Emulator in a Laboratory Environment

    NASA Astrophysics Data System (ADS)

    Vinatha, U.; Vittal K, P.

    2013-07-01

    Wave turbine emulator (WTE) is an important equipment for developing wave energy conversion system. The emulator reflects the actual behavior of the wave turbine by reproducing the characteristics of real wave turbine without reliance on natural wave resources and actual wave turbine. It offers a controllable test environment that allows the evaluation and improvement of control schemes for electric generators. The emulator can be used for research applications to drive an electrical generator in a similar way as a practical wave turbine. This article presents the development of a WTE in a laboratory environment and studies on the behavior of electrical generator coupled to the emulator. The structure of a WTE consists of a PC where the characteristics of the turbine are implemented, ac drive to emulate the turbine rotor, feedback mechanism from the drive and power electronic equipment to control the drive. The feedback signal is acquired by the PC through an A/D converter, and the signal for driving the power electronic device comes from the PC through a D/A converter.

  8. Emulation for probabilistic weather forecasting

    NASA Astrophysics Data System (ADS)

    Cornford, Dan; Barillec, Remi

    2010-05-01

    Numerical weather prediction models are typically very expensive to run due to their complexity and resolution. Characterising the sensitivity of the model to its initial condition and/or to its parameters requires numerous runs of the model, which is impractical for all but the simplest models. To produce probabilistic forecasts requires knowledge of the distribution of the model outputs, given the distribution over the inputs, where the inputs include the initial conditions, boundary conditions and model parameters. Such uncertainty analysis for complex weather prediction models seems a long way off, given current computing power, with ensembles providing only a partial answer. One possible way forward that we develop in this work is the use of statistical emulators. Emulators provide an efficient statistical approximation to the model (or simulator) while quantifying the uncertainty introduced. In the emulator framework, a Gaussian process is fitted to the simulator response as a function of the simulator inputs using some training data. The emulator is essentially an interpolator of the simulator output and the response in unobserved areas is dictated by the choice of covariance structure and parameters in the Gaussian process. Suitable parameters are inferred from the data in a maximum likelihood, or Bayesian framework. Once trained, the emulator allows operations such as sensitivity analysis or uncertainty analysis to be performed at a much lower computational cost. The efficiency of emulators can be further improved by exploiting the redundancy in the simulator output through appropriate dimension reduction techniques. We demonstrate this using both Principal Component Analysis on the model output and a new reduced-rank emulator in which an optimal linear projection operator is estimated jointly with other parameters, in the context of simple low order models, such as the Lorenz 40D system. We present the application of emulators to probabilistic weather forecasting, where the construction of the emulator training set replaces the traditional ensemble model runs. Thus the actual forecast distributions are computed using the emulator conditioned on the ‘ensemble runs' which are chosen to explore the plausible input space using relatively crude experimental design methods. One benefit here is that the ensemble does not need to be a sample from the true distribution of the input space, rather it should cover that input space in some sense. The probabilistic forecasts are computed using Monte Carlo methods sampling from the input distribution and using the emulator to produce the output distribution. Finally we discuss the limitations of this approach and briefly mention how we might use similar methods to learn the model error within a framework that incorporates a data assimilation like aspect, using emulators and learning complex model error representations. We suggest future directions for research in the area that will be necessary to apply the method to more realistic numerical weather prediction models.

  9. Expanding Hardware-in-the-Loop Formation Navigation and Control with Radio Frequency Crosslink Ranging

    NASA Technical Reports Server (NTRS)

    Mitchell, Jason W.; Barbee, Brent W.; Baldwin, Philip J.; Luquette, Richard J.

    2007-01-01

    The Formation Flying Testbed (FFTB) at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) provides a hardware-in-the-loop test environment for formation navigation and control. The facility continues to evolve as a modular, hybrid, dynamic simulation facility for end-to-end guidance, navigation, and control (GN&C) design and analysis of formation flying spacecraft. The core capabilities of the FFTB, as a platform for testing critical hardware and software algorithms in-the-loop, are reviewed with a focus on recent improvements. With the most recent improvement, in support of Technology Readiness Level (TRL) 6 testing of the Inter-spacecraft Ranging and Alarm System (IRAS) for the Magnetospheric Multiscale (MMS) mission, the FFTB has significantly expanded its ability to perform realistic simulations that require Radio Frequency (RF) ranging sensors for relative navigation with the Path Emulator for RF Signals (PERFS). The PERFS, currently under development at NASA GSFC, modulates RF signals exchanged between spacecraft. The RF signals are modified to accurately reflect the dynamic environment through which they travel, including the effects of medium, moving platforms, and radiated power.

  10. MACHETE: Environment for Space Networking Evaluation

    NASA Technical Reports Server (NTRS)

    Jennings, Esther H.; Segui, John S.; Woo, Simon

    2010-01-01

    Space Exploration missions requires the design and implementation of space networking that differs from terrestrial networks. In a space networking architecture, interplanetary communication protocols need to be designed, validated and evaluated carefully to support different mission requirements. As actual systems are expensive to build, it is essential to have a low cost method to validate and verify mission/system designs and operations. This can be accomplished through simulation. Simulation can aid design decisions where alternative solutions are being considered, support trade-studies and enable fast study of what-if scenarios. It can be used to identify risks, verify system performance against requirements, and as an initial test environment as one moves towards emulation and actual hardware implementation of the systems. We describe the development of Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) and its use cases in supporting architecture trade studies, protocol performance and its role in hybrid simulation/emulation. The MACHETE environment contains various tools and interfaces such that users may select the set of tools tailored for the specific simulation end goal. The use cases illustrate tool combinations for simulating space networking in different mission scenarios. This simulation environment is useful in supporting space networking design for planned and future missions as well as evaluating performance of existing networks where non-determinism exist in data traffic and/or link conditions.

  11. Entrainment and motor emulation approaches to joint action: Alternatives or complementary approaches?

    PubMed

    Colling, Lincoln J; Williamson, Kellie

    2014-01-01

    Joint actions, such as music and dance, rely crucially on the ability of two, or more, agents to align their actions with great temporal precision. Within the literature that seeks to explain how this action alignment is possible, two broad approaches have appeared. The first, what we term the entrainment approach, has sought to explain these alignment phenomena in terms of the behavioral dynamics of the system of two agents. The second, what we term the emulator approach, has sought to explain these alignment phenomena in terms of mechanisms, such as forward and inverse models, that are implemented in the brain. They have often been pitched as alternative explanations of the same phenomena; however, we argue that this view is mistaken, because, as we show, these two approaches are engaged in distinct, and not mutually exclusive, explanatory tasks. While the entrainment approach seeks to uncover the general laws that govern behavior the emulator approach seeks to uncover mechanisms. We argue that is possible to do both and that the entrainment approach must pay greater attention to the mechanisms that support the behavioral dynamics of interest. In short, the entrainment approach must be transformed into a neuroentrainment approach by adopting a mechanistic view of explanation and by seeking mechanisms that are implemented in the brain.

  12. Mixed-mode oscillations in memristor emulator based Liénard system

    NASA Astrophysics Data System (ADS)

    Kingston, S. Leo; Suresh, K.; Thamilmaran, K.

    2018-04-01

    We report the existence of mixed-mode oscillations in memristor emulator based Liénard system which is externally driven by sinusoidal force. The charge and flux relationship of memristor emulator device explored based on the smooth cubic nonlinear element. The system exhibits the successive period adding sequences of mixed-mode oscillations in the wide parameter region. The electronics circuit of the memristor emulator is successfully implemented through PSpice simulation and mixed mode oscillations are observed through PSpice experiment and the obtained results are qualitatively matches with the numerical simulation.

  13. Megatux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-09-25

    The Megatux platform enables the emulation of large scale (multi-million node) distributed systems. In particular, it allows for the emulation of large-scale networks interconnecting a very large number of emulated computer systems. It does this by leveraging virtualization and associated technologies to allow hundreds of virtual computers to be hosted on a single moderately sized server or workstation. Virtualization technology provided by modern processors allows for multiple guest OSs to run at the same time, sharing the hardware resources. The Megatux platform can be deployed on a single PC, a small cluster of a few boxes or a large clustermore » of computers. With a modest cluster, the Megatux platform can emulate complex organizational networks. By using virtualization, we emulate the hardware, but run actual software enabling large scale without sacrificing fidelity.« less

  14. Advanced data management system architectures testbed

    NASA Technical Reports Server (NTRS)

    Grant, Terry

    1990-01-01

    The objective of the Architecture and Tools Testbed is to provide a working, experimental focus to the evolving automation applications for the Space Station Freedom data management system. Emphasis is on defining and refining real-world applications including the following: the validation of user needs; understanding system requirements and capabilities; and extending capabilities. The approach is to provide an open, distributed system of high performance workstations representing both the standard data processors and networks and advanced RISC-based processors and multiprocessor systems. The system provides a base from which to develop and evaluate new performance and risk management concepts and for sharing the results. Participants are given a common view of requirements and capability via: remote login to the testbed; standard, natural user interfaces to simulations and emulations; special attention to user manuals for all software tools; and E-mail communication. The testbed elements which instantiate the approach are briefly described including the workstations, the software simulation and monitoring tools, and performance and fault tolerance experiments.

  15. Processor Emulator with Benchmark Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lloyd, G. Scott; Pearce, Roger; Gokhale, Maya

    2015-11-13

    A processor emulator and a suite of benchmark applications have been developed to assist in characterizing the performance of data-centric workloads on current and future computer architectures. Some of the applications have been collected from other open source projects. For more details on the emulator and an example of its usage, see reference [1].

  16. The Ghost Condition: Imitation Versus Emulation in Young Children's Observational Learning.

    ERIC Educational Resources Information Center

    Thompson, Doreen E.; Russell, James

    2004-01-01

    Although observational learning by children may occur through imitating a modeler's actions, it can also occur through learning about an object's dynamic affordances- a process that M. Tomasello (1996) calls "emulation." The relative contributions of imitation and emulation within observational learning were examined in a study with 14- to…

  17. The Use and Abuses of Emulation as a Pedagogical Practice

    ERIC Educational Resources Information Center

    Jonas, Mark E.; Chambers, Drew W.

    2017-01-01

    From the late eighteenth through the end of the nineteenth century, educational philosophers and practitioners debated the benefits and shortcomings of the use of emulation in schools. During this period, "emulation" referred to a pedagogy that leveraged comparisons between students as a tool to motivate them to higher achievement. Many…

  18. Evaluating and Celebrating PBIS Success: Development and Implementation of Ohio's PBIS Recognition System

    ERIC Educational Resources Information Center

    Noltemeyer, Amity; Petrasek, Michael; Stine, Karen; Palmer, Katelyn; Meehan, Cricket; Jordan, Emily

    2018-01-01

    With the increasing use of Positive Behavioral Interventions and Supports (PBIS) nationally, several states have developed systems to recognize exemplar schools that are implementing PBIS with fidelity. These systems serve the dual purpose of identifying model PBIS schools for other schools to emulate while also reinforcing schools' effective PBIS…

  19. Aerial image measurement technique for automated reticle defect disposition (ARDD) in wafer fabs

    NASA Astrophysics Data System (ADS)

    Zibold, Axel M.; Schmid, Rainer M.; Stegemann, B.; Scheruebl, Thomas; Harnisch, Wolfgang; Kobiyama, Yuji

    2004-08-01

    The Aerial Image Measurement System (AIMS)* for 193 nm lithography emulation has been brought into operation successfully worldwide. A second generation system comprising 193 nm AIMS capability, mini-environment and SMIF, the AIMS fab 193 plus is currently introduced into the market. By adjustment of numerical aperture (NA), illumination type and partial illumination coherence to match the conditions in 193 nm steppers or scanners, it can emulate the exposure tool for any type of reticles like binary, OPC and PSM down to the 65 nm node. The system allows a rapid prediction of wafer printability of defects or defect repairs, and critical features, like dense patterns or contacts on the masks without the need to perform expensive image qualification consisting of test wafer exposures followed by SEM measurements. Therefore, AIMS is a mask quality verification standard for high-end photo masks and established in mask shops worldwide. The progress on the AIMS technology described in this paper will highlight that besides mask shops there will be a very beneficial use of the AIMS in the wafer fab and we propose an Automated Reticle Defect Disposition (ARDD) process. With smaller nodes, where design rules are 65 nm or less, it is expected that smaller defects on reticles will occur in increasing numbers in the wafer fab. These smaller mask defects will matter more and more and become a serious yield limiting factor. With increasing mask prices and increasing number of defects and severability on reticles it will become cost beneficial to perform defect disposition on the reticles in wafer production. Currently ongoing studies demonstrate AIMS benefits for wafer fab applications. An outlook will be given for extension of 193 nm aerial imaging down to the 45 nm node based on emulation of immersion scanners.

  20. A 1.26 μW Cytomimetic IC Emulating Complex Nonlinear Mammalian Cell Cycle Dynamics: Synthesis, Simulation and Proof-of-Concept Measured Results.

    PubMed

    Houssein, Alexandros; Papadimitriou, Konstantinos I; Drakakis, Emmanuel M

    2015-08-01

    Cytomimetic circuits represent a novel, ultra low-power, continuous-time, continuous-value class of circuits, capable of mapping on silicon cellular and molecular dynamics modelled by means of nonlinear ordinary differential equations (ODEs). Such monolithic circuits are in principle able to emulate on chip, single or multiple cell operations in a highly parallel fashion. Cytomimetic topologies can be synthesized by adopting the Nonlinear Bernoulli Cell Formalism (NBCF), a mathematical framework that exploits the striking similarities between the equations describing weakly-inverted Metal-Oxide Semiconductor (MOS) devices and coupled nonlinear ODEs, typically appearing in models of naturally encountered biochemical systems. The NBCF maps biological state variables onto strictly positive subthreshold MOS circuit currents. This paper presents the synthesis, the simulation and proof-of-concept chip results corresponding to the emulation of a complex cellular network mechanism, the skeleton model for the network of Cyclin-dependent Kinases (CdKs) driving the mammalian cell cycle. This five variable nonlinear biological model, when appropriate model parameter values are assigned, can exhibit multiple oscillatory behaviors, varying from simple periodic oscillations, to complex oscillations such as quasi-periodicity and chaos. The validity of our approach is verified by simulated results with realistic process parameters from the commercially available AMS 0.35 μm technology and by chip measurements. The fabricated chip occupies an area of 2.27 mm2 and consumes a power of 1.26 μW from a power supply of 3 V. The presented cytomimetic topology follows closely the behavior of its biological counterpart, exhibiting similar time-dependent solutions of the Cdk complexes, the transcription factors and the proteins.

  1. Idols as Sunshine or Road Signs: Comparing Absorption-Addiction Idolatry With Identification-Emulation Idolatry.

    PubMed

    Cheung, Chau-Kiu; Yue, Xiao Dong

    2018-01-01

    This study seeks to contrast absorption-addiction idolatry and identification-emulation idolatry. Whereas absorption-addiction idolatry progresses from entertainment/socializing to personalizing and obsession about the idol, identification-emulation idolatry unfolds in terms of identification, attachment, romantization, idealization, and consumption about the idol or his or her derivatives. Based on a sample of 1310 secondary school and university students in Hong Kong, the study verified the original factor model composed of five first-order identification-emulation idolatry and three first-order absorption-addiction idolatry factors, with the latter more predictable by fans' club membership.

  2. Design of BLDCM emulator for transmission control units

    NASA Astrophysics Data System (ADS)

    Liu, Chang; He, Yongyi; Zhang, Bodong

    2018-04-01

    According to the testing requirements of the transmission control unit, a brushless DC motor emulating system is designed based on motor simulation and power hardware-in-the-loop. The discrete motor model is established and a real-time numerical method is designed to solve the motor states. The motor emulator directly interacts with power stage of the transmission control unit using a power-efficient circuit topology and is compatible with sensor-less control. Experiments on a laboratory prototype help to verify that the system can emulate the real motor currents and voltages whenever the motor is starting up or suddenly loaded.

  3. Memristor emulator causes dissimilarity on a coupled memristive systems

    NASA Astrophysics Data System (ADS)

    Sabarathinam, S.; Prasad, Awadhesh

    2018-04-01

    The memristor is known as abasic fourth passive solid state circuit element. Itgaining increasing attention to create the next generation electronic devices commonly used as fundamental chaotic circuit although often arbitrary (typically piecewise linear or cubic) fluxcharge characteristics. In thispresent work, the causes of the memristor emulator studied in a coupled memristive chaoticoscillator for the first time. We confirm that the emulator that allows synchronization between theoscillators and cause the dissimilarity between the systems when increasing the couplingstrength, and co-efficient of the memristor emulator. The detailed statistical analysis was performed to confirm such phenomenon.

  4. Positive Behavior Support and Applied Behavior Analysis

    PubMed Central

    Johnston, J.M; Foxx, Richard M; Jacobson, John W; Green, Gina; Mulick, James A

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We also consider the features of PBS that have facilitated its broad dissemination and how ABA might benefit from emulating certain practices of the PBS movement. PMID:22478452

  5. System on a Chip Real-Time Emulation (SOCRE)

    DTIC Science & Technology

    2006-09-01

    code ) i Table of Contents Preface...emulation platform included LDPC decoders, A/V and radio applications Port BEE flow to Emulation Platforms, SOC Technologies One of the key tasks of the...Once the design has been described within Simulink, the designer runs the BEE design flow within Matlab using the bee_xps interface. At this point

  6. Emotions Targeting Moral Exemplarity: Making Sense of the Logical Geography of Admiration, Emulation and Elevation

    ERIC Educational Resources Information Center

    Kristjánsson, Kristján

    2017-01-01

    Despite renewed interest in moral role-modelling and its emotional underpinnings, further conceptual work is needed on the logical geography of the emotions purportedly driving it, in particular, admiration, emulation and elevation. In this article, I explore admiration (as understood by Linda Zagzebski), emulation (as understood by Aristotle) and…

  7. Organic-based molecular switches for molecular electronics.

    PubMed

    Fuentes, Noelia; Martín-Lasanta, Ana; Alvarez de Cienfuegos, Luis; Ribagorda, Maria; Parra, Andres; Cuerva, Juan M

    2011-10-05

    In a general sense, molecular electronics (ME) is the branch of nanotechnology which studies the application of molecular building blocks for the fabrication of electronic components. Among the different types of molecules, organic compounds have been revealed as promising candidates for ME, due to the easy access, great structural diversity and suitable electronic and mechanical properties. Thanks to these useful capabilities, organic molecules have been used to emulate electronic devices at the nanoscopic scale. In this feature article, we present the diverse strategies used to develop organic switches towards ME with special attention to non-volatile systems.

  8. Intelligent flight control systems

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.

    1993-01-01

    The capabilities of flight control systems can be enhanced by designing them to emulate functions of natural intelligence. Intelligent control functions fall in three categories. Declarative actions involve decision-making, providing models for system monitoring, goal planning, and system/scenario identification. Procedural actions concern skilled behavior and have parallels in guidance, navigation, and adaptation. Reflexive actions are spontaneous, inner-loop responses for control and estimation. Intelligent flight control systems learn knowledge of the aircraft and its mission and adapt to changes in the flight environment. Cognitive models form an efficient basis for integrating 'outer-loop/inner-loop' control functions and for developing robust parallel-processing algorithms.

  9. IBM NJE protocol emulator for VAX/VMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engert, D.E.

    1981-01-01

    Communications software has been written at Argonne National Laboratory to enable a VAX/VMS system to participate as an end-node in a standard IBM network by emulating the Network Job Entry (NJE) protocol. NJE is actually a collection of programs that support job networking for the operating systems used on most large IBM-compatible computers (e.g., VM/370, MVS with JES2 or JES3, SVS, MVT with ASP or HASP). Files received by the VAX can be printed or saved in user-selected disk files. Files sent to the network can be routed to any node in the network for printing, punching, or job submission,more » as well as to a VM/370 user's virtual reader. Files sent from the VAX are queued and transmitted asynchronously to allow users to perform other work while files are awaiting transmission. No changes are required to the IBM software.« less

  10. A cross-validation package driving Netica with python

    USGS Publications Warehouse

    Fienen, Michael N.; Plant, Nathaniel G.

    2014-01-01

    Bayesian networks (BNs) are powerful tools for probabilistically simulating natural systems and emulating process models. Cross validation is a technique to avoid overfitting resulting from overly complex BNs. Overfitting reduces predictive skill. Cross-validation for BNs is known but rarely implemented due partly to a lack of software tools designed to work with available BN packages. CVNetica is open-source, written in Python, and extends the Netica software package to perform cross-validation and read, rebuild, and learn BNs from data. Insights gained from cross-validation and implications on prediction versus description are illustrated with: a data-driven oceanographic application; and a model-emulation application. These examples show that overfitting occurs when BNs become more complex than allowed by supporting data and overfitting incurs computational costs as well as causing a reduction in prediction skill. CVNetica evaluates overfitting using several complexity metrics (we used level of discretization) and its impact on performance metrics (we used skill).

  11. Case-based reasoning emulation of persons for wheelchair navigation.

    PubMed

    Peula, Jose Manuel; Urdiales, Cristina; Herrero, Ignacio; Fernandez-Carmona, Manuel; Sandoval, Francisco

    2012-10-01

    Testing is a key stage in system development, particularly in systems such as a wheelchair, in which the final user is typically a disabled person. These systems have stringent safety requirements, requiring major testing with many different individuals. The best would be to have the wheelchair tested by many different end users, as each disability affects driving skills in a different way. Unfortunately, from a practical point of view it is difficult to engage end users as beta testers. Hence, testing often relies on simulations. Naturally, these simulations need to be as realistic as possible to make the system robust and safe before real tests can be accomplished. This work presents a tool to automatically test wheelchairs through realistic emulation of different wheelchair users. Our approach is based on extracting meaningful data from real users driving a power wheelchair autonomously. This data is then used to train a case-based reasoning (CBR) system that captures the specifics of the driver via learning. The resulting case-base is then used to emulate the driving behavior of that specific person in more complex situations or when a new assistive algorithm needs to be tested. CBR returns user's motion commands appropriate for each specific situation to add the human component to shared control systems. The proposed system has been used to emulate several power wheelchair users presenting different disabilities. Data to create this emulation was obtained from previous wheelchair navigation experiments with 35 volunteer in-patients presenting different degrees of disability. CBR was trained with a limited number of scenarios for each volunteer. Results proved that: (i) emulated and real users returned similar paths in the same scenario (maximum and mean path deviations are equal to 23 and 10cm, respectively) and similar efficiency; (ii) we established the generality of our approach taking a new path not present in the training traces; (iii) the emulated user is more realistic - path and efficiency are less homogeneous and smooth - than potential field approaches; and (iv) the system adequately emulates in-patients - maximum and mean path deviations are equal to 19 and 8.3cm approximately and efficiencies are similar - with specific disabilities (apraxia and dementia) obtaining different behaviors during emulation for each of the in-patients, as expected. The proposed system adequately emulates the driving behavior of people with different disabilities in indoor scenarios. This approach is suitable to emulate real users' driving behaviors for early testing stages of assistive navigation systems. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Performance tests of a power-electronics converter for multi-megawatt wind turbines using a grid emulator

    NASA Astrophysics Data System (ADS)

    Rizqy Averous, Nurhan; Berthold, Anica; Schneider, Alexander; Schwimmbeck, Franz; Monti, Antonello; De Doncker, Rik W.

    2016-09-01

    A vast increase of wind turbines (WT) contribution in the modern electrical grids have led to the development of grid connection requirements. In contrast to the conventional test method, testing power-electronics converters for WT using a grid emulator at Center for Wind Power Drives (CWD) RWTH Aachen University offers more flexibility for conducting test scenarios. Further analysis on the performance of the device under test (DUT) is however required when testing with grid emulator since the characteristic of the grid emulator might influence the performance of the DUT. This paper focuses on the performance analysis of the DUT when tested using grid emulator. Beside the issue regarding the current harmonics, the performance during Fault Ride-Through (FRT) is discussed in detail. A power hardware in the loop setup is an attractive solution to conduct a comprehensive study on the interaction between the power-electronics converters and the electrical grids.

  13. THE MIRA–TITAN UNIVERSE: PRECISION PREDICTIONS FOR DARK ENERGY SURVEYS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heitmann, Katrin; Habib, Salman; Biswas, Rahul

    2016-04-01

    Large-scale simulations of cosmic structure formation play an important role in interpreting cosmological observations at high precision. The simulations must cover a parameter range beyond the standard six cosmological parameters and need to be run at high mass and force resolution. A key simulation-based task is the generation of accurate theoretical predictions for observables using a finite number of simulation runs, via the method of emulation. Using a new sampling technique, we explore an eight-dimensional parameter space including massive neutrinos and a variable equation of state of dark energy. We construct trial emulators using two surrogate models (the linear powermore » spectrum and an approximate halo mass function). The new sampling method allows us to build precision emulators from just 26 cosmological models and to systematically increase the emulator accuracy by adding new sets of simulations in a prescribed way. Emulator fidelity can now be continuously improved as new observational data sets become available and higher accuracy is required. Finally, using one ΛCDM cosmology as an example, we study the demands imposed on a simulation campaign to achieve the required statistics and accuracy when building emulators for investigations of dark energy.« less

  14. The mira-titan universe. Precision predictions for dark energy surveys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heitmann, Katrin; Bingham, Derek; Lawrence, Earl

    2016-03-28

    Large-scale simulations of cosmic structure formation play an important role in interpreting cosmological observations at high precision. The simulations must cover a parameter range beyond the standard six cosmological parameters and need to be run at high mass and force resolution. A key simulation-based task is the generation of accurate theoretical predictions for observables using a finite number of simulation runs, via the method of emulation. Using a new sampling technique, we explore an eight-dimensional parameter space including massive neutrinos and a variable equation of state of dark energy. We construct trial emulators using two surrogate models (the linear powermore » spectrum and an approximate halo mass function). The new sampling method allows us to build precision emulators from just 26 cosmological models and to systematically increase the emulator accuracy by adding new sets of simulations in a prescribed way. Emulator fidelity can now be continuously improved as new observational data sets become available and higher accuracy is required. Finally, using one ΛCDM cosmology as an example, we study the demands imposed on a simulation campaign to achieve the required statistics and accuracy when building emulators for investigations of dark energy.« less

  15. SimSchool: An Opportunity for Using Serious Gaming for Training Teachers in Rural Areas

    ERIC Educational Resources Information Center

    Tyler-Wood, Tandra; Estes, Mary; Christensen, Rhonda; Knezek, Gerald; Gibson, David

    2015-01-01

    This article examines the use of simSchool as a training tool for educators working with students with special needs in rural districts. SimSchool is a game that emulates a classroom utilizing a virtual environment. The theory supporting simSchool is explored and current research associated with simSchool is reviewed. The issues surrounding…

  16. Transformed Telepresence and Its Association with Learning in Computer-Supported Collaborative Learning: A Case Study in English Learning and Its Evaluation

    ERIC Educational Resources Information Center

    Ting, Yu-Liang; Tai, Yaming; Chen, Jun-Horng

    2017-01-01

    Telepresence has been playing an important role in a mediated learning environment. However, the current design of telepresence seems to be dominated by the emulation of physical human presence. With reference to social constructivism learning and the recognition of individuals as intelligent entities, this study explored the transformation of…

  17. An EXPRESS Rack Overview and Support for Microgravity Research on the International Space Station (ISS)

    NASA Technical Reports Server (NTRS)

    Pelfrey, Joseph J.; Jordan, Lee P.

    2008-01-01

    The EXpedite the PRocessing of Experiments to Space Station or EXPRESS Rack System has provided accommodations and facilitated operations for microgravity-based research payloads for over 6 years on the International Space Station (ISS). The EXPRESS Rack accepts Space Shuttle middeck type lockers and International Subrack Interface Standard (ISIS) drawers, providing a modular-type interface on the ISS. The EXPRESS Rack provides 28Vdc power, Ethernet and RS-422 data interfaces, thermal conditioning, vacuum exhaust, and Nitrogen supply for payload use. The EXPRESS Rack system also includes payload checkout capability with a flight rack or flight rack emulator prior to launch, providing a high degree of confidence in successful operations once an-orbit. In addition, EXPRESS trainer racks are provided to support crew training of both rack systems and subrack operations. Standard hardware and software interfaces provided by the EXPRESS Rack simplify the integration processes for ISS payload development. The EXPRESS Rack is designed to accommodate multidiscipline research, allowing for the independent operation of each subrack payload within a single rack. On-orbit operations began for the EXPRESS Rack Project on April 24, 2001, with one rack operating continuously to support high-priority payloads. The other on-orbit EXPRESS Racks operate based on payload need and resource availability. Over 50 multi-discipline payloads have now been supported on-orbit by the EXPRESS Rack Program. Sustaining engineering, logistics, and maintenance functions are in place to maintain hardware, operations and provide software upgrades. Additional EXPRESS Racks are planned for launch prior to ISS completion in support of long-term operations and the planned transition of the U.S. Segment to a National Laboratory.

  18. Network Modeling and Simulation Environment (NEMSE)

    DTIC Science & Technology

    2012-07-01

    the NEMSE program investigated complex emulation techniques and selected compatible emulation techniques for all OSI network stack layers. Other...EMULAB; 2) Completed the selection of compatible emulation techniques that allows working with all layers of the Open System Interconnect ( OSI ...elements table, Figure 3, reconciles the various elements of NEMSE against the OSI stack and other functions. OSI Layer or Function EM UL AB NS 2

  19. Generic Software for Emulating Multiprocessor Architectures.

    DTIC Science & Technology

    1985-05-01

    RD-A157 662 GENERIC SOFTWARE FOR EMULATING MULTIPROCESSOR 1/2 AlRCHITECTURES(J) MASSACHUSETTS INST OF TECH CAMBRIDGE U LRS LAB FOR COMPUTER SCIENCE R...AREA & WORK UNIT NUMBERS MIT Laboratory for Computer Science 545 Technology Square Cambridge, MA 02139 ____________ I I. CONTROLLING OFFICE NAME AND...aide If neceeasy end Identify by block number) Computer architecture, emulation, simulation, dataf low 20. ABSTRACT (Continue an reverse slde It

  20. Propulsion Electric Grid Simulator (PEGS) for Future Turboelectric Distributed Propulsion Aircraft

    NASA Technical Reports Server (NTRS)

    Choi, Benjamin B.; Morrison, Carlos; Dever, Timothy; Brown, Gerald V.

    2014-01-01

    NASA Glenn Research Center, in collaboration with the aerospace industry and academia, has begun the development of technology for a future hybrid-wing body electric airplane with a turboelectric distributed propulsion (TeDP) system. It is essential to design a subscale system to emulate the TeDP power grid, which would enable rapid analysis and demonstration of the proof-of-concept of the TeDP electrical system. This paper describes how small electrical machines with their controllers can emulate all the components in a TeDP power train. The whole system model in Matlab/Simulink was first developed and tested in simulation, and the simulation results showed that system dynamic characteristics could be implemented by using the closed-loop control of the electric motor drive systems. Then we designed a subscale experimental system to emulate the entire power system from the turbine engine to the propulsive fans. Firstly, we built a system to emulate a gas turbine engine driving a generator, consisting of two permanent magnet (PM) motors with brushless motor drives, coupled by a shaft. We programmed the first motor and its drive to mimic the speed-torque characteristic of the gas turbine engine, while the second motor and drive act as a generator and produce a torque load on the first motor. Secondly, we built another system of two PM motors and drives to emulate a motor driving a propulsive fan. We programmed the first motor and drive to emulate a wound-rotor synchronous motor. The propulsive fan was emulated by implementing fan maps and flight conditions into the fourth motor and drive, which produce a torque load on the driving motor. The stator of each PM motor is designed to travel axially to change the coupling between rotor and stator. This feature allows the PM motor to more closely emulate a wound-rotor synchronous machine. These techniques can convert the plain motor system into a unique TeDP power grid emulator that enables real-time simulation performance using hardware-in-the-loop (HIL).

  1. Emulation of simulations of atmospheric dispersion at Fukushima for Sobol' sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Girard, Sylvain; Korsakissok, Irène; Mallet, Vivien

    2015-04-01

    Polyphemus/Polair3D, from which derives IRSN's operational model ldX, was used to simulate the atmospheric dispersion at the Japan scale of radionuclides after the Fukushima disaster. A previous study with the screening method of Morris had shown that - The sensitivities depend a lot on the considered output; - Only a few of the inputs are non-influential on all considered outputs; - Most influential inputs have either non-linear effects or are interacting. These preliminary results called for a more detailed sensitivity analysis, especially regarding the characterization of interactions. The method of Sobol' allows for a precise evaluation of interactions but requires large simulation samples. Gaussian process emulators for each considered outputs were built in order to relieve this computational burden. Globally aggregated outputs proved to be easy to emulate with high accuracy, and associated Sobol' indices are in broad agreement with previous results obtained with the Morris method. More localized outputs, such as temporal averages of gamma dose rates at measurement stations, resulted in lesser emulator performances: tests simulations could not satisfactorily be reproduced by some emulators. These outputs are of special interest because they can be compared to available observations, for instance for calibration purpose. A thorough inspection of prediction residuals hinted that the model response to wind perturbations often behaved in very distinct regimes relatively to some thresholds. Complementing the initial sample with wind perturbations set to the extreme values allowed for sensible improvement of some of the emulators while other remained too unreliable to be used in a sensitivity analysis. Adaptive sampling or regime-wise emulation could be tried to circumvent this issue. Sobol' indices for local outputs revealed interesting patterns, mostly dominated by the winds, with very high interactions. The emulators will be useful for subsequent studies. Indeed, our goal is to characterize the model output uncertainty but too little information is available about input uncertainties. Hence, calibration of the input distributions with observation and a Bayesian approach seem necessary. This would probably involve methods such as MCMC which would be intractable without emulators.

  2. Control System Applicable Use Assessment of the Secure Computing Corporation - Secure Firewall (Sidewinder)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadley, Mark D.; Clements, Samuel L.

    2009-01-01

    Battelle’s National Security & Defense objective is, “applying unmatched expertise and unique facilities to deliver homeland security solutions. From detection and protection against weapons of mass destruction to emergency preparedness/response and protection of critical infrastructure, we are working with industry and government to integrate policy, operational, technological, and logistical parameters that will secure a safe future”. In an ongoing effort to meet this mission, engagements with industry that are intended to improve operational and technical attributes of commercial solutions that are related to national security initiatives are necessary. This necessity will ensure that capabilities for protecting critical infrastructure assets aremore » considered by commercial entities in their development, design, and deployment lifecycles thus addressing the alignment of identified deficiencies and improvements needed to support national cyber security initiatives. The Secure Firewall (Sidewinder) appliance by Secure Computing was assessed for applicable use in critical infrastructure control system environments, such as electric power, nuclear and other facilities containing critical systems that require augmented protection from cyber threat. The testing was performed in the Pacific Northwest National Laboratory’s (PNNL) Electric Infrastructure Operations Center (EIOC). The Secure Firewall was tested in a network configuration that emulates a typical control center network and then evaluated. A number of observations and recommendations are included in this report relating to features currently included in the Secure Firewall that support critical infrastructure security needs.« less

  3. Copy of Using Emulation and Simulation to Understand the Large-Scale Behavior of the Internet.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adalsteinsson, Helgi; Armstrong, Robert C.; Chiang, Ken

    2008-10-01

    We report on the work done in the late-start LDRDUsing Emulation and Simulation toUnderstand the Large-Scale Behavior of the Internet. We describe the creation of a researchplatform that emulates many thousands of machines to be used for the study of large-scale inter-net behavior. We describe a proof-of-concept simple attack we performed in this environment.We describe the successful capture of a Storm bot and, from the study of the bot and furtherliterature search, establish large-scale aspects we seek to understand via emulation of Storm onour research platform in possible follow-on work. Finally, we discuss possible future work.3

  4. FPGA Based Reconfigurable ATM Switch Test Bed

    NASA Technical Reports Server (NTRS)

    Chu, Pong P.; Jones, Robert E.

    1998-01-01

    Various issues associated with "FPGA Based Reconfigurable ATM Switch Test Bed" are presented in viewgraph form. Specific topics include: 1) Network performance evaluation; 2) traditional approaches; 3) software simulation; 4) hardware emulation; 5) test bed highlights; 6) design environment; 7) test bed architecture; 8) abstract sheared-memory switch; 9) detailed switch diagram; 10) traffic generator; 11) data collection circuit and user interface; 12) initial results; and 13) the following conclusions: Advances in FPGA make hardware emulation feasible for performance evaluation, hardware emulation can provide several orders of magnitude speed-up over software simulation; due to the complexity of hardware synthesis process, development in emulation is much more difficult than simulation and requires knowledge in both networks and digital design.

  5. Modeling of a latent fault detector in a digital system

    NASA Technical Reports Server (NTRS)

    Nagel, P. M.

    1978-01-01

    Methods of modeling the detection time or latency period of a hardware fault in a digital system are proposed that explain how a computer detects faults in a computational mode. The objectives were to study how software reacts to a fault, to account for as many variables as possible affecting detection and to forecast a given program's detecting ability prior to computation. A series of experiments were conducted on a small emulated microprocessor with fault injection capability. Results indicate that the detecting capability of a program largely depends on the instruction subset used during computation and the frequency of its use and has little direct dependence on such variables as fault mode, number set, degree of branching and program length. A model is discussed which employs an analog with balls in an urn to explain the rate of which subsequent repetitions of an instruction or instruction set detect a given fault.

  6. Software-implemented fault insertion: An FTMP example

    NASA Technical Reports Server (NTRS)

    Czeck, Edward W.; Siewiorek, Daniel P.; Segall, Zary Z.

    1987-01-01

    This report presents a model for fault insertion through software; describes its implementation on a fault-tolerant computer, FTMP; presents a summary of fault detection, identification, and reconfiguration data collected with software-implemented fault insertion; and compares the results to hardware fault insertion data. Experimental results show detection time to be a function of time of insertion and system workload. For the fault detection time, there is no correlation between software-inserted faults and hardware-inserted faults; this is because hardware-inserted faults must manifest as errors before detection, whereas software-inserted faults immediately exercise the error detection mechanisms. In summary, the software-implemented fault insertion is able to be used as an evaluation technique for the fault-handling capabilities of a system in fault detection, identification and recovery. Although the software-inserted faults do not map directly to hardware-inserted faults, experiments show software-implemented fault insertion is capable of emulating hardware fault insertion, with greater ease and automation.

  7. Statistical evaluation of metal fill widths for emulated metal fill in parasitic extraction methodology

    NASA Astrophysics Data System (ADS)

    J-Me, Teh; Noh, Norlaili Mohd.; Aziz, Zalina Abdul

    2015-05-01

    In the chip industry today, the key goal of a chip development organization is to develop and market chips within a short time frame to gain foothold on market share. This paper proposes a design flow around the area of parasitic extraction to improve the design cycle time. The proposed design flow utilizes the usage of metal fill emulation as opposed to the current flow which performs metal fill insertion directly. By replacing metal fill structures with an emulation methodology in earlier iterations of the design flow, this is targeted to help reduce runtime in fill insertion stage. Statistical design of experiments methodology utilizing the randomized complete block design was used to select an appropriate emulated metal fill width to improve emulation accuracy. The experiment was conducted on test cases of different sizes, ranging from 1000 gates to 21000 gates. The metal width was varied from 1 x minimum metal width to 6 x minimum metal width. Two-way analysis of variance and Fisher's least significant difference test were used to analyze the interconnect net capacitance values of the different test cases. This paper presents the results of the statistical analysis for the 45 nm process technology. The recommended emulated metal fill width was found to be 4 x the minimum metal width.

  8. Boosting flood warning schemes with fast emulator of detailed hydrodynamic models

    NASA Astrophysics Data System (ADS)

    Bellos, V.; Carbajal, J. P.; Leitao, J. P.

    2017-12-01

    Floods are among the most destructive catastrophic events and their frequency has incremented over the last decades. To reduce flood impact and risks, flood warning schemes are installed in flood prone areas. Frequently, these schemes are based on numerical models which quickly provide predictions of water levels and other relevant observables. However, the high complexity of flood wave propagation in the real world and the need of accurate predictions in urban environments or in floodplains hinders the use of detailed simulators. This sets the difficulty, we need fast predictions that meet the accuracy requirements. Most physics based detailed simulators although accurate, will not fulfill the speed demand. Even if High Performance Computing techniques are used (the magnitude of required simulation time is minutes/hours). As a consequence, most flood warning schemes are based in coarse ad-hoc approximations that cannot take advantage a detailed hydrodynamic simulation. In this work, we present a methodology for developing a flood warning scheme using an Gaussian Processes based emulator of a detailed hydrodynamic model. The methodology consists of two main stages: 1) offline stage to build the emulator; 2) online stage using the emulator to predict and generate warnings. The offline stage consists of the following steps: a) definition of the critical sites of the area under study, and the specification of the observables to predict at those sites, e.g. water depth, flow velocity, etc.; b) generation of a detailed simulation dataset to train the emulator; c) calibration of the required parameters (if measurements are available). The online stage is carried on using the emulator to predict the relevant observables quickly, and the detailed simulator is used in parallel to verify key predictions of the emulator. The speed gain given by the emulator allows also to quantify uncertainty in predictions using ensemble methods. The above methodology is applied in real world scenario.

  9. Linking big models to big data: efficient ecosystem model calibration through Bayesian model emulation

    NASA Astrophysics Data System (ADS)

    Fer, I.; Kelly, R.; Andrews, T.; Dietze, M.; Richardson, A. D.

    2016-12-01

    Our ability to forecast ecosystems is limited by how well we parameterize ecosystem models. Direct measurements for all model parameters are not always possible and inverse estimation of these parameters through Bayesian methods is computationally costly. A solution to computational challenges of Bayesian calibration is to approximate the posterior probability surface using a Gaussian Process that emulates the complex process-based model. Here we report the integration of this method within an ecoinformatics toolbox, Predictive Ecosystem Analyzer (PEcAn), and its application with two ecosystem models: SIPNET and ED2.1. SIPNET is a simple model, allowing application of MCMC methods both to the model itself and to its emulator. We used both approaches to assimilate flux (CO2 and latent heat), soil respiration, and soil carbon data from Bartlett Experimental Forest. This comparison showed that emulator is reliable in terms of convergence to the posterior distribution. A 10000-iteration MCMC analysis with SIPNET itself required more than two orders of magnitude greater computation time than an MCMC run of same length with its emulator. This difference would be greater for a more computationally demanding model. Validation of the emulator-calibrated SIPNET against both the assimilated data and out-of-sample data showed improved fit and reduced uncertainty around model predictions. We next applied the validated emulator method to the ED2, whose complexity precludes standard Bayesian data assimilation. We used the ED2 emulator to assimilate demographic data from a network of inventory plots. For validation of the calibrated ED2, we compared the model to results from Empirical Succession Mapping (ESM), a novel synthesis of successional patterns in Forest Inventory and Analysis data. Our results revealed that while the pre-assimilation ED2 formulation cannot capture the emergent demographic patterns from ESM analysis, constrained model parameters controlling demographic processes increased their agreement considerably.

  10. Nice and Kind, Smart and Funny: What Children Like and Want to Emulate in Their Teachers

    ERIC Educational Resources Information Center

    Hutchings, Merryn; Carrington, Bruce; Francis, Becky; Skelton, Christine; Read, Barbara; Hall, Ian

    2008-01-01

    In many western countries, government statements about the need to recruit more men to primary teaching are frequently supported by references to the importance of male teachers as role models for boys. The suggestion is that boys will both achieve better and behave better when taught by male teachers, because they will identify with them and want…

  11. Comprehensive Software Simulation on Ground Power Supply for Launch Pads and Processing Facilities at NASA Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Dominguez, Jesus A.; Victor, Elias; Vasquez, Angel L.; Urbina, Alfredo R.

    2017-01-01

    A multi-threaded software application has been developed in-house by the Ground Special Power (GSP) team at NASA Kennedy Space Center (KSC) to separately simulate and fully emulate all units that supply VDC power and battery-based power backup to multiple KSC launch ground support systems for NASA Space Launch Systems (SLS) rocket.

  12. Distributed Emulation in Support of Large Networks

    DTIC Science & Technology

    2016-06-01

    Provider LTE Long Term Evolution MB Megabyte MIPS Microprocessor without Interlocked Pipeline Stages MRT Multi-Threaded Routing Toolkit NPS Naval...environment, modifications to a network, protocol, or model can be executed – and the effects measured – without affecting real-world users or services...produce their results when analyzing performance of Long Term Evolution ( LTE ) gateways [3]. Many research scenarios allow problems to be represented

  13. Observability analysis of DVL/PS aided INS for a maneuvering AUV.

    PubMed

    Klein, Itzik; Diamant, Roee

    2015-10-22

    Recently, ocean exploration has increased considerably through the use of autonomous underwater vehicles (AUV). A key enabling technology is the precision of the AUV navigation capability. In this paper, we focus on understanding the limitation of the AUV navigation system. That is, what are the observable error-states for different maneuvering types of the AUV? Since analyzing the performance of an underwater navigation system is highly complex, to answer the above question, current approaches use simulations. This, of course, limits the conclusions to the emulated type of vehicle used and to the simulation setup. For this reason, we take a different approach and analyze the system observability for different types of vehicle dynamics by finding the set of observable and unobservable states. To that end, we apply the observability Gramian approach, previously used only for terrestrial applications. We demonstrate our analysis for an underwater inertial navigation system aided by a Doppler velocity logger or by a pressure sensor. The result is a first prediction of the performance of an AUV standing, rotating at a position and turning at a constant speed. Our conclusions of the observable and unobservable navigation error states for different dynamics are supported by extensive numerical simulation.

  14. Observability Analysis of DVL/PS Aided INS for a Maneuvering AUV

    PubMed Central

    Klein, Itzik; Diamant, Roee

    2015-01-01

    Recently, ocean exploration has increased considerably through the use of autonomous underwater vehicles (AUV). A key enabling technology is the precision of the AUV navigation capability. In this paper, we focus on understanding the limitation of the AUV navigation system. That is, what are the observable error-states for different maneuvering types of the AUV? Since analyzing the performance of an underwater navigation system is highly complex, to answer the above question, current approaches use simulations. This, of course, limits the conclusions to the emulated type of vehicle used and to the simulation setup. For this reason, we take a different approach and analyze the system observability for different types of vehicle dynamics by finding the set of observable and unobservable states. To that end, we apply the observability Gramian approach, previously used only for terrestrial applications. We demonstrate our analysis for an underwater inertial navigation system aided by a Doppler velocity logger or by a pressure sensor. The result is a first prediction of the performance of an AUV standing, rotating at a position and turning at a constant speed. Our conclusions of the observable and unobservable navigation error states for different dynamics are supported by extensive numerical simulation. PMID:26506356

  15. Analysis of Eye-Tracking Data with Regards to the Complexity of Flight Deck Information Automation and Management - Inattentional Blindness, System State Awareness, and EFB Usage

    NASA Technical Reports Server (NTRS)

    Dill, Evan T.; Young, Steven D.

    2015-01-01

    In the constant drive to further the safety and efficiency of air travel, the complexity of avionics-related systems, and the procedures for interacting with these systems, appear to be on an ever-increasing trend. While this growing complexity often yields productive results with respect to system capabilities and flight efficiency, it can place a larger burden on pilots to manage increasing amounts of information and to understand intricate system designs. Evidence supporting this observation is becoming widespread, yet has been largely anecdotal or the result of subjective analysis. One way to gain more insight into this issue is through experimentation using more objective measures or indicators. This study utilizes and analyzes eye-tracking data obtained during a high-fidelity flight simulation study wherein many of the complexities of current flight decks, as well as those planned for the next generation air transportation system (NextGen), were emulated. The following paper presents the findings of this study with a focus on electronic flight bag (EFB) usage, system state awareness (SSA) and events involving suspected inattentional blindness (IB).

  16. Artificial Intelligence and Semantics through the Prism of Structural, Post-Structural and Transcendental Approaches.

    PubMed

    Gasparyan, Diana

    2016-12-01

    There is a problem associated with contemporary studies of philosophy of mind, which focuses on the identification and convergence of human and machine intelligence. This is the problem of machine emulation of sense. In the present study, analysis of this problem is carried out based on concepts from structural and post-structural approaches that have been almost entirely overlooked by contemporary philosophy of mind. If we refer to the basic definitions of "sign" and "meaning" found in structuralism and post-structuralism, we see a fundamental difference between the capabilities of a machine and the human brain engaged in the processing of a sign. This research will exemplify and provide additional evidence to support distinctions between syntactic and semantic aspects of intelligence, an issue widely discussed by adepts of contemporary philosophy of mind. The research will demonstrate that some aspect of a number of ideas proposed in relation to semantics and semiosis in structuralism and post-structuralism are similar to those we find in contemporary analytical studies related to the theory and philosophy of artificial intelligence. The concluding part of the paper offers an interpretation of the problem of formalization of sense, connected to its metaphysical (transcendental) properties.

  17. High-emulation mask recognition with high-resolution hyperspectral video capture system

    NASA Astrophysics Data System (ADS)

    Feng, Jiao; Fang, Xiaojing; Li, Shoufeng; Wang, Yongjin

    2014-11-01

    We present a method for distinguishing human face from high-emulation mask, which is increasingly used by criminals for activities such as stealing card numbers and passwords on ATM. Traditional facial recognition technique is difficult to detect such camouflaged criminals. In this paper, we use the high-resolution hyperspectral video capture system to detect high-emulation mask. A RGB camera is used for traditional facial recognition. A prism and a gray scale camera are used to capture spectral information of the observed face. Experiments show that mask made of silica gel has different spectral reflectance compared with the human skin. As multispectral image offers additional spectral information about physical characteristics, high-emulation mask can be easily recognized.

  18. Statistical Emulator for Expensive Classification Simulators

    NASA Technical Reports Server (NTRS)

    Ross, Jerret; Samareh, Jamshid A.

    2016-01-01

    Expensive simulators prevent any kind of meaningful analysis to be performed on the phenomena they model. To get around this problem the concept of using a statistical emulator as a surrogate representation of the simulator was introduced in the 1980's. Presently, simulators have become more and more complex and as a result running a single example on these simulators is very expensive and can take days to weeks or even months. Many new techniques have been introduced, termed criteria, which sequentially select the next best (most informative to the emulator) point that should be run on the simulator. These criteria methods allow for the creation of an emulator with only a small number of simulator runs. We follow and extend this framework to expensive classification simulators.

  19. Nanoelectronic programmable synapses based on phase change materials for brain-inspired computing.

    PubMed

    Kuzum, Duygu; Jeyasingh, Rakesh G D; Lee, Byoungil; Wong, H-S Philip

    2012-05-09

    Brain-inspired computing is an emerging field, which aims to extend the capabilities of information technology beyond digital logic. A compact nanoscale device, emulating biological synapses, is needed as the building block for brain-like computational systems. Here, we report a new nanoscale electronic synapse based on technologically mature phase change materials employed in optical data storage and nonvolatile memory applications. We utilize continuous resistance transitions in phase change materials to mimic the analog nature of biological synapses, enabling the implementation of a synaptic learning rule. We demonstrate different forms of spike-timing-dependent plasticity using the same nanoscale synapse with picojoule level energy consumption.

  20. A floating-point/multiple-precision processor for airborne applications

    NASA Technical Reports Server (NTRS)

    Yee, R.

    1982-01-01

    A compact input output (I/O) numerical processor capable of performing floating-point, multiple precision and other arithmetic functions at execution times which are at least 100 times faster than comparable software emulation is described. The I/O device is a microcomputer system containing a 16 bit microprocessor, a numerical coprocessor with eight 80 bit registers running at a 5 MHz clock rate, 18K random access memory (RAM) and 16K electrically programmable read only memory (EPROM). The processor acts as an intelligent slave to the host computer and can be programmed in high order languages such as FORTRAN and PL/M-86.

  1. Final Progress Report: Isotope Identification Algorithm for Rapid and Accurate Determination of Radioisotopes Feasibility Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rawool-Sullivan, Mohini; Bounds, John Alan; Brumby, Steven P.

    2012-04-30

    This is the final report of the project titled, 'Isotope Identification Algorithm for Rapid and Accurate Determination of Radioisotopes,' PMIS project number LA10-HUMANID-PD03. The goal of the work was to demonstrate principles of emulating a human analysis approach towards the data collected using radiation isotope identification devices (RIIDs). It summarizes work performed over the FY10 time period. The goal of the work was to demonstrate principles of emulating a human analysis approach towards the data collected using radiation isotope identification devices (RIIDs). Human analysts begin analyzing a spectrum based on features in the spectrum - lines and shapes that aremore » present in a given spectrum. The proposed work was to carry out a feasibility study that will pick out all gamma ray peaks and other features such as Compton edges, bremsstrahlung, presence/absence of shielding and presence of neutrons and escape peaks. Ultimately success of this feasibility study will allow us to collectively explain identified features and form a realistic scenario that produced a given spectrum in the future. We wanted to develop and demonstrate machine learning algorithms that will qualitatively enhance the automated identification capabilities of portable radiological sensors that are currently being used in the field.« less

  2. Introduction to multiprotocol over ATM (MPOA)

    NASA Astrophysics Data System (ADS)

    Fredette, Andre N.

    1997-10-01

    Multiprotocol over ATM (MPOA) is a new protocol specified by the ATM Forum. MPOA provides a framework for effectively synthesizing bridging and routing with ATM in an environment of diverse protocols and network technologies. The primary goal of MPOA is the efficient transfer of inter-subnet unicast data in a LAN Emulation (LANE) environment. MPOA integrates LANE and the next hop resolution protocol (NHRP) to preserve the benefits of LAN Emulation, while allowing inter-subnet, internetwork layer protocol communication over ATM VCCs without requiring routers in the data path. It reduces latency and the internetwork layer forwarding load on backbone routers by enabling direct connectivity between ATM-attached edge devices (i.e., shortcuts). To establish these shortcuts, MPOA uses both routing and bridging information to locate the edge device closest to the addressed end station. By integrating LANE and NHRP, MPOA allows the physical separation of internetwork layer route calculation and forwarding, a technique known as virtual routing. This separation provides a number of key benefits including enhanced manageability and reduced complexity of internetwork layer capable edge devices. This paper provides an overview of MPOA that summarizes the goals, architecture, and key attributes of the protocol. In presenting this overview, the salient attributes of LANE and NHRP are described as well.

  3. A Weather Radar Simulator for the Evaluation of Polarimetric Phased Array Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byrd, Andrew D.; Ivic, Igor R.; Palmer, Robert D.

    A radar simulator capable of generating time series data for a polarimetric phased array weather radar has been designed and implemented. The received signals are composed from a high-resolution numerical prediction weather model. Thousands of scattering centers, each with an independent randomly generated Doppler spectrum, populate the field of view of the radar. The moments of the scattering center spectra are derived from the numerical weather model, and the scattering center positions are updated based on the three-dimensional wind field. In order to accurately emulate the effects of the system-induced cross-polar contamination, the array is modeled using a complete setmore » of dual-polarization radiation patterns. The simulator offers reconfigurable element patterns and positions as well as access to independent time series data for each element, resulting in easy implementation of any beamforming method. It also allows for arbitrary waveform designs and is able to model the effects of quantization on waveform performance. Simultaneous, alternating, quasi-simultaneous, and pulse-to-pulse phase coded modes of polarimetric signal transmission have been implemented. This framework allows for realistic emulation of the effects of cross-polar fields on weather observations, as well as the evaluation of possible techniques for the mitigation of those effects.« less

  4. Simulating the dynamic interaction of a robotic arm and the Space Shuttle remote manipulator system. M.S. Thesis - George Washington Univ., Dec. 1994

    NASA Technical Reports Server (NTRS)

    Garrahan, Steven L.; Tolson, Robert H.; Williams, Robert L., II

    1995-01-01

    Industrial robots are usually attached to a rigid base. Placing the robot on a compliant base introduces dynamic coupling between the two systems. The Vehicle Emulation System (VES) is a six DOF platform that is capable of modeling this interaction. The VES employs a force-torque sensor as the interface between robot and base. A computer simulation of the VES is presented. Each of the hardware and software components is described and Simulink is used as the programming environment. The simulation performance is compared with experimental results to validate accuracy. A second simulation which models the dynamic interaction of a robot and a flexible base acts as a comparison to the simulated motion of the VES. Results are presented that compare the simulated VES motion with the motion of the VES hardware using the same admittance model. The two computer simulations are compared to determine how well the VES is expected to emulate the desired motion. Simulation results are given for robots mounted to the end effector of the Space Shuttle Remote Manipulator System (SRMS). It is shown that for fast motions of the two robots studied, the SRMS experiences disturbances on the order of centimeters. Larger disturbances are possible if different manipulators are used.

  5. Wealth distribution across communities of adaptive financial agents

    NASA Astrophysics Data System (ADS)

    DeLellis, Pietro; Garofalo, Franco; Lo Iudice, Francesco; Napoletano, Elena

    2015-08-01

    This paper studies the trading volumes and wealth distribution of a novel agent-based model of an artificial financial market. In this model, heterogeneous agents, behaving according to the Von Neumann and Morgenstern utility theory, may mutually interact. A Tobin-like tax (TT) on successful investments and a flat tax are compared to assess the effects on the agents’ wealth distribution. We carry out extensive numerical simulations in two alternative scenarios: (i) a reference scenario, where the agents keep their utility function fixed, and (ii) a focal scenario, where the agents are adaptive and self-organize in communities, emulating their neighbours by updating their own utility function. Specifically, the interactions among the agents are modelled through a directed scale-free network to account for the presence of community leaders, and the herding-like effect is tested against the reference scenario. We observe that our model is capable of replicating the benefits and drawbacks of the two taxation systems and that the interactions among the agents strongly affect the wealth distribution across the communities. Remarkably, the communities benefit from the presence of leaders with successful trading strategies, and are more likely to increase their average wealth. Moreover, this emulation mechanism mitigates the decrease in trading volumes, which is a typical drawback of TTs.

  6. A Prosthetic Foot Emulator to Optimize Prescription of Prosthetic Feet in Veterans and Service Members with Leg Amputations

    DTIC Science & Technology

    2017-09-01

    AWARD NUMBER: W81XWH-16-1-0569 TITLE: A Prosthetic Foot Emulator to Optimize Prescription of Prosthetic Feet in Veterans and Service Members...Headquarters Services , Directorate for Information Operations and Reports (0704-0188), 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202- 4302...GRANT NUMBER A Prosthetic Foot Emulator to Optimize Prescription of Prosthetic Feet in Veterans and Service Members with Leg Amputations 5c

  7. Satellite Communication Hardware Emulation System (SCHES)

    NASA Technical Reports Server (NTRS)

    Kaplan, Ted

    1993-01-01

    Satellite Communication Hardware Emulator System (SCHES) is a powerful simulator that emulates the hardware used in TDRSS links. SCHES is a true bit-by-bit simulator that models communications hardware accurately enough to be used as a verification mechanism for actual hardware tests on user spacecraft. As a credit to its modular design, SCHES is easily configurable to model any user satellite communication link, though some development may be required to tailor existing software to user specific hardware.

  8. Statistical emulation of landslide-induced tsunamis at the Rockall Bank, NE Atlantic

    PubMed Central

    Guillas, S.; Georgiopoulou, A.; Dias, F.

    2017-01-01

    Statistical methods constitute a useful approach to understand and quantify the uncertainty that governs complex tsunami mechanisms. Numerical experiments may often have a high computational cost. This forms a limiting factor for performing uncertainty and sensitivity analyses, where numerous simulations are required. Statistical emulators, as surrogates of these simulators, can provide predictions of the physical process in a much faster and computationally inexpensive way. They can form a prominent solution to explore thousands of scenarios that would be otherwise numerically expensive and difficult to achieve. In this work, we build a statistical emulator of the deterministic codes used to simulate submarine sliding and tsunami generation at the Rockall Bank, NE Atlantic Ocean, in two stages. First we calibrate, against observations of the landslide deposits, the parameters used in the landslide simulations. This calibration is performed under a Bayesian framework using Gaussian Process (GP) emulators to approximate the landslide model, and the discrepancy function between model and observations. Distributions of the calibrated input parameters are obtained as a result of the calibration. In a second step, a GP emulator is built to mimic the coupled landslide-tsunami numerical process. The emulator propagates the uncertainties in the distributions of the calibrated input parameters inferred from the first step to the outputs. As a result, a quantification of the uncertainty of the maximum free surface elevation at specified locations is obtained. PMID:28484339

  9. Uncertainty in modeled upper ocean heat content change

    NASA Astrophysics Data System (ADS)

    Tokmakian, Robin; Challenor, Peter

    2014-02-01

    This paper examines the uncertainty in the change in the heat content in the ocean component of a general circulation model. We describe the design and implementation of our statistical methodology. Using an ensemble of model runs and an emulator, we produce an estimate of the full probability distribution function (PDF) for the change in upper ocean heat in an Atmosphere/Ocean General Circulation Model, the Community Climate System Model v. 3, across a multi-dimensional input space. We show how the emulator of the GCM's heat content change and hence, the PDF, can be validated and how implausible outcomes from the emulator can be identified when compared to observational estimates of the metric. In addition, the paper describes how the emulator outcomes and related uncertainty information might inform estimates of the same metric from a multi-model Coupled Model Intercomparison Project phase 3 ensemble. We illustrate how to (1) construct an ensemble based on experiment design methods, (2) construct and evaluate an emulator for a particular metric of a complex model, (3) validate the emulator using observational estimates and explore the input space with respect to implausible outcomes and (4) contribute to the understanding of uncertainties within a multi-model ensemble. Finally, we estimate the most likely value for heat content change and its uncertainty for the model, with respect to both observations and the uncertainty in the value for the input parameters.

  10. Statistical emulation of landslide-induced tsunamis at the Rockall Bank, NE Atlantic.

    PubMed

    Salmanidou, D M; Guillas, S; Georgiopoulou, A; Dias, F

    2017-04-01

    Statistical methods constitute a useful approach to understand and quantify the uncertainty that governs complex tsunami mechanisms. Numerical experiments may often have a high computational cost. This forms a limiting factor for performing uncertainty and sensitivity analyses, where numerous simulations are required. Statistical emulators, as surrogates of these simulators, can provide predictions of the physical process in a much faster and computationally inexpensive way. They can form a prominent solution to explore thousands of scenarios that would be otherwise numerically expensive and difficult to achieve. In this work, we build a statistical emulator of the deterministic codes used to simulate submarine sliding and tsunami generation at the Rockall Bank, NE Atlantic Ocean, in two stages. First we calibrate, against observations of the landslide deposits, the parameters used in the landslide simulations. This calibration is performed under a Bayesian framework using Gaussian Process (GP) emulators to approximate the landslide model, and the discrepancy function between model and observations. Distributions of the calibrated input parameters are obtained as a result of the calibration. In a second step, a GP emulator is built to mimic the coupled landslide-tsunami numerical process. The emulator propagates the uncertainties in the distributions of the calibrated input parameters inferred from the first step to the outputs. As a result, a quantification of the uncertainty of the maximum free surface elevation at specified locations is obtained.

  11. Emulation applied to reliability analysis of reconfigurable, highly reliable, fault-tolerant computing systems

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques applied to the analysis of the reliability of highly reliable computer systems for future commercial aircraft are described. The lack of credible precision in reliability estimates obtained by analytical modeling techniques is first established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Next, the technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. Use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques. Finally an illustrative example is presented to demonstrate from actual use the promise of the proposed application of emulation.

  12. Space-Shuttle Emulator Software

    NASA Technical Reports Server (NTRS)

    Arnold, Scott; Askew, Bill; Barry, Matthew R.; Leigh, Agnes; Mermelstein, Scott; Owens, James; Payne, Dan; Pemble, Jim; Sollinger, John; Thompson, Hiram; hide

    2007-01-01

    A package of software has been developed to execute a raw binary image of the space shuttle flight software for simulation of the computational effects of operation of space shuttle avionics. This software can be run on inexpensive computer workstations. Heretofore, it was necessary to use real flight computers to perform such tests and simulations. The package includes a program that emulates the space shuttle orbiter general- purpose computer [consisting of a central processing unit (CPU), input/output processor (IOP), master sequence controller, and buscontrol elements]; an emulator of the orbiter display electronics unit and models of the associated cathode-ray tubes, keyboards, and switch controls; computational models of the data-bus network; computational models of the multiplexer-demultiplexer components; an emulation of the pulse-code modulation master unit; an emulation of the payload data interleaver; a model of the master timing unit; a model of the mass memory unit; and a software component that ensures compatibility of telemetry and command services between the simulated space shuttle avionics and a mission control center. The software package is portable to several host platforms.

  13. Cinematic camera emulation using two-dimensional color transforms

    NASA Astrophysics Data System (ADS)

    McElvain, Jon S.; Gish, Walter

    2015-02-01

    For cinematic and episodic productions, on-set look management is an important component of the creative process, and involves iterative adjustments of the set, actors, lighting and camera configuration. Instead of using the professional motion capture device to establish a particular look, the use of a smaller form factor DSLR is considered for this purpose due to its increased agility. Because the spectral response characteristics will be different between the two camera systems, a camera emulation transform is needed to approximate the behavior of the destination camera. Recently, twodimensional transforms have been shown to provide high-accuracy conversion of raw camera signals to a defined colorimetric state. In this study, the same formalism is used for camera emulation, whereby a Canon 5D Mark III DSLR is used to approximate the behavior a Red Epic cinematic camera. The spectral response characteristics for both cameras were measured and used to build 2D as well as 3x3 matrix emulation transforms. When tested on multispectral image databases, the 2D emulation transforms outperform their matrix counterparts, particularly for images containing highly chromatic content.

  14. Evaluating the sources of water to wells: Three techniques for metamodeling of a groundwater flow model

    USGS Publications Warehouse

    Fienen, Michael N.; Nolan, Bernard T.; Feinstein, Daniel T.

    2016-01-01

    For decision support, the insights and predictive power of numerical process models can be hampered by insufficient expertise and computational resources required to evaluate system response to new stresses. An alternative is to emulate the process model with a statistical “metamodel.” Built on a dataset of collocated numerical model input and output, a groundwater flow model was emulated using a Bayesian Network, an Artificial neural network, and a Gradient Boosted Regression Tree. The response of interest was surface water depletion expressed as the source of water-to-wells. The results have application for managing allocation of groundwater. Each technique was tuned using cross validation and further evaluated using a held-out dataset. A numerical MODFLOW-USG model of the Lake Michigan Basin, USA, was used for the evaluation. The performance and interpretability of each technique was compared pointing to advantages of each technique. The metamodel can extend to unmodeled areas.

  15. NJE; VAX-VMS IBM NJE network protocol emulator. [DEC VAX11/780; VAX-11 FORTRAN 77 (99%) and MACRO-11 (1%)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engert, D.E.; Raffenetti, C.

    NJE is communications software developed to enable a VAX VMS system to participate as an end-node in a standard IBM network by emulating the Network Job Entry (NJE) protocol. NJE supports job networking for the operating systems used on most large IBM-compatible computers (e.g., VM/370, MVS with JES2 or JES3, SVS, MVT with ASP or HASP). Files received by the VAX can be printed or saved in user-selected disk files. Files sent to the network can be routed to any network node for printing, punching, or job submission, or to a VM/370 user's virtual reader. Files sent from the VAXmore » are queued and transmitted asynchronously. No changes are required to the IBM software.DEC VAX11/780; VAX-11 FORTRAN 77 (99%) and MACRO-11 (1%); VMS 2.5; VAX11/780 with DUP-11 UNIBUS interface and 9600 baud synchronous modem..« less

  16. Profiling optimization for big data transfer over dedicated channels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yun, D.; Wu, Qishi; Rao, Nageswara S

    The transfer of big data is increasingly supported by dedicated channels in high-performance networks, where transport protocols play an important role in maximizing applicationlevel throughput and link utilization. The performance of transport protocols largely depend on their control parameter settings, but it is prohibitively time consuming to conduct an exhaustive search in a large parameter space to find the best set of parameter values. We propose FastProf, a stochastic approximation-based transport profiler, to quickly determine the optimal operational zone of a given data transfer protocol/method over dedicated channels. We implement and test the proposed method using both emulations based onmore » real-life performance measurements and experiments over physical connections with short (2 ms) and long (380 ms) delays. Both the emulation and experimental results show that FastProf significantly reduces the profiling overhead while achieving a comparable level of end-to-end throughput performance with the exhaustive search-based approach.« less

  17. Parallel VLSI architecture emulation and the organization of APSA/MPP

    NASA Technical Reports Server (NTRS)

    Odonnell, John T.

    1987-01-01

    The Applicative Programming System Architecture (APSA) combines an applicative language interpreter with a novel parallel computer architecture that is well suited for Very Large Scale Integration (VLSI) implementation. The Massively Parallel Processor (MPP) can simulate VLSI circuits by allocating one processing element in its square array to an area on a square VLSI chip. As long as there are not too many long data paths, the MPP can simulate a VLSI clock cycle very rapidly. The APSA circuit contains a binary tree with a few long paths and many short ones. A skewed H-tree layout allows every processing element to simulate a leaf cell and up to four tree nodes, with no loss in parallelism. Emulation of a key APSA algorithm on the MPP resulted in performance 16,000 times faster than a Vax. This speed will make it possible for the APSA language interpreter to run fast enough to support research in parallel list processing algorithms.

  18. High fidelity wireless network evaluation for heterogeneous cognitive radio networks

    NASA Astrophysics Data System (ADS)

    Ding, Lei; Sagduyu, Yalin; Yackoski, Justin; Azimi-Sadjadi, Babak; Li, Jason; Levy, Renato; Melodia, Tammaso

    2012-06-01

    We present a high fidelity cognitive radio (CR) network emulation platform for wireless system tests, measure- ments, and validation. This versatile platform provides the configurable functionalities to control and repeat realistic physical channel effects in integrated space, air, and ground networks. We combine the advantages of scalable simulation environment with reliable hardware performance for high fidelity and repeatable evaluation of heterogeneous CR networks. This approach extends CR design only at device (software-defined-radio) or lower-level protocol (dynamic spectrum access) level to end-to-end cognitive networking, and facilitates low-cost deployment, development, and experimentation of new wireless network protocols and applications on frequency- agile programmable radios. Going beyond the channel emulator paradigm for point-to-point communications, we can support simultaneous transmissions by network-level emulation that allows realistic physical-layer inter- actions between diverse user classes, including secondary users, primary users, and adversarial jammers in CR networks. In particular, we can replay field tests in a lab environment with real radios perceiving and learning the dynamic environment thereby adapting for end-to-end goals over distributed spectrum coordination channels that replace the common control channel as a single point of failure. CR networks offer several dimensions of tunable actions including channel, power, rate, and route selection. The proposed network evaluation platform is fully programmable and can reliably evaluate the necessary cross-layer design solutions with configurable op- timization space by leveraging the hardware experiments to represent the realistic effects of physical channel, topology, mobility, and jamming on spectrum agility, situational awareness, and network resiliency. We also provide the flexibility to scale up the test environment by introducing virtual radios and establishing seamless signal-level interactions with real radios. This holistic wireless evaluation approach supports a large-scale, het- erogeneous, and dynamic CR network architecture and allows developing cross-layer network protocols under high fidelity, repeatable, and scalable wireless test scenarios suitable for heterogeneous space, air, and ground networks.

  19. A stimulus-control account of regulated drug intake in rats.

    PubMed

    Panlilio, Leigh V; Thorndike, Eric B; Schindler, Charles W

    2008-02-01

    Patterns of drug self-administration are often highly regular, with a consistent pause after each self-injection. This pausing might occur because the animal has learned that additional injections are not reinforcing once the drug effect has reached a certain level, possibly due to the reinforcement system reaching full capacity. Thus, interoceptive effects of the drug might function as a discriminative stimulus, signaling when additional drug will be reinforcing and when it will not. This hypothetical stimulus control aspect of drug self-administration was emulated using a schedule of food reinforcement. Rats' nose-poke responses produced food only when a cue light was present. No drug was administered at any time. However, the state of the light stimulus was determined by calculating what the whole-body drug level would have been if each response in the session had produced a drug injection. The light was only presented while this virtual drug level was below a specific threshold. A range of doses of cocaine and remifentanil were emulated using parameters based on previous self-administration experiments. Response patterns were highly regular, dose-dependent, and remarkably similar to actual drug self-administration. This similarity suggests that the emulation schedule may provide a reasonable model of the contingencies inherent in drug reinforcement. Thus, these results support a stimulus control account of regulated drug intake in which rats learn to discriminate when the level of drug effect has fallen to a point where another self-injection will be reinforcing.

  20. Space Link Extension (SLE) Emulation for High-Throughput Network Communication

    NASA Technical Reports Server (NTRS)

    Murawski, Robert W.; Tchorowski, Nicole; Golden, Bert

    2014-01-01

    As the data rate requirements for space communications increases, significant stress is placed not only on the wireless satellite communication links, but also on the ground networks which forward data from end-users to remote ground stations. These wide area network (WAN) connections add delay and jitter to the end-to-end satellite communication link, effects which can have significant impacts on the wireless communication link. It is imperative that any ground communication protocol can react to these effects such that the ground network does not become a bottleneck in the communication path to the satellite. In this paper, we present our SCENIC Emulation Lab testbed which was developed to test the CCSDS SLE protocol implementations proposed for use on future NASA communication networks. Our results show that in the presence of realistic levels of network delay, high-throughput SLE communication links can experience significant data rate throttling. Based on our observations, we present some insight into why this data throttling happens, and trace the probable issue back to non-optimal blocking communication which is sup-ported by the CCSDS SLE API recommended practices. These issues were presented as well to the SLE implementation developers which, based on our reports, developed a new release for SLE which we show fixes the SLE blocking issue and greatly improves the protocol throughput. In this paper, we also discuss future developments for our end-to-end emulation lab and how these improvements can be used to develop and test future space communication technologies.

  1. Stochastic simulation of power systems with integrated renewable and utility-scale storage resources

    NASA Astrophysics Data System (ADS)

    Degeilh, Yannick

    The push for a more sustainable electric supply has led various countries to adopt policies advocating the integration of renewable yet variable energy resources, such as wind and solar, into the grid. The challenges of integrating such time-varying, intermittent resources has in turn sparked a growing interest in the implementation of utility-scale energy storage resources ( ESRs), with MWweek storage capability. Indeed, storage devices provide flexibility to facilitate the management of power system operations in the presence of uncertain, highly time-varying and intermittent renewable resources. The ability to exploit the potential synergies between renewable and ESRs hinges on developing appropriate models, methodologies, tools and policy initiatives. We report on the development of a comprehensive simulation methodology that provides the capability to quantify the impacts of integrated renewable and ESRs on the economics, reliability and emission variable effects of power systems operating in a market environment. We model the uncertainty in the demands, the available capacity of conventional generation resources and the time-varying, intermittent renewable resources, with their temporal and spatial correlations, as discrete-time random processes. We deploy models of the ESRs to emulate their scheduling and operations in the transmission-constrained hourly day-ahead markets. To this end, we formulate a scheduling optimization problem (SOP) whose solutions determine the operational schedule of the controllable ESRs in coordination with the demands and the conventional/renewable resources. As such, the SOP serves the dual purpose of emulating the clearing of the transmission-constrained day-ahead markets (DAMs ) and scheduling the energy storage resource operations. We also represent the need for system operators to impose stricter ramping requirements on the conventional generating units so as to maintain the system capability to perform "load following'', i.e., respond to quick variations in the loads and renewable resource outputs in a manner that maintains the power balance, by incorporating appropriate ramping requirement constraints in the formulation of the SOP. The simulation approach makes use of Monte Carlo simulation techniques to represent the impacts of the sources of uncertainty on the side-by-side power system and market operations. As such, we systematically sample the "input'' random processes -- namely the buyer demands, renewable resource outputs and conventional generation resource available capacities -- to generate the realizations, or sample paths, that we use in the emulation of the transmission-constrained day-ahead markets via SOP . As a result, we obtain realizations of the market outcomes and storage resource operations that we can use to approximate their statistics. The approach not only has the capability to emulate the side-by-side power system and energy market operations with the explicit representation of the chronology of time-dependent phenomena -- including storage cycles of charge/discharge -- and constraints imposed by the transmission network in terms of deliverability of the energy, but also to provide the figures of merit for all metrics to assess the economics, reliability and the environmental impacts of the performance of those operations. Our efforts to address the implementational aspects of the methodology so as to ensure computational tractability for large-scale systems over longer periods include relaxing the SOP, the use of a "warm-start'' technique as well as representative simulation periods, parallelization and variance reduction techniques. Our simulation approach is useful in power system planning, operations and investment analysis. There is a broad range of applications of the simulation methodology to resource planning studies, production costing issues, investment analysis, transmission utilization, reliability analysis, environmental assessments, policy formulation and to answer quantitatively various what-if questions. We demonstrate the capabilities of the simulation approach by carrying out various studies on modified IEEE 118- and WECC 240-bus systems. The results of our representative case studies effectively illustrate the synergies among wind and ESRs. Our investigations clearly indicate that energy storage and wind resources tend to complement each other in the reduction of wholesale purchase payments in the DAMs and the improvement of system reliability. In addition, we observe that CO2 emission impacts with energy storage depend on the resource mix characteristics. An important finding is that storage seems to attenuate the "diminishing returns'' associated with increased penetration of wind generation. Our studies also evidence the limited ability of integrated ESRs to enhance the wind resource capability to replace conventional resources from purely a system reliability perspective. Some useful insights into the siting of ESRs are obtained and they indicate the potentially significant impacts of such decisions on the network congestion patterns and, consequently, on the LMPs. Simulation results further indicate that the explicit representation of ramping requirements on the conventional units at the DAM level causes the expected total wholesale purchase payments to increase, thereby mitigating the benefits of wind integration. The stricter ramping requirements are also shown to impact the revenues of generators that do not even provide any ramp capability services.

  2. Strategic Communication in the System for Health

    DTIC Science & Technology

    2013-03-01

    have borne our share of real crises and even tragedies, every day our Soldiers and their families are protected from injuries , illnesses, and...combat wounds; receive state-of-the-art treatment when prevention fails; and are supported by extraordinarily talented people.”5 And yet, while LTG...design, it “Emulates, nests, and aligns with Army Strategic Planning Guidance (ASPG) Vision and Army Campaign Plan (ACP) end state: Prevent , Shape, Win

  3. A data driven approach using Takagi-Sugeno models for computationally efficient lumped floodplain modelling

    NASA Astrophysics Data System (ADS)

    Wolfs, Vincent; Willems, Patrick

    2013-10-01

    Many applications in support of water management decisions require hydrodynamic models with limited calculation time, including real time control of river flooding, uncertainty and sensitivity analyses by Monte-Carlo simulations, and long term simulations in support of the statistical analysis of the model simulation results (e.g. flood frequency analysis). Several computationally efficient hydrodynamic models exist, but little attention is given to the modelling of floodplains. This paper presents a methodology that can emulate output from a full hydrodynamic model by predicting one or several levels in a floodplain, together with the flow rate between river and floodplain. The overtopping of the embankment is modelled as an overflow at a weir. Adaptive neuro fuzzy inference systems (ANFIS) are exploited to cope with the varying factors affecting the flow. Different input sets and identification methods are considered in model construction. Because of the dual use of simplified physically based equations and data-driven techniques, the ANFIS consist of very few rules with a low number of input variables. A second calculation scheme can be followed for exceptionally large floods. The obtained nominal emulation model was tested for four floodplains along the river Dender in Belgium. Results show that the obtained models are accurate with low computational cost.

  4. Knowing for Nursing Practice: Patterns of Knowledge and Their Emulation in Expert Systems

    PubMed Central

    Abraham, Ivo L.; Fitzpatrick, Joyce J.

    1987-01-01

    This paper addresses the issue of clinical knowledge in nursing, and the feasibility of emulating this knowledge into expert system technology. The perspective on patterns of knowing for nursing practice, advanced by Carper (1978), serves as point of departure. The four patterns of knowing -- empirics, esthetics, ethics, personal knowledge -- are evaluated as to the extent to which they can be emulated in clinical expert systems, given constraints imposed by the current technology of these systems.

  5. Recovering from "amnesia" brought about by radiation. Verification of the "Over the air" (OTA) application software update mechanism On-Board Solar Orbiter's Energetic Particle Detector

    NASA Astrophysics Data System (ADS)

    Da Silva, Antonio; Sánchez Prieto, Sebastián; Rodriguez Polo, Oscar; Parra Espada, Pablo

    Computer memories are not supposed to forget, but they do. Because of the proximity of the Sun, from the Solar Orbiter boot software perspective, it is mandatory to look out for permanent memory errors resulting from (SEL) latch-up failures in application binaries stored in EEPROM and its SDRAM deployment areas. In this situation, the last line in defense established by FDIR mechanisms is the capability of the boot software to provide an accurate report of the memories’ damages and to perform an application software update, that avoid the harmed locations by flashing EEPROM with a new binary. This paper describes the OTA EEPROM firmware update procedure verification of the boot software that will run in the Instrument Control Unit (ICU) of the Energetic Particle Detector (EPD) on-board Solar Orbiter. Since the maximum number of rewrites on real EEPROM is limited and permanent memory faults cannot be friendly emulated in real hardware, the verification has been accomplished by the use of a LEON2 Virtual Platform (Leon2ViP) with fault injection capabilities and real SpaceWire interfaces developed by the Space Research Group (SRG) of the University of Alcalá. This way it is possible to run the exact same target binary software as if was run on the real ICU platform. Furthermore, the use of this virtual hardware-in-the-loop (VHIL) approach makes it possible to communicate with Electrical Ground Support Equipment (EGSE) through real SpaceWire interfaces in an agile, controlled and deterministic environment.

  6. QoS support over ultrafast TDM optical networks

    NASA Astrophysics Data System (ADS)

    Narvaez, Paolo; Siu, Kai-Yeung; Finn, Steven G.

    1999-08-01

    HLAN is a promising architecture to realize Tb/s access networks based on ultra-fast optical TDM technologies. This paper presents new research results on efficient algorithms for the support of quality of service over the HLAN network architecture. In particular, we propose a new scheduling algorithm that emulates fair queuing in a distributed manner for bandwidth allocation purpose. The proposed scheduler collects information on the queue of each host on the network and then instructs each host how much data to send. Our new scheduling algorithm ensures full bandwidth utilization, while guaranteeing fairness among all hosts.

  7. Fabrication and Testing of a Modular Micro-Pocket Fission Detector Instrumentation System for Test Nuclear Reactors

    NASA Astrophysics Data System (ADS)

    Reichenberger, Michael A.; Nichols, Daniel M.; Stevenson, Sarah R.; Swope, Tanner M.; Hilger, Caden W.; Roberts, Jeremy A.; Unruh, Troy C.; McGregor, Douglas S.

    2018-01-01

    Advancements in nuclear reactor core modeling and computational capability have encouraged further development of in-core neutron sensors. Measurement of the neutron-flux distribution within the reactor core provides a more complete understanding of the operating conditions in the reactor than typical ex-core sensors. Micro-Pocket Fission Detectors have been developed and tested previously but have been limited to single-node operation and have utilized highly specialized designs. The development of a widely deployable, multi-node Micro-Pocket Fission Detector assembly will enhance nuclear research capabilities. A modular, four-node Micro-Pocket Fission Detector array was designed, fabricated, and tested at Kansas State University. The array was constructed from materials that do not significantly perturb the neutron flux in the reactor core. All four sensor nodes were equally spaced axially in the array to span the fuel-region of the reactor core. The array was filled with neon gas, serving as an ionization medium in the small cavities of the Micro-Pocket Fission Detectors. The modular design of the instrument facilitates the testing and deployment of numerous sensor arrays. The unified design drastically improved device ruggedness and simplified construction from previous designs. Five 8-mm penetrations in the upper grid plate of the Kansas State University TRIGA Mk. II research nuclear reactor were utilized to deploy the array between fuel elements in the core. The Micro-Pocket Fission Detector array was coupled to an electronic support system which has been specially developed to support pulse-mode operation. The Micro-Pocket Fission Detector array composed of four sensors was used to monitor local neutron flux at a constant reactor power of 100 kWth at different axial locations simultaneously. The array was positioned at five different radial locations within the core to emulate the deployment of multiple arrays and develop a 2-dimensional measurement of neutron flux in the reactor core.

  8. Simulation, measurement, and emulation of photovoltaic modules using high frequency and high power density power electronic circuits

    NASA Astrophysics Data System (ADS)

    Erkaya, Yunus

    The number of solar photovoltaic (PV) installations is growing exponentially, and to improve the energy yield and the efficiency of PV systems, it is necessary to have correct methods for simulation, measurement, and emulation. PV systems can be simulated using PV models for different configurations and technologies of PV modules. Additionally, different environmental conditions of solar irradiance, temperature, and partial shading can be incorporated in the model to accurately simulate PV systems for any given condition. The electrical measurement of PV systems both prior to and after making electrical connections is important for attaining high efficiency and reliability. Measuring PV modules using a current-voltage (I-V) curve tracer allows the installer to know whether the PV modules are 100% operational. The installed modules can be properly matched to maximize performance. Once installed, the whole system needs to be characterized similarly to detect mismatches, partial shading, or installation damage before energizing the system. This will prevent any reliability issues from the onset and ensure the system efficiency will remain high. A capacitive load is implemented in making I-V curve measurements with the goal of minimizing the curve tracer volume and cost. Additionally, the increase of measurement resolution and accuracy is possible via the use of accurate voltage and current measurement methods and accurate PV models to translate the curves to standard testing conditions. A move from mechanical relays to solid-state MOSFETs improved system reliability while significantly reducing device volume and costs. Finally, emulating PV modules is necessary for testing electrical components of a PV system. PV emulation simplifies and standardizes the tests allowing for different irradiance, temperature and partial shading levels to be easily tested. Proper emulation of PV modules requires an accurate and mathematically simple PV model that incorporates all known system variables so that any PV module can be emulated as the design requires. A non-synchronous buck converter is proposed for the emulation of a single, high-power PV module using traditional silicon devices. With the proof-of-concept working and improvements in efficiency, power density and steady-state errors made, dynamic tests were performed using an inverter connected to the PV emulator. In order to improve the dynamic characteristics, a synchronous buck converter topology is proposed along with the use of advanced GaNFET devices which resulted in very high power efficiency and improved dynamic response characteristics when emulating PV modules.

  9. Dynamic climate emulators for solar geoengineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacMartin, Douglas G.; Kravitz, Ben

    2016-12-22

    Climate emulators trained on existing simulations can be used to project project the climate effects that result from different possible future pathways of anthropogenic forcing, without further relying on general circulation model (GCM) simulations. We extend this idea to include different amounts of solar geoengineering in addition to different pathways of greenhouse gas concentrations, by training emulators from a multi-model ensemble of simulations from the Geoengineering Model Intercomparison Project (GeoMIP). The emulator is trained on the abrupt 4 × CO 2 and a compensating solar reduction simulation (G1), and evaluated by comparing predictions against a simulated 1 % per yearmore » CO 2 increase and a similarly smaller solar reduction (G2). We find reasonable agreement in most models for predicting changes in temperature and precipitation (including regional effects), and annual-mean Northern Hemisphere sea ice extent, with the difference between simulation and prediction typically being smaller than natural variability. This verifies that the linearity assumption used in constructing the emulator is sufficient for these variables over the range of forcing considered. Annual-minimum Northern Hemisphere sea ice extent is less well predicted, indicating a limit to the linearity assumption.« less

  10. Panel Flutter Emulation Using a Few Concentrated Forces

    NASA Astrophysics Data System (ADS)

    Dhital, Kailash; Han, Jae-Hung

    2018-04-01

    The objective of this paper is to study the feasibility of panel flutter emulation using a few concentrated forces. The concentrated forces are considered to be equivalent to aerodynamic forces. The equivalence is carried out using surface spline method and principle of virtual work. The structural modeling of the plate is based on the classical plate theory and the aerodynamic modeling is based on the piston theory. The present approach differs from the linear panel flutter analysis in scheming the modal aerodynamics forces with unchanged structural properties. The solutions for the flutter problem are obtained numerically using the standard eigenvalue procedure. A few concentrated forces were considered with an optimization effort to decide their optimal locations. The optimization process is based on minimizing the error between the flutter bounds from emulated and linear flutter analysis method. The emulated flutter results for the square plate of four different boundary conditions using six concentrated forces are obtained with minimal error to the reference value. The results demonstrated the workability and viability of using concentrated forces in emulating real panel flutter. In addition, the paper includes the parametric studies of linear panel flutter whose proper literatures are not available.

  11. The Mira-Titan Universe. II. Matter Power Spectrum Emulation

    NASA Astrophysics Data System (ADS)

    Lawrence, Earl; Heitmann, Katrin; Kwan, Juliana; Upadhye, Amol; Bingham, Derek; Habib, Salman; Higdon, David; Pope, Adrian; Finkel, Hal; Frontiere, Nicholas

    2017-09-01

    We introduce a new cosmic emulator for the matter power spectrum covering eight cosmological parameters. Targeted at optical surveys, the emulator provides accurate predictions out to a wavenumber k˜ 5 Mpc-1 and redshift z≤slant 2. In addition to covering the standard set of ΛCDM parameters, massive neutrinos and a dynamical dark energy of state are included. The emulator is built on a sample set of 36 cosmological models, carefully chosen to provide accurate predictions over the wide and large parameter space. For each model, we have performed a high-resolution simulation, augmented with 16 medium-resolution simulations and TimeRG perturbation theory results to provide accurate coverage over a wide k-range; the data set generated as part of this project is more than 1.2Pbytes. With the current set of simulated models, we achieve an accuracy of approximately 4%. Because the sampling approach used here has established convergence and error-control properties, follow-up results with more than a hundred cosmological models will soon achieve ˜ 1 % accuracy. We compare our approach with other prediction schemes that are based on halo model ideas and remapping approaches. The new emulator code is publicly available.

  12. Low-warming Scenarios and their Approximation: Testing Emulation Performance for Average and Extreme Variables

    NASA Astrophysics Data System (ADS)

    Tebaldi, C.; Knutti, R.; Armbruster, A.

    2017-12-01

    Taking advantage of the availability of ensemble simulations under low-warming scenarios performed with NCAR-DOE CESM, we test the performance of established methods for climate model output emulation. The goal is to provide a green, yellow or red light to the large impact research community that may be interested in performing impact analysis using climate model output other than, or in conjunction with, CESM's, especially as the IPCC Special Report on the 1.5 target urgently calls for scientific contributions exploring the costs and benefits of attaining these ambitious goals. We test the performance of emulators of average temperature and precipitation - and their interannual variability - and we also explore the possibility of emulating indices of extremes (ETCCDI indices), devised to offer impact relevant information from daily output of temperature and precipitation. Different degrees of departure from the linearity assumed in these traditional emulation approaches are found across the various quantities considered, and across regions, highlighting different degrees of quality in the approximations, and therefore some challenges in the provision of climate change information for impact analysis under these new scenarios that not many models have thus far targeted through their simulations.

  13. The Mira-Titan Universe. II. Matter Power Spectrum Emulation

    DOE PAGES

    Lawrence, Earl; Heitmann, Katrin; Kwan, Juliana; ...

    2017-09-20

    We introduce a new cosmic emulator for the matter power spectrum covering eight cosmological parameters. Targeted at optical surveys, the emulator provides accurate predictions out to a wavenumber k ~ 5Mpc -1 and redshift z ≤ 2. Besides covering the standard set of CDM parameters, massive neutrinos and a dynamical dark energy of state are included. The emulator is built on a sample set of 36 cosmological models, carefully chosen to provide accurate predictions over the wide and large parameter space. For each model, we have performed a high-resolution simulation, augmented with sixteen medium-resolution simulations and TimeRG perturbation theory resultsmore » to provide accurate coverage of a wide k-range; the dataset generated as part of this project is more than 1.2Pbyte. With the current set of simulated models, we achieve an accuracy of approximately 4%. Because the sampling approach used here has established convergence and error-control properties, follow-on results with more than a hundred cosmological models will soon achieve ~1% accuracy. We compare our approach with other prediction schemes that are based on halo model ideas and remapping approaches. The new emulator code is publicly available.« less

  14. The Mira-Titan Universe. II. Matter Power Spectrum Emulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawrence, Earl; Heitmann, Katrin; Kwan, Juliana

    We introduce a new cosmic emulator for the matter power spectrum covering eight cosmological parameters. Targeted at optical surveys, the emulator provides accurate predictions out to a wavenumber k ~ 5Mpc -1 and redshift z ≤ 2. Besides covering the standard set of CDM parameters, massive neutrinos and a dynamical dark energy of state are included. The emulator is built on a sample set of 36 cosmological models, carefully chosen to provide accurate predictions over the wide and large parameter space. For each model, we have performed a high-resolution simulation, augmented with sixteen medium-resolution simulations and TimeRG perturbation theory resultsmore » to provide accurate coverage of a wide k-range; the dataset generated as part of this project is more than 1.2Pbyte. With the current set of simulated models, we achieve an accuracy of approximately 4%. Because the sampling approach used here has established convergence and error-control properties, follow-on results with more than a hundred cosmological models will soon achieve ~1% accuracy. We compare our approach with other prediction schemes that are based on halo model ideas and remapping approaches. The new emulator code is publicly available.« less

  15. What role does performance information play in securing improvement in healthcare? a conceptual framework for levers of change.

    PubMed

    Levesque, Jean-Frederic; Sutherland, Kim

    2017-08-28

    Across healthcare systems, there is consensus on the need for independent and impartial assessment of performance. There is less agreement about how measurement and reporting performance improves healthcare. This paper draws on academic theories to develop a conceptual framework-one that classifies in an integrated manner the ways in which change can be leveraged by healthcare performance information. A synthesis of published frameworks. The framework identifies eight levers for change enabled by performance information, spanning internal and external drivers, and emergent and planned processes: (1) cognitive levers provide awareness and understanding; (2) mimetic levers inform about the performance of others to encourage emulation; (3) supportive levers provide facilitation, implementation tools or models of care to actively support change; (4) formative levers develop capabilities and skills through teaching, mentoring and feedback; (5) normative levers set performance against guidelines, standards, certification and accreditation processes; (6) coercive levers use policies, regulations incentives and disincentives to force change; (7) structural levers modify the physical environment or professional cultures and routines; (8) competitive levers attract patients or funders. This framework highlights how performance measurement and reporting can contribute to eight different levers for change. It provides guidance into how to align performance measurement and reporting into quality improvement programme. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. Process membership in asynchronous environments

    NASA Technical Reports Server (NTRS)

    Ricciardi, Aleta M.; Birman, Kenneth P.

    1993-01-01

    The development of reliable distributed software is simplified by the ability to assume a fail-stop failure model. The emulation of such a model in an asynchronous distributed environment is discussed. The solution proposed, called Strong-GMP, can be supported through a highly efficient protocol, and was implemented as part of a distributed systems software project at Cornell University. The precise definition of the problem, the protocol, correctness proofs, and an analysis of costs are addressed.

  17. Megawatt-Scale Power Hardware-in-the-Loop Simulation Testing of a Power Conversion Module for Naval Applications

    DTIC Science & Technology

    2015-06-21

    problem was detected . Protection elements were implemented to trigger on over- voltage , over-current, over/under-frequency, and zero-sequence voltage ...power hardware in the loop simulation of distribution networks with photovoltaic generation,” International Journal of Renewable Energy Research...source modules were intended to support both emulation of a representative gas turbine generator set, as well as a flexible, controllable voltage source

  18. Launching applications on compute and service processors running under different operating systems in scalable network of processor boards with routers

    DOEpatents

    Tomkins, James L [Albuquerque, NM; Camp, William J [Albuquerque, NM

    2009-03-17

    A multiple processor computing apparatus includes a physical interconnect structure that is flexibly configurable to support selective segregation of classified and unclassified users. The physical interconnect structure also permits easy physical scalability of the computing apparatus. The computing apparatus can include an emulator which permits applications from the same job to be launched on processors that use different operating systems.

  19. Terahertz multistatic reflection imaging.

    PubMed

    Dorney, Timothy D; Symes, William W; Baraniuk, Richard G; Mittleman, Daniel M

    2002-07-01

    We describe a new imaging method using single-cycle pulses of terahertz (THz) radiation. This technique emulates the data collection and image processing procedures developed for geophysical prospecting and is made possible by the availability of fiber-coupled THz receiver antennas. We use a migration procedure to solve the inverse problem; this permits us to reconstruct the location, the shape, and the refractive index of targets. We show examples for both metallic and dielectric model targets, and we perform velocity analysis on dielectric targets to estimate the refractive indices of imaged components. These results broaden the capabilities of THz imaging systems and also demonstrate the viability of the THz system as a test bed for the exploration of new seismic processing methods.

  20. Dynamic Emulation of NASA Missions for IVandV: A Case Study of JWST and SLS

    NASA Technical Reports Server (NTRS)

    Yokum, Steve

    2015-01-01

    Software-Only-Simulations are an emerging but quickly developing field of study throughout NASA. The NASA Independent Verification Validation (IVV) Independent Test Capability (ITC) team has been rapidly building a collection of simulators for a wide range of NASA missions. ITC specializes in full end-to-end simulations that enable developers, VV personnel, and operators to test-as-you-fly. In four years, the team has delivered a wide variety of spacecraft simulations ranging from low complexity science missions such as the Global Precipitation Management (GPM) satellite and the Deep Space Climate Observatory (DSCOVR), to the extremely complex missions such as the James Webb Space Telescope (JWST) and Space Launch System (SLS).

  1. User-Friendly Interface Developed for a Web-Based Service for SpaceCAL Emulations

    NASA Technical Reports Server (NTRS)

    Liszka, Kathy J.; Holtz, Allen P.

    2004-01-01

    A team at the NASA Glenn Research Center is developing a Space Communications Architecture Laboratory (SpaceCAL) for protocol development activities for coordinated satellite missions. SpaceCAL will provide a multiuser, distributed system to emulate space-based Internet architectures, backbone networks, formation clusters, and constellations. As part of a new effort in 2003, building blocks are being defined for an open distributed system to make the satellite emulation test bed accessible through an Internet connection. The first step in creating a Web-based service to control the emulation remotely is providing a user-friendly interface for encoding the data into a well-formed and complete Extensible Markup Language (XML) document. XML provides coding that allows data to be transferred between dissimilar systems. Scenario specifications include control parameters, network routes, interface bandwidths, delay, and bit error rate. Specifications for all satellite, instruments, and ground stations in a given scenario are also included in the XML document. For the SpaceCAL emulation, the XML document can be created using XForms, a Webbased forms language for data collection. Contrary to older forms technology, the interactive user interface makes the science prevalent, not the data representation. Required versus optional input fields, default values, automatic calculations, data validation, and reuse will help researchers quickly and accurately define missions. XForms can apply any XML schema defined for the test mission to validate data before forwarding it to the emulation facility. New instrument definitions, facilities, and mission types can be added to the existing schema. The first prototype user interface incorporates components for interactive input and form processing. Internet address, data rate, and the location of the facility are implemented with basic form controls with default values provided for convenience and efficiency using basic XForms operations. Because different emulation scenarios will vary widely in their component structure, more complex operations are used to add and delete facilities.

  2. Organic core-sheath nanowire artificial synapses with femtojoule energy consumption.

    PubMed

    Xu, Wentao; Min, Sung-Yong; Hwang, Hyunsang; Lee, Tae-Woo

    2016-06-01

    Emulation of biological synapses is an important step toward construction of large-scale brain-inspired electronics. Despite remarkable progress in emulating synaptic functions, current synaptic devices still consume energy that is orders of magnitude greater than do biological synapses (~10 fJ per synaptic event). Reduction of energy consumption of artificial synapses remains a difficult challenge. We report organic nanowire (ONW) synaptic transistors (STs) that emulate the important working principles of a biological synapse. The ONWs emulate the morphology of nerve fibers. With a core-sheath-structured ONW active channel and a well-confined 300-nm channel length obtained using ONW lithography, ~1.23 fJ per synaptic event for individual ONW was attained, which rivals that of biological synapses. The ONW STs provide a significant step toward realizing low-energy-consuming artificial intelligent electronics and open new approaches to assembling soft neuromorphic systems with nanometer feature size.

  3. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    NASA Technical Reports Server (NTRS)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  4. Fast emulation of track reconstruction in the CMS simulation

    NASA Astrophysics Data System (ADS)

    Komm, Matthias; CMS Collaboration

    2017-10-01

    Simulated samples of various physics processes are a key ingredient within analyses to unlock the physics behind LHC collision data. Samples with more and more statistics are required to keep up with the increasing amounts of recorded data. During sample generation, significant computing time is spent on the reconstruction of charged particle tracks from energy deposits which additionally scales with the pileup conditions. In CMS, the FastSimulation package is developed for providing a fast alternative to the standard simulation and reconstruction workflow. It employs various techniques to emulate track reconstruction effects in particle collision events. Several analysis groups in CMS are utilizing the package, in particular those requiring many samples to scan the parameter space of physics models (e.g. SUSY) or for the purpose of estimating systematic uncertainties. The strategies for and recent developments in this emulation are presented, including a novel, flexible implementation of tracking emulation while retaining a sufficient, tuneable accuracy.

  5. Performance Guaranteed Inertia Emulation forDiesel-Wind System Feed Microgrid via ModelReference Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melin, Alexander M.; Zhang, Yichen; Djouadi, Seddik

    In this paper, a model reference control based inertia emulation strategy is proposed. Desired inertia can be precisely emulated through this control strategy so that guaranteed performance is ensured. A typical frequency response model with parametrical inertia is set to be the reference model. A measurement at a specific location delivers the information of disturbance acting on the diesel-wind system to the referencemodel. The objective is for the speed of the diesel-wind system to track the reference model. Since active power variation is dominantly governed by mechanical dynamics and modes, only mechanical dynamics and states, i.e., a swing-engine-governor system plusmore » a reduced-order wind turbine generator, are involved in the feedback control design. The controller is implemented in a three-phase diesel-wind system feed microgrid. The results show exact synthetic inertia is emulated, leading to guaranteed performance and safety bounds.« less

  6. Calibration of International Space Station (ISS) Node 1 Vibro-Acoustic Model-Report 2

    NASA Technical Reports Server (NTRS)

    Zhang, Weiguo; Raveendra, Ravi

    2014-01-01

    Reported here is the capability of the Energy Finite Element Method (E-FEM) to predict the vibro-acoustic sound fields within the International Space Station (ISS) Node 1 and to compare the results with simulated leak sounds. A series of electronically generated structural ultrasonic noise sources were created in the pressure wall to emulate leak signals at different locations of the Node 1 STA module during its period of storage at Stennis Space Center (SSC). The exact sound source profiles created within the pressure wall at the source were unknown, but were estimated from the closest sensor measurement. The E-FEM method represents a reverberant sound field calculation, and of importance to this application is the requirement to correctly handle the direct field effect of the sound generation. It was also important to be able to compute the sound energy fields in the ultrasonic frequency range. This report demonstrates the capability of this technology as applied to this type of application.

  7. Social learning by imitation in a reptile (Pogona vitticeps).

    PubMed

    Kis, Anna; Huber, Ludwig; Wilkinson, Anna

    2015-01-01

    The ability to learn through imitation is thought to be the basis of cultural transmission and was long considered a distinctive characteristic of humans. There is now evidence that both mammals and birds are capable of imitation. However, nothing is known about these abilities in the third amniotic class-reptiles. Here, we use a bidirectional control procedure to show that a reptile species, the bearded dragon (Pogona vitticeps), is capable of social learning that cannot be explained by simple mechanisms such as local enhancement or goal emulation. Subjects in the experimental group opened a trap door to the side that had been demonstrated, while subjects in the ghost control group, who observed the door move without the intervention of a conspecific, were unsuccessful. This, together with differences in behaviour between experimental and control groups, provides compelling evidence that reptiles possess cognitive abilities that are comparable to those observed in mammals and birds and suggests that learning by imitation is likely to be based on ancient mechanisms.

  8. Development, Demonstration, and Control of a Testbed for Multiterminal HVDC System

    DOE PAGES

    Li, Yalong; Shi, Xiaojie M.; Liu, Bo; ...

    2016-10-21

    This paper presents the development of a scaled four-terminal high-voltage direct current (HVDC) testbed, including hardware structure, communication architecture, and different control schemes. The developed testbed is capable of emulating typical operation scenarios including system start-up, power variation, line contingency, and converter station failure. Some unique scenarios are also developed and demonstrated, such as online control mode transition and station re-commission. In particular, a dc line current control is proposed, through the regulation of a converter station at one terminal. By controlling a dc line current to zero, the transmission line can be opened by using relatively low-cost HVDC disconnectsmore » with low current interrupting capability, instead of the more expensive dc circuit breaker. Utilizing the dc line current control, an automatic line current limiting scheme is developed. As a result, when a dc line is overloaded, the line current control will be automatically activated to regulate current within the allowable maximum value.« less

  9. Robust self-cleaning and micromanipulation capabilities of gecko spatulae and their bio-mimics

    NASA Astrophysics Data System (ADS)

    Xu, Quan; Wan, Yiyang; Hu, Travis Shihao; Liu, Tony X.; Tao, Dashuai; Niewiarowski, Peter H.; Tian, Yu; Liu, Yue; Dai, Liming; Yang, Yanqing; Xia, Zhenhai

    2015-11-01

    Geckos have the extraordinary ability to prevent their sticky feet from fouling while running on dusty walls and ceilings. Understanding gecko adhesion and self-cleaning mechanisms is essential for elucidating animal behaviours and rationally designing gecko-inspired devices. Here we report a unique self-cleaning mechanism possessed by the nano-pads of gecko spatulae. The difference between the velocity-dependent particle-wall adhesion and the velocity-independent spatula-particle dynamic response leads to a robust self-cleaning capability, allowing geckos to efficiently dislodge dirt during their locomotion. Emulating this natural design, we fabricate artificial spatulae and micromanipulators that show similar effects, and that provide a new way to manipulate micro-objects. By simply tuning the pull-off velocity, our gecko-inspired micromanipulators, made of synthetic microfibers with graphene-decorated micro-pads, can easily pick up, transport, and drop-off microparticles for precise assembling. This work should open the door to the development of novel self-cleaning adhesives, smart surfaces, microelectromechanical systems, biomedical devices, and more.

  10. Deep Neural Network Emulation of a High-Order, WENO-Limited, Space-Time Reconstruction

    NASA Astrophysics Data System (ADS)

    Norman, M. R.; Hall, D. M.

    2017-12-01

    Deep Neural Networks (DNNs) have been used to emulate a number of processes in atmospheric models, including radiation and even so-called super-parameterization of moist convection. In each scenario, the DNN provides a good representation of the process even for inputs that have not been encountered before. More notably, they provide an emulation at a fraction of the cost of the original routine, giving speed-ups of 30× and even up to 200× compared to the runtime costs of the original routines. However, to our knowledge there has not been an investigation into using DNNs to emulate the dynamics. The most likely reason for this is that dynamics operators are typically both linear and low cost, meaning they cannot be sped up by a non-linear DNN emulation. However, there exist high-cost non-linear space-time dynamics operators that significantly reduce the number of parallel data transfers necessary to complete an atmospheric simulation. The WENO-limited Finite-Volume method with ADER-DT time integration is a prime example of this - needing only two parallel communications per large, fully limited time step. However, it comes at a high cost in terms of computation, which is why many would hesitate to use it. This talk investigates DNN emulation of the WENO-limited space-time finite-volume reconstruction procedure - the most expensive portion of this method, which densely clusters a large amount of non-linear computation. Different training techniques and network architectures are tested, and the accuracy and speed-up of each is given.

  11. The value of improved wind power forecasting: Grid flexibility quantification, ramp capability analysis, and impacts of electricity market operation timescales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qin; Wu, Hongyu; Florita, Anthony R.

    The value of improving wind power forecasting accuracy at different electricity market operation timescales was analyzed by simulating the IEEE 118-bus test system as modified to emulate the generation mixes of the Midcontinent, California, and New England independent system operator balancing authority areas. The wind power forecasting improvement methodology and error analysis for the data set were elaborated. Production cost simulation was conducted on the three emulated systems with a total of 480 scenarios, considering the impacts of different generation technologies, wind penetration levels, and wind power forecasting improvement timescales. The static operational flexibility of the three systems was comparedmore » through the diversity of generation mix, the percentage of must-run baseload generators, as well as the available ramp rate and the minimum generation levels. The dynamic operational flexibility was evaluated by the real-time upward and downward ramp capacity. Simulation results show that the generation resource mix plays a crucial role in evaluating the value of improved wind power forecasting at different timescales. In addition, the changes in annual operational electricity generation costs were mostly influenced by the dominant resource in the system. Lastly, the impacts of pumped-storage resources, generation ramp rates, and system minimum generation level requirements on the value of improved wind power forecasting were also analyzed.« less

  12. The value of improved wind power forecasting: Grid flexibility quantification, ramp capability analysis, and impacts of electricity market operation timescales

    DOE PAGES

    Wang, Qin; Wu, Hongyu; Florita, Anthony R.; ...

    2016-11-11

    The value of improving wind power forecasting accuracy at different electricity market operation timescales was analyzed by simulating the IEEE 118-bus test system as modified to emulate the generation mixes of the Midcontinent, California, and New England independent system operator balancing authority areas. The wind power forecasting improvement methodology and error analysis for the data set were elaborated. Production cost simulation was conducted on the three emulated systems with a total of 480 scenarios, considering the impacts of different generation technologies, wind penetration levels, and wind power forecasting improvement timescales. The static operational flexibility of the three systems was comparedmore » through the diversity of generation mix, the percentage of must-run baseload generators, as well as the available ramp rate and the minimum generation levels. The dynamic operational flexibility was evaluated by the real-time upward and downward ramp capacity. Simulation results show that the generation resource mix plays a crucial role in evaluating the value of improved wind power forecasting at different timescales. In addition, the changes in annual operational electricity generation costs were mostly influenced by the dominant resource in the system. Lastly, the impacts of pumped-storage resources, generation ramp rates, and system minimum generation level requirements on the value of improved wind power forecasting were also analyzed.« less

  13. High Density Polyetherurethane Foam as a Fragmentation and Radiographic Surrogate for Cortical Bone

    PubMed Central

    Beardsley, Christina L; Heiner, Anneliese D; Brandser, Eric A; Marsh, J Lawrence; Brown, Thomas D

    2000-01-01

    Background Although one of the most important factors in predicting outcome of articular fracture, the comminution of the fracture is only subjectively assessed. To facilitate development of objective, quantitative measures of comminution phenomena, there is need for a bone fragmentation surrogate. Methods Laboratory investigation was undertaken to develop and characterize a novel synthetic material capable of emulating the fragmentation and radiographic behavior of human cortical bone. Result Screening tests performed with a drop tower apparatus identified high-density polyetherurethane foam as having suitable fragmentation properties. The material's impact behavior and its quasi-static mechanical properties are here described. Dispersal of barium sulfate (BaSO4) in the resin achieved radio-density closely resembling that of bone, without detectably altering mechanical behavior. The surrogate material's ultimate strength, elastic modulus, and quasi-static toughness are within an order of magnitude of those of mammalian cortical bone. The spectrum of comminution patterns produced by this material when impacted with varying amounts of energy is very comparable to the spectrum of bone fragment comminution seen clinically. Conclusions A novel high-density polyetherurethane foam, when subjected to impact loading, sustains comminuted fracture in a manner strikingly similar to cortical bone. Moreover, since the material also can be doped with radio-opacifier so as to closely emulate bone's radiographic signature, it opens many new possibilities for CT-based systematic study of comminution phenomena. PMID:10934621

  14. Modeling dynamics of large tabular icebergs submerged in the ocean

    NASA Astrophysics Data System (ADS)

    Adcroft, A.; Stern, A. A.; Sergienko, O. V.

    2017-12-01

    Large tabular icebergs account for a major fraction of the ice calved from the Antarctic ice shelves, and have long lifetimes due to their size. They drift for long distances, interacting with the local ocean circulation, impacting bottom-water formation, sea-ice formation, and biological productivity in the vicinity of the icebergs. However, due to their large horizontal extent and mass, it is challenging to consistently represent large tabular icebergs in global ocean circulation models and so large tabular icebergs are not currently represented in climate models. In this study we develop a novel framework to model large tabular icebergs submerged in the ocean. In this framework, a tabular iceberg is represented by a collection of Lagrangian elements that are linked through rigid bonds. The Lagrangian elements are finite-area modifications of the point-particles used in previous studies to represent small icebergs. These elements interact with the ocean by exerting pressure on the ocean surface, and through melt water and momentum exchange. A breaking of the rigid bonds allows the model to emulate calving events (i.e. detachment of a tabular iceberg from an ice shelf), and to emulate the breaking up of tabular icebergs into smaller pieces. Idealized simulations of the calving of a tabular iceberg, subsequent drift and breakup, demonstrate the capabilities of the new framework with a promise that climate models may soon be able to represent large tabular icebergs.

  15. Emulation of Physician Tasks in Eye-Tracked Virtual Reality for Remote Diagnosis of Neurodegenerative Disease.

    PubMed

    Orlosky, Jason; Itoh, Yuta; Ranchet, Maud; Kiyokawa, Kiyoshi; Morgan, John; Devos, Hannes

    2017-04-01

    For neurodegenerative conditions like Parkinson's disease, early and accurate diagnosis is still a difficult task. Evaluations can be time consuming, patients must often travel to metropolitan areas or different cities to see experts, and misdiagnosis can result in improper treatment. To date, only a handful of assistive or remote methods exist to help physicians evaluate patients with suspected neurological disease in a convenient and consistent way. In this paper, we present a low-cost VR interface designed to support evaluation and diagnosis of neurodegenerative disease and test its use in a clinical setting. Using a commercially available VR display with an infrared camera integrated into the lens, we have constructed a 3D virtual environment designed to emulate common tasks used to evaluate patients, such as fixating on a point, conducting smooth pursuit of an object, or executing saccades. These virtual tasks are designed to elicit eye movements commonly associated with neurodegenerative disease, such as abnormal saccades, square wave jerks, and ocular tremor. Next, we conducted experiments with 9 patients with a diagnosis of Parkinson's disease and 7 healthy controls to test the system's potential to emulate tasks for clinical diagnosis. We then applied eye tracking algorithms and image enhancement to the eye recordings taken during the experiment and conducted a short follow-up study with two physicians for evaluation. Results showed that our VR interface was able to elicit five common types of movements usable for evaluation, physicians were able to confirm three out of four abnormalities, and visualizations were rated as potentially useful for diagnosis.

  16. ATM LAN Emulation: Getting from Here to There.

    ERIC Educational Resources Information Center

    Learn, Larry L., Ed.

    1995-01-01

    Discusses current LAN (local area network) configuration and explains ATM (asynchronous transfer mode) as the future telecommunications transport. Highlights include LAN emulation, which enables the interconnection of legacy LANs and the new ATM environment; virtual LANs; broadcast servers; and standards. (LRW)

  17. Statistical emulators of maize, rice, soybean and wheat yields from global gridded crop models

    DOE PAGES

    Blanc, Élodie

    2017-01-26

    This study provides statistical emulators of crop yields based on global gridded crop model simulations from the Inter-Sectoral Impact Model Intercomparison Project Fast Track project. The ensemble of simulations is used to build a panel of annual crop yields from five crop models and corresponding monthly summer weather variables for over a century at the grid cell level globally. This dataset is then used to estimate, for each crop and gridded crop model, the statistical relationship between yields, temperature, precipitation and carbon dioxide. This study considers a new functional form to better capture the non-linear response of yields to weather,more » especially for extreme temperature and precipitation events, and now accounts for the effect of soil type. In- and out-of-sample validations show that the statistical emulators are able to replicate spatial patterns of yields crop levels and changes overtime projected by crop models reasonably well, although the accuracy of the emulators varies by model and by region. This study therefore provides a reliable and accessible alternative to global gridded crop yield models. By emulating crop yields for several models using parsimonious equations, the tools provide a computationally efficient method to account for uncertainty in climate change impact assessments.« less

  18. Emulation, imitation, over-imitation and the scope of culture for child and chimpanzee

    PubMed Central

    Whiten, Andrew; McGuigan, Nicola; Marshall-Pescini, Sarah; Hopper, Lydia M.

    2009-01-01

    We describe our recent studies of imitation and cultural transmission in chimpanzees and children, which question late twentieth-century characterizations of children as imitators, but chimpanzees as emulators. As emulation entails learning only about the results of others' actions, it has been thought to curtail any capacity to sustain cultures. Recent chimpanzee diffusion experiments have by contrast documented a significant capacity for copying local behavioural traditions. Additionally, in recent ‘ghost’ experiments with no model visible, chimpanzees failed to replicate the object movements on which emulation is supposed to focus. We conclude that chimpanzees rely more on imitation and have greater cultural capacities than previously acknowledged. However, we also find that they selectively apply a range of social learning processes that include emulation. Recent studies demonstrating surprisingly unselective ‘over-imitation’ in children suggest that children's propensity to imitate has been underestimated too. We discuss the implications of these developments for the nature of social learning and culture in the two species. Finally, our new experiments directly address cumulative cultural learning. Initial results demonstrate a relative conservatism and conformity in chimpanzees' learning, contrasting with cumulative cultural learning in young children. This difference may contribute much to the contrast in these species' capacities for cultural evolution. PMID:19620112

  19. Method using in vivo quantitative spectroscopy to guide design and optimization of low-cost, compact clinical imaging devices: emulation and evaluation of multispectral imaging systems

    NASA Astrophysics Data System (ADS)

    Saager, Rolf B.; Baldado, Melissa L.; Rowland, Rebecca A.; Kelly, Kristen M.; Durkin, Anthony J.

    2018-04-01

    With recent proliferation in compact and/or low-cost clinical multispectral imaging approaches and commercially available components, questions remain whether they adequately capture the requisite spectral content of their applications. We present a method to emulate the spectral range and resolution of a variety of multispectral imagers, based on in-vivo data acquired from spatial frequency domain spectroscopy (SFDS). This approach simulates spectral responses over 400 to 1100 nm. Comparing emulated data with full SFDS spectra of in-vivo tissue affords the opportunity to evaluate whether the sparse spectral content of these imagers can (1) account for all sources of optical contrast present (completeness) and (2) robustly separate and quantify sources of optical contrast (crosstalk). We validate the approach over a range of tissue-simulating phantoms, comparing the SFDS-based emulated spectra against measurements from an independently characterized multispectral imager. Emulated results match the imager across all phantoms (<3 % absorption, <1 % reduced scattering). In-vivo test cases (burn wounds and photoaging) illustrate how SFDS can be used to evaluate different multispectral imagers. This approach provides an in-vivo measurement method to evaluate the performance of multispectral imagers specific to their targeted clinical applications and can assist in the design and optimization of new spectral imaging devices.

  20. Statistical emulators of maize, rice, soybean and wheat yields from global gridded crop models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blanc, Élodie

    This study provides statistical emulators of crop yields based on global gridded crop model simulations from the Inter-Sectoral Impact Model Intercomparison Project Fast Track project. The ensemble of simulations is used to build a panel of annual crop yields from five crop models and corresponding monthly summer weather variables for over a century at the grid cell level globally. This dataset is then used to estimate, for each crop and gridded crop model, the statistical relationship between yields, temperature, precipitation and carbon dioxide. This study considers a new functional form to better capture the non-linear response of yields to weather,more » especially for extreme temperature and precipitation events, and now accounts for the effect of soil type. In- and out-of-sample validations show that the statistical emulators are able to replicate spatial patterns of yields crop levels and changes overtime projected by crop models reasonably well, although the accuracy of the emulators varies by model and by region. This study therefore provides a reliable and accessible alternative to global gridded crop yield models. By emulating crop yields for several models using parsimonious equations, the tools provide a computationally efficient method to account for uncertainty in climate change impact assessments.« less

  1. One-dimensional photonic crystals for eliminating cross-talk in mid-IR photonics-based respiratory gas sensing

    NASA Astrophysics Data System (ADS)

    Fleming, L.; Gibson, D.; Song, S.; Hutson, D.; Reid, S.; MacGregor, C.; Clark, C.

    2017-02-01

    Mid-IR carbon dioxide (CO2) gas sensing is critical for monitoring in respiratory care, and is finding increasing importance in surgical anaesthetics where nitrous oxide (N2O) induced cross-talk is a major obstacle to accurate CO2 monitoring. In this work, a novel, solid state mid-IR photonics based CO2 gas sensor is described, and the role that 1- dimensional photonic crystals, often referred to as multilayer thin film optical coatings [1], play in boosting the sensor's capability of gas discrimination is discussed. Filter performance in isolating CO2 IR absorption is tested on an optical filter test bed and a theoretical gas sensor model is developed, with the inclusion of a modelled multilayer optical filter to analyse the efficacy of optical filtering on eliminating N2O induced cross-talk for this particular gas sensor architecture. Future possible in-house optical filter fabrication techniques are discussed. As the actual gas sensor configuration is small, it would be challenging to manufacture a filter of the correct size; dismantling the sensor and mounting a new filter for different optical coating designs each time would prove to be laborious. For this reason, an optical filter testbed set-up is described and, using a commercial optical filter, it is demonstrated that cross-talk can be considerably reduced; cross-talk is minimal even for very high concentrations of N2O, which are unlikely to be encountered in exhaled surgical anaesthetic patient breath profiles. A completely new and versatile system for breath emulation is described and the capability it has for producing realistic human exhaled CO2 vs. time waveforms is shown. The cross-talk inducing effect that N2O has on realistic emulated CO2 vs. time waveforms as measured using the NDIR gas sensing technique is demonstrated and the effect that optical filtering will have on said cross-talk is discussed.

  2. PERTS: A Prototyping Environment for Real-Time Systems

    NASA Technical Reports Server (NTRS)

    Liu, Jane W. S.; Lin, Kwei-Jay; Liu, C. L.

    1993-01-01

    PERTS is a prototyping environment for real-time systems. It is being built incrementally and will contain basic building blocks of operating systems for time-critical applications, tools, and performance models for the analysis, evaluation and measurement of real-time systems and a simulation/emulation environment. It is designed to support the use and evaluation of new design approaches, experimentations with alternative system building blocks, and the analysis and performance profiling of prototype real-time systems.

  3. R-EACTR: A Framework for Designing Realistic Cyber Warfare Exercises

    DTIC Science & Technology

    2017-09-11

    2.1 Environment 3 2.2 Adversary 4 2.3 Communications 4 2.4 Tactics 5 2.5 Roles 5 3 Case Study – Cyber Forge 11 7 3.1 Environment 7 3.2...realism into each aspect of the exercise, and a case study of one exercise where the framework was successfully employed. CMU/SEI-2017-TR-005...network, emulation, logging, reporting Supporting: computer network defense service provider (CNDSP), intelligence, reach-back, higher

  4. Enabling Diverse Software Stacks on Supercomputers using High Performance Virtual Clusters.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Younge, Andrew J.; Pedretti, Kevin; Grant, Ryan

    While large-scale simulations have been the hallmark of the High Performance Computing (HPC) community for decades, Large Scale Data Analytics (LSDA) workloads are gaining attention within the scientific community not only as a processing component to large HPC simulations, but also as standalone scientific tools for knowledge discovery. With the path towards Exascale, new HPC runtime systems are also emerging in a way that differs from classical distributed com- puting models. However, system software for such capabilities on the latest extreme-scale DOE supercomputing needs to be enhanced to more appropriately support these types of emerging soft- ware ecosystems. In thismore » paper, we propose the use of Virtual Clusters on advanced supercomputing resources to enable systems to support not only HPC workloads, but also emerging big data stacks. Specifi- cally, we have deployed the KVM hypervisor within Cray's Compute Node Linux on a XC-series supercomputer testbed. We also use libvirt and QEMU to manage and provision VMs directly on compute nodes, leveraging Ethernet-over-Aries network emulation. To our knowledge, this is the first known use of KVM on a true MPP supercomputer. We investigate the overhead our solution using HPC benchmarks, both evaluating single-node performance as well as weak scaling of a 32-node virtual cluster. Overall, we find single node performance of our solution using KVM on a Cray is very efficient with near-native performance. However overhead increases by up to 20% as virtual cluster size increases, due to limitations of the Ethernet-over-Aries bridged network. Furthermore, we deploy Apache Spark with large data analysis workloads in a Virtual Cluster, ef- fectively demonstrating how diverse software ecosystems can be supported by High Performance Virtual Clusters.« less

  5. What attributions do Australian high-performing general practices make for their success? Applying the clinical microsystems framework: a qualitative study

    PubMed Central

    Dunham, Annette H; Dunbar, James A; Johnson, Julie K; Fuller, Jeff; Morgan, Mark; Ford, Dale

    2018-01-01

    Objectives To identify the success attributions of high-performing Australian general practices and the enablers and barriers they envisage for practices wishing to emulate them. Design Qualitative study using semi-structured interviews and content analysis of the data. Responses were recorded, transcribed verbatim and coded according to success characteristics of high-performing clinical microsystems. Setting Primary healthcare with the participating general practices representing all Australian states and territories, and representing metropolitan and rural locations. Participants Twenty-two general practices identified as high performing via a number of success criteria. The 52 participants were 19 general practitioners, 18 practice managers and 15 practice nurses. Results Participants most frequently attributed success to the interdependence of the team members, patient-focused care and leadership of the practice. They most often signalled practice leadership, team interdependence and staff focus as enablers that other organisations would need to emulate their success. They most frequently identified barriers that might be encountered in the form of potential deficits or limitations in practice leadership, staff focus and mesosystem support. Conclusions Practice leaders need to empower their teams to take action through providing inclusive leadership that facilitates team interdependence. Mesosystem support for quality improvement in general practice should focus on enabling this leadership and team building, thereby ensuring improvement efforts are converted into effective healthcare provision. PMID:29643162

  6. Optimal visual simulation of the self-tracking combustion of the infrared decoy based on the particle system

    NASA Astrophysics Data System (ADS)

    Hu, Qi; Duan, Jin; Wang, LiNing; Zhai, Di

    2016-09-01

    The high-efficiency simulation test of military weapons has a very important effect on the high cost of the actual combat test and the very demanding operational efficiency. Especially among the simulative emulation methods of the explosive smoke, the simulation method based on the particle system has generated much attention. In order to further improve the traditional simulative emulation degree of the movement process of the infrared decoy during the real combustion cycle, this paper, adopting the virtual simulation platform of OpenGL and Vega Prime and according to their own radiation characteristics and the aerodynamic characteristics of the infrared decoy, has simulated the dynamic fuzzy characteristics of the infrared decoy during the real combustion cycle by using particle system based on the double depth peeling algorithm and has solved key issues such as the interface, coordinate conversion and the retention and recovery of the Vega Prime's status. The simulation experiment has basically reached the expected improvement purpose, effectively improved the simulation fidelity and provided theoretical support for improving the performance of the infrared decoy.

  7. Acute cardiovascular responses while playing virtual games simulated by Nintendo Wii®

    PubMed Central

    Rodrigues, Gusthavo Augusto Alves; Felipe, Danilo De Souza; Silva, Elisangela; De Freitas, Wagner Zeferino; Higino, Wonder Passoni; Da Silva, Fabiano Fernandes; De Carvalho, Wellington Roberto Gomes; Aparecido de Souza, Renato

    2015-01-01

    [Purpose] This investigation evaluated the acute cardiovascular responses that occur while playing virtual games (aerobic and balance) emulated by Nintendo Wii®. [Subjects] Nineteen healthy male volunteers were recruited. [Methods] The ergospirometric variables of maximum oxygen consumption, metabolic equivalents, and heart rate were obtained during the aerobic (Obstacle Course, Hula Hoop, and Free Run) and balance (Soccer Heading, Penguin Slide, and Table Tilt) games of Wii Fit Plus® software. To access and analyze the ergospirometric information, a VO2000 analyzer was used. Normalized data (using maximum oxygen consumption and heart rate) were analyzed using repeated measures analysis of variance and Scheffe’s test. [Results] Significant differences were found among the balance and aerobic games in all variables analyzed. In addition, the Wii exercises performed were considered to be of light (balance games) and moderate (aerobic games) intensity in accordance with American College Sports Medicine exercise stratification. [Conclusion] Physical activity in a virtual environment emulated by Nintendo Wii® can change acute cardiovascular responses, primarily when Wii aerobic games are performed. These results support the use of the Nintendo Wii® in physical activity programs. PMID:26504308

  8. Acute cardiovascular responses while playing virtual games simulated by Nintendo Wii(®).

    PubMed

    Rodrigues, Gusthavo Augusto Alves; Felipe, Danilo De Souza; Silva, Elisangela; De Freitas, Wagner Zeferino; Higino, Wonder Passoni; Da Silva, Fabiano Fernandes; De Carvalho, Wellington Roberto Gomes; Aparecido de Souza, Renato

    2015-09-01

    [Purpose] This investigation evaluated the acute cardiovascular responses that occur while playing virtual games (aerobic and balance) emulated by Nintendo Wii(®). [Subjects] Nineteen healthy male volunteers were recruited. [Methods] The ergospirometric variables of maximum oxygen consumption, metabolic equivalents, and heart rate were obtained during the aerobic (Obstacle Course, Hula Hoop, and Free Run) and balance (Soccer Heading, Penguin Slide, and Table Tilt) games of Wii Fit Plus(®) software. To access and analyze the ergospirometric information, a VO2000 analyzer was used. Normalized data (using maximum oxygen consumption and heart rate) were analyzed using repeated measures analysis of variance and Scheffe's test. [Results] Significant differences were found among the balance and aerobic games in all variables analyzed. In addition, the Wii exercises performed were considered to be of light (balance games) and moderate (aerobic games) intensity in accordance with American College Sports Medicine exercise stratification. [Conclusion] Physical activity in a virtual environment emulated by Nintendo Wii(®) can change acute cardiovascular responses, primarily when Wii aerobic games are performed. These results support the use of the Nintendo Wii(®) in physical activity programs.

  9. Inter-Domain Roaming Mechanism Transparent to Mobile Nodes among PMIPv6 Networks

    NASA Astrophysics Data System (ADS)

    Park, Soochang; Lee, Euisin; Jin, Min-Sook; Kim, Sang-Ha

    In Proxy Mobile IPv6 (PMIPv6), when a Mobile Node (MN) enters a PMIPv6 domain and attaches to an access link, the router on the access link detects attachment of the MN by the link-layer access. All elements of PMIPv6 including the router then provide network-based mobility management service for the MN. If the MN moves to another router in this PMIPv6 domain, the new router emulates attachment to the previous router by providing same network prefix to the MN. In other words, PMIPv6 provides rapid mobility management based on layer-2 attachment and transparent mobility support to the MN by emulating layer-3 attachment with respect to intra-domain roaming. However, when the MN moves to other PMIPv6 domains, although the domains also provide the network-based mobility management service, the MN should exploit the host-based mobility management protocol, i.e. Mobile IPv6 (MIPv6), for the inter-domain roaming. Hence, this letter proposes the rapid and transparent inter-domain roaming mechanism controlled by the networks adopting PMIPv6.

  10. SUV Rollover Test

    NASA Technical Reports Server (NTRS)

    Chambers, William V.

    2004-01-01

    The National Highway Traffic Safety Administration (NHTSA) approached NASA to evaluate vehicle rollover resistance using the High Capacity Centrifuge Facility. Testing was planned for six different sport utility vehicles. Previous methods for simulating the rollover conditions were considered to be not indicative of the true driving conditions. A more realistic gradual application of side loading could be achieved by using a centrifuge facility. A unique load measuring lower support system was designed to measure tire loading on the inboard tires and to indicate tire liftoff. This lower support system was designed to more closely emulate the actual rollover conditions. Additional design features were provided to mitigate potential safety hazards.

  11. ATM Quality of Service Parameters at 45 Mbps Using a Satellite Emulator: Laboratory Measurements

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.; Bobinsky, Eric A.

    1997-01-01

    Results of 45-Mbps DS3 intermediate-frequency loopback measurements of asynchronous transfer mode (ATM) quality of service parameters (cell error ratio and cell loss ratio) are presented. These tests, which were conducted at the NASA Lewis Research Center in support of satellite-ATM interoperability research, represent initial efforts to quantify the minimum parameters for stringent ATM applications, such as MPEG-1 and MPEG-2 video transmission. Portions of these results were originally presented to the International Telecommunications Union's ITU-R Working Party 4B in February 1996 in support of their Draft Preliminary Recommendation on the Transmission of ATM Traffic via Satellite.

  12. Quantification of uncertainties in the tsunami hazard for Cascadia using statistical emulation

    NASA Astrophysics Data System (ADS)

    Guillas, S.; Day, S. J.; Joakim, B.

    2016-12-01

    We present new high resolution tsunami wave propagation and coastal inundation for the Cascadia region in the Pacific Northwest. The coseismic representation in this analysis is novel, and more realistic than in previous studies, as we jointly parametrize multiple aspects of the seabed deformation. Due to the large computational cost of such simulators, statistical emulation is required in order to carry out uncertainty quantification tasks, as emulators efficiently approximate simulators. The emulator replaces the tsunami model VOLNA by a fast surrogate, so we are able to efficiently propagate uncertainties from the source characteristics to wave heights, in order to probabilistically assess tsunami hazard for Cascadia. We employ a new method for the design of the computer experiments in order to reduce the number of runs while maintaining good approximations properties of the emulator. Out of the initial nine parameters, mostly describing the geometry and time variation of the seabed deformation, we drop two parameters since these turn out to not have an influence on the resulting tsunami waves at the coast. We model the impact of another parameter linearly as its influence on the wave heights is identified as linear. We combine this screening approach with the sequential design algorithm MICE (Mutual Information for Computer Experiments), that adaptively selects the input values at which to run the computer simulator, in order to maximize the expected information gain (mutual information) over the input space. As a result, the emulation is made possible and accurate. Starting from distributions of the source parameters that encapsulate geophysical knowledge of the possible source characteristics, we derive distributions of the tsunami wave heights along the coastline.

  13. Emulation of Condensed Fuel Flames Using a Burning Rate Emulator (BRE) in Microgravity

    NASA Technical Reports Server (NTRS)

    Markan, A.; Quintiere, J. G.; Sunderland, P. B.; De Ris, J. L.; Stocker, D. P.

    2017-01-01

    The Burning Rate Emulator (BRE) is a gaseous fuel burner developed to emulate the burning of condensed phase fuels. The current study details several tests at the NASA Glenn 5-s drop facility to test the BRE technique in microgravity conditions. The tests are conducted for two burner diameters, 25 mm and 50 mm respectively, with methane and ethylene as the fuels. The ambient pressure, oxygen content and fuel flow rate are additional parameters. The microgravity results exhibit a nominally hemispherical flame with decelerating growth and quasi-steady heat flux after about 5 seconds. The BRE burner was evaluated with a transient analysis to assess the extent of steady-state achieved. The burning rate and flame height recorded at the end of the drop are correlated using two steady-state purely diffusive models. A higher burning rate for the bigger burner as compared to theory indicates the significance of gas radiation. The effect of the ambient pressure and oxygen concentration on the heat of gasification are also examined.

  14. Design of testbed and emulation tools

    NASA Technical Reports Server (NTRS)

    Lundstrom, S. F.; Flynn, M. J.

    1986-01-01

    The research summarized was concerned with the design of testbed and emulation tools suitable to assist in projecting, with reasonable accuracy, the expected performance of highly concurrent computing systems on large, complete applications. Such testbed and emulation tools are intended for the eventual use of those exploring new concurrent system architectures and organizations, either as users or as designers of such systems. While a range of alternatives was considered, a software based set of hierarchical tools was chosen to provide maximum flexibility, to ease in moving to new computers as technology improves and to take advantage of the inherent reliability and availability of commercially available computing systems.

  15. On stethoscope design: a challenge for biomedical circuit designers.

    PubMed

    Hahn, A W

    2001-01-01

    Most clinicians learned the art and science of auscultation using an acoustic stethoscope. While many models of electronic stethoscopes have been marketed over the years, none of them seem to do a very good job of emulating the most common forms of acoustic stethoscopes available. This paper is an appeal to biomedical circuit designers to learn more about the acoustics of commonly used stethoscopes and to develop an appropriate group of circuits which would emulate them much like music synthesizers can emulate almost any musical instrument. The implications are for creative designers to move toward a rational and acceptable design for both personal physician use and for telemedicine.

  16. Electrode geometry for electrostatic generators and motors

    DOEpatents

    Post, Richard F.

    2016-02-23

    An electrostatic (ES) device is described with electrodes that improve its performance metrics. Devices include ES generators and ES motors, which are comprised of one or more stators (stationary members) and one or more rotors (rotatable members). The stator and rotors are configured as a pair of concentric cylindrical structures and aligned about a common axis. The stator and rotor are comprised of an ensemble of discrete, longitudinal electrodes, which are axially oriented in an annular arrangement. The shape of the electrodes described herein enables the ES device to function at voltages significantly greater than that of the existing art, resulting in devices with greater power-handling capability and overall efficiency. Electrode shapes include, but are not limited to, rods, corrugated sheets and emulations thereof.

  17. Harnessing Disordered-Ensemble Quantum Dynamics for Machine Learning

    NASA Astrophysics Data System (ADS)

    Fujii, Keisuke; Nakajima, Kohei

    2017-08-01

    The quantum computer has an amazing potential of fast information processing. However, the realization of a digital quantum computer is still a challenging problem requiring highly accurate controls and key application strategies. Here we propose a platform, quantum reservoir computing, to solve these issues successfully by exploiting the natural quantum dynamics of ensemble systems, which are ubiquitous in laboratories nowadays, for machine learning. This framework enables ensemble quantum systems to universally emulate nonlinear dynamical systems including classical chaos. A number of numerical experiments show that quantum systems consisting of 5-7 qubits possess computational capabilities comparable to conventional recurrent neural networks of 100-500 nodes. This discovery opens up a paradigm for information processing with artificial intelligence powered by quantum physics.

  18. Runway Incursion Prevention System: Demonstration and Testing at the Dallas/Fort Worth International Airport

    NASA Technical Reports Server (NTRS)

    Jones, Denise R.; Quach, Cuong C.; Young, Steven D.

    2007-01-01

    A Runway Incursion Prevention System (RIPS) was tested at the Dallas-Ft. Worth International Airport (DFW) in October 2000. The system integrated airborne and ground components to provide both pilots and controllers with enhanced situational awareness, supplemental guidance cues, a real-time display of traffic information, and warning of runway incursions in order to prevent runway incidents while also improving operational capability. A series of test runs was conducted using NASA s Boeing 757 research aircraft and a test van equipped to emulate an incurring aircraft. The system was also demonstrated to over 100 visitors from the aviation community. This paper gives an overview of the RIPS, DFW flight test activities, and quantitative and qualitative results of the testing.

  19. Automatic maintenance payload on board of a Mexican LEO microsatellite

    NASA Astrophysics Data System (ADS)

    Vicente-Vivas, Esaú; García-Nocetti, Fabián; Mendieta-Jiménez, Francisco

    2006-02-01

    Few research institutions from Mexico work together to finalize the integration of a technological demonstration microsatellite called Satex, aiming the launching of the first ever fully designed and manufactured domestic space vehicle. The project is based on technical knowledge gained in previous space experiences, particularly in developing GASCAN automatic experiments for NASA's space shuttle, and in some support obtained from the local team which assembled the México-OSCAR-30 microsatellites. Satex includes three autonomous payloads and a power subsystem, each one with a local microcomputer to provide intelligent and dedicated control. It also contains a flight computer (FC) with a pair of full redundancies. This enables the remote maintenance of processing boards from the ground station. A fourth communications payload depends on the flight computer for control purposes. A fifth payload was decided to be developed for the satellite. It adds value to the available on-board computers and extends the opportunity for a developing country to learn and to generate domestic space technology. Its aim is to provide automatic maintenance capabilities for the most critical on-board computer in order to achieve continuous satellite operations. This paper presents the virtual computer architecture specially developed to provide maintenance capabilities to the flight computer. The architecture is periodically implemented by software with a small amount of physical processors (FC processors) and virtual redundancies (payload processors) to emulate a hybrid redundancy computer. Communications among processors are accomplished over a fault-tolerant LAN. This allows a versatile operating behavior in terms of data communication as well as in terms of distributed fault tolerance. Obtained results, payload validation and reliability results are also presented.

  20. Optics in neural computation

    NASA Astrophysics Data System (ADS)

    Levene, Michael John

    In all attempts to emulate the considerable powers of the brain, one is struck by both its immense size, parallelism, and complexity. While the fields of neural networks, artificial intelligence, and neuromorphic engineering have all attempted oversimplifications on the considerable complexity, all three can benefit from the inherent scalability and parallelism of optics. This thesis looks at specific aspects of three modes in which optics, and particularly volume holography, can play a part in neural computation. First, holography serves as the basis of highly-parallel correlators, which are the foundation of optical neural networks. The huge input capability of optical neural networks make them most useful for image processing and image recognition and tracking. These tasks benefit from the shift invariance of optical correlators. In this thesis, I analyze the capacity of correlators, and then present several techniques for controlling the amount of shift invariance. Of particular interest is the Fresnel correlator, in which the hologram is displaced from the Fourier plane. In this case, the amount of shift invariance is limited not just by the thickness of the hologram, but by the distance of the hologram from the Fourier plane. Second, volume holography can provide the huge storage capacity and high speed, parallel read-out necessary to support large artificial intelligence systems. However, previous methods for storing data in volume holograms have relied on awkward beam-steering or on as-yet non- existent cheap, wide-bandwidth, tunable laser sources. This thesis presents a new technique, shift multiplexing, which is capable of very high densities, but which has the advantage of a very simple implementation. In shift multiplexing, the reference wave consists of a focused spot a few millimeters in front of the hologram. Multiplexing is achieved by simply translating the hologram a few tens of microns or less. This thesis describes the theory for how shift multiplexing works based on an unconventional, but very intuitive, analysis of the optical far-field. A more detailed analysis based on a path-integral interpretation of the Born approximation is also derived. The capacity of shift multiplexing is compared with that of angle and wavelength multiplexing. The last part of this thesis deals with the role of optics in neuromorphic engineering. Up until now, most neuromorphic engineering has involved one or a few VLSI circuits emulating early sensory systems. However, optical interconnects will be required in order to push towards more ambitious goals, such as the simulation of early visual cortex. I describe a preliminary approach to designing such a system, and show how shift multiplexing can be used to simultaneously store and implement the immense interconnections required by such a project.

  1. The Outline of Personhood Law Regarding Artificial Intelligences and Emulated Human Entities

    NASA Astrophysics Data System (ADS)

    Muzyka, Kamil

    2013-12-01

    On the verge of technological breakthroughs, which define and revolutionize our understanding of intelligence, cognition, and personhood, especially when speaking of artificial intelligences and mind uploads, one must consider the legal implications of granting personhood rights to artificial intelligences or emulated human entities

  2. Emulation of the MBM-MEDUSA model: exploring the sea level and the basin-to-shelf transfer influence on the system dynamics

    NASA Astrophysics Data System (ADS)

    Ermakov, Ilya; Crucifix, Michel; Munhoven, Guy

    2013-04-01

    Complex climate models require high computational burden. However, computational limitations may be avoided by using emulators. In this work we present several approaches for dynamical emulation (also called metamodelling) of the Multi-Box Model (MBM) coupled to the Model of Early Diagenesis in the Upper Sediment A (MEDUSA) that simulates the carbon cycle of the ocean and atmosphere [1]. We consider two experiments performed on the MBM-MEDUSA that explore the Basin-to-Shelf Transfer (BST) dynamics. In both experiments the sea level is varied according to a paleo sea level reconstruction. Such experiments are interesting because the BST is an important cause of the CO2 variation and the dynamics is potentially nonlinear. The output that we are interested in is the variation of the carbon dioxide partial pressure in the atmosphere over the Pleistocene. The first experiment considers that the BST is fixed constant during the simulation. In the second experiment the BST is interactively adjusted according to the sea level, since the sea level is the primary control of the growth and decay of coral reefs and other shelf carbon reservoirs. The main aim of the present contribution is to create a metamodel of the MBM-MEDUSA using the Dynamic Emulation Modelling methodology [2] and compare the results obtained using linear and non-linear methods. The first step in the emulation methodology used in this work is to identify the structure of the metamodel. In order to select an optimal approach for emulation we compare the results of identification obtained by the simple linear and more complex nonlinear models. In order to identify the metamodel in the first experiment the simple linear regression and the least-squares method is sufficient to obtain a 99,9% fit between the temporal outputs of the model and the metamodel. For the second experiment the MBM's output is highly nonlinear. In this case we apply nonlinear models, such as, NARX, Hammerstein model, and an 'ad-hoc' switching model. After the identification we perform the parameter mapping using spline interpolation and validate the emulator on a new set of parameters. References: [1] G. Munhoven, "Glacial-interglacial rain ratio changes: Implications for atmospheric CO2 and ocean-sediment interaction," Deep-Sea Res Pt II, vol. 54, pp. 722-746, 2007. [2] A. Castelletti et al., "A general framework for Dynamic Emulation Modelling in environmental problems," Environ Modell Softw, vol. 34, pp. 5-18, 2012.

  3. Emulating Simulations of Cosmic Dawn for 21 cm Power Spectrum Constraints on Cosmology, Reionization, and X-Ray Heating

    NASA Astrophysics Data System (ADS)

    Kern, Nicholas S.; Liu, Adrian; Parsons, Aaron R.; Mesinger, Andrei; Greig, Bradley

    2017-10-01

    Current and upcoming radio interferometric experiments are aiming to make a statistical characterization of the high-redshift 21 cm fluctuation signal spanning the hydrogen reionization and X-ray heating epochs of the universe. However, connecting 21 cm statistics to the underlying physical parameters is complicated by the theoretical challenge of modeling the relevant physics at computational speeds quick enough to enable exploration of the high-dimensional and weakly constrained parameter space. In this work, we use machine learning algorithms to build a fast emulator that can accurately mimic an expensive simulation of the 21 cm signal across a wide parameter space. We embed our emulator within a Markov Chain Monte Carlo framework in order to perform Bayesian parameter constraints over a large number of model parameters, including those that govern the Epoch of Reionization, the Epoch of X-ray Heating, and cosmology. As a worked example, we use our emulator to present an updated parameter constraint forecast for the Hydrogen Epoch of Reionization Array experiment, showing that its characterization of a fiducial 21 cm power spectrum will considerably narrow the allowed parameter space of reionization and heating parameters, and could help strengthen Planck's constraints on {σ }8. We provide both our generalized emulator code and its implementation specifically for 21 cm parameter constraints as publicly available software.

  4. OLIO+: an osteopathic medicine database.

    PubMed

    Woods, S E

    1991-01-01

    OLIO+ is a bibliographic database designed to meet the information needs of the osteopathic medical community. Produced by the American Osteopathic Association (AOA), OLIO+ is devoted exclusively to the osteopathic literature. The database is available only by subscription through AOA and may be accessed from any data terminal with modem or IBM-compatible personal computer with telecommunications software that can emulate VT100 or VT220. Apple access is also available, but some assistance from OLIO+ support staff may be necessary to modify the Apple keyboard.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Brian J.; Marcy, Peter W.

    We will investigate the use of derivative information in complex computer model emulation when the correlation function is of the compactly supported Bohman class. To this end, a Gaussian process model similar to that used by Kaufman et al. (2011) is extended to a situation where first partial derivatives in each dimension are calculated at each input site (i.e. using gradients). A simulation study in the ten-dimensional case is conducted to assess the utility of the Bohman correlation function against strictly positive correlation functions when a high degree of sparsity is induced.

  6. Identification of potential compensatory muscle strategies in a breast cancer survivor population: A combined computational and experimental approach.

    PubMed

    Chopp-Hurley, Jaclyn N; Brookham, Rebecca L; Dickerson, Clark R

    2016-12-01

    Biomechanical models are often used to estimate the muscular demands of various activities. However, specific muscle dysfunctions typical of unique clinical populations are rarely considered. Due to iatrogenic tissue damage, pectoralis major capability is markedly reduced in breast cancer population survivors, which could influence arm internal and external rotation muscular strategies. Accordingly, an optimization-based muscle force prediction model was systematically modified to emulate breast cancer population survivors through adjusting pectoralis capability and enforcing an empirical muscular co-activation relationship. Model permutations were evaluated through comparisons between predicted muscle forces and empirically measured muscle activations in survivors. Similarities between empirical data and model outputs were influenced by muscle type, hand force, pectoralis major capability and co-activation constraints. Differences in magnitude were lower when the co-activation constraint was enforced (-18.4% [31.9]) than unenforced (-23.5% [27.6]) (p<0.0001). This research demonstrates that muscle dysfunction in breast cancer population survivors can be reflected through including a capability constraint for pectoralis major. Further refinement of the co-activation constraint for survivors could improve its generalizability across this population and activities. Improving biomechanical models to more accurately represent clinical populations can provide novel information that can help in the development of optimal treatment programs for breast cancer population survivors. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. ASAP progress and expenditure report for the month of December 1--31, 1995. Joint UK/US radar program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Twogood, R.E.; Brase, J.M.; Chambers, D.H.

    1996-01-19

    The RAR/SAR is a high-priority radar system for the joint US/UK Program. Based on previous experiment results and coordination with the UK, specifications needed for future radar experiments were identified as follows: dual polarimetric (HH and VV) with medium to high resolution in SAR mode. Secondary airborne installation requirements included; high power (circa 10kw) and SLIER capability to emulate Tupelev-134 type system; initially x-band but easily extendible to other frequencies. In FY96 we intended to enhance the radar system`s capabilities by providing a second polarization (VV), spotlight imaging mode, extended frequency of operation to include S- band, increase power, andmore » interface to an existing infrared sensor. Short term objectives are: continue to evaluate and characterize the radar system; upgrade navigation and real-time processing capability to refine motion compensation; upgrade to dual polarimetry (add VV); and develop a ``spotlight`` mode capability. Accomplishments this reporting period: design specifications for the SAR system polarimetric upgrade are complete. The upgrade is ready to begin the procurement cycle when funds become available. System characterization is one of the highest priority tasks for the SAR. Although the radar is dedicated for our use, Hughes is waiting for contract funding before allowing us access to the hardware« less

  8. Applicability of aquifer impact models to support decisions at CO 2 sequestration sites

    DOE PAGES

    Keating, Elizabeth; Bacon, Diana; Carroll, Susan; ...

    2016-07-25

    The National Risk Assessment Partnership has developed a suite of tools to assess and manage risk at CO 2 sequestration sites. This capability includes polynomial or look-up table based reduced-order models (ROMs) that predict the impact of CO 2 and brine leaks on overlying aquifers. The development of these computationally-efficient models and the underlying reactive transport simulations they emulate has been documented elsewhere (Carroll et al., 2014a; Carroll et al., 2014b; Dai et al., 2014 ; Keating et al., 2016). Here in this paper, we seek to demonstrate applicability of ROM-based analysis by considering what types of decisions and aquifermore » types would benefit from the ROM analysis. We present four hypothetical examples where applying ROMs, in ensemble mode, could support decisions during a geologic CO 2 sequestration project. These decisions pertain to site selection, site characterization, monitoring network evaluation, and health impacts. In all cases, we consider potential brine/CO 2 leak rates at the base of the aquifer to be uncertain. We show that derived probabilities provide information relevant to the decision at hand. Although the ROMs were developed using site-specific data from two aquifers (High Plains and Edwards), the models accept aquifer characteristics as variable inputs and so they may have more broad applicability. We conclude that pH and TDS predictions are the most transferable to other aquifers based on the analysis of the nine water quality metrics (pH, TDS, 4 trace metals, 3 organic compounds). Guidelines are presented for determining the aquifer types for which the ROMs should be applicable.« less

  9. High speed fault tolerant secure communication for muon chamber using FPGA based GBTx emulator

    NASA Astrophysics Data System (ADS)

    Sau, Suman; Mandal, Swagata; Saini, Jogender; Chakrabarti, Amlan; Chattopadhyay, Subhasis

    2015-12-01

    The Compressed Baryonic Matter (CBM) experiment is a part of the Facility for Antiproton and Ion Research (FAIR) in Darmstadt at the GSI. The CBM experiment will investigate the highly compressed nuclear matter using nucleus-nucleus collisions. This experiment will examine lieavy-ion collisions in fixed target geometry and will be able to measure hadrons, electrons and muons. CBM requires precise time synchronization, compact hardware, radiation tolerance, self-triggered front-end electronics, efficient data aggregation schemes and capability to handle high data rate (up to several TB/s). As a part of the implementation of read out chain of Muon Cliamber(MUCH) [1] in India, we have tried to implement FPGA based emulator of GBTx in India. GBTx is a radiation tolerant ASIC that can be used to implement multipurpose high speed bidirectional optical links for high-energy physics (HEP) experiments and is developed by CERN. GBTx will be used in highly irradiated area and more prone to be affected by multi bit error. To mitigate this effect instead of single bit error correcting RS code we have used two bit error correcting (15, 7) BCH code. It will increase the redundancy which in turn increases the reliability of the coded data. So the coded data will be less prone to be affected by noise due to radiation. The data will go from detector to PC through multiple nodes through the communication channel. The computing resources are connected to a network which can be accessed by authorized person to prevent unauthorized data access which might happen by compromising the network security. Thus data encryption is essential. In order to make the data communication secure, advanced encryption standard [2] (AES - a symmetric key cryptography) and RSA [3], [4] (asymmetric key cryptography) are used after the channel coding. We have implemented GBTx emulator on two Xilinx Kintex-7 boards (KC705). One will act as transmitter and other will act as receiver and they are connected through optical fiber through small form-factor pluggable (SFP) port. We have tested the setup in the runtime environment using Xilinx Cliipscope Pro Analyzer. We also measure the resource utilization, throughput., power optimization of implemented design.

  10. Continuous-variable quantum authentication of physical unclonable keys: Security against an emulation attack

    NASA Astrophysics Data System (ADS)

    Nikolopoulos, Georgios M.

    2018-01-01

    We consider a recently proposed entity authentication protocol in which a physical unclonable key is interrogated by random coherent states of light, and the quadratures of the scattered light are analyzed by means of a coarse-grained homodyne detection. We derive a sufficient condition for the protocol to be secure against an emulation attack in which an adversary knows the challenge-response properties of the key and moreover, he can access the challenges during the verification. The security analysis relies on Holevo's bound and Fano's inequality, and suggests that the protocol is secure against the emulation attack for a broad range of physical parameters that are within reach of today's technology.

  11. Preparing the Direct Broadcast Community for GOES-R

    NASA Astrophysics Data System (ADS)

    Dubey, K. F.; Baptiste, E.; Prasad, K.; Shin, H.

    2012-12-01

    The first satellite in the United States next generation weather satellite program, GOES-R, will be launched in 2015. SeaSpace Corporation is using our recent experience and lessons learned from bringing Suomi NPP-capable direct reception systems online, to similarly bring direct reception solutions to future GOES-R users. This includes earlier outreach to customers, due to the advance budgeting deadline for procurement in many agencies. With the cancellation of eGRB, all current GOES gvar customer will need a new direct readout system, with a new receiver, high powered processing subsystem, and a larger antenna in some locations. SeaSpace's preparations have also included communicating with program leaders in NOAA and NASA regarding direct readout specifications and the development of the borrowing process for the government-procured GRB emulator. At the request of NASA, SeaSpace has offered input towards the emulator check-out process, which is expected to begin in spring 2013. After the launch of Suomi NPP, SeaSpace found a need by non-traditional customers (such as customers with non-SeaSpace ground stations or those getting data via the NOAA archive), for a processing-only subsystem. In response to this need, SeaSpace developed such a solution for Suomi NPP users, and plans to do similar for GOES-R. This presentation will cover the steps that SeaSpace is undertaking to prepare the members of the direct reception community for reception and processing of GOES-R satellite data, and detail the solutions offered.

  12. Reconfigurable HIL Testing of Earth Satellites

    NASA Technical Reports Server (NTRS)

    2008-01-01

    In recent years, hardware-in-the-loop (HIL) testing has carved a strong niche in several industries, such as automotive, aerospace, telecomm, and consumer electronics. As desktop computers have realized gains in speed, memory size, and data storage capacity, hardware/software platforms have evolved into high performance, deterministic HIL platforms, capable of hosting the most demanding applications for testing components and subsystems. Using simulation software to emulate the digital and analog I/O signals of system components, engineers of all disciplines can now test new systems in realistic environments to evaluate their function and performance prior to field deployment. Within the Aerospace industry, space-borne satellite systems are arguably some of the most demanding in terms of their requirement for custom engineering and testing. Typically, spacecraft are built one or few at a time to fulfill a space science or defense mission. In contrast to other industries that can amortize the cost of HIL systems over thousands, even millions of units, spacecraft HIL systems have been built as one-of-a-kind solutions, expensive in terms of schedule, cost, and risk, to assure satellite and spacecraft systems reliability. The focus of this paper is to present a new approach to HIL testing for spacecraft systems that takes advantage of a highly flexible hardware/software architecture based on National Instruments PXI reconfigurable hardware and virtual instruments developed using LabVIEW. This new approach to HIL is based on a multistage/multimode spacecraft bus emulation development model called Reconfigurable Hardware In-the-Loop or RHIL.

  13. EDT mode for JED -- An advanced Unix text editor

    NASA Astrophysics Data System (ADS)

    McIlwrath, B. K.; Page, C. G.

    This note describes Starlink extended EDT emulation for the JED editor. It provides a Unix text editor which can utilise the advanced facilities of DEC VTn00, xterm and similar terminals. JED in this mode provides a reasonably good emulation of the VAX/VMS editor EDT in addition to many extra facilities.

  14. Using Texas Instruments Emulators as Teaching Tools in Quantitative Chemical Analysis

    ERIC Educational Resources Information Center

    Young, Vaneica Y.

    2011-01-01

    This technology report alerts upper-division undergraduate chemistry faculty and lecturers to the use of Texas Instruments emulators as virtual graphing calculators. These may be used in multimedia lectures to instruct students on the use of their graphing calculators to obtain solutions to complex chemical problems. (Contains 1 figure.)

  15. System analysis for the Huntsville Operation Support Center distributed computer system

    NASA Technical Reports Server (NTRS)

    Ingels, F. M.

    1986-01-01

    A simulation model of the NASA Huntsville Operational Support Center (HOSC) was developed. This simulation model emulates the HYPERchannel Local Area Network (LAN) that ties together the various computers of HOSC. The HOSC system is a large installation of mainframe computers such as the Perkin Elmer 3200 series and the Dec VAX series. A series of six simulation exercises of the HOSC model is described using data sets provided by NASA. The analytical analysis of the ETHERNET LAN and the video terminals (VTs) distribution system are presented. An interface analysis of the smart terminal network model which allows the data flow requirements due to VTs on the ETHERNET LAN to be estimated, is presented.

  16. Utility of an emulation and simulation computer model for air revitalization system hardware design, development, and test

    NASA Technical Reports Server (NTRS)

    Yanosy, J. L.; Rowell, L. F.

    1985-01-01

    Efforts to make increasingly use of suitable computer programs in the design of hardware have the potential to reduce expenditures. In this context, NASA has evaluated the benefits provided by software tools through an application to the Environmental Control and Life Support (ECLS) system. The present paper is concerned with the benefits obtained by an employment of simulation tools in the case of the Air Revitalization System (ARS) of a Space Station life support system. Attention is given to the ARS functions and components, a computer program overview, a SAND (solid amine water desorbed) bed model description, a model validation, and details regarding the simulation benefits.

  17. What attributions do Australian high-performing general practices make for their success? Applying the clinical microsystems framework: a qualitative study.

    PubMed

    Dunham, Annette H; Dunbar, James A; Johnson, Julie K; Fuller, Jeff; Morgan, Mark; Ford, Dale

    2018-04-10

    To identify the success attributions of high-performing Australian general practices and the enablers and barriers they envisage for practices wishing to emulate them. Qualitative study using semi-structured interviews and content analysis of the data. Responses were recorded, transcribed verbatim and coded according to success characteristics of high-performing clinical microsystems. Primary healthcare with the participating general practices representing all Australian states and territories, and representing metropolitan and rural locations. Twenty-two general practices identified as high performing via a number of success criteria. The 52 participants were 19 general practitioners, 18 practice managers and 15 practice nurses. Participants most frequently attributed success to the interdependence of the team members, patient-focused care and leadership of the practice. They most often signalled practice leadership, team interdependence and staff focus as enablers that other organisations would need to emulate their success. They most frequently identified barriers that might be encountered in the form of potential deficits or limitations in practice leadership, staff focus and mesosystem support. Practice leaders need to empower their teams to take action through providing inclusive leadership that facilitates team interdependence. Mesosystem support for quality improvement in general practice should focus on enabling this leadership and team building, thereby ensuring improvement efforts are converted into effective healthcare provision. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  18. Feasibility and reliability of using an exoskeleton to emulate muscle contractures during walking.

    PubMed

    Attias, M; Bonnefoy-Mazure, A; De Coulon, G; Cheze, L; Armand, S

    2016-10-01

    Contracture is a permanent shortening of the muscle-tendon-ligament complex that limits joint mobility. Contracture is involved in many diseases (cerebral palsy, stroke, etc.) and can impair walking and other activities of daily living. The purpose of this study was to quantify the reliability of an exoskeleton designed to emulate lower limb muscle contractures unilaterally and bilaterally during walking. An exoskeleton was built according to the following design criteria: adjustable to different morphologies; respect of the principal lines of muscular actions; placement of reflective markers on anatomical landmarks; and the ability to replicate the contractures of eight muscles of the lower limb unilaterally and bilaterally (psoas, rectus femoris, hamstring, hip adductors, gastrocnemius, soleus, tibialis posterior, and peroneus). Sixteen combinations of contractures were emulated on the unilateral and bilateral muscles of nine healthy participants. Two sessions of gait analysis were performed at weekly intervals to assess the reliability of the emulated contractures. Discrete variables were extracted from the kinematics to analyse the reliability. The exoskeleton did not affect normal walking when contractures were not emulated. Kinematic reliability varied from poor to excellent depending on the targeted muscle. Reliability was good for the bilateral and unilateral gastrocnemius, soleus, and tibialis posterior as well as the bilateral hamstring and unilateral hip adductors. The exoskeleton can be used to replicate contracture on healthy participants. The exoskeleton will allow us to differentiate primary and compensatory effects of muscle contractures on gait kinematics. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Fast and Precise Emulation of Stochastic Biochemical Reaction Networks With Amplified Thermal Noise in Silicon Chips.

    PubMed

    Kim, Jaewook; Woo, Sung Sik; Sarpeshkar, Rahul

    2018-04-01

    The analysis and simulation of complex interacting biochemical reaction pathways in cells is important in all of systems biology and medicine. Yet, the dynamics of even a modest number of noisy or stochastic coupled biochemical reactions is extremely time consuming to simulate. In large part, this is because of the expensive cost of random number and Poisson process generation and the presence of stiff, coupled, nonlinear differential equations. Here, we demonstrate that we can amplify inherent thermal noise in chips to emulate randomness physically, thus alleviating these costs significantly. Concurrently, molecular flux in thermodynamic biochemical reactions maps to thermodynamic electronic current in a transistor such that stiff nonlinear biochemical differential equations are emulated exactly in compact, digitally programmable, highly parallel analog "cytomorphic" transistor circuits. For even small-scale systems involving just 80 stochastic reactions, our 0.35-μm BiCMOS chips yield a 311× speedup in the simulation time of Gillespie's stochastic algorithm over COPASI, a fast biochemical-reaction software simulator that is widely used in computational biology; they yield a 15 500× speedup over equivalent MATLAB stochastic simulations. The chip emulation results are consistent with these software simulations over a large range of signal-to-noise ratios. Most importantly, our physical emulation of Poisson chemical dynamics does not involve any inherently sequential processes and updates such that, unlike prior exact simulation approaches, they are parallelizable, asynchronous, and enable even more speedup for larger-size networks.

  20. An aerosol activation metamodel of v1.2.0 of the pyrcel cloud parcel model: development and offline assessment for use in an aerosol–climate model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rothenberg, Daniel; Wang, Chien

    We describe an emulator of a detailed cloud parcel model which has been trained to assess droplet nucleation from a complex, multimodal aerosol size distribution simulated by a global aerosol–climate model. The emulator is constructed using a sensitivity analysis approach (polynomial chaos expansion) which reproduces the behavior of the targeted parcel model across the full range of aerosol properties and meteorology simulated by the parent climate model. An iterative technique using aerosol fields sampled from a global model is used to identify the critical aerosol size distribution parameters necessary for accurately predicting activation. Across the large parameter space used tomore » train them, the emulators estimate cloud droplet number concentration (CDNC) with a mean relative error of 9.2% for aerosol populations without giant cloud condensation nuclei (CCN) and 6.9% when including them. Versus a parcel model driven by those same aerosol fields, the best-performing emulator has a mean relative error of 4.6%, which is comparable with two commonly used activation schemes also evaluated here (which have mean relative errors of 2.9 and 6.7%, respectively). We identify the potential for regional biases in modeled CDNC, particularly in oceanic regimes, where our best-performing emulator tends to overpredict by 7%, whereas the reference activation schemes range in mean relative error from -3 to 7%. The emulators which include the effects of giant CCN are more accurate in continental regimes (mean relative error of 0.3%) but strongly overestimate CDNC in oceanic regimes by up to 22%, particularly in the Southern Ocean. Finally, the biases in CDNC resulting from the subjective choice of activation scheme could potentially influence the magnitude of the indirect effect diagnosed from the model incorporating it.« less

  1. An aerosol activation metamodel of v1.2.0 of the pyrcel cloud parcel model: development and offline assessment for use in an aerosol–climate model

    DOE PAGES

    Rothenberg, Daniel; Wang, Chien

    2017-04-27

    We describe an emulator of a detailed cloud parcel model which has been trained to assess droplet nucleation from a complex, multimodal aerosol size distribution simulated by a global aerosol–climate model. The emulator is constructed using a sensitivity analysis approach (polynomial chaos expansion) which reproduces the behavior of the targeted parcel model across the full range of aerosol properties and meteorology simulated by the parent climate model. An iterative technique using aerosol fields sampled from a global model is used to identify the critical aerosol size distribution parameters necessary for accurately predicting activation. Across the large parameter space used tomore » train them, the emulators estimate cloud droplet number concentration (CDNC) with a mean relative error of 9.2% for aerosol populations without giant cloud condensation nuclei (CCN) and 6.9% when including them. Versus a parcel model driven by those same aerosol fields, the best-performing emulator has a mean relative error of 4.6%, which is comparable with two commonly used activation schemes also evaluated here (which have mean relative errors of 2.9 and 6.7%, respectively). We identify the potential for regional biases in modeled CDNC, particularly in oceanic regimes, where our best-performing emulator tends to overpredict by 7%, whereas the reference activation schemes range in mean relative error from -3 to 7%. The emulators which include the effects of giant CCN are more accurate in continental regimes (mean relative error of 0.3%) but strongly overestimate CDNC in oceanic regimes by up to 22%, particularly in the Southern Ocean. Finally, the biases in CDNC resulting from the subjective choice of activation scheme could potentially influence the magnitude of the indirect effect diagnosed from the model incorporating it.« less

  2. A radiation-hardened, computer for satellite applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaona, J.I. Jr.

    1996-08-01

    This paper describes high reliability radiation hardened computers built by Sandia for application aboard DOE satellite programs requiring 32 bit processing. The computers highlight a radiation hardened (10 kGy(Si)) R3000 executing up to 10 million reduced instruction set instructions (RISC) per second (MIPS), a dual purpose module control bus used for real-time default and power management which allows for extended mission operation on as little as 1.2 watts, and a local area network capable of 480 Mbits/s. The central processing unit (CPU) is the NASA Goddard R3000 nicknamed the ``Mongoose or Mongoose 1``. The Sandia Satellite Computer (SSC) uses Rational`smore » Ada compiler, debugger, operating system kernel, and enhanced floating point emulation library targeted at the Mongoose. The SSC gives Sandia the capability of processing complex types of spacecraft attitude determination and control algorithms and of modifying programmed control laws via ground command. And in general, SSC offers end users the ability to process data onboard the spacecraft that would normally have been sent to the ground which allows reconsideration of traditional space-grounded partitioning options.« less

  3. Microswitch and Keyboard-Emulator Technology to Facilitate the Writing Performance of Persons with Extensive Motor Disabilities

    ERIC Educational Resources Information Center

    Lancioni, Giulio E.; Singh, Nirbhay N.; O'Reilly, Mark F.; Sigafoos, Jeff; Green, Vanessa; Oliva, Doretta; Lang, Russell

    2011-01-01

    This study assessed the effectiveness of microswitches for simple responses (i.e., partial hand closure, vocalization, and hand stroking) and a keyboard emulator to facilitate the writing performance of three participants with extensive motor disabilities. The study was carried out according to an ABAB design. During the A phases, the participants…

  4. Radiation-hardened optically reconfigurable gate array exploiting holographic memory characteristics

    NASA Astrophysics Data System (ADS)

    Seto, Daisaku; Watanabe, Minoru

    2015-09-01

    In this paper, we present a proposal for a radiation-hardened optically reconfigurable gate array (ORGA). The ORGA is a type of field programmable gate array (FPGA). The ORGA configuration can be executed by the exploitation of holographic memory characteristics even if 20% of the configuration data are damaged. Moreover, the optoelectronic technology enables the high-speed reconfiguration of the programmable gate array. Such a high-speed reconfiguration can increase the radiation tolerance of its programmable gate array to 9.3 × 104 times higher than that of current FPGAs. Through experimentation, this study clarified the configuration dependability using the impulse-noise emulation and high-speed configuration capabilities of the ORGA with corrupt configuration contexts. Moreover, the radiation tolerance of the programmable gate array was confirmed theoretically through probabilistic calculation.

  5. Engineering simulation development and evaluation of the two-segment noise abatement approach conducted in the B-727-222 flight simulator

    NASA Technical Reports Server (NTRS)

    Nylen, W. E.

    1974-01-01

    Profile modification as a means of reducing ground level noise from jet aircraft in the landing approach is evaluated. A flight simulator was modified to incorporate the cockpit hardware which would be in the prototype airplane installation. The two-segment system operational and aircraft interface logic was accurately emulated in software. Programs were developed to permit data to be recorded in real time on the line printer, a 14-channel oscillograph, and an x-y plotter. The two-segment profile and procedures which were developed are described with emphasis on operational concepts and constraints. The two-segment system operational logic and the flight simulator capabilities are described. The findings influenced the ultimate system design and aircraft interface.

  6. Laser optical appraisal and design of a PRIME/Rover interface

    NASA Technical Reports Server (NTRS)

    Donaldson, J. A.

    1980-01-01

    An appraisal of whether to improve the existing multi-laser/multi detector system was undertaken. Two features of the elevation scanning mast which prevent the system from meeting desired specifications were studied. Then elevation scanning mast has 20 detectors, as opposed to the desired 40. This influences the system's overall resolution. The mirror shaft encoder's finite resolution prevents the laser from being aimed exactly as desired. This influences the system's overall accuracy. It was concluded that the existing system needs no modification at present. The design and construction of a data emulator which allowed testing data transactions with a PRIME computer is described, and theory of operation briefly discussed. A full blown PRIME/Rover Interface was designed and built. The capabilities of this Interface and its operating principles are discussed.

  7. A multi-modal stereo microscope based on a spatial light modulator.

    PubMed

    Lee, M P; Gibson, G M; Bowman, R; Bernet, S; Ritsch-Marte, M; Phillips, D B; Padgett, M J

    2013-07-15

    Spatial Light Modulators (SLMs) can emulate the classic microscopy techniques, including differential interference (DIC) contrast and (spiral) phase contrast. Their programmability entails the benefit of flexibility or the option to multiplex images, for single-shot quantitative imaging or for simultaneous multi-plane imaging (depth-of-field multiplexing). We report the development of a microscope sharing many of the previously demonstrated capabilities, within a holographic implementation of a stereo microscope. Furthermore, we use the SLM to combine stereo microscopy with a refocusing filter and with a darkfield filter. The instrument is built around a custom inverted microscope and equipped with an SLM which gives various imaging modes laterally displaced on the same camera chip. In addition, there is a wide angle camera for visualisation of a larger region of the sample.

  8. Formation of an internal model of environment dynamics during upper limb reaching movements: a fuzzy approach.

    PubMed

    MacDonald, Chad; Moussavi, Zahra; Sarkodie-Gyan, Thompson

    2007-01-01

    This paper presents the development and simulation of a fuzzy logic based learning mechanism to emulate human motor learning. In particular, fuzzy inference was used to develop an internal model of a novel dynamic environment experienced during planar reaching movements with the upper limb. A dynamic model of the human arm was developed and a fuzzy if-then rule base was created to relate trajectory movement and velocity errors to internal model update parameters. An experimental simulation was performed to compare the fuzzy system's performance with that of human subjects. It was found that the dynamic model behaved as expected, and the fuzzy learning mechanism created an internal model that was capable of opposing the environmental force field to regain a trajectory closely resembling the desired ideal.

  9. A multi-stimuli responsive switch as a fluorescent molecular analogue of transistors† †Electronic supplementary information (ESI) available: Detailed experimental procedures and additional data on the characterization of 1. See DOI: 10.1039/c5sc03395k Click here for additional data file.

    PubMed Central

    Gallardo, Iluminada; Morais, Sandy; Prats, Gemma

    2016-01-01

    Although the quantum nature of molecules makes them specially suitable for mimicking the operation of digital electronic elements, molecular compounds can also be envisioned to emulate the behavior of analog devices. In this work we report a novel fluorescent three-state switch capable of reproducing the analog response of transistors, an ubiquitous device in modern electronics. Exploiting the redox and thermal sensitivity of this compound, the amplitude of its fluorescence emission can be continuously modulated, in a similar way as the output current in a transistor is amplified by the gate-to-source voltage. PMID:28959394

  10. A reconfigurable on-line learning spiking neuromorphic processor comprising 256 neurons and 128K synapses.

    PubMed

    Qiao, Ning; Mostafa, Hesham; Corradi, Federico; Osswald, Marc; Stefanini, Fabio; Sumislawska, Dora; Indiveri, Giacomo

    2015-01-01

    Implementing compact, low-power artificial neural processing systems with real-time on-line learning abilities is still an open challenge. In this paper we present a full-custom mixed-signal VLSI device with neuromorphic learning circuits that emulate the biophysics of real spiking neurons and dynamic synapses for exploring the properties of computational neuroscience models and for building brain-inspired computing systems. The proposed architecture allows the on-chip configuration of a wide range of network connectivities, including recurrent and deep networks, with short-term and long-term plasticity. The device comprises 128 K analog synapse and 256 neuron circuits with biologically plausible dynamics and bi-stable spike-based plasticity mechanisms that endow it with on-line learning abilities. In addition to the analog circuits, the device comprises also asynchronous digital logic circuits for setting different synapse and neuron properties as well as different network configurations. This prototype device, fabricated using a 180 nm 1P6M CMOS process, occupies an area of 51.4 mm(2), and consumes approximately 4 mW for typical experiments, for example involving attractor networks. Here we describe the details of the overall architecture and of the individual circuits and present experimental results that showcase its potential. By supporting a wide range of cortical-like computational modules comprising plasticity mechanisms, this device will enable the realization of intelligent autonomous systems with on-line learning capabilities.

  11. Intelligent Launch and Range Operations Virtual Test Bed (ILRO-VTB)

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge; Rajkumar, T.

    2003-01-01

    Intelligent Launch and Range Operations Virtual Test Bed (ILRO-VTB) is a real-time web-based command and control, communication, and intelligent simulation environment of ground-vehicle, launch and range operation activities. ILRO-VTB consists of a variety of simulation models combined with commercial and indigenous software developments (NASA Ames). It creates a hybrid software/hardware environment suitable for testing various integrated control system components of launch and range. The dynamic interactions of the integrated simulated control systems are not well understood. Insight into such systems can only be achieved through simulation/emulation. For that reason, NASA has established a VTB where we can learn the actual control and dynamics of designs for future space programs, including testing and performance evaluation. The current implementation of the VTB simulates the operations of a sub-orbital vehicle of mission, control, ground-vehicle engineering, launch and range operations. The present development of the test bed simulates the operations of Space Shuttle Vehicle (SSV) at NASA Kennedy Space Center. The test bed supports a wide variety of shuttle missions with ancillary modeling capabilities like weather forecasting, lightning tracker, toxic gas dispersion model, debris dispersion model, telemetry, trajectory modeling, ground operations, payload models and etc. To achieve the simulations, all models are linked using Common Object Request Broker Architecture (CORBA). The test bed provides opportunities for government, universities, researchers and industries to do a real time of shuttle launch in cyber space.

  12. Intelligent launch and range operations virtual testbed (ILRO-VTB)

    NASA Astrophysics Data System (ADS)

    Bardina, Jorge; Rajkumar, Thirumalainambi

    2003-09-01

    Intelligent Launch and Range Operations Virtual Test Bed (ILRO-VTB) is a real-time web-based command and control, communication, and intelligent simulation environment of ground-vehicle, launch and range operation activities. ILRO-VTB consists of a variety of simulation models combined with commercial and indigenous software developments (NASA Ames). It creates a hybrid software/hardware environment suitable for testing various integrated control system components of launch and range. The dynamic interactions of the integrated simulated control systems are not well understood. Insight into such systems can only be achieved through simulation/emulation. For that reason, NASA has established a VTB where we can learn the actual control and dynamics of designs for future space programs, including testing and performance evaluation. The current implementation of the VTB simulates the operations of a sub-orbital vehicle of mission, control, ground-vehicle engineering, launch and range operations. The present development of the test bed simulates the operations of Space Shuttle Vehicle (SSV) at NASA Kennedy Space Center. The test bed supports a wide variety of shuttle missions with ancillary modeling capabilities like weather forecasting, lightning tracker, toxic gas dispersion model, debris dispersion model, telemetry, trajectory modeling, ground operations, payload models and etc. To achieve the simulations, all models are linked using Common Object Request Broker Architecture (CORBA). The test bed provides opportunities for government, universities, researchers and industries to do a real time of shuttle launch in cyber space.

  13. Yielding physically-interpretable emulators - A Sparse PCA approach

    NASA Astrophysics Data System (ADS)

    Galelli, S.; Alsahaf, A.; Giuliani, M.; Castelletti, A.

    2015-12-01

    Projection-based techniques, such as Principal Orthogonal Decomposition (POD), are a common approach to surrogate high-fidelity process-based models by lower order dynamic emulators. With POD, the dimensionality reduction is achieved by using observations, or 'snapshots' - generated with the high-fidelity model -, to project the entire set of input and state variables of this model onto a smaller set of basis functions that account for most of the variability in the data. While reduction efficiency and variance control of POD techniques are usually very high, the resulting emulators are structurally highly complex and can hardly be given a physically meaningful interpretation as each basis is a projection of the entire set of inputs and states. In this work, we propose a novel approach based on Sparse Principal Component Analysis (SPCA) that combines the several assets of POD methods with the potential for ex-post interpretation of the emulator structure. SPCA reduces the number of non-zero coefficients in the basis functions by identifying a sparse matrix of coefficients. While the resulting set of basis functions may retain less variance of the snapshots, the presence of a few non-zero coefficients assists in the interpretation of the underlying physical processes. The SPCA approach is tested on the reduction of a 1D hydro-ecological model (DYRESM-CAEDYM) used to describe the main ecological and hydrodynamic processes in Tono Dam, Japan. An experimental comparison against a standard POD approach shows that SPCA achieves the same accuracy in emulating a given output variable - for the same level of dimensionality reduction - while yielding better insights of the main process dynamics.

  14. Assessment of phase based dose modulation for improved dose efficiency in cardiac CT on an anthropomorphic motion phantom

    NASA Astrophysics Data System (ADS)

    Budde, Adam; Nilsen, Roy; Nett, Brian

    2014-03-01

    State of the art automatic exposure control modulates the tube current across view angle and Z based on patient anatomy for use in axial full scan reconstructions. Cardiac CT, however, uses a fundamentally different image reconstruction that applies a temporal weighting to reduce motion artifacts. This paper describes a phase based mA modulation that goes beyond axial and ECG modulation; it uses knowledge of the temporal view weighting applied within the reconstruction algorithm to improve dose efficiency in cardiac CT scanning. Using physical phantoms and synthetic noise emulation, we measure how knowledge of sinogram temporal weighting and the prescribed cardiac phase can be used to improve dose efficiency. First, we validated that a synthetic CT noise emulation method produced realistic image noise. Next, we used the CT noise emulation method to simulate mA modulation on scans of a physical anthropomorphic phantom where a motion profile corresponding to a heart rate of 60 beats per minute was used. The CT noise emulation method matched noise to lower dose scans across the image within 1.5% relative error. Using this noise emulation method to simulate modulating the mA while keeping the total dose constant, the image variance was reduced by an average of 11.9% on a scan with 50 msec padding, demonstrating improved dose efficiency. Radiation dose reduction in cardiac CT can be achieved while maintaining the same level of image noise through phase based dose modulation that incorporates knowledge of the cardiac reconstruction algorithm.

  15. Enhancing the Critical Role of Malaysian Institute of Higher Education from Ivy League American Universities Research Culture Experiences

    ERIC Educational Resources Information Center

    Jusoff, Hj. Kamaruzaman; Samah, Hjh. Siti Akmar Abu; Abdullah, Zaini

    2009-01-01

    Emulation by example is an old adage that has been a pragmatic initiative in great endeavors. To create a dynamic research culture too requires revisiting eminent personality and renowned organization by which one can copy in order to establish credibility. This paper explores the practicality of the emulation activities that help to establish…

  16. The Corporation of Learning: Nonprofit Higher Education Takes Lessons from Business. Research & Occasional Paper Series. CSHE.5.03

    ERIC Educational Resources Information Center

    Kirp, David L.

    2003-01-01

    This essay examines the ways in which nonprofit universities increasingly emulate businesses, focusing on two of the most direct forms of emulation: the creation of internal university markets at the University of Southern California through adoption of variants of resource center management (RCM) and the privatization of public higher education…

  17. Using Student Group Work in Higher Education to Emulate Professional Communities of Practice

    ERIC Educational Resources Information Center

    Fearon, Colm; McLaughlin, Heather; Eng, Tan Yoke

    2012-01-01

    Purpose: The purpose of this paper is to discuss the value of social learning from group work that emulates a professional community of practice. Design/methodology/approach: A thought piece that first, examines the role of group-work projects as part of social learning, then outlines key arguments for social learning based upon applying a…

  18. Application of a statistical emulator to fire emission modeling

    Treesearch

    Marwan Katurji; Jovanka Nikolic; Shiyuan Zhong; Scott Pratt; Lejiang Yu; Warren E. Heilman

    2015-01-01

    We have demonstrated the use of an advanced Gaussian-Process (GP) emulator to estimate wildland fire emissions over a wide range of fuel and atmospheric conditions. The Fire Emission Production Simulator, or FEPS, is used to produce an initial set of emissions data that correspond to some selected values in the domain of the input fuel and atmospheric parameters for...

  19. The Center for Advanced Systems and Engineering (CASE)

    DTIC Science & Technology

    2012-01-01

    targets from multiple sensors. Qinru Qiu, State University of New York at Binghamton – A Neuromorphic Approach for Intelligent Text Recognition...Rogers, SUNYIT, Basic Research, Development and Emulation of Derived Models of Neuromorphic Brain Processes to Investigate the Computational Architecture...Issues They Present Work pertaining to the basic research, development and emulation of derived models of Neuromorphic brain processes to

  20. Extending the Coyote emulator to dark energy models with standard w {sub 0}- w {sub a} parametrization of the equation of state

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casarini, L.; Bonometto, S.A.; Tessarotto, E.

    2016-08-01

    We discuss an extension of the Coyote emulator to predict non-linear matter power spectra of dark energy (DE) models with a scale factor dependent equation of state of the form w = w {sub 0}+(1- a ) w {sub a} . The extension is based on the mapping rule between non-linear spectra of DE models with constant equation of state and those with time varying one originally introduced in ref. [40]. Using a series of N-body simulations we show that the spectral equivalence is accurate to sub-percent level across the same range of modes and redshift covered by the Coyotemore » suite. Thus, the extended emulator provides a very efficient and accurate tool to predict non-linear power spectra for DE models with w {sub 0}- w {sub a} parametrization. According to the same criteria we have developed a numerical code that we have implemented in a dedicated module for the CAMB code, that can be used in combination with the Coyote Emulator in likelihood analyses of non-linear matter power spectrum measurements. All codes can be found at https://github.com/luciano-casarini/pkequal.« less

  1. The Mira-Titan Universe. II. Matter Power Spectrum Emulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawrence, Earl; Heitmann, Katrin; Kwan, Juliana

    We introduce a new cosmic emulator for the matter power spectrum covering eight cosmological parameters. Targeted at optical surveys, the emulator provides accurate predictions out to a wavenumber k similar to 5 Mpc(-1) and redshift z <= 2. In addition to covering the standard set of Lambda CDM parameters, massive neutrinos and a dynamical dark energy of state are included. The emulator is built on a sample set of 36 cosmological models, carefully chosen to provide accurate predictions over the wide and large parameter space. For each model, we have performed a high-resolution simulation, augmented with 16 medium-resolution simulations andmore » TimeRG perturbation theory results to provide accurate coverage over a wide k-range; the data set generated as part of this project is more than 1.2Pbytes. With the current set of simulated models, we achieve an accuracy of approximately 4%. Because the sampling approach used here has established convergence and error-control properties, follow-up results with more than a hundred cosmological models will soon achieve similar to 1% accuracy. We compare our approach with other prediction schemes that are based on halo model ideas and remapping approaches.« less

  2. Piecewise linear emulator of the nonlinear Schrödinger equation and the resulting analytic solutions for Bose-Einstein condensates.

    PubMed

    Theodorakis, Stavros

    2003-06-01

    We emulate the cubic term Psi(3) in the nonlinear Schrödinger equation by a piecewise linear term, thus reducing the problem to a set of uncoupled linear inhomogeneous differential equations. The resulting analytic expressions constitute an excellent approximation to the exact solutions, as is explicitly shown in the case of the kink, the vortex, and a delta function trap. Such a piecewise linear emulation can be used for any differential equation where the only nonlinearity is a Psi(3) one. In particular, it can be used for the nonlinear Schrödinger equation in the presence of harmonic traps, giving analytic Bose-Einstein condensate solutions that reproduce very accurately the numerically calculated ones in one, two, and three dimensions.

  3. Neural networks for self-learning control systems

    NASA Technical Reports Server (NTRS)

    Nguyen, Derrick H.; Widrow, Bernard

    1990-01-01

    It is shown how a neural network can learn of its own accord to control a nonlinear dynamic system. An emulator, a multilayered neural network, learns to identify the system's dynamic characteristics. The controller, another multilayered neural network, next learns to control the emulator. The self-trained controller is then used to control the actual dynamic system. The learning process continues as the emulator and controller improve and track the physical process. An example is given to illustrate these ideas. The 'truck backer-upper,' a neural network controller that steers a trailer truck while the truck is backing up to a loading dock, is demonstrated. The controller is able to guide the truck to the dock from almost any initial position. The technique explored should be applicable to a wide variety of nonlinear control problems.

  4. Short-Term Synaptic Plasticity Regulation in Solution-Gated Indium-Gallium-Zinc-Oxide Electric-Double-Layer Transistors.

    PubMed

    Wan, Chang Jin; Liu, Yang Hui; Zhu, Li Qiang; Feng, Ping; Shi, Yi; Wan, Qing

    2016-04-20

    In the biological nervous system, synaptic plasticity regulation is based on the modulation of ionic fluxes, and such regulation was regarded as the fundamental mechanism underlying memory and learning. Inspired by such biological strategies, indium-gallium-zinc-oxide (IGZO) electric-double-layer (EDL) transistors gated by aqueous solutions were proposed for synaptic behavior emulations. Short-term synaptic plasticity, such as paired-pulse facilitation, high-pass filtering, and orientation tuning, was experimentally emulated in these EDL transistors. Most importantly, we found that such short-term synaptic plasticity can be effectively regulated by alcohol (ethyl alcohol) and salt (potassium chloride) additives. Our results suggest that solution gated oxide-based EDL transistors could act as the platforms for short-term synaptic plasticity emulation.

  5. Interconnection arrangement of routers of processor boards in array of cabinets supporting secure physical partition

    DOEpatents

    Tomkins, James L [Albuquerque, NM; Camp, William J [Albuquerque, NM

    2007-07-17

    A multiple processor computing apparatus includes a physical interconnect structure that is flexibly configurable to support selective segregation of classified and unclassified users. The physical interconnect structure includes routers in service or compute processor boards distributed in an array of cabinets connected in series on each board and to respective routers in neighboring row cabinet boards with the routers in series connection coupled to routers in series connection in respective neighboring column cabinet boards. The array can include disconnect cabinets or respective routers in all boards in each cabinet connected in a toroid. The computing apparatus can include an emulator which permits applications from the same job to be launched on processors that use different operating systems.

  6. A Laboratory for Characterizing the Efficacy of Moving Target Defense

    DTIC Science & Technology

    2016-10-25

    of William and Mary are developing a scalable, dynamic, adaptive security system that combines virtualization , emulation, and mutable network...goal with the resource constraints of a small number of servers, and making virtual nodes “real enough” from the view of attackers. Unfortunately, with...we at College of William and Mary are developing a scalable, dynamic, adaptive security system that combines virtualization , emulation, and mutable

  7. Guidance, Navigation and Control Digital Emulation Technology Laboratory. Volume 1. Part 1. Task 1: Digital Emulation Technology Laboratory

    DTIC Science & Technology

    1991-09-27

    complex floating-point functions in a fraction of the time used by the best supercomputers on the market today. These co-processing boards "piggy-back...by the VNIX-based DECLARE program. Ve’ ctLptieu du te, tedi the new verion with main programs that noi, include onlN the variablc required wkith each

  8. JSME: a free molecule editor in JavaScript.

    PubMed

    Bienfait, Bruno; Ertl, Peter

    2013-01-01

    A molecule editor, i.e. a program facilitating graphical input and interactive editing of molecules, is an indispensable part of every cheminformatics or molecular processing system. Today, when a web browser has become the universal scientific user interface, a tool to edit molecules directly within the web browser is essential. One of the most popular tools for molecular structure input on the web is the JME applet. Since its release nearly 15 years ago, however the web environment has changed and Java applets are facing increasing implementation hurdles due to their maintenance and support requirements, as well as security issues. This prompted us to update the JME editor and port it to a modern Internet programming language - JavaScript. The actual molecule editing Java code of the JME editor was translated into JavaScript with help of the Google Web Toolkit compiler and a custom library that emulates a subset of the GUI features of the Java runtime environment. In this process, the editor was enhanced by additional functionalities including a substituent menu, copy/paste, drag and drop and undo/redo capabilities and an integrated help. In addition to desktop computers, the editor supports molecule editing on touch devices, including iPhone, iPad and Android phones and tablets. In analogy to JME the new editor is named JSME. This new molecule editor is compact, easy to use and easy to incorporate into web pages. A free molecule editor written in JavaScript was developed and is released under the terms of permissive BSD license. The editor is compatible with JME, has practically the same user interface as well as the web application programming interface. The JSME editor is available for download from the project web page http://peter-ertl.com/jsme/

  9. Dynamic Emulation Modelling (DEMo) of large physically-based environmental models

    NASA Astrophysics Data System (ADS)

    Galelli, S.; Castelletti, A.

    2012-12-01

    In environmental modelling large, spatially-distributed, physically-based models are widely adopted to describe the dynamics of physical, social and economic processes. Such an accurate process characterization comes, however, to a price: the computational requirements of these models are considerably high and prevent their use in any problem requiring hundreds or thousands of model runs to be satisfactory solved. Typical examples include optimal planning and management, data assimilation, inverse modelling and sensitivity analysis. An effective approach to overcome this limitation is to perform a top-down reduction of the physically-based model by identifying a simplified, computationally efficient emulator, constructed from and then used in place of the original model in highly resource-demanding tasks. The underlying idea is that not all the process details in the original model are equally important and relevant to the dynamics of the outputs of interest for the type of problem considered. Emulation modelling has been successfully applied in many environmental applications, however most of the literature considers non-dynamic emulators (e.g. metamodels, response surfaces and surrogate models), where the original dynamical model is reduced to a static map between input and the output of interest. In this study we focus on Dynamic Emulation Modelling (DEMo), a methodological approach that preserves the dynamic nature of the original physically-based model, with consequent advantages in a wide variety of problem areas. In particular, we propose a new data-driven DEMo approach that combines the many advantages of data-driven modelling in representing complex, non-linear relationships, but preserves the state-space representation typical of process-based models, which is both particularly effective in some applications (e.g. optimal management and data assimilation) and facilitates the ex-post physical interpretation of the emulator structure, thus enhancing the credibility of the model to stakeholders and decision-makers. Numerical results from the application of the approach to the reduction of 3D coupled hydrodynamic-ecological models in several real world case studies, including Marina Reservoir (Singapore) and Googong Reservoir (Australia), are illustrated.

  10. Emulating a System Dynamics Model with Agent-Based Models: A Methodological Case Study in Simulation of Diabetes Progression

    DOE PAGES

    Schryver, Jack; Nutaro, James; Shankar, Mallikarjun

    2015-10-30

    An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease statesmore » in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.« less

  11. Analysis of Public Datasets for Wearable Fall Detection Systems.

    PubMed

    Casilari, Eduardo; Santoyo-Ramón, José-Antonio; Cano-García, José-Manuel

    2017-06-27

    Due to the boom of wireless handheld devices such as smartwatches and smartphones, wearable Fall Detection Systems (FDSs) have become a major focus of attention among the research community during the last years. The effectiveness of a wearable FDS must be contrasted against a wide variety of measurements obtained from inertial sensors during the occurrence of falls and Activities of Daily Living (ADLs). In this regard, the access to public databases constitutes the basis for an open and systematic assessment of fall detection techniques. This paper reviews and appraises twelve existing available data repositories containing measurements of ADLs and emulated falls envisaged for the evaluation of fall detection algorithms in wearable FDSs. The analysis of the found datasets is performed in a comprehensive way, taking into account the multiple factors involved in the definition of the testbeds deployed for the generation of the mobility samples. The study of the traces brings to light the lack of a common experimental benchmarking procedure and, consequently, the large heterogeneity of the datasets from a number of perspectives (length and number of samples, typology of the emulated falls and ADLs, characteristics of the test subjects, features and positions of the sensors, etc.). Concerning this, the statistical analysis of the samples reveals the impact of the sensor range on the reliability of the traces. In addition, the study evidences the importance of the selection of the ADLs and the need of categorizing the ADLs depending on the intensity of the movements in order to evaluate the capability of a certain detection algorithm to discriminate falls from ADLs.

  12. Emulating a System Dynamics Model with Agent-Based Models: A Methodological Case Study in Simulation of Diabetes Progression

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schryver, Jack; Nutaro, James; Shankar, Mallikarjun

    An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease statesmore » in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.« less

  13. Analysis of Public Datasets for Wearable Fall Detection Systems

    PubMed Central

    Santoyo-Ramón, José-Antonio; Cano-García, José-Manuel

    2017-01-01

    Due to the boom of wireless handheld devices such as smartwatches and smartphones, wearable Fall Detection Systems (FDSs) have become a major focus of attention among the research community during the last years. The effectiveness of a wearable FDS must be contrasted against a wide variety of measurements obtained from inertial sensors during the occurrence of falls and Activities of Daily Living (ADLs). In this regard, the access to public databases constitutes the basis for an open and systematic assessment of fall detection techniques. This paper reviews and appraises twelve existing available data repositories containing measurements of ADLs and emulated falls envisaged for the evaluation of fall detection algorithms in wearable FDSs. The analysis of the found datasets is performed in a comprehensive way, taking into account the multiple factors involved in the definition of the testbeds deployed for the generation of the mobility samples. The study of the traces brings to light the lack of a common experimental benchmarking procedure and, consequently, the large heterogeneity of the datasets from a number of perspectives (length and number of samples, typology of the emulated falls and ADLs, characteristics of the test subjects, features and positions of the sensors, etc.). Concerning this, the statistical analysis of the samples reveals the impact of the sensor range on the reliability of the traces. In addition, the study evidences the importance of the selection of the ADLs and the need of categorizing the ADLs depending on the intensity of the movements in order to evaluate the capability of a certain detection algorithm to discriminate falls from ADLs. PMID:28653991

  14. A device for emulating cuff recordings of action potentials propagating along peripheral nerves.

    PubMed

    Rieger, Robert; Schuettler, Martin; Chuang, Sheng-Chih

    2014-09-01

    This paper describes a device that emulates propagation of action potentials along a peripheral nerve, suitable for reproducible testing of bio-potential recording systems using nerve cuff electrodes. The system is a microcontroller-based stand-alone instrument which uses established nerve and electrode models to represent neural activity of real nerves recorded with a nerve cuff interface, taking into consideration electrode impedance, voltages picked up by the electrodes, and action potential propagation characteristics. The system emulates different scenarios including compound action potentials with selectable propagation velocities and naturally occurring nerve traffic from different velocity fiber populations. Measured results from a prototype implementation are reported and compared with in vitro recordings from Xenopus Laevis frog sciatic nerve, demonstrating that the electrophysiological setting is represented to a satisfactory degree, useful for the development, optimization and characterization of future recording systems.

  15. Memristors with diffusive dynamics as synaptic emulators for neuromorphic computing

    DOE PAGES

    Wang, Zhongrui; Joshi, Saumil; Savel’ev, Sergey E.; ...

    2016-09-26

    The accumulation and extrusion of Ca 2+ in the pre- and postsynaptic compartments play a critical role in initiating plastic changes in biological synapses. In order to emulate this fundamental process in electronic devices, we developed diffusive Ag-in-oxide memristors with a temporal response during and after stimulation similar to that of the synaptic Ca 2+ dynamics. In situ high-resolution transmission electron microscopy and nanoparticle dynamics simulations both demonstrate that Ag atoms disperse under electrical bias and regroup spontaneously under zero bias because of interfacial energy minimization, closely resembling synaptic influx and extrusion of Ca 2+, respectively. Furthermore, the diffusive memristormore » and its dynamics enable a direct emulation of both short- and long-term plasticity of biological synapses, representing an advance in hardware implementation of neuromorphic functionalities.« less

  16. The third level trigger and output event unit of the UA1 data-acquisition system

    NASA Astrophysics Data System (ADS)

    Cittolin, S.; Demoulin, M.; Fucci, A.; Haynes, W.; Martin, B.; Porte, J. P.; Sphicas, P.

    1989-12-01

    The upgraded UA1 experiment utilizes twelve 3081/E emulators for its third-level trigger system. The system is interfaced to VME, and is controlled by 68000 microprocessor VME boards on the input and output. The output controller communicates with an IBM 9375 mainframe via the CERN-IBM developed VICI interface. The events selected by the emulators are output on IBM-3480 cassettes. The user interface to this system is based on a series of Macintosh personal computer connected to the VME bus. These Macs are also used for developing software for the emulators and for monitoring the entire system. The same configuration has also been used for offline event reconstruction. A description of the system, together with details of both the online and offline modes of operation and an eveluation of its performance are presented.

  17. Cooling Atomic Gases With Disorder

    DOE PAGES

    Paiva, Thereza; Khatami, Ehsan; Yang, Shuxiang; ...

    2015-12-10

    Cold atomic gases have proven capable of emulating a number of fundamental condensed matter phenomena including Bose-Einstein condensation, the Mott transition, Fulde-Ferrell-Larkin-Ovchinnikov pairing, and the quantum Hall effect. Cooling to a low enough temperature to explore magnetism and exotic superconductivity in lattices of fermionic atoms remains a challenge. Here in this paper, we propose a method to produce a low temperature gas by preparing it in a disordered potential and following a constant entropy trajectory to deliver the gas into a nondisordered state which exhibits these incompletely understood phases. We show, using quantum Monte Carlo simulations, that we can approachmore » the Néel temperature of the three-dimensional Hubbard model for experimentally achievable parameters. Recent experimental estimates suggest the randomness required lies in a regime where atom transport and equilibration are still robust.« less

  18. Trauma on the Isle of Man.

    PubMed Central

    Hackney, R G; Varley, G; Stevens, D; Green, A

    1993-01-01

    The Isle of Man Tourist Trophy motorcycle races remain one of the most popular venues for motorcycle races. This is despite the reduced status of the event. The reason for the loss of world championship and formula one status is the nature of the road racing circuit itself. The twisting narrow roads are only closed to the public at certain times during the practice and race weeks. Motorcycling visitors to the event attempt to emulate their heroes on machines capable of high speeds. Casualties from both visitors and racers are dealt with efficiently by an expanded medical service. This includes the use of an aeromedical evacuation helicopter. Casualties from the visitors exceeded those from the racers themselves during the period reported. Images Figure 1 Figure 2 Figure 4 Figure 3 Figure 5 Figure 6 Figure 7 Figure 8 PMID:8457818

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Youssef, Tarek A.; El Hariri, Mohamad; Elsayed, Ahmed T.

    The smart grid is seen as a power system with realtime communication and control capabilities between the consumer and the utility. This modern platform facilitates the optimization in energy usage based on several factors including environmental, price preferences, and system technical issues. In this paper a real-time energy management system (EMS) for microgrids or nanogrids was developed. The developed system involves an online optimization scheme to adapt its parameters based on previous, current, and forecasted future system states. The communication requirements for all EMS modules were analyzed and are all integrated over a data distribution service (DDS) Ethernet network withmore » appropriate quality of service (QoS) profiles. In conclusion, the developed EMS was emulated with actual residential energy consumption and irradiance data from Miami, Florida and proved its effectiveness in reducing consumers’ bills and achieving flat peak load profiles.« less

  20. On the TFNS Subgrid Models for Liquid-Fueled Turbulent Combustion

    NASA Technical Reports Server (NTRS)

    Liu, Nan-Suey; Wey, Thomas

    2014-01-01

    This paper describes the time-filtered Navier-Stokes (TFNS) approach capable of capturing unsteady flow structures important for turbulent mixing in the combustion chamber and two different subgrid models used to emulate the major processes occurring in the turbulence-chemistry interaction. These two subgrid models are termed as LEM-like model and EUPDF-like model (Eulerian probability density function), respectively. Two-phase turbulent combustion in a single-element lean-direct-injection (LDI) combustor is calculated by employing the TFNS/LEM-like approach as well as the TFNS/EUPDF-like approach. Results obtained from the TFNS approach employing these two different subgrid models are compared with each other, along with the experimental data, followed by more detailed comparison between the results of an updated calculation using the TFNS/LEM-like model and the experimental data.

  1. Analog Approach to Constraint Satisfaction Enabled by Spin Orbit Torque Magnetic Tunnel Junctions.

    PubMed

    Wijesinghe, Parami; Liyanagedera, Chamika; Roy, Kaushik

    2018-05-02

    Boolean satisfiability (k-SAT) is an NP-complete (k ≥ 3) problem that constitute one of the hardest classes of constraint satisfaction problems. In this work, we provide a proof of concept hardware based analog k-SAT solver, that is built using Magnetic Tunnel Junctions (MTJs). The inherent physics of MTJs, enhanced by device level modifications, is harnessed here to emulate the intricate dynamics of an analog satisfiability (SAT) solver. In the presence of thermal noise, the MTJ based system can successfully solve Boolean satisfiability problems. Most importantly, our results exhibit that, the proposed MTJ based hardware SAT solver is capable of finding a solution to a significant fraction (at least 85%) of hard 3-SAT problems, within a time that has a polynomial relationship with the number of variables(<50).

  2. Josephson Circuits as Vector Quantum Spins

    NASA Astrophysics Data System (ADS)

    Samach, Gabriel; Kerman, Andrew J.

    While superconducting circuits based on Josephson junction technology can be engineered to represent spins in the quantum transverse-field Ising model, no circuit architecture to date has succeeded in emulating the vector quantum spin models of interest for next-generation quantum annealers and quantum simulators. Here, we present novel Josephson circuits which may provide these capabilities. We discuss our rigorous quantum-mechanical simulations of these circuits, as well as the larger architectures they may enable. This research was funded by the Office of the Director of National Intelligence (ODNI) and the Intelligence Advanced Research Projects Activity (IARPA) under Air Force Contract No. FA8721-05-C-0002. The views and conclusions contained herein are those of the authors and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of ODNI, IARPA, or the US Government.

  3. Distributed Spectral Monitoring For Emitter Localization

    DTIC Science & Technology

    2018-02-12

    localization techniques in a DSA sensor network. The results of the research are presented through simulation of localization algorithms, emulation of a...network on a wireless RF environment emulator, and field tests. The results of the various tests in both the lab and field are obtained and analyzed to... are two main classes of localization techniques, and the technique to use will depend on the information available with the emitter. The first class

  4. A Structured Microprogram Set for the SUMC Computer to Emulate the IBM System/360, Model 50

    NASA Technical Reports Server (NTRS)

    Gimenez, Cesar R.

    1975-01-01

    Similarities between regular and structured microprogramming were examined. An explanation of machine branching architecture (particularly in the SUMC computer), required for ease of structured microprogram implementation is presented. Implementation of a structured microprogram set in the SUMC to emulate the IBM System/360 is described and a comparison is made between the structured set with a nonstructured set previously written for the SUMC.

  5. Becoming a high-fidelity - super - imitator: what are the contributions of social and individual learning?

    PubMed

    Subiaul, Francys; Patterson, Eric M; Schilder, Brian; Renner, Elizabeth; Barr, Rachel

    2015-11-01

    In contrast to other primates, human children's imitation performance goes from low to high fidelity soon after infancy. Are such changes associated with the development of other forms of learning? We addressed this question by testing 215 children (26-59 months) on two social conditions (imitation, emulation) - involving a demonstration - and two asocial conditions (trial-and-error, recall) - involving individual learning - using two touchscreen tasks. The tasks required responding to either three different pictures in a specific picture order (Cognitive: Airplane→Ball→Cow) or three identical pictures in a specific spatial order (Motor-Spatial: Up→Down→Right). There were age-related improvements across all conditions and imitation, emulation and recall performance were significantly better than trial-and-error learning. Generalized linear models demonstrated that motor-spatial imitation fidelity was associated with age and motor-spatial emulation performance, but cognitive imitation fidelity was only associated with age. While this study provides evidence for multiple imitation mechanisms, the development of one of those mechanisms - motor-spatial imitation - may be bootstrapped by the development of another social learning skill - motor-spatial emulation. Together, these findings provide important clues about the development of imitation, which is arguably a distinctive feature of the human species. © 2014 John Wiley & Sons Ltd.

  6. A universal ankle-foot prosthesis emulator for human locomotion experiments.

    PubMed

    Caputo, Joshua M; Collins, Steven H

    2014-03-01

    Robotic prostheses have the potential to significantly improve mobility for people with lower-limb amputation. Humans exhibit complex responses to mechanical interactions with these devices, however, and computational models are not yet able to predict such responses meaningfully. Experiments therefore play a critical role in development, but have been limited by the use of product-like prototypes, each requiring years of development and specialized for a narrow range of functions. Here we describe a robotic ankle-foot prosthesis system that enables rapid exploration of a wide range of dynamical behaviors in experiments with human subjects. This emulator comprises powerful off-board motor and control hardware, a flexible Bowden cable tether, and a lightweight instrumented prosthesis, resulting in a combination of low mass worn by the human (0.96 kg) and high mechatronic performance compared to prior platforms. Benchtop tests demonstrated closed-loop torque bandwidth of 17 Hz, peak torque of 175 Nm, and peak power of 1.0 kW. Tests with an anthropomorphic pendulum "leg" demonstrated low interference from the tether, less than 1 Nm about the hip. This combination of low worn mass, high bandwidth, high torque, and unrestricted movement makes the platform exceptionally versatile. To demonstrate suitability for human experiments, we performed preliminary tests in which a subject with unilateral transtibial amputation walked on a treadmill at 1.25 ms-1 while the prosthesis behaved in various ways. These tests revealed low torque tracking error (RMS error of 2.8 Nm) and the capacity to systematically vary work production or absorption across a broad range (from -5 to 21 J per step). These results support the use of robotic emulators during early stage assessment of proposed device functionalities and for scientific study of fundamental aspects of human-robot interaction. The design of simple, alternate end-effectors would enable studies at other joints or with additional degrees of freedom.

  7. Carbon-Temperature-Water Change Analysis for Peanut Production Under Climate Change: A Prototype for the AgMIP Coordinated Climate-Crop Modeling Project (C3MP)

    NASA Technical Reports Server (NTRS)

    Ruane, Alex C.; McDermid, Sonali; Rosenzweig, Cynthia; Baigorria, Guillermo A.; Jones, James W.; Romero, Consuelo C.; Cecil, L. DeWayne

    2014-01-01

    Climate change is projected to push the limits of cropping systems and has the potential to disrupt the agricultural sector from local to global scales. This article introduces the Coordinated Climate-Crop Modeling Project (C3MP), an initiative of the Agricultural Model Intercomparison and Improvement Project (AgMIP) to engage a global network of crop modelers to explore the impacts of climate change via an investigation of crop responses to changes in carbon dioxide concentration ([CO2]), temperature, and water. As a demonstration of the C3MP protocols and enabled analyses, we apply the Decision Support System for Agrotechnology Transfer (DSSAT) CROPGRO-Peanut crop model for Henry County, Alabama, to evaluate responses to the range of plausible [CO2], temperature changes, and precipitation changes projected by climate models out to the end of the 21st century. These sensitivity tests are used to derive crop model emulators that estimate changes in mean yield and the coefficient of variation for seasonal yields across a broad range of climate conditions, reproducing mean yields from sensitivity test simulations with deviations of ca. 2% for rain-fed conditions. We apply these statistical emulators to investigate how peanuts respond to projections from various global climate models, time periods, and emissions scenarios, finding a robust projection of modest (<10%) median yield losses in the middle of the 21st century accelerating to more severe (>20%) losses and larger uncertainty at the end of the century under the more severe representative concentration pathway (RCP8.5). This projection is not substantially altered by the selection of the AgMERRA global gridded climate dataset rather than the local historical observations, differences between the Third and Fifth Coupled Model Intercomparison Project (CMIP3 and CMIP5), or the use of the delta method of climate impacts analysis rather than the C3MP impacts response surface and emulator approach.

  8. Factors influencing food choice in an Australian Aboriginal community.

    PubMed

    Brimblecombe, Julie; Maypilama, Elaine; Colles, Susan; Scarlett, Maria; Dhurrkay, Joanne Garnggulkpuy; Ritchie, Jan; O'Dea, Kerin

    2014-03-01

    We explored with Aboriginal adults living in a remote Australian community the social context of food choice and factors perceived to shape food choice. An ethnographic approach of prolonged community engagement over 3 years was augmented by interviews. Our findings revealed that knowledge, health, and resources supporting food choice were considered "out of balance," and this imbalance was seen to manifest in a Western-imposed diet lacking variety and overrelying on familiar staples. Participants felt ill-equipped to emulate the traditional pattern of knowledge transfer through passing food-related wisdom to younger generations. The traditional food system was considered key to providing the framework for learning about the contemporary food environment. Practitioners seeking to improve diet and health outcomes for this population should attend to past and present contexts of food in nutrition education, support the educative role of caregivers, address the high cost of food, and support access to traditional foods.

  9. A data distribution strategy for the 1990s (files are not enough)

    NASA Technical Reports Server (NTRS)

    Tankenson, Mike; Wright, Steven

    1993-01-01

    Virtually all of the data distribution strategies being contemplated for the EOSDIS era revolve around the use of files. Most, if not all, mass storage technologies are based around the file model. However, files may be the wrong primary abstraction for supporting scientific users in the 1990s and beyond. Other abstractions more closely matching the respective scientific discipline of the end user may be more appropriate. JPL has built a unique multimission data distribution system based on a strategy of telemetry stream emulation to match the responsibilities of spacecraft team and ground data system operators supporting our nations suite of planetary probes. The current system, operational since 1989 and the launch of the Magellan spacecraft, is supporting over 200 users at 15 remote sites. This stream-oriented data distribution model can provide important lessons learned to builders of future data systems.

  10. Composition and Realization of Source-to-Sink High-Performance Flows: File Systems, Storage, Hosts, LAN and WAN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Chase Qishi

    A number of Department of Energy (DOE) science applications, involving exascale computing systems and large experimental facilities, are expected to generate large volumes of data, in the range of petabytes to exabytes, which will be transported over wide-area networks for the purpose of storage, visualization, and analysis. To support such capabilities, significant progress has been made in various components including the deployment of 100 Gbps networks with future 1 Tbps bandwidth, increases in end-host capabilities with multiple cores and buses, capacity improvements in large disk arrays, and deployment of parallel file systems such as Lustre and GPFS. High-performance source-to-sink datamore » flows must be composed of these component systems, which requires significant optimizations of the storage-to-host data and execution paths to match the edge and long-haul network connections. In particular, end systems are currently supported by 10-40 Gbps Network Interface Cards (NIC) and 8-32 Gbps storage Host Channel Adapters (HCAs), which carry the individual flows that collectively must reach network speeds of 100 Gbps and higher. Indeed, such data flows must be synthesized using multicore, multibus hosts connected to high-performance storage systems on one side and to the network on the other side. Current experimental results show that the constituent flows must be optimally composed and preserved from storage systems, across the hosts and the networks with minimal interference. Furthermore, such a capability must be made available transparently to the science users without placing undue demands on them to account for the details of underlying systems and networks. And, this task is expected to become even more complex in the future due to the increasing sophistication of hosts, storage systems, and networks that constitute the high-performance flows. The objectives of this proposal are to (1) develop and test the component technologies and their synthesis methods to achieve source-to-sink high-performance flows, and (2) develop tools that provide these capabilities through simple interfaces to users and applications. In terms of the former, we propose to develop (1) optimization methods that align and transition multiple storage flows to multiple network flows on multicore, multibus hosts; and (2) edge and long-haul network path realization and maintenance using advanced provisioning methods including OSCARS and OpenFlow. We also propose synthesis methods that combine these individual technologies to compose high-performance flows using a collection of constituent storage-network flows, and realize them across the storage and local network connections as well as long-haul connections. We propose to develop automated user tools that profile the hosts, storage systems, and network connections; compose the source-to-sink complex flows; and set up and maintain the needed network connections. These solutions will be tested using (1) 100 Gbps connection(s) between Oak Ridge National Laboratory (ORNL) and Argonne National Laboratory (ANL) with storage systems supported by Lustre and GPFS file systems with an asymmetric connection to University of Memphis (UM); (2) ORNL testbed with multicore and multibus hosts, switches with OpenFlow capabilities, and network emulators; and (3) 100 Gbps connections from ESnet and their Openflow testbed, and other experimental connections. This proposal brings together the expertise and facilities of the two national laboratories, ORNL and ANL, and UM. It also represents a collaboration between DOE and the Department of Defense (DOD) projects at ORNL by sharing technical expertise and personnel costs, and leveraging the existing DOD Extreme Scale Systems Center (ESSC) facilities at ORNL.« less

  11. High-Performance Integrated Control of water quality and quantity in urban water reservoirs

    NASA Astrophysics Data System (ADS)

    Galelli, S.; Castelletti, A.; Goedbloed, A.

    2015-11-01

    This paper contributes a novel High-Performance Integrated Control framework to support the real-time operation of urban water supply storages affected by water quality problems. We use a 3-D, high-fidelity simulation model to predict the main water quality dynamics and inform a real-time controller based on Model Predictive Control. The integration of the simulation model into the control scheme is performed by a model reduction process that identifies a low-order, dynamic emulator running 4 orders of magnitude faster. The model reduction, which relies on a semiautomatic procedural approach integrating time series clustering and variable selection algorithms, generates a compact and physically meaningful emulator that can be coupled with the controller. The framework is used to design the hourly operation of Marina Reservoir, a 3.2 Mm3 storm-water-fed reservoir located in the center of Singapore, operated for drinking water supply and flood control. Because of its recent formation from a former estuary, the reservoir suffers from high salinity levels, whose behavior is modeled with Delft3D-FLOW. Results show that our control framework reduces the minimum salinity levels by nearly 40% and cuts the average annual deficit of drinking water supply by about 2 times the active storage of the reservoir (about 4% of the total annual demand).

  12. Observation of localized ground and excited orbitals in graphene photonic ribbons

    NASA Astrophysics Data System (ADS)

    Cantillano, C.; Mukherjee, S.; Morales-Inostroza, L.; Real, B.; Cáceres-Aravena, G.; Hermann-Avigliano, C.; Thomson, R. R.; Vicencio, R. A.

    2018-03-01

    We report on the experimental realization of a quasi-one-dimensional photonic graphene ribbon supporting four flat-bands (FBs). We study the dynamics of fundamental and dipolar modes, which are analogous to the s and p orbitals, respectively. In the experiment, both modes (orbitals) are effectively decoupled from each other, implying two sets of six bands, where two of them are completely flat (dispersionless). Using an image generator setup, we excite the s and p FB modes and demonstrate their non-diffracting propagation for the first time. Our results open an exciting route towards photonic emulation of higher orbital dynamics.

  13. Gas Stripping in the Simulated Pegasus Galaxy

    NASA Astrophysics Data System (ADS)

    Mercado, Francisco Javier; Samaniego, Alejandro; Wheeler, Coral; Bullock, James

    2017-01-01

    We utilize the hydrodynamic simulation code GIZMO to construct a non-cosmological idealized dwarf galaxy built to match the parameters of the observed Pegasus dwarf galaxy. This simulated galaxy will be used in a series of tests in which we will implement different methods of removing the dwarf’s gas in order to emulate the ram pressure stripping mechanism encountered by dwarf galaxies as they fall into more massive companion galaxies. These scenarios will be analyzed in order to determine the role that the removal of gas plays in rotational vs. dispersion support (Vrot/σ) of our galaxy.

  14. Transforming Primary Care Practice and Education: Lessons From 6 Academic Learning Collaboratives.

    PubMed

    Koch, Ursula; Bitton, Asaf; Landon, Bruce E; Phillips, Russell S

    Adoption of new primary care models has been slow in academic teaching practices. We describe a common framework that academic learning collaboratives are using to transform primary care practice based on our analysis of 6 collaboratives nationally. We show that the work of the collaboratives could be divided into 3 phases and provide detail on the phases of work and a road map for those who seek to emulate this work. We found that learning collaboratives foster transformation, even in complex academic practices, but need specific support adapted to their unique challenges.

  15. Predeployment validation of fault-tolerant systems through software-implemented fault insertion

    NASA Technical Reports Server (NTRS)

    Czeck, Edward W.; Siewiorek, Daniel P.; Segall, Zary Z.

    1989-01-01

    Fault injection-based automated testing (FIAT) environment, which can be used to experimentally characterize and evaluate distributed realtime systems under fault-free and faulted conditions is described. A survey is presented of validation methodologies. The need for fault insertion based on validation methodologies is demonstrated. The origins and models of faults, and motivation for the FIAT concept are reviewed. FIAT employs a validation methodology which builds confidence in the system through first providing a baseline of fault-free performance data and then characterizing the behavior of the system with faults present. Fault insertion is accomplished through software and allows faults or the manifestation of faults to be inserted by either seeding faults into memory or triggering error detection mechanisms. FIAT is capable of emulating a variety of fault-tolerant strategies and architectures, can monitor system activity, and can automatically orchestrate experiments involving insertion of faults. There is a common system interface which allows ease of use to decrease experiment development and run time. Fault models chosen for experiments on FIAT have generated system responses which parallel those observed in real systems under faulty conditions. These capabilities are shown by two example experiments each using a different fault-tolerance strategy.

  16. Quantum Emulation of Gravitational Waves.

    PubMed

    Fernandez-Corbaton, Ivan; Cirio, Mauro; Büse, Alexander; Lamata, Lucas; Solano, Enrique; Molina-Terriza, Gabriel

    2015-07-14

    Gravitational waves, as predicted by Einstein's general relativity theory, appear as ripples in the fabric of spacetime traveling at the speed of light. We prove that the propagation of small amplitude gravitational waves in a curved spacetime is equivalent to the propagation of a subspace of electromagnetic states. We use this result to propose the use of entangled photons to emulate the evolution of gravitational waves in curved spacetimes by means of experimental electromagnetic setups featuring metamaterials.

  17. A Scalable and Dynamic Testbed for Conducting Penetration-Test Training in a Laboratory Environment

    DTIC Science & Technology

    2015-03-01

    entry point through which to execute a payload to accomplish a higher-level goal: executing arbitrary code, escalating privileges , pivoting...Mobile Ad Hoc Network Emulator (EMANE)26 can emulate the entire network stack (physical to application -layer protocols). 2. Methodology To build a...to host Windows, Linux, MacOS, Android , and other operating systems without much effort. 4 E. A simple and automatic “restore” function: Many

  18. On the diagnostic emulation technique and its use in the AIRLAB

    NASA Technical Reports Server (NTRS)

    Migneault, Gerard E.

    1988-01-01

    An aid is presented for understanding and judging the relevance of the diagnostic emulation technique to studies of highly reliable, digital computing systems for aircraft. A short review is presented of the need for and the use of the technique as well as an explanation of its principles of operation and implementation. Details that would be needed for operational control or modification of existing versions of the technique are not described.

  19. Tools and Methods to Create Scenarios for Experimental Research in the Network Science Research Laboratory (NSRL)

    DTIC Science & Technology

    2015-12-01

    research areas in network science. 15. SUBJECT TERMS scenario creation , emulation environment, NSRL 16. SECURITY CLASSIFICATION OF: 17. LIMITATION...mobility aspect of the emulated environment, the development and creation of scenarios play an integral part. By creating scenarios that model certain...during the visualization phase. We examine these 3 phases in detail by describing the creation of a scenario based upon a vignette from the Multi-Level

  20. Environmental control and life support system analysis tools for the Space Station era

    NASA Technical Reports Server (NTRS)

    Blakely, R. L.; Rowell, L. F.

    1984-01-01

    This paper describes the concept of a developing emulation, simulation, sizing, and technology assessment program (ESSTAP) which can be used effectively for the various functional disciplines (structures, power, ECLSS, etc.) beginning with the initial system selection and conceptual design processes and continuing on through the mission operation and growth phases of the Space Station for the purpose of minimizing overall program costs. It will discuss the basic requirements for these tools, as currently envisioned for the Environmental Control and Life Support System (ECLSS), identifying their intended and potential uses and applications, and present examples and status of several representative tools. The development and applications of a Space Station Atmospheric Revitalization Subsystem (ARS) demonstration model to be used for concent verification will also be discussed.

  1. Statistical Emulation of Climate Model Projections Based on Precomputed GCM Runs*

    DOE PAGES

    Castruccio, Stefano; McInerney, David J.; Stein, Michael L.; ...

    2014-02-24

    The authors describe a new approach for emulating the output of a fully coupled climate model under arbitrary forcing scenarios that is based on a small set of precomputed runs from the model. Temperature and precipitation are expressed as simple functions of the past trajectory of atmospheric CO 2 concentrations, and a statistical model is fit using a limited set of training runs. The approach is demonstrated to be a useful and computationally efficient alternative to pattern scaling and captures the nonlinear evolution of spatial patterns of climate anomalies inherent in transient climates. The approach does as well as patternmore » scaling in all circumstances and substantially better in many; it is not computationally demanding; and, once the statistical model is fit, it produces emulated climate output effectively instantaneously. In conclusion, it may therefore find wide application in climate impacts assessments and other policy analyses requiring rapid climate projections.« less

  2. Microwave emulations and tight-binding calculations of transport in polyacetylene

    NASA Astrophysics Data System (ADS)

    Stegmann, Thomas; Franco-Villafañe, John A.; Ortiz, Yenni P.; Kuhl, Ulrich; Mortessagne, Fabrice; Seligman, Thomas H.

    2017-01-01

    A novel approach to investigate the electron transport of cis- and trans-polyacetylene chains in the single-electron approximation is presented by using microwave emulation measurements and tight-binding calculations. In the emulation we take into account the different electronic couplings due to the double bonds leading to coupled dimer chains. The relative coupling constants are adjusted by DFT calculations. For sufficiently long chains a transport band gap is observed if the double bonds are present, whereas for identical couplings no band gap opens. The band gap can be observed also in relatively short chains, if additional edge atoms are absent, which cause strong resonance peaks within the band gap. The experimental results are in agreement with our tight-binding calculations using the nonequilibrium Green's function method. The tight-binding calculations show that it is crucial to include third nearest neighbor couplings to obtain the gap in the cis-polyacetylene.

  3. Emulation and design of terahertz reflection-mode confocal scanning microscopy based on virtual pinhole

    NASA Astrophysics Data System (ADS)

    Yang, Yong-fa; Li, Qi

    2014-12-01

    In the practical application of terahertz reflection-mode confocal scanning microscopy, the size of detector pinhole is an important factor that determines the performance of spatial resolution characteristic of the microscopic system. However, the use of physical pinhole brings some inconvenience to the experiment and the adjustment error has a great influence on the experiment result. Through reasonably selecting the parameter of matrix detector virtual pinhole (VPH), it can efficiently approximate the physical pinhole. By using this approach, the difficulty of experimental calibration is reduced significantly. In this article, an imaging scheme of terahertz reflection-mode confocal scanning microscopy that is based on the matrix detector VPH is put forward. The influence of detector pinhole size on the axial resolution of confocal scanning microscopy is emulated and analyzed. Then, the parameter of VPH is emulated when the best axial imaging performance is reached.

  4. A learning-based autonomous driver: emulate human driver's intelligence in low-speed car following

    NASA Astrophysics Data System (ADS)

    Wei, Junqing; Dolan, John M.; Litkouhi, Bakhtiar

    2010-04-01

    In this paper, an offline learning mechanism based on the genetic algorithm is proposed for autonomous vehicles to emulate human driver behaviors. The autonomous driving ability is implemented based on a Prediction- and Cost function-Based algorithm (PCB). PCB is designed to emulate a human driver's decision process, which is modeled as traffic scenario prediction and evaluation. This paper focuses on using a learning algorithm to optimize PCB with very limited training data, so that PCB can have the ability to predict and evaluate traffic scenarios similarly to human drivers. 80 seconds of human driving data was collected in low-speed (< 30miles/h) car-following scenarios. In the low-speed car-following tests, PCB was able to perform more human-like carfollowing after learning. A more general 120 kilometer-long simulation showed that PCB performs robustly even in scenarios that are not part of the training set.

  5. Compact first and second order polarization mode dispersion emulator

    NASA Astrophysics Data System (ADS)

    Zhang, Yang; Li, Shiguang; Yang, Changxi

    2005-08-01

    We propose a 1st and 2nd order polarization mode dispersion emulator (PMDE) with one variable differential group delay (DGD) element using birefringence crystals and four polarization controllers (PCs). Monte Carlo simulations demonstrate that the output 1st and 2nd order polarization mode dispersion (PMD) generated by the PMDE consists with statistic theory. Compared with former PMDEs, this design is tunable, lower-cost, and more integrated for fabrication, which shows response time of 150 ?s, response frequency of 3.8 kHz, working wavelength of 1550 nm, total power consumption of less than 3 W, working range of 0---84 ps and 0---3600 ps2 for 1st and 2nd order PMD emulation, respectively. Also, it is programmable and can be controlled by either singlechip or computer. It can be applied to study the outage probability of optical communication systems due to PMD effect and the effectiveness of PMD compensation.

  6. Beamforming synthesis of binaural responses from computer simulations of acoustic spaces.

    PubMed

    Poletti, Mark A; Svensson, U Peter

    2008-07-01

    Auditorium designs can be evaluated prior to construction by numerical modeling of the design. High-accuracy numerical modeling produces the sound pressure on a rectangular grid, and subjective assessment of the design requires auralization of the sampled sound field at a desired listener position. This paper investigates the production of binaural outputs from the sound pressure at a selected number of grid points by using a least squares beam forming approach. Low-frequency axisymmetric emulations are derived by assuming a solid sphere model of the head, and a spherical array of 640 microphones is used to emulate ten measured head-related transfer function (HRTF) data sets from the CIPIC database for half the audio bandwidth. The spherical array can produce high-accuracy band-limited emulation of any human subject's measured HRTFs for a fixed listener position by using individual sets of beam forming impulse responses.

  7. Simulation/Emulation Techniques: Compressing Schedules With Parallel (HW/SW) Development

    NASA Technical Reports Server (NTRS)

    Mangieri, Mark L.; Hoang, June

    2014-01-01

    NASA has always been in the business of balancing new technologies and techniques to achieve human space travel objectives. NASA's Kedalion engineering analysis lab has been validating and using many contemporary avionics HW/SW development and integration techniques, which represent new paradigms to NASA's heritage culture. Kedalion has validated many of the Orion HW/SW engineering techniques borrowed from the adjacent commercial aircraft avionics solution space, inserting new techniques and skills into the Multi - Purpose Crew Vehicle (MPCV) Orion program. Using contemporary agile techniques, Commercial-off-the-shelf (COTS) products, early rapid prototyping, in-house expertise and tools, and extensive use of simulators and emulators, NASA has achieved cost effective paradigms that are currently serving the Orion program effectively. Elements of long lead custom hardware on the Orion program have necessitated early use of simulators and emulators in advance of deliverable hardware to achieve parallel design and development on a compressed schedule.

  8. Emulating weak localization using a solid-state quantum circuit.

    PubMed

    Chen, Yu; Roushan, P; Sank, D; Neill, C; Lucero, Erik; Mariantoni, Matteo; Barends, R; Chiaro, B; Kelly, J; Megrant, A; Mutus, J Y; O'Malley, P J J; Vainsencher, A; Wenner, J; White, T C; Yin, Yi; Cleland, A N; Martinis, John M

    2014-10-14

    Quantum interference is one of the most fundamental physical effects found in nature. Recent advances in quantum computing now employ interference as a fundamental resource for computation and control. Quantum interference also lies at the heart of sophisticated condensed matter phenomena such as Anderson localization, phenomena that are difficult to reproduce in numerical simulations. Here, employing a multiple-element superconducting quantum circuit, with which we manipulate a single microwave photon, we demonstrate that we can emulate the basic effects of weak localization. By engineering the control sequence, we are able to reproduce the well-known negative magnetoresistance of weak localization as well as its temperature dependence. Furthermore, we can use our circuit to continuously tune the level of disorder, a parameter that is not readily accessible in mesoscopic systems. Demonstrating a high level of control, our experiment shows the potential for employing superconducting quantum circuits as emulators for complex quantum phenomena.

  9. Navigating tissue chips from development to dissemination: A pharmaceutical industry perspective

    PubMed Central

    Fabre, Kristin; Chakilam, Ananthsrinivas; Dragan, Yvonne; Duignan, David B; Eswaraka, Jeetu; Gan, Jinping; Guzzie-Peck, Peggy; Otieno, Monicah; Jeong, Claire G; Keller, Douglas A; de Morais, Sonia M; Phillips, Jonathan A; Proctor, William; Sura, Radhakrishna; Van Vleet, Terry; Watson, David; Will, Yvonne; Tagle, Danilo; Berridge, Brian

    2017-01-01

    Tissue chips are poised to deliver a paradigm shift in drug discovery. By emulating human physiology, these chips have the potential to increase the predictive power of preclinical modeling, which in turn will move the pharmaceutical industry closer to its aspiration of clinically relevant and ultimately animal-free drug discovery. Despite the tremendous science and innovation invested in these tissue chips, significant challenges remain to be addressed to enable their routine adoption into the industrial laboratory. This article describes the main steps that need to be taken and highlights key considerations in order to transform tissue chip technology from the hands of the innovators into those of the industrial scientists. Written by scientists from 13 pharmaceutical companies and partners at the National Institutes of Health, this article uniquely captures a consensus view on the progression strategy to facilitate and accelerate the adoption of this valuable technology. It concludes that success will be delivered by a partnership approach as well as a deep understanding of the context within which these chips will actually be used. Impact statement The rapid pace of scientific innovation in the tissue chip (TC) field requires a cohesive partnership between innovators and end users. Near term uptake of these human-relevant platforms will fill gaps in current capabilities for assessing important properties of disposition, efficacy and safety liabilities. Similarly, these platforms could support mechanistic studies which aim to resolve challenges later in development (e.g. assessing the human relevance of a liability identified in animal studies). Building confidence that novel capabilities of TCs can address real world challenges while they themselves are being developed will accelerate their application in the discovery and development of innovative medicines. This article outlines a strategic roadmap to unite innovators and end users thus making implementation smooth and rapid. With the collective contributions from multiple international pharmaceutical companies and partners at National Institutes of Health, this article should serve as an invaluable resource to the multi-disciplinary field of TC development. PMID:28622731

  10. Navigating tissue chips from development to dissemination: A pharmaceutical industry perspective.

    PubMed

    Ewart, Lorna; Fabre, Kristin; Chakilam, Ananthsrinivas; Dragan, Yvonne; Duignan, David B; Eswaraka, Jeetu; Gan, Jinping; Guzzie-Peck, Peggy; Otieno, Monicah; Jeong, Claire G; Keller, Douglas A; de Morais, Sonia M; Phillips, Jonathan A; Proctor, William; Sura, Radhakrishna; Van Vleet, Terry; Watson, David; Will, Yvonne; Tagle, Danilo; Berridge, Brian

    2017-10-01

    Tissue chips are poised to deliver a paradigm shift in drug discovery. By emulating human physiology, these chips have the potential to increase the predictive power of preclinical modeling, which in turn will move the pharmaceutical industry closer to its aspiration of clinically relevant and ultimately animal-free drug discovery. Despite the tremendous science and innovation invested in these tissue chips, significant challenges remain to be addressed to enable their routine adoption into the industrial laboratory. This article describes the main steps that need to be taken and highlights key considerations in order to transform tissue chip technology from the hands of the innovators into those of the industrial scientists. Written by scientists from 13 pharmaceutical companies and partners at the National Institutes of Health, this article uniquely captures a consensus view on the progression strategy to facilitate and accelerate the adoption of this valuable technology. It concludes that success will be delivered by a partnership approach as well as a deep understanding of the context within which these chips will actually be used. Impact statement The rapid pace of scientific innovation in the tissue chip (TC) field requires a cohesive partnership between innovators and end users. Near term uptake of these human-relevant platforms will fill gaps in current capabilities for assessing important properties of disposition, efficacy and safety liabilities. Similarly, these platforms could support mechanistic studies which aim to resolve challenges later in development (e.g. assessing the human relevance of a liability identified in animal studies). Building confidence that novel capabilities of TCs can address real world challenges while they themselves are being developed will accelerate their application in the discovery and development of innovative medicines. This article outlines a strategic roadmap to unite innovators and end users thus making implementation smooth and rapid. With the collective contributions from multiple international pharmaceutical companies and partners at National Institutes of Health, this article should serve as an invaluable resource to the multi-disciplinary field of TC development.

  11. Relating dynamic brain states to dynamic machine states: Human and machine solutions to the speech recognition problem

    PubMed Central

    Liu, Xunying; Zhang, Chao; Woodland, Phil; Fonteneau, Elisabeth

    2017-01-01

    There is widespread interest in the relationship between the neurobiological systems supporting human cognition and emerging computational systems capable of emulating these capacities. Human speech comprehension, poorly understood as a neurobiological process, is an important case in point. Automatic Speech Recognition (ASR) systems with near-human levels of performance are now available, which provide a computationally explicit solution for the recognition of words in continuous speech. This research aims to bridge the gap between speech recognition processes in humans and machines, using novel multivariate techniques to compare incremental ‘machine states’, generated as the ASR analysis progresses over time, to the incremental ‘brain states’, measured using combined electro- and magneto-encephalography (EMEG), generated as the same inputs are heard by human listeners. This direct comparison of dynamic human and machine internal states, as they respond to the same incrementally delivered sensory input, revealed a significant correspondence between neural response patterns in human superior temporal cortex and the structural properties of ASR-derived phonetic models. Spatially coherent patches in human temporal cortex responded selectively to individual phonetic features defined on the basis of machine-extracted regularities in the speech to lexicon mapping process. These results demonstrate the feasibility of relating human and ASR solutions to the problem of speech recognition, and suggest the potential for further studies relating complex neural computations in human speech comprehension to the rapidly evolving ASR systems that address the same problem domain. PMID:28945744

  12. A reconfigurable on-line learning spiking neuromorphic processor comprising 256 neurons and 128K synapses

    PubMed Central

    Qiao, Ning; Mostafa, Hesham; Corradi, Federico; Osswald, Marc; Stefanini, Fabio; Sumislawska, Dora; Indiveri, Giacomo

    2015-01-01

    Implementing compact, low-power artificial neural processing systems with real-time on-line learning abilities is still an open challenge. In this paper we present a full-custom mixed-signal VLSI device with neuromorphic learning circuits that emulate the biophysics of real spiking neurons and dynamic synapses for exploring the properties of computational neuroscience models and for building brain-inspired computing systems. The proposed architecture allows the on-chip configuration of a wide range of network connectivities, including recurrent and deep networks, with short-term and long-term plasticity. The device comprises 128 K analog synapse and 256 neuron circuits with biologically plausible dynamics and bi-stable spike-based plasticity mechanisms that endow it with on-line learning abilities. In addition to the analog circuits, the device comprises also asynchronous digital logic circuits for setting different synapse and neuron properties as well as different network configurations. This prototype device, fabricated using a 180 nm 1P6M CMOS process, occupies an area of 51.4 mm2, and consumes approximately 4 mW for typical experiments, for example involving attractor networks. Here we describe the details of the overall architecture and of the individual circuits and present experimental results that showcase its potential. By supporting a wide range of cortical-like computational modules comprising plasticity mechanisms, this device will enable the realization of intelligent autonomous systems with on-line learning capabilities. PMID:25972778

  13. Quantum Emulation of Gravitational Waves

    PubMed Central

    Fernandez-Corbaton, Ivan; Cirio, Mauro; Büse, Alexander; Lamata, Lucas; Solano, Enrique; Molina-Terriza, Gabriel

    2015-01-01

    Gravitational waves, as predicted by Einstein’s general relativity theory, appear as ripples in the fabric of spacetime traveling at the speed of light. We prove that the propagation of small amplitude gravitational waves in a curved spacetime is equivalent to the propagation of a subspace of electromagnetic states. We use this result to propose the use of entangled photons to emulate the evolution of gravitational waves in curved spacetimes by means of experimental electromagnetic setups featuring metamaterials. PMID:26169801

  14. The reflection of evolving bearing faults in the stator current's extended park vector approach for induction machines

    NASA Astrophysics Data System (ADS)

    Corne, Bram; Vervisch, Bram; Derammelaere, Stijn; Knockaert, Jos; Desmet, Jan

    2018-07-01

    Stator current analysis has the potential of becoming the most cost-effective condition monitoring technology regarding electric rotating machinery. Since both electrical and mechanical faults are detected by inexpensive and robust current-sensors, measuring current is advantageous on other techniques such as vibration, acoustic or temperature analysis. However, this technology is struggling to breach into the market of condition monitoring as the electrical interpretation of mechanical machine-problems is highly complicated. Recently, the authors built a test-rig which facilitates the emulation of several representative mechanical faults on an 11 kW induction machine with high accuracy and reproducibility. Operating this test-rig, the stator current of the induction machine under test can be analyzed while mechanical faults are emulated. Furthermore, while emulating, the fault-severity can be manipulated adaptively under controllable environmental conditions. This creates the opportunity of examining the relation between the magnitude of the well-known current fault components and the corresponding fault-severity. This paper presents the emulation of evolving bearing faults and their reflection in the Extended Park Vector Approach for the 11 kW induction machine under test. The results confirm the strong relation between the bearing faults and the stator current fault components in both identification and fault-severity. Conclusively, stator current analysis increases reliability in the application as a complete, robust, on-line condition monitoring technology.

  15. SiSeRHMap v1.0: a simulator for mapped seismic response using a hybrid model

    NASA Astrophysics Data System (ADS)

    Grelle, Gerardo; Bonito, Laura; Lampasi, Alessandro; Revellino, Paola; Guerriero, Luigi; Sappa, Giuseppe; Guadagno, Francesco Maria

    2016-04-01

    The SiSeRHMap (simulator for mapped seismic response using a hybrid model) is a computerized methodology capable of elaborating prediction maps of seismic response in terms of acceleration spectra. It was realized on the basis of a hybrid model which combines different approaches and models in a new and non-conventional way. These approaches and models are organized in a code architecture composed of five interdependent modules. A GIS (geographic information system) cubic model (GCM), which is a layered computational structure based on the concept of lithodynamic units and zones, aims at reproducing a parameterized layered subsoil model. A meta-modelling process confers a hybrid nature to the methodology. In this process, the one-dimensional (1-D) linear equivalent analysis produces acceleration response spectra for a specified number of site profiles using one or more input motions. The shear wave velocity-thickness profiles, defined as trainers, are randomly selected in each zone. Subsequently, a numerical adaptive simulation model (Emul-spectra) is optimized on the above trainer acceleration response spectra by means of a dedicated evolutionary algorithm (EA) and the Levenberg-Marquardt algorithm (LMA) as the final optimizer. In the final step, the GCM maps executor module produces a serial map set of a stratigraphic seismic response at different periods, grid solving the calibrated Emul-spectra model. In addition, the spectra topographic amplification is also computed by means of a 3-D validated numerical prediction model. This model is built to match the results of the numerical simulations related to isolate reliefs using GIS morphometric data. In this way, different sets of seismic response maps are developed on which maps of design acceleration response spectra are also defined by means of an enveloping technique.

  16. An abdominal active can defibrillator may facilitate a successful generator change when a lead failure is present.

    PubMed

    Solomon, A J; Moubarak, J B; Drood, J M; Tracy, C M; Karasik, P E

    1999-10-01

    Defibrillator generator changes are frequently performed on patients with an implantable cardioverter defibrillator in an abdominal pocket. These patients usually have epicardial patches or older endocardial lead systems. At the time of a defibrillator generator change defibrillation may be unsuccessful as a result of lead failure. We tested the hypothesis that an active can defibrillator implanted in the abdominal pocket could replace a non-functioning endocardial lead or epicardial patch. An abdominal defibrillator generator change was performed in 10 patients, (mean age = 67 +/- 13 years, nine men). Initially, a defibrillation threshold (DFT) was obtained using a passive defibrillator and the chronic endocardial or epicardial lead system. DFTs were then performed using an active can emulator and one chronic lead to simulate endocardial or epicardial lead failure. We tested 30 lead configurations (nine endocardial and 21 epicardial). Although a DFT of 7.3 +/- 4.2 joules was obtained with the intact chronic lead system, the active can emulator and one endocardial or epicardial lead still yielded an acceptable DFT of 19.9 +/- 6.1 joules. In addition, a successful implant (DFT < or = 24 joules) could have been accomplished in 28 of 30 (93%) lead configurations. An active can defibrillator in an abdominal pocket may allow for a successful generator change in patients with defibrillator lead malfunction. This would be simpler than abandoning the abdominal implant and moving to a new pectoral device and lead or tunnelling a new endocardial electrode. However, loss of defibrillation capability with a particular complex lead may be a warning of impending loss of other functions (eg. sensing and/or pacing).

  17. Dynamical analysis of Parkinsonian state emulated by hybrid Izhikevich neuron models

    NASA Astrophysics Data System (ADS)

    Liu, Chen; Wang, Jiang; Yu, Haitao; Deng, Bin; Wei, Xile; Li, Huiyan; Loparo, Kenneth A.; Fietkiewicz, Chris

    2015-11-01

    Computational models play a significant role in exploring novel theories to complement the findings of physiological experiments. Various computational models have been developed to reveal the mechanisms underlying brain functions. Particularly, in the development of therapies to modulate behavioral and pathological abnormalities, computational models provide the basic foundations to exhibit transitions between physiological and pathological conditions. Considering the significant roles of the intrinsic properties of the globus pallidus and the coupling connections between neurons in determining the firing patterns and the dynamical activities of the basal ganglia neuronal network, we propose a hypothesis that pathological behaviors under the Parkinsonian state may originate from combined effects of intrinsic properties of globus pallidus neurons and synaptic conductances in the whole neuronal network. In order to establish a computational efficient network model, hybrid Izhikevich neuron model is used due to its capacity of capturing the dynamical characteristics of the biological neuronal activities. Detailed analysis of the individual Izhikevich neuron model can assist in understanding the roles of model parameters, which then facilitates the establishment of the basal ganglia-thalamic network model, and contributes to a further exploration of the underlying mechanisms of the Parkinsonian state. Simulation results show that the hybrid Izhikevich neuron model is capable of capturing many of the dynamical properties of the basal ganglia-thalamic neuronal network, such as variations of the firing rates and emergence of synchronous oscillations under the Parkinsonian condition, despite the simplicity of the two-dimensional neuronal model. It may suggest that the computational efficient hybrid Izhikevich neuron model can be used to explore basal ganglia normal and abnormal functions. Especially it provides an efficient way of emulating the large-scale neuron network and potentially contributes to development of improved therapy for neurological disorders such as Parkinson's disease.

  18. NWP model forecast skill optimization via closure parameter variations

    NASA Astrophysics Data System (ADS)

    Järvinen, H.; Ollinaho, P.; Laine, M.; Solonen, A.; Haario, H.

    2012-04-01

    We present results of a novel approach to tune predictive skill of numerical weather prediction (NWP) models. These models contain tunable parameters which appear in parameterizations schemes of sub-grid scale physical processes. The current practice is to specify manually the numerical parameter values, based on expert knowledge. We developed recently a concept and method (QJRMS 2011) for on-line estimation of the NWP model parameters via closure parameter variations. The method called EPPES ("Ensemble prediction and parameter estimation system") utilizes ensemble prediction infra-structure for parameter estimation in a very cost-effective way: practically no new computations are introduced. The approach provides an algorithmic decision making tool for model parameter optimization in operational NWP. In EPPES, statistical inference about the NWP model tunable parameters is made by (i) generating an ensemble of predictions so that each member uses different model parameter values, drawn from a proposal distribution, and (ii) feeding-back the relative merits of the parameter values to the proposal distribution, based on evaluation of a suitable likelihood function against verifying observations. In this presentation, the method is first illustrated in low-order numerical tests using a stochastic version of the Lorenz-95 model which effectively emulates the principal features of ensemble prediction systems. The EPPES method correctly detects the unknown and wrongly specified parameters values, and leads to an improved forecast skill. Second, results with an ensemble prediction system emulator, based on the ECHAM5 atmospheric GCM show that the model tuning capability of EPPES scales up to realistic models and ensemble prediction systems. Finally, preliminary results of EPPES in the context of ECMWF forecasting system are presented.

  19. The emerging phenomenon of electronic cigarettes.

    PubMed

    Caponnetto, Pasquale; Campagna, Davide; Papale, Gabriella; Russo, Cristina; Polosa, Riccardo

    2012-02-01

    The need for novel and more effective approaches to tobacco control is unquestionable. The electronic cigarette is a battery-powered electronic nicotine delivery system that looks very similar to a conventional cigarette and is capable of emulating smoking, but without the combustion products accountable for smoking's damaging effects. Smokers who decide to switch to electronic cigarettes instead of continuing to smoke would achieve large health gains. The electronic cigarette is an emerging phenomenon that is becoming increasingly popular with smokers worldwide. Users report buying them to help quit smoking, to reduce cigarette consumption, to relieve tobacco withdrawal symptoms due to workplace smoking restrictions and to continue to have a 'smoking' experience but with reduced health risks. The focus of the present article is the health effects of using electronic cigarettes, with consideration given to the acceptability, safety and effectiveness of this product to serve as a long-term substitute for smoking or as a tool for smoking cessation.

  20. Finding structure in the dark: Coupled dark energy, weak lensing, and the mildly nonlinear regime

    NASA Astrophysics Data System (ADS)

    Miranda, Vinicius; González, Mariana Carrillo; Krause, Elisabeth; Trodden, Mark

    2018-03-01

    We reexamine interactions between the dark sectors of cosmology, with a focus on robust constraints that can be obtained using only mildly nonlinear scales. While it is well known that couplings between dark matter and dark energy can be constrained to the percent level when including the full range of scales probed by future optical surveys, calibrating matter power spectrum emulators to all possible choices of potentials and couplings requires many computationally expensive n-body simulations. Here we show that lensing and clustering of galaxies in combination with the cosmic microwave background (CMB) are capable of probing the dark sector coupling to the few percent level for a given class of models, using only linear and quasilinear Fourier modes. These scales can, in principle, be described by semianalytical techniques such as the effective field theory of large-scale structure.

  1. Intelligent control based on fuzzy logic and neural net theory

    NASA Technical Reports Server (NTRS)

    Lee, Chuen-Chien

    1991-01-01

    In the conception and design of intelligent systems, one promising direction involves the use of fuzzy logic and neural network theory to enhance such systems' capability to learn from experience and adapt to changes in an environment of uncertainty and imprecision. Here, an intelligent control scheme is explored by integrating these multidisciplinary techniques. A self-learning system is proposed as an intelligent controller for dynamical processes, employing a control policy which evolves and improves automatically. One key component of the intelligent system is a fuzzy logic-based system which emulates human decision making behavior. It is shown that the system can solve a fairly difficult control learning problem. Simulation results demonstrate that improved learning performance can be achieved in relation to previously described systems employing bang-bang control. The proposed system is relatively insensitive to variations in the parameters of the system environment.

  2. Estimated Muscle Loads During Squat Exercise in Microgravity Conditions

    NASA Technical Reports Server (NTRS)

    Fregly, Christopher D.; Kim, Brandon T.; Li, Zhao; DeWitt, John K.; Fregly, Benjamin J.

    2012-01-01

    Loss of muscle mass in microgravity is one of the primary factors limiting long-term space flight. NASA researchers have developed a number of exercise devices to address this problem. The most recent is the Advanced Resistive Exercise Device (ARED), which is currently used by astronauts on the International Space Station (ISS) to emulate typical free-weight exercises in microgravity. ARED exercise on the ISS is intended to reproduce Earth-level muscle loads, but the actual muscle loads produced remain unknown as they cannot currently be measured directly. In this study we estimated muscle loads experienced during squat exercise on ARED in microgravity conditions representative of Mars, the moon, and the ISS. The estimates were generated using a subject-specific musculoskeletal computer model and ARED exercise data collected on Earth. The results provide insight into the capabilities and limitations of the ARED machine.

  3. High pressure reaction cell and transfer mechanism for ultrahigh vacuum spectroscopic chambers

    NASA Astrophysics Data System (ADS)

    Nelson, A. E.; Schulz, K. H.

    2000-06-01

    A novel high pressure reaction cell and sample transfer mechanism for ultrahigh vacuum (UHV) spectroscopic chambers is described. The design employs a unique modification of a commercial load-lock transfer system to emulate a tractable microreactor. The reaction cell has an operating pressure range of <1×10-4 to 1000 Torr and can be evacuated to UHV conditions to enable sample transfer into the spectroscopic chamber. Additionally, a newly designed sample holder equipped with electrical and thermocouple contacts is described. The sample holder is capable of resistive specimen heating to 400 and 800 °C with current requirements of 14 A (2 V) and 25 A (3.5 V), respectively. The design enables thorough material science characterization of catalytic reactions and the surface chemistry of catalytic materials without exposing the specimen to atmospheric contaminants. The system is constructed primarily from readily available commercial equipment allowing its rapid implementation into existing laboratories.

  4. Additive manufacturing of biologically-inspired materials.

    PubMed

    Studart, André R

    2016-01-21

    Additive manufacturing (AM) technologies offer an attractive pathway towards the fabrication of functional materials featuring complex heterogeneous architectures inspired by biological systems. In this paper, recent research on the use of AM approaches to program the local chemical composition, structure and properties of biologically-inspired materials is reviewed. A variety of structural motifs found in biological composites have been successfully emulated in synthetic systems using inkjet-based, direct-writing, stereolithography and slip casting technologies. The replication in synthetic systems of design principles underlying such structural motifs has enabled the fabrication of lightweight cellular materials, strong and tough composites, soft robots and autonomously shaping structures with unprecedented properties and functionalities. Pushing the current limits of AM technologies in future research should bring us closer to the manufacturing capabilities of living organisms, opening the way for the digital fabrication of advanced materials with superior performance, lower environmental impact and new functionalities.

  5. Acquiring Spectra of Solar System Objects with the NIRSpec Instrument on the James Webb Space Telescope

    NASA Astrophysics Data System (ADS)

    Proffitt, Charles R.; Birkmann, Stephan; Ferruit, Pierre; Guilbert, Aurelie; Holler, Bryan J.; Stansberry, John

    2017-10-01

    The NIRSpec Instrument on the James Webb Space Telescope will allow near-IR spectroscopy in the wavelength range between 0.6 and 5.3 microns with resolving power of ~100, 1000, or 2700. We review strategies for performing spectral observations of solar system objects using each of NIRSpec's available observing modes, including the integral field unit (IFU), multi-Object Spectroscopy (MOS), and fixed slit (FS) templates, and discuss how the choice of mode affects the limiting target brightness as well as the detailed wavelength and spatial coverage obtained. We also discuss the expected pointing accuracy and target acquisition options for moving targets, including the use and limitations of the Wide Aperture Target Acquisition (WATA) capability and of the pre-defined field points that will be available for use with the MOS template to enable the use of custom micro-shutter patterns including ones emulating very long slits.

  6. Modeling tabular icebergs submerged in the ocean

    NASA Astrophysics Data System (ADS)

    Stern, A. A.; Adcroft, A.; Sergienko, O.; Marques, G.

    2017-08-01

    Large tabular icebergs calved from Antarctic ice shelves have long lifetimes (due to their large size), during which they drift across large distances, altering ambient ocean circulation, bottom-water formation, sea-ice formation, and biological primary productivity in the icebergs' vicinity. However, despite their importance, the current generation of ocean circulation models usually do not represent large tabular icebergs. In this study, we develop a novel framework to model large tabular icebergs submerged in the ocean. In this framework, tabular icebergs are represented by pressure-exerting Lagrangian elements that drift in the ocean. The elements are held together and interact with each other via bonds. A breaking of these bonds allows the model to emulate calving events (i.e., detachment of a tabular iceberg from an ice shelf) and tabular icebergs breaking up into smaller pieces. Idealized simulations of a calving tabular iceberg, its drift, and its breakup demonstrate capabilities of the developed framework.

  7. Performance evaluation of heart sound cancellation in FPGA hardware implementation for electronic stethoscope.

    PubMed

    Chao, Chun-Tang; Maneetien, Nopadon; Wang, Chi-Jo; Chiou, Juing-Shian

    2014-01-01

    This paper presents the design and evaluation of the hardware circuit for electronic stethoscopes with heart sound cancellation capabilities using field programmable gate arrays (FPGAs). The adaptive line enhancer (ALE) was adopted as the filtering methodology to reduce heart sound attributes from the breath sounds obtained via the electronic stethoscope pickup. FPGAs were utilized to implement the ALE functions in hardware to achieve near real-time breath sound processing. We believe that such an implementation is unprecedented and crucial toward a truly useful, standalone medical device in outpatient clinic settings. The implementation evaluation with one Altera cyclone II-EP2C70F89 shows that the proposed ALE used 45% resources of the chip. Experiments with the proposed prototype were made using DE2-70 emulation board with recorded body signals obtained from online medical archives. Clear suppressions were observed in our experiments from both the frequency domain and time domain perspectives.

  8. Architecture of web services in the enhancement of real-time 3D video virtualization in cloud environment

    NASA Astrophysics Data System (ADS)

    Bada, Adedayo; Wang, Qi; Alcaraz-Calero, Jose M.; Grecos, Christos

    2016-04-01

    This paper proposes a new approach to improving the application of 3D video rendering and streaming by jointly exploring and optimizing both cloud-based virtualization and web-based delivery. The proposed web service architecture firstly establishes a software virtualization layer based on QEMU (Quick Emulator), an open-source virtualization software that has been able to virtualize system components except for 3D rendering, which is still in its infancy. The architecture then explores the cloud environment to boost the speed of the rendering at the QEMU software virtualization layer. The capabilities and inherent limitations of Virgil 3D, which is one of the most advanced 3D virtual Graphics Processing Unit (GPU) available, are analyzed through benchmarking experiments and integrated into the architecture to further speed up the rendering. Experimental results are reported and analyzed to demonstrate the benefits of the proposed approach.

  9. Magneto-rheological fluid shock absorbers for HMMWV

    NASA Astrophysics Data System (ADS)

    Gordaninejad, Faramarz; Kelso, Shawn P.

    2000-04-01

    This paper presents the development and evaluation of a controllable, semi-active magneto-rheological fluid (MRF) shock absorber for a High Mobility Multi-purpose Wheeled Vehicle (HMMWV). The University of Nevada, Reno (UNR) MRF damper is tailored for structures and ground vehicles that undergo a wide range of dynamic loading. It also has the capability for unique rebound and compression characteristics. The new MRF shock absorber emulates the original equipment manufacturer (OEM) shock absorber behavior in passive mode, and provides a wide controllable damping force range. A theoretical study is performed to evaluate the UNR MRF shock absorber. The Bingham plastic theory is employed to model the nonlinear behavior of the MR fluid. A fluid-mechanics-based theoretical model along with a three-dimensional finite element electromagnetic analysis is utilized to predict the MRF damper performance. The theoretical results are compared with experimental data and are demonstrated to be in excellent agreement.

  10. Improved Signal Processing Technique Leads to More Robust Self Diagnostic Accelerometer System

    NASA Technical Reports Server (NTRS)

    Tokars, Roger; Lekki, John; Jaros, Dave; Riggs, Terrence; Evans, Kenneth P.

    2010-01-01

    The self diagnostic accelerometer (SDA) is a sensor system designed to actively monitor the health of an accelerometer. In this case an accelerometer is considered healthy if it can be determined that it is operating correctly and its measurements may be relied upon. The SDA system accomplishes this by actively monitoring the accelerometer for a variety of failure conditions including accelerometer structural damage, an electrical open circuit, and most importantly accelerometer detachment. In recent testing of the SDA system in emulated engine operating conditions it has been found that a more robust signal processing technique was necessary. An improved accelerometer diagnostic technique and test results of the SDA system utilizing this technique are presented here. Furthermore, the real time, autonomous capability of the SDA system to concurrently compensate for effects from real operating conditions such as temperature changes and mechanical noise, while monitoring the condition of the accelerometer health and attachment, will be demonstrated.

  11. The Role of Health Care Transformation for the Chinese Dream: Powering Economic Growth, Promoting a Harmonious Society.

    PubMed

    Mattke, Soeren; Liu, Hangsheng; Hunter, Lauren E; Gu, Kun; Newberry, Sydne

    2014-12-30

    After having successfully expanded health insurance coverage, China now faces the challenge of building an effective and efficient delivery system to serve its large and aging population. The country finds itself at a crossroads-it can emulate the models of Western countries with their well-known limitations, or embark on an ambitious endeavor to create an innovative and sustainable model. We recommend that China choose the second option and design and implement a health care system based on population health management principles and sophisticated health information technology. Taking this path could yield a triple dividend for China: Health care will contribute to the growth of service sector employment, stimulate domestic demand by unlocking savings, and enable China to export its health system development capabilities to other emerging economies, mirroring its success in building other critical infrastructure. These forces can help turn the Chinese Dream into a reality.

  12. A DDS-Based Energy Management Framework for Small Microgrid Operation and Control

    DOE PAGES

    Youssef, Tarek A.; El Hariri, Mohamad; Elsayed, Ahmed T.; ...

    2017-09-26

    The smart grid is seen as a power system with realtime communication and control capabilities between the consumer and the utility. This modern platform facilitates the optimization in energy usage based on several factors including environmental, price preferences, and system technical issues. In this paper a real-time energy management system (EMS) for microgrids or nanogrids was developed. The developed system involves an online optimization scheme to adapt its parameters based on previous, current, and forecasted future system states. The communication requirements for all EMS modules were analyzed and are all integrated over a data distribution service (DDS) Ethernet network withmore » appropriate quality of service (QoS) profiles. In conclusion, the developed EMS was emulated with actual residential energy consumption and irradiance data from Miami, Florida and proved its effectiveness in reducing consumers’ bills and achieving flat peak load profiles.« less

  13. Neuromorphic photonic networks using silicon photonic weight banks.

    PubMed

    Tait, Alexander N; de Lima, Thomas Ferreira; Zhou, Ellen; Wu, Allie X; Nahmias, Mitchell A; Shastri, Bhavin J; Prucnal, Paul R

    2017-08-07

    Photonic systems for high-performance information processing have attracted renewed interest. Neuromorphic silicon photonics has the potential to integrate processing functions that vastly exceed the capabilities of electronics. We report first observations of a recurrent silicon photonic neural network, in which connections are configured by microring weight banks. A mathematical isomorphism between the silicon photonic circuit and a continuous neural network model is demonstrated through dynamical bifurcation analysis. Exploiting this isomorphism, a simulated 24-node silicon photonic neural network is programmed using "neural compiler" to solve a differential system emulation task. A 294-fold acceleration against a conventional benchmark is predicted. We also propose and derive power consumption analysis for modulator-class neurons that, as opposed to laser-class neurons, are compatible with silicon photonic platforms. At increased scale, Neuromorphic silicon photonics could access new regimes of ultrafast information processing for radio, control, and scientific computing.

  14. Bayesian History Matching of Complex Infectious Disease Models Using Emulation: A Tutorial and a Case Study on HIV in Uganda

    PubMed Central

    Andrianakis, Ioannis; Vernon, Ian R.; McCreesh, Nicky; McKinley, Trevelyan J.; Oakley, Jeremy E.; Nsubuga, Rebecca N.; Goldstein, Michael; White, Richard G.

    2015-01-01

    Advances in scientific computing have allowed the development of complex models that are being routinely applied to problems in disease epidemiology, public health and decision making. The utility of these models depends in part on how well they can reproduce empirical data. However, fitting such models to real world data is greatly hindered both by large numbers of input and output parameters, and by long run times, such that many modelling studies lack a formal calibration methodology. We present a novel method that has the potential to improve the calibration of complex infectious disease models (hereafter called simulators). We present this in the form of a tutorial and a case study where we history match a dynamic, event-driven, individual-based stochastic HIV simulator, using extensive demographic, behavioural and epidemiological data available from Uganda. The tutorial describes history matching and emulation. History matching is an iterative procedure that reduces the simulator's input space by identifying and discarding areas that are unlikely to provide a good match to the empirical data. History matching relies on the computational efficiency of a Bayesian representation of the simulator, known as an emulator. Emulators mimic the simulator's behaviour, but are often several orders of magnitude faster to evaluate. In the case study, we use a 22 input simulator, fitting its 18 outputs simultaneously. After 9 iterations of history matching, a non-implausible region of the simulator input space was identified that was times smaller than the original input space. Simulator evaluations made within this region were found to have a 65% probability of fitting all 18 outputs. History matching and emulation are useful additions to the toolbox of infectious disease modellers. Further research is required to explicitly address the stochastic nature of the simulator as well as to account for correlations between outputs. PMID:25569850

  15. minimega v. 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crussell, Jonathan; Erickson, Jeremy; Fritz, David

    minimega is an emulytics platform for creating testbeds of networked devices. The platoform consists of easily deployable tools to facilitate bringing up large networks of virtual machines including Windows, Linux, and Android. minimega allows experiments to be brought up quickly with almost no configuration. minimega also includes tools for simple cluster, management, as well as tools for creating Linux-based virtual machines. This release of minimega includes new emulated sensors for Android devices to improve the fidelity of testbeds that include mobile devices. Emulated sensors include GPS and

  16. Implementation of virtual LANs over ATM WANs

    NASA Astrophysics Data System (ADS)

    Braun, Torsten; Maehler, Martin

    1998-09-01

    Virtual LANs (VLANs) allow to interconnect users over campus or wide area networks and gives the users the impression as they would be connected to the same local area network (LAN). The implementation of VLANs is based on ATM Forum's LAN Emulation and LAN/ATM switches providing interconnection of emulated LANs over ATM and the LAN ports to which the user's end systems are attached to. The paper discusses possible implementation architectures and describes advanced features such as ATM short-cuts, QoS, and redundancy concepts.

  17. A Low-Cost Part-Task Flight Training System: An Application of a Head Mounted Display

    DTIC Science & Technology

    1990-12-01

    architecture. The task at hand was to develop a software emulation libary that would emulate the function calls used within the Flight and Dog programs. This...represented in two hexadecimal digits for each color. The format of the packed long integer looks like aaggbbrr with each color value representing a...Western Digital ethernet card as the cheapest compatible card available. Good fortune arrived, as I was calling to order the card, I saw an unused card

  18. Cyber integrated MEMS microhand for biological applications

    NASA Astrophysics Data System (ADS)

    Weissman, Adam; Frazier, Athena; Pepen, Michael; Lu, Yen-Wen; Yang, Shanchieh Jay

    2009-05-01

    Anthropomorphous robotic hands at microscales have been developed to receive information and perform tasks for biological applications. To emulate a human hand's dexterity, the microhand requires a master-slave interface with a wearable controller, force sensors, and perception displays for tele-manipulation. Recognizing the constraints and complexity imposed in developing feedback interface during miniaturization, this project address the need by creating an integrated cyber environment incorporating sensors with a microhand, haptic/visual display, and object model, to emulates human hands' psychophysical perception at microscale.

  19. 8755 Emulator Design

    DTIC Science & Technology

    1988-12-01

    Break Control....................51 8755 I/O Control..................54 Z-100 Control Software................. 55 Pass User Memory...the emulator SRAM and the other is 55 for the target SRAM. If either signal is replaced by the NACK signal the host computer displays an error message...Block Diagram 69 AUU M/M 8 in Fiur[:. cemti Daga 70m F’M I-- ’I ANLAD AMam U52 COMM ow U2 i M.2 " -n ax- U 6_- Figure 2b. Schematic Diagram 71 )-M -A -I

  20. Real time wind farm emulation using SimWindFarm toolbox

    NASA Astrophysics Data System (ADS)

    Topor, Marcel

    2016-06-01

    This paper presents a wind farm emulation solution using an open source Matlab/Simulink toolbox and the National Instruments cRIO platform. This work is based on the Aeolus SimWindFarm (SWF) toolbox models developed at Aalborg university, Denmark. Using the Matlab Simulink models developed in SWF, the modeling code can be exported to a real time model using the NI Veristand model framework and the resulting code is integrated as a hardware in the loop control on the NI 9068 platform.

  1. Emulation: A fast stochastic Bayesian method to eliminate model space

    NASA Astrophysics Data System (ADS)

    Roberts, Alan; Hobbs, Richard; Goldstein, Michael

    2010-05-01

    Joint inversion of large 3D datasets has been the goal of geophysicists ever since the datasets first started to be produced. There are two broad approaches to this kind of problem, traditional deterministic inversion schemes and more recently developed Bayesian search methods, such as MCMC (Markov Chain Monte Carlo). However, using both these kinds of schemes has proved prohibitively expensive, both in computing power and time cost, due to the normally very large model space which needs to be searched using forward model simulators which take considerable time to run. At the heart of strategies aimed at accomplishing this kind of inversion is the question of how to reliably and practicably reduce the size of the model space in which the inversion is to be carried out. Here we present a practical Bayesian method, known as emulation, which can address this issue. Emulation is a Bayesian technique used with considerable success in a number of technical fields, such as in astronomy, where the evolution of the universe has been modelled using this technique, and in the petroleum industry where history matching is carried out of hydrocarbon reservoirs. The method of emulation involves building a fast-to-compute uncertainty-calibrated approximation to a forward model simulator. We do this by modelling the output data from a number of forward simulator runs by a computationally cheap function, and then fitting the coefficients defining this function to the model parameters. By calibrating the error of the emulator output with respect to the full simulator output, we can use this to screen out large areas of model space which contain only implausible models. For example, starting with what may be considered a geologically reasonable prior model space of 10000 models, using the emulator we can quickly show that only models which lie within 10% of that model space actually produce output data which is plausibly similar in character to an observed dataset. We can thus much more tightly constrain the input model space for a deterministic inversion or MCMC method. By using this technique jointly on several datasets (specifically seismic, gravity, and magnetotelluric (MT) describing the same region), we can include in our modelling uncertainties in the data measurements, the relationships between the various physical parameters involved, as well as the model representation uncertainty, and at the same time further reduce the range of plausible models to several percent of the original model space. Being stochastic in nature, the output posterior parameter distributions also allow our understanding of/beliefs about a geological region can be objectively updated, with full assessment of uncertainties, and so the emulator is also an inversion-type tool in it's own right, with the advantage (as with any Bayesian method) that our uncertainties from all sources (both data and model) can be fully evaluated.

  2. Best practices for creating social presence and caring behaviors online.

    PubMed

    Plante, Kathleen; Asselin, Marilyn E

    2014-01-01

    To identify best practices and evidence-based strategies for creating an online learning environment that encompasses caring behaviors and promotes social presence. Faculty who teach online classes are challenged to create a sense of social presence and caring behaviors in a virtual world in which students feel connected and part of the learning environment. To extrapolate evidence to support best practices, a review of literature was conducted focused on social presence and caring online. Faculty messages that are respectful, positive, encouraging, timely, and frequent foster social presence and caring behaviors while also allowing for caring interactions, mutual respect, and finding meaning in relationships. A variety of measures to emulate caring online intertwine with social presence to promote a sense of caring and belonging. More research is needed to support the evidence for these strategies.

  3. Infrared Avionics Signal Distribution Using WDM

    NASA Technical Reports Server (NTRS)

    Atiquzzaman, Mohammed; Sluss, James J., Jr.

    2004-01-01

    Supporting analog RF signal transmission over optical fibers, this project demonstrates a successful application of wavelength division multiplexing (WDM) to the avionics environment. We characterize the simultaneous transmission of four RF signals (channels) over a single optical fiber. At different points along a fiber optic backbone, these four analog channels are sequentially multiplexed and demultiplexed to more closely emulate the conditions in existing onboard aircraft. We present data from measurements of optical power, transmission response (loss and gain), reflection response, group delay that defines phase distortion, signal-to-noise ratio (SNR), and dynamic range that defines nonlinear distortion. The data indicate that WDM is very suitable for avionics applications.

  4. The Application of Fiber Optic Wavelength Division Multiplexing in RF Avionics

    NASA Technical Reports Server (NTRS)

    Ngo, Duc; Nguyen, Hung; Atiquzzaman, Mohammed; Sluss, James J., Jr.; Refai, Hakki H.

    2004-01-01

    This paper demonstrates a successful application of wavelength division multiplexing (WDM) to the avionics environment to support analog RF signal transmission. We investigate the simultaneous transmission of four RF signals (channels) over a single optical fiber. These four analog channels are sequentially multiplexed and demultiplexed at different points along a fiber optic backbone to more closely emulate the conditions found onboard aircraft. We present data from measurements of signal-to-noise ratio (SNR), transmission response (loss and gain), group delay that defines phase distortion, and dynamic range that defines nonlinear distortion. The data indicate that WDM is well-suited for avionics applications.

  5. Centralized database for interconnection system design. [for spacecraft

    NASA Technical Reports Server (NTRS)

    Billitti, Joseph W.

    1989-01-01

    A database application called DFACS (Database, Forms and Applications for Cabling and Systems) is described. The objective of DFACS is to improve the speed and accuracy of interconnection system information flow during the design and fabrication stages of a project, while simultaneously supporting both the horizontal (end-to-end wiring) and the vertical (wiring by connector) design stratagems used by the Jet Propulsion Laboratory (JPL) project engineering community. The DFACS architecture is centered around a centralized database and program methodology which emulates the manual design process hitherto used at JPL. DFACS has been tested and successfully applied to existing JPL hardware tasks with a resulting reduction in schedule time and costs.

  6. Robert N. Butler, MD (January 21, 1927-July 4, 2010): visionary leader.

    PubMed

    Achenbaum, W Andrew

    2014-02-01

    The career and accomplishments of Dr. Robert N. Butler highlight the history of postwar gerontology and geriatrics here and abroad. Butler was an idea broker: He introduced "life review" as a therapeutic intervention and coined "ageism." Butler was the only researcher on aging to win a Pulitzer Prize or long after normal retirement lay the foundations for a new gerontology. Butler was an institution builder: he served as first director of the National Institute on Aging, created the first department of geriatric medicine in the United States, and mobilized support here and abroad for global aging. His legacy provides much for successive generations to emulate and enhance.

  7. A Phase-Locked Loop Epilepsy Network Emulator.

    PubMed

    Watson, P D; Horecka, K M; Cohen, N J; Ratnam, R

    2016-10-15

    Most seizure forecasting employs statistical learning techniques that lack a representation of the network interactions that give rise to seizures. We present an epilepsy network emulator (ENE) that uses a network of interconnected phase-locked loops (PLLs) to model synchronous, circuit-level oscillations between electrocorticography (ECoG) electrodes. Using ECoG data from a canine-epilepsy model (Davis et al. 2011) and a physiological entropy measure (approximate entropy or ApEn, Pincus 1995), we demonstrate the entropy of the emulator phases increases dramatically during ictal periods across all ECoG recording sites and across all animals in the sample. Further, this increase precedes the observable voltage spikes that characterize seizure activity in the ECoG data. These results suggest that the ENE is sensitive to phase-domain information in the neural circuits measured by ECoG and that an increase in the entropy of this measure coincides with increasing likelihood of seizure activity. Understanding this unpredictable phase-domain electrical activity present in ECoG recordings may provide a target for seizure detection and feedback control.

  8. Evaluation Applied to Reliability Analysis of Reconfigurable, Highly Reliable, Fault-Tolerant, Computing Systems for Avionics

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.

  9. Input current shaped ac-to-dc converters

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Input current shaping techniques for ac-to-dc converters were investigated. Input frequencies much higher than normal, up to 20 kHz were emphasized. Several methods of shaping the input current waveform in ac-to-dc converters were reviewed. The simplest method is the LC filter following the rectifier. The next simplest method is the resistor emulation approach in which the inductor size is determined by the converter switching frequency and not by the line input frequency. Other methods require complicated switch drive algorithms to construct the input current waveshape. For a high-frequency line input, on the order of 20 kHz, the simple LC cannot be discarded so peremptorily, since the inductor size can be compared with that for the resistor emulation method. In fact, since a dc regulator will normally be required after the filter anyway, the total component count is almost the same as for the resistor emulation method, in which the filter is effectively incorporated into the regulator.

  10. Measurement of fault latency in a digital avionic miniprocessor

    NASA Technical Reports Server (NTRS)

    Mcgough, J. G.; Swern, F. L.

    1981-01-01

    The results of fault injection experiments utilizing a gate-level emulation of the central processor unit of the Bendix BDX-930 digital computer are presented. The failure detection coverage of comparison-monitoring and a typical avionics CPU self-test program was determined. The specific tasks and experiments included: (1) inject randomly selected gate-level and pin-level faults and emulate six software programs using comparison-monitoring to detect the faults; (2) based upon the derived empirical data develop and validate a model of fault latency that will forecast a software program's detecting ability; (3) given a typical avionics self-test program, inject randomly selected faults at both the gate-level and pin-level and determine the proportion of faults detected; (4) determine why faults were undetected; (5) recommend how the emulation can be extended to multiprocessor systems such as SIFT; and (6) determine the proportion of faults detected by a uniprocessor BIT (built-in-test) irrespective of self-test.

  11. Dual-energy imaging of bone marrow edema on a dedicated multi-source cone-beam CT system for the extremities

    NASA Astrophysics Data System (ADS)

    Zbijewski, W.; Sisniega, A.; Stayman, J. W.; Thawait, G.; Packard, N.; Yorkston, J.; Demehri, S.; Fritz, J.; Siewerdsen, J. H.

    2015-03-01

    Purpose: Arthritis and bone trauma are often accompanied by bone marrow edema (BME). BME is challenging to detect in CT due to the overlaying trabecular structure but can be visualized using dual-energy (DE) techniques to discriminate water and fat. We investigate the feasibility of DE imaging of BME on a dedicated flat-panel detector (FPD) extremities cone-beam CT (CBCT) with a unique x-ray tube with three longitudinally mounted sources. Methods: Simulations involved a digital BME knee phantom imaged with a 60 kVp low-energy beam (LE) and 105 kVp high-energy beam (HE) (+0.25 mm Ag filter). Experiments were also performed on a test-bench with a Varian 4030CB FPD using the same beam energies as the simulation study. A three-source configuration was implemented with x-ray sources distributed along the longitudinal axis and DE CBCT acquisition in which the superior and inferior sources operate at HE (and collect half of the projection angles each) and the central source operates at LE. Three-source DE CBCT was compared to a double-scan, single-source orbit. Experiments were performed with a wrist phantom containing a 50 mg/ml densitometry insert submerged in alcohol (simulating fat) with drilled trabeculae down to ~1 mm to emulate the trabecular matrix. Reconstruction-based three-material decomposition of fat, soft tissue, and bone was performed. Results: For a low-dose scan (36 mAs in the HE and LE data), DE CBCT achieved combined accuracy of ~0.80 for a pattern of BME spherical lesions ranging 2.5 - 10 mm diameter in the knee phantom. The accuracy increased to ~0.90 for a 360 mAs scan. Excellent DE discrimination of the base materials was achieved in the experiments. Approximately 80% of the alcohol (fat) voxels in the trabecular phantom was properly identified both for single and 3-source acquisitions, indicating the ability to detect edemous tissue (water-equivalent plastic in the body of the densitometry insert) from the fat inside the trabecular matrix (emulating normal trabecular bone with significant fraction of yellow marrow). Conclusion: Detection of BME and quantification of water and fat content were achieved in extremities DE CBCT with a longitudinal configuration of sources providing DE imaging in a single gantry rotation. The findings support the development of DE imaging capability for CBCT of the extremities in areas conventionally in the domain of MRI.

  12. Dual-Energy Imaging of Bone Marrow Edema on a Dedicated Multi-Source Cone-Beam CT System for the Extremities

    PubMed Central

    Zbijewski, W.; Sisniega, A.; Stayman, J. W.; Thawait, G.; Packard, N.; Yorkston, J.; Demehri, S.; Fritz, J.; Siewerdsen, J. H.

    2015-01-01

    Purpose Arthritis and bone trauma are often accompanied by bone marrow edema (BME). BME is challenging to detect in CT due to the overlaying trabecular structure but can be visualized using dual-energy (DE) techniques to discriminate water and fat. We investigate the feasibility of DE imaging of BME on a dedicated flat-panel detector (FPD) extremities cone-beam CT (CBCT) with a unique x-ray tube with three longitudinally mounted sources. Methods Simulations involved a digital BME knee phantom imaged with a 60 kVp low-energy beam (LE) and 105 kVp high-energy beam (HE) (+0.25 mm Ag filter). Experiments were also performed on a test-bench with a Varian 4030CB FPD using the same beam energies as the simulation study. A three-source configuration was implemented with x-ray sources distributed along the longitudinal axis and DE CBCT acquisition in which the superior and inferior sources operate at HE (and collect half of the projection angles each) and the central source operates at LE. Three-source DE CBCT was compared to a double-scan, single-source orbit. Experiments were performed with a wrist phantom containing a 50 mg/ml densitometry insert submerged in alcohol (simulating fat) with drilled trabeculae down to ~1 mm to emulate the trabecular matrix. Reconstruction-based three-material decomposition of fat, soft tissue, and bone was performed. Results For a low-dose scan (36 mAs in the HE and LE data), DE CBCT achieved combined accuracy of ~0.80 for a pattern of BME spherical lesions ranging 2.5 – 10 mm diameter in the knee phantom. The accuracy increased to ~0.90 for a 360 mAs scan. Excellent DE discrimination of the base materials was achieved in the experiments. Approximately 80% of the alcohol (fat) voxels in the trabecular phantom was properly identified both for single and 3-source acquisitions, indicating the ability to detect edemous tissue (water-equivalent plastic in the body of the densitometry insert) from the fat inside the trabecular matrix (emulating normal trabecular bone with significant fraction of yellow marrow). Conclusion Detection of BME and quantification of water and fat content were achieved in extremities DE CBCT with a longitudinal configuration of sources providing DE imaging in a single gantry rotation. The findings support the development of DE imaging capability for CBCT of the extremities in areas conventionally in the domain of MRI. PMID:26045631

  13. Dual-Energy Imaging of Bone Marrow Edema on a Dedicated Multi-Source Cone-Beam CT System for the Extremities.

    PubMed

    Zbijewski, W; Sisniega, A; Stayman, J W; Thawait, G; Packard, N; Yorkston, J; Demehri, S; Fritz, J; Siewerdsen, J H

    2015-02-21

    Arthritis and bone trauma are often accompanied by bone marrow edema (BME). BME is challenging to detect in CT due to the overlaying trabecular structure but can be visualized using dual-energy (DE) techniques to discriminate water and fat. We investigate the feasibility of DE imaging of BME on a dedicated flat-panel detector (FPD) extremities cone-beam CT (CBCT) with a unique x-ray tube with three longitudinally mounted sources. Simulations involved a digital BME knee phantom imaged with a 60 kVp low-energy beam (LE) and 105 kVp high-energy beam (HE) (+0.25 mm Ag filter). Experiments were also performed on a test-bench with a Varian 4030CB FPD using the same beam energies as the simulation study. A three-source configuration was implemented with x-ray sources distributed along the longitudinal axis and DE CBCT acquisition in which the superior and inferior sources operate at HE (and collect half of the projection angles each) and the central source operates at LE. Three-source DE CBCT was compared to a double-scan, single-source orbit. Experiments were performed with a wrist phantom containing a 50 mg/ml densitometry insert submerged in alcohol (simulating fat) with drilled trabeculae down to ~1 mm to emulate the trabecular matrix. Reconstruction-based three-material decomposition of fat, soft tissue, and bone was performed. For a low-dose scan (36 mAs in the HE and LE data), DE CBCT achieved combined accuracy of ~0.80 for a pattern of BME spherical lesions ranging 2.5 - 10 mm diameter in the knee phantom. The accuracy increased to ~0.90 for a 360 mAs scan. Excellent DE discrimination of the base materials was achieved in the experiments. Approximately 80% of the alcohol (fat) voxels in the trabecular phantom was properly identified both for single and 3-source acquisitions, indicating the ability to detect edemous tissue (water-equivalent plastic in the body of the densitometry insert) from the fat inside the trabecular matrix (emulating normal trabecular bone with significant fraction of yellow marrow). Detection of BME and quantification of water and fat content were achieved in extremities DE CBCT with a longitudinal configuration of sources providing DE imaging in a single gantry rotation. The findings support the development of DE imaging capability for CBCT of the extremities in areas conventionally in the domain of MRI.

  14. Quantifying Parkinson's disease progression by simulating gait patterns

    NASA Astrophysics Data System (ADS)

    Cárdenas, Luisa; Martínez, Fabio; Atehortúa, Angélica; Romero, Eduardo

    2015-12-01

    Modern rehabilitation protocols of most neurodegenerative diseases, in particular the Parkinson Disease, rely on a clinical analysis of gait patterns. Currently, such analysis is highly dependent on both the examiner expertise and the type of evaluation. Development of evaluation methods with objective measures is then crucial. Physical models arise as a powerful alternative to quantify movement patterns and to emulate the progression and performance of specific treatments. This work introduces a novel quantification of the Parkinson disease progression using a physical model that accurately represents the main gait biomarker, the body Center of Gravity (CoG). The model tracks the whole gait cycle by a coupled double inverted pendulum that emulates the leg swinging for the single support phase and by a damper-spring System (SDP) that recreates both legs in contact with the ground for the double phase. The patterns generated by the proposed model are compared with actual ones learned from 24 subjects in stages 2,3, and 4. The evaluation performed demonstrates a better performance of the proposed model when compared with a baseline model(SP) composed of a coupled double pendulum and a mass-spring system. The Frechet distance measured differences between model estimations and real trajectories, showing for stages 2, 3 and 4 distances of 0.137, 0.155, 0.38 for the baseline and 0.07, 0.09, 0.29 for the proposed method.

  15. Multiscale Modulation of Nanocrystalline Cellulose Hydrogel via Nanocarbon Hybridization for 3D Neuronal Bilayer Formation.

    PubMed

    Kim, Dongyoon; Park, Subeom; Jo, Insu; Kim, Seong-Min; Kang, Dong Hee; Cho, Sung-Pyo; Park, Jong Bo; Hong, Byung Hee; Yoon, Myung-Han

    2017-07-01

    Bacterial biopolymers have drawn much attention owing to their unconventional three-dimensional structures and interesting functions, which are closely integrated with bacterial physiology. The nongenetic modulation of bacterial (Acetobacter xylinum) cellulose synthesis via nanocarbon hybridization, and its application to the emulation of layered neuronal tissue, is reported. The controlled dispersion of graphene oxide (GO) nanoflakes into bacterial cellulose (BC) culture media not only induces structural changes within a crystalline cellulose nanofibril, but also modulates their 3D collective association, leading to substantial reduction in Young's modulus (≈50%) and clear definition of water-hydrogel interfaces. Furthermore, real-time investigation of 3D neuronal networks constructed in this GO-incorporated BC hydrogel with broken chiral nematic ordering revealed the vertical locomotion of growth cones, the accelerated neurite outgrowth (≈100 µm per day) with reduced backward travel length, and the efficient formation of synaptic connectivity with distinct axonal bifurcation abundancy at the ≈750 µm outgrowth from a cell body. In comparison with the pristine BC, GO-BC supports the formation of well-defined neuronal bilayer networks with flattened interfacial profiles and vertical axonal outgrowth, apparently emulating the neuronal development in vivo. We envisioned that our findings may contribute to various applications of engineered BC hydrogel to fundamental neurobiology studies and neural engineering. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Agricultural response functions to changes in carbon, temperature, and water based on the C3MP data set

    NASA Astrophysics Data System (ADS)

    Snyder, A.; Ruane, A. C.; Phillips, M.; Calvin, K. V.; Clarke, L.

    2017-12-01

    Agricultural yields vary depending on temperature, precipitation/irrigation conditions, fertilizer application, and CO2 concentration. The Coordinated Climate-Crop Modeling Project (C3MP), conducted as a component of the Agricultural Model Intercomparison and Improvement Project (AgMIP), organized a sensitivity experiments across carbon-temperature-water (CTW) space across 1100 management conditions in 50+ countries, sampling 15 crop species and 20 crop models. Such coordinated sensitivity tests allow for the building of emulators of yield response to changes in CTW values, allowing rapid estimation of yield changes from the types of climate changes projected by the climate modeling community. The resulting emulator may be used to supply agricultural responses to climate change in any user-defined scenario, rather than the restriction to the RCPs in many past works. We present the resulting emulators built from the C3MP output data set for use in the Global Change Assessment Model (GCAM) integrated assessment model that allows for the co-evolution of socioeconomic development, greenhouse gas emissions, climate change, and agricultural sector ramifications. C3MP-based emulators may be of use in designing agricultural impact studies in other IAMs, and we place them in the context of past crop modeling efforts, including the Challinor et al. Meta-analysis, the AgMIP Wheat team results, the AgMIP Global Gridded Crop Model Intercomparison (GGCMI) fast-track modeling results, and the MACSUR impact response surface results.

  17. Design and Modeling of a Test Bench for Dual-Motor Electric Drive Tracked Vehicles Based on a Dynamic Load Emulation Method.

    PubMed

    Wang, Zhe; Lv, Haoliang; Zhou, Xiaojun; Chen, Zhaomeng; Yang, Yong

    2018-06-21

    Dual-motor Electric Drive Tracked Vehicles (DDTVs) have attracted increasing attention due to their high transmission efficiency and economical fuel consumption. A test bench for the development and validation of new DDTV technologies is necessary and urgent. How to load the vehicle on a DDTV test bench exactly the same as on a real road is a crucial issue when designing the bench. This paper proposes a novel dynamic load emulation method to address this problem. The method adopts dual dynamometers to simulate both the road load and the inertia load that are imposed on the dual independent drive systems. The vehicle’s total inertia equivalent to the drive wheels is calculated with separate consideration of vehicle body, tracks and road wheels to obtain a more accurate inertia load. A speed tracking control strategy with feedforward compensation is implemented to control the dual dynamometers, so as to make the real-time dynamic load emulation possible. Additionally, a MATLAB/Simulink model of the test bench is built based on a dynamics analysis of the platform. Experiments are finally carried out on this test bench under different test conditions. The outcomes show that the proposed load emulation method is effective, and has good robustness and adaptability to complex driving conditions. Besides, the accuracy of the established test bench model is also demonstrated by comparing the results obtained from the simulation model and experiments.

  18. Conception et mise au point d'un emulateur de machine Synchrone trapezoidale a aimants permanents

    NASA Astrophysics Data System (ADS)

    Lessard, Francois

    The development of technology leads inevitably to higher systems' complexity faced by engineers. Over time, tools are often developed in parallel with the main systems to ensure their sustainability. The work presented in this document provides a new tool for testing motor drives. In general, this project refers to active loads, which are complex dynamic loads emulated electronically with a static converter. Specifically, this document proposes and implements a system whose purpose is to recreate the behaviour of a trapezoidal permanent magnets synchronous machine. The ultimate goal is to connect a motor drive to the three terminal of the motor emulator, as it would with a real motor. The emulator's response then obtained, when subjected to disturbances of the motor drive, is ideally identical to the one of a real motor. The motor emulator led to a significant versatility of a test bench because the electrical and mechanical parameters of the application can be easily modified. The work is divided into two main parts: the static converter and real-rime. Overall, these two entities form a PHIL (Power Hardware-in-the-loop) real-time simulation. The static converter enables the exchange of real power between the drive motor and the real-time simulation. The latter gives the application the intelligence needed to interact with the motor drive in a way which the desired behaviour is recreated. The main partner of this project, Opal-RT, ensures this development. Keywords: virtual machine, PHIL, real-time simulation, electronic load

  19. Analytical Support Capabilities of Turkish General Staff Scientific Decision Support Centre (SDSC) to Defence Transformation

    DTIC Science & Technology

    2005-04-01

    RTO-MP-SAS-055 4 - 1 UNCLASSIFIED/UNLIMITED UNCLASSIFIED/UNLIMITED Analytical Support Capabilities of Turkish General Staff Scientific...the end failed to achieve anything commensurate with the effort. The analytical support capabilities of Turkish Scientific Decision Support Center to...percent of the İpekkan, Z.; Özkil, A. (2005) Analytical Support Capabilities of Turkish General Staff Scientific Decision Support Centre (SDSC) to

  20. Inducing morphological changes in lipid bilayer membranes with microfabricated substrates

    NASA Astrophysics Data System (ADS)

    Liu, Fangjie; Collins, Liam F.; Ashkar, Rana; Heberle, Frederick A.; Srijanto, Bernadeta R.; Collier, C. Patrick

    2016-11-01

    Lateral organization of lipids and proteins into distinct domains and anchoring to a cytoskeleton are two important strategies employed by biological membranes to carry out many cellular functions. However, these interactions are difficult to emulate with model systems. Here we use the physical architecture of substrates consisting of arrays of micropillars to systematically control the behavior of supported lipid bilayers - an important step in engineering model lipid membrane systems with well-defined functionalities. Competition between attractive interactions of supported lipid bilayers with the underlying substrate versus the energy cost associated with membrane bending at pillar edges can be systematically investigated as functions of pillar height and pitch, chemical functionalization of the microstructured substrate, and the type of unilamellar vesicles used for assembling the supported bilayer. Confocal fluorescent imaging and AFM measurements highlight correlations that exist between topological and mechanical properties of lipid bilayers and lateral lipid mobility in these confined environments. This study provides a baseline for future investigations into lipid domain reorganization on structured solid surfaces and scaffolds for cell growth.

  1. Human-like object tracking and gaze estimation with PKD android

    PubMed Central

    Wijayasinghe, Indika B.; Miller, Haylie L.; Das, Sumit K; Bugnariu, Nicoleta L.; Popa, Dan O.

    2018-01-01

    As the use of robots increases for tasks that require human-robot interactions, it is vital that robots exhibit and understand human-like cues for effective communication. In this paper, we describe the implementation of object tracking capability on Philip K. Dick (PKD) android and a gaze tracking algorithm, both of which further robot capabilities with regard to human communication. PKD's ability to track objects with human-like head postures is achieved with visual feedback from a Kinect system and an eye camera. The goal of object tracking with human-like gestures is twofold : to facilitate better human-robot interactions and to enable PKD as a human gaze emulator for future studies. The gaze tracking system employs a mobile eye tracking system (ETG; SensoMotoric Instruments) and a motion capture system (Cortex; Motion Analysis Corp.) for tracking the head orientations. Objects to be tracked are displayed by a virtual reality system, the Computer Assisted Rehabilitation Environment (CAREN; MotekForce Link). The gaze tracking algorithm converts eye tracking data and head orientations to gaze information facilitating two objectives: to evaluate the performance of the object tracking system for PKD and to use the gaze information to predict the intentions of the user, enabling the robot to understand physical cues by humans. PMID:29416193

  2. Characterization of performance-emission indices of a diesel engine using ANFIS operating in dual-fuel mode with LPG

    NASA Astrophysics Data System (ADS)

    Chakraborty, Amitav; Roy, Sumit; Banerjee, Rahul

    2018-03-01

    This experimental work highlights the inherent capability of an adaptive-neuro fuzzy inference system (ANFIS) based model to act as a robust system identification tool (SIT) in prognosticating the performance and emission parameters of an existing diesel engine running of diesel-LPG dual fuel mode. The developed model proved its adeptness by successfully harnessing the effects of the input parameters of load, injection duration and LPG energy share on output parameters of BSFCEQ, BTE, NOX, SOOT, CO and HC. Successive evaluation of the ANFIS model, revealed high levels of resemblance with the already forecasted ANN results for the same input parameters and it was evident that similar to ANN, ANFIS also has the innate ability to act as a robust SIT. The ANFIS predicted data harmonized the experimental data with high overall accuracy. The correlation coefficient (R) values are stretched in between 0.99207 to 0.999988. The mean absolute percentage error (MAPE) tallies were recorded in the range of 0.02-0.173% with the root mean square errors (RMSE) in acceptable margins. Hence the developed model is capable of emulating the actual engine parameters with commendable ranges of accuracy, which in turn would act as a robust prediction platform in the future domains of optimization.

  3. Human-like object tracking and gaze estimation with PKD android

    NASA Astrophysics Data System (ADS)

    Wijayasinghe, Indika B.; Miller, Haylie L.; Das, Sumit K.; Bugnariu, Nicoleta L.; Popa, Dan O.

    2016-05-01

    As the use of robots increases for tasks that require human-robot interactions, it is vital that robots exhibit and understand human-like cues for effective communication. In this paper, we describe the implementation of object tracking capability on Philip K. Dick (PKD) android and a gaze tracking algorithm, both of which further robot capabilities with regard to human communication. PKD's ability to track objects with human-like head postures is achieved with visual feedback from a Kinect system and an eye camera. The goal of object tracking with human-like gestures is twofold: to facilitate better human-robot interactions and to enable PKD as a human gaze emulator for future studies. The gaze tracking system employs a mobile eye tracking system (ETG; SensoMotoric Instruments) and a motion capture system (Cortex; Motion Analysis Corp.) for tracking the head orientations. Objects to be tracked are displayed by a virtual reality system, the Computer Assisted Rehabilitation Environment (CAREN; MotekForce Link). The gaze tracking algorithm converts eye tracking data and head orientations to gaze information facilitating two objectives: to evaluate the performance of the object tracking system for PKD and to use the gaze information to predict the intentions of the user, enabling the robot to understand physical cues by humans.

  4. Space Link Extension Protocol Emulation for High-Throughput, High-Latency Network Connections

    NASA Technical Reports Server (NTRS)

    Tchorowski, Nicole; Murawski, Robert

    2014-01-01

    New space missions require higher data rates and new protocols to meet these requirements. These high data rate space communication links push the limitations of not only the space communication links, but of the ground communication networks and protocols which forward user data to remote ground stations (GS) for transmission. The Consultative Committee for Space Data Systems, (CCSDS) Space Link Extension (SLE) standard protocol is one protocol that has been proposed for use by the NASA Space Network (SN) Ground Segment Sustainment (SGSS) program. New protocol implementations must be carefully tested to ensure that they provide the required functionality, especially because of the remote nature of spacecraft. The SLE protocol standard has been tested in the NASA Glenn Research Center's SCENIC Emulation Lab in order to observe its operation under realistic network delay conditions. More specifically, the delay between then NASA Integrated Services Network (NISN) and spacecraft has been emulated. The round trip time (RTT) delay for the continental NISN network has been shown to be up to 120ms; as such the SLE protocol was tested with network delays ranging from 0ms to 200ms. Both a base network condition and an SLE connection were tested with these RTT delays, and the reaction of both network tests to the delay conditions were recorded. Throughput for both of these links was set at 1.2Gbps. The results will show that, in the presence of realistic network delay, the SLE link throughput is significantly reduced while the base network throughput however remained at the 1.2Gbps specification. The decrease in SLE throughput has been attributed to the implementation's use of blocking calls. The decrease in throughput is not acceptable for high data rate links, as the link requires constant data a flow in order for spacecraft and ground radios to stay synchronized, unless significant data is queued a the ground station. In cases where queuing the data is not an option, such as during real time transmissions, the SLE implementation cannot support high data rate communication.

  5. Fast neural network surrogates for very high dimensional physics-based models in computational oceanography.

    PubMed

    van der Merwe, Rudolph; Leen, Todd K; Lu, Zhengdong; Frolov, Sergey; Baptista, Antonio M

    2007-05-01

    We present neural network surrogates that provide extremely fast and accurate emulation of a large-scale circulation model for the coupled Columbia River, its estuary and near ocean regions. The circulation model has O(10(7)) degrees of freedom, is highly nonlinear and is driven by ocean, atmospheric and river influences at its boundaries. The surrogates provide accurate emulation of the full circulation code and run over 1000 times faster. Such fast dynamic surrogates will enable significant advances in ensemble forecasts in oceanography and weather.

  6. Specifying a target trial prevents immortal time bias and other self-inflicted injuries in observational analyses

    PubMed Central

    Hernán, Miguel A.; Sauer, Brian C.; Hernández-Díaz, Sonia; Platt, Robert; Shrier, Ian

    2016-01-01

    Many analyses of observational data are attempts to emulate a target trial. The emulation of the target trial may fail when researchers deviate from simple principles that guide the design and analysis of randomized experiments. We review a framework to describe and prevent biases, including immortal time bias, that result from a failure to align start of follow-up, specification of eligibility, and treatment assignment. We review some analytic approaches to avoid these problems in comparative effectiveness or safety research. PMID:27237061

  7. Real-time video streaming in mobile cloud over heterogeneous wireless networks

    NASA Astrophysics Data System (ADS)

    Abdallah-Saleh, Saleh; Wang, Qi; Grecos, Christos

    2012-06-01

    Recently, the concept of Mobile Cloud Computing (MCC) has been proposed to offload the resource requirements in computational capabilities, storage and security from mobile devices into the cloud. Internet video applications such as real-time streaming are expected to be ubiquitously deployed and supported over the cloud for mobile users, who typically encounter a range of wireless networks of diverse radio access technologies during their roaming. However, real-time video streaming for mobile cloud users across heterogeneous wireless networks presents multiple challenges. The network-layer quality of service (QoS) provision to support high-quality mobile video delivery in this demanding scenario remains an open research question, and this in turn affects the application-level visual quality and impedes mobile users' perceived quality of experience (QoE). In this paper, we devise a framework to support real-time video streaming in this new mobile video networking paradigm and evaluate the performance of the proposed framework empirically through a lab-based yet realistic testing platform. One particular issue we focus on is the effect of users' mobility on the QoS of video streaming over the cloud. We design and implement a hybrid platform comprising of a test-bed and an emulator, on which our concept of mobile cloud computing, video streaming and heterogeneous wireless networks are implemented and integrated to allow the testing of our framework. As representative heterogeneous wireless networks, the popular WLAN (Wi-Fi) and MAN (WiMAX) networks are incorporated in order to evaluate effects of handovers between these different radio access technologies. The H.264/AVC (Advanced Video Coding) standard is employed for real-time video streaming from a server to mobile users (client nodes) in the networks. Mobility support is introduced to enable continuous streaming experience for a mobile user across the heterogeneous wireless network. Real-time video stream packets are captured for analytical purposes on the mobile user node. Experimental results are obtained and analysed. Future work is identified towards further improvement of the current design and implementation. With this new mobile video networking concept and paradigm implemented and evaluated, results and observations obtained from this study would form the basis of a more in-depth, comprehensive understanding of various challenges and opportunities in supporting high-quality real-time video streaming in mobile cloud over heterogeneous wireless networks.

  8. Multi-level emulation of a volcanic ash transport and dispersion model to quantify sensitivity to uncertain parameters

    NASA Astrophysics Data System (ADS)

    Harvey, Natalie J.; Huntley, Nathan; Dacre, Helen F.; Goldstein, Michael; Thomson, David; Webster, Helen

    2018-01-01

    Following the disruption to European airspace caused by the eruption of Eyjafjallajökull in 2010 there has been a move towards producing quantitative predictions of volcanic ash concentration using volcanic ash transport and dispersion simulators. However, there is no formal framework for determining the uncertainties of these predictions and performing many simulations using these complex models is computationally expensive. In this paper a Bayesian linear emulation approach is applied to the Numerical Atmospheric-dispersion Modelling Environment (NAME) to better understand the influence of source and internal model parameters on the simulator output. Emulation is a statistical method for predicting the output of a computer simulator at new parameter choices without actually running the simulator. A multi-level emulation approach is applied using two configurations of NAME with different numbers of model particles. Information from many evaluations of the computationally faster configuration is combined with results from relatively few evaluations of the slower, more accurate, configuration. This approach is effective when it is not possible to run the accurate simulator many times and when there is also little prior knowledge about the influence of parameters. The approach is applied to the mean ash column loading in 75 geographical regions on 14 May 2010. Through this analysis it has been found that the parameters that contribute the most to the output uncertainty are initial plume rise height, mass eruption rate, free tropospheric turbulence levels and precipitation threshold for wet deposition. This information can be used to inform future model development and observational campaigns and routine monitoring. The analysis presented here suggests the need for further observational and theoretical research into parameterisation of atmospheric turbulence. Furthermore it can also be used to inform the most important parameter perturbations for a small operational ensemble of simulations. The use of an emulator also identifies the input and internal parameters that do not contribute significantly to simulator uncertainty. Finally, the analysis highlights that the faster, less accurate, configuration of NAME can, on its own, provide useful information for the problem of predicting average column load over large areas.

  9. Characterization of Orbital Debris Via Hyper-Velocity Ground-Based Tests

    NASA Technical Reports Server (NTRS)

    Cowardin, Heather

    2015-01-01

    To replicate a hyper-velocity fragmentation event using modern-day spacecraft materials and construction techniques to better improve the existing DoD and NASA breakup models. DebriSat is intended to be representative of modern LEO satellites.Major design decisions were reviewed and approved by Aerospace subject matter experts from different disciplines. DebriSat includes 7 major subsystems. Attitude determination and control system (ADCS), command and data handling (C&DH), electrical power system (EPS), payload, propulsion, telemetry tracking and command (TT&C), and thermal management. To reduce cost, most components are emulated based on existing design of flight hardware and fabricated with the same materials. A key laboratory-based test, Satellite Orbital debris Characterization Impact Test (SOCIT), supporting the development of the DoD and NASA satellite breakup models was conducted at AEDC in 1992 .Breakup models based on SOCIT have supported many applications and matched on-orbit events reasonably well over the years.

  10. Evaluating the accuracy of climate change pattern emulation for low warming targets

    NASA Astrophysics Data System (ADS)

    Tebaldi, Claudia; Knutti, Reto

    2018-05-01

    Global climate policy is increasingly debating the value of very low warming targets, yet not many experiments conducted with global climate models in their fully coupled versions are currently available to help inform studies of the corresponding impacts. This raises the question whether a map of warming or precipitation change in a world 1.5 °C warmer than preindustrial can be emulated from existing simulations that reach higher warming targets, or whether entirely new simulations are required. Here we show that also for this type of low warming in strong mitigation scenarios, climate change signals are quite linear as a function of global temperature. Therefore, emulation techniques amounting to linear rescaling on the basis of global temperature change ratios (like simple pattern scaling) provide a viable way forward. The errors introduced are small relative to the spread in the forced response to a given scenario that we can assess from a multi-model ensemble. They are also small relative to the noise introduced into the estimates of the forced response by internal variability within a single model, which we can assess from either control simulations or initial condition ensembles. Challenges arise when scaling inadvertently reduces the inter-model spread or suppresses the internal variability, both important sources of uncertainty for impact assessment, or when the scenarios have very different characteristics in the composition of the forcings. Taking advantage of an available suite of coupled model simulations under low-warming and intermediate scenarios, we evaluate the accuracy of these emulation techniques and show that they are unlikely to represent a substantial contribution to the total uncertainty.

  11. Advanced Free Flight Planner and Dispatcher's Workstation: Preliminary Design Specification

    NASA Technical Reports Server (NTRS)

    Wilson, J.; Wright, C.; Couluris, G. J.

    1997-01-01

    The National Aeronautics and Space Administration (NASA) has implemented the Advanced Air Transportation Technology (AATT) program to investigate future improvements to the national and international air traffic management systems. This research, as part of the AATT program, developed preliminary design requirements for an advanced Airline Operations Control (AOC) dispatcher's workstation, with emphasis on flight planning. This design will support the implementation of an experimental workstation in NASA laboratories that would emulate AOC dispatch operations. The work developed an airline flight plan data base and specified requirements for: a computer tool for generation and evaluation of free flight, user preferred trajectories (UPT); the kernel of an advanced flight planning system to be incorporated into the UPT-generation tool; and an AOC workstation to house the UPT-generation tool and to provide a real-time testing environment. A prototype for the advanced flight plan optimization kernel was developed and demonstrated. The flight planner uses dynamic programming to search a four-dimensional wind and temperature grid to identify the optimal route, altitude and speed for successive segments of a flight. An iterative process is employed in which a series of trajectories are successively refined until the LTPT is identified. The flight planner is designed to function in the current operational environment as well as in free flight. The free flight environment would enable greater flexibility in UPT selection based on alleviation of current procedural constraints. The prototype also takes advantage of advanced computer processing capabilities to implement more powerful optimization routines than would be possible with older computer systems.

  12. Results from Carbon Dioxide Washout Testing Using a Suited Manikin Test Apparatus with a Space Suit Ventilation Test Loop

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; Conger, Bruce; McMillin, Summer; Vonau, Walt; Kanne, Bryan; Korona, Adam; Swickrath, Mike

    2016-01-01

    NASA is developing an advanced portable life support system (PLSS) to meet the needs of a new NASA advanced space suit. The PLSS is one of the most critical aspects of the space suit providing the necessary oxygen, ventilation, and thermal protection for an astronaut performing a spacewalk. The ventilation subsystem in the PLSS must provide sufficient carbon dioxide (CO2) removal and ensure that the CO2 is washed away from the oronasal region of the astronaut. CO2 washout is a term used to describe the mechanism by which CO2 levels are controlled within the helmet to limit the concentration of CO2 inhaled by the astronaut. Accumulation of CO2 in the helmet or throughout the ventilation loop could cause the suited astronaut to experience hypercapnia (excessive carbon dioxide in the blood). A suited manikin test apparatus (SMTA) integrated with a space suit ventilation test loop was designed, developed, and assembled at NASA in order to experimentally validate adequate CO2 removal throughout the PLSS ventilation subsystem and to quantify CO2 washout performance under various conditions. The test results from this integrated system will be used to validate analytical models and augment human testing. This paper presents the system integration of the PLSS ventilation test loop with the SMTA including the newly developed regenerative Rapid Cycle Amine component used for CO2 removal and tidal breathing capability to emulate the human. The testing and analytical results of the integrated system are presented along with future work.

  13. A Phase-Locked Loop Epilepsy Network Emulator

    PubMed Central

    Watson, P.D.; Horecka, K. M.; Cohen, N.J.; Ratnam, R.

    2015-01-01

    Most seizure forecasting employs statistical learning techniques that lack a representation of the network interactions that give rise to seizures. We present an epilepsy network emulator (ENE) that uses a network of interconnected phase-locked loops (PLLs) to model synchronous, circuit-level oscillations between electrocorticography (ECoG) electrodes. Using ECoG data from a canine-epilepsy model (Davis et al. 2011) and a physiological entropy measure (approximate entropy or ApEn, Pincus 1995), we demonstrate the entropy of the emulator phases increases dramatically during ictal periods across all ECoG recording sites and across all animals in the sample. Further, this increase precedes the observable voltage spikes that characterize seizure activity in the ECoG data. These results suggest that the ENE is sensitive to phase-domain information in the neural circuits measured by ECoG and that an increase in the entropy of this measure coincides with increasing likelihood of seizure activity. Understanding this unpredictable phase-domain electrical activity present in ECoG recordings may provide a target for seizure detection and feedback control. PMID:26664133

  14. A satellite orbital testbed for SATCOM using mobile robots

    NASA Astrophysics Data System (ADS)

    Shen, Dan; Lu, Wenjie; Wang, Zhonghai; Jia, Bin; Wang, Gang; Wang, Tao; Chen, Genshe; Blasch, Erik; Pham, Khanh

    2016-05-01

    This paper develops and evaluates a satellite orbital testbed (SOT) for satellite communications (SATCOM). SOT can emulate the 3D satellite orbit using the omni-wheeled robots and a robotic arm. The 3D motion of satellite is partitioned into the movements in the equatorial plane and the up-down motions in the vertical plane. The former actions are emulated by omni-wheeled robots while the up-down motions are performed by a stepped-motor-controlled-ball along a rod (robotic arm), which is attached to the robot. The emulated satellite positions will go to the measure model, whose results will be used to perform multiple space object tracking. Then the tracking results will go to the maneuver detection and collision alert. The satellite maneuver commands will be translated to robots commands and robotic arm commands. In SATCOM, the effects of jamming depend on the range and angles of the positions of satellite transponder relative to the jamming satellite. We extend the SOT to include USRP transceivers. In the extended SOT, the relative ranges and angles are implemented using omni-wheeled robots and robotic arms.

  15. Using Big Data to Emulate a Target Trial When a Randomized Trial Is Not Available.

    PubMed

    Hernán, Miguel A; Robins, James M

    2016-04-15

    Ideally, questions about comparative effectiveness or safety would be answered using an appropriately designed and conducted randomized experiment. When we cannot conduct a randomized experiment, we analyze observational data. Causal inference from large observational databases (big data) can be viewed as an attempt to emulate a randomized experiment-the target experiment or target trial-that would answer the question of interest. When the goal is to guide decisions among several strategies, causal analyses of observational data need to be evaluated with respect to how well they emulate a particular target trial. We outline a framework for comparative effectiveness research using big data that makes the target trial explicit. This framework channels counterfactual theory for comparing the effects of sustained treatment strategies, organizes analytic approaches, provides a structured process for the criticism of observational studies, and helps avoid common methodologic pitfalls. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Brain-Inspired Photonic Signal Processor for Generating Periodic Patterns and Emulating Chaotic Systems

    NASA Astrophysics Data System (ADS)

    Antonik, Piotr; Haelterman, Marc; Massar, Serge

    2017-05-01

    Reservoir computing is a bioinspired computing paradigm for processing time-dependent signals. Its hardware implementations have received much attention because of their simplicity and remarkable performance on a series of benchmark tasks. In previous experiments, the output was uncoupled from the system and, in most cases, simply computed off-line on a postprocessing computer. However, numerical investigations have shown that feeding the output back into the reservoir opens the possibility of long-horizon time-series forecasting. Here, we present a photonic reservoir computer with output feedback, and we demonstrate its capacity to generate periodic time series and to emulate chaotic systems. We study in detail the effect of experimental noise on system performance. In the case of chaotic systems, we introduce several metrics, based on standard signal-processing techniques, to evaluate the quality of the emulation. Our work significantly enlarges the range of tasks that can be solved by hardware reservoir computers and, therefore, the range of applications they could potentially tackle. It also raises interesting questions in nonlinear dynamics and chaos theory.

  17. The introspective may achieve more: Enhancing existing Geoscientific models with native-language emulated structural reflection

    NASA Astrophysics Data System (ADS)

    Ji, Xinye; Shen, Chaopeng

    2018-01-01

    Geoscientific models manage myriad and increasingly complex data structures as trans-disciplinary models are integrated. They often incur significant redundancy with cross-cutting tasks. Reflection, the ability of a program to inspect and modify its structure and behavior at runtime, is known as a powerful tool to improve code reusability, abstraction, and separation of concerns. Reflection is rarely adopted in high-performance Geoscientific models, especially with Fortran, where it was previously deemed implausible. Practical constraints of language and legacy often limit us to feather-weight, native-language solutions. We demonstrate the usefulness of a structural-reflection-emulating, dynamically-linked metaObjects, gd. We show real-world examples including data structure self-assembly, effortless input/output (IO) and upgrade to parallel I/O, recursive actions and batch operations. We share gd and a derived module that reproduces MATLAB-like structure in Fortran and C++. We suggest that both a gd representation and a Fortran-native representation are maintained to access the data, each for separate purposes. Embracing emulated reflection allows generically-written codes that are highly re-usable across projects.

  18. Impedance Analysis of Ion Transport Through Supported Lipid Membranes Doped with Ionophores: A New Kinetic Approach

    PubMed Central

    Alvarez, P. E.; Vallejo, A. E.

    2008-01-01

    Kinetics of facilitated ion transport through planar bilayer membranes are normally analyzed by electrical conductance methods. The additional use of electrical relaxation techniques, such as voltage jump, is necessary to evaluate individual rate constants. Although electrochemical impedance spectroscopy is recognized as the most powerful of the available electric relaxation techniques, it has rarely been used in connection with these kinetic studies. According to the new approach presented in this work, three steps were followed. First, a kinetic model was proposed that has the distinct quality of being general, i.e., it properly describes both carrier and channel mechanisms of ion transport. Second, the state equations for steady-state and for impedance experiments were derived, exhibiting the input–output representation pertaining to the model’s structure. With the application of a method based on the similarity transformation approach, it was possible to check that the proposed mechanism is distinguishable, i.e., no other model with a different structure exhibits the same input–output behavior for any input as the original. Additionally, the method allowed us to check whether the proposed model is globally identifiable (i.e., whether there is a single set of fit parameters for the model) when analyzed in terms of its impedance response. Thus, our model does not represent a theoretical interpretation of the experimental impedance but rather constitutes the prerequisite to select this type of experiment in order to obtain optimal kinetic identification of the system. Finally, impedance measurements were performed and the results were fitted to the proposed theoretical model in order to obtain the kinetic parameters of the system. The successful application of this approach is exemplified with results obtained for valinomycin–K+ in lipid bilayers supported onto gold substrates, i.e., an arrangement capable of emulating biological membranes. PMID:19669528

  19. Applicability of aquifer impact models to support decisions at CO2 sequestration sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keating, Elizabeth; Bacon, Diana; Carroll, Susan

    2016-09-01

    The National Risk Assessment Partnership has developed a suite of tools to assess and manage risk at CO2 sequestration sites (www.netldoe.gov/nrap). This capability includes polynomial or look-up table based reduced-order models (ROMs) that predict the impact of CO2 and brine leaks on overlying aquifers. The development of these computationally-efficient models and the underlying reactive transport simulations they emulate has been documented elsewhere (Carroll et al., 2014, Dai et al., 2014, Keating et al., 2015). The ROMs reproduce the ensemble behavior of large numbers of simulations and are well-suited to applications that consider a large number of scenarios to understand parametermore » sensitivity and uncertainty on the risk of CO2 leakage to groundwater quality. In this paper, we seek to demonstrate applicability of ROM-based ensemble analysis by considering what types of decisions and aquifer types would benefit from the ROM analysis. We present four hypothetical four examples where applying ROMs, in ensemble mode, could support decisions in the early stages in a geologic CO2 sequestration project. These decisions pertain to site selection, site characterization, monitoring network evaluation, and health impacts. In all cases, we consider potential brine/CO2 leak rates at the base of the aquifer to be uncertain. We show that derived probabilities provide information relevant to the decision at hand. Although the ROMs were developed using site-specific data from two aquifers (High Plains and Edwards), the models accept aquifer characteristics as variable inputs and so they may have more broad applicability. We conclude that pH and TDS predictions are the most transferable to other aquifers based on the analysis of the nine water quality metrics (pH, TDS, 4 trace metals, 3 organic compounds). Guidelines are presented for determining the aquifer types for which the ROMs should be applicable.« less

  20. Feasibility of a Networked Air Traffic Infrastructure Validation Environment for Advanced NextGen Concepts

    NASA Technical Reports Server (NTRS)

    McCormack, Michael J.; Gibson, Alec K.; Dennis, Noah E.; Underwood, Matthew C.; Miller,Lana B.; Ballin, Mark G.

    2013-01-01

    Abstract-Next Generation Air Transportation System (NextGen) applications reliant upon aircraft data links such as Automatic Dependent Surveillance-Broadcast (ADS-B) offer a sweeping modernization of the National Airspace System (NAS), but the aviation stakeholder community has not yet established a positive business case for equipage and message content standards remain in flux. It is necessary to transition promising Air Traffic Management (ATM) Concepts of Operations (ConOps) from simulation environments to full-scale flight tests in order to validate user benefits and solidify message standards. However, flight tests are prohibitively expensive and message standards for Commercial-off-the-Shelf (COTS) systems cannot support many advanced ConOps. It is therefore proposed to simulate future aircraft surveillance and communications equipage and employ an existing commercial data link to exchange data during dedicated flight tests. This capability, referred to as the Networked Air Traffic Infrastructure Validation Environment (NATIVE), would emulate aircraft data links such as ADS-B using in-flight Internet and easily-installed test equipment. By utilizing low-cost equipment that is easy to install and certify for testing, advanced ATM ConOps can be validated, message content standards can be solidified, and new standards can be established through full-scale flight trials without necessary or expensive equipage or extensive flight test preparation. This paper presents results of a feasibility study of the NATIVE concept. To determine requirements, six NATIVE design configurations were developed for two NASA ConOps that rely on ADS-B. The performance characteristics of three existing in-flight Internet services were investigated to determine whether performance is adequate to support the concept. Next, a study of requisite hardware and software was conducted to examine whether and how the NATIVE concept might be realized. Finally, to determine a business case, economic factors were evaluated and a preliminary cost-benefit analysis was performed.

  1. Effect of a new social support program by voluntary organization in pediatric oncology department in a developing country.

    PubMed

    Nair, Manjusha; Parukkutty, Kusumakumary; Kommadath, Sheethal

    2014-04-01

    Comprehensive childhood cancer treatment in the modern era means not only strenuous treatment regimens and meticulous nursing care, it also implies attention to social, psychological, and financial aspects of disease and treatment. In a developing country like ours, though it is possible to provide good medical and nursing care in government set-up, there is always shortage of workforce and financial support, leading to nonadherence to treatment regimens by patients and parents, resulting in suboptimal treatment outcomes. Overcrowding of pediatric cancer patients along with general patients for lab tests and other hospital services, poor drug compliance, treatment abandonment, and lost to follow-up, lack of funding to meet nonmedical expenses and inadequate facility for providing psychological support were some of the major reasons we could identify as lacunae in our pediatric oncology division (POD). We introduced a new social support program with the help of additional staff supported by a nongovernmental agency, and new quality improvement services were introduced. The impact was demonstrable as reduction in waiting time in the hospital, allayed anxiety of painful procedures, better drug compliance, less treatment abandonment, and improved follow-up. This can be emulated in similar other resource-limited centers.

  2. Emulating short-term synaptic dynamics with memristive devices

    NASA Astrophysics Data System (ADS)

    Berdan, Radu; Vasilaki, Eleni; Khiat, Ali; Indiveri, Giacomo; Serb, Alexandru; Prodromakis, Themistoklis

    2016-01-01

    Neuromorphic architectures offer great promise for achieving computation capacities beyond conventional Von Neumann machines. The essential elements for achieving this vision are highly scalable synaptic mimics that do not undermine biological fidelity. Here we demonstrate that single solid-state TiO2 memristors can exhibit non-associative plasticity phenomena observed in biological synapses, supported by their metastable memory state transition properties. We show that, contrary to conventional uses of solid-state memory, the existence of rate-limiting volatility is a key feature for capturing short-term synaptic dynamics. We also show how the temporal dynamics of our prototypes can be exploited to implement spatio-temporal computation, demonstrating the memristors full potential for building biophysically realistic neural processing systems.

  3. Part-whole reasoning in medical ontologies revisited--introducing SEP triplets into classification-based description logics.

    PubMed

    Schulz, S; Romacker, M; Hahn, U

    1998-01-01

    The development of powerful and comprehensive medical ontologies that support formal reasoning on a large scale is one of the key requirements for clinical computing in the next millennium. Taxonomic medical knowledge, a major portion of these ontologies, is mainly characterized by generalization and part-whole relations between concepts. While reasoning in generalization hierarchies is quite well understood, no fully conclusive mechanism as yet exists for part-whole reasoning. The approach we take emulates part-whole reasoning via classification-based reasoning using SEP triplets, a special data structure for encoding part-whole relations that is fully embedded in the formal framework of standard description logics.

  4. Final Report: Enabling Exascale Hardware and Software Design through Scalable System Virtualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bridges, Patrick G.

    2015-02-01

    In this grant, we enhanced the Palacios virtual machine monitor to increase its scalability and suitability for addressing exascale system software design issues. This included a wide range of research on core Palacios features, large-scale system emulation, fault injection, perfomrance monitoring, and VMM extensibility. This research resulted in large number of high-impact publications in well-known venues, the support of a number of students, and the graduation of two Ph.D. students and one M.S. student. In addition, our enhanced version of the Palacios virtual machine monitor has been adopted as a core element of the Hobbes operating system under active DOE-fundedmore » research and development.« less

  5. THE `IN' AND THE `OUT': An Evolutionary Approach

    NASA Astrophysics Data System (ADS)

    Medina-Martins, P. R.; Rocha, L.

    It is claimed that a great deal of the problems which the mechanist approaches to the emulation/modelling of the human mind are presently facing are due to a host of canons so `readily' accepted and acquainted that some of their underlying processes have not yet been the objective of intensive research. The essay proposes a (possible) solution for some of these problems introducing the tenets of a new paradigm which, based on a reappraisal of the concepts of purposiveness and teleology, lays emphasis on the evolutionary aspects (biological and mental) of alive beings. A complex neuro-fuzzy system which works as the supporting realization of this paradigm is briefly described in the essay.

  6. Part-whole reasoning in medical ontologies revisited--introducing SEP triplets into classification-based description logics.

    PubMed Central

    Schulz, S.; Romacker, M.; Hahn, U.

    1998-01-01

    The development of powerful and comprehensive medical ontologies that support formal reasoning on a large scale is one of the key requirements for clinical computing in the next millennium. Taxonomic medical knowledge, a major portion of these ontologies, is mainly characterized by generalization and part-whole relations between concepts. While reasoning in generalization hierarchies is quite well understood, no fully conclusive mechanism as yet exists for part-whole reasoning. The approach we take emulates part-whole reasoning via classification-based reasoning using SEP triplets, a special data structure for encoding part-whole relations that is fully embedded in the formal framework of standard description logics. Images Figure 3 PMID:9929335

  7. Certification of Completion of ASC FY08 Level-2 Milestone ID #2933

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lipari, D A

    2008-06-12

    This report documents the satisfaction of the completion criteria associated with ASC FY08 Milestone ID No.2933: 'Deploy Moab resource management services on BlueGene/L'. Specifically, this milestone represents LLNL efforts to enhance both SLURM and Moab to extend Moab's capabilities to schedule and manage BlueGene/L, and increases portability of user scripts between ASC systems. The completion criteria for the milestone are the following: (1) Batch jobs can be specified, submitted to Moab, scheduled and run on the BlueGene/L system; (2) Moab will be able to support the markedly increased scale in node count as well as the wiring geometry that ismore » unique to BlueGene/L; and (3) Moab will also prepare and report statistics of job CPU usage just as it does for the current systems it supports. This document presents the completion evidence for both of the stated milestone certification methods: Completion evidence for this milestone will be in the form of (1) documentation--a report that certifies that the completion criteria have been met; and (2) user hand-off. As the selected Tri-Lab workload manager, Moab was chosen to replace LCRM as the enterprise-wide scheduler across Livermore Computing (LC) systems. While LCRM/SLURM successfully scheduled jobs on BG/L, the effort to replace LCRM with Moab on BG/L represented a significant challenge. Moab is a commercial product developed and sold by Cluster Resources, Inc. (CRI). Moab receives the users batch job requests and dispatches these jobs to run on a specific cluster. SLURM is an open-source resource manager whose development is managed by members of the Integrated Computational Resource Management Group (ICRMG) within the Services and Development Division at LLNL. SLURM is responsible for launching and running jobs on an individual cluster. Replacing LCRM with Moab on BG/L required substantial changes to both Moab and SLURM. While the ICRMG could directly manage the SLURM development effort, the work to enhance Moab had to be done by Moab's vendor. Members of the ICRMG held many meetings with CRI developers to develop the design and specify the requirements for what Moab needed to do. Extensions to SLURM are used to run jobs on the BlueGene/L architecture. These extensions support the three dimensional network topology unique to BG/L. While BG/L geometry support was already in SLURM, enhancements were needed to provide backfill capability and answer 'will-run' queries from Moab. For its part, the Moab architecture needed to be modified to interact with SLURM in a more coordinated way. It needed enhancements to support SLURM's shorthand notation for representing thousands of compute nodes and report this information using Moab's existing status commands. The LCRM wrapper scripts that emulated LCRM commands also needed to be enhanced to support BG/L usage. The effort was successful as Moab 5.2.2 and SLURM 1.3 was installed on the 106496 node BG/L machine on May 21, 2008, and turned over to the users to run production.« less

  8. Control of electro-rheological fluid-based torque generation components for use in active rehabilitation devices

    NASA Astrophysics Data System (ADS)

    Nikitczuk, Jason; Weinberg, Brian; Mavroidis, Constantinos

    2006-03-01

    In this paper we present the design and control algorithms for novel electro-rheological fluid based torque generation elements that will be used to drive the joint of a new type of portable and controllable Active Knee Rehabilitation Orthotic Device (AKROD) for gait retraining in stroke patients. The AKROD is composed of straps and rigid components for attachment to the leg, with a central hinge mechanism where a gear system is connected. The key features of AKROD include: a compact, lightweight design with highly tunable torque capabilities through a variable damper component, full portability with on board power, control circuitry, and sensors (encoder and torque), and real-time capabilities for closed loop computer control for optimizing gait retraining. The variable damper component is achieved through an electro-rheological fluid (ERF) element that connects to the output of the gear system. Using the electrically controlled rheological properties of ERFs, compact brakes capable of supplying high resistive and controllable torques, are developed. A preliminary prototype for AKROD v.2 has been developed and tested in our laboratory. AKROD's v.2 ERF resistive actuator was tested in laboratory experiments using our custom made ERF Testing Apparatus (ETA). ETA provides a computer controlled environment to test ERF brakes and actuators in various conditions and scenarios including emulating the interaction between human muscles involved with the knee and AKROD's ERF actuators / brakes. In our preliminary results, AKROD's ERF resistive actuator was tested in closed loop torque control experiments. A hybrid (non-linear, adaptive) Proportional-Integral (PI) torque controller was implemented to achieve this goal.

  9. Evolution of cooperative behavior in simulation agents

    NASA Astrophysics Data System (ADS)

    Stroud, Phillip D.

    1998-03-01

    A simulated automobile factory paint shop is used as a testbed for exploring the emulation of human decision-making behavior. A discrete-events simulation of the paint shop as a collection of interacting Java actors is described. An evolutionary cognitive architecture is under development for building software actors to emulate humans in simulations of human- dominated complex systems. In this paper, the cognitive architecture is extended by implementing a persistent population of trial behaviors with an incremental fitness valuation update strategy, and by allowing a group of cognitive actors to share information. A proof-of-principle demonstration is presented.

  10. Specifying a target trial prevents immortal time bias and other self-inflicted injuries in observational analyses.

    PubMed

    Hernán, Miguel A; Sauer, Brian C; Hernández-Díaz, Sonia; Platt, Robert; Shrier, Ian

    2016-11-01

    Many analyses of observational data are attempts to emulate a target trial. The emulation of the target trial may fail when researchers deviate from simple principles that guide the design and analysis of randomized experiments. We review a framework to describe and prevent biases, including immortal time bias, that result from a failure to align start of follow-up, specification of eligibility, and treatment assignment. We review some analytic approaches to avoid these problems in comparative effectiveness or safety research. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Guidance, Navigation and Control Digital Emulation Technology Laboratory. Volume 1. Part 3. Task 1: Digital Emulation Technology Laboratory

    DTIC Science & Technology

    1991-09-27

    ISTT IEXIT IGO MAGV TGII ROLLERI YAWER LA !10 TRMTGO TGE 1 TGE2ALI IBURNI ACQD UVs MVS CMMD VCMD IFTAB TFTAB ACSLEV DTOFFV IVTAB TB URNM TI MONVI TVTAB...Volume 1, Part 3 C b inertdfatal loup ndxjust compu ed . Missile states must not CI C transition has occurred since the last C C integration step . The...SUBROUTINE RESP2R ( DT,WD,ZD,CILL,CIL,CI., COLL ,COL,CO) C C SUBROUTINE NAME : RESP2R C C AUTHOR(S) :D. F. SMITH C C FUNCTION Given a second

  12. 75 FR 9588 - Advisory Panel on Department of Defense Capabilities for Support of Civil Authorities After...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-03

    ... Defense Capabilities for Support of Civil Authorities after Certain Incidents (hereinafter referred to as... Advisory Panel on Department of Defense Capabilities for Support of Civil Authorities After Certain... DEPARTMENT OF DEFENSE Office of the Secretary Advisory Panel on Department of Defense Capabilities...

  13. Improved passive optical network architectures to support local area network emulation and protection

    NASA Astrophysics Data System (ADS)

    Wong, Elaine; Nadarajah, Nishaanthan; Chae, Chang-Joon; Nirmalathas, Ampalavanapillai; Attygalle, Sanjeewa M.

    2006-01-01

    We describe two optical layer schemes which simultaneously facilitate local area network emulation and automatic protection switching against distribution fiber breaks in passive optical networks. One scheme employs a narrowband fiber Bragg grating placed close to the star coupler in the feeder fiber of the passive optical network, while the other uses an additional short length distribution fiber from the star coupler to each customer for the redirection of the customer traffic. Both schemes use RF subcarrier multiplexed transmission for intercommunication between customers in conjunction with upstream access to the central office at baseband. Failure detection and automatic protection switching are performed independently by each optical network unit that is located at the customer premises in a distributed manner. The restoration of traffic transported between the central office and an optical network unit in the event of the distribution fiber break is performed by interconnecting adjacent optical network units and carrying out signal transmissions via an independent but interconnected optical network unit. Such a protection mechanism enables multiple adjacent optical network units to be simultaneously protected by a single optical network unit utilizing its maximum available bandwidth. We experimentally verify the feasibility of both schemes with 1.25 Gb/s upstream baseband transmission to the central office and 155 Mb/s local area network data transmission on a RF subcarrier frequency. The experimental results obtained from both schemes are compared, and the power budgets are calculated to analyze the scalability of each scheme.

  14. Space Link Extension (SLE) Emulation for High-Throughput Network Communication

    NASA Technical Reports Server (NTRS)

    Murawski, Robert; Tchorowski, Nicole; Golden, Bert

    2014-01-01

    As the data rate requirements for space communications increases, signicant stressis placed not only on the wireless satellite communication links, but also on the groundnetworks which forward data from end-users to remote ground stations. These wide areanetwork (WAN) connections add delay and jitter to the end-to-end satellite communicationlink, eects which can have signicant impacts on the wireless communication link. It isimperative that any ground communication protocol can react to these eects such that theground network does not become a bottleneck in the communication path to the satellite.In this paper, we present our SCENIC Emulation Lab testbed which was developed to testthe CCSDS SLE protocol implementations proposed for use on future NASA communica-tion networks. Our results show that in the presence of realistic levels of network delay,high-throughput SLE communication links can experience signicant data rate throttling.Based on our observations, we present some insight into why this data throttling happens,and trace the probable issue back to non-optimal blocking communication which is sup-ported by the CCSDS SLE API recommended practices. These issues were presented aswell to the SLE implementation developers which, based on our reports, developed a newrelease for SLE which we show xes the SLE blocking issue and greatly improves the pro-tocol throughput. In this paper, we also discuss future developments for our end-to-endemulation lab and how these improvements can be used to develop and test future spacecommunication technologies.

  15. A pilot study comparing mouse and mouse-emulating interface devices for graphic input.

    PubMed

    Kanny, E M; Anson, D K

    1991-01-01

    Adaptive interface devices make it possible for individuals with physical disabilities to use microcomputers and thus perform many tasks that they would otherwise be unable to accomplish. Special equipment is available that purports to allow functional access to the computer for users with disabilities. As technology moves from purely keyboard applications to include graphic input, it will be necessary for assistive interface devices to support graphics as well as text entry. Headpointing systems that emulate the mouse in combination with on-screen keyboards are of particular interest to persons with severe physical impairment such as high level quadriplegia. Two such systems currently on the market are the HeadMaster and the Free Wheel. The authors have conducted a pilot study comparing graphic input speed using the mouse and two headpointing interface systems on the Macintosh computer. The study used a single subject design with six able-bodied subjects, to establish a baseline for comparison with persons with severe disabilities. Results of these preliminary data indicated that the HeadMaster was nearly as effective as the mouse and that it was superior to the Free Wheel for graphics input. This pilot study, however, demonstrated several experimental design problems that need to be addressed to make the study more robust. It also demonstrated the need to include the evaluation of text input so that the effectiveness of the interface devices with text and graphic input could be compared.

  16. CASE/A - COMPUTER AIDED SYSTEM ENGINEERING AND ANALYSIS, ECLSS/ATCS SERIES

    NASA Technical Reports Server (NTRS)

    Bacskay, A.

    1994-01-01

    Design and analysis of Environmental Control and Life Support Systems (ECLSS) and Active Thermal Control Systems (ATCS) for spacecraft missions requires powerful software that is flexible and responsive to the demands of particular projects. CASE/A is an interactive trade study and analysis tool designed to increase productivity during all phases of systems engineering. The graphics-based command-driven package provides a user-friendly environment in which the engineer can analyze the performance and interface characteristics of an ECLS/ATC system. The package is useful during all phases of a spacecraft design program, from initial conceptual design trade studies to the actual flight, including pre-flight prediction and in-flight anomaly analysis. The CASE/A program consists of three fundamental parts: 1) the schematic management system, 2) the database management system, and 3) the simulation control and execution system. The schematic management system allows the user to graphically construct a system model by arranging icons representing system components and connecting the components with physical fluid streams. Version 4.1 contains 51 fully coded and documented default component routines. New components can be added by the user through the "blackbox" component option. The database management system supports the storage and manipulation of component data, output data, and solution control data through interactive edit screens. The simulation control and execution system initiates and controls the iterative solution process, displaying time status and any necessary diagnostic messages. In addition to these primary functions, the program provides three other important functional areas: 1) model output management, 2) system utility commands, and 3) user operations logic capacity. The model output management system provides tabular and graphical output capability. Complete fluid constituent mass fraction and properties data (mass flow, pressure, temperature, specific heat, density, and viscosity) is generated at user-selected output intervals and stored for reference. The Integrated Plot Utility (IPU) provides plotting capability for all data output. System utility commands are provided to enable the user to operate more efficiently in the CASE/A environment. The user is able to customize a simulation through optional operations FORTRAN logic. This user-developed code is compiled and linked with a CASE/A model and enables the user to control and timeline component operating parameters during various phases of the iterative solution process. CASE/A provides for transient tracking of the flow stream constituents and determination of their thermodynamic state throughout an ECLSS/ATCS simulation, performing heat transfer, chemical reaction, mass/energy balance, and system pressure drop analysis based on user-specified operating conditions. The program tracks each constituent through all combination and decomposition states while maintaining a mass and energy balance on the overall system. This allows rapid assessment of ECLSS designs, the impact of alternate technologies, and impacts due to changes in metabolic forcing functions, consumables usage, and system control considerations. CASE/A is written in FORTRAN 77 for the DEC VAX/VMS computer series, and requires 12Mb of disk storage and a minimum paging file quota of 20,000 pages. The program operates on the Tektronix 4014 graphics standard and VT100 text standard. The program requires a Tektronix 4014 or later graphics terminal, third party composite graphics/text terminal, or personal computer loaded with appropriate VT100/TEK 4014 emulator software. The use of composite terminals or personal computers with popular emulation software is recommended for enhanced CASE/A operations and general ease of use. The program is available on an unlabeled 9-track 6250 BPI DEC VAX BACKUP format magnetic tape. CASE/A development began in 1985 under contract to NASA/Marshall Space Flight Center. The latest version (4.1) was released in 1990. Tektronix and TEK 4014 are trademarks of Tektronix, Inc. VT100 is a trademark of Digital Equipment Corporation.

  17. Robot tracking system improvements and visual calibration of orbiter position for radiator inspection

    NASA Technical Reports Server (NTRS)

    Tonkay, Gregory

    1990-01-01

    The following separate topics are addressed: (1) improving a robotic tracking system; and (2) providing insights into orbiter position calibration for radiator inspection. The objective of the tracking system project was to provide the capability to track moving targets more accurately by adjusting parameters in the control system and implementing a predictive algorithm. A computer model was developed to emulate the tracking system. Using this model as a test bed, a self-tuning algorithm was developed to tune the system gains. The model yielded important findings concerning factors that affect the gains. The self-tuning algorithms will provide the concepts to write a program to automatically tune the gains in the real system. The section concerning orbiter position calibration provides a comparison to previous work that had been performed for plant growth. It provided the conceptualized routines required to visually determine the orbiter position and orientation. Furthermore, it identified the types of information which are required to flow between the robot controller and the vision system.

  18. Photonically enabled Ka-band radar and infrared sensor subscale testbed

    NASA Astrophysics Data System (ADS)

    Lohr, Michele B.; Sova, Raymond M.; Funk, Kevin B.; Airola, Marc B.; Dennis, Michael L.; Pavek, Richard E.; Hollenbeck, Jennifer S.; Garrison, Sean K.; Conard, Steven J.; Terry, David H.

    2014-10-01

    A subscale radio frequency (RF) and infrared (IR) testbed using novel RF-photonics techniques for generating radar waveforms is currently under development at The Johns Hopkins University Applied Physics Laboratory (JHU/APL) to study target scenarios in a laboratory setting. The linearity of Maxwell's equations allows the use of millimeter wavelengths and scaled-down target models to emulate full-scale RF scene effects. Coupled with passive IR and visible sensors, target motions and heating, and a processing and algorithm development environment, this testbed provides a means to flexibly and cost-effectively generate and analyze multi-modal data for a variety of applications, including verification of digital model hypotheses, investigation of correlated phenomenology, and aiding system capabilities assessment. In this work, concept feasibility is demonstrated for simultaneous RF, IR, and visible sensor measurements of heated, precessing, conical targets and of a calibration cylinder. Initial proof-of-principle results are shown of the Ka-band subscale radar, which models S-band for 1/10th scale targets, using stretch processing and Xpatch models.

  19. Noise-aided computation within a synthetic gene network through morphable and robust logic gates

    NASA Astrophysics Data System (ADS)

    Dari, Anna; Kia, Behnam; Wang, Xiao; Bulsara, Adi R.; Ditto, William

    2011-04-01

    An important goal for synthetic biology is to build robust and tunable genetic regulatory networks that are capable of performing assigned operations, usually in the presence of noise. In this work, a synthetic gene network derived from the bacteriophage λ underpins a reconfigurable logic gate wherein we exploit noise and nonlinearity through the application of the logical stochastic resonance paradigm. This biological logic gate can emulate or “morph” the AND and OR operations through varying internal system parameters in a noisy background. Such genetic circuits can afford intriguing possibilities in the realization of engineered genetic networks in which the actual function of the gate can be changed after the network has been built, via an external control parameter. In this article, the full system characterization is reported, with the logic gate performance studied in the presence of external and internal noise. The robustness of the gate, to noise, is studied and illustrated through numerical simulations.

  20. Experimental validation of wireless communication with chaos.

    PubMed

    Ren, Hai-Peng; Bai, Chao; Liu, Jian; Baptista, Murilo S; Grebogi, Celso

    2016-08-01

    The constraints of a wireless physical media, such as multi-path propagation and complex ambient noises, prevent information from being communicated at low bit error rate. Surprisingly, it has only recently been shown that, from a theoretical perspective, chaotic signals are optimal for communication. It maximises the receiver signal-to-noise performance, consequently minimizing the bit error rate. This work demonstrates numerically and experimentally that chaotic systems can in fact be used to create a reliable and efficient wireless communication system. Toward this goal, we propose an impulsive control method to generate chaotic wave signals that encode arbitrary binary information signals and an integration logic together with the match filter capable of decreasing the noise effect over a wireless channel. The experimental validation is conducted by inputting the signals generated by an electronic transmitting circuit to an electronic circuit that emulates a wireless channel, where the signals travel along three different paths. The output signal is decoded by an electronic receiver, after passing through a match filter.

  1. Activity-dependent synaptic plasticity of a chalcogenide electronic synapse for neuromorphic systems.

    PubMed

    Li, Yi; Zhong, Yingpeng; Zhang, Jinjian; Xu, Lei; Wang, Qing; Sun, Huajun; Tong, Hao; Cheng, Xiaoming; Miao, Xiangshui

    2014-05-09

    Nanoscale inorganic electronic synapses or synaptic devices, which are capable of emulating the functions of biological synapses of brain neuronal systems, are regarded as the basic building blocks for beyond-Von Neumann computing architecture, combining information storage and processing. Here, we demonstrate a Ag/AgInSbTe/Ag structure for chalcogenide memristor-based electronic synapses. The memristive characteristics with reproducible gradual resistance tuning are utilised to mimic the activity-dependent synaptic plasticity that serves as the basis of memory and learning. Bidirectional long-term Hebbian plasticity modulation is implemented by the coactivity of pre- and postsynaptic spikes, and the sign and degree are affected by assorted factors including the temporal difference, spike rate and voltage. Moreover, synaptic saturation is observed to be an adjustment of Hebbian rules to stabilise the growth of synaptic weights. Our results may contribute to the development of highly functional plastic electronic synapses and the further construction of next-generation parallel neuromorphic computing architecture.

  2. A digital waveguide-based approach for Clavinet modeling and synthesis

    NASA Astrophysics Data System (ADS)

    Gabrielli, Leonardo; Välimäki, Vesa; Penttinen, Henri; Squartini, Stefano; Bilbao, Stefan

    2013-12-01

    The Clavinet is an electromechanical musical instrument produced in the mid-twentieth century. As is the case for other vintage instruments, it is subject to aging and requires great effort to be maintained or restored. This paper reports analyses conducted on a Hohner Clavinet D6 and proposes a computational model to faithfully reproduce the Clavinet sound in real time, from tone generation to the emulation of the electronic components. The string excitation signal model is physically inspired and represents a cheap solution in terms of both computational resources and especially memory requirements (compared, e.g., to sample playback systems). Pickups and amplifier models have been implemented which enhance the natural character of the sound with respect to previous work. A model has been implemented on a real-time software platform, Pure Data, capable of a 10-voice polyphony with low latency on an embedded device. Finally, subjective listening tests conducted using the current model are compared to previous tests showing slightly improved results.

  3. Cost-Effective Hyperspectral Transmissometers for Oceanographic Applications: Performance Analysis

    PubMed Central

    Ramírez-Pérez, Marta; Röttgers, Rüdiger; Torrecilla, Elena; Piera, Jaume

    2015-01-01

    The recent development of inexpensive, compact hyperspectral transmissometers broadens the research capabilities of oceanographic applications. These developments have been achieved by incorporating technologies such as micro-spectrometers as detectors as well as light emitting diodes (LEDs) as light sources. In this study, we evaluate the performance of the new commercial LED-based hyperspectral transmissometer VIPER (TriOS GmbH, Rastede, Germany), which combines different LEDs to emulate the visible light spectrum, aiming at the determination of attenuation coefficients in coastal environments. For this purpose, experimental uncertainties related to the instrument stability, the effect of ambient light and derived temperature, and salinity correction factors are analyzed. Our results identify some issues related to the thermal management of the LEDs and the contamination of ambient light. Furthermore, the performance of VIPER is validated against other transmissometers through simultaneous field measurements. It is demonstrated that VIPER provides a compact and cost-effective alternative for beam attenuation measurements in coastal waters, but it requires the consideration of several optimizations. PMID:26343652

  4. Experimental validation of wireless communication with chaos

    NASA Astrophysics Data System (ADS)

    Ren, Hai-Peng; Bai, Chao; Liu, Jian; Baptista, Murilo S.; Grebogi, Celso

    2016-08-01

    The constraints of a wireless physical media, such as multi-path propagation and complex ambient noises, prevent information from being communicated at low bit error rate. Surprisingly, it has only recently been shown that, from a theoretical perspective, chaotic signals are optimal for communication. It maximises the receiver signal-to-noise performance, consequently minimizing the bit error rate. This work demonstrates numerically and experimentally that chaotic systems can in fact be used to create a reliable and efficient wireless communication system. Toward this goal, we propose an impulsive control method to generate chaotic wave signals that encode arbitrary binary information signals and an integration logic together with the match filter capable of decreasing the noise effect over a wireless channel. The experimental validation is conducted by inputting the signals generated by an electronic transmitting circuit to an electronic circuit that emulates a wireless channel, where the signals travel along three different paths. The output signal is decoded by an electronic receiver, after passing through a match filter.

  5. Performance Evaluation of Heart Sound Cancellation in FPGA Hardware Implementation for Electronic Stethoscope

    PubMed Central

    Chao, Chun-Tang

    2014-01-01

    This paper presents the design and evaluation of the hardware circuit for electronic stethoscopes with heart sound cancellation capabilities using field programmable gate arrays (FPGAs). The adaptive line enhancer (ALE) was adopted as the filtering methodology to reduce heart sound attributes from the breath sounds obtained via the electronic stethoscope pickup. FPGAs were utilized to implement the ALE functions in hardware to achieve near real-time breath sound processing. We believe that such an implementation is unprecedented and crucial toward a truly useful, standalone medical device in outpatient clinic settings. The implementation evaluation with one Altera cyclone II–EP2C70F89 shows that the proposed ALE used 45% resources of the chip. Experiments with the proposed prototype were made using DE2-70 emulation board with recorded body signals obtained from online medical archives. Clear suppressions were observed in our experiments from both the frequency domain and time domain perspectives. PMID:24790573

  6. A superconducting direct-current limiter with a power of up to 8 MVA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisher, L. M.; Alferov, D. F., E-mail: DFAlferov@niitfa.ru; Akhmetgareev, M. R.

    2016-12-15

    A resistive switching superconducting fault current limiter (SFCL) for DC networks with a nominal voltage of 3.5 kV and a nominal current of 2 kA was developed, produced, and tested. The SFCL has two main units—an assembly of superconducting modules and a high-speed vacuum circuit breaker. The assembly of superconducting modules consists of nine (3 × 3) parallel–series connected modules. Each module contains four parallel-connected 2G high-temperature superconducting (HTS) tapes. The results of SFCL tests in the short-circuit emulation mode with a maximum current rise rate of 1300 A/ms are presented. The SFCL is capable of limiting the current atmore » a level of 7 kA and break it 8 ms after the current-limiting mode begins. The average temperature of HTS tapes during the current-limiting mode increases to 210 K. After the current is interrupted, the superconductivity recovery time does not exceed 1 s.« less

  7. Study of a unified hardware and software fault-tolerant architecture

    NASA Technical Reports Server (NTRS)

    Lala, Jaynarayan; Alger, Linda; Friend, Steven; Greeley, Gregory; Sacco, Stephen; Adams, Stuart

    1989-01-01

    A unified architectural concept, called the Fault Tolerant Processor Attached Processor (FTP-AP), that can tolerate hardware as well as software faults is proposed for applications requiring ultrareliable computation capability. An emulation of the FTP-AP architecture, consisting of a breadboard Motorola 68010-based quadruply redundant Fault Tolerant Processor, four VAX 750s as attached processors, and four versions of a transport aircraft yaw damper control law, is used as a testbed in the AIRLAB to examine a number of critical issues. Solutions of several basic problems associated with N-Version software are proposed and implemented on the testbed. This includes a confidence voter to resolve coincident errors in N-Version software. A reliability model of N-Version software that is based upon the recent understanding of software failure mechanisms is also developed. The basic FTP-AP architectural concept appears suitable for hosting N-Version application software while at the same time tolerating hardware failures. Architectural enhancements for greater efficiency, software reliability modeling, and N-Version issues that merit further research are identified.

  8. Advanced Grid Simulator for Multi-Megawatt Power Converter Testing and Certification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koralewicz, Przemyslaw; Gevorgian, Vahan; Wallen, Robb

    2017-02-16

    Grid integration testing of inverter-coupled renewable energy technologies is an essential step in the qualification of renewable energy and energy storage systems to ensure the stability of the power system. New types of devices must be thoroughly tested and validated for compliance with relevant grid codes and interconnection requirements. For this purpose, highly specialized custom-made testing equipment is needed to emulate various types of realistic grid conditions that are required by certification bodies or for research purposes. For testing multi-megawatt converters, a high power grid simulator capable of creating controlled grid conditions and meeting both power quality and dynamic characteristicsmore » is needed. This paper describes the new grid simulator concept based on ABB's medium voltage ACS6000 drive technology that utilizes advanced modulation and control techniques to create an unique testing platform for various multi-megawatt power converter systems. Its performance is demonstrated utilizing the test results obtained during commissioning activities at the National Renewable Energy Laboratory in Colorado, USA.« less

  9. Experimental validation of wireless communication with chaos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Hai-Peng; Bai, Chao; Liu, Jian

    The constraints of a wireless physical media, such as multi-path propagation and complex ambient noises, prevent information from being communicated at low bit error rate. Surprisingly, it has only recently been shown that, from a theoretical perspective, chaotic signals are optimal for communication. It maximises the receiver signal-to-noise performance, consequently minimizing the bit error rate. This work demonstrates numerically and experimentally that chaotic systems can in fact be used to create a reliable and efficient wireless communication system. Toward this goal, we propose an impulsive control method to generate chaotic wave signals that encode arbitrary binary information signals and anmore » integration logic together with the match filter capable of decreasing the noise effect over a wireless channel. The experimental validation is conducted by inputting the signals generated by an electronic transmitting circuit to an electronic circuit that emulates a wireless channel, where the signals travel along three different paths. The output signal is decoded by an electronic receiver, after passing through a match filter.« less

  10. In Situ Neutron Depth Profiling of Lithium Metal-Garnet Interfaces for Solid State Batteries.

    PubMed

    Wang, Chengwei; Gong, Yunhui; Dai, Jiaqi; Zhang, Lei; Xie, Hua; Pastel, Glenn; Liu, Boyang; Wachsman, Eric; Wang, Howard; Hu, Liangbing

    2017-10-11

    The garnet-based solid state electrolyte (SSE) is considered a promising candidate to realize all solid state lithium (Li) metal batteries. However, critical issues require additional investigation before practical applications become possible, among which high interfacial impedance and low interfacial stability remain the most challenging. In this work, neutron depth profiling (NDP), a nondestructive and uniquely Li-sensitive technique, has been used to reveal the interfacial behavior of garnet SSE in contact with metallic Li through in situ monitoring of Li plating-stripping processes. The NDP measurement demonstrates predictive capabilities for diagnosing short-circuits in solid state batteries. Two types of cells, symmetric Li/garnet/Li (LGL) cells and asymmetric Li/garnet/carbon-nanotubes (LGC), are fabricated to emulate the behavior of Li metal and Li-free Li metal anodes, respectively. The data imply the limitation of Li-free Li metal anode in forming reliable interfacial contacts, and strategies of excessive Li and better interfacial engineering need to be investigated.

  11. A superconducting direct-current limiter with a power of up to 8 MVA

    NASA Astrophysics Data System (ADS)

    Fisher, L. M.; Alferov, D. F.; Akhmetgareev, M. R.; Budovskii, A. I.; Evsin, D. V.; Voloshin, I. F.; Kalinov, A. V.

    2016-12-01

    A resistive switching superconducting fault current limiter (SFCL) for DC networks with a nominal voltage of 3.5 kV and a nominal current of 2 kA was developed, produced, and tested. The SFCL has two main units—an assembly of superconducting modules and a high-speed vacuum circuit breaker. The assembly of superconducting modules consists of nine (3 × 3) parallel-series connected modules. Each module contains four parallel-connected 2G high-temperature superconducting (HTS) tapes. The results of SFCL tests in the short-circuit emulation mode with a maximum current rise rate of 1300 A/ms are presented. The SFCL is capable of limiting the current at a level of 7 kA and break it 8 ms after the current-limiting mode begins. The average temperature of HTS tapes during the current-limiting mode increases to 210 K. After the current is interrupted, the superconductivity recovery time does not exceed 1 s.

  12. Optimized pulsed write schemes improve linearity and write speed for low-power organic neuromorphic devices

    NASA Astrophysics Data System (ADS)

    Keene, Scott T.; Melianas, Armantas; Fuller, Elliot J.; van de Burgt, Yoeri; Talin, A. Alec; Salleo, Alberto

    2018-06-01

    Neuromorphic devices are becoming increasingly appealing as efficient emulators of neural networks used to model real world problems. However, no hardware to date has demonstrated the necessary high accuracy and energy efficiency gain over CMOS in both (1) training via backpropagation and (2) in read via vector matrix multiplication. Such shortcomings are due to device non-idealities, particularly asymmetric conductance tuning in response to uniform voltage pulse inputs. Here, by formulating a general circuit model for capacitive ion-exchange neuromorphic devices, we show that asymmetric nonlinearity in organic electrochemical neuromorphic devices (ENODes) can be suppressed by an appropriately chosen write scheme. Simulations based upon our model suggest that a nonlinear write-selector could reduce the switching voltage and energy, enabling analog tuning via a continuous set of resistance states (100 states) with extremely low switching energy (~170 fJ · µm‑2). This work clarifies the pathway to neural algorithm accelerators capable of parallelism during both read and write operations.

  13. CDP - Adaptive Supervisory Control and Data Acquisition (SCADA) Technology for Infrastructure Protection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marco Carvalho; Richard Ford

    2012-05-14

    Supervisory Control and Data Acquisition (SCADA) Systems are a type of Industrial Control System characterized by the centralized (or hierarchical) monitoring and control of geographically dispersed assets. SCADA systems combine acquisition and network components to provide data gathering, transmission, and visualization for centralized monitoring and control. However these integrated capabilities, especially when built over legacy systems and protocols, generally result in vulnerabilities that can be exploited by attackers, with potentially disastrous consequences. Our research project proposal was to investigate new approaches for secure and survivable SCADA systems. In particular, we were interested in the resilience and adaptability of large-scale mission-criticalmore » monitoring and control infrastructures. Our research proposal was divided in two main tasks. The first task was centered on the design and investigation of algorithms for survivable SCADA systems and a prototype framework demonstration. The second task was centered on the characterization and demonstration of the proposed approach in illustrative scenarios (simulated or emulated).« less

  14. A hierarchical structure for representing and learning fuzzy rules

    NASA Technical Reports Server (NTRS)

    Yager, Ronald R.

    1993-01-01

    Yager provides an example in which the flat representation of fuzzy if-then rules leads to unsatisfactory results. Consider a rule base consisting to two rules: if U is 12 the V is 29; if U is (10-15) the V is (25-30). If U = 12 we would get V is G where G = (25-30). The application of the defuzzification process leads to a selection of V = 27.5. Thus we see that the very specific instruction was not followed. The problem with the technique used is that the most specific information was swamped by the less specific information. In this paper we shall provide for a new structure for the representation of fuzzy if-then rules. The representational form introduced here is called a Hierarchical Prioritized Structure (HPS) representation. Most importantly in addition to overcoming the problem illustrated in the previous example this HPS representation has an inherent capability to emulate the learning of general rules and provides a reasonable accurate cognitive mapping of how human beings store information.

  15. Combining super-ensembles and statistical emulation to improve a regional climate and vegetation model

    NASA Astrophysics Data System (ADS)

    Hawkins, L. R.; Rupp, D. E.; Li, S.; Sarah, S.; McNeall, D. J.; Mote, P.; Betts, R. A.; Wallom, D.

    2017-12-01

    Changing regional patterns of surface temperature, precipitation, and humidity may cause ecosystem-scale changes in vegetation, altering the distribution of trees, shrubs, and grasses. A changing vegetation distribution, in turn, alters the albedo, latent heat flux, and carbon exchanged with the atmosphere with resulting feedbacks onto the regional climate. However, a wide range of earth-system processes that affect the carbon, energy, and hydrologic cycles occur at sub grid scales in climate models and must be parameterized. The appropriate parameter values in such parameterizations are often poorly constrained, leading to uncertainty in predictions of how the ecosystem will respond to changes in forcing. To better understand the sensitivity of regional climate to parameter selection and to improve regional climate and vegetation simulations, we used a large perturbed physics ensemble and a suite of statistical emulators. We dynamically downscaled a super-ensemble (multiple parameter sets and multiple initial conditions) of global climate simulations using a 25-km resolution regional climate model HadRM3p with the land-surface scheme MOSES2 and dynamic vegetation module TRIFFID. We simultaneously perturbed land surface parameters relating to the exchange of carbon, water, and energy between the land surface and atmosphere in a large super-ensemble of regional climate simulations over the western US. Statistical emulation was used as a computationally cost-effective tool to explore uncertainties in interactions. Regions of parameter space that did not satisfy observational constraints were eliminated and an ensemble of parameter sets that reduce regional biases and span a range of plausible interactions among earth system processes were selected. This study demonstrated that by combining super-ensemble simulations with statistical emulation, simulations of regional climate could be improved while simultaneously accounting for a range of plausible land-atmosphere feedback strengths.

  16. Causal knowledge and imitation/emulation switching in chimpanzees (Pan troglodytes) and children (Homo sapiens).

    PubMed

    Horner, Victoria; Whiten, Andrew

    2005-07-01

    This study explored whether the tendency of chimpanzees and children to use emulation or imitation to solve a tool-using task was a response to the availability of causal information. Young wild-born chimpanzees from an African sanctuary and 3- to 4-year-old children observed a human demonstrator use a tool to retrieve a reward from a puzzle-box. The demonstration involved both causally relevant and irrelevant actions, and the box was presented in each of two conditions: opaque and clear. In the opaque condition, causal information about the effect of the tool inside the box was not available, and hence it was impossible to differentiate between the relevant and irrelevant parts of the demonstration. However, in the clear condition causal information was available, and subjects could potentially determine which actions were necessary. When chimpanzees were presented with the opaque box, they reproduced both the relevant and irrelevant actions, thus imitating the overall structure of the task. When the box was presented in the clear condition they instead ignored the irrelevant actions in favour of a more efficient, emulative technique. These results suggest that emulation is the favoured strategy of chimpanzees when sufficient causal information is available. However, if such information is not available, chimpanzees are prone to employ a more comprehensive copy of an observed action. In contrast to the chimpanzees, children employed imitation to solve the task in both conditions, at the expense of efficiency. We suggest that the difference in performance of chimpanzees and children may be due to a greater susceptibility of children to cultural conventions, perhaps combined with a differential focus on the results, actions and goals of the demonstrator.

  17. Pinatubo Emulation in Multiple Models (POEMs): co-ordinated experiments in the ISA-MIP model intercomparison activity component of the SPARC Stratospheric Sulphur and it's Role in Climate initiative (SSiRC)

    NASA Astrophysics Data System (ADS)

    Lee, Lindsay; Mann, Graham; Carslaw, Ken; Toohey, Matthew; Aquila, Valentina

    2016-04-01

    The World Climate Research Program's SPARC initiative has a new international activity "Stratospheric Sulphur and its Role in Climate" (SSiRC) to better understand changes in stratospheric aerosol and precursor gaseous sulphur species. One component of SSiRC involves an intercomparison "ISA-MIP" of composition-climate models that simulate the stratospheric aerosol layer interactively. Within PoEMS each modelling group will run a "perturbed physics ensemble" (PPE) of interactive stratospheric aerosol (ISA) simulations of the Pinatubo eruption, varying several uncertain parameters associated with the eruption's SO2 emissions and model processes. A powerful new technique to quantify and attribute sources of uncertainty in complex global models is described by Lee et al. (2011, ACP). The analysis uses Gaussian emulation to derive a probability density function (pdf) of predicted quantities, essentially interpolating the PPE results in multi-dimensional parameter space. Once trained on the ensemble, a Monte Carlo simulation with the fast Gaussian emulator enabling a full variance-based sensitivity analysis. The approach has already been used effectively by Carslaw et al., (2013, Nature) to quantify the uncertainty in the cloud albedo effect forcing from a 3D global aerosol-microphysics model allowing to compare the sensitivy of different predicted quantities to uncertainties in natural and anthropogenic emissions types, and structural parameters in the models. Within ISA-MIP, each group will carry out a PPE of runs, with the subsequent analysis with the emulator assessing the uncertainty in the volcanic forcings predicted by each model. In this poster presentation we will give an outline of the "PoEMS" analysis, describing the uncertain parameters to be varied and the relevance to further understanding differences identified in previous international stratospheric aerosol assessments.

  18. Characterization and Compensation of Network-Level Anomalies in Mixed-Signal Neuromorphic Modeling Platforms

    PubMed Central

    Petrovici, Mihai A.; Vogginger, Bernhard; Müller, Paul; Breitwieser, Oliver; Lundqvist, Mikael; Muller, Lyle; Ehrlich, Matthias; Destexhe, Alain; Lansner, Anders; Schüffny, René; Schemmel, Johannes; Meier, Karlheinz

    2014-01-01

    Advancing the size and complexity of neural network models leads to an ever increasing demand for computational resources for their simulation. Neuromorphic devices offer a number of advantages over conventional computing architectures, such as high emulation speed or low power consumption, but this usually comes at the price of reduced configurability and precision. In this article, we investigate the consequences of several such factors that are common to neuromorphic devices, more specifically limited hardware resources, limited parameter configurability and parameter variations due to fixed-pattern noise and trial-to-trial variability. Our final aim is to provide an array of methods for coping with such inevitable distortion mechanisms. As a platform for testing our proposed strategies, we use an executable system specification (ESS) of the BrainScaleS neuromorphic system, which has been designed as a universal emulation back-end for neuroscientific modeling. We address the most essential limitations of this device in detail and study their effects on three prototypical benchmark network models within a well-defined, systematic workflow. For each network model, we start by defining quantifiable functionality measures by which we then assess the effects of typical hardware-specific distortion mechanisms, both in idealized software simulations and on the ESS. For those effects that cause unacceptable deviations from the original network dynamics, we suggest generic compensation mechanisms and demonstrate their effectiveness. Both the suggested workflow and the investigated compensation mechanisms are largely back-end independent and do not require additional hardware configurability beyond the one required to emulate the benchmark networks in the first place. We hereby provide a generic methodological environment for configurable neuromorphic devices that are targeted at emulating large-scale, functional neural networks. PMID:25303102

  19. Characterization and compensation of network-level anomalies in mixed-signal neuromorphic modeling platforms.

    PubMed

    Petrovici, Mihai A; Vogginger, Bernhard; Müller, Paul; Breitwieser, Oliver; Lundqvist, Mikael; Muller, Lyle; Ehrlich, Matthias; Destexhe, Alain; Lansner, Anders; Schüffny, René; Schemmel, Johannes; Meier, Karlheinz

    2014-01-01

    Advancing the size and complexity of neural network models leads to an ever increasing demand for computational resources for their simulation. Neuromorphic devices offer a number of advantages over conventional computing architectures, such as high emulation speed or low power consumption, but this usually comes at the price of reduced configurability and precision. In this article, we investigate the consequences of several such factors that are common to neuromorphic devices, more specifically limited hardware resources, limited parameter configurability and parameter variations due to fixed-pattern noise and trial-to-trial variability. Our final aim is to provide an array of methods for coping with such inevitable distortion mechanisms. As a platform for testing our proposed strategies, we use an executable system specification (ESS) of the BrainScaleS neuromorphic system, which has been designed as a universal emulation back-end for neuroscientific modeling. We address the most essential limitations of this device in detail and study their effects on three prototypical benchmark network models within a well-defined, systematic workflow. For each network model, we start by defining quantifiable functionality measures by which we then assess the effects of typical hardware-specific distortion mechanisms, both in idealized software simulations and on the ESS. For those effects that cause unacceptable deviations from the original network dynamics, we suggest generic compensation mechanisms and demonstrate their effectiveness. Both the suggested workflow and the investigated compensation mechanisms are largely back-end independent and do not require additional hardware configurability beyond the one required to emulate the benchmark networks in the first place. We hereby provide a generic methodological environment for configurable neuromorphic devices that are targeted at emulating large-scale, functional neural networks.

  20. Uncertainty-based simulation-optimization using Gaussian process emulation: Application to coastal groundwater management

    NASA Astrophysics Data System (ADS)

    Rajabi, Mohammad Mahdi; Ketabchi, Hamed

    2017-12-01

    Combined simulation-optimization (S/O) schemes have long been recognized as a valuable tool in coastal groundwater management (CGM). However, previous applications have mostly relied on deterministic seawater intrusion (SWI) simulations. This is a questionable simplification, knowing that SWI models are inevitably prone to epistemic and aleatory uncertainty, and hence a management strategy obtained through S/O without consideration of uncertainty may result in significantly different real-world outcomes than expected. However, two key issues have hindered the use of uncertainty-based S/O schemes in CGM, which are addressed in this paper. The first issue is how to solve the computational challenges resulting from the need to perform massive numbers of simulations. The second issue is how the management problem is formulated in presence of uncertainty. We propose the use of Gaussian process (GP) emulation as a valuable tool in solving the computational challenges of uncertainty-based S/O in CGM. We apply GP emulation to the case study of Kish Island (located in the Persian Gulf) using an uncertainty-based S/O algorithm which relies on continuous ant colony optimization and Monte Carlo simulation. In doing so, we show that GP emulation can provide an acceptable level of accuracy, with no bias and low statistical dispersion, while tremendously reducing the computational time. Moreover, five new formulations for uncertainty-based S/O are presented based on concepts such as energy distances, prediction intervals and probabilities of SWI occurrence. We analyze the proposed formulations with respect to their resulting optimized solutions, the sensitivity of the solutions to the intended reliability levels, and the variations resulting from repeated optimization runs.

Top